WorldWideScience

Sample records for modelling assessment irma

  1. Utilizing NASA Earth Observations to Assess Impacts of Hurricanes Andrew and Irma on Mangrove Forests in Biscayne Bay National Park, FL

    Science.gov (United States)

    Kumar, A.; Weber, S.; Remillard, C.; Escobar Pardo, M. L.; Hashemi Tonekaboni, N.; Cameron, C.; Linton, S.; Rickless, D.; Rivero, R.; Madden, M.

    2017-12-01

    Extreme weather events, such as hurricanes, pose major threats to coastal communities around the globe. However, mangrove forests along coastlines act as barriers and subdue the impacts associated with these catastrophic events. The Biscayne Bay National Park mangrove forest located near the city of Miami Beach was recently affected by the category four hurricane Irma in September of 2017. This study analyzed the impact of Hurricane Irma on Biscayne Bay National Park mangroves. Several remote sensing datasets including Landsat 8 Operational Land Imager (OLI), Sentinel 2-Multi Spectral Imager (MSI), PlanetScope, and aerial imagery were utilized to assess pre-and post-hurricane conditions. The high-resolution aerial imagery and PlanetScope data were used to map damaged areas within the national park. Additionally, Landsat 8 OLI and Sentinel-2 MSI data were utilized to estimate changes in biophysical parameters, including gross primary productivity (GPP), before and after Hurricane Irma. This project also examined damages associated with Hurricane Andrew (1992) using historical Landsat 5 Thematic Mapper (TM) data. These results were compared to GPP estimates following Hurricane Irma and suggested that Hurricane Andrew's impact was greater than that of Irma in Biscayne Bay National Park. The results of this study will help to enhance the mangrove health monitoring and shoreline management programs led by officials at the City of Miami Beach Public Works Department.

  2. Comparative assessment of quality of immunoradiometric assay (IRMA) and chemiluminescence immunometric assay (CHEIMA) for estimation of thyroid stimulating hormone (TSH)

    International Nuclear Information System (INIS)

    Sajid, K.M.

    2009-01-01

    Biological substances like hormones, vitamins and enzymes are found in minute quantities in blood. Their estimation requires very sensitive and specific methods. The most modern method for estimation of thyroid stimulating hormone in serum is non-isotopic enzyme enhanced chemiluminescence immunometric method. In our laboratory immunoradiometric assay is in routine for the last many years. Recently interest has grown to establish non-isotopic techniques in laboratories of PAEC. However, the main requirement to adopt the new procedures is to compare their results, cost and other benefits with the existing method. Immunoassay laboratory of MINAR, therefore, conducted a study to compare the two methods. A total of 173 (males: 34 females: 139 age: between 1 and 65 years) cases of clinically confirmed thyroid status were included in the study. Serum samples of these cases were analyzed by two methods and results were compared by plotting precision profiles, correlation plots and calculating sensitivities and specificities of the methods. As the results in all the samples were not normally distributed Wilcoxon rank sum test was applied to compare the analytical results of two methods. The comparison shows that the results obtained in two methods are not completely similar (p=0.0003293), although analysis of samples in groups shows that some similarity exists between the results of hypo and hyperthyroid patients (p<=0.156 and p<=0.6138). This shows that results obtained in these two methods could sometimes disagree in final diagnosis. Although TSH-CHEIMA is analytically more sensitive than TSH-IRMA the clinical sensitivities and specificities of two methods are not significantly different. TSH-CHEIMA test completes in almost 2 hours whereas TSH-IRMA takes about 6 hours to complete. Comparison of costs shows that TSH-CHIEMA is almost 5 times more expensive than TSH-IRMA. We conclude that the two methods could sometimes disagree but the two techniques have almost same

  3. Nowcasting, forecasting and hindcasting Harvey and Irma inundation in near-real time using a continental 2D hydrodynamic model

    Science.gov (United States)

    Sampson, C. C.; Wing, O.; Quinn, N.; Smith, A.; Neal, J. C.; Schumann, G.; Bates, P.

    2017-12-01

    During an ongoing natural disaster data are required on: (1) the current situation (nowcast); (2) its likely immediate evolution (forecast); and (3) a consistent view post-event of what actually happened (hindcast or reanalysis). We describe methods used to achieve all three tasks for flood inundation during the Harvey and Irma events using a continental scale 2D hydrodynamic model (Wing et al., 2017). The model solves the local inertial form of the Shallow Water equations over a regular grid of 1 arcsecond ( 30m). Terrain data are taken from the USGS National Elevation Dataset with known flood defences represented using the U.S. Army Corps of Engineers National Levee Dataset. Channels are treated as sub-grid scale features using the HydroSHEDS global hydrography data set. The model is driven using river flows, rainfall and coastal water levels. It simulates river flooding in basins > 50 km2, and fluvial and coastal flooding everywhere. Previous wide area validation tests show this model to be capable of matching FEMA maps and USGS local models built with bespoke data with hit rates of 86% and 92% respectively (Wing et al., 2017). Boundary conditions were taken from NOAA QPS data to produce nowcast and forecast simulations in near real time, before updating with NOAA observations to produce the hindcast. During the event simulation results were supplied to major insurers and multi-nationals who used them to estimate their likely capital exposure and to mitigate flood damage to their infrastructure whilst the event was underway. Simulations were validated against modelled flood footprints computed by FEMA and USACE, and composite satellite imagery produced by the Dartmouth Flood Observatory. For the Harvey event, hit rates ranged from 60-84% against these data sources, but a lack of metadata meant it was difficult to perform like-for-like comparisons. The satellite data also appeared to miss known flooding in urban areas that was picked up in the models. Despite

  4. Hva er det med Irma?

    Directory of Open Access Journals (Sweden)

    Anne Beate Reinertsen

    2016-10-01

    Full Text Available Abstract: This article is about formative quality assessment in a posthuman or newmaterial perspective; embodied knowledges.  Theory and method are written together in immanence to envision complexity.  The aim and scope of the article is to follow the flow of events Irma produces opening up for affirmative poetical critique praxis. Quality is forwarded as an intensity and force in a moment only.  The moment is therefore the only structure of the text, as moving quality. The intention and inner logic of the text is therefore designed to work against fixed definitions and conceptualizations of what quality and quality assessment is.  This way I hope to show what posthuman and newmaterial approaches can contribute with to build cultures of innovation in which quality is assessed and produces again and again. They put differences to work and open up for creating moments of educational justice. Sammendrag: Artikkelen handler om kvalitetsvurdering i et nymaterielt perspektiv: Kroppslig kunnskap og viten. Teori og metode skrives sammen i immanent samtidighet for å gi et bilde på kompleksitet. Både hensikt og mål med artikkelen er å følge den flyten som Irma som hendelse eller event produserer, for å åpne opp potensialitet for en bekreftende eller affirmativ poetiserende kritikkpraksis og en ny forståelse av- og vurdering for kvalitet. Kvalitet skrives fram som en hendelse, intensitet eller kraft i et øyeblikk. Øyeblikket er derfor tekstens eneste og bærende struktur, som kvalitet i bevegelse. Tekstens indre logikk har slik til hensikt å motvirke forsøk på å skape definerte, faste eller bestemte oppfatninger av hva kvalitet er og hvordan kvalitet kan vurderes. På denne måten håper jeg å vise hva nymaterielle perspektiver kan gjøre for å bygge kulturer for innovasjon hvor kvalitet vurderes og produseres igjen og igjen. De setter forskjellighet i bevegelse og åpner opp for å skape rettferdige utdanningsøyeblikk.

  5. Irma G. Enriquez-Maldonado

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics. Irma G. Enriquez-Maldonado. Articles written in Journal of Genetics. Volume 88 Issue 2 August 2009 pp 249-252 Research Note. Association of matrix metalloproteinase-2 gene promoter polymorphism with myocardial infarction susceptibility in a Mexican population.

  6. West Florida Shelf Response to Hurricane Irma

    Science.gov (United States)

    Liu, Y.; Weisberg, R. H.; Chen, J.; Merz, C. R.; Law, J.; Zheng, L.

    2017-12-01

    Hurricane Irma impacted the west Florida continental shelf (WFS) as it transited the state of Florida during September 10-12, 2017, making landfall first at Cudjoe Key and then again at Naples, as a Category 2 hurricane. The WFS response to Hurricane Irma is analyzed using a combination of in situ observations and numerical model simulations. The observations include water column velocity (by Acoustic Doppler Current Profilers), sea surface temperature and meteorological records from three moorings on the shelf, surface currents by high-frequency radars, and coastal tide gauge records. The West Florida Coastal Ocean Model (WFCOM) employed downscales from the deep Gulf of Mexico, across the shelf and into the estuaries by nesting the unstructured grid FVCOM in the Gulf of Mexico HYCOM. Both the observations and the model simulations revealed strong upwelling and vertical mixing followed by downwelling as the storm passed by. This was accompanied by a rapid drop in sea surface temperature of approximately 4ºC and large decreases in sea level with associated negative surges, causing drying in the Florida Bay, Charlotte Harbor, Tampa Bay estuaries and the Big Bend region. The transport and exchange of water between the shelf and the estuaries and between the shelf and the Florida Keys reef track during the hurricane may have important implications for ecosystem studies within the region.

  7. Irma Optimisti "Female mathematics" / Raivo Kelomees

    Index Scriptorium Estoniae

    Kelomees, Raivo, 1960-

    2007-01-01

    Irma Optimisti näitusest "Female mathematics" Helsingis Muu-galeriis, mis on osa projektist "Mõistelisus ja käsitöölikkus (Käsitteellisyys ja käsityöläisyys) ehk naine ja tehnoloogia". Ilmunud ka ajalehes "Eesti Päevaleht", 1996, 27. veebruar

  8. Rapid-response flood mapping during Hurricanes Harvey, Irma and Maria by the Global Flood Partnership (GFP)

    Science.gov (United States)

    Cohen, S.; Alfieri, L.; Brakenridge, G. R.; Coughlan, E.; Galantowicz, J. F.; Hong, Y.; Kettner, A.; Nghiem, S. V.; Prados, A. I.; Rudari, R.; Salamon, P.; Trigg, M.; Weerts, A.

    2017-12-01

    The Global Flood Partnership (GFP; https://gfp.jrc.ec.europa.eu) is a multi-disciplinary group of scientists, operational agencies and flood risk managers focused on developing efficient and effective global flood management tools. Launched in 2014, its aim is to establish a partnership for global flood forecasting, monitoring and impact assessment to strengthen preparedness and response and to reduce global disaster losses. International organizations, the private sector, national authorities, universities and research agencies contribute to the GFP on a voluntary basis and benefit from a global network focused on flood risk reduction. At the onset of Hurricane Harvey, GFP was `activated' using email requests via its mailing service. Soon after, flood inundation maps, based on remote sensing analysis and modeling, were shared by different agencies, institutions, and individuals. These products were disseminated, to varying degrees of effectiveness, to federal, state and local agencies via emails and data-sharing services. This generated a broad data-sharing network which was utilized at the early stages of Hurricane Irma's impact, just two weeks after Harvey. In this presentation, we will describe the extent and chronology of the GFP response to both Hurricanes Harvey, Irma and Maria. We will assess the potential usefulness of this effort for event managers in various types of organizations and discuss future improvements to be implemented.

  9. Estimating the human influence on Hurricanes Harvey, Irma and Maria

    Science.gov (United States)

    Wehner, M. F.; Patricola, C. M.; Risser, M. D.

    2017-12-01

    Attribution of the human-induced climate change influence on the physical characteristics of individual extreme weather events has become an advanced science over the past decade. However, it is only recently that such quantification of anthropogenic influences on event magnitudes and probability of occurrence could be applied to very extreme storms such as hurricanes. We present results from two different classes of attribution studies for the impactful Atlantic hurricanes of 2017. The first is an analysis of the record rainfall amounts during Hurricane Harvey in the Houston, Texas area. We analyzed observed precipitation from the Global Historical Climatology Network with a covariate-based extreme value statistical analysis, accounting for both the external influence of global warming and the internal influence of ENSO. We found that human-induced climate change likely increased Hurricane Harvey's total rainfall by at least 19%, and likely increased the chances of the observed rainfall by a factor of at least 3.5. This suggests that changes exceeded Clausius-Clapeyron scaling, motivating attribution studies using dynamical climate models. The second analysis consists of two sets of hindcast simulations of Hurricanes Harvey, Irma, and Maria using the Weather Research and Forecasting model (WRF) at 4.5 km resolution. The first uses realistic boundary and initial conditions and present-day greenhouse gas forcings while the second uses perturbed conditions and pre-industrial greenhouse has forcings to simulate counterfactual storms without anthropogenic influences. These simulations quantify the fraction of Harvey's precipitation attributable to human activities and test the super Clausius-Clapeyron scaling suggested by the observational analysis. We will further quantify the human influence on intensity for Harvey, Irma, and Maria.

  10. Kinetic Consideration of AFP irma assay

    International Nuclear Information System (INIS)

    Aly, M. A.; Moustafa, K.A.

    2003-01-01

    Alpha-fetoprotein (AFP) is a glycoprotein produced by the yolk sac and later by the fetal liver during pregnancy. When the neural tube is not properly formed, by the fetal liver during pregnancy. When the neural tube is not properly formed, large amounts of AFP pass into the amniotic fluid and reach the mother's blood. During pregnancy, the major interest in AFP determination in maternal serum and amniotic fluid is on the early diagnosis of fetal abnormalities. AFP also used as a tumor marker for hepatocellular carcinoma. There are many different techniques for measuring AFP in blood, but the more accurate one is the immunoassay technique. The kinetics of the interaction between AFP antigen and two matched antibodies, one labeled with radioactive isotope 1 25I (tracer) and the other is unlabelled and attached to a solid support (tube), are studied using the more recently, two sites (sandwich) immunoradiometric assay (IRMA) technique. We present here a method for determining the rate constants, using an advanced computer program (RKY), which based on the nelder-mead optimization principle. The rate constant, at three variable temperatures and three different antigen concentrations, as well as the half time of exchange (t 1/2 ) were calculated

  11. Emergency Response Imagery Related to Hurricanes Harvey, Irma, and Maria

    Science.gov (United States)

    Worthem, A. V.; Madore, B.; Imahori, G.; Woolard, J.; Sellars, J.; Halbach, A.; Helmricks, D.; Quarrick, J.

    2017-12-01

    NOAA's National Geodetic Survey (NGS) and Remote Sensing Division acquired and rapidly disseminated emergency response imagery related to the three recent hurricanes Harvey, Irma, and Maria. Aerial imagery was collected using a Trimble Digital Sensor System, a high-resolution digital camera, by means of NOAA's King Air 350ER and DeHavilland Twin Otter (DHC-6) Aircraft. The emergency response images are used to assess the before and after effects of the hurricanes' damage. The imagery aids emergency responders, such as FEMA, Coast Guard, and other state and local governments, in developing recovery strategies and efforts by prioritizing areas most affected and distributing appropriate resources. Collected imagery is also used to provide damage assessment for use in long-term recovery and rebuilding efforts. Additionally, the imagery allows for those evacuated persons to see images of their homes and neighborhoods remotely. Each of the individual images are processed through ortho-rectification and merged into a uniform mosaic image. These remotely sensed datasets are publically available, and often used by web-based map servers as well as, federal, state, and local government agencies. This poster will show the imagery collected for these three hurricanes and the processes involved in getting data quickly into the hands of those that need it most.

  12. Tracking the Aftermath of Irma in Antigua and Barbuda

    Science.gov (United States)

    Friedman, E.; Look, C.

    2017-12-01

    The twin island nation of Antigua and Barbuda were the first places heavily impacted by Hurricane Irma. The powerful imagery generated of destruction and abandonment, stood as warning for many in the U.S. Virgin Islands, Puerto Rico, and the coastal United States. This paper presents findings on how resilience in the aftermath of Irma's destruction has functioned to constitute those who are sustainable from those who are not. The two sister islands experienced completely different outcomes from Hurricane Irma, with Antigua being relatively `untouched' and Barbuda with approximately 90% of the island of the destroyed, presenting a contradictory identity of the twin-island nation both as a victim of climate change and a land of economic opportunity, "open for business". This contradiction will be unpacked through analysis of language from formal practitioner interviews, informal unstructured discussions, local climate-risk reduction policies, local newspapers, and social media.

  13. Impact of Hurricane Irma in the post-recovery of Matthew in South Carolina, the South Atlantic Bight (Western Atlantic)

    Science.gov (United States)

    Harris, M. S.; Levine, N. S.; Jaume, S. C.; Hendricks, J. K.; Rubin, N. D.; Hernandez, J. L.

    2017-12-01

    The impacts on the Southeastern United States (SEUS, Western Atlantic) from Hurricane Irma in Sept 2017 were felt primarily on the active coastline with the third highest inland storm surge in Charleston and Savannah since the 19th Century. Coastal geometry, waves, and wind duration had a strong influence on the storm surge and coastal erosion impacts regionally. To the North and immediate South, impacts were much less. A full year after the 2016 hurricane season (Hurricane Matthew), the lack of regional recovery reduced protection against Irma. The most devastating impacts of Irma in the SAB occurred from 300 to 500 km away from the eye, on the opposite side of the Floridian peninsula. As Irma devastated the Caribbean, winds started to increases off the SAB on September 8 in the early morning, continuing for the next 3 days and blowing directly towards the SC and GA coasts. Tide gauges started to respond the night of September 8, while waves started arriving in the SEUS around Sept 6. Coastal erosion pre- and post-Irma has been calculated for Central SC using vertical and oblique aerial photos. Citizen Science initiatives through the Charleston Resilience Network have provided on-the-ground data during storms when transportation infrastructures were closed, and allow for ground-truth post-storm of surge and impacts. Said information was collected through Facebook, Google, and other social media. Pictures with timestamps and water heights were collected and are validating inundation flood maps generated for the Charleston SC region. The maps have 1-m horizontal and 7- to 15-cm vertical accuracy. Inundation surfaces were generated at MHHW up to a maximum surge in 6 inch increments. The flood extents of the modeled surge and the photographic evidence show a high correspondence. Storm surge measurements from RTK-GPS provide regional coverage of surge elevations from the coast, inland, and allow for testing of modeled results and model tuning. With Hurricane Irma

  14. Magnetic particle separation technique: a reliable and simple tool for RIA/IRMA and quantitative PCR assay

    International Nuclear Information System (INIS)

    Shen Rongsen; Shen Decun

    1998-01-01

    Five types of magnetic particles without or with aldehyde, amino and carboxyl functional groups, respectively were used to immobilize first or second antibody by three models, i. e. physical adsorption, chemical coupling and immuno-affinity, forming four types of magnetic particle antibodies. The second antibody immobilized on polyacrolein magnetic particles through aldehyde functional groups and the first antibodies immobilized on carboxylic polystyrene magnetic particles through carboxyl functional groups were recommended to apply to RIAs and/or IRMAs. Streptavidin immobilized on commercial magnetic particles through amino functional groups was successfully applied to separating specific PCR product for quantification of human cytomegalovirus. In the paper typical data on reliability of these magnetic particle ligands were reported and simplicity of the magnetic particle separation technique was discussed. The results showed that the technique was a reliable and simple tool for RIA/IRMA and quantitative PCR assay. (author)

  15. Proposal for the development of IRMA kits for prostate specific antigen, PSA

    International Nuclear Information System (INIS)

    Abdul, A.B.

    1997-01-01

    The following are the major objectives of this research proposal: (1) To establish a protocol for biotinylation of monoclonal antibody, mabs or polyclonal antibody against the antigen, PSA. This shall include the purifying procedure using size exclusion chromatography on HPLC for use in binding assays to determine its binding capacity with PSA. (2) To establish an immunoassay protocol for IRMAs, using the technique of immobilizing the capture mabs on solid phase (surfaces of polystyrene) and the radioiodine labeled streptavidin-biotinylated bridge system. This will include optimization of the assay design and a Quality Control Assessment with the inclusion of standards derived from the Agency and subsequent work to determine its sensitivity (Minimum Detection Limit) and working range (the phenomenon of Hooke's Effect). An in-house quality control would also be useful to determine the assay's suitability for screening the tumour marker from patient samples obtained from neighboring hospitals (such as the Science University of Malaysia Hospital and the National University Hospital) and private clinical pathology laboratory (such as the Pantai Medical Centre) which compare concurrently the results with existing commercial immunoassay kits (RIA/IRMA). These work and that described earlier in (1) shall be done entirely at MINT. (3) To perform an external coordinated external Quality Control Assurance Programme with other research institutes (such as the Department of Immunology, Medical faculty, Science University of Malaysia and government hospitals) in Malaysia on several batches of the IRMA kits (produced at MINT and proven to be suitable for screening PSA in human serum from in-house Quality Control data, as mentioned earlier in (2)). This coordinated work shall include analyzing and documenting all values obtained from a group of patient's sample in clinical conditions such as batch to batch variation, inter and intra-assay variations and mean values for negative

  16. Using High-Resolution Imagery to Characterize Disturbance from Hurricane Irma in South Florida Wetlands

    Science.gov (United States)

    Lagomasino, D.; Cook, B.; Fatoyinbo, T.; Morton, D. C.; Montesano, P.; Neigh, C. S. R.; Wooten, M.; Gaiser, E.; Troxler, T.

    2017-12-01

    Hurricane Irma, one of the strongest hurricanes recorded in the Atlantic, first made landfall in the Florida Keys before coming ashore in southwestern Florida near Everglades National Park (ENP) on September 9th and 10th of this year. Strong winds and storm surge impacted a 100+ km stretch of the southern Florida Gulf Coast, resulting in extensive damages to coastal and inland ecosystems. Impacts from previous catastrophic storms in the region have led to irreversible changes to vegetation communities and in some areas, ecosystem collapse. The processes that drive coastal wetland vulnerability and resilience are largely a function of the severity of the impact to forest structure and ground elevation. Remotely sensed imagery plays an important role in measuring changes to the landscape, particularly for extensive and inaccessible regions like the mangroves in ENP. We have estimated changes in coastal vegetation structure and soil elevation using a combination of repeat measurements from ground, airborne, and satellite platforms. At the ground level, we used before and after Structure-from-Motion models to capture the change in below canopy structure as result of stem breakage and fallen branches. Using airborne imagery collected before and after Hurricane Irma by Goddard's Lidar, Hyperspectral, and Thermal (G-LiHT) Airborne Imager, we measured the change in forest structure and soil elevation. This unique data acquisition covered an area over 130,000 ha in regions most heavily impacted storm surge. Lastly, we also combined commercial and NASA satellite Earth observations to measure forest structural changes across the entire South Florida coast. An analysis of long-term observations from the Landsat data archive highlights the heterogeneity of hurricane and other environmental disturbances along the Florida coast. These findings captured coastal disturbance legacies that have the potential to influence the trajectory of mangrove resilience and vulnerability

  17. NASA Earth Science Disasters Program Response Activities During Hurricanes Harvey, Irma, and Maria in 2017

    Science.gov (United States)

    Bell, J. R.; Schultz, L. A.; Molthan, A.; Kirschbaum, D.; Roman, M.; Yun, S. H.; Meyer, F. J.; Hogenson, K.; Gens, R.; Goodman, H. M.; Owen, S. E.; Lou, Y.; Amini, R.; Glasscoe, M. T.; Brentzel, K. W.; Stefanov, W. L.; Green, D. S.; Murray, J. J.; Seepersad, J.; Struve, J. C.; Thompson, V.

    2017-12-01

    The 2017 Atlantic hurricane season included a series of storms that impacted the United States, and the Caribbean breaking a 12-year drought of landfalls in the mainland United States (Harvey and Irma), with additional impacts from the combination of Irma and Maria felt in the Caribbean. These storms caused widespread devastation resulting in a significant need to support federal partners in response to these destructive weather events. The NASA Earth Science Disasters Program provided support to federal partners including the Federal Emergency Management Agency (FEMA) and the National Guard Bureau (NGB) by leveraging remote sensing and other expertise through NASA Centers and partners in academia throughout the country. The NASA Earth Science Disasters Program leveraged NASA mission products from the GPM mission to monitor cyclone intensity, assist with cyclone center tracking, and quantifying precipitation. Multispectral imagery from the NASA-NOAA Suomi-NPP mission and the VIIRS Day-Night Band proved useful for monitoring power outages and recovery. Synthetic Aperture Radar (SAR) data from the Copernicus Sentinel-1 satellites operated by the European Space Agency were used to create flood inundation and damage assessment maps that were useful for damage density mapping. Using additional datasets made available through the USGS Hazards Data Distribution System and the activation of the International Charter: Space and Major Disasters, the NASA Earth Science Disasters Program created additional flood products from optical and radar remote sensing platforms, along with PI-led efforts to derive products from other international partner assets such as the COSMO-SkyMed system. Given the significant flooding impacts from Harvey in the Houston area, NASA provided airborne L-band SAR collections from the UAVSAR system which captured the daily evolution of record flooding, helping to guide response and mitigation decisions for critical infrastructure and public safety. We

  18. Impact of Hurricane Irma on Little Ambergris Cay, Turks and Caicos

    Science.gov (United States)

    Stein, N.; Grotzinger, J. P.; Hayden, A.; Quinn, D. P.; Trower, L.; Lingappa, U.; Present, T. M.; Gomes, M.; Orzechowski, E. A.; Fischer, W. W.

    2017-12-01

    Little Ambergris Cay (21.3° N, 71.7° W) is a 6 km long, 1.6 km wide island on the Caicos platform. The island was the focus of mapping campaigns in July 2016, August 2017, and following Hurricane Irma in September 2017. The cay is lined with lithified upper shoreface and eolian ooid grainstone forming a 1-4 m high bedrock rim that is locally breached, allowing tides to inundate an interior basin lined with extensive microbial mats. The island was mapped in July of 2016 using UAV- and satellite-based images and in situ measurements. Sedimentologic and biofacies were mapped onto a 15 cm/pixel visible light orthomosaic of the cay made from more than 1500 UAV images, and a corresponding stereogrammetric digital elevation model (DEM) was used to track how microbial mat texture varies in response to water depth. An identical UAV-based visible light map of the island was made in August 2017. On September 7th, 2017, the eye of hurricane Irma directly crossed Little Ambergris Cay with sustained winds exceeding 170 MPH. The island was remapped with a UAV on September 24th, yielding a 5 cm/pixel UAV-based visible light orthomosaic and a corresponding DEM. In situ observations and comparison with previous UAV maps shows that Irma caused significant channel and bedrock erosion, scouring and removal of broad tracts of microbial mats, and blanketing by ooid sediment of large portions of the interior basin including smothering of mats by up to 1 m of sediment. The southern rim of the cay was overtopped by water and sediment, indicating a storm surge of at least 3 m. Blocks of rock more than 1 m in length and 50 cm thick were separated from bedrock on the north side of the island and washed higher to form imbricated boulder deposits. Hundreds of 5-30 cm diameter imbricated rip-up intraclasts of rounded microbial mat now line exposed bedrock in the interior basin. Fresh ooid sediment and microbial mats were sampled from three sites: on desiccated mats 50 cm above tide level, on

  19. Significant Wave Height under Hurricane Irma derived from SAR Sentinel-1 Data

    Science.gov (United States)

    Lehner, S.; Pleskachevsky, A.; Soloviev, A.; Fujimura, A.

    2017-12-01

    while making landfall on Cuba and the Florida Keys, where IRMA still hit as a category 3 to 4 hurricane. Results are compared to the WW3 model, which could not be validated over an area under strong and variable wind conditions before. A new theory on hurricane intensification based on Kelvin-Helmholtz instability is discussed and a first comparison to the SAR data is given.

  20. Mapping Daily and Maximum Flood Extents at 90-m Resolution During Hurricanes Harvey and Irma Using Passive Microwave Remote Sensing

    Science.gov (United States)

    Galantowicz, J. F.; Picton, J.; Root, B.

    2017-12-01

    Passive microwave remote sensing can provided a distinct perspective on flood events by virtue of wide sensor fields of view, frequent observations from multiple satellites, and sensitivity through clouds and vegetation. During Hurricanes Harvey and Irma, we used AMSR2 (Advanced Microwave Scanning Radiometer 2, JAXA) data to map flood extents starting from the first post-storm rain-free sensor passes. Our standard flood mapping algorithm (FloodScan) derives flooded fraction from 22-km microwave data (AMSR2 or NASA's GMI) in near real time and downscales it to 90-m resolution using a database built from topography, hydrology, and Global Surface Water Explorer data and normalized to microwave data footprint shapes. During Harvey and Irma we tested experimental versions of the algorithm designed to map the maximum post-storm flood extent rapidly and made a variety of map products available immediately for use in storm monitoring and response. The maps have several unique features including spanning the entire storm-affected area and providing multiple post-storm updates as flood water shifted and receded. From the daily maps we derived secondary products such as flood duration, maximum flood extent (Figure 1), and flood depth. In this presentation, we describe flood extent evolution, maximum extent, and local details as detected by the FloodScan algorithm in the wake of Harvey and Irma. We compare FloodScan results to other available flood mapping resources, note observed shortcomings, and describe improvements made in response. We also discuss how best-estimate maps could be updated in near real time by merging FloodScan products and data from other remote sensing systems and hydrological models.

  1. Detection of HBsAg and Anti HBc on donors of a blood bank by IRMA and ELISA methods

    International Nuclear Information System (INIS)

    Freire Martinez, D.Y.

    1985-10-01

    Comparative evaluation of two methods, Immunoradiometric Assay (IRMA) and Enzyme Immunoassay (ELISA), for detecting HBsAg and Anti HBc was made for determining which is the most advantageous and reliable. The study was made on 300 donors of the Hospital San Juan de Dios Blood Bank. In comparison with the reference method (IRMA), ELISA shows 91.67% of sensitivity. The Anti HBc detection by IRMA is more reliable than the HBsAg detection by IRMA and ELISA for determining the carrier state

  2. Development of IRMA reagent and methodology for PSA

    International Nuclear Information System (INIS)

    Najafi, R.

    1997-01-01

    The PSA test is a solid phase two-site immunoassay. Rabbit anti PSA is coated or bound on surface of solid phase and monoclonal anti PSA labeled with 1-125. The PSA molecules present in the standard solution or serum are 'Sandwiched' between the two antibodies. After formation of coated antibody-antigen-labeled antibody complex, the unbound labeled antibody will removed by washing. The complex is measured by gamma counter. The concentration of analyte is proportional to the counts of test sample. In order to develop kits for IRMA PSA, it should be prepared three essential reagents Antibody coated solid phase, labeled antibody, standards and finally optimizing them to obtain an standard curve fit to measure specimen PSA in desired range of concentration. The type of solid phase and procedure(s) to coat or bind to antibody, is still main debatable subject in development and setting up RIA/IRMA kits. In our experiments, polystyrene beads, because of their easy to coat with antibody as well as easy to use, can be considered as a desired solid phase. Most antibodies are passively adsorbed to a plastic surface (e.g. Polystyrene, Propylene, and Polyvinyl chloride) from a diluted buffer. The antibody coated plastic surface, then acts as solid phase reagent. Poor efficiency and time required to reach equilibrium and also lack of reproducibility especially batch-to-batch variation between materials, are disadvantages in this simple coating procedure. Improvements can be made by coating second antibody on surface of beads, and reaction between second and primary antibodies. There is also possible to enhance more coating efficiency of beads by using Staphylococcus ureus-Protein A. Protein A is a major component of staphylococcus aureus cell wall which has an affinity for FC segment of immunoglobulin G (IgG) of some species, including human; rabbit; and mice. This property of Staphylococcal Protein A has made it a very useful tool in the purification of classes and subclasses

  3. Sedimentary and Vegetative Impacts of Hurricane Irma to Coastal Wetland Ecosystems across Southwest Florida

    Science.gov (United States)

    Moyer, R. P.; Khan, N.; Radabaugh, K.; Engelhart, S. E.; Smoak, J. M.; Horton, B.; Rosenheim, B. E.; Kemp, A.; Chappel, A. R.; Schafer, C.; Jacobs, J. A.; Dontis, E. E.; Lynch, J.; Joyse, K.; Walker, J. S.; Halavik, B. T.; Bownik, M.

    2017-12-01

    Since 2014, our collaborative group has been working in coastal marshes and mangroves across Southwest Florida, including Tampa Bay, Charlotte Harbor, Ten Thousand Islands, Biscayne Bay, and the lower Florida Keys. All existing field sites were located within 50 km of Hurricane Irma's eye path, with a few sites in the Lower Florida Keys and Naples/Ten Thousand Islands region suffering direct eyewall hits. As a result, we have been conducting storm-impact and damage assessments at these locations with the primary goal of understanding how major hurricanes contribute to and/or modify the sedimentary record of mangroves and salt marshes. We have also assessed changes to the vegetative structure of the mangrove forests at each site. Preliminary findings indicate a reduction in mangrove canopy cover from 70-90% pre-storm, to 30-50% post-Irma, and a reduction in tree height of approximately 1.2 m. Sedimentary deposits consisting of fine carbonate mud up to 12 cm thick were imported into the mangroves of the lower Florida Keys, Biscayne Bay, and the Ten Thousand Islands. Import of siliciclastic mud up to 5 cm thick was observed in Charlotte Harbor. In addition to fine mud, all sites had imported tidal wrack consisting of a mixed seagrass and mangrove leaf litter, with some deposits as thick as 6 cm. In areas with newly opened canopy, a microbial layer was coating the surface of the imported wrack layer. Overwash and shoreline erosion were also documented at two sites in the lower Keys and Biscayne Bay, and will be monitored for change and recovery over the next few years. Because active research was being conducted, a wealth of pre-storm data exists, thus these locations are uniquely positioned to quantify hurricane impacts to the sedimentary record and standing biomass across a wide geographic area. Due to changes in intensity along the storm path, direct comparisons of damage metrics can be made to environmental setting, wind speed, storm surge, and distance to eyewall.

  4. Historia, memoria y impunidad: el caso de Irma Flaquer

    Directory of Open Access Journals (Sweden)

    June Carolyn Erlick

    2005-12-01

    Full Text Available Na Guatemala, talvez mais do que em qualquer outro país, as comissões de investigação da verdade enfatizaram as narrativas de testemunho como documentos sobre os abusos do passado. No entanto, esta documentação manteve seu foco nas vítimas e nos crimes cometidos contra elas. A recuperação da vida das vítimas através da narrativa se apresenta como uma outra maneira de restaurar a memória e transformá-la em história. A vida e a obra da corajosa jornalista guatemalteca, Irma Flaquer, foi documentada pelo projeto da American Press Association, "Crimes Impunes contra Jornalistas." Como resultado, sob os auspícios da Comissão Interamericana dos Direitos Humanos, o governo da Guatemala admitiu sua responsabilidade no desaparecimento da jornalista e reabriu o caso. Assim, a reconstrução da memória através das técnicas narrativas não resultou apenas na reconstrução da história, mas em sua mudança.

  5. Measurement of some tumor markers by IRMA in vietnam

    International Nuclear Information System (INIS)

    Tran Xuan Truong

    2004-01-01

    As we known that a perfect tumor markers could be used in five different ways : for population screening, for diagnose, for monitoring therapy and for follow-up early evidence of cancer recurrence. In order to achieve perfect status a tumor markers would require total negativity in healthy subject, total positivity for single tumor type and close correlation between plasma tumor marker concentration and tumor size . The advance of monoclonal antibodies has had dramatic impact in oncology, where new tumor markers have been discovered and assay methods for all tumor markers have been improved commercially . Analytical performance of these new methods are potentially as good as that of the best Immunoradiometric assay for others analytes. In Vietnam, the first time we use immunoradiometric assay (IRMA) for the measurement of some tumor markers in normal subject and cancer diseases. These are Thyroglobulin (TG) of thyroid cancer, cancer-antigen 15-3 (CA15-3) of breast cancer and cancer-antigen 72-4 (CA72-4) of stomach cancer. We would like applying the CA72-4 in the indication of stomach cancer, CA15-3 in the differential diagnosis of breast cancer, and TG in the differential diagnosis of thyroid cancer. And all of these tumor markers were also used in the clinical follow-up and early detection of recurrence and metastatic Cancer of them. We could try researching on them much more. (authors)

  6. Measurement of some tumour markers by IRMA in Vietnam

    International Nuclear Information System (INIS)

    Tran Xuan Truong

    2004-01-01

    Full text: Determination of tumour markers may be useful for screening, monitoring therapy, and follow-up of cancers. In order to achieve perfect status, tumour markers would require total negativity in healthy subject, total positivity for a single tumour type and close correlation between plasma tumor marker concentration and tumour size. With the advances in monoclonal antibodies production, assaying methods for all tumour markers have been improved and made available commercially. Moreover, many new tumour markers have been identified. In Vietnam, we first time used immunoradiometric-assay (IRMA) for the measurement of few tumour markers in normal subjects and in some cancer diseases. These are Thyroglobulin (TG) for thyroid cancer, cancer-antigen 15-3 (CA15-3) for breast cancer and cancer-antigen 72-4 (CA72-4) for stomach cancer. Concentration of tumour markers in the normal subjects was found to be 3.0-4.0 U/ml for CA 72-4 (n =24), 15.0-19.1 U/ml for CA 15-3 (n =26) and 3.6-7.3 (ng/ml) for TG (n =33). We would like to apply the detection of these tumor markers in the evaluation of cancerous diseases viz., CA72-4 in stomach cancer, CA15-3 in breast cancer and TG in thyroid cancer. All these tumour markers would we helpful in the clinical follow-up and early detection of recurrence and metastatic cancer. (author)

  7. High Temporal Resolution Tropospheric Wind Profile Observations at NASA Kennedy Space Center During Hurricane Irma

    Science.gov (United States)

    Decker, Ryan K.; Barbre, Robert E., Jr.; Huddleston, Lisa; Brauer, Thomas; Wilfong, Timothy

    2018-01-01

    The NASA Kennedy Space Center (KSC) operates a 48-MHz Tropospheric/Stratospheric Doppler Radar Wind Profiler (TDRWP) on a continual basis generating wind profiles between 2-19 km in the support of space launch vehicle operations. A benefit of the continual operability of the system is the ability to provide unique observations of severe weather events such as hurricanes. Over the past two Atlantic Hurricane seasons the TDRWP has made high temporal resolution wind profile observations of Hurricane Irma in 2017 and Hurricane Matthew in 2016. Hurricane Irma was responsible for power outages to approximately 2/3 of Florida's population during its movement over the state(Stein,2017). An overview of the TDRWP system configuration, brief summary of Hurricanes Irma and Matthew storm track in proximity to KSC, characteristics of the tropospheric wind observations from the TDRWP during both events, and discussion of the dissemination of TDRWP data during the event will be presented.

  8. Withstanding trauma: the significance of Emma Eckstein's circumcision to Freud's Irma dream.

    Science.gov (United States)

    Bonomi, Carlo

    2013-07-01

    The author considers the medical rationale for Wilhelm Fliess's operation on Emma Eckstein's nose in February 1895 and interprets the possible role that this played in Freud's dream of Irma's injection five months later. The author's main argument is that Emma likely endured female castration as a child and that she therefore experienced the surgery to her nose in 1895 as a retraumatization of her childhood trauma. The author further argues that Freud's unconscious identification with Emma, which broke through in his dream of Irma's injection with resistances and apotropaic defenses, served to accentuate his own "masculine protest". The understanding brought to light by the present interpretation of Freud's Irma dream, when coupled with our previous knowledge of Freud, allows us to better grasp the unconscious logic and origins of psychoanalysis itself.(1.) © 2013 The Psychoanalytic Quarterly, Inc.

  9. Microphysical Structures of Hurricane Irma Observed by Polarimetric Radar

    Science.gov (United States)

    Didlake, A. C.; Kumjian, M. R.

    2017-12-01

    This study examines dual-polarization radar observations of Hurricane Irma as its center passed near the WSR-88D radar in Puerto Rico, capturing needed microphysical information of a mature tropical cyclone. Twenty hours of observations continuously sampled the inner core precipitation features. These data were analyzed by annuli and azimuth, providing a bulk characterization of the primary eyewall, secondary eyewall, and rainbands as they varied around the storm. Polarimetric radar variables displayed distinct signatures of convective and stratiform precipitation in the primary eyewall and rainbands that were organized in a manner consistent with the expected kinematic asymmetry of a storm in weak environmental wind shear but with moderate low-level storm-relative flow. In the front quadrants of the primary eyewall, vertical profiles of differential reflectivity (ZDR) exhibit increasing values with decreasing height consistent with convective precipitation processes. In particular, the front-right quadrant exhibits a signature in reflectivity (ZH) and ZDR indicating larger, sparser drops, which is consistent with a stronger updraft present in this quadrant. In the rear quadrants, a sharply peaked ZDR maximum occurs within the melting layer, which is attributed of stratiform processes. In the rainbands, the convective to stratiform transition can be seen traveling from the front-right to the front-left quadrant. The front-right quadrant exhibits lower co-polar correlation coefficient (ρHV) values in the 3-8 km altitude layer, suggesting larger vertical spreading of various hydrometeors that occurs in convective vertical motions. The front-left quadrant exhibits larger ρHV values, suggesting less diversity of hydrometeor shapes, consistent with stratiform processes. The secondary eyewall did not exhibit a clear signature of processes preferred in a specific quadrant, and a temporal analysis of the secondary eyewall revealed a complex evolution of its structure

  10. Are recent hurricane (Harvey, Irma, Maria) disasters natural?

    Science.gov (United States)

    Trenberth, K. E.; Lijing, C.; Jacobs, P.; Abraham, J. P.

    2017-12-01

    Yes and no! Hurricanes are certainly natural, but human-caused climate change is supersizing them, and unbridled growth is exacerbating risk of major damages. The addition of heat-trapping gases to the atmosphere has led to observed increases in upper ocean heat content (OHC). This human-caused increase in OHC supports higher sea surface temperatures (SSTs) and atmospheric moisture. These elevated temperatures and increased moisture availability fuel tropical storms, allowing them to grow larger, longer lasting, and more intense, and with widespread heavy rainfalls. Our preliminary analysis of OHC through the August of 2017 shows not only was it by far the highest on record globally, but it was also the highest on record in the Gulf of Mexico prior to hurricane Harvey occurring. The human influence on the climate is also evident in rising sea levels, which increases risks from storm surges. These climatic changes are taking place against a background of growing habitation along coasts, which further increases the risk storms pose to life and property. This combination of planning choice and climatic change illustrates the tragedy of global warming, as evidenced by Harvey in Houston, Irma in the Caribbean and Florida, and Maria in Puerto Rico. However, future damages and loss of life can be mitigated, by stopping or slowing human-caused climate change, and through proactive planning (e.g., better building codes, increased-capacity drainage systems, shelters, and evacuation plans). We discuss the climatic and planning contexts of the unnatural disasters of the 2017 Atlantic Hurricane season, including novel indices of climate-hurricane influence.

  11. Reproductive hormones disorders of Sudanese females using immunoradiometric assay (IRMA)

    International Nuclear Information System (INIS)

    Ali, N. I.; Almahi, W. A. A.; Abdalla, O. M.; Bafarag, S. M. I.; Abdelgadir, O. M.; Eltayeb, M. A. H.; Hassan, A. M. E.; Hassan, A. M. E.

    2004-12-01

    In this study fertility hormones were measured for 587 infertile Sudanese female referred from gynecological clinics. The ages of these female ranges from 16-50 years divided into seven groups. Eighty seven percent of them are in the age range between 21 and 40 year which correlate with the female's fertile period and 5.6% of them under 20 years. Sensitive (IRMA) method was used for measuring the hormone concentration. The objective of this study was to found out the percentage of hormonal disorders and its relation to the age in infertile Sudanese females. The age group (21-25) was the most affected group by the Poly Cystic Ovary Syndrome (PCOS) and represented 5.1% of the total number of patients. The least group was the age group (41-45) with a percentage of 0.4. The LH and the FSH in the age group of (31-35) was found to be higher than the other groups and represented 11.4% and 7.8% from the total number of patients respectively. The least percent of high level of LH and FSH was found to be in the most fertile age group (15-20) and it was 1.7% and 1.0% from the total number of studied patient, respectively. Those who were in the age range (26-30) with hyperprolactinaemia represented 10.4% of patients, while those with age rang (46-50) with hyperprolactinaemia represented the lowest percentage (1.2%). The percentage of patients having high LH and high FSH was 44.5% and 29.1% respectively, while the hyperprolactinaemia among the infertile Sudanese female was found to 38.2%.(Author)

  12. Using the integrated rural mobility and access (IRMA) approach in prospering rural South Africa

    CSIR Research Space (South Africa)

    Chakwizira, J

    2008-11-01

    Full Text Available standard of living, freedom, dignity, self-esteem and respect from others”6. RURAL DEVELOPMENT IMPACT TECHNOLOGY IN PRACTICE - THE IRMA PROJECT IN MPUMALANGA: A CASE STUDY Inherent to South Africa is a dual socio-economic ‘access divide’, clearly...

  13. Antibodies immobilized on magnetic particles for RIA and IRMA of thyroid related hormones

    International Nuclear Information System (INIS)

    Wayan, R.S.; Djayusman, D.S.

    1996-01-01

    In Indonesia radioimmunoassay kits on the magnetic method of separation need to be imported and are very expensive. Local production of these kits would be economical. Different types of magnetic particles have been used for immobilizing antibodies for use in RIA of T 3 , T 4 , IRMA-TSH as well as neonatal IRMA-TSH. The particles studied here include magnetic cellulose (SCIPAC, U.K.), magnetite (Hungary), Silanized Iron Oxide (China) and Latex-M. Various parameters have been studied in order to optimize the antibody immobilization procedures as well as the assays based on these immunoadsorbents. The assays developed by us have been compared with those obtained with commercial kits from Amersham, NETRIA and DPC. The study done in this work includes immobilization of second antibodies for RIA of T 4 and immobilization of anti-TSH for IRMA-TSH. Among several different magnetic particles studied in this work, magnetite and silanized iron oxide were found to be satisfactory on account of the simplicity of immobilization, high binding capacity and the low non specific binding. A good assay performance in the case of RIA T 3 and T 4 was obtained using second antibodies immobilized magnetic particles. However, the quality of first antibodies is found to play an important role on the sensitivity and precision of the assay. Good correlation has been obtained with Amersham kit (y = 1.06x - 0.12 and r = 0.987). Assay performance of IRMA-TSH using in-house prepared anti-TSH immobilized magnetic particles is also found to be comparable with Amersham, NETRIA and DPC kits. (author). 4 refs, 6 figs, 1 tab

  14. Comparison study Irma Ca-125 Kit between the production of immuno tech

    International Nuclear Information System (INIS)

    Puji Widayati; Sri Hartini; Agus Ariyanto

    2012-01-01

    An immunoradiometric assay (IRMA) is one of immunoassay technique using radionuclide as the tracer to detect low quantity of analyte. This technique is based on the reaction between antigen (Ag) contained in the sample or standard (tumor marker) with radioactive antibody (Ab*) which is in the excessive quantity can form the antigen-antibody (Ag-Ab*). This technique is suitable for tumor marker testing in the serum which has complex matrix and various concentration. The tumor marker used for monitoring of ovarium cancer is Ca-125, a kind of antigenic glycoprotein which is formed in the ovarium and released into the blood system of people who suffering ovarium cancer. The aim of this research is to compare between local IRMA Ca-125 kit (produced by Center for Radioisotopes and Radiopharmaceuticals, National Nuclear Energy Agency) and imported IRMA (Immuno tech, France) toward 245 samples obtained from PPTA-BATAN clinic and Dharmais Cancer Hospital. The results showed 184 samples as true negative, 46 samples as true positive of ovarium cancer, 13 samples as false negative and 2 samples as false positive. This comparison study gave diagnostic sensitivity as much as 95.83% and diagnostic specificity as much as 93.40%. (author)

  15. A yeast synthetic network for in vivo assessment of reverse-engineering and modeling approaches.

    Science.gov (United States)

    Cantone, Irene; Marucci, Lucia; Iorio, Francesco; Ricci, Maria Aurelia; Belcastro, Vincenzo; Bansal, Mukesh; Santini, Stefania; di Bernardo, Mario; di Bernardo, Diego; Cosma, Maria Pia

    2009-04-03

    Systems biology approaches are extensively used to model and reverse engineer gene regulatory networks from experimental data. Conversely, synthetic biology allows "de novo" construction of a regulatory network to seed new functions in the cell. At present, the usefulness and predictive ability of modeling and reverse engineering cannot be assessed and compared rigorously. We built in the yeast Saccharomyces cerevisiae a synthetic network, IRMA, for in vivo "benchmarking" of reverse-engineering and modeling approaches. The network is composed of five genes regulating each other through a variety of regulatory interactions; it is negligibly affected by endogenous genes, and it is responsive to small molecules. We measured time series and steady-state expression data after multiple perturbations. These data were used to assess state-of-the-art modeling and reverse-engineering techniques. A semiquantitative model was able to capture and predict the behavior of the network. Reverse engineering based on differential equations and Bayesian networks correctly inferred regulatory interactions from the experimental data.

  16. Freud's struggle with misogyny: homosexuality and guilt in the dream of Irma's injection.

    Science.gov (United States)

    Lotto, D

    2001-01-01

    The highly condensed dream element trimethylamin is central to the dream of Irma's injection. After a brief review of the medical literature on timethylamine (TMA), it is suggested that two important meanings of this chemical and its properties lie in its disguised reference to disparaging views of women, as well as to Freud's homosexual connection to Wilhelm Fliess. Freud's misogynistic and homosexual impulses were stimulated by Fliess's recent surgical error committed while operating on Freud's patient Emma Eckstein. Evidence is presented that the collaboration between Freud and Fliess in performing an aggressive act toward a woman was for Freud an enactment of a childhood situation in which he and his nephew John had ganged up on John's sister Pauline. The later relationship between Freud, Jung, and Sabina Spielrein is seen as an additional reenactment of this childhood triangle. An examination of Freud's associations to and analysis of the Irma dream, as well as some of his later relationships with women, indicates that guilt and the wish to make reparation were also prominent themes in Freud's inner life.

  17. Kinetics and equilibrium in the immunoradiometric assay (IRMA) of thyroglobuline.

    Science.gov (United States)

    Garcia Gomez, J; Moreno Frigols, J L

    2002-01-01

    This paper studies the kinetics of the thyroglobuline reaction with its specific antibody immobilised on the inner wall of the reaction tube, and the subsequent binding of the immunocomplex formed with a second 125I-labelled antibody. These reactions are used in the immunoradiometric determination of thyroglobuline. Independent variables were analyte and labelled antibody, temperature, viscosity, and the medium's ionic strength. For the global process, mono-exponential kinetics were found to be dependent on the concentrations, such dependence fitting with the models discussed in the paper. Viscosity results clearly indicate its negative influence on the direct reaction rate. Ionic strength shows noticeable, but not too relevant, effects, which suggests that the variation caused by the glycerol addition is not due to the influence of the dielectric constant of the solutions used. The effect of temperature shows activation parameters similar to the viscous flow energy of water, which suggests diffusion control for the global process.

  18. Towards sustainable flood risk management in the Rhine and Meuse river basins: synopsis of the findings of IRMA-SPONGE

    NARCIS (Netherlands)

    Hooijer, A.; Klijn, F.; Pedroli, G.B.M.; Os, van A.G.

    2004-01-01

    Recent flood events in western Europe have shown the need for improved flood risk management along the Rhine and Meuse rivers. In response, the IRMA-SPONGE research programme was established, consisting of 13 research projects, in which over 30 organizations from six countries co-operated. The aim

  19. Unfulfilled farmer expectations: the case of the Insect Resistant Maize for Africa (IRMA project in Kenya

    Directory of Open Access Journals (Sweden)

    Mabeya Justin

    2012-11-01

    Full Text Available Abstract Background Maize is the most important staple food in Kenya; any reduction in production and yield therefore often becomes a national food security concern. To address the challenge posed by the maize stem borer, the Insect Resistant Maize for Africa (IRMA agricultural biotechnology public-private partnership (PPP project was launched in 1999. There were, however, pre-existing concerns regarding the use of genetic engineering in crop production and skepticism about private sector involvement. The purpose of this case study was to understand the role of trust in the IRMA partnership by identifying the challenges to, and practices for, building trust in the project. Methods Data were collected by conducting face-to-face, semi-structured interviews; reviewing publicly available project documents; and direct observations. The data were analyzed to generate recurring and emergent themes on how trust is understood and built among the partners in the IRMA project and between the project and the community. Results Clear and continued communication with stakeholders is of paramount importance to building trust, especially regarding competition among partners about project management positions; a lack of clarity on ownership of intellectual property rights (IPRs; and the influence of anti-genetic modification (GM organizations. Awareness creation about IRMA’s anticipated products raised the end users’ expectations, which were unfulfilled due to failure to deliver Bacillus thuringiensis (Bt-based products, thereby leading to diminished trust between the project and the community. Conclusions Four key issues have been identified from the results of the study. First, the inability to deliver the intended products to the end user diminished stakeholders’ trust and interest in the project. Second, full and honest disclosure of information by partners when entering into project agreements is crucial to ensuring progress in a project. Third

  20. Performance of the FV3-powered Next Generation Global Prediction System for Harvey and Irma, and a vision for a "beyond weather timescale" prediction system for long-range hurricane track and intensity predictions

    Science.gov (United States)

    Lin, S. J.; Bender, M.; Harris, L.; Hazelton, A.

    2017-12-01

    The performance of a GFDL developed FV3-based Next Generation Global Prediction System (NGGPS) for Harvey and Irma will be reported. We will report on aspects of track and intensity errors (vs operational models), heavy precipitation (Harvey), rapid intensification, and simulated structure (in comparison with ground based radar), and point to a need of a future long-range (from day-5 up to 30 days) physically based ensemble hurricane prediction system for providing useful information to the forecasters, beyond the usual weather timescale.

  1. Nanotechnologies: Risk assessment model

    Science.gov (United States)

    Giacobbe, F.; Monica, L.; Geraci, D.

    2009-05-01

    The development and use of nanomaterials has grown widely in the last years. Hence, it is necessary to carry out a careful and aimed risk assessment for the safety of the workers. The objective of this research is a specific assessment model finalized to the workplaces where the personnel work manipulating nanoparticles. This model mainly takes into account the number of exposed workers, the dimensions of particles, the information found in the safety data sheets and the uncertainties about the danger level coming from the exposition to nanomaterials. The evaluation algorithm considers the normal work conditions, the abnormal (e.g. breakdown air filter) and emergency situations (e.g. package cracking). It has been necessary to define several risk conditions in order to quantify the risk by increasing levels ("low", "middle" and "high" level). Each level includes appropriate behavioural procedures. In particular for the high level, it is advisable that the user carries out urgent interventions finalized to reduce the risk level (e.g. the utilization of vacuum box for the manipulation, high efficiency protection PPE, etc). The model has been implemented in a research laboratory where titanium dioxide and carbon nanotubes are used. The outcomes taken out from such specific evaluation gave a risk level equal to middle.

  2. Nanotechnologies: Risk assessment model

    International Nuclear Information System (INIS)

    Giacobbe, F; Monica, L; Geraci, D

    2009-01-01

    The development and use of nanomaterials has grown widely in the last years. Hence, it is necessary to carry out a careful and aimed risk assessment for the safety of the workers. The objective of this research is a specific assessment model finalized to the workplaces where the personnel work manipulating nanoparticles. This model mainly takes into account the number of exposed workers, the dimensions of particles, the information found in the safety data sheets and the uncertainties about the danger level coming from the exposition to nanomaterials. The evaluation algorithm considers the normal work conditions, the abnormal (e.g. breakdown air filter) and emergency situations (e.g. package cracking). It has been necessary to define several risk conditions in order to quantify the risk by increasing levels ('low', 'middle' and 'high' level). Each level includes appropriate behavioural procedures. In particular for the high level, it is advisable that the user carries out urgent interventions finalized to reduce the risk level (e.g. the utilization of vacuum box for the manipulation, high efficiency protection PPE, etc). The model has been implemented in a research laboratory where titanium dioxide and carbon nanotubes are used. The outcomes taken out from such specific evaluation gave a risk level equal to middle.

  3. Integrated Assessment Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Edmonds, James A.; Calvin, Katherine V.; Clarke, Leon E.; Janetos, Anthony C.; Kim, Son H.; Wise, Marshall A.; McJeon, Haewon C.

    2012-10-31

    This paper discusses the role of Integrated Assessment models (IAMs) in climate change research. IAMs are an interdisciplinary research platform, which constitutes a consistent scientific framework in which the large-scale interactions between human and natural Earth systems can be examined. In so doing, IAMs provide insights that would otherwise be unavailable from traditional single-discipline research. By providing a broader view of the issue, IAMs constitute an important tool for decision support. IAMs are also a home of human Earth system research and provide natural Earth system scientists information about the nature of human intervention in global biogeophysical and geochemical processes.

  4. Kinetic Studies on the Total Human Chorionic Gonadotropin (hCG) IRMA Assay

    International Nuclear Information System (INIS)

    Moustafa, K.A.; Aly, M.A.M.; Al-Kolaly, M.T.; Abou El-Nour, F.

    2002-01-01

    Human chorionic gonadotropin (hCG) is a two chains glycoprotein hormone normally found in blood and urine, only during pregnancy. It is secreted by placental tissue, beginning with the primitive trophoblast, almost from the time of implantation. The kinetics of the interaction between the hCG antigen and two matched antibodies, one labelled with radioactive isotope 125 I (tracer) and the other is unlabelled and attached to a solid support (tube) , are studied using, two-sites (sandwich) Immunoradiometric assay (IRMA) technique. A new method for determining the rate constants, using an advanced computer program (RKY) based on the Nelder-Mead optimisation principle is introduced. The rate constants, at three variable temperatures and three different antigen concentrations as well as the half time of exchange (t) were calculated

  5. Endothelial dysfunction and inflammation predict development of diabetic nephropathy in the Irbesartan in Patients with Type 2 Diabetes and Microalbuminuria (IRMA 2) study

    DEFF Research Database (Denmark)

    Persson, Frederik; Rossing, Peter; Hovind, Peter

    2008-01-01

    OBJECTIVE: To evaluate risk factors for progression from persistent microalbuminuria to diabetic nephropathy in the Irbesartan in Patients with Type 2 diabetes and Microalbuminuria (IRMA 2) study, including biomarkers of endothelial dysfunction, chronic low-grade inflammation, growth factors...

  6. ["... my friend Leopold was percussing her through her bodice...". Leopold von Auenbrugger in Sigmund Freud's dream of Irma's injection].

    Science.gov (United States)

    Reicheneder, Johann Georg

    2011-01-01

    This paper provides a psychoanalytic interpretation of an element in the Irma dream that Freud had ignored in his own interpretation. The allusion to Leopold von Auenbrugger, the originator of percussion as a method of clinical investigation, which appears in the manifest dream reflects Freud's hopes and fears about how his Interpretation of Dreams and the new human science established there would be received by his medical colleagues.

  7. Overuse Injury Assessment Model

    National Research Council Canada - National Science Library

    Stuhmiller, James H; Shen, Weixin; Sih, Bryant

    2005-01-01

    .... Previously, we developed a preliminary model that predicted the stress fracture rate and used biomechanical modeling, nonlinear optimization for muscle force, and bone structural analysis to estimate...

  8. Examining the effects of hurricanes Matthew and Irma on water quality in the intracoastal waterway, St. Augustine, FL.

    Science.gov (United States)

    Ward, N. D.; Osborne, T.; Dye, T.; Julian, P.

    2017-12-01

    The last several years have been marked by a high incidence of Atlantic tropical cyclones making landfall as powerful hurricanes or tropical storms. For example, in 2016 Hurricane Matthew devastated parts of the Caribbean and the southeastern United States. In 2017, this region was further battered by hurricanes Irma and Maria. Here, we present water quality data collected in the intracoastal waterway near the Whitney Lab for Marine Bioscience during hurricanes Matthew and Irma, a region that experienced flooding during both storms. YSI Exo 2 sondes were deployed to measure pH, salinity, temperature, dissolved O2, fluorescent dissolved organic matter (fDOM), turbidity, and Chlorophyll-a (Chl-a) on a 15 minute interval. The Hurricane Matthew sonde deployment failed as soon as the storm hit, but revealed an interesting phenomenon leading up to the storm that was also observed during Irma. Salinity in the intracoastal waterway (off the Whitney Lab dock) typically varies from purely marine to 15-20 psu throughout the tidal cycle. However, several days before both storms approached the Florida coast (i.e. when they were near the Caribbean), the salinity signal became purely marine, overriding any tidal signal. Anecdotally, storm drains were already filled up to street level prior to the storm hitting, poising the region for immense flooding and storm surge. The opposite effect was observed after Irma moved past FL. Water became much fresher than normal for several days and it took almost a week to return to "normal" salinity tidal cycles. As both storms hit, turbidity increased by an order of magnitude for a several hour period. fDOM and O2 behaved similar to salinity during and after Irma, showing a mostly marine signal (e.g. higher O2, lower fDOM) in the lead up, and brief switch to more freshwater influence the week after the storm. Chl-a peaked several days after the storm, presumably due to mobilization of nutrient rich flood and waste waters and subsequent algae

  9. Overuse Injury Assessment Model

    Science.gov (United States)

    2006-03-01

    bones in celiac disease patients." Am J Gastroenterol 98(2): 382-390. Ferretti, J. L. 1997. "Noninvasive Assessment of Bone Architecture and...altered gait pretest anthropometry diet and nutrition genetics endocrine status and hormones bone disease (pathology) age initial bone health state...and between subjects can be expected. Consequently large numbers of subject are required to obtain statistically significant results. Even with

  10. Measuring and building resilience after big storms: Lessons learned from Super-Storm Sandy for the Harvey, Irma, Jose, and Maria coasts

    Science.gov (United States)

    Murdoch, P. S.; Penn, K. M.; Taylor, S. M.; Subramanian, B.; Bennett, R.

    2017-12-01

    As we recover from recent large storms, we need information to support increased environmental and socio-economic resilience of the Nation's coasts. Defining baseline conditions, tracking effects of mitigation actions, and measuring the uncertainty of resilience to future disturbance are essential so that the best management practices can be determined. The US Dept. of the Interior invested over $787 million dollars in 2013 to understand and mitigate coastal storm vulnerabilities and enhance resilience of the Northeast coast following Super-Storm Sandy. Several lessons-learned from that investment have direct application to mitigation and restoration needs following Hurricanes Harvey, Irma, Jose and Maria. New models of inundation, overwash, and erosion, developed during the Sandy projects have already been applied to coastlines before and after these recent storms. Results from wetland, beach, back-bay, estuary, and built-environment projects improved models of inundation and erosion from surge and waves. Tests of nature-based infrastructure for mitigating coastal disturbance yielded new concepts for best-practices. Ecological and socio-economic measurements established for detecting disturbance and tracking recovery provide baseline data critical to early detection of vulnerabilities. The Sandy lessons and preliminary applications on the recent storms could help define best-resilience practices before more costly mitigation or restoration efforts are required.

  11. Integrated Environmental Assessment Modelling

    Energy Technology Data Exchange (ETDEWEB)

    Guardanz, R.; Gimeno, B. S.; Bermejo, V.; Elvira, S.; Martin, F.; Palacios, M.; Rodriguez, E.; Donaire, I. [Ciemat, Madrid (Spain)

    2000-07-01

    This report describes the results of the Spanish participation in the project Coupling CORINAIR data to cost-effect emission reduction strategies based on critical threshold. (EU/LIFE97/ENV/FIN/336). The subproject has focused on three tasks. Develop tools to improve knowledge on the spatial and temporal details of emissions of air pollutants in Spain. Exploit existing experimental information on plant response to air pollutants in temperate ecosystem and Integrate these findings in a modelling framework that can asses with more accuracy the impact of air pollutants to temperate ecosystems. The results obtained during the execution of this project have significantly improved the models of the impact of alternative emission control strategies on ecosystems and crops in the Iberian Peninsula. (Author) 375 refs.

  12. MAPPING THE EXTENT AND MAGNITUDE OF SEVER FLOODING INDUCED BY HURRICANE IRMA WITH MULTI-TEMPORAL SENTINEL-1 SAR AND INSAR OBSERVATIONS

    Directory of Open Access Journals (Sweden)

    B. Zhang

    2018-04-01

    Full Text Available During Hurricane Irma’s passage over Florida in September 2017, many sections of the state experienced heavy rain and sequent flooding. In order to drain water out of potential flooding zones and assess property damage, it is important to map the extent and magnitude of the flooded areas at various stages of the storm. We use Synthetic Aperture Radar (SAR and Interferometric SAR (InSAR observations, acquired by Sentinel-1 before, during and after the hurricane passage, which enable us to evaluate surface condition during different stages of the hurricane. This study uses multi-temporal images acquired under dry condition before the hurricane to constrain the background backscattering signature. Flooded areas are detected when the backscattering during the hurricane is statistically significantly different from the average dry conditions. The detected changes can be either an increase or decrease of the backscattering, which depends on the scattering characteristics of the surface. In addition, water level change information in Palmdale, South Florida is extracted from an interferogram with the aid of a local water gauge as the reference. The results of our flooding analysis revealed that the majority of the study area in South Florida was flooded during Hurricane Irma.

  13. Ultrasensitive human thyrotropin (h TSH) immunoradiometric assay (IRMA) set up, through identification and minimization of non specific bindings

    International Nuclear Information System (INIS)

    Peroni, C.N.

    1994-01-01

    An IRMA of h TSH, based on magnetic solid phase separation, was studied especially for what concerns its non specific bindings. These were identified as a product of the interaction between an altered form of radioiodinated anti-h TSH monoclonal antibody ( 125 I-m AB) and the uncoupled magnetizable cellulose particle (matrix). Apparently this form of 125 I-m AB is a type of aggregate that can be partly resolved from the main peak on Sephadex G-200 and further minimized via a single pre-incubation with the same matrix. Solid phase saturation with milk proteins, tracer storage at 4 0 C and serum addition during incubation were also found particularly effective is preventing its formation. These findings were used in order to reproducibly decrease non specific bindings to values 60 /B O ) up to values of 300-500. This way we obtained h TSH radio assays with functional sensitivities of about 0.05 m IU/L and analytical sensitivities of the order of 0.02 m IU/L, which classify them at least as among the best second generation assays and that are excellent indeed for magnetic IRMA s. A more optimistic sensitivity calculation, based on Rodbard's definition, provided values down to 0.008 m IU/L. Such sensitivities, moreover, were obtained in a very reproducible way and all over the useful tracer life. (author). 83 refs, 13 figs, 25 tabs

  14. Characterization of Landslide Sites in Puerto Rico after Hurricanes Irma and María

    Science.gov (United States)

    Hughes, K. S.; Morales Vélez, A. C.

    2017-12-01

    Thousands of landslides in Puerto Rico and the U.S. Virgin Islands were triggered by the passage of Hurricanes Irma (Sep. 6) and María (Sep. 20) in 2017. Both were classified as Category 5 hurricanes on the Saffir-Simpson scale before making landfall. Most of the mass wasting occurred in the rugged mountainous regions of Puerto Rico and—along with bridge collapse, flooding, and the threat of dam failure—left many communities isolated for up to a month or longer. Aerial photography collected by FEMA and the Civil Air Patrol have allowed for the rapid inventory of landslide sites across the archipelago by the USGS and other groups. Using this dataset and other local information, we identified a list of priority sites that were documented in detail as part of a NSF-GEER (Geotechnical Extreme Event Reconnaissance) mission. The juvenile landscape and short-wavelength topography in most of Puerto Rico present considerable landslide risk that is exaggerated during heavy rainfall events like Hurricane María. Our preliminary work shows that natural escarpments, de-vegetated pastureland in mountainous areas, and road cuts along incised river valleys were areas of concentrated failures during these storms. Notably, the northern karst area suffered fewer failures than the arc basement rocks exposed elsewhere in the island. In addition to previously active landslides at specific sites on the island, new landslides along PR-143 in the municipality of Barranquitas, PR-431 in the municipality of Lares, and PR-109 in the municipality of Añasco are among important mass wasting events that were a focus of the GEER team and remain important in our ongoing research. A team of undergraduate and graduate students led by faculty at the University of Puerto Rico in Mayagüez are working to characterize the complete inventory of landslides in terms of underlying geology, soil type, slope, curvature, rain fall amounts during both atmospheric events, and other local geomorphic and

  15. The Assessment Cycle: A Model for Learning through Peer Assessment

    Science.gov (United States)

    Reinholz, Daniel

    2016-01-01

    This paper advances a model describing how peer assessment supports self-assessment. Although prior research demonstrates that peer assessment promotes self-assessment, the connection between these two activities is underspecified. This model, the assessment cycle, draws from theories of self-assessment to elaborate how learning takes place…

  16. Model uncertainty in safety assessment

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Huovinen, T.

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.)

  17. Tumour associated antigen CA-50, CA-242 immunoradiometric assay (IRMA) in genitourinary malignancy and gastrointestinal carcinoma early diagnosis

    International Nuclear Information System (INIS)

    Chen Zhizhou.

    1992-04-01

    Tumour markers CA-50 and CA-242 were measured by immunometric assay (IRMA) to investigate their usefulness in the diagnosis of cancer of the pancreas, biliary tract, liver, breast, lung, gastrointestinal and genitourinary systems. The cutoff points, derived from studies on normal subjects and those with proven benign disease, were 20 u/ml and 12 u/ml for CA-50 and CA-242 respectively. Both markers were found to be generally useful with significant differences between malignant and non malignant disease. The highest positive rates, were found in cancers of the pancreas and gall bladder. The overall rate of false positives was low. It is concluded that measurements of CA-50 and CA-242 are useful in the detection of malignancy, particularly of the pancreas and biliary tract. 2 figs, 2 tabs

  18. Climate Modeling Computing Needs Assessment

    Science.gov (United States)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  19. Short-term impacts of Hurricanes Irma and Maria on tropical stream chemistry as measured by in-situ sensors

    Science.gov (United States)

    McDowell, W. H.; Potter, J.; López-Lloreda, C.

    2017-12-01

    High intensity hurricanes have been shown to alter topical forest productivity and stream chemistry for years to decades in the montane rain forest of Puerto Rico, but much less is known about the immediate ecosystem response to these extreme events. Here we report the short-term impacts of Hurricanes Irma and Maria on the chemistry of Quebrada Sonadora immediately before and after the storms. We place the results from our 15-minute sensor record in the context of long-term weekly sampling that spans 34 years and includes two earlier major hurricanes (Hugo and Geoges). As expected, turbidity during Maria was the highest in our sensor record (> 1000 NTU). Contrary to our expectations, we found that solute-flow behavior changed with the advent of the storms. Specific conductance showed a dilution response to flow before the storms, but then changed to an enrichment response during and after Maria. This switch in system behavior is likely due to the deposition of marine aerosols during the hurricane. Nitrate concentrations showed very little response to discharge prior to the recent hurricanes, but large increase in concentration occurred at high flow both during and after the hurricanes. Baseflow nitrate concentrations decreased immediately after Irma to below the long-term background concentrations, which we attribute to the immobilization of N on organic debris choking the stream channel. Within three weeks of Hurricane Maria, baseflow nitrate concentrations began to rise. This is likely due to mineralization of N from decomposing canopy vegetation on the forest floor, and reduced N uptake by hurricane-damaged vegetation. The high frequency sensors are providing new insights into the response of this ecosystem in the days and weeks following two major disturbance events. The flipping of nitrate response to storms, from source limited to transport limited, suggests that these two severe hurricanes have fundamentally altered the nitrogen cycle at the site in ways

  20. Irrigation in dose assessments models

    Energy Technology Data Exchange (ETDEWEB)

    Bergstroem, Ulla; Barkefors, Catarina [Studsvik RadWaste AB, Nykoeping (Sweden)

    2004-05-01

    SKB has carried out several safety analyses for repositories for radioactive waste, one of which was SR 97, a multi-site study concerned with a future deep bedrock repository for high-level waste. In case of future releases due to unforeseen failure of the protective multiple barrier system, radionuclides may be transported with groundwater and may reach the biosphere. Assessments of doses have to be carried out with a long-term perspective. Specific models are therefore employed to estimate consequences to man. It has been determined that the main pathway for nuclides from groundwater or surface water to soil is via irrigation. Irrigation may cause contamination of crops directly by e.g. interception or rain-splash, and indirectly via root-uptake from contaminated soil. The exposed people are in many safety assessments assumed to be self-sufficient, i.e. their food is produced locally where the concentration of radionuclides may be the highest. Irrigation therefore plays an important role when estimating consequences. The present study is therefore concerned with a more extensive analysis of the role of irrigation for possible future doses to people living in the area surrounding a repository. Current irrigation practices in Sweden are summarised, showing that vegetables and potatoes are the most common crops for irrigation. In general, however, irrigation is not so common in Sweden. The irrigation model used in the latest assessments is described. A sensitivity analysis is performed showing that, as expected, interception of irrigation water and retention on vegetation surfaces are important parameters. The parameters used to describe this are discussed. A summary is also given how irrigation is proposed to be handled in the international BIOMASS (BIOsphere Modelling and ASSessment) project and in models like TAME and BIOTRAC. Similarities and differences are pointed out. Some numerical results are presented showing that surface contamination in general gives the

  1. Irrigation in dose assessments models

    International Nuclear Information System (INIS)

    Bergstroem, Ulla; Barkefors, Catarina

    2004-05-01

    SKB has carried out several safety analyses for repositories for radioactive waste, one of which was SR 97, a multi-site study concerned with a future deep bedrock repository for high-level waste. In case of future releases due to unforeseen failure of the protective multiple barrier system, radionuclides may be transported with groundwater and may reach the biosphere. Assessments of doses have to be carried out with a long-term perspective. Specific models are therefore employed to estimate consequences to man. It has been determined that the main pathway for nuclides from groundwater or surface water to soil is via irrigation. Irrigation may cause contamination of crops directly by e.g. interception or rain-splash, and indirectly via root-uptake from contaminated soil. The exposed people are in many safety assessments assumed to be self-sufficient, i.e. their food is produced locally where the concentration of radionuclides may be the highest. Irrigation therefore plays an important role when estimating consequences. The present study is therefore concerned with a more extensive analysis of the role of irrigation for possible future doses to people living in the area surrounding a repository. Current irrigation practices in Sweden are summarised, showing that vegetables and potatoes are the most common crops for irrigation. In general, however, irrigation is not so common in Sweden. The irrigation model used in the latest assessments is described. A sensitivity analysis is performed showing that, as expected, interception of irrigation water and retention on vegetation surfaces are important parameters. The parameters used to describe this are discussed. A summary is also given how irrigation is proposed to be handled in the international BIOMASS (BIOsphere Modelling and ASSessment) project and in models like TAME and BIOTRAC. Similarities and differences are pointed out. Some numerical results are presented showing that surface contamination in general gives the

  2. Regional-scale impact of storm surges on groundwaters of Texas, Florida and Puerto Rico after 2017 hurricanes Harvey, Irma, Jose, Maria

    Science.gov (United States)

    Sellier, W. H.; Dürr, H. H.

    2017-12-01

    Hurricanes and related storm surges have devastating effects on near-shore infrastructure and above-ground installations. They also heavily impact groundwater resources, with potentially millions of people dependant on these resources as a freshwater source. Destructions of casings and direct incursions of saline and/or polluted waters have been widely observed. It is uncertain how extensive the effects are on underground water systems, especially in limestone karst areas such as Florida and Puerto Rico. Here, we report regional-scale water level changes in groundwater systems of Texas, Florida and Puerto Rico for the 2017 Hurricanes Harvey, Irma, Jose and Maria. We collected regional scale data from the USGS Waterdata portal. Puerto Rico shows the strongest increase in groundwater levels in wells during Hurricane Maria, with less reaction for the preceding storms Irma and Jose. Increases in water levels range from 0.5 to 11m, with maximum storm surges in Puerto Rico around 3m. These wells are located throughout Puerto Rico, on the coast and inland. In Florida, most wells that show a response during Hurricane Irma are located in the Miami region. Wells located on the west coast show smaller responses with the exception of one well located directly on Hurricane Irma's track. These wells show an increase of 0.2 to 1.7m. In Texas, wells located in proximity to Hurricane Harvey's track show an increase in water level. The effect of groundwater level increases is not limited to the Texas coast, but inland as well. An increase between 0.03 and 2.9m is seen. Storm surges for both Florida and Texas have ranged from 1.8-3.7m maximum. We discuss the findings in the context of local and regional geology and hydrogeology (presence of connected aquifer systems, faulting, presence of carbonate/karst systems etc.).

  3. Behavior model for performance assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Borwn-VanHoozer, S. A.

    1999-07-23

    Every individual channels information differently based on their preference of the sensory modality or representational system (visual auditory or kinesthetic) we tend to favor most (our primary representational system (PRS)). Therefore, some of us access and store our information primarily visually first, some auditorily, and others kinesthetically (through feel and touch); which in turn establishes our information processing patterns and strategies and external to internal (and subsequently vice versa) experiential language representation. Because of the different ways we channel our information, each of us will respond differently to a task--the way we gather and process the external information (input), our response time (process), and the outcome (behavior). Traditional human models of decision making and response time focus on perception, cognitive and motor systems stimulated and influenced by the three sensory modalities, visual, auditory and kinesthetic. For us, these are the building blocks to knowing how someone is thinking. Being aware of what is taking place and how to ask questions is essential in assessing performance toward reducing human errors. Existing models give predications based on time values or response times for a particular event, and may be summed and averaged for a generalization of behavior(s). However, by our not establishing a basic understanding of the foundation of how the behavior was predicated through a decision making strategy process, predicative models are overall inefficient in their analysis of the means by which behavior was generated. What is seen is the end result.

  4. Behavior model for performance assessment

    International Nuclear Information System (INIS)

    Brown-VanHoozer, S. A.

    1999-01-01

    Every individual channels information differently based on their preference of the sensory modality or representational system (visual auditory or kinesthetic) we tend to favor most (our primary representational system (PRS)). Therefore, some of us access and store our information primarily visually first, some auditorily, and others kinesthetically (through feel and touch); which in turn establishes our information processing patterns and strategies and external to internal (and subsequently vice versa) experiential language representation. Because of the different ways we channel our information, each of us will respond differently to a task--the way we gather and process the external information (input), our response time (process), and the outcome (behavior). Traditional human models of decision making and response time focus on perception, cognitive and motor systems stimulated and influenced by the three sensory modalities, visual, auditory and kinesthetic. For us, these are the building blocks to knowing how someone is thinking. Being aware of what is taking place and how to ask questions is essential in assessing performance toward reducing human errors. Existing models give predications based on time values or response times for a particular event, and may be summed and averaged for a generalization of behavior(s). However, by our not establishing a basic understanding of the foundation of how the behavior was predicated through a decision making strategy process, predicative models are overall inefficient in their analysis of the means by which behavior was generated. What is seen is the end result

  5. The Irma-sponge Project Frhymap: Flood Risk and Hydrological Mapping

    Science.gov (United States)

    Hoffmann, L.; Pfister, L.

    In the context of both increasing socio-economic developments in floodplains and the recent heavy floodings that have occurred in the Rhine and Meuse basins, the need for reliable hydro-climatological data, easily transposable hydrological and hydraulic models, as well as risk management tools has increased crucially. In the FRHYMAP project, some of these issues were addressed within a common mesoscale experimen- tal basin: the Alzette river basin, located in the Grand-duchy of Luxembourg. The various aspects concerning flooding events, reaching from the hydro-climatological analysis of field data to the risk assessment of socio-economic impacts, taking into account past and future climate and landuse changes were analysed by the six partici- pating research institutes (CREBS, L; CEREG, F; DLR, D; EPFL, CH; UB, D; VUB, B). Hydro-climatological data analysis over the past five decades has shown that in the study area, the increase in westerly and south-westerly atmospheric circulation patterns induced higher winter rainfall totals, leading to more frequent groundwater resurgences and ultimately also to higher daily maximum streamflow of the Alzette. The thus increased flood hazard has nonetheless a certain spatial variability, closely linked to the rainfall distribution patterns, which are strongly depending on the topo- graphical characteristics of the study area. Although the overall regime of the Alzette is more dependent on climate fluctuations, land use changes (mining activities, urbani- sation) had a marked effect on the rainfall-runoff relationship in some sub-basins over the last decades. By linking model parameters to physiographical basin characteris- tics, regionalised and thus easily transposable hydrological models were developed. Within a study area with very little long-term observation series, this technique, com- bined with the use of hydraulic models, allowed to define hydrological hazard pro- ducing and hydrological risk exposed areas. The

  6. Utility of Social Modeling for Proliferation Assessment - Preliminary Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Garill A.; Gastelum, Zoe N.; Brothers, Alan J.; Thompson, Sandra E.

    2009-06-01

    This Preliminary Assessment draft report will present the results of a literature search and preliminary assessment of the body of research, analysis methods, models and data deemed to be relevant to the Utility of Social Modeling for Proliferation Assessment research. This report will provide: 1) a description of the problem space and the kinds of information pertinent to the problem space, 2) a discussion of key relevant or representative literature, 3) a discussion of models and modeling approaches judged to be potentially useful to the research, and 4) the next steps of this research that will be pursued based on this preliminary assessment. This draft report represents a technical deliverable for the NA-22 Simulations, Algorithms, and Modeling (SAM) program. Specifically this draft report is the Task 1 deliverable for project PL09-UtilSocial-PD06, Utility of Social Modeling for Proliferation Assessment. This project investigates non-traditional use of social and cultural information to improve nuclear proliferation assessment, including nonproliferation assessment, proliferation resistance assessments, safeguards assessments and other related studies. These assessments often use and create technical information about the State’s posture towards proliferation, the vulnerability of a nuclear energy system to an undesired event, and the effectiveness of safeguards. This project will find and fuse social and technical information by explicitly considering the role of cultural, social and behavioral factors relevant to proliferation. The aim of this research is to describe and demonstrate if and how social science modeling has utility in proliferation assessment.

  7. Dose assessment models. Annex A

    International Nuclear Information System (INIS)

    1982-01-01

    The models presented in this chapter have been separated into 2 general categories: environmental transport models which describe the movement of radioactive materials through all sectors of the environment after their release, and dosimetric models to calculate the absorbed dose following an intake of radioactive materials or exposure to external irradiation. Various sections of this chapter also deal with atmospheric transport models, terrestrial models, and aquatic models.

  8. Modeling for operational event risk assessment

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The U.S. Nuclear Regulatory Commission has been using risk models to evaluate the risk significance of operational events in U.S. commercial nuclear power plants for more seventeen years. During that time, the models have evolved in response to the advances in risk assessment technology and insights gained with experience. Evaluation techniques fall into two categories, initiating event assessments and condition assessments. The models used for these analyses have become uniquely specialized for just this purpose

  9. Sensitivity Assessment of Ozone Models

    Energy Technology Data Exchange (ETDEWEB)

    Shorter, Jeffrey A.; Rabitz, Herschel A.; Armstrong, Russell A.

    2000-01-24

    The activities under this contract effort were aimed at developing sensitivity analysis techniques and fully equivalent operational models (FEOMs) for applications in the DOE Atmospheric Chemistry Program (ACP). MRC developed a new model representation algorithm that uses a hierarchical, correlated function expansion containing a finite number of terms. A full expansion of this type is an exact representation of the original model and each of the expansion functions is explicitly calculated using the original model. After calculating the expansion functions, they are assembled into a fully equivalent operational model (FEOM) that can directly replace the original mode.

  10. Assessment of Molecular Modeling & Simulation

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-01-03

    This report reviews the development and applications of molecular and materials modeling in Europe and Japan in comparison to those in the United States. Topics covered include computational quantum chemistry, molecular simulations by molecular dynamics and Monte Carlo methods, mesoscale modeling of material domains, molecular-structure/macroscale property correlations like QSARs and QSPRs, and related information technologies like informatics and special-purpose molecular-modeling computers. The panel's findings include the following: The United States leads this field in many scientific areas. However, Canada has particular strengths in DFT methods and homogeneous catalysis; Europe in heterogeneous catalysis, mesoscale, and materials modeling; and Japan in materials modeling and special-purpose computing. Major government-industry initiatives are underway in Europe and Japan, notably in multi-scale materials modeling and in development of chemistry-capable ab-initio molecular dynamics codes.

  11. Predictive Model Assessment for Count Data

    National Research Council Canada - National Science Library

    Czado, Claudia; Gneiting, Tilmann; Held, Leonhard

    2007-01-01

    .... In case studies, we critique count regression models for patent data, and assess the predictive performance of Bayesian age-period-cohort models for larynx cancer counts in Germany. Key words: Calibration...

  12. A Quantitative Software Risk Assessment Model

    Science.gov (United States)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  13. Modeling Diagnostic Assessments with Bayesian Networks

    Science.gov (United States)

    Almond, Russell G.; DiBello, Louis V.; Moulder, Brad; Zapata-Rivera, Juan-Diego

    2007-01-01

    This paper defines Bayesian network models and examines their applications to IRT-based cognitive diagnostic modeling. These models are especially suited to building inference engines designed to be synchronous with the finer grained student models that arise in skills diagnostic assessment. Aspects of the theory and use of Bayesian network models…

  14. Assessment of the Rescorla-Wagner model.

    Science.gov (United States)

    Miller, R R; Barnet, R C; Grahame, N J

    1995-05-01

    The Rescorla-Wagner model has been the most influential theory of associative learning to emerge from the study of animal behavior over the last 25 years. Recently, equivalence to this model has become a benchmark in assessing connectionist models, with such equivalence often achieved by incorporating the Widrow-Hoff delta rule. This article presents the Rescorla-Wagner model's basic assumptions, reviews some of the model's predictive successes and failures, relates the failures to the model's assumptions, and discusses the model's heuristic value. It is concluded that the model has had a positive influence on the study of simple associative learning by stimulating research and contributing to new model development. However, this benefit should neither lead to the model being regarded as inherently "correct" nor imply that its predictions can be profitably used to assess other models.

  15. Early Grade Writing Assessment: An Instrument Model

    Science.gov (United States)

    Jiménez, Juan E.

    2017-01-01

    The United Nations Educational, Scientific, and Cultural Organization promoted the creation of a model instrument for individual assessment of students' foundational writing skills in the Spanish language that was based on a literature review and existing writing tools and assessments. The purpose of the "Early Grade Writing Assessment"…

  16. Ecosystem Model Skill Assessment. Yes We Can!

    Science.gov (United States)

    Olsen, Erik; Fay, Gavin; Gaichas, Sarah; Gamble, Robert; Lucey, Sean; Link, Jason S.

    2016-01-01

    Need to Assess the Skill of Ecosystem Models Accelerated changes to global ecosystems call for holistic and integrated analyses of past, present and future states under various pressures to adequately understand current and projected future system states. Ecosystem models can inform management of human activities in a complex and changing environment, but are these models reliable? Ensuring that models are reliable for addressing management questions requires evaluating their skill in representing real-world processes and dynamics. Skill has been evaluated for just a limited set of some biophysical models. A range of skill assessment methods have been reviewed but skill assessment of full marine ecosystem models has not yet been attempted. Northeast US Atlantis Marine Ecosystem Model We assessed the skill of the Northeast U.S. (NEUS) Atlantis marine ecosystem model by comparing 10-year model forecasts with observed data. Model forecast performance was compared to that obtained from a 40-year hindcast. Multiple metrics (average absolute error, root mean squared error, modeling efficiency, and Spearman rank correlation), and a suite of time-series (species biomass, fisheries landings, and ecosystem indicators) were used to adequately measure model skill. Overall, the NEUS model performed above average and thus better than expected for the key species that had been the focus of the model tuning. Model forecast skill was comparable to the hindcast skill, showing that model performance does not degenerate in a 10-year forecast mode, an important characteristic for an end-to-end ecosystem model to be useful for strategic management purposes. Skill Assessment Is Both Possible and Advisable We identify best-practice approaches for end-to-end ecosystem model skill assessment that would improve both operational use of other ecosystem models and future model development. We show that it is possible to not only assess the skill of a complicated marine ecosystem model, but that

  17. Caries risk assessment models in caries prediction

    Directory of Open Access Journals (Sweden)

    Amila Zukanović

    2013-11-01

    Full Text Available Objective. The aim of this research was to assess the efficiency of different multifactor models in caries prediction. Material and methods. Data from the questionnaire and objective examination of 109 examinees was entered into the Cariogram, Previser and Caries-Risk Assessment Tool (CAT multifactor risk assessment models. Caries risk was assessed with the help of all three models for each patient, classifying them as low, medium or high-risk patients. The development of new caries lesions over a period of three years [Decay Missing Filled Tooth (DMFT increment = difference between Decay Missing Filled Tooth Surface (DMFTS index at baseline and follow up], provided for examination of the predictive capacity concerning different multifactor models. Results. The data gathered showed that different multifactor risk assessment models give significantly different results (Friedman test: Chi square = 100.073, p=0.000. Cariogram is the model which identified the majority of examinees as medium risk patients (70%. The other two models were more radical in risk assessment, giving more unfavorable risk –profiles for patients. In only 12% of the patients did the three multifactor models assess the risk in the same way. Previser and CAT gave the same results in 63% of cases – the Wilcoxon test showed that there is no statistically significant difference in caries risk assessment between these two models (Z = -1.805, p=0.071. Conclusions. Evaluation of three different multifactor caries risk assessment models (Cariogram, PreViser and CAT showed that only the Cariogram can successfully predict new caries development in 12-year-old Bosnian children.

  18. Ecological models and pesticide risk assessment: current modeling practice.

    Science.gov (United States)

    Schmolke, Amelie; Thorbek, Pernille; Chapman, Peter; Grimm, Volker

    2010-04-01

    Ecological risk assessments of pesticides usually focus on risk at the level of individuals, and are carried out by comparing exposure and toxicological endpoints. However, in most cases the protection goal is populations rather than individuals. On the population level, effects of pesticides depend not only on exposure and toxicity, but also on factors such as life history characteristics, population structure, timing of application, presence of refuges in time and space, and landscape structure. Ecological models can integrate such factors and have the potential to become important tools for the prediction of population-level effects of exposure to pesticides, thus allowing extrapolations, for example, from laboratory to field. Indeed, a broad range of ecological models have been applied to chemical risk assessment in the scientific literature, but so far such models have only rarely been used to support regulatory risk assessments of pesticides. To better understand the reasons for this situation, the current modeling practice in this field was assessed in the present study. The scientific literature was searched for relevant models and assessed according to nine characteristics: model type, model complexity, toxicity measure, exposure pattern, other factors, taxonomic group, risk assessment endpoint, parameterization, and model evaluation. The present study found that, although most models were of a high scientific standard, many of them would need modification before they are suitable for regulatory risk assessments. The main shortcomings of currently available models in the context of regulatory pesticide risk assessments were identified. When ecological models are applied to regulatory risk assessments, we recommend reviewing these models according to the nine characteristics evaluated here. (c) 2010 SETAC.

  19. Risk assessment models in the tourism sector

    Directory of Open Access Journals (Sweden)

    Simanavicius Arturas

    2015-05-01

    Full Text Available One of the most prominent contemporary success stories is tourism. This industry began to significantly increase only in 1960, and during the last 50 years, tourism revenues and number of outgoing people have increased by a number of times. Therefore, the tourism sector is highly attractive to new business initiation and development of its dynamic growth, new activities, new trends and technologies, new markets and rapid changes. Purpose of the article - to analyze the prevailing risks in the tourism sector and to identify the business risk assessment models. Scientists pay big attention to risk analysis. A series of risk analysis theoretical, methodological and practical studies are made, but for the tourism risk scientistseconomists do not pay attention in practice. Tourism risk assessment models, analyzed in the article, showed their adaptability to tourism industry. Performed tourism economic risk assessment models showed that in the tourism risk classification it is appropriate to use a procedural approach, which is related to the tourism product identification stages. It would be logical to link the identification of risks to the tourism services in stages, as in each stage prevails certain risk groups The aim of the article - to analyze the tourism risk assessment models and on the basis of analysis to develop further tourism risk assessment model. Article originality is associated with the prepared tourism risk assessment model that is versatile and can be used in different countries in assessing the risks of tourism.

  20. Ecosystem Model Skill Assessment. Yes We Can!

    Science.gov (United States)

    Olsen, Erik; Fay, Gavin; Gaichas, Sarah; Gamble, Robert; Lucey, Sean; Link, Jason S

    2016-01-01

    Accelerated changes to global ecosystems call for holistic and integrated analyses of past, present and future states under various pressures to adequately understand current and projected future system states. Ecosystem models can inform management of human activities in a complex and changing environment, but are these models reliable? Ensuring that models are reliable for addressing management questions requires evaluating their skill in representing real-world processes and dynamics. Skill has been evaluated for just a limited set of some biophysical models. A range of skill assessment methods have been reviewed but skill assessment of full marine ecosystem models has not yet been attempted. We assessed the skill of the Northeast U.S. (NEUS) Atlantis marine ecosystem model by comparing 10-year model forecasts with observed data. Model forecast performance was compared to that obtained from a 40-year hindcast. Multiple metrics (average absolute error, root mean squared error, modeling efficiency, and Spearman rank correlation), and a suite of time-series (species biomass, fisheries landings, and ecosystem indicators) were used to adequately measure model skill. Overall, the NEUS model performed above average and thus better than expected for the key species that had been the focus of the model tuning. Model forecast skill was comparable to the hindcast skill, showing that model performance does not degenerate in a 10-year forecast mode, an important characteristic for an end-to-end ecosystem model to be useful for strategic management purposes. We identify best-practice approaches for end-to-end ecosystem model skill assessment that would improve both operational use of other ecosystem models and future model development. We show that it is possible to not only assess the skill of a complicated marine ecosystem model, but that it is necessary do so to instill confidence in model results and encourage their use for strategic management. Our methods are applicable

  1. Model of MSD Risk Assessment at Workplace

    OpenAIRE

    K. Sekulová; M. Šimon

    2015-01-01

    This article focuses on upper-extremity musculoskeletal disorders risk assessment model at workplace. In this model are used risk factors that are responsible for musculoskeletal system damage. Based on statistic calculations the model is able to define what risk of MSD threatens workers who are under risk factors. The model is also able to say how MSD risk would decrease if these risk factors are eliminated.

  2. Hurricane Irma's Effects on Dune and Beach Morphology at Matanzas Inlet, Atlantic Coast of North Florida: Impacts and Inhibited Recovery?

    Science.gov (United States)

    Adams, P. N.; Conlin, M. P.; Johnson, H. A.; Paniagua-Arroyave, J. F.; Woo, H. B.; Kelly, B. P.

    2017-12-01

    During energetic coastal storms, surge from low atmospheric pressure, high wave set-up, and increased wave activity contribute to significant morphologic change within the dune and upper beach environments of barrier island systems. Hurricane Irma made landfall on the southwestern portion of the Florida peninsula, as a category 4 storm on Sept 10th, 2017 and tracked northward along the axis of the Florida peninsula for two days before dissipating over the North American continent. Observations along the North Florida Atlantic coast recorded significant wave heights of nearly 7 m and water levels that exceeded predictions by 2 meters on the early morning of Sept. 11th. At Fort Matanzas National Monument, the dune and upper beach adjacent to Matanzas Inlet experienced landward retreat during the storm, diminishing the acreage of dune and scrub habitat for federally-listed endangered and threatened animal species, including the Anastasia beach mouse, gopher tortoises, and several protected shore birds. Real Time Kinematic (RTK) GPS surveys, conducted prior to the passage of the storm (Sept. 8) and immediately after the storm (Sept. 13) document dune scarp retreat >10 m in places and an average retreat of 7.8 m (+/- 5.2 m) of the 2-m beach contour, attributable to the event, within the study region. Although it is typical to see sedimentary recovery at the base of dunes within weeks following an erosive event of this magnitude, our follow up RTK surveys, two weeks (Sept. 26) and five weeks (Oct. 19) after the storm, document continued dune retreat and upper beach lowering. Subsequent local buoy observations during the offshore passage of Hurricanes Jose, Maria (Sept. 17 and 23, respectively) and several early-season Nor'easters recorded wave heights well above normal (2-3 meters) from the northeast. The lack of recovery may reveal a threshold vulnerability of the system, in which the timing of multiple moderate-to-high wave events, in the aftermath of a land falling

  3. A Model for Situation and Threat Assessment

    National Research Council Canada - National Science Library

    Steinberg, Alan

    2006-01-01

    .... The activity relates to levels 2 and 3 of the familiar JDL data fusion model. Level 2, Situation Assessment, involves such applications as scene understanding, force structure analysis and many other types of situational analysis...

  4. Evaluation of models in performance assessment

    International Nuclear Information System (INIS)

    Dormuth, K.W.

    1993-01-01

    The reliability of models used for performance assessment for high-level waste repositories is a key factor in making decisions regarding the management of high-level waste. Model reliability may be viewed as a measure of the confidence that regulators and others have in the use of these models to provide information for decision making. The degree of reliability required for the models will increase as implementation of disposal proceeds and decisions become increasingly important to safety. Evaluation of the models by using observations of real systems provides information that assists the assessment analysts and reviewers in establishing confidence in the conclusions reached in the assessment. A continuing process of model calibration, evaluation, and refinement should lead to increasing reliability of models as implementation proceeds. However, uncertainty in the model predictions cannot be eliminated, so decisions will always be made under some uncertainty. Examples from the Canadian program illustrate the process of model evaluation using observations of real systems and its relationship to performance assessment. 21 refs., 2 figs

  5. Qualitative and Quantitative Detection of Urinary Human Complement Factor H-Related Protein (BTA Stat and BTA TRAK) and Fragments of Cytokeratins 8, 18 (UBC Rapid and UBC IRMA) as Markers for Transitional Cell Carcinoma of the Bladder

    Czech Academy of Sciences Publication Activity Database

    Babjuk, M.; Koštířová, M.; Mudra, K.; Pecher, S.; Smolová, H.; Pecen, Ladislav; Ibrahim, Z.; Dvořáček, J.; Jarolím, L.; Novák, J.; Zima, T.

    2002-01-01

    Roč. 41, č. 1 (2002), s. 34-39 ISSN 0302-2838 R&D Projects: GA MZd NC5961 Institutional research plan: AV0Z1030915 Keywords : TCC of bladder * BTA stat and BTA TRAK * UBC Rapid and UBC IRMA * urinary cytology Subject RIV: BA - General Mathematics Impact factor: 1.798, year: 2002

  6. Modeling of Communication in a Computational Situation Assessment Model

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Seong, Poong Hyun

    2009-01-01

    Operators in nuclear power plants have to acquire information from human system interfaces (HSIs) and the environment in order to create, update, and confirm their understanding of a plant state, or situation awareness, because failures of situation assessment may result in wrong decisions for process control and finally errors of commission in nuclear power plants. Quantitative or prescriptive models to predict operator's situation assessment in a situation, the results of situation assessment, provide many benefits such as HSI design solutions, human performance data, and human reliability. Unfortunately, a few computational situation assessment models for NPP operators have been proposed and those insufficiently embed human cognitive characteristics. Thus we proposed a new computational situation assessment model of nuclear power plant operators. The proposed model incorporating significant cognitive factors uses a Bayesian belief network (BBN) as model architecture. It is believed that communication between nuclear power plant operators affects operators' situation assessment and its result, situation awareness. We tried to verify that the proposed model represent the effects of communication on situation assessment. As the result, the proposed model succeeded in representing the operators' behavior and this paper shows the details

  7. evaluation of models for assessing groundwater vulnerability

    African Journals Online (AJOL)

    DR. AMINU

    applied models for groundwater vulnerability assessment mapping. ... of other models have not been applied to ground water studies in Nigeria, unlike other parts of .... Clay Loam. 3. Muck. 2. Nonshrinking and nonaggregated clay. 1. Aller et al., (1987). Table 2: Assigned weights for DRASTIC parameters. Parameters.

  8. A Simple Model of Self-Assessments

    NARCIS (Netherlands)

    S. Dominguez Martinez (Silvia); O.H. Swank (Otto)

    2006-01-01

    textabstractWe develop a simple model that describes individuals' self-assessments of their abilities. We assume that individuals learn about their abilities from appraisals of others and experience. Our model predicts that if communication is imperfect, then (i) appraisals of others tend to be too

  9. A simple model of self-assessment

    NARCIS (Netherlands)

    Dominguez-Martinez, S.; Swank, O.H.

    2009-01-01

    We develop a simple model that describes individuals' self-assessments of their abilities. We assume that individuals learn about their abilities from appraisals of others and experience. Our model predicts that if communication is imperfect, then (i) appraisals of others tend to be too positive and

  10. Models and parameters for environmental radiological assessments

    Energy Technology Data Exchange (ETDEWEB)

    Miller, C W [ed.

    1984-01-01

    This book presents a unified compilation of models and parameters appropriate for assessing the impact of radioactive discharges to the environment. Models examined include those developed for the prediction of atmospheric and hydrologic transport and deposition, for terrestrial and aquatic food-chain bioaccumulation, and for internal and external dosimetry. Chapters have been entered separately into the data base. (ACR)

  11. Models and parameters for environmental radiological assessments

    International Nuclear Information System (INIS)

    Miller, C.W.

    1984-01-01

    This book presents a unified compilation of models and parameters appropriate for assessing the impact of radioactive discharges to the environment. Models examined include those developed for the prediction of atmospheric and hydrologic transport and deposition, for terrestrial and aquatic food-chain bioaccumulation, and for internal and external dosimetry. Chapters have been entered separately into the data base

  12. The Model for Assessment of Telemedicine (MAST)

    DEFF Research Database (Denmark)

    Kidholm, Kristian; Clemensen, Jane; Caffery, Liam J

    2017-01-01

    The evaluation of telemedicine can be achieved using different evaluation models or theoretical frameworks. This paper presents a scoping review of published studies which have applied the Model for Assessment of Telemedicine (MAST). MAST includes pre-implementation assessment (e.g. by use of par...... the inclusion criteria and were included in the review. In this article, research design and methods used in the multidisciplinary assessment are described, strengths and weaknesses are analysed, and recommendations for future research are presented....... of participatory design), followed by multidisciplinary assessment, including description of the patients and the application and assessment of safety, clinical effectiveness, patient perspectives, economic aspects organisational aspects and socio-cultural, legal and ethical aspects. Twenty-two studies met...

  13. Predictions of models for environmental radiological assessment

    International Nuclear Information System (INIS)

    Peres, Sueli da Silva; Lauria, Dejanira da Costa; Mahler, Claudio Fernando

    2011-01-01

    In the field of environmental impact assessment, models are used for estimating source term, environmental dispersion and transfer of radionuclides, exposure pathway, radiation dose and the risk for human beings Although it is recognized that the specific information of local data are important to improve the quality of the dose assessment results, in fact obtaining it can be very difficult and expensive. Sources of uncertainties are numerous, among which we can cite: the subjectivity of modelers, exposure scenarios and pathways, used codes and general parameters. The various models available utilize different mathematical approaches with different complexities that can result in different predictions. Thus, for the same inputs different models can produce very different outputs. This paper presents briefly the main advances in the field of environmental radiological assessment that aim to improve the reliability of the models used in the assessment of environmental radiological impact. The intercomparison exercise of model supplied incompatible results for 137 Cs and 60 Co, enhancing the need for developing reference methodologies for environmental radiological assessment that allow to confront dose estimations in a common comparison base. The results of the intercomparison exercise are present briefly. (author)

  14. Assessing Ecosystem Model Performance in Semiarid Systems

    Science.gov (United States)

    Thomas, A.; Dietze, M.; Scott, R. L.; Biederman, J. A.

    2017-12-01

    In ecosystem process modelling, comparing outputs to benchmark datasets observed in the field is an important way to validate models, allowing the modelling community to track model performance over time and compare models at specific sites. Multi-model comparison projects as well as models themselves have largely been focused on temperate forests and similar biomes. Semiarid regions, on the other hand, are underrepresented in land surface and ecosystem modelling efforts, and yet will be disproportionately impacted by disturbances such as climate change due to their sensitivity to changes in the water balance. Benchmarking models at semiarid sites is an important step in assessing and improving models' suitability for predicting the impact of disturbance on semiarid ecosystems. In this study, several ecosystem models were compared at a semiarid grassland in southwestern Arizona using PEcAn, or the Predictive Ecosystem Analyzer, an open-source eco-informatics toolbox ideal for creating the repeatable model workflows necessary for benchmarking. Models included SIPNET, DALEC, JULES, ED2, GDAY, LPJ-GUESS, MAESPA, CLM, CABLE, and FATES. Comparison between model output and benchmarks such as net ecosystem exchange (NEE) tended to produce high root mean square error and low correlation coefficients, reflecting poor simulation of seasonality and the tendency for models to create much higher carbon sources than observed. These results indicate that ecosystem models do not currently adequately represent semiarid ecosystem processes.

  15. Underwater noise modelling for environmental impact assessment

    Energy Technology Data Exchange (ETDEWEB)

    Farcas, Adrian [Centre for Environment, Fisheries and Aquaculture Science (Cefas), Pakefield Road, Lowestoft, NR33 0HT (United Kingdom); Thompson, Paul M. [Lighthouse Field Station, Institute of Biological and Environmental Sciences, University of Aberdeen, Cromarty IV11 8YL (United Kingdom); Merchant, Nathan D., E-mail: nathan.merchant@cefas.co.uk [Centre for Environment, Fisheries and Aquaculture Science (Cefas), Pakefield Road, Lowestoft, NR33 0HT (United Kingdom)

    2016-02-15

    Assessment of underwater noise is increasingly required by regulators of development projects in marine and freshwater habitats, and noise pollution can be a constraining factor in the consenting process. Noise levels arising from the proposed activity are modelled and the potential impact on species of interest within the affected area is then evaluated. Although there is considerable uncertainty in the relationship between noise levels and impacts on aquatic species, the science underlying noise modelling is well understood. Nevertheless, many environmental impact assessments (EIAs) do not reflect best practice, and stakeholders and decision makers in the EIA process are often unfamiliar with the concepts and terminology that are integral to interpreting noise exposure predictions. In this paper, we review the process of underwater noise modelling and explore the factors affecting predictions of noise exposure. Finally, we illustrate the consequences of errors and uncertainties in noise modelling, and discuss future research needs to reduce uncertainty in noise assessments.

  16. Bioavailability in the boris assessment model

    International Nuclear Information System (INIS)

    Norden, M.; Avila, R.; Gonze, M.A.; Tamponnet, C.

    2004-01-01

    The fifth framework EU project BORIS (Bioavailability Of Radionuclides In Soils: role of biological components and resulting improvement of prediction models) has three scientific objectives. The first is to improve understanding of the mechanisms governing the transfer of radionuclides to plants. The second is to improve existing predictive models of radionuclide interaction with soils by incorporating the knowledge acquired from the experimental results. The last and third objective is to extract from the experimental results some scientific basis for the development of bioremediation methods of radionuclides contaminated soils and to apprehend the role of additional non-radioactive pollutants on radionuclide bio-availability. This paper is focused on the second objective. The purpose of the BORIS assessment model is to describe the behaviour of radionuclides in the soil-plant system with the aim of making predictions of the time dynamics of the bioavailability of radionuclides in soil and the radionuclides concentrations in plants. To be useful the assessment model should be rather simple and use only a few parameters, which are commonly available or possible to measure for different sites. The model shall take into account, as much as possible, the results of the experimental studies and the mechanistic models developed in the BORIS project. One possible approach is to introduce in the assessment model a quantitative relationship between bioavailability of the radionuclides in soil and the soil properties. To do this an operational definition of bioavailability is needed. Here operational means experimentally measurable, directly or indirectly, and that the bioavailability can be translated into a mathematical expression. This paper describes the reasoning behind the chosen definition of bioavailability for the assessment model, how to derive operational expressions for the bioavailability and how to use them in the assessment model. (author)

  17. Real-Time Tracking of the Extreme Rainfall of Hurricanes Harvey, Irma, and Maria using UCI CHRS's iRain System

    Science.gov (United States)

    Shearer, E. J.; Nguyen, P.; Ombadi, M.; Palacios, T.; Huynh, P.; Furman, D.; Tran, H.; Braithwaite, D.; Hsu, K. L.; Sorooshian, S.; Logan, W. S.

    2017-12-01

    During the 2017 hurricane season, three major hurricanes-Harvey, Irma, and Maria-devastated the Atlantic coast of the US and the Caribbean Islands. Harvey set the record for the rainiest storm in continental US history, Irma was the longest-lived powerful hurricane ever observed, and Maria was the costliest storm in Puerto Rican history. The recorded maximum precipitation totals for these storms were 65, 16, and 20 inches respectively. These events provided the Center for Hydrometeorology and Remote Sensing (CHRS) an opportunity to test its global real-time satellite precipitation observation system, iRain, for extreme storm events. The iRain system has been under development through a collaboration between CHRS at the University of California, Irvine (UCI) and UNESCO's International Hydrological Program (IHP). iRain provides near real-time high resolution (0.04°, approx. 4km) global (60°N - 60°S) satellite precipitation data estimated by the PERSIANN-Cloud Classification System (PERSIANN-CCS) algorithm developed by the scientists at CHRS. The user-interactive and web-accessible iRain system allows users to visualize and download real-time global satellite precipitation estimates and track the development and path of the current 50 largest storms globally from data generated by the PERSIANN-CCS algorithm. iRain continuously proves to be an effective tool for measuring real-time precipitation amounts of extreme storms-especially in locations that do not have extensive rain gauge or radar coverage. Such areas include large portions of the world's oceans and over continents such as Africa and Asia. CHRS also created a mobile app version of the system named "iRain UCI", available for iOS and Android devices. During these storms, real-time rainfall data generated by PERSIANN-CCS was consistently comparable to radar and rain gauge data. This presentation evaluates iRain's efficiency as a tool for extreme precipitation monitoring and provides an evaluation of the

  18. Assessing alternative conceptual models of fracture flow

    International Nuclear Information System (INIS)

    Ho, C.K.

    1995-01-01

    The numerical code TOUGH2 was used to assess alternative conceptual models of fracture flow. The models that were considered included the equivalent continuum model (ECM) and the dual permeability (DK) model. A one-dimensional, layered, unsaturated domain was studied with a saturated bottom boundary and a constant infiltration at the top boundary. Two different infiltration rates were used in the studies. In addition, the connection areas between the fracture and matrix elements in the dual permeability model were varied. Results showed that the two conceptual models of fracture flow produced different saturation and velocity profiles-even under steady-state conditions. The magnitudes of the discrepancies were sensitive to two parameters that affected the flux between the fractures and matrix in the dual permeability model: (1) the fracture-matrix connection areas and (2) the capillary pressure gradients between the fracture and matrix elements

  19. An Integrated Ecological Modeling System for Assessing ...

    Science.gov (United States)

    We demonstrate a novel, spatially explicit assessment of the current condition of aquatic ecosystem services, with limited sensitivity analysis for the atmospheric contaminant mercury. The Integrated Ecological Modeling System (IEMS) forecasts water quality and quantity, habitat suitability for aquatic biota, fish biomasses, population densities, productivities, and contamination by methylmercury across headwater watersheds. We applied this IEMS to the Coal River Basin (CRB), West Virginia (USA), an 8-digit hydrologic unit watershed, by simulating a network of 97 stream segments using the SWAT watershed model, a watershed mercury loading model, the WASP water quality model, the PiSCES fish community estimation model, a fish habitat suitability model, the BASS fish community and bioaccumulation model, and an ecoservices post-processer. Model application was facilitated by automated data retrieval and model setup and updated model wrappers and interfaces for data transfers between these models from a prior study. This companion study evaluates baseline predictions of ecoservices provided for 1990 – 2010 for the population of streams in the CRB and serves as a foundation for future model development. Published in the journal, Ecological Modeling. Highlights: • Demonstrate a spatially-explicit IEMS for multiple scales. • Design a flexible IEMS for

  20. Personalized pseudophakic model for refractive assessment.

    Directory of Open Access Journals (Sweden)

    Filomena J Ribeiro

    Full Text Available PURPOSE: To test a pseudophakic eye model that allows for intraocular lens power (IOL calculation, both in normal eyes and in extreme conditions, such as post-LASIK. METHODS: PARTICIPANTS: The model's efficacy was tested in 54 participants (104 eyes who underwent LASIK and were assessed before and after surgery, thus allowing to test the same method in the same eye after only changing corneal topography. MODELLING: The Liou-Brennan eye model was used as a starting point, and biometric values were replaced by individual measurements. Detailed corneal surface data were obtained from topography (Orbscan® and a grid of elevation values was used to define corneal surfaces in an optical ray-tracing software (Zemax®. To determine IOL power, optimization criteria based on values of the modulation transfer function (MTF weighted according to contrast sensitivity function (CSF, were applied. RESULTS: Pre-operative refractive assessment calculated by our eye model correlated very strongly with SRK/T (r = 0.959, p0.05. Comparison of post-operative refractive assessment obtained using our eye model with the average of currently used formulas showed a strong correlation (r = 0.778, p0.05. CONCLUSIONS: Results suggest that personalized pseudophakic eye models and ray-tracing allow for the use of the same methodology, regardless of previous LASIK, independent of population averages and commonly used regression correction factors, which represents a clinical advantage.

  1. Personalized pseudophakic model for refractive assessment.

    Science.gov (United States)

    Ribeiro, Filomena J; Castanheira-Dinis, António; Dias, João M

    2012-01-01

    To test a pseudophakic eye model that allows for intraocular lens power (IOL) calculation, both in normal eyes and in extreme conditions, such as post-LASIK. The model's efficacy was tested in 54 participants (104 eyes) who underwent LASIK and were assessed before and after surgery, thus allowing to test the same method in the same eye after only changing corneal topography. MODELLING: The Liou-Brennan eye model was used as a starting point, and biometric values were replaced by individual measurements. Detailed corneal surface data were obtained from topography (Orbscan®) and a grid of elevation values was used to define corneal surfaces in an optical ray-tracing software (Zemax®). To determine IOL power, optimization criteria based on values of the modulation transfer function (MTF) weighted according to contrast sensitivity function (CSF), were applied. Pre-operative refractive assessment calculated by our eye model correlated very strongly with SRK/T (r = 0.959, p0.05). Comparison of post-operative refractive assessment obtained using our eye model with the average of currently used formulas showed a strong correlation (r = 0.778, p0.05). Results suggest that personalized pseudophakic eye models and ray-tracing allow for the use of the same methodology, regardless of previous LASIK, independent of population averages and commonly used regression correction factors, which represents a clinical advantage.

  2. An Office Automation Needs Assessment Model

    Science.gov (United States)

    1985-08-01

    office automation needs of a Army Hospital. Based on a literature review and interviews with industry experts, a model was developed to assess office automation needs. The model was applied against the needs of the Clinical Support Division. The author identified a need for a strategic plan for Office Automation prior to analysis of a specific service for automaton. He recommended establishment of a Hospital Automation Advisory Council to centralize establish policy recommendations for Office automation

  3. Quality assessment of human behavior models

    NARCIS (Netherlands)

    Doesburg, W.A. van

    2007-01-01

    Accurate and efficient models of human behavior offer great potential in military and crisis management applications. However, little attention has been given to the man ner in which it can be determined if this potential is actually realized. In this study a quality assessment approach that

  4. A STATISTICAL MODEL FOR STOCK ASSESSMENT OF ...

    African Journals Online (AJOL)

    Assessment of the status of southern bluefin tuna (SBT) by Australia and Japan has used a method (ADAPT) that imposes a number of structural restrictions, and is ... over time within the bounds of specific structure, and (3) autocorrelation in recruitment processes is considered within the likelihood framework of the model.

  5. Testing spatial heterogeneity with stock assessment models

    DEFF Research Database (Denmark)

    Jardim, Ernesto; Eero, Margit; Silva, Alexandra

    2018-01-01

    This paper describes a methodology that combines meta-population theory and stock assessment models to gain insights about spatial heterogeneity of the meta-population in an operational time frame. The methodology was tested with stochastic simulations for different degrees of connectivity betwee...

  6. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  7. Model evaluation methodology applicable to environmental assessment models

    International Nuclear Information System (INIS)

    Shaeffer, D.L.

    1979-08-01

    A model evaluation methodology is presented to provide a systematic framework within which the adequacy of environmental assessment models might be examined. The necessity for such a tool is motivated by the widespread use of models for predicting the environmental consequences of various human activities and by the reliance on these model predictions for deciding whether a particular activity requires the deployment of costly control measures. Consequently, the uncertainty associated with prediction must be established for the use of such models. The methodology presented here consists of six major tasks: model examination, algorithm examination, data evaluation, sensitivity analyses, validation studies, and code comparison. This methodology is presented in the form of a flowchart to show the logical interrelatedness of the various tasks. Emphasis has been placed on identifying those parameters which are most important in determining the predictive outputs of a model. Importance has been attached to the process of collecting quality data. A method has been developed for analyzing multiplicative chain models when the input parameters are statistically independent and lognormally distributed. Latin hypercube sampling has been offered as a promising candidate for doing sensitivity analyses. Several different ways of viewing the validity of a model have been presented. Criteria are presented for selecting models for environmental assessment purposes

  8. Review and assessment of pool scrubbing models

    Energy Technology Data Exchange (ETDEWEB)

    Herranz, L.E.; Escudero, M.J.; Peyres, V.; Polo, J.; Lopez, J.

    1996-07-01

    Decontamination of fission products bearing bubbles as they pass through aqueous pools becomes a crucial phenomenon for source term evaluation of hypothetical risk dominant sequences of Light Water Reactors. In the present report a peer review and assessment of models encapsulated in SPARC and BUSCA codes is presented. Several aspects of pool scrubbing have been addressed: particle removal, fission product vapour retention and bubble hydrodynamics. Particular emphasis has been given to the close link between retention and hydrodynamics, from both modelling and experimental point of view. In addition, RHR and SGTR sequences were simulated with SPARC90 and BUSCA-AUG92 codes, and their results were compared with those obtained with MAAP 3.0B. As a result of this work, model capabilities and shortcomings have been assessed and some areas susceptible of further research have been identified. (Author) 73 refs.

  9. Radionuclide transport and dose assessment modelling in biosphere assessment 2009

    International Nuclear Information System (INIS)

    Hjerpe, T.; Broed, R.

    2010-11-01

    Following the guidelines set forth by the Ministry of Trade and Industry (now Ministry of Employment and Economy), Posiva is preparing to submit a construction license application for the final disposal spent nuclear fuel at the Olkiluoto site, Finland, by the end of the year 2012. Disposal will take place in a geological repository implemented according to the KBS-3 method. The long-term safety section supporting the license application will be based on a safety case that, according to the internationally adopted definition, will be a compilation of the evidence, analyses and arguments that quantify and substantiate the safety and the level of expert confidence in the safety of the planned repository. This report documents in detail the conceptual and mathematical models and key data used in the landscape model set-up, radionuclide transport modelling, and radiological consequences analysis applied in the 2009 biosphere assessment. Resulting environmental activity concentrations in landscape model due to constant unit geosphere release rates, and the corresponding annual doses, are also calculated and presented in this report. This provides the basis for understanding the behaviour of the applied landscape model and subsequent dose calculations. (orig.)

  10. Models for assessing and managing credit risk

    Directory of Open Access Journals (Sweden)

    Neogradi Slađana

    2014-01-01

    Full Text Available This essay deals with the definition of a model for assessing and managing credit risk. Risk is an inseparable component of any average and normal credit transaction. Looking at the different aspects of the identification and classification of risk in the banking industry as well as representation of the key components of modern risk management. In the first part of the essay will analyze how the impact of credit risk on bank and empirical models for determining the financial difficulties in which the company can be found. Bank on the basis of these models can reduce number of approved risk assets. In the second part, we consider models for improving credit risk with emphasis on Basel I, II and III, and the third part, we conclude that the most appropriate model and gives the best effect for measuring credit risk in domestic banks.

  11. Integrated assessment models of global climate change

    International Nuclear Information System (INIS)

    Parson, E.A.; Fisher-Vanden, K.

    1997-01-01

    The authors review recent work in the integrated assessment modeling of global climate change. This field has grown rapidly since 1990. Integrated assessment models seek to combine knowledge from multiple disciplines in formal integrated representations; inform policy-making, structure knowledge, and prioritize key uncertainties; and advance knowledge of broad system linkages and feedbacks, particularly between socio-economic and bio-physical processes. They may combine simplified representations of the socio-economic determinants of greenhouse gas emissions, the atmosphere and oceans, impacts on human activities and ecosystems, and potential policies and responses. The authors summarize current projects, grouping them according to whether they emphasize the dynamics of emissions control and optimal policy-making, uncertainty, or spatial detail. They review the few significant insights that have been claimed from work to date and identify important challenges for integrated assessment modeling in its relationships to disciplinary knowledge and to broader assessment seeking to inform policy- and decision-making. 192 refs., 2 figs

  12. Assessment of galactic cosmic ray models

    Science.gov (United States)

    Mrigakshi, Alankrita Isha; Matthiä, Daniel; Berger, Thomas; Reitz, Günther; Wimmer-Schweingruber, Robert F.

    2012-08-01

    Among several factors involved in the development of a manned space mission concept, the astronauts' health is a major concern that needs to be considered carefully. Galactic cosmic rays (GCRs), which mainly consist of high-energetic nuclei ranging from hydrogen to iron and beyond, pose a major radiation health risk in long-term space missions. It is therefore required to assess the radiation exposure of astronauts in order to estimate their radiation risks. This can be done either by performing direct measurements or by making computer based simulations from which the dose can be derived. A necessary prerequisite for an accurate estimation of the exposure using simulations is a reliable description of the GCR spectra. The aim of this work is to compare GCR models and to test their applicability for the exposure assessment of astronauts. To achieve this, commonly used models capable of describing both light and heavy GCR particle spectra were evaluated by investigating the model spectra for various particles over several decades. The updated Badhwar-O'Neill model published in the year 2010, CREME2009 which uses the International Standard model for GCR, CREME96 and the Burger-Usoskin model were examined. Hydrogen, helium, oxygen and iron nuclei spectra calculated by the different models are compared with measurements from various high-altitude balloon and space-borne experiments. During certain epochs in the last decade, there are large discrepancies between the GCR energy spectra described by the models and the measurements. All the models exhibit weaknesses in describing the increased GCR flux that was observed in 2009-2010.

  13. Tailored model abstraction in performance assessments

    International Nuclear Information System (INIS)

    Kessler, J.H.

    1995-01-01

    Total System Performance Assessments (TSPAs) are likely to be one of the most significant parts of making safety cases for the continued development and licensing of geologic repositories for the disposal of spent fuel and HLW. Thus, it is critical that the TSPA model capture the 'essence' of the physical processes relevant to demonstrating the appropriate regulation is met. But how much detail about the physical processes must be modeled and understood before there is enough confidence that the appropriate essence has been captured? In this summary the level of model abstraction that is required is discussed. Approaches for subsystem and total system performance analyses are outlined, and the role of best estimate models is examined. It is concluded that a conservative approach for repository performance, based on limited amount of field and laboratory data, can provide sufficient confidence for a regulatory decision

  14. Empirical generalization assessment of neural network models

    DEFF Research Database (Denmark)

    Larsen, Jan; Hansen, Lars Kai

    1995-01-01

    competing models. Since all models are trained on the same data, a key issue is to take this dependency into account. The optimal split of the data set of size N into a cross-validation set of size Nγ and a training set of size N(1-γ) is discussed. Asymptotically (large data sees), γopt→1......This paper addresses the assessment of generalization performance of neural network models by use of empirical techniques. We suggest to use the cross-validation scheme combined with a resampling technique to obtain an estimate of the generalization performance distribution of a specific model....... This enables the formulation of a bulk of new generalization performance measures. Numerical results demonstrate the viability of the approach compared to the standard technique of using algebraic estimates like the FPE. Moreover, we consider the problem of comparing the generalization performance of different...

  15. Models and phantoms for internal dose assessment

    International Nuclear Information System (INIS)

    Giussani, Augusto

    2015-01-01

    Radiation doses delivered by incorporated radionuclides cannot be directly measured, and they are assessed by means of biokinetic and dosimetric models and computational phantoms. For emitters of short-range radiation like alpha-particles or Auger electrons, the doses at organ levels, as they are usually defined in internal dosimetry, are no longer relevant. Modelling the inter- and intra-cellular radiation transport and the local patterns of deposition at molecular or cellular levels are the challenging tasks of micro- and nano-dosimetry. With time, the physiological and anatomical realism of the models and phantoms have increased. However, not always the information is available that would be required to characterise the greater complexity of the recent models. Uncertainty studies in internal dose assessment provide here a valuable contribution for testing the significance of the new dose estimates and of the discrepancies from the previous values. Some of the challenges, limitations and future perspectives of the use of models and phantoms in internal dosimetry are discussed in the present manuscript. (authors)

  16. Performance assessment modeling of pyrometallurgical process wasteforms

    International Nuclear Information System (INIS)

    Nutt, W.M.; Hill, R.N.; Bullen, D.B.

    1995-01-01

    Performance assessment analyses have been completed to estimate the behavior of high-level nuclear wasteforms generated from the pyrometallurgical processing of liquid metal reactor (LMR) and light water reactor (LWR) spent nuclear fuel. Waste emplaced in the proposed repository at Yucca Mountain is investigated as the basis for the study. The resulting cumulative actinide and fission product releases to the accessible environment within a 100,000 year period from the various pyrometallurgical process wasteforms are compared to those of directly disposed LWR spent fuel using the same total repository system model. The impact of differing radionuclide transport models on the overall release characteristics is investigated

  17. An assessment model for quality management

    Science.gov (United States)

    Völcker, Chr.; Cass, A.; Dorling, A.; Zilioli, P.; Secchi, P.

    2002-07-01

    SYNSPACE together with InterSPICE and Alenia Spazio is developing an assessment method to determine the capability of an organisation in the area of quality management. The method, sponsored by the European Space Agency (ESA), is called S9kS (SPiCE- 9000 for SPACE). S9kS is based on ISO 9001:2000 with additions from the quality standards issued by the European Committee for Space Standardization (ECSS) and ISO 15504 - Process Assessments. The result is a reference model that supports the expansion of the generic process assessment framework provided by ISO 15504 to nonsoftware areas. In order to be compliant with ISO 15504, requirements from ISO 9001 and ECSS-Q-20 and Q-20-09 have been turned into process definitions in terms of Purpose and Outcomes, supported by a list of detailed indicators such as Practices, Work Products and Work Product Characteristics. In coordination with this project, the capability dimension of ISO 15504 has been revised to be consistent with ISO 9001. As contributions from ISO 9001 and the space quality assurance standards are separable, the stripped down version S9k offers organisations in all industries an assessment model based solely on ISO 9001, and is therefore interesting to all organisations, which intend to improve their quality management system based on ISO 9001.

  18. Improving Flood Damage Assessment Models in Italy

    Science.gov (United States)

    Amadio, M.; Mysiak, J.; Carrera, L.; Koks, E.

    2015-12-01

    The use of Stage-Damage Curve (SDC) models is prevalent in ex-ante assessments of flood risk. To assess the potential damage of a flood event, SDCs describe a relation between water depth and the associated potential economic damage over land use. This relation is normally developed and calibrated through site-specific analysis based on ex-post damage observations. In some cases (e.g. Italy) SDCs are transferred from other countries, undermining the accuracy and reliability of simulation results. Against this background, we developed a refined SDC model for Northern Italy, underpinned by damage compensation records from a recent flood event. Our analysis considers both damage to physical assets and production losses from business interruptions. While the first is calculated based on land use information, production losses are measured through the spatial distribution of Gross Value Added (GVA). An additional component of the model assesses crop-specific agricultural losses as a function of flood seasonality. Our results show an overestimation of asset damage from non-calibrated SDC values up to a factor of 4.5 for tested land use categories. Furthermore, we estimate that production losses amount to around 6 per cent of the annual GVA. Also, maximum yield losses are less than a half of the amount predicted by the standard SDC methods.

  19. Expert judgement models in quantitative risk assessment

    International Nuclear Information System (INIS)

    Rosqvist, T.; Tuominen, R.

    1999-01-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed

  20. Assessment model validity document FARF31

    International Nuclear Information System (INIS)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria

    2004-08-01

    The prime goal of model validation is to build confidence in the model concept and that the model is fit for its intended purpose. In other words: Does the model predict transport in fractured rock adequately to be used in repository performance assessments. Are the results reasonable for the type of modelling tasks the model is designed for. Commonly, in performance assessments a large number of realisations of flow and transport is made to cover the associated uncertainties. Thus, the flow and transport including radioactive chain decay are preferably calculated in the same model framework. A rather sophisticated concept is necessary to be able to model flow and radionuclide transport in the near field and far field of a deep repository, also including radioactive chain decay. In order to avoid excessively long computational times there is a need for well-based simplifications. For this reason, the far field code FARF31 is made relatively simple, and calculates transport by using averaged entities to represent the most important processes. FARF31 has been shown to be suitable for the performance assessments within the SKB studies, e.g. SR 97. Among the advantages are that it is a fast, simple and robust code, which enables handling of many realisations with wide spread in parameters in combination with chain decay of radionuclides. Being a component in the model chain PROPER, it is easy to assign statistical distributions to the input parameters. Due to the formulation of the advection-dispersion equation in FARF31 it is possible to perform the groundwater flow calculations separately.The basis for the modelling is a stream tube, i.e. a volume of rock including fractures with flowing water, with the walls of the imaginary stream tube defined by streamlines. The transport within the stream tube is described using a dual porosity continuum approach, where it is assumed that rock can be divided into two distinct domains with different types of porosity

  1. Assessment model validity document FARF31

    Energy Technology Data Exchange (ETDEWEB)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria [Kemakta Konsult AB, Stockholm (Sweden)

    2004-08-01

    The prime goal of model validation is to build confidence in the model concept and that the model is fit for its intended purpose. In other words: Does the model predict transport in fractured rock adequately to be used in repository performance assessments. Are the results reasonable for the type of modelling tasks the model is designed for. Commonly, in performance assessments a large number of realisations of flow and transport is made to cover the associated uncertainties. Thus, the flow and transport including radioactive chain decay are preferably calculated in the same model framework. A rather sophisticated concept is necessary to be able to model flow and radionuclide transport in the near field and far field of a deep repository, also including radioactive chain decay. In order to avoid excessively long computational times there is a need for well-based simplifications. For this reason, the far field code FARF31 is made relatively simple, and calculates transport by using averaged entities to represent the most important processes. FARF31 has been shown to be suitable for the performance assessments within the SKB studies, e.g. SR 97. Among the advantages are that it is a fast, simple and robust code, which enables handling of many realisations with wide spread in parameters in combination with chain decay of radionuclides. Being a component in the model chain PROPER, it is easy to assign statistical distributions to the input parameters. Due to the formulation of the advection-dispersion equation in FARF31 it is possible to perform the groundwater flow calculations separately.The basis for the modelling is a stream tube, i.e. a volume of rock including fractures with flowing water, with the walls of the imaginary stream tube defined by streamlines. The transport within the stream tube is described using a dual porosity continuum approach, where it is assumed that rock can be divided into two distinct domains with different types of porosity

  2. Assessment of MARMOT Grain Growth Model

    Energy Technology Data Exchange (ETDEWEB)

    Fromm, B. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Fuel Modeling and Simulation Dept.; Zhang, Y. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Fuel Modeling and Simulation Dept.; Schwen, D. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Fuel Modeling and Simulation Dept.; Brown, D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pokharel, R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-12-01

    This report assesses the MARMOT grain growth model by comparing modeling predictions with experimental results from thermal annealing. The purpose here is threefold: (1) to demonstrate the validation approach of using thermal annealing experiments with non-destructive characterization, (2) to test the reconstruction capability and computation efficiency in MOOSE, and (3) to validate the grain growth model and the associated parameters that are implemented in MARMOT for UO2. To assure a rigorous comparison, the 2D and 3D initial experimental microstructures of UO2 samples were characterized using non-destructive Synchrotron x-ray. The same samples were then annealed at 2273K for grain growth, and their initial microstructures were used as initial conditions for simulated annealing at the same temperature using MARMOT. After annealing, the final experimental microstructures were characterized again to compare with the results from simulations. So far, comparison between modeling and experiments has been done for 2D microstructures, and 3D comparison is underway. The preliminary results demonstrated the usefulness of the non-destructive characterization method for MARMOT grain growth model validation. A detailed analysis of the 3D microstructures is in progress to fully validate the current model in MARMOT.

  3. Probabilistic Radiological Performance Assessment Modeling and Uncertainty

    Science.gov (United States)

    Tauxe, J.

    2004-12-01

    A generic probabilistic radiological Performance Assessment (PA) model is presented. The model, built using the GoldSim systems simulation software platform, concerns contaminant transport and dose estimation in support of decision making with uncertainty. Both the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE) require assessments of potential future risk to human receptors of disposal of LLW. Commercially operated LLW disposal facilities are licensed by the NRC (or agreement states), and the DOE operates such facilities for disposal of DOE-generated LLW. The type of PA model presented is probabilistic in nature, and hence reflects the current state of knowledge about the site by using probability distributions to capture what is expected (central tendency or average) and the uncertainty (e.g., standard deviation) associated with input parameters, and propagating through the model to arrive at output distributions that reflect expected performance and the overall uncertainty in the system. Estimates of contaminant release rates, concentrations in environmental media, and resulting doses to human receptors well into the future are made by running the model in Monte Carlo fashion, with each realization representing a possible combination of input parameter values. Statistical summaries of the results can be compared to regulatory performance objectives, and decision makers are better informed of the inherently uncertain aspects of the model which supports their decision-making. While this information may make some regulators uncomfortable, they must realize that uncertainties which were hidden in a deterministic analysis are revealed in a probabilistic analysis, and the chance of making a correct decision is now known rather than hoped for. The model includes many typical features and processes that would be part of a PA, but is entirely fictitious. This does not represent any particular site and is meant to be a generic example. A

  4. The MESORAD dose assessment model: Computer code

    International Nuclear Information System (INIS)

    Ramsdell, J.V.; Athey, G.F.; Bander, T.J.; Scherpelz, R.I.

    1988-10-01

    MESORAD is a dose equivalent model for emergency response applications that is designed to be run on minicomputers. It has been developed by the Pacific Northwest Laboratory for use as part of the Intermediate Dose Assessment System in the US Nuclear Regulatory Commission Operations Center in Washington, DC, and the Emergency Management System in the US Department of Energy Unified Dose Assessment Center in Richland, Washington. This volume describes the MESORAD computer code and contains a listing of the code. The technical basis for MESORAD is described in the first volume of this report (Scherpelz et al. 1986). A third volume of the documentation planned. That volume will contain utility programs and input and output files that can be used to check the implementation of MESORAD. 18 figs., 4 tabs

  5. Mapping the Extent and Magnitude of Severe Flooding Induced by Hurricanes Harvey, Irma, and Maria with Sentinel-1 SAR and InSAR Observations

    Science.gov (United States)

    Zhang, B.; Koirala, R.; Oliver-Cabrera, T.; Wdowinski, S.; Osmanoglu, B.

    2017-12-01

    Hurricanes can cause winds, rainfall and storm surge, all of which could result in flooding. Between August and September 2017, Hurricanes Harvey, Irma and Maria made landfall over Texas, Florida and Puerto Rico causing destruction and damages. Flood mapping is important for water management and to estimate risks and property damage. Though water gauges are able to monitor water levels, they are normally distributed sparsely. To map flooding products of these extreme events, we use Synthetic Aperture Radar (SAR) observations acquired by the European satellite constellation Sentinel-1. We obtained two acquisitions from before each flooding event, a single acquisition during the hurricane, and two after each event, a total of five acquisitions. We use both amplitude and phase observations to map extent and magnitude of flooding respectively. To map flooding extents, we use amplitude images from before, after and if possible during the hurricane pass. A calibration is used to convert the image raw data to backscatter coefficient, termed sigma nought. We generate a composite of the two image layers using red and green bands to show the change of sigma nought between acquisitions, which directly reflects the extent of flooding. Because inundation can result with either an increase or decrease of sigma nought values depending on the surface scattering characteristics, we map flooded areas in location where sigma nought changes were above a detection threshold. To study magnitude of flooding we study Interferometric Synthetic Aperture Radar (InSAR) phase changes. Changes in the water level can be detected by the radar when the signal is reflected away from water surface and bounces again by another object (e.g. trees and/or buildings) known as double bounce phase. To generate meaningful interferograms, we compare phase information with the nearest water gauge records to verify our results. Preliminary results show that the three hurricanes caused flooding condition over

  6. Hurricanes Harvey and Irma - High-Resolution Flood Mapping and Monitoring from Sentinel SAR with the Depolarization Reduction Algorithm for Global Observations of InundatioN (DRAGON)

    Science.gov (United States)

    Nghiem, S. V.; Brakenridge, G. R.; Nguyen, D. T.

    2017-12-01

    Hurricane Harvey inflicted historical catastrophic flooding across extensive regions around Houston and southeast Texas after making landfall on 25 August 2017. The Federal Emergency Management Agency (FEMA) requested urgent supports for flood mapping and monitoring in an emergency response to the extreme flood situation. An innovative satellite remote sensing method, called the Depolarization Reduction Algorithm for Global Observations of inundatioN (DRAGON), has been developed and implemented for use with Sentinel synthetic aperture radar (SAR) satellite data at a resolution of 10 meters to identify, map, and monitor inundation including pre-existing water bodies and newly flooded areas. Results from this new method are hydrologically consistent and have been verified with known surface waters (e.g., coastal ocean, rivers, lakes, reservoirs, etc.), with clear-sky high-resolution WorldView images (where waves can be seen on surface water in inundated areas within a small spatial coverage), and with other flood maps from the consortium of Global Flood Partnership derived from multiple satellite datasets (including clear-sky Landsat and MODIS at lower resolutions). Figure 1 is a high-resolution (4K UHD) image of a composite inundation map for the region around Rosharon (in Brazoria County, south of Houston, Texas). This composite inundation map reveals extensive flooding on 29 August 2017 (four days after Hurricane Harvey made landfall), and the inundation was still persistent in most of the west and south of Rosharon one week later (5 September 2017) while flooding was reduced in the east of Rosharon. Hurricane Irma brought flooding to a number of areas in Florida. As of 10 September 2017, Sentinel SAR flood maps reveal inundation in the Florida Panhandle and over lowland surfaces on several islands in the Florida Keys. However, Sentinel SAR results indicate that flooding along the Florida coast was not extreme despite Irma was a Category-5 hurricane that might

  7. An Exploratory Study: Assessment of Modeled Dioxin ...

    Science.gov (United States)

    EPA has released an external review draft entitled, An Exploratory Study: Assessment of Modeled Dioxin Exposure in Ceramic Art Studios(External Review Draft). The public comment period and the external peer-review workshop are separate processes that provide opportunities for all interested parties to comment on the document. In addition to consideration by EPA, all public comments submitted in accordance with this notice will also be forwarded to EPA’s contractor for the external peer-review panel prior to the workshop. EPA has realeased this draft document solely for the purpose of pre-dissemination peer review under applicable information quality guidelines. This document has not been formally disseminated by EPA. It does not represent and should not be construed to represent any Agency policy or determination. The purpose of this report is to describe an exploratory investigation of potential dioxin exposures to artists/hobbyists who use ball clay to make pottery and related products.

  8. Review of early assessment models of innovative medical technologies

    DEFF Research Database (Denmark)

    Fasterholdt, Iben; Krahn, Murray D; Kidholm, Kristian

    2017-01-01

    INTRODUCTION: Hospitals increasingly make decisions regarding the early development of and investment in technologies, but a formal evaluation model for assisting hospitals early on in assessing the potential of innovative medical technologies is lacking. This article provides an overview of models...... methods assessing cost-effectiveness are most prevalent in early assessment, but seems ill-suited for early assessment in hospitals. Four models provided some usable elements for the development of a hospital-based model....

  9. Teachers' Development Model to Authentic Assessment by Empowerment Evaluation Approach

    Science.gov (United States)

    Charoenchai, Charin; Phuseeorn, Songsak; Phengsawat, Waro

    2015-01-01

    The purposes of this study were 1) Study teachers authentic assessment, teachers comprehension of authentic assessment and teachers needs for authentic assessment development. 2) To create teachers development model. 3) Experiment of teachers development model. 4) Evaluate effectiveness of teachers development model. The research is divided into 4…

  10. Modelling Beginning Teachers' Assessment Literacy: The Contribution of Training, Self-Efficacy, and Conceptions of Assessment

    Science.gov (United States)

    Levy-Vered, Adi; Alhija, Fadia Nasser-Abu

    2015-01-01

    Teachers devote a substantial amount of their time to assessment-related activities. This study aimed to describe beginning teachers' assessment literacy and to examine a structural model that binds assessment literacy with assessment training, self-efficacy, and conceptions of assessment. Data were collected from 327 Israeli inductee teachers and…

  11. Robust flood area detection using a L-band synthetic aperture radar: Preliminary application for Florida, the U.S. affected by Hurricane Irma

    Science.gov (United States)

    Nagai, H.; Ohki, M.; Abe, T.

    2017-12-01

    Urgent crisis response for a hurricane-induced flood needs urgent providing of a flood map covering a broad region. However, there is no standard threshold values for automatic flood identification from pre-and-post images obtained by satellite-based synthetic aperture radars (SARs). This problem could hamper prompt data providing for operational uses. Furthermore, one pre-flood SAR image does not always represent potential water surfaces and river flows especially in tropical flat lands which are greatly influenced by seasonal precipitation cycle. We are, therefore, developing a new method of flood mapping using PALSAR-2, an L-band SAR, which is less affected by temporal surface changes. Specifically, a mean-value image and a standard-deviation image are calculated from a series of pre-flood SAR images. It is combined with a post-flood SAR image to obtain normalized backscatter amplitude difference (NoBADi), with which a difference between a post-flood image and a mean-value image is divided by a standard-deviation image to emphasize anomalous water extents. Flooding areas are then automatically obtained from the NoBADi images as lower-value pixels avoiding potential water surfaces. We applied this method to PALSAR-2 images acquired on Sept. 8, 10, and 12, 2017, covering flooding areas in a central region of Dominican Republic and west Florida, the U.S. affected by Hurricane Irma. The output flooding outlines are validated with flooding areas manually delineated from high-resolution optical satellite images, resulting in higher consistency and less uncertainty than previous methods (i.e., a simple pre-and-post flood difference and pre-and-post coherence changes). The NoBADi method has a great potential to obtain a reliable flood map for future flood hazards, not hampered by cloud cover, seasonal surface changes, and "casual" thresholds in the flood identification process.

  12. A Model for Assessing the Liability of Seemingly Correct Software

    Science.gov (United States)

    Voas, Jeffrey M.; Voas, Larry K.; Miller, Keith W.

    1991-01-01

    Current research on software reliability does not lend itself to quantitatively assessing the risk posed by a piece of life-critical software. Black-box software reliability models are too general and make too many assumptions to be applied confidently to assessing the risk of life-critical software. We present a model for assessing the risk caused by a piece of software; this model combines software testing results and Hamlet's probable correctness model. We show how this model can assess software risk for those who insure against a loss that can occur if life-critical software fails.

  13. Hierarchical Model of Assessing and Selecting Experts

    Science.gov (United States)

    Chernysheva, T. Y.; Korchuganova, M. A.; Borisov, V. V.; Min'kov, S. L.

    2016-04-01

    Revealing experts’ competences is a multi-objective issue. Authors of the paper deal with competence assessing methods of experts seen as objects, and criteria of qualities. An analytic hierarchy process of assessing and ranking experts is offered, which is based on paired comparison matrices and scores, quality parameters are taken into account as well. Calculation and assessment of experts is given as an example.

  14. Hierarchical Model of Assessing and Selecting Experts

    OpenAIRE

    Chernysheva, Tatiana Yurievna; Korchuganova, Mariya Anatolievna; Borisov, V. V.; Minkov, S. L.

    2016-01-01

    Revealing experts' competences is a multi-objective issue. Authors of the paper deal with competence assessing methods of experts seen as objects, and criteria of qualities. An analytic hierarchy process of assessing and ranking experts is offered, which is based on paired comparison matrices and scores, quality parameters are taken into account as well. Calculation and assessment of experts is given as an example.

  15. The Development of a Secondary School Health Assessment Model

    Science.gov (United States)

    Sriring, Srinual; Erawan, Prawit; Sriwarom, Monoon

    2015-01-01

    The objective of this research was to: 1) involved a survey of information relating to secondary school health, 2) involved the construction of a model of health assessment and a handbook for using the model in secondary school, 3) develop an assessment model for secondary school. The research included 3 phases. (1) involved a survey of…

  16. FORMATIVE ASSESSMENT MODEL OF LEARNING SUCCESS ACHIEVEMENTS

    Directory of Open Access Journals (Sweden)

    Mikhailova Elena Konstantinovna

    2013-05-01

    Full Text Available The paper is devoted to the problem of assessment of the school students’ learning success achievements. The problem is investigated from the viewpoint of assessing the students’ learning outcomes that is aimed to ensure the teachers and students with the means and conditions to improve the educational process and results.

  17. A comparison of models for risk assessment

    International Nuclear Information System (INIS)

    Kellerer, A.M.; Jing Chen

    1993-01-01

    Various mathematical models have been used to represent the dependence of excess cancer risk on dose, age and time since exposure. For solid cancers, i.e. all cancers except leukaemia, the so-called relative risk model is usually employed. However, there can be quite different relative risk models. The most usual model for the quantification of excess tumour rate among the atomic bomb survivors has been a dependence of the relative risk on age at exposure, but it has been shown recently that an age attained model can be equally applied, to represent the observations among the atomic bomb survivors. The differences between the models and their implications are explained. It is also shown that the age attained model is similar to the approaches that have been used in the analysis of lung cancer incidence among radon exposed miners. A more unified approach to modelling of radiation risks can thus be achieved. (3 figs.)

  18. Assessment of multi class kinematic wave models

    NARCIS (Netherlands)

    Van Wageningen-Kessels, F.L.M.; Van Lint, J.W.C.; Vuik, C.; Hoogendoorn, S.P.

    2012-01-01

    In the last decade many multi class kinematic wave (MCKW) traffic ow models have been proposed. MCKW models introduce heterogeneity among vehicles and drivers. For example, they take into account differences in (maximum) velocities and driving style. Nevertheless, the models are macroscopic and the

  19. THE MODEL FOR RISK ASSESSMENT ERP-SYSTEMS INFORMATION SECURITY

    Directory of Open Access Journals (Sweden)

    V. S. Oladko

    2016-12-01

    Full Text Available The article deals with the problem assessment of information security risks in the ERP-system. ERP-system functions and architecture are studied. The model malicious impacts on levels of ERP-system architecture are composed. Model-based risk assessment, which is the quantitative and qualitative approach to risk assessment, built on the partial unification 3 methods for studying the risks of information security - security models with full overlapping technique CRAMM and FRAP techniques developed.

  20. Accuracy assessment of landslide prediction models

    International Nuclear Information System (INIS)

    Othman, A N; Mohd, W M N W; Noraini, S

    2014-01-01

    The increasing population and expansion of settlements over hilly areas has greatly increased the impact of natural disasters such as landslide. Therefore, it is important to developed models which could accurately predict landslide hazard zones. Over the years, various techniques and models have been developed to predict landslide hazard zones. The aim of this paper is to access the accuracy of landslide prediction models developed by the authors. The methodology involved the selection of study area, data acquisition, data processing and model development and also data analysis. The development of these models are based on nine different landslide inducing parameters i.e. slope, land use, lithology, soil properties, geomorphology, flow accumulation, aspect, proximity to river and proximity to road. Rank sum, rating, pairwise comparison and AHP techniques are used to determine the weights for each of the parameters used. Four (4) different models which consider different parameter combinations are developed by the authors. Results obtained are compared to landslide history and accuracies for Model 1, Model 2, Model 3 and Model 4 are 66.7, 66.7%, 60% and 22.9% respectively. From the results, rank sum, rating and pairwise comparison can be useful techniques to predict landslide hazard zones

  1. Proposing an Environmental Excellence Self-Assessment Model

    DEFF Research Database (Denmark)

    Meulengracht Jensen, Peter; Johansen, John; Wæhrens, Brian Vejrum

    2013-01-01

    This paper presents an Environmental Excellence Self-Assessment (EEA) model based on the structure of the European Foundation of Quality Management Business Excellence Framework. Four theoretical scenarios for deploying the model are presented as well as managerial implications, suggesting...

  2. The housing market: modeling and assessment methods

    Directory of Open Access Journals (Sweden)

    Zapadnjuk Evgenij Aleksandrovich

    2016-10-01

    Full Text Available This paper analyzes the theoretical foundations of econometric simulation model that can be used to study the housing sector. Shows the methods of the practical use of correlation and regression models in the analysis of the status and prospects of development of the housing market.

  3. Attention modeling for video quality assessment

    DEFF Research Database (Denmark)

    You, Junyong; Korhonen, Jari; Perkis, Andrew

    2010-01-01

    averaged spatiotemporal pooling. The local quality is derived from visual attention modeling and quality variations over frames. Saliency, motion, and contrast information are taken into account in modeling visual attention, which is then integrated into IQMs to calculate the local quality of a video frame...

  4. Assessing a Theoretical Model on EFL College Students

    Science.gov (United States)

    Chang, Yu-Ping

    2011-01-01

    This study aimed to (1) integrate relevant language learning models and theories, (2) construct a theoretical model of college students' English learning performance, and (3) assess the model fit between empirically observed data and the theoretical model proposed by the researchers of this study. Subjects of this study were 1,129 Taiwanese EFL…

  5. Assessing Model Characterization of Single Source ...

    Science.gov (United States)

    Aircraft measurements made downwind from specific coal fired power plants during the 2013 Southeast Nexus field campaign provide a unique opportunity to evaluate single source photochemical model predictions of both O3 and secondary PM2.5 species. The model did well at predicting downwind plume placement. The model shows similar patterns of an increasing fraction of PM2.5 sulfate ion to the sum of SO2 and PM2.5 sulfate ion by distance from the source compared with ambient based estimates. The model was less consistent in capturing downwind ambient based trends in conversion of NOX to NOY from these sources. Source sensitivity approaches capture near-source O3 titration by fresh NO emissions, in particular subgrid plume treatment. However, capturing this near-source chemical feature did not translate into better downwind peak estimates of single source O3 impacts. The model estimated O3 production from these sources but often was lower than ambient based source production. The downwind transect ambient measurements, in particular secondary PM2.5 and O3, have some level of contribution from other sources which makes direct comparison with model source contribution challenging. Model source attribution results suggest contribution to secondary pollutants from multiple sources even where primary pollutants indicate the presence of a single source. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, deci

  6. Automatic Assessment of 3D Modeling Exams

    Science.gov (United States)

    Sanna, A.; Lamberti, F.; Paravati, G.; Demartini, C.

    2012-01-01

    Computer-based assessment of exams provides teachers and students with two main benefits: fairness and effectiveness in the evaluation process. This paper proposes a fully automatic evaluation tool for the Graphic and Virtual Design (GVD) curriculum at the First School of Architecture of the Politecnico di Torino, Italy. In particular, the tool is…

  7. Auditory modelling for assessing room acoustics

    NARCIS (Netherlands)

    Van Dorp Schuitman, J.

    2011-01-01

    The acoustics of a concert hall, or any other room, are generally assessed by measuring room impulse responses for one or multiple source and receiver location(s). From these responses, objective parameters can be determined that should be related to various perceptual attributes of room acoustics.

  8. Structure ignition assessment model (SIAM)\\t

    Science.gov (United States)

    Jack D. Cohen

    1995-01-01

    Major wildland/urban interface fire losses, principally residences, continue to occur. Although the problem is not new, the specific mechanisms are not well known on how structures ignite in association with wildland fires. In response to the need for a better understanding of wildland/urban interface ignition mechanisms and a method of assessing the ignition risk,...

  9. Highly Integrated Model Assessment Technology and Tools

    Science.gov (United States)

    Pirnay-Dummer, Pablo; Ifenthaler, Dirk; Spector, J. Michael

    2010-01-01

    Effective and efficient measurement of the development of skill and knowledge, especially in domains of human activity that involve complex and challenging problems, is important with regard to workplace and academic performance. However, there has been little progress in the area of practical measurement and assessment, due in part to the lack of…

  10. The Automation of Nowcast Model Assessment Processes

    Science.gov (United States)

    2016-09-01

    9 km (m2), and 3 km (m3) will be evaluated over D1 (o1), D2 ( o2 ), and D3 (o3), respectively. The goal would be to assess and calculate error...consistent domain. For RDA, the innermost domain masking files are needed and should be placed in the same directory as the Point-Stat configuration

  11. A model for assessment of telemedicine applications

    DEFF Research Database (Denmark)

    Kidholm, Kristian; Ekeland, Anne Granstrøm; Jensen, Lise Kvistgaard

    2012-01-01

    Telemedicine applications could potentially solve many of the challenges faced by the healthcare sectors in Europe. However, a framework for assessment of these technologies is need by decision makers to assist them in choosing the most efficient and cost-effective technologies. Therefore in 2009...

  12. Wall Paint Exposure Assessment Model (WPEM)

    Science.gov (United States)

    WPEM uses mathematical models developed from small chamber data to estimate the emissions of chemicals from oil-based (alkyd) and latex wall paint which is then combined with detailed use, workload and occupancy data to estimate user exposure.

  13. New Diagnostics to Assess Model Performance

    Science.gov (United States)

    Koh, Tieh-Yong

    2013-04-01

    The comparison of model performance between the tropics and the mid-latitudes is particularly problematic for observables like temperature and humidity: in the tropics, these observables have little variation and so may give an apparent impression that model predictions are often close to observations; on the contrary, they vary widely in mid-latitudes and so the discrepancy between model predictions and observations might be unnecessarily over-emphasized. We have developed a suite of mathematically rigorous diagnostics that measures normalized errors accounting for the observed and modeled variability of the observables themselves. Another issue in evaluating model performance is the relative importance of getting the variance of an observable right versus getting the modeled variation to be in phase with the observed. The correlation-similarity diagram was designed to analyse the pattern error of a model by breaking it down into contributions from amplitude and phase errors. A final and important question pertains to the generalization of scalar diagnostics to analyse vector observables like wind. In particular, measures of variance and correlation must be properly derived to avoid the mistake of ignoring the covariance between north-south and east-west winds (hence wrongly assuming that the north-south and east-west directions form a privileged vector basis for error analysis). There is also a need to quantify systematic preferences in the direction of vector wind errors, which we make possible by means of an error anisotropy diagram. Although the suite of diagnostics is mentioned with reference to model verification here, it is generally applicable to quantify differences between two datasets (e.g. from two observation platforms). Reference publication: Koh, T. Y. et al. (2012), J. Geophys. Res., 117, D13109, doi:10.1029/2011JD017103. also available at http://www.ntu.edu.sg/home/kohty

  14. Assessment of Energy Efficient and Model Based Control

    Science.gov (United States)

    2017-06-15

    ARL-TR-8042 ● JUNE 2017 US Army Research Laboratory Assessment of Energy -Efficient and Model- Based Control by Craig Lennon...originator. ARL-TR-8042 ● JUNE 2017 US Army Research Laboratory Assessment of Energy -Efficient and Model- Based Control by Craig...

  15. Higher Education Quality Assessment Model: Towards Achieving Educational Quality Standard

    Science.gov (United States)

    Noaman, Amin Y.; Ragab, Abdul Hamid M.; Madbouly, Ayman I.; Khedra, Ahmed M.; Fayoumi, Ayman G.

    2017-01-01

    This paper presents a developed higher education quality assessment model (HEQAM) that can be applied for enhancement of university services. This is because there is no universal unified quality standard model that can be used to assess the quality criteria of higher education institutes. The analytical hierarchy process is used to identify the…

  16. Assessing The Performance of Hydrological Models

    Science.gov (United States)

    van der Knijff, Johan

    The performance of hydrological models is often characterized using the coefficient of efficiency, E. The sensitivity of E to extreme streamflow values, and the difficulty of deciding what value of E should be used as a threshold to identify 'good' models or model parameterizations, have proven to be serious shortcomings of this index. This paper reviews some alternative performance indices that have appeared in the litera- ture. Legates and McCabe (1999) suggested a more generalized form of E, E'(j,B). Here, j is a parameter that controls how much emphasis is put on extreme streamflow values, and B defines a benchmark or 'null hypothesis' against which the results of the model are tested. E'(j,B) was used to evaluate a large number of parameterizations of a conceptual rainfall-runoff model, using 6 different combinations of j and B. First, the effect of j and B is explained. Second, it is demonstrated how the index can be used to explicitly test hypotheses about the model and the data. This approach appears to be particularly attractive if the index is used as a likelihood measure within a GLUE-type analysis.

  17. High Resolution Satellite Data reveals Massive Export of Carbon and Nitrogen-Rich Seagrass Wrack from Greater Florida Bay to the Open Ocean after Hurricane Irma

    Science.gov (United States)

    Dierssen, H. M.; Hedley, J. D.; Russell, B. J.; Vaudrey, J. M.; Perry, R. A.

    2017-12-01

    Episodic storms are known to be important drivers of ocean ecosystem processes, but the impacts are notoriously difficult to quantify with traditional sampling techniques. Here, we use stunning high spatial resolution satellite imagery from Sentinel 2A collected 13 September 2017, only days after Hurricane Irma passed directly over the Florida Keys, to quantify massive amounts of floating vegetative material. This Category 4 storm passed directly over the Florida Keys, bringing wind gusts over 35 m s-1 and creating turbulence in the water column that scoured the seafloor. The imagery reveals as initial estimate of 40 km2 of surface drifting material. Although the identity of the brown material cannot be fully determined without a hyperspectral sensor, the accumulations are consistent with our past research showing large aggregations of seagrass leaves or "wrack" advected under high winds from dense beds of Syringodium filiforme within Greater Florida Bay to the oceanic waters of the Atlantic. Using measurements of wrack collected from this area, we estimate that this single event corresponds to a total export of 9.7 x 1010 gC and 2.7 x 109 gN from the seagrass beds. This high amount of export is not considered typical for many types of tropical seagrass meadows that are thought to highly recycle nutrients within the beds. Elemental analysis of seagrass leaves from Greater Florida Bay is consistent with nitrogen-fixation in the beds, which could provide the means to sustain a large export of nitrogen from the meadows. As the wrack travels at the sea surface, some of these nutrients are exuded into the surrounding waters providing a nutrient subsidy of dissolved and particulate carbon and nitrogen and making the wrack an ecological hot spot for organisms. Although wrack can potentially remain floating for months, the ultimate fate of the wrack is to either wash ashore, providing connectivity between marine and terrestrial ecosystems, or sink to the seafloor. If most

  18. Uncertainties in environmental radiological assessment models and their implications

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Miller, C.W.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because these models are inexact representations of real systems. The major sources of this uncertainty are related to biases in model formulation and parameter estimation. The best approach for estimating the actual extent of over- or underprediction is model validation, a procedure that requires testing over the range of the intended realm of model application. Other approaches discussed are the use of screening procedures, sensitivity and stochastic analyses, and model comparison. The magnitude of uncertainty in model predictions is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. Estimates are made of the relative magnitude of uncertainty for situations requiring predictions of individual and collective risks for both chronic and acute releases of radionuclides. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible

  19. A verification system survival probability assessment model test methods

    International Nuclear Information System (INIS)

    Jia Rui; Wu Qiang; Fu Jiwei; Cao Leituan; Zhang Junnan

    2014-01-01

    Subject to the limitations of funding and test conditions, the number of sub-samples of large complex system test less often. Under the single sample conditions, how to make an accurate evaluation of the performance, it is important for reinforcement of complex systems. It will be able to significantly improve the technical maturity of the assessment model, if that can experimental validation and evaluation model. In this paper, a verification system survival probability assessment model test method, the method by the test system sample test results, verify the correctness of the assessment model and a priori information. (authors)

  20. A normative model for assessing competitive strategy

    OpenAIRE

    Ungerer, Gerard David; Cayzer, Steve

    2016-01-01

    The hyper-competitive nature of e-business has raised the need for a generic way to appraise the merit of a developed business strategy. Although progress has been made in the domain of strategy evaluation, the established literature differs over the ‘tests’ that a strategy must pass to be considered well-constructed. This paper therefore investigates the existing strategy-evaluation literature to propose a more integrated and comprehensive normative strategic assessment that can be used to e...

  1. Description and comparison of energy impact assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Burnett, R.A.; Fraley, D.W.

    1977-04-01

    During the past few years the need for more comprehensive analytical techniques for assessing the environmental, economic, and social impacts of energy supply-demand systems and related public policy-making activities has increased. The research and academic communities have responded to this need by developing a wide range of models and other analytical tools for energy impact estimation. The models generally fall into two categories: large-scale and specialized. This report examines the general features and shortcomings of current large-scale and specialized modeling efforts from the point of view of energy impact assessment. Characteristics deemed desirable in large-scale energy-impact-assessment models and related studies are discussed. An outline of criteria for describing and comparing such models is presented, from which seven large-scale energy models and one impact-assessment study are described and compared in considerable detail. Tables are also presented which summarize the results of the categorizations.

  2. Persistence Modeling for Assessing Marketing Strategy Performance

    NARCIS (Netherlands)

    M.G. Dekimpe (Marnik); D.M. Hanssens (Dominique)

    2003-01-01

    textabstractThe question of long-run market response lies at the heart of any marketing strategy that tries to create a sustainable competitive advantage for the firm or brand. A key challenge, however, is that only short-run results of marketing actions are readily observable. Persistence modeling

  3. SOIL QUALITY ASSESSMENT USING FUZZY MODELING

    Science.gov (United States)

    Maintaining soil productivity is essential if agriculture production systems are to be sustainable, thus soil quality is an essential issue. However, there is a paucity of tools for measurement for the purpose of understanding changes in soil quality. Here the possibility of using fuzzy modeling t...

  4. Assessment of Response Surface Models using Independent Confirmation Point Analysis

    Science.gov (United States)

    DeLoach, Richard

    2010-01-01

    This paper highlights various advantages that confirmation-point residuals have over conventional model design-point residuals in assessing the adequacy of a response surface model fitted by regression techniques to a sample of experimental data. Particular advantages are highlighted for the case of design matrices that may be ill-conditioned for a given sample of data. The impact of both aleatory and epistemological uncertainty in response model adequacy assessments is considered.

  5. A modelling framework for MSP-oriented cumulative effects assessment

    OpenAIRE

    Stefano Menegon; Daniel Depellegrin; Giulio Farella; Elena Gissi; Michol Ghezzo; Alessandro Sarretta; Chiara Venier; Andrea Barbanti

    2018-01-01

    This research presents a comprehensive Cumulative Eects Assessment (CEA) based on the Tools4MSP modelling framework tested for the Italian Adriatic Sea. The CEA incorporates ve methodological advancements: (1) linear and non-linear ecosystem response to anthropogenic pressures/effects, (2) modelling of additive, dominant and antagonist stressor effects, (3) implementation of a convolution distance model for stressor dispersion modelling, (4) application of a CEA backsourcing (CEA-B) model to ...

  6. A normative model for assessing competitive strategy

    Directory of Open Access Journals (Sweden)

    Ungerer, Gerard David

    2016-12-01

    Full Text Available The hyper-competitive nature of e-business has raised the need for a generic way to appraise the merit of a developed business strategy. Although progress has been made in the domain of strategy evaluation, the established literature differs over the ‘tests’ that a strategy must pass to be considered well-constructed. This paper therefore investigates the existing strategy-evaluation literature to propose a more integrated and comprehensive normative strategic assessment that can be used to evaluate and refine a business’ s competitive strategy , adding to its robustness and survivability.

  7. Mathematical Models for Camouflage Pattern Assessment

    Science.gov (United States)

    2013-04-01

    Matemático Facultad de Ciencias F́ısicas y Matemáticas http://www.cmm.uchile.cl DISTRIBUTION A: Distribution approved for public release University of Chile...Centro de Modelamiento Matemático Facultad de Ciencias Físicas y Matemáticas Final Report: Camouage Assessment January 2013 Abstract The main...mathematical details are to be foun in Appendix B and the summaries of the some state-of-the- art work involving non-local segmentation considering the

  8. Defining assessment projects and scenarios for policy support: Use of ontology in Integrated Assessment Modelling

    NARCIS (Netherlands)

    Janssen, S.; Ewert, F.; Hongtao, Li; Anthanasiadis, I.N.; Wien, J.J.F.; Therond, O.; Knapen, M.J.R.; Bezlepkina, I.; Alkan-Olsson, J.; Rizzoli, A.E.; Belhouchette, H.; Svensson, M.; Ittersum, van M.K.

    2009-01-01

    Integrated Assessment and Modelling (IAM) provides an interdisciplinary approach to support ex-ante decision-making by combining quantitative models representing different systems and scales into a framework for integrated assessment. Scenarios in IAM are developed in the interaction between

  9. National Built Environment Health Impact Assessment Model ...

    Science.gov (United States)

    Behavioral (activity, diet, social interaction) and exposure (air pollution, traffic injury, and noise) related health impacts of land use and transportation investment decisions are becoming better understood and quantified. Research has shown relationships between density, mix, street connectivity, access to parks, shops, transit, presence of sidewalks and bikeways, and healthy food with physical activity, obesity, cardiovascular disease, type II diabetes, and some mental health outcomes. This session demonstrates successful integration of health impact assessment into multiple scenario planning tool platforms. Detailed evidence on chronic disease and related costs associated with contrasting land use and transportation investments are built into a general-purpose module that can be accessed by multiple platforms. Funders, researchers, and end users of the tool will present a detailed description of the key elements of the approach, how it has been applied, and how will evolve. A critical focus will be placed on equity and social justice inherent within the assessment of health disparities that will be featured in the session. Health impacts of community design have significant cost benefit implications. Recent research is now extending relationships between community design features and chronic disease to health care costs. This session will demonstrate the recent application of this evidence on health impacts to the newly adopted Los Angeles Regional Transpo

  10. Criteria Assessment Model for Sustainable Product Development

    Science.gov (United States)

    Mohd Turan, Faiz; Johan, Kartina; Hisyamudin Muhd Nor, Nik

    2016-11-01

    The instability in today's market and the ever increasing and emerging demands for mass customized and hybrid products by customers, are driving companies and decision makers to seek for cost effective and time efficient improvements in their product development process. Design concept evaluation which is the end of conceptual design is one of the most critical decision points in product development. It relates to the final success of product development, because poor criteria assessment in design concept evaluation can rarely compensated at the later stages. This has led to real pressure for the adaptation of new developmental architecture and operational parameters to remain competitive in the market. In this paper, a new integrated design concept evaluation based on fuzzy-technique for order preference by similarity to ideal solution (Fuzzy-TOPSIS) is presented, and it also attempts to incorporate sustainability practices in assessing the criteria. Prior to Fuzzy-TOPSIS, a new scale of “Weighting criteria” for survey process is developed to quantify the evaluation criteria. This method will help engineers to improve the effectiveness and objectivity of the sustainable product development. Case example from industry is presented to demonstrate the efficacy of the proposed methodology. The result of the example shows that the new integrated method provides an alternative to existing methods of design concept evaluation.

  11. Sustainability Assessment Model in Product Development

    Science.gov (United States)

    Turan, Faiz Mohd; Johan, Kartina; Nor, Nik Hisyamudin Muhd; Omar, Badrul

    2017-08-01

    Faster and more efficient development of innovative and sustainable products has become the focus for manufacturing companies in order to remain competitive in today’s technologically driven world. Design concept evaluation which is the end of conceptual design is one of the most critical decision points. It relates to the final success of product development, because poor criteria assessment in design concept evaluation can rarely compensated at the later stages. Furthermore, consumers, investors, shareholders and even competitors are basing their decisions on what to buy or invest in, from whom, and also on what company report, and sustainability is one of a critical component. In this research, a new methodology of sustainability assessment in product development for Malaysian industry has been developed using integration of green project management, new scale of “Weighting criteria” and Rough-Grey Analysis. This method will help design engineers to improve the effectiveness and objectivity of the sustainable design concept evaluation, enable them to make better-informed decisions before finalising their choice and consequently create value to the company or industry. The new framework is expected to provide an alternative to existing methods.

  12. ITER plasma safety interface models and assessments

    International Nuclear Information System (INIS)

    Uckan, N.A.; Bartels, H-W.; Honda, T.; Amano, T.; Boucher, D.; Post, D.; Wesley, J.

    1996-01-01

    Physics models and requirements to be used as a basis for safety analysis studies are developed and physics results motivated by safety considerations are presented for the ITER design. Physics specifications are provided for enveloping plasma dynamic events for Category I (operational event), Category II (likely event), and Category III (unlikely event). A safety analysis code SAFALY has been developed to investigate plasma anomaly events. The plasma response to ex-vessel component failure and machine response to plasma transients are considered

  13. Beyond citation analysis: a model for assessment of research impact.

    Science.gov (United States)

    Sarli, Cathy C; Dubinsky, Ellen K; Holmes, Kristi L

    2010-01-01

    Is there a means of assessing research impact beyond citation analysis? The case study took place at the Washington University School of Medicine Becker Medical Library. This case study analyzed the research study process to identify indicators beyond citation count that demonstrate research impact. The authors discovered a number of indicators that can be documented for assessment of research impact, as well as resources to locate evidence of impact. As a result of the project, the authors developed a model for assessment of research impact, the Becker Medical Library Model for Assessment of Research. Assessment of research impact using traditional citation analysis alone is not a sufficient tool for assessing the impact of research findings, and it is not predictive of subsequent clinical applications resulting in meaningful health outcomes. The Becker Model can be used by both researchers and librarians to document research impact to supplement citation analysis.

  14. Interactive Rapid Dose Assessment Model (IRDAM): user's guide

    International Nuclear Information System (INIS)

    Poeton, R.W.; Moeller, M.P.; Laughlin, G.J.; Desrosiers, A.E.

    1983-05-01

    As part of the continuing emphasis on emergency preparedness the US Nuclear Regulatory Commission (NRC) sponsored the development of a rapid dose assessment system by Pacific Northwest Laboratory (PNL). This system, the Interactive Rapid Dose Assessment Model (IRDAM) is a micro-computer based program for rapidly assessing the radiological impact of accidents at nuclear power plants. This User's Guide provides instruction in the setup and operation of the equipment necessary to run IRDAM. Instructions are also given on how to load the magnetic disks and access the interactive part of the program. Two other companion volumes to this one provide additional information on IRDAM. Reactor Accident Assessment Methods (NUREG/CR-3012, Volume 2) describes the technical bases for IRDAM including methods, models and assumptions used in calculations. Scenarios for Comparing Dose Assessment Models (NUREG/CR-3012, Volume 3) provides the results of calculations made by IRDAM and other models for specific accident scenarios

  15. Uncertainty Assessment in Urban Storm Water Drainage Modelling

    DEFF Research Database (Denmark)

    Thorndahl, Søren

    The object of this paper is to make an overall description of the author's PhD study, concerning uncertainties in numerical urban storm water drainage models. Initially an uncertainty localization and assessment of model inputs and parameters as well as uncertainties caused by different model...

  16. Evaluation of habitat suitability index models for assessing biotic resources

    Science.gov (United States)

    John C. Rennie; Joseph D. Clark; James M. Sweeney

    2000-01-01

    Existing habitat suitability index (HSI) models are evaluated for assessing the biotic resources on Champion International Corporation (CIC) lands with data from a standard and an expanded timber inventory. Forty HSI models for 34 species that occur in the Southern Appalachians have been identified from the literature. All of the variables for 14 models are provided (...

  17. A Multiregional Impact Assessment Model for disaster analysis

    NARCIS (Netherlands)

    Koks, E.E.; Thissen, M.

    2016-01-01

    This paper presents a recursive dynamic multiregional supply-use model, combining linear programming and input–output (I–O) modeling to assess the economy-wide consequences of a natural disaster on a pan-European scale. It is a supply-use model which considers production technologies and allows for

  18. A Risk Assessment Model for Campylobacter in Broiler Meat

    NARCIS (Netherlands)

    Nauta, M.J.; Jacobs-Reitsma, W.F.; Havelaar, A.H.

    2007-01-01

    A quantitative microbiological risk assessment model describes the transmission of Campylobacter through the broiler meat production chain and at home, from entering the processing plant until consumption of a chicken breast fillet meal. The exposure model is linked to a dose-response model to allow

  19. Model-Based Approaches for Teaching and Practicing Personality Assessment.

    Science.gov (United States)

    Blais, Mark A; Hopwood, Christopher J

    2017-01-01

    Psychological assessment is a complex professional skill. Competence in assessment requires an extensive knowledge of personality, neuropsychology, social behavior, and psychopathology, a background in psychometrics, familiarity with a range of multimethod tools, cognitive flexibility, skepticism, and interpersonal sensitivity. This complexity makes assessment a challenge to teach and learn, particularly as the investment of resources and time in assessment has waned in psychological training programs over the last few decades. In this article, we describe 3 conceptual models that can assist teaching and learning psychological assessments. The transtheoretical model of personality provides a personality systems-based framework for understanding how multimethod assessment data relate to major personality systems and can be combined to describe and explain complex human behavior. The quantitative psychopathology-personality trait model is an empirical model based on the hierarchical organization of individual differences. Application of this model can help students understand diagnostic comorbidity and symptom heterogeneity, focus on more meaningful high-order domains, and identify the most effective assessment tools for addressing a given question. The interpersonal situation model is rooted in interpersonal theory and can help students connect test data to here-and-now interactions with patients. We conclude by demonstrating the utility of these models using a case example.

  20. Thermal ecological risk assessment - methodology for modeling

    International Nuclear Information System (INIS)

    Markandeya, S.G.

    2007-01-01

    Discharge of hot effluents into natural water bodies is a potential risk to the aquatic life. The stipulations imposed by the MoEF, Government of India for protecting the environment are in place. However, due to lack of quality scientific information, these stipulations are generally conservative in nature and hence questionable. A Coordinated Research Project on Thermal Ecological Studies, successfully completed recently came out with a suggestion of implementing multi-factorially estimated mixing zone concept. In the present paper, risk based assessment methodology is proposed as an alternate approach. The methodology is presented only conceptually and briefly over which further refining may be necessary. The methodology would enable to account for variations in the plant operational conditions, climatic conditions and the geographical and hydraulic characteristic conditions of the water body in a suitable manner. (author)

  1. Road Assessment Model and Pilot Application in China

    Directory of Open Access Journals (Sweden)

    Tiejun Zhang

    2014-01-01

    Full Text Available Risk assessment of roads is an effective approach for road agencies to determine safety improvement investments. It can increases the cost-effective returns in crash and injury reductions. To get a powerful Chinese risk assessment model, Research Institute of Highway (RIOH is developing China Road Assessment Programme (ChinaRAP model to show the traffic crashes in China in partnership with International Road Assessment Programme (iRAP. The ChinaRAP model is based upon RIOH’s achievements and iRAP models. This paper documents part of ChinaRAP’s research work, mainly including the RIOH model and its pilot application in a province in China.

  2. Modelling fog in probabilistic consequence assessment

    International Nuclear Information System (INIS)

    Underwood, B.Y.

    1993-02-01

    Earlier work examined the potential influence of foggy weather conditions on the probabilistic assessment of the consequences of accidental releases of radioactive material to the atmosphere (PCA), in particular the impact of a fraction of the released aerosol becoming incorporated into droplets. A major uncertainty emerging from the initial scoping study concerned estimation of the fraction of the released material that would be taken up into droplets. An objective is to construct a method for handling in a PCA context the effect of fog on deposition, basing the method on the experience gained from prior investigations. There are two aspects to explicitly including the effect of fog in PCA: estimating the probability of occurrence of various types of foggy condition and calculating the impact on the conventional end-points of consequence assessment. For the first, a brief outline is given of the use of meteorological data by PCA computer codes, followed by a discussion of some routinely-recorded meteorological parameters that are pertinent to fog, such as the presentweather code and horizontal visibility. Four stylized scenarios are defined to cover a wide range of situations in which particle growth by uptake of water may have an important impact on deposition. A description is then given of the way in which routine meteorological data could be used to flag the presence of each of these conditions in the meteorological data file used by the PCA code. The approach developed to calculate the impact on deposition is pitched at a level of complexity appropriate to the PCA context and reflects the physical constraints of the system and accounts for the specific characteristics of the released aerosol. (Author)

  3. A Hierarchal Risk Assessment Model Using the Evidential Reasoning Rule

    Directory of Open Access Journals (Sweden)

    Xiaoxiao Ji

    2017-02-01

    Full Text Available This paper aims to develop a hierarchical risk assessment model using the newly-developed evidential reasoning (ER rule, which constitutes a generic conjunctive probabilistic reasoning process. In this paper, we first provide a brief introduction to the basics of the ER rule and emphasize the strengths for representing and aggregating uncertain information from multiple experts and sources. Further, we discuss the key steps of developing the hierarchical risk assessment framework systematically, including (1 formulation of risk assessment hierarchy; (2 representation of both qualitative and quantitative information; (3 elicitation of attribute weights and information reliabilities; (4 aggregation of assessment information using the ER rule and (5 quantification and ranking of risks using utility-based transformation. The proposed hierarchical risk assessment framework can potentially be implemented to various complex and uncertain systems. A case study on the fire/explosion risk assessment of marine vessels demonstrates the applicability of the proposed risk assessment model.

  4. Modeling and assessing international climate financing

    Science.gov (United States)

    Wu, Jing; Tang, Lichun; Mohamed, Rayman; Zhu, Qianting; Wang, Zheng

    2016-06-01

    Climate financing is a key issue in current negotiations on climate protection. This study establishes a climate financing model based on a mechanism in which donor countries set up funds for climate financing and recipient countries use the funds exclusively for carbon emission reduction. The burden-sharing principles are based on GDP, historical emissions, and consumptionbased emissions. Using this model, we develop and analyze a series of scenario simulations, including a financing program negotiated at the Cancun Climate Change Conference (2010) and several subsequent programs. Results show that sustained climate financing can help to combat global climate change. However, the Cancun Agreements are projected to result in a reduction of only 0.01°C in global warming by 2100 compared to the scenario without climate financing. Longer-term climate financing programs should be established to achieve more significant benefits. Our model and simulations also show that climate financing has economic benefits for developing countries. Developed countries will suffer a slight GDP loss in the early stages of climate financing, but the longterm economic growth and the eventual benefits of climate mitigation will compensate for this slight loss. Different burden-sharing principles have very similar effects on global temperature change and economic growth of recipient countries, but they do result in differences in GDP changes for Japan and the FSU. The GDP-based principle results in a larger share of financial burden for Japan, while the historical emissions-based principle results in a larger share of financial burden for the FSU. A larger burden share leads to a greater GDP loss.

  5. A narrative review of research impact assessment models and methods.

    Science.gov (United States)

    Milat, Andrew J; Bauman, Adrian E; Redman, Sally

    2015-03-18

    Research funding agencies continue to grapple with assessing research impact. Theoretical frameworks are useful tools for describing and understanding research impact. The purpose of this narrative literature review was to synthesize evidence that describes processes and conceptual models for assessing policy and practice impacts of public health research. The review involved keyword searches of electronic databases, including MEDLINE, CINAHL, PsycINFO, EBM Reviews, and Google Scholar in July/August 2013. Review search terms included 'research impact', 'policy and practice', 'intervention research', 'translational research', 'health promotion', and 'public health'. The review included theoretical and opinion pieces, case studies, descriptive studies, frameworks and systematic reviews describing processes, and conceptual models for assessing research impact. The review was conducted in two phases: initially, abstracts were retrieved and assessed against the review criteria followed by the retrieval and assessment of full papers against review criteria. Thirty one primary studies and one systematic review met the review criteria, with 88% of studies published since 2006. Studies comprised assessments of the impacts of a wide range of health-related research, including basic and biomedical research, clinical trials, health service research, as well as public health research. Six studies had an explicit focus on assessing impacts of health promotion or public health research and one had a specific focus on intervention research impact assessment. A total of 16 different impact assessment models were identified, with the 'payback model' the most frequently used conceptual framework. Typically, impacts were assessed across multiple dimensions using mixed methodologies, including publication and citation analysis, interviews with principal investigators, peer assessment, case studies, and document analysis. The vast majority of studies relied on principal investigator

  6. Conceptual adsorption models and open issues pertaining to performance assessment

    International Nuclear Information System (INIS)

    Serne, R.J.

    1992-01-01

    Recently several articles have been published that question the appropriateness of the distribution coefficient, Rd, concept to quantify radionuclide migration. Several distinct issues surrounding the modeling of nuclide retardation. The first section defines adsorption terminology and discusses various adsorption processes. The next section describes five commonly used adsorption conceptual models, specifically emphasizing what attributes that affect adsorption are explicitly accommodated in each model. I also review efforts to incorporate each adsorption model into performance assessment transport computer codes. The five adsorption conceptual models are (1) the constant Rd model, (2) the parametric Rd model, (3) isotherm adsorption models, (4) mass action adsorption models, and (5) surface-complexation with electrostatics models. The final section discusses the adequacy of the distribution ratio concept, the adequacy of transport calculations that rely on constant retardation factors and the status of incorporating sophisticated adsorption models into transport codes. 86 refs., 1 fig., 1 tab

  7. Route Assessment for Unmanned Aerial Vehicle Based on Cloud Model

    Directory of Open Access Journals (Sweden)

    Xixia Sun

    2014-01-01

    Full Text Available An integrated route assessment approach based on cloud model is proposed in this paper, where various sources of uncertainties are well kept and modeled by cloud theory. Firstly, a systemic criteria framework incorporating models for scoring subcriteria is developed. Then, the cloud model is introduced to represent linguistic variables, and survivability probability histogram of each route is converted into normal clouds by cloud transformation, enabling both randomness and fuzziness in the assessment environment to be managed simultaneously. Finally, a new way to measure the similarity between two normal clouds satisfying reflexivity, symmetry, transitivity, and overlapping is proposed. Experimental results demonstrate that the proposed route assessment approach outperforms fuzzy logic based assessment approach with regard to feasibility, reliability, and consistency with human thinking.

  8. Tactical Medical Logistics Planning Tool: Modeling Operational Risk Assessment

    National Research Council Canada - National Science Library

    Konoske, Paula

    2004-01-01

    ...) models the patient flow from the point of injury through more definitive care, and (2) supports operations research and systems analysis studies, operational risk assessment, and field medical services planning. TML+...

  9. Indoor Air Quality Building Education and Assessment Model Forms

    Science.gov (United States)

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM) is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  10. Indoor Air Quality Building Education and Assessment Model

    Science.gov (United States)

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM), released in 2002, is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  11. Predicting the ungauged basin: Model validation and realism assessment

    Directory of Open Access Journals (Sweden)

    Tim evan Emmerik

    2015-10-01

    Full Text Available The hydrological decade on Predictions in Ungauged Basins (PUB led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this paper we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. This paper does not present a generic approach that can be transferred to other ungauged catchments, but it aims to show how clever model design and alternative data acquisition can result in a valuable hydrological model for an ungauged catchment.

  12. The Global Modeling Initiative Assessment Model: Model Description, Integration and Testing of the Transport Shell

    Energy Technology Data Exchange (ETDEWEB)

    Rotman, D.A.; Tannahill, J.R.; Kinnison, D.E.; Connell, P.S.; Bergmann, D.; Proctor, D.; Rodriquez, J.M.; Lin, S.J.; Rood, R.B.; Prather, M.J.; Rasch, P.J.; Considine, D.B.; Ramaroson, R.; Kawa, S.R.

    2000-04-25

    We describe the three dimensional global stratospheric chemistry model developed under the NASA Global Modeling Initiative (GMI) to assess the possible environmental consequences from the emissions of a fleet of proposed high speed civil transport aircraft. This model was developed through a unique collaboration of the members of the GMI team. Team members provided computational modules representing various physical and chemical processes, and analysis of simulation results through extensive comparison to observation. The team members' modules were integrated within a computational framework that allowed transportability and simulations on massively parallel computers. A unique aspect of this model framework is the ability to interchange and intercompare different submodules to assess the sensitivity of numerical algorithms and model assumptions to simulation results. In this paper, we discuss the important attributes of the GMI effort, describe the GMI model computational framework and the numerical modules representing physical and chemical processes. As an application of the concept, we illustrate an analysis of the impact of advection algorithms on the dispersion of a NO{sub y}-like source in the stratosphere which mimics that of a fleet of commercial supersonic transports (High-Speed Civil Transport (HSCT)) flying between 17 and 20 kilometers.

  13. Model of environmental life cycle assessment for coal mining operations.

    Science.gov (United States)

    Burchart-Korol, Dorota; Fugiel, Agata; Czaplicka-Kolarz, Krystyna; Turek, Marian

    2016-08-15

    This paper presents a novel approach to environmental assessment of coal mining operations, which enables assessment of the factors that are both directly and indirectly affecting the environment and are associated with the production of raw materials and energy used in processes. The primary novelty of the paper is the development of a computational environmental life cycle assessment (LCA) model for coal mining operations and the application of the model for coal mining operations in Poland. The LCA model enables the assessment of environmental indicators for all identified unit processes in hard coal mines with the life cycle approach. The proposed model enables the assessment of greenhouse gas emissions (GHGs) based on the IPCC method and the assessment of damage categories, such as human health, ecosystems and resources based on the ReCiPe method. The model enables the assessment of GHGs for hard coal mining operations in three time frames: 20, 100 and 500years. The model was used to evaluate the coal mines in Poland. It was demonstrated that the largest environmental impacts in damage categories were associated with the use of fossil fuels, methane emissions and the use of electricity, processing of wastes, heat, and steel supports. It was concluded that an environmental assessment of coal mining operations, apart from direct influence from processing waste, methane emissions and drainage water, should include the use of electricity, heat and steel, particularly for steel supports. Because the model allows the comparison of environmental impact assessment for various unit processes, it can be used for all hard coal mines, not only in Poland but also in the world. This development is an important step forward in the study of the impacts of fossil fuels on the environment with the potential to mitigate the impact of the coal industry on the environment. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. A risk assessment model for selecting cloud service providers

    OpenAIRE

    Cayirci, Erdal; Garaga, Alexandr; Santana de Oliveira, Anderson; Roudier, Yves

    2016-01-01

    The Cloud Adoption Risk Assessment Model is designed to help cloud customers in assessing the risks that they face by selecting a specific cloud service provider. It evaluates background information obtained from cloud customers and cloud service providers to analyze various risk scenarios. This facilitates decision making an selecting the cloud service provider with the most preferable risk profile based on aggregated risks to security, privacy, and service delivery. Based on this model we ...

  15. ASSESSING INDIVIDUAL PERFORMANCE ON INFORMATION TECHNOLOGY ADOPTION: A NEW MODEL

    OpenAIRE

    Diah Hari Suryaningrum

    2012-01-01

    This paper aims to propose a new model in assessing individual performance on information technology adoption. The new model to assess individual performance was derived from two different theories: decomposed theory of planned behavior and task-technology fit theory. Although many researchers have tried to expand these theories, some of their efforts might lack of theoretical assumptions. To overcome this problem and enhance the coherence of the integration, I used a theory from social scien...

  16. Modeling the Assessment of Agricultural Enterprises Headcount Analysis

    OpenAIRE

    Viatkina, Tatyana

    2014-01-01

    The modern procedures of enterprises labour resources have been analyzed. The algorithm for calculation the enterprise performancepotential efficiency ratio and assessment of the performance potential of the enterprise assessment based on quantitativeand qualitative characteristics has been provided. The model for assessment the effectiveness of labour management of enterprise,branch or region subject to such factors as motivation, labour expenses, staff rotation and its qualifications has be...

  17. Why Reinvent the Wheel? Let's Adapt Our Institutional Assessment Model.

    Science.gov (United States)

    Aguirre, Francisco; Hawkins, Linda

    This paper reports on the implementation of an Integrated Assessment and Strategic Planning (IASP) process to comply with accountability requirements at the community college of New Mexico State University at Alamogordo. The IASP model adapted an existing compliance matrix and applied it to the business college program in 1995 to assess and…

  18. A model for assessing the environmental impact of transport

    OpenAIRE

    Malgorzata Latuszynska; Roma Strulak-Wojcikiewicz

    2013-01-01

    Environmental effects of transport, with a particular focus on the natural environment have been discussed. The authors present methods for assessing the influence of investments in transport infrastructure on the environment, as well as the concept of a simulation model which integrates various methods and approaches used to assess the impact of such investments on the environment. (original abstract)

  19. Evolution of oil trajectory, fate and impact assessment models

    International Nuclear Information System (INIS)

    French, D.P.

    1998-01-01

    Oil fates and effects modelling may be used for a wide variety of purposes. Natural resource damage assessment is just one example role. Modelling is particularly useful for ecological risk assessment. Modelling allows quantification of potential impacts and probabilities of those impacts. The relative impacts of various spills can be used to focus response efforts. Maximum liabilities for accidental spills may be estimated. The results of various management strategies may be investigated. A model system may be used to educate the public about potential impacts of various spill scenarios. A number of oil trajectory and fates models are available around the world. However, fewer model developers have carried out the analysis to quantitatively address impacts of oil spills. This review focuses on the development of coupled oil fates and effects models. (author)

  20. The role of computer modelling in participatory integrated assessments

    International Nuclear Information System (INIS)

    Siebenhuener, Bernd; Barth, Volker

    2005-01-01

    In a number of recent research projects, computer models have been included in participatory procedures to assess global environmental change. The intention was to support knowledge production and to help the involved non-scientists to develop a deeper understanding of the interactions between natural and social systems. This paper analyses the experiences made in three projects with the use of computer models from a participatory and a risk management perspective. Our cross-cutting analysis of the objectives, the employed project designs and moderation schemes and the observed learning processes in participatory processes with model use shows that models play a mixed role in informing participants and stimulating discussions. However, no deeper reflection on values and belief systems could be achieved. In terms of the risk management phases, computer models serve best the purposes of problem definition and option assessment within participatory integrated assessment (PIA) processes

  1. Plenary lecture: innovative modeling approaches applicable to risk assessments.

    Science.gov (United States)

    Oscar, T P

    2011-06-01

    Proper identification of safe and unsafe food at the processing plant is important for maximizing the public health benefit of food by ensuring both its consumption and safety. Risk assessment is a holistic approach to food safety that consists of four steps: 1) hazard identification; 2) exposure assessment; 3) hazard characterization; and 4) risk characterization. Risk assessments are modeled by mapping the risk pathway as a series of unit operations and associated pathogen events and then using probability distributions and a random sampling method to simulate the rare, random, variable and uncertain nature of pathogen events in the risk pathway. To model pathogen events, a rare event modeling approach is used that links a discrete distribution for incidence of the pathogen event with a continuous distribution for extent of the pathogen event. When applied to risk assessment, rare event modeling leads to the conclusion that the most highly contaminated food at the processing plant does not necessarily pose the highest risk to public health because of differences in post-processing risk factors among distribution channels and consumer populations. Predictive microbiology models for individual pathogen events can be integrated with risk assessment models using the rare event modeling method. Published by Elsevier Ltd.

  2. Communications Assessment Model (CAM): Processes and Products Associated with Modeling, Simulation and Assessment for DoD Networks

    National Research Council Canada - National Science Library

    Collins, Daniel

    1999-01-01

    The CAM process is a communications assessment and modeling tool that evaluates the impact of communications demands on current and evolving theater-level Defense Information Infrastructure/Defense...

  3. Predicting the ungauged basin: model validation and realism assessment

    NARCIS (Netherlands)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  4. Predicting the ungauged basin : Model validation and realism assessment

    NARCIS (Netherlands)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  5. Global Drought Assessment using a Multi-Model Dataset

    NARCIS (Netherlands)

    Lanen, van H.A.J.; Huijgevoort, van M.H.J.; Corzo Perez, G.; Wanders, N.; Hazenberg, P.; Loon, van A.F.; Estifanos, S.; Melsen, L.A.

    2011-01-01

    Large-scale models are often applied to study past drought (forced with global reanalysis datasets) and to assess future drought (using downscaled, bias-corrected forcing from climate models). The EU project WATer and global CHange (WATCH) provides a 0.5o degree global dataset of meteorological

  6. Capabilities For Modelling Of Conversion Processes In Life Cycle Assessment

    DEFF Research Database (Denmark)

    Damgaard, Anders; Zarrin, Bahram; Tonini, Davide

    Life cycle assessment was traditionally used for modelling of product design and optimization. This is also seen in the conventional LCA software which is optimized for the modelling of single materials streams of a homogeneous nature that is assembled into a final product. There has therefore been...

  7. Assessing Children's Mathematical Thinking in Practical Modelling Situations.

    Science.gov (United States)

    Tanner, Howard; Jones, Sonia

    2002-01-01

    Investigates the use of mathematical modeling tasks in 11- and 12-year-old students and the development of mathematical thinking skills using practical modeling activities. Analyzes the development of students' mathematical thinking with interviews of a form of dynamic assessment. Reports that some students proved to be naturally mindful and…

  8. Assessment of errors and uncertainty patterns in GIA modeling

    DEFF Research Database (Denmark)

    Barletta, Valentina Roberta; Spada, G.

    2012-01-01

    During the last decade many efforts have been devoted to the assessment of global sea level rise and to the determination of the mass balance of continental ice sheets. In this context, the important role of glacial-isostatic adjustment (GIA) has been clearly recognized. Yet, in many cases only one...... "preferred" GIA model has been used, without any consideration of the possible errors involved. Lacking a rigorous assessment of systematic errors in GIA modeling, the reliabil-ity of the results is uncertain. GIA sensitivity and uncertainties associated with the viscosity mod-els have been explored...... in the literature. However, at least two major sources of errors remain. The first is associated with the ice models, spatial distribution of ice and history of melting (this is especially the case of Antarctica), the second with the numerical implementation of model fea-tures relevant to sea level modeling...

  9. Ultrasensitive human thyrotropin (h TSH) immunoradiometric assay (IRMA) set up, through identification and minimization of non specific bindings; Ensaio imunoradiometrico ultra-sensivel de tireotrofina humana (hTSH) obtido mediante a identificacao e minimizacao de ligacoes inespecificas

    Energy Technology Data Exchange (ETDEWEB)

    Peroni, C.N.

    1994-12-31

    An IRMA of h TSH, based on magnetic solid phase separation, was studied especially for what concerns its non specific bindings. These were identified as a product of the interaction between an altered form of radioiodinated anti-h TSH monoclonal antibody ({sup 125} I-m AB) and the uncoupled magnetizable cellulose particle (matrix). Apparently this form of {sup 125} I-m AB is a type of aggregate that can be partly resolved from the main peak on Sephadex G-200 and further minimized via a single pre-incubation with the same matrix. Solid phase saturation with milk proteins, tracer storage at 4{sup 0} C and serum addition during incubation were also found particularly effective is preventing its formation. These findings were used in order to reproducibly decrease non specific bindings to values <0.1% (or <70 cpm), increasing thus the signal-to-noise ratio (B{sub 60}/B{sub O}) up to values of 300-500. This way we obtained h TSH radio assays with functional sensitivities of about 0.05 m IU/L and analytical sensitivities of the order of 0.02 m IU/L, which classify them at least as among the best second generation assays and that are excellent indeed for magnetic IRMA s. A more optimistic sensitivity calculation, based on Rodbard`s definition, provided values down to 0.008 m IU/L. Such sensitivities, moreover, were obtained in a very reproducible way and all over the useful tracer life. (author). 83 refs, 13 figs, 25 tabs.

  10. Future directions for LDEF ionizing radiation modeling and assessments

    Science.gov (United States)

    Armstrong, T. W.; Colborn, B. L.

    1993-01-01

    A calculational program utilizing data from radiation dosimetry measurements aboard the Long Duration Exposure Facility (LDEF) satellite to reduce the uncertainties in current models defining the ionizing radiation environment is in progress. Most of the effort to date has been on using LDEF radiation dose measurements to evaluate models defining the geomagnetically trapped radiation, which has provided results applicable to radiation design assessments being performed for Space Station Freedom. Plans for future data comparisons, model evaluations, and assessments using additional LDEF data sets (LET spectra, induced radioactivity, and particle spectra) are discussed.

  11. Application of Physiologically Based Pharmacokinetic Models in Chemical Risk Assessment

    Directory of Open Access Journals (Sweden)

    Moiz Mumtaz

    2012-01-01

    Full Text Available Post-exposure risk assessment of chemical and environmental stressors is a public health challenge. Linking exposure to health outcomes is a 4-step process: exposure assessment, hazard identification, dose response assessment, and risk characterization. This process is increasingly adopting “in silico” tools such as physiologically based pharmacokinetic (PBPK models to fine-tune exposure assessments and determine internal doses in target organs/tissues. Many excellent PBPK models have been developed. But most, because of their scientific sophistication, have found limited field application—health assessors rarely use them. Over the years, government agencies, stakeholders/partners, and the scientific community have attempted to use these models or their underlying principles in combination with other practical procedures. During the past two decades, through cooperative agreements and contracts at several research and higher education institutions, ATSDR funded translational research has encouraged the use of various types of models. Such collaborative efforts have led to the development and use of transparent and user-friendly models. The “human PBPK model toolkit” is one such project. While not necessarily state of the art, this toolkit is sufficiently accurate for screening purposes. Highlighted in this paper are some selected examples of environmental and occupational exposure assessments of chemicals and their mixtures.

  12. Model summary report for the safety assessment SR-Site

    Energy Technology Data Exchange (ETDEWEB)

    Vahlund, Fredrik; Zetterstroem Evins, Lena (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)); Lindgren, Maria (Kemakta Konsult AB, Stockholm (Sweden))

    2010-12-15

    This document is the model summary report for the safety assessment SR-Site. In the report, the quality assurance (QA) measures conducted for assessment codes are presented together with the chosen QA methodology. In the safety assessment project SR-Site, a large number of numerical models are used to analyse the system and to show compliance. In order to better understand how the different models interact and how information are transferred between the different models Assessment Model Flowcharts, AMFs, are used. From these, different modelling tasks can be identify and the computer codes used. As a large number of computer codes are used in the assessment the complexity of these differs to a large extent, some of the codes are commercial while others are developed especially for the assessment at hand. QA requirements must on the one hand take this diversity into account and on the other hand be well defined. In the methodology section of the report the following requirements are defined for all codes: - It must be demonstrated that the code is suitable for its purpose. - It must be demonstrated that the code has been properly used. - It must be demonstrated that the code development process has followed appropriate procedures and that the code produces accurate results. - It must be described how data are transferred between the different computational tasks. Although the requirements are identical for all codes in the assessment, the measures used to show that the requirements are fulfilled will be different for different types of codes (for instance due to the fact that for some software the source-code is not available for review). Subsequent to the methodology section, each assessment code is presented together with a discussion on how the requirements are met

  13. Model summary report for the safety assessment SR-Site

    International Nuclear Information System (INIS)

    Vahlund, Fredrik; Zetterstroem Evins, Lena; Lindgren, Maria

    2010-12-01

    This document is the model summary report for the safety assessment SR-Site. In the report, the quality assurance (QA) measures conducted for assessment codes are presented together with the chosen QA methodology. In the safety assessment project SR-Site, a large number of numerical models are used to analyse the system and to show compliance. In order to better understand how the different models interact and how information are transferred between the different models Assessment Model Flowcharts, AMFs, are used. From these, different modelling tasks can be identify and the computer codes used. As a large number of computer codes are used in the assessment the complexity of these differs to a large extent, some of the codes are commercial while others are developed especially for the assessment at hand. QA requirements must on the one hand take this diversity into account and on the other hand be well defined. In the methodology section of the report the following requirements are defined for all codes: - It must be demonstrated that the code is suitable for its purpose. - It must be demonstrated that the code has been properly used. - It must be demonstrated that the code development process has followed appropriate procedures and that the code produces accurate results. - It must be described how data are transferred between the different computational tasks. Although the requirements are identical for all codes in the assessment, the measures used to show that the requirements are fulfilled will be different for different types of codes (for instance due to the fact that for some software the source-code is not available for review). Subsequent to the methodology section, each assessment code is presented together with a discussion on how the requirements are met

  14. Fire models for assessment of nuclear power plant fires

    International Nuclear Information System (INIS)

    Nicolette, V.F.; Nowlen, S.P.

    1989-01-01

    This paper reviews the state-of-the-art in available fire models for the assessment of nuclear power plants fires. The advantages and disadvantages of three basic types of fire models (zone, field, and control volume) and Sandia's experience with these models will be discussed. It is shown that the type of fire model selected to solve a particular problem should be based on the information that is required. Areas of concern which relate to all nuclear power plant fire models are identified. 17 refs., 6 figs

  15. Economic assessment model architecture for AGC/AVLIS selection

    International Nuclear Information System (INIS)

    Hoglund, R.L.

    1984-01-01

    The economic assessment model architecture described provides the flexibility and completeness in economic analysis that the selection between AGC and AVLIS demands. Process models which are technology-specific will provide the first-order responses of process performance and cost to variations in process parameters. The economics models can be used to test the impacts of alternative deployment scenarios for a technology. Enterprise models provide global figures of merit for evaluating the DOE perspective on the uranium enrichment enterprise, and business analysis models compute the financial parameters from the private investor's viewpoint

  16. Assessment of Teacher Perceived Skill in Classroom Assessment Practices Using IRT Models

    Science.gov (United States)

    Koloi-Keaikitse, Setlhomo

    2017-01-01

    The purpose of this study was to assess teacher perceived skill in classroom assessment practices. Data were collected from a sample of (N = 691) teachers selected from government primary, junior secondary, and senior secondary schools in Botswana. Item response theory models were used to identify teacher response on items that measured their…

  17. Summary Diagrams for Coupled Hydrodynamic-Ecosystem Model Skill Assessment

    Science.gov (United States)

    2009-01-01

    numerical model study of the Georges bank ecosystems. Part II: biological-physical model. Deep-Sea Research II 48, 457-482. Friedrichs. M.A.M., Dusenberry...upper ocean ecology, photochemistry, and optics. NRL Technical Memorandum. NRI ./MR/7330-07-9026. Naval Research Laboratory, Stennis Space Center...Wallhead, P.J., Martin. A.P.. Srokosz, M.A.. Franks, P.J.S., in press. Predicting the bulk plankton dynamics of Georges Bank : model skill assessment

  18. Combining catchment and instream modelling to assess physical habitat quality

    DEFF Research Database (Denmark)

    Olsen, Martin

    the physical habitat quality of stream Ledreborg using af habitat hydraulic model • to assess the present and potential physical habitat quality of stream Ledreborg • to evaluate the suitability and applicability of habitat hydraulic models to Danish stream management Results • Precipitation and evaporation...... the best potential physical habitat quality for trout fry and juvenile trout and the lowest potential physical habitat quality for adult trout. This finding supports previous evaluations of the stream as a trout habitat, concluding that stream Ledreborg has very few suitable habitats for adult trout...... in the modelling. • Although more time consuming than present Danish methods for assessment of physical habitat quality in streams, the habitat hydraulic models can be used to evaluate physical habitat conditions at reach level and work as a basis for a more objective assessment method....

  19. A model for assessing human cognitive reliability in PRA studies

    International Nuclear Information System (INIS)

    Hannaman, G.W.; Spurgin, A.J.; Lukic, Y.

    1985-01-01

    This paper summarizes the status of a research project sponsored by EPRI as part of the Probabilistic Risk Assessment (PRA) technology improvement program and conducted by NUS Corporation to develop a model of Human Cognitive Reliability (HCR). The model was synthesized from features identified in a review of existing models. The model development was based on the hypothesis that the key factors affecting crew response times are separable. The inputs to the model consist of key parameters the values of which can be determined by PRA analysts for each accident situation being assessed. The output is a set of curves which represent the probability of control room crew non-response as a function of time for different conditions affecting their performance. The non-response probability is then a contributor to the overall non-success of operating crews to achieve a functional objective identified in the PRA study. Simulator data and some small scale tests were utilized to illustrate the calibration of interim HCR model coefficients for different types of cognitive processing since the data were sparse. The model can potentially help PRA analysts make human reliability assessments more explicit. The model incorporates concepts from psychological models of human cognitive behavior, information from current collections of human reliability data sources and crew response time data from simulator training exercises

  20. Dependability modeling and assessment in UML-based software development.

    Science.gov (United States)

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.

  1. Model of environmental life cycle assessment for coal mining operations

    Energy Technology Data Exchange (ETDEWEB)

    Burchart-Korol, Dorota, E-mail: dburchart@gig.eu; Fugiel, Agata, E-mail: afugiel@gig.eu; Czaplicka-Kolarz, Krystyna, E-mail: kczaplicka@gig.eu; Turek, Marian, E-mail: mturek@gig.eu

    2016-08-15

    This paper presents a novel approach to environmental assessment of coal mining operations, which enables assessment of the factors that are both directly and indirectly affecting the environment and are associated with the production of raw materials and energy used in processes. The primary novelty of the paper is the development of a computational environmental life cycle assessment (LCA) model for coal mining operations and the application of the model for coal mining operations in Poland. The LCA model enables the assessment of environmental indicators for all identified unit processes in hard coal mines with the life cycle approach. The proposed model enables the assessment of greenhouse gas emissions (GHGs) based on the IPCC method and the assessment of damage categories, such as human health, ecosystems and resources based on the ReCiPe method. The model enables the assessment of GHGs for hard coal mining operations in three time frames: 20, 100 and 500 years. The model was used to evaluate the coal mines in Poland. It was demonstrated that the largest environmental impacts in damage categories were associated with the use of fossil fuels, methane emissions and the use of electricity, processing of wastes, heat, and steel supports. It was concluded that an environmental assessment of coal mining operations, apart from direct influence from processing waste, methane emissions and drainage water, should include the use of electricity, heat and steel, particularly for steel supports. Because the model allows the comparison of environmental impact assessment for various unit processes, it can be used for all hard coal mines, not only in Poland but also in the world. This development is an important step forward in the study of the impacts of fossil fuels on the environment with the potential to mitigate the impact of the coal industry on the environment. - Highlights: • A computational LCA model for assessment of coal mining operations • Identification of

  2. Model of environmental life cycle assessment for coal mining operations

    International Nuclear Information System (INIS)

    Burchart-Korol, Dorota; Fugiel, Agata; Czaplicka-Kolarz, Krystyna; Turek, Marian

    2016-01-01

    This paper presents a novel approach to environmental assessment of coal mining operations, which enables assessment of the factors that are both directly and indirectly affecting the environment and are associated with the production of raw materials and energy used in processes. The primary novelty of the paper is the development of a computational environmental life cycle assessment (LCA) model for coal mining operations and the application of the model for coal mining operations in Poland. The LCA model enables the assessment of environmental indicators for all identified unit processes in hard coal mines with the life cycle approach. The proposed model enables the assessment of greenhouse gas emissions (GHGs) based on the IPCC method and the assessment of damage categories, such as human health, ecosystems and resources based on the ReCiPe method. The model enables the assessment of GHGs for hard coal mining operations in three time frames: 20, 100 and 500 years. The model was used to evaluate the coal mines in Poland. It was demonstrated that the largest environmental impacts in damage categories were associated with the use of fossil fuels, methane emissions and the use of electricity, processing of wastes, heat, and steel supports. It was concluded that an environmental assessment of coal mining operations, apart from direct influence from processing waste, methane emissions and drainage water, should include the use of electricity, heat and steel, particularly for steel supports. Because the model allows the comparison of environmental impact assessment for various unit processes, it can be used for all hard coal mines, not only in Poland but also in the world. This development is an important step forward in the study of the impacts of fossil fuels on the environment with the potential to mitigate the impact of the coal industry on the environment. - Highlights: • A computational LCA model for assessment of coal mining operations • Identification of

  3. Utility of Social Modeling for Proliferation Assessment - Preliminary Findings

    International Nuclear Information System (INIS)

    Coles, Garill A.; Gastelum, Zoe N.; Brothers, Alan J.; Thompson, Sandra E.

    2009-01-01

    Often the methodologies for assessing proliferation risk are focused around the inherent vulnerability of nuclear energy systems and associated safeguards. For example an accepted approach involves ways to measure the intrinsic and extrinsic barriers to potential proliferation. This paper describes preliminary investigation into non-traditional use of social and cultural information to improve proliferation assessment and advance the approach to assessing nuclear material diversion. Proliferation resistance assessment, safeguard assessments and related studies typically create technical information about the vulnerability of a nuclear energy system to diversion of nuclear material. The purpose of this research project is to find ways to integrate social information with technical information by explicitly considering the role of culture, groups and/or individuals to factors that impact the possibility of proliferation. When final, this work is expected to describe and demonstrate the utility of social science modeling in proliferation and proliferation risk assessments.

  4. Quantile hydrologic model selection and model structure deficiency assessment : 1. Theory

    NARCIS (Netherlands)

    Pande, S.

    2013-01-01

    A theory for quantile based hydrologic model selection and model structure deficiency assessment is presented. The paper demonstrates that the degree to which a model selection problem is constrained by the model structure (measured by the Lagrange multipliers of the constraints) quantifies

  5. Model summary report for the safety assessment SR-Can

    Energy Technology Data Exchange (ETDEWEB)

    Vahlund, Fredrik

    2006-10-15

    This document is the model summary report for the safety assessment SR-Can. In the report, the quality assurance measures conducted for the assessment codes are presented together with the chosen methodology. In the safety assessment SR-Can, a number of different computer codes are used. In order to better understand how these codes are related Assessment Model Flowcharts, AMFs, have been produced within the project. From these, it is possible to identify the different modelling tasks and consequently also the different computer codes used. A large number of different computer codes are used in the assessment of which some are commercial while others are developed especially for the current assessment project. QA requirements must on the one hand take this diversity into account and on the other hand be well defined. In the methodology section of the report the following requirements are defined: It must be demonstrated that the code is suitable for its purpose; It must be demonstrated that the code has been properly used; and, It must be demonstrated that the code development process has followed appropriate procedures and that the code produces accurate results. Although the requirements are identical for all codes, the measures used to show that the requirements are fulfilled will be different for different codes (for instance due to the fact that for some software the source-code is not available for review). Subsequent to the methodology section, each assessment code is presented and it is shown how the requirements are met.

  6. Model summary report for the safety assessment SR-Can

    International Nuclear Information System (INIS)

    Vahlund, Fredrik

    2006-10-01

    This document is the model summary report for the safety assessment SR-Can. In the report, the quality assurance measures conducted for the assessment codes are presented together with the chosen methodology. In the safety assessment SR-Can, a number of different computer codes are used. In order to better understand how these codes are related Assessment Model Flowcharts, AMFs, have been produced within the project. From these, it is possible to identify the different modelling tasks and consequently also the different computer codes used. A large number of different computer codes are used in the assessment of which some are commercial while others are developed especially for the current assessment project. QA requirements must on the one hand take this diversity into account and on the other hand be well defined. In the methodology section of the report the following requirements are defined: It must be demonstrated that the code is suitable for its purpose; It must be demonstrated that the code has been properly used; and, It must be demonstrated that the code development process has followed appropriate procedures and that the code produces accurate results. Although the requirements are identical for all codes, the measures used to show that the requirements are fulfilled will be different for different codes (for instance due to the fact that for some software the source-code is not available for review). Subsequent to the methodology section, each assessment code is presented and it is shown how the requirements are met

  7. PARALLEL MODELS OF ASSESSMENT: INFANT MENTAL HEALTH AND THERAPEUTIC ASSESSMENT MODELS INTERSECT THROUGH EARLY CHILDHOOD CASE STUDIES.

    Science.gov (United States)

    Gart, Natalie; Zamora, Irina; Williams, Marian E

    2016-07-01

    Therapeutic Assessment (TA; S.E. Finn & M.E. Tonsager, 1997; J.D. Smith, 2010) is a collaborative, semistructured model that encourages self-discovery and meaning-making through the use of assessment as an intervention approach. This model shares core strategies with infant mental health assessment, including close collaboration with parents and caregivers, active participation of the family, a focus on developing new family stories and increasing parents' understanding of their child, and reducing isolation and increasing hope through the assessment process. The intersection of these two theoretical approaches is explored, using case studies of three infants/young children and their families to illustrate the application of TA to infant mental health. The case of an 18-month-old girl whose parents fear that she has bipolar disorder illustrates the core principles of the TA model, highlighting the use of assessment intervention sessions and the clinical approach to preparing assessment feedback. The second case follows an infant with a rare genetic syndrome from ages 2 to 24 months, focusing on the assessor-parent relationship and the importance of a developmental perspective. Finally, assessment of a 3-year-old boy illustrates the development and use of a fable as a tool to provide feedback to a young child about assessment findings and recommendations. © 2016 Michigan Association for Infant Mental Health.

  8. Fish habitat simulation models and integrated assessment tools

    International Nuclear Information System (INIS)

    Harby, A.; Alfredsen, K.

    1999-01-01

    Because of human development water use increases in importance, and this worldwide trend is leading to an increasing number of user conflicts with a strong need for assessment tools to measure the impacts both on the ecosystem and the different users and user groups. The quantitative tools must allow a comparison of alternatives, different user groups, etc., and the tools must be integrated while impact assessments includes different disciplines. Fish species, especially young ones, are indicators of the environmental state of a riverine system and monitoring them is a way to follow environmental changes. The direct and indirect impacts on the ecosystem itself are measured, and impacts on user groups is not included. Fish habitat simulation models are concentrated on, and methods and examples are considered from Norway. Some ideas on integrated modelling tools for impact assessment studies are included. One dimensional hydraulic models are rapidly calibrated and do not require any expert knowledge in hydraulics. Two and three dimensional models require a bit more skilled users, especially if the topography is very heterogeneous. The advantages of using two and three dimensional models include: they do not need any calibration, just validation; they are predictive; and they can be more cost effective than traditional habitat hydraulic models when combined with modern data acquisition systems and tailored in a multi-disciplinary study. Suitable modelling model choice should be based on available data and possible data acquisition, available manpower, computer, and software resources, and needed output and accuracy in the output. 58 refs

  9. Model and Analytic Processes for Export License Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.; Wood, Thomas W.; Daly, Don S.; Brothers, Alan J.; Sanfilippo, Antonio P.; Cook, Diane; Holder, Larry

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determine which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An

  10. Addressing challenges in single species assessments via a simple state-space assessment model

    DEFF Research Database (Denmark)

    Nielsen, Anders

    Single-species and age-structured fish stock assessments still remains the main tool for managing fish stocks. A simple state-space assessment model is presented as an alternative to (semi) deterministic procedures and the full parametric statistical catch at age models. It offers a solution...... to some of the key challenges of these models. Compared to the deterministic procedures it solves a list of problems originating from falsely assuming that age classified catches are known without errors and allows quantification of uncertainties of estimated quantities of interest. Compared to full...

  11. Assessing The Policy Relevance of Regional Air Quality Models

    Science.gov (United States)

    Holloway, T.

    This work presents a framework for discussing the policy relevance of models, and regional air quality models in particular. We define four criteria: 1) The scientific status of the model; 2) Its ability to address primary environmental concerns; 3) The position of modeled environmental issues on the political agenda; and 4) The role of scientific input into the policy process. This framework is applied to current work simulating the transport of nitric acid in Asia with the ATMOS-N model, to past studies on air pollution transport in Europe with the EMEP model, and to future applications of the United States Environmental Protection Agency (US EPA) Models-3. The Lagrangian EMEP model provided critical input to the development of the 1994 Oslo and 1999 Gothenburg Protocols to the Convention on Long-Range Transbound- ary Air Pollution, as well as to the development of EU directives, via its role as a component of the RAINS integrated assessment model. Our work simulating reactive nitrogen in Asia follows the European example in part, with the choice of ATMOS-N, a regional Lagrangian model to calculate source-receptor relationships for the RAINS- Asia integrated assessment model. However, given differences between ATMOS-N and the EMEP model, as well as differences between the scientific and political cli- mates facing Europe ten years ago and Asia today, the role of these two models in the policy process is very different. We characterize the different aspects of policy relevance between these models using our framework, and consider how the current generation US EPA air quality model compares, in light of its Eulerian structure, dif- ferent objectives, and the policy context of the US.

  12. The Trauma Outcome Process Assessment Model: A Structural Equation Model Examination of Adjustment

    Science.gov (United States)

    Borja, Susan E.; Callahan, Jennifer L.

    2009-01-01

    This investigation sought to operationalize a comprehensive theoretical model, the Trauma Outcome Process Assessment, and test it empirically with structural equation modeling. The Trauma Outcome Process Assessment reflects a robust body of research and incorporates known ecological factors (e.g., family dynamics, social support) to explain…

  13. Designing and Assessing Interactive Systems Using Task Models

    OpenAIRE

    Palanque, Philippe; Martinie, Célia; Winckler, Marco

    2017-01-01

    Part 6: Courses; International audience; This two-part course takes a practical approach to introduce the principles, methods and tools in task modelling. Part 1: A non-technical introduction demonstrates that task models support successful design of interactive systems. Part 2: A more technical interactive hands-on exercise of how to “do it right”, such as: How to go from task analysis to task models? How to assess (through analysis and simulation) that a task model is correct? How to identi...

  14. Environmental impact assessments and geological repositories: A model process

    International Nuclear Information System (INIS)

    Webster, S.

    2000-01-01

    In a recent study carried out for the European Commission, the scope and application of environmental impact assessment (EIA) legislation and current EIA practice in European Union Member States and applicant countries of Central and Eastern Europe was investigated, specifically in relation to the geological disposal of radioactive waste. This paper reports the study's investigations into a model approach to EIA in the context of geological repositories, including the role of the assessment in the overall decision processes and public involvement. (author)

  15. Agricultural climate impacts assessment for economic modeling and decision support

    Science.gov (United States)

    Thomson, A. M.; Izaurralde, R. C.; Beach, R.; Zhang, X.; Zhao, K.; Monier, E.

    2013-12-01

    A range of approaches can be used in the application of climate change projections to agricultural impacts assessment. Climate projections can be used directly to drive crop models, which in turn can be used to provide inputs for agricultural economic or integrated assessment models. These model applications, and the transfer of information between models, must be guided by the state of the science. But the methodology must also account for the specific needs of stakeholders and the intended use of model results beyond pure scientific inquiry, including meeting the requirements of agencies responsible for designing and assessing policies, programs, and regulations. Here we present methodology and results of two climate impacts studies that applied climate model projections from CMIP3 and from the EPA Climate Impacts and Risk Analysis (CIRA) project in a crop model (EPIC - Environmental Policy Indicator Climate) in order to generate estimates of changes in crop productivity for use in an agricultural economic model for the United States (FASOM - Forest and Agricultural Sector Optimization Model). The FASOM model is a forward-looking dynamic model of the US forest and agricultural sector used to assess market responses to changing productivity of alternative land uses. The first study, focused on climate change impacts on the UDSA crop insurance program, was designed to use available daily climate projections from the CMIP3 archive. The decision to focus on daily data for this application limited the climate model and time period selection significantly; however for the intended purpose of assessing impacts on crop insurance payments, consideration of extreme event frequency was critical for assessing periodic crop failures. In a second, coordinated impacts study designed to assess the relative difference in climate impacts under a no-mitigation policy and different future climate mitigation scenarios, the stakeholder specifically requested an assessment of a

  16. Consensus-based training and assessment model for general surgery.

    Science.gov (United States)

    Szasz, P; Louridas, M; de Montbrun, S; Harris, K A; Grantcharov, T P

    2016-05-01

    Surgical education is becoming competency-based with the implementation of in-training milestones. Training guidelines should reflect these changes and determine the specific procedures for such milestone assessments. This study aimed to develop a consensus view regarding operative procedures and tasks considered appropriate for junior and senior trainees, and the procedures that can be used as technical milestone assessments for trainee progression in general surgery. A Delphi process was followed where questionnaires were distributed to all 17 Canadian general surgery programme directors. Items were ranked on a 5-point Likert scale, with consensus defined as Cronbach's α of at least 0·70. Items rated 4 or above on the 5-point Likert scale by 80 per cent of the programme directors were included in the models. Two Delphi rounds were completed, with 14 programme directors taking part in round one and 11 in round two. The overall consensus was high (Cronbach's α = 0·98). The training model included 101 unique procedures and tasks, 24 specific to junior trainees, 68 specific to senior trainees, and nine appropriate to all. The assessment model included four procedures. A system of operative procedures and tasks for junior- and senior-level trainees has been developed along with an assessment model for trainee progression. These can be used as milestones in competency-based assessments. © 2016 BJS Society Ltd Published by John Wiley & Sons Ltd.

  17. Ecohydrological modeling for large-scale environmental impact assessment.

    Science.gov (United States)

    Woznicki, Sean A; Nejadhashemi, A Pouyan; Abouali, Mohammad; Herman, Matthew R; Esfahanian, Elaheh; Hamaamin, Yaseen A; Zhang, Zhen

    2016-02-01

    Ecohydrological models are frequently used to assess the biological integrity of unsampled streams. These models vary in complexity and scale, and their utility depends on their final application. Tradeoffs are usually made in model scale, where large-scale models are useful for determining broad impacts of human activities on biological conditions, and regional-scale (e.g. watershed or ecoregion) models provide stakeholders greater detail at the individual stream reach level. Given these tradeoffs, the objective of this study was to develop large-scale stream health models with reach level accuracy similar to regional-scale models thereby allowing for impacts assessments and improved decision-making capabilities. To accomplish this, four measures of biological integrity (Ephemeroptera, Plecoptera, and Trichoptera taxa (EPT), Family Index of Biotic Integrity (FIBI), Hilsenhoff Biotic Index (HBI), and fish Index of Biotic Integrity (IBI)) were modeled based on four thermal classes (cold, cold-transitional, cool, and warm) of streams that broadly dictate the distribution of aquatic biota in Michigan. The Soil and Water Assessment Tool (SWAT) was used to simulate streamflow and water quality in seven watersheds and the Hydrologic Index Tool was used to calculate 171 ecologically relevant flow regime variables. Unique variables were selected for each thermal class using a Bayesian variable selection method. The variables were then used in development of adaptive neuro-fuzzy inference systems (ANFIS) models of EPT, FIBI, HBI, and IBI. ANFIS model accuracy improved when accounting for stream thermal class rather than developing a global model. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Risk Assessment in Fractured Clayey Tills - Which Modeling Tools?

    DEFF Research Database (Denmark)

    Chambon, Julie Claire Claudia; Bjerg, Poul Løgstrup; Binning, Philip John

    2012-01-01

    The article presents different tools available for risk assessment in fractured clayey tills and their advantages and limitations are discussed. Because of the complex processes occurring during contaminant transport through fractured media, the development of simple practical tools for risk...... assessment is challenging and the inclusion of the relevant processes is difficult. Furthermore the lack of long-term monitoring data prevents from verifying the accuracy of the different conceptual models. Further investigations based on long-term data and numerical modeling are needed to accurately...... describe contaminant transport in fractured media and develop practical tools with the relevant processes and level of complexity....

  19. Assessment of the assessment: Evaluation of the model quality estimates in CASP10

    KAUST Repository

    Kryshtafovych, Andriy

    2013-08-31

    The article presents an assessment of the ability of the thirty-seven model quality assessment (MQA) methods participating in CASP10 to provide an a priori estimation of the quality of structural models, and of the 67 tertiary structure prediction groups to provide confidence estimates for their predicted coordinates. The assessment of MQA predictors is based on the methods used in previous CASPs, such as correlation between the predicted and observed quality of the models (both at the global and local levels), accuracy of methods in distinguishing between good and bad models as well as good and bad regions within them, and ability to identify the best models in the decoy sets. Several numerical evaluations were used in our analysis for the first time, such as comparison of global and local quality predictors with reference (baseline) predictors and a ROC analysis of the predictors\\' ability to differentiate between the well and poorly modeled regions. For the evaluation of the reliability of self-assessment of the coordinate errors, we used the correlation between the predicted and observed deviations of the coordinates and a ROC analysis of correctly identified errors in the models. A modified two-stage procedure for testing MQA methods in CASP10 whereby a small number of models spanning the whole range of model accuracy was released first followed by the release of a larger number of models of more uniform quality, allowed a more thorough analysis of abilities and inabilities of different types of methods. Clustering methods were shown to have an advantage over the single- and quasi-single- model methods on the larger datasets. At the same time, the evaluation revealed that the size of the dataset has smaller influence on the global quality assessment scores (for both clustering and nonclustering methods), than its diversity. Narrowing the quality range of the assessed models caused significant decrease in accuracy of ranking for global quality predictors but

  20. Assessment of the assessment: evaluation of the model quality estimates in CASP10.

    Science.gov (United States)

    Kryshtafovych, Andriy; Barbato, Alessandro; Fidelis, Krzysztof; Monastyrskyy, Bohdan; Schwede, Torsten; Tramontano, Anna

    2014-02-01

    The article presents an assessment of the ability of the thirty-seven model quality assessment (MQA) methods participating in CASP10 to provide an a priori estimation of the quality of structural models, and of the 67 tertiary structure prediction groups to provide confidence estimates for their predicted coordinates. The assessment of MQA predictors is based on the methods used in previous CASPs, such as correlation between the predicted and observed quality of the models (both at the global and local levels), accuracy of methods in distinguishing between good and bad models as well as good and bad regions within them, and ability to identify the best models in the decoy sets. Several numerical evaluations were used in our analysis for the first time, such as comparison of global and local quality predictors with reference (baseline) predictors and a ROC analysis of the predictors' ability to differentiate between the well and poorly modeled regions. For the evaluation of the reliability of self-assessment of the coordinate errors, we used the correlation between the predicted and observed deviations of the coordinates and a ROC analysis of correctly identified errors in the models. A modified two-stage procedure for testing MQA methods in CASP10 whereby a small number of models spanning the whole range of model accuracy was released first followed by the release of a larger number of models of more uniform quality, allowed a more thorough analysis of abilities and inabilities of different types of methods. Clustering methods were shown to have an advantage over the single- and quasi-single- model methods on the larger datasets. At the same time, the evaluation revealed that the size of the dataset has smaller influence on the global quality assessment scores (for both clustering and nonclustering methods), than its diversity. Narrowing the quality range of the assessed models caused significant decrease in accuracy of ranking for global quality predictors but

  1. Radioactive waste disposal assessment - overview of biosphere processes and models

    International Nuclear Information System (INIS)

    Coughtrey, P.J.

    1992-09-01

    This report provides an overview of biosphere processes and models in the general context of the radiological assessment of radioactive waste disposal as a basis for HMIP's response to biosphere aspects of Nirex's submissions for disposal of radioactive wastes in a purpose-built repository at Sellafield, Cumbria. The overview takes into account published information from the UK as available from Nirex's safety and assessment research programme and HMIP's disposal assessment programme, as well as that available from studies in the UK and elsewhere. (Author)

  2. Confidence assessment. Site-descriptive modelling SDM-Site Laxemar

    Energy Technology Data Exchange (ETDEWEB)

    2008-12-15

    The objective of this report is to assess the confidence that can be placed in the Laxemar site descriptive model, based on the information available at the conclusion of the surface-based investigations (SDM-Site Laxemar). In this exploration, an overriding question is whether remaining uncertainties are significant for repository engineering design or long-term safety assessment and could successfully be further reduced by more surface-based investigations or more usefully by explorations underground made during construction of the repository. Procedures for this assessment have been progressively refined during the course of the site descriptive modelling, and applied to all previous versions of the Forsmark and Laxemar site descriptive models. They include assessment of whether all relevant data have been considered and understood, identification of the main uncertainties and their causes, possible alternative models and their handling, and consistency between disciplines. The assessment then forms the basis for an overall confidence statement. The confidence in the Laxemar site descriptive model, based on the data available at the conclusion of the surface based site investigations, has been assessed by exploring: - Confidence in the site characterization data base, - remaining issues and their handling, - handling of alternatives, - consistency between disciplines and - main reasons for confidence and lack of confidence in the model. Generally, the site investigation database is of high quality, as assured by the quality procedures applied. It is judged that the Laxemar site descriptive model has an overall high level of confidence. Because of the relatively robust geological model that describes the site, the overall confidence in the Laxemar Site Descriptive model is judged to be high, even though details of the spatial variability remain unknown. The overall reason for this confidence is the wide spatial distribution of the data and the consistency between

  3. Guide for developing conceptual models for ecological risk assessments

    Energy Technology Data Exchange (ETDEWEB)

    Suter, G.W., II

    1996-05-01

    Ecological conceptual models are the result of the problem formulation phase of an ecological risk assessment, which is an important component of the Remedial Investigation process. They present hypotheses of how the site contaminants might affect the site ecology. The contaminant sources, routes, media, routes, and endpoint receptors are presented in the form of a flow chart. This guide is for preparing the conceptual models; use of this guide will standardize the models so that they will be of high quality, useful to the assessment process, and sufficiently consistent so that connections between sources of exposure and receptors can be extended across operable units (OU). Generic conceptual models are presented for source, aquatic integrator, groundwater integrator, and terrestrial OUs.

  4. EASETECH – A LCA model for assessment of environmental technologies

    DEFF Research Database (Denmark)

    Damgaard, Anders; Baumeister, Hubert; Astrup, Thomas Fruergaard

    2014-01-01

    with different kinds of material flows, such as sludge, wastewater, biomass for energy production and treatment of contaminated soil. The primary aim of EASETECH is to perform life cycle assessment (LCA) of complex systems handling heterogeneous material flows. The main novelties of the model compared to other......EASETECH is a new model for the environmental assessment of environmental technologies developed in collaboration between DTU Environment and DTU Compute. EASETECH is based on experience gained in the field of waste management modelling over the last decade and applies the same concepts to systems...... LCA software are as follows. The focus is put on material flow modelling. This means that each material flow is characterized as a mix of material fractions with different properties. Flows in terms of mass and composition are computed throughout the integrated system including rejects, slags, ashes...

  5. Guide for developing conceptual models for ecological risk assessments

    International Nuclear Information System (INIS)

    Suter, G.W., II.

    1996-05-01

    Ecological conceptual models are the result of the problem formulation phase of an ecological risk assessment, which is an important component of the Remedial Investigation process. They present hypotheses of how the site contaminants might affect the site ecology. The contaminant sources, routes, media, routes, and endpoint receptors are presented in the form of a flow chart. This guide is for preparing the conceptual models; use of this guide will standardize the models so that they will be of high quality, useful to the assessment process, and sufficiently consistent so that connections between sources of exposure and receptors can be extended across operable units (OU). Generic conceptual models are presented for source, aquatic integrator, groundwater integrator, and terrestrial OUs

  6. Assessing biocomputational modelling in transforming clinical guidelines for osteoporosis management.

    Science.gov (United States)

    Thiel, Rainer; Viceconti, Marco; Stroetmann, Karl

    2011-01-01

    Biocomputational modelling as developed by the European Virtual Physiological Human (VPH) Initiative is the area of ICT most likely to revolutionise in the longer term the practice of medicine. Using the example of osteoporosis management, a socio-economic assessment framework is presented that captures how the transformation of clinical guidelines through VPH models can be evaluated. Applied to the Osteoporotic Virtual Physiological Human Project, a consequent benefit-cost analysis delivers promising results, both methodologically and substantially.

  7. The Generalised Ecosystem Modelling Approach in Radiological Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Klos, Richard

    2008-03-15

    An independent modelling capability is required by SSI in order to evaluate dose assessments carried out in Sweden by, amongst others, SKB. The main focus is the evaluation of the long-term radiological safety of radioactive waste repositories for both spent fuel and low-level radioactive waste. To meet the requirement for an independent modelling tool for use in biosphere dose assessments, SSI through its modelling team CLIMB commissioned the development of a new model in 2004, a project to produce an integrated model of radionuclides in the landscape. The generalised ecosystem modelling approach (GEMA) is the result. GEMA is a modular system of compartments representing the surface environment. It can be configured, through water and solid material fluxes, to represent local details in the range of ecosystem types found in the past, present and future Swedish landscapes. The approach is generic but fine tuning can be carried out using local details of the surface drainage system. The modular nature of the modelling approach means that GEMA modules can be linked to represent large scale surface drainage features over an extended domain in the landscape. System change can also be managed in GEMA, allowing a flexible and comprehensive model of the evolving landscape to be constructed. Environmental concentrations of radionuclides can be calculated and the GEMA dose pathway model provides a means of evaluating the radiological impact of radionuclide release to the surface environment. This document sets out the philosophy and details of GEMA and illustrates the functioning of the model with a range of examples featuring the recent CLIMB review of SKB's SR-Can assessment

  8. A Protocol for the Global Sensitivity Analysis of Impact Assessment Models in Life Cycle Assessment.

    Science.gov (United States)

    Cucurachi, S; Borgonovo, E; Heijungs, R

    2016-02-01

    The life cycle assessment (LCA) framework has established itself as the leading tool for the assessment of the environmental impact of products. Several works have established the need of integrating the LCA and risk analysis methodologies, due to the several common aspects. One of the ways to reach such integration is through guaranteeing that uncertainties in LCA modeling are carefully treated. It has been claimed that more attention should be paid to quantifying the uncertainties present in the various phases of LCA. Though the topic has been attracting increasing attention of practitioners and experts in LCA, there is still a lack of understanding and a limited use of the available statistical tools. In this work, we introduce a protocol to conduct global sensitivity analysis in LCA. The article focuses on the life cycle impact assessment (LCIA), and particularly on the relevance of global techniques for the development of trustable impact assessment models. We use a novel characterization model developed for the quantification of the impacts of noise on humans as a test case. We show that global SA is fundamental to guarantee that the modeler has a complete understanding of: (i) the structure of the model and (ii) the importance of uncertain model inputs and the interaction among them. © 2015 Society for Risk Analysis.

  9. Biosphere models for safety assessment of radioactive waste disposal

    International Nuclear Information System (INIS)

    Proehl, G.; Olyslaegers, G.; Zeevaert, T.; Kanyar, B.; Bergstroem, U.; Hallberg, B.; Mobbs, S.; Chen, Q.; Kowe, R.

    2004-01-01

    The aim of the BioMoSA project has been to contribute in the confidence building of biosphere models, for application in performance assessments of radioactive waste disposal. The detailed objectives of this project are: development and test of practical biosphere models for application in long-term safety studies of radioactive waste disposal to different European locations, identification of features, events and processes that need to be modelled on a site-specific rather than on a generic base, comparison of the results and quantification of the variability of site-specific models developed according to the reference biosphere methodology, development of a generic biosphere tool for application in long term safety studies, comparison of results from site-specific models to those from generic one, Identification of possibilities and limitations for the application of the generic biosphere model. (orig.)

  10. Assessment of closure coefficients for compressible-flow turbulence models

    Science.gov (United States)

    Huang, P. G.; Bradshaw, P.; Coakley, T. J.

    1992-01-01

    A critical assessment is made of the closure coefficients used for turbulence length scale in existing models of the transport equation, with reference to the extension of these models to compressible flow. It is shown that to satisfy the compressible 'law of the wall', the model coefficients must actually be functions of density gradients. The magnitude of the errors that result from neglecting this dependence on density varies with the variable used to specify the length scale. Among the models investigated, the k-omega model yields the best performance, although it is not completely free from errors associated with density terms. Models designed to reduce the density-gradient effect to an insignificant level are proposed.

  11. Mesorad dose assessment model. Volume 1. Technical basis

    International Nuclear Information System (INIS)

    Scherpelz, R.I.; Bander, T.J.; Athey, G.F.; Ramsdell, J.V.

    1986-03-01

    MESORAD is a dose assessment model for emergency response applications. Using release data for as many as 50 radionuclides, the model calculates: (1) external doses resulting from exposure to radiation emitted by radionuclides contained in elevated or deposited material; (2) internal dose commitment resulting from inhalation; and (3) total whole-body doses. External doses from airborne material are calculated using semi-infinite and finite cloud approximations. At each stage in model execution, the appropriate approximation is selected after considering the cloud dimensions. Atmospheric processes are represented in MESORAD by a combination of Lagrangian puff and Gaussian plume dispersion models, a source depletion (deposition velocity) dry deposition model, and a wet deposition model using washout coefficients based on precipitation rates

  12. A Consensus Model: Shifting assessment practices in dietetics tertiary education.

    Science.gov (United States)

    Bacon, Rachel; Kellett, Jane; Dart, Janeane; Knight-Agarwal, Cathy; Mete, Rebecca; Ash, Susan; Palermo, Claire

    2018-02-21

    The aim of this research was to evaluate a Consensus Model for competency-based assessment. An evaluative case study was used to allow a holistic examination of a constructivist-interpretivist programmatic model of assessment. Using a modified Delphi process, the competence of all 29 students enrolled in their final year of a Master of Nutrition and Dietetics course was assessed by a panel (with expertise in competency-based assessment; industry and academic representation) from a course e-portfolio (that included the judgements of student performance made by worksite educators) and a panel interview. Data were triangulated with assessments from a capstone internship. Qualitative descriptive studies with worksite educators (focus groups n = 4, n = 5, n = 8) and students (personal interviews n = 29) explored stakeholder experiences analysed using thematic analysis. Panel consensus was achieved for all cases by the third-round and corroborated by internship outcomes. For 34% of students this differed to the 'interpretations' of their performance made by their worksite educator/s. Emerging qualitative themes from stakeholder data found the model: (i) supported sustainable assessment practices; (ii) shifted the power relationship between students and worksite educators and (iii) provided a fair method to assess competence. To maximise benefits, more refinement, resources and training are required. This research questions competency-based assessment practices based on discrete placement units and supports a constructivist-interpretivist programmatic approach where evidence across a whole course of study is considered by a panel of assessors. © 2018 Dietitians Association of Australia.

  13. Modeling Characteristics of an Operational Probabilistic Safety Assessment (PSA)

    International Nuclear Information System (INIS)

    Anoba, Richard C.; Khalil, Yehia; Fluehr, J.J. III; Kellogg, Richard; Hackerott, Alan

    2002-01-01

    Probabilistic Safety Assessments (PSAs) are increasingly being used as a tool for supporting the acceptability of design, procurement, construction, operation, and maintenance activities at nuclear power plants. Since the issuance of Generic Letter 88-20 and subsequent Individual Plant Examinations (IPEs)/Individual Plant Examinations for External Events (IPEEEs), the NRC has issued several Regulatory Guides such as RG 1.182 to describe the use of PSA in risk informed regulation activities. The PSA models developed for the IPEs were typically based on a 'snapshot' of the the risk profile at the nuclear power plant. The IPE models contain implicit assumptions and simplifications that limit the ability to realistically assess current issues. For example, IPE modeling assumptions related to plant configuration limit the ability to perform online equipment out-of-service assessments. The lack of model symmetry results in skewed risk results. IPE model simplifications related to initiating events have resulted in non-conservative estimates of risk impacts when equipment is removed from service. The IPE models also do not explicitly address all external events that are potentially risk significant as equipment is removed from service. (authors)

  14. An analytical model for the assessment of airline expansion strategies

    Directory of Open Access Journals (Sweden)

    Mauricio Emboaba Moreira

    2014-01-01

    Full Text Available Purpose: The purpose of this article is to develop an analytical model to assess airline expansion strategies by combining generic business strategy models with airline business models. Methodology and approach: A number of airline business models are examined, as are Porter’s (1983 industry five forces that drive competition, complemented by Nalebuff/ Brandenburger’s  (1996 sixth force, and the basic elements of the general environment in which the expansion process takes place.  A system of points and weights is developed to create a score among the 904,736 possible combinations considered. The model’s outputs are generic expansion strategies with quantitative assessments for each specific combination of elements inputted. Originality and value: The analytical model developed is original because it combines for the first time and explicitly elements of the general environment, industry environment, airline business models and the generic expansion strategy types. Besides it creates a system of scores that may be used to drive the decision process toward the choice of a specific strategic expansion path. Research implications: The analytical model may be adapted to other industries apart from the airline industry by substituting the element “airline business model” by other industries corresponding elements related to the different specific business models.

  15. Skill and independence weighting for multi-model assessments

    International Nuclear Information System (INIS)

    Sanderson, Benjamin M.; Wehner, Michael; Knutti, Reto

    2017-01-01

    We present a weighting strategy for use with the CMIP5 multi-model archive in the fourth National Climate Assessment, which considers both skill in the climatological performance of models over North America as well as the inter-dependency of models arising from common parameterizations or tuning practices. The method exploits information relating to the climatological mean state of a number of projection-relevant variables as well as metrics representing long-term statistics of weather extremes. The weights, once computed can be used to simply compute weighted means and significance information from an ensemble containing multiple initial condition members from potentially co-dependent models of varying skill. Two parameters in the algorithm determine the degree to which model climatological skill and model uniqueness are rewarded; these parameters are explored and final values are defended for the assessment. The influence of model weighting on projected temperature and precipitation changes is found to be moderate, partly due to a compensating effect between model skill and uniqueness. However, more aggressive skill weighting and weighting by targeted metrics is found to have a more significant effect on inferred ensemble confidence in future patterns of change for a given projection.

  16. A Corrosion Risk Assessment Model for Underground Piping

    Science.gov (United States)

    Datta, Koushik; Fraser, Douglas R.

    2009-01-01

    The Pressure Systems Manager at NASA Ames Research Center (ARC) has embarked on a project to collect data and develop risk assessment models to support risk-informed decision making regarding future inspections of underground pipes at ARC. This paper shows progress in one area of this project - a corrosion risk assessment model for the underground high-pressure air distribution piping system at ARC. It consists of a Corrosion Model of pipe-segments, a Pipe Wrap Protection Model; and a Pipe Stress Model for a pipe segment. A Monte Carlo simulation of the combined models provides a distribution of the failure probabilities. Sensitivity study results show that the model uncertainty, or lack of knowledge, is the dominant contributor to the calculated unreliability of the underground piping system. As a result, the Pressure Systems Manager may consider investing resources specifically focused on reducing these uncertainties. Future work includes completing the data collection effort for the existing ground based pressure systems and applying the risk models to risk-based inspection strategies of the underground pipes at ARC.

  17. Predicting the ungauged basin: model validation and realism assessment

    Science.gov (United States)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2016-04-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) [1] led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this study [2] we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. We do not present a generic approach that can be transferred to other ungauged catchments, but we aim to show how clever model design and alternative data acquisition can result in a valuable hydrological model for ungauged catchments. [1] Sivapalan, M., Takeuchi, K., Franks, S., Gupta, V., Karambiri, H., Lakshmi, V., et al. (2003). IAHS decade on predictions in ungauged basins (PUB), 2003-2012: shaping an exciting future for the hydrological sciences. Hydrol. Sci. J. 48, 857-880. doi: 10.1623/hysj.48.6.857.51421 [2] van Emmerik, T., Mulder, G., Eilander, D., Piet, M. and Savenije, H. (2015). Predicting the ungauged basin: model validation and realism assessment

  18. Model Test Bed for Evaluating Wave Models and Best Practices for Resource Assessment and Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Neary, Vincent Sinclair [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Water Power Technologies; Yang, Zhaoqing [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). Coastal Sciences Division; Wang, Taiping [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). Coastal Sciences Division; Gunawan, Budi [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Water Power Technologies; Dallman, Ann Renee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Water Power Technologies

    2016-03-01

    A wave model test bed is established to benchmark, test and evaluate spectral wave models and modeling methodologies (i.e., best practices) for predicting the wave energy resource parameters recommended by the International Electrotechnical Commission, IEC TS 62600-101Ed. 1.0 ©2015. Among other benefits, the model test bed can be used to investigate the suitability of different models, specifically what source terms should be included in spectral wave models under different wave climate conditions and for different classes of resource assessment. The overarching goal is to use these investigations to provide industry guidance for model selection and modeling best practices depending on the wave site conditions and desired class of resource assessment. Modeling best practices are reviewed, and limitations and knowledge gaps in predicting wave energy resource parameters are identified.

  19. Evaluation of models for assessing groundwater vulnerability to ...

    African Journals Online (AJOL)

    This paper examines, based on a review and synthesis of available material, the presently most applied models for groundwater vulnerability assessment mapping. The appraoches and the pros and cons of each method are evaluated in terms of both the conditions of their implementation and the result obtained. The paper ...

  20. Assessment of the Quality Management Models in Higher Education

    Science.gov (United States)

    Basar, Gulsun; Altinay, Zehra; Dagli, Gokmen; Altinay, Fahriye

    2016-01-01

    This study involves the assessment of the quality management models in Higher Education by explaining the importance of quality in higher education and by examining the higher education quality assurance system practices in other countries. The qualitative study was carried out with the members of the Higher Education Planning, Evaluation,…

  1. Queuing Models: A Tool For Assessing The Profitability Of Barbing ...

    African Journals Online (AJOL)

    The study considered small scale business as an option in reducing the unemployment rate in our society. The study uses queuing models to assess the profitability of barbing salon business in Agbor town of Delta State. The result of the study indicates that the distribution of inter-arrival times, service times, and waiting ...

  2. Modeling of Information Security Strategic Planning Methods and Expert Assessments

    Directory of Open Access Journals (Sweden)

    Alexander Panteleevich Batsula

    2014-09-01

    Full Text Available The article, paper addresses problem of increasing the level of information security. As a result, a method of increasing the level of information security is developed through its modeling of strategic planning SWOT-analysis using expert assessments.

  3. Vulnerability Assessment Models to Drought: Toward a Conceptual Framework

    Directory of Open Access Journals (Sweden)

    Kiumars Zarafshani

    2016-06-01

    Full Text Available Drought is regarded as a slow-onset natural disaster that causes inevitable damage to water resources and to farm life. Currently, crisis management is the basis of drought mitigation plans, however, thus far studies indicate that effective drought management strategies are based on risk management. As a primary tool in mitigating the impact of drought, vulnerability assessment can be used as a benchmark in drought mitigation plans and to enhance farmers’ ability to cope with drought. Moreover, literature pertaining to drought has focused extensively on its impact, only awarding limited attention to vulnerability assessment as a tool. Therefore, the main purpose of this paper is to develop a conceptual framework for designing a vulnerability model in order to assess farmers’ level of vulnerability before, during and after the onset of drought. Use of this developed drought vulnerability model would aid disaster relief workers by enhancing the adaptive capacity of farmers when facing the impacts of drought. The paper starts with the definition of vulnerability and outlines different frameworks on vulnerability developed thus far. It then identifies various approaches of vulnerability assessment and finally offers the most appropriate model. The paper concludes that the introduced model can guide drought mitigation programs in countries that are impacted the most by drought.

  4. Task-based dermal exposure models for regulatory risk assessment

    NARCIS (Netherlands)

    Warren, N.D.; Marquart, H.; Christopher, Y.; Laitinen, J.; Hemmen, J.J. van

    2006-01-01

    The regulatory risk assessment of chemicals requires the estimation of occupational dermal exposure. Until recently, the models used were either based on limited data or were specific to a particular class of chemical or application. The EU project RISKOFDERM has gathered a considerable number of

  5. Groundwater Impacts of Radioactive Wastes and Associated Environmental Modeling Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Rui; Zheng, Chunmiao; Liu, Chongxuan

    2012-11-01

    This article provides a review of the major sources of radioactive wastes and their impacts on groundwater contamination. The review discusses the major biogeochemical processes that control the transport and fate of radionuclide contaminants in groundwater, and describe the evolution of mathematical models designed to simulate and assess the transport and transformation of radionuclides in groundwater.

  6. Modeling Logistic Performance in Quantitative Microbial Risk Assessment

    NARCIS (Netherlands)

    Rijgersberg, H.; Tromp, S.O.; Jacxsens, L.; Uyttendaele, M.

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage

  7. A model for assessing Medicago Sativa L. hay quality | Scholtz ...

    African Journals Online (AJOL)

    A study was conducted to identify chemical parameters and/or models for assessing. Medicago sativa L. (L) hay quality, using near infrared reflectance spectroscopy (NIRS) analysis and Cornell Net Carbohydrate and Protein System (CNCPS) milk prediction as a criterion of accuracy. Milk yield (MY) derived from the ...

  8. An Empirical Study of a Solo Performance Assessment Model

    Science.gov (United States)

    Russell, Brian E.

    2015-01-01

    The purpose of this study was to test a hypothesized model of solo music performance assessment. Specifically, this study investigates the influence of technique and musical expression on perceptions of overall performance quality. The Aural Musical Performance Quality (AMPQ) measure was created to measure overall performance quality, technique,…

  9. Confidence Intervals for Assessing Heterogeneity in Generalized Linear Mixed Models

    Science.gov (United States)

    Wagler, Amy E.

    2014-01-01

    Generalized linear mixed models are frequently applied to data with clustered categorical outcomes. The effect of clustering on the response is often difficult to practically assess partly because it is reported on a scale on which comparisons with regression parameters are difficult to make. This article proposes confidence intervals for…

  10. A Comprehensive Assessment Model for Critical Infrastructure Protection

    Directory of Open Access Journals (Sweden)

    Häyhtiö Markus

    2017-12-01

    Full Text Available International business demands seamless service and IT-infrastructure throughout the entire supply chain. However, dependencies between different parts of this vulnerable ecosystem form a fragile web. Assessment of the financial effects of any abnormalities in any part of the network is demanded in order to protect this network in a financially viable way. Contractual environment between the actors in a supply chain, different business domains and functions requires a management model, which enables a network wide protection for critical infrastructure. In this paper authors introduce such a model. It can be used to assess financial differences between centralized and decentralized protection of critical infrastructure. As an end result of this assessment business resilience to unknown threats can be improved across the entire supply chain.

  11. Interactive Rapid Dose Assessment Model (IRDAM): reactor-accident assessment methods. Vol. 2

    Energy Technology Data Exchange (ETDEWEB)

    Poeton, R.W.; Moeller, M.P.; Laughlin, G.J.; Desrosiers, A.E.

    1983-05-01

    As part of the continuing emphasis on emergency preparedness, the US Nuclear Regulatory Commission (NRC) sponsored the development of a rapid dose assessment system by Pacific Northwest Laboratory (PNL). This system, the Interactive Rapid Dose Assessment Model (IRDAM) is a micro-computer based program for rapidly assessing the radiological impact of accidents at nuclear power plants. This document describes the technical bases for IRDAM including methods, models and assumptions used in calculations. IRDAM calculates whole body (5-cm depth) and infant thyroid doses at six fixed downwind distances between 500 and 20,000 meters. Radionuclides considered primarily consist of noble gases and radioiodines. In order to provide a rapid assessment capability consistent with the capacity of the Osborne-1 computer, certain simplifying approximations and assumptions are made. These are described, along with default values (assumptions used in the absence of specific input) in the text of this document. Two companion volumes to this one provide additional information on IRDAM. The user's Guide (NUREG/CR-3012, Volume 1) describes the setup and operation of equipment necessary to run IRDAM. Scenarios for Comparing Dose Assessment Models (NUREG/CR-3012, Volume 3) provides the results of calculations made by IRDAM and other models for specific accident scenarios.

  12. Some considerations for validation of repository performance assessment models

    International Nuclear Information System (INIS)

    Eisenberg, N.

    1991-01-01

    Validation is an important aspect of the regulatory uses of performance assessment. A substantial body of literature exists indicating the manner in which validation of models is usually pursued. Because performance models for a nuclear waste repository cannot be tested over the long time periods for which the model must make predictions, the usual avenue for model validation is precluded. Further impediments to model validation include a lack of fundamental scientific theory to describe important aspects of repository performance and an inability to easily deduce the complex, intricate structures characteristic of a natural system. A successful strategy for validation must attempt to resolve these difficulties in a direct fashion. Although some procedural aspects will be important, the main reliance of validation should be on scientific substance and logical rigor. The level of validation needed will be mandated, in part, by the uses to which these models are put, rather than by the ideal of validation of a scientific theory. Because of the importance of the validation of performance assessment models, the NRC staff has engaged in a program of research and international cooperation to seek progress in this important area. 2 figs., 16 refs

  13. Spatial variability and parametric uncertainty in performance assessment models

    International Nuclear Information System (INIS)

    Pensado, Osvaldo; Mancillas, James; Painter, Scott; Tomishima, Yasuo

    2011-01-01

    The problem of defining an appropriate treatment of distribution functions (which could represent spatial variability or parametric uncertainty) is examined based on a generic performance assessment model for a high-level waste repository. The generic model incorporated source term models available in GoldSim ® , the TDRW code for contaminant transport in sparse fracture networks with a complex fracture-matrix interaction process, and a biosphere dose model known as BDOSE TM . Using the GoldSim framework, several Monte Carlo sampling approaches and transport conceptualizations were evaluated to explore the effect of various treatments of spatial variability and parametric uncertainty on dose estimates. Results from a model employing a representative source and ensemble-averaged pathway properties were compared to results from a model allowing for stochastic variation of transport properties along streamline segments (i.e., explicit representation of spatial variability within a Monte Carlo realization). We concluded that the sampling approach and the definition of an ensemble representative do influence consequence estimates. In the examples analyzed in this paper, approaches considering limited variability of a transport resistance parameter along a streamline increased the frequency of fast pathways resulting in relatively high dose estimates, while those allowing for broad variability along streamlines increased the frequency of 'bottlenecks' reducing dose estimates. On this basis, simplified approaches with limited consideration of variability may suffice for intended uses of the performance assessment model, such as evaluation of site safety. (author)

  14. Predictive assessment of models for dynamic functional connectivity.

    Science.gov (United States)

    Nielsen, Søren F V; Schmidt, Mikkel N; Madsen, Kristoffer H; Mørup, Morten

    2018-05-01

    In neuroimaging, it has become evident that models of dynamic functional connectivity (dFC), which characterize how intrinsic brain organization changes over time, can provide a more detailed representation of brain function than traditional static analyses. Many dFC models in the literature represent functional brain networks as a meta-stable process with a discrete number of states; however, there is a lack of consensus on how to perform model selection and learn the number of states, as well as a lack of understanding of how different modeling assumptions influence the estimated state dynamics. To address these issues, we consider a predictive likelihood approach to model assessment, where models are evaluated based on their predictive performance on held-out test data. Examining several prominent models of dFC (in their probabilistic formulations) we demonstrate our framework on synthetic data, and apply it on two real-world examples: a face recognition EEG experiment and resting-state fMRI. Our results evidence that both EEG and fMRI are better characterized using dynamic modeling approaches than by their static counterparts, but we also demonstrate that one must be cautious when interpreting dFC because parameter settings and modeling assumptions, such as window lengths and emission models, can have a large impact on the estimated states and consequently on the interpretation of the brain dynamics. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Agent Model Development for Assessing Climate-Induced Geopolitical Instability.

    Energy Technology Data Exchange (ETDEWEB)

    Boslough, Mark B.; Backus, George A.

    2005-12-01

    We present the initial stages of development of new agent-based computational methods to generate and test hypotheses about linkages between environmental change and international instability. This report summarizes the first year's effort of an originally proposed three-year Laboratory Directed Research and Development (LDRD) project. The preliminary work focused on a set of simple agent-based models and benefited from lessons learned in previous related projects and case studies of human response to climate change and environmental scarcity. Our approach was to define a qualitative model using extremely simple cellular agent models akin to Lovelock's Daisyworld and Schelling's segregation model. Such models do not require significant computing resources, and users can modify behavior rules to gain insights. One of the difficulties in agent-based modeling is finding the right balance between model simplicity and real-world representation. Our approach was to keep agent behaviors as simple as possible during the development stage (described herein) and to ground them with a realistic geospatial Earth system model in subsequent years. This work is directed toward incorporating projected climate data--including various C02 scenarios from the Intergovernmental Panel on Climate Change (IPCC) Third Assessment Report--and ultimately toward coupling a useful agent-based model to a general circulation model.3

  16. Assessing disease stress and modeling yield losses in alfalfa

    Science.gov (United States)

    Guan, Jie

    weight, percentage reflectance (810 nm), and green leaf area index (GLAI). Percentage reflectance (810 nm) assessments had a stronger relationship with dry weight and green leaf area index than percentage defoliation assessments. Our research conclusively demonstrates that percentage reflectance measurements can be used to nondestructively assess green leaf area index which is a direct measure of plant health and an indirect measure of productivity. This research conclusively demonstrates that remote sensing is superior to visual assessment method to assess alfalfa stress and to model yield and GLAI in the alfalfa foliar disease pathosystem.

  17. Assessing the sustainability of biofuels: A logic-based model

    International Nuclear Information System (INIS)

    Gnansounou, Edgard

    2011-01-01

    Over the last decade, the production and consumption of biofuels increased rapidly worldwide, in an attempt to reduce GHG (greenhouse gas) emissions, diversify transportation fuels, promote renewable energy, and create or maintain employment, especially in rural areas and developing countries. Although policy instruments being currently implemented in industrialized regions focus on sustainable biofuels, the definition and assessment of sustainability remains a highly debated issue. Several countries have adopted compulsory targets or financial incentives for promoting biofuels, and only a few countries have accounted for sustainability certification schemes for those biofuels within their policy framework. In this paper, a logic-based model for assessing the sustainability of biofuels is presented. The model uses a hierarchical structure to link multiple factors from the more specific variables to the most general one, sustainability performance. The strengths and limitations of the model are discussed and the anticipated improvements are provided.

  18. Probabilistic Rotor Life Assessment Using Reduced Order Models

    Directory of Open Access Journals (Sweden)

    Brian K. Beachkofski

    2009-01-01

    Full Text Available Probabilistic failure assessments for integrally bladed disks are system reliability problems where a failure in at least one blade constitutes a rotor system failure. Turbine engine fan and compressor blade life is dominated by High Cycle Fatigue (HCF initiated either by pure HCF or Foreign Object Damage (FOD. To date performing an HCF life assessment for the entire rotor system has been too costly in analysis time to be practical. Although the substantial run-time has previously precluded a full-rotor probabilistic analysis, reduced order models make this process tractable as demonstrated in this work. The system model includes frequency prediction, modal stress variation, mistuning amplification, FOD effect, and random material capability. The model has many random variables which are most easily handled through simple random sampling.

  19. Accuracy assessment of global barotropic ocean tide models

    DEFF Research Database (Denmark)

    Stammer, D.; Ray, R. D.; Andersen, Ole Baltazar

    2014-01-01

    The accuracy of state-of-the-art global barotropic tide models is assessed using bottom pressure data, coastal tide gauges, satellite altimetry, various geodetic data on Antarctic ice shelves, and independent tracked satellite orbit perturbations. Tide models under review include empirical, purely......-water regions and also in the deep ocean. The root-sum-square differences between tide observations and the best models for eight major constituents are approximately 0.9, 5.0, and 6.5 cm for pelagic, shelf, and coastal conditions, respectively. Large intermodel discrepancies occur in high latitudes......, but testing in those regions is impeded by the paucity of high-quality in situ tide records. Long-wavelength components of models tested by analyzing satellite laser ranging measurements suggest that several models are comparably accurate for use in precise orbit determination, but analyses of GRACE...

  20. Assessing uncertainty in SRTM elevations for global flood modelling

    Science.gov (United States)

    Hawker, L. P.; Rougier, J.; Neal, J. C.; Bates, P. D.

    2017-12-01

    The SRTM DEM is widely used as the topography input to flood models in data-sparse locations. Understanding spatial error in the SRTM product is crucial in constraining uncertainty about elevations and assessing the impact of these upon flood prediction. Assessment of SRTM error was carried out by Rodriguez et al (2006), but this did not explicitly quantify the spatial structure of vertical errors in the DEM, and nor did it distinguish between errors over different types of landscape. As a result, there is a lack of information about spatial structure of vertical errors of the SRTM in the landscape that matters most to flood models - the floodplain. Therefore, this study attempts this task by comparing SRTM, an error corrected SRTM product (The MERIT DEM of Yamazaki et al., 2017) and near truth LIDAR elevations for 3 deltaic floodplains (Mississippi, Po, Wax Lake) and a large lowland region (the Fens, UK). Using the error covariance function, calculated by comparing SRTM elevations to the near truth LIDAR, perturbations of the 90m SRTM DEM were generated, producing a catalogue of plausible DEMs. This allows modellers to simulate a suite of plausible DEMs at any aggregated block size above native SRTM resolution. Finally, the generated DEM's were input into a hydrodynamic model of the Mekong Delta, built using the LISFLOOD-FP hydrodynamic model, to assess how DEM error affects the hydrodynamics and inundation extent across the domain. The end product of this is an inundation map with the probability of each pixel being flooded based on the catalogue of DEMs. In a world of increasing computer power, but a lack of detailed datasets, this powerful approach can be used throughout natural hazard modelling to understand how errors in the SRTM DEM can impact the hazard assessment.

  1. The Assessment of Patient Clinical Outcome: Advantages, Models, Features of an Ideal Model

    Directory of Open Access Journals (Sweden)

    Mou’ath Hourani

    2016-06-01

    Full Text Available Background: The assessment of patient clinical outcome focuses on measuring various aspects of the health status of a patient who is under healthcare intervention. Patient clinical outcome assessment is a very significant process in the clinical field as it allows health care professionals to better understand the effectiveness of their health care programs and thus for enhancing the health care quality in general. It is thus vital that a high quality, informative review of current issues regarding the assessment of patient clinical outcome should be conducted. Aims & Objectives: 1 Summarizes the advantages of the assessment of patient clinical outcome; 2 reviews some of the existing patient clinical outcome assessment models namely: Simulation, Markov, Bayesian belief networks, Bayesian statistics and Conventional statistics, and Kaplan-Meier analysis models; and 3 demonstrates the desired features that should be fulfilled by a well-established ideal patient clinical outcome assessment model. Material & Methods: An integrative review of the literature has been performed using the Google Scholar to explore the field of patient clinical outcome assessment. Conclusion: This paper will directly support researchers, clinicians and health care professionals in their understanding of developments in the domain of the assessment of patient clinical outcome, thus enabling them to propose ideal assessment models.

  2. Avian collision risk models for wind energy impact assessments

    Energy Technology Data Exchange (ETDEWEB)

    Masden, E.A., E-mail: elizabeth.masden@uhi.ac.uk [Environmental Research Institute, North Highland College-UHI, University of the Highlands and Islands, Ormlie Road, Thurso, Caithness KW14 7EE (United Kingdom); Cook, A.S.C.P. [British Trust for Ornithology, The Nunnery, Thetford IP24 2PU (United Kingdom)

    2016-01-15

    With the increasing global development of wind energy, collision risk models (CRMs) are routinely used to assess the potential impacts of wind turbines on birds. We reviewed and compared the avian collision risk models currently available in the scientific literature, exploring aspects such as the calculation of a collision probability, inclusion of stationary components e.g. the tower, angle of approach and uncertainty. 10 models were cited in the literature and of these, all included a probability of collision of a single bird colliding with a wind turbine during passage through the rotor swept area, and the majority included a measure of the number of birds at risk. 7 out of the 10 models calculated the probability of birds colliding, whilst the remainder used a constant. We identified four approaches to calculate the probability of collision and these were used by others. 6 of the 10 models were deterministic and included the most frequently used models in the UK, with only 4 including variation or uncertainty in some way, the most recent using Bayesian methods. Despite their appeal, CRMs have their limitations and can be ‘data hungry’ as well as assuming much about bird movement and behaviour. As data become available, these assumptions should be tested to ensure that CRMs are functioning to adequately answer the questions posed by the wind energy sector. - Highlights: • We highlighted ten models available to assess avian collision risk. • Only 4 of the models included variability or uncertainty. • Collision risk models have limitations and can be ‘data hungry’. • It is vital that the most appropriate model is used for a given task.

  3. Radionuclide release rates from spent fuel for performance assessment modeling

    International Nuclear Information System (INIS)

    Curtis, D.B.

    1994-01-01

    In a scenario of aqueous transport from a high-level radioactive waste repository, the concentration of radionuclides in water in contact with the waste constitutes the source term for transport models, and as such represents a fundamental component of all performance assessment models. Many laboratory experiments have been done to characterize release rates and understand processes influencing radionuclide release rates from irradiated nuclear fuel. Natural analogues of these waste forms have been studied to obtain information regarding the long-term stability of potential waste forms in complex natural systems. This information from diverse sources must be brought together to develop and defend methods used to define source terms for performance assessment models. In this manuscript examples of measures of radionuclide release rates from spent nuclear fuel or analogues of nuclear fuel are presented. Each example represents a very different approach to obtaining a numerical measure and each has its limitations. There is no way to obtain an unambiguous measure of this or any parameter used in performance assessment codes for evaluating the effects of processes operative over many millennia. The examples are intended to suggest by example that in the absence of the ability to evaluate accuracy and precision, consistency of a broadly based set of data can be used as circumstantial evidence to defend the choice of parameters used in performance assessments

  4. Modeling risk assessment for nuclear processing plants with LAVA

    International Nuclear Information System (INIS)

    Smith, S.T.; Tisinger, R.M.

    1988-01-01

    Using the Los Alamos Vulnerability and Risk Assessment (LAVA) methodology, the authors developed a model for assessing risks associated with nuclear processing plants. LAVA is a three-part systematic approach to risk assessment. The first part is the mathematical methodology; the second is the general personal computer-based software engine; and the third is the application itself. The methodology provides a framework for creating applications for the software engine to operate upon; all application-specific information is data. Using LAVA, the authors build knowledge-based expert systems to assess risks in applications systems comprising a subject system and a safeguards system. The subject system model is sets of threats, assets, and undesirable outcomes. The safeguards system model is sets of safeguards functions for protecting the assets from the threats by preventing or ameliorating the undesirable outcomes, sets of safeguards subfunctions whose performance determine whether the function is adequate and complete, and sets of issues, appearing as interactive questionnaires, whose measures (in both monetary and linguistic terms) define both the weaknesses in the safeguards system and the potential costs of an undesirable outcome occurring

  5. Training courses on integrated safety assessment modelling for waste repositories

    International Nuclear Information System (INIS)

    Mallants, D.

    2007-01-01

    Near-surface or deep repositories of radioactive waste are being developed and evaluated all over the world. Also, existing repositories for low- and intermediate-level waste often need to be re-evaluated to extend their license or to obtain permission for final closure. The evaluation encompasses both a technical feasibility as well as a safety analysis. The long term safety is usually demonstrated by means of performance or safety assessment. For this purpose computer models are used that calculate the migration of radionuclides from the conditioned radioactive waste, through engineered barriers to the environment (groundwater, surface water, and biosphere). Integrated safety assessment modelling addresses all relevant radionuclide pathways from source to receptor (man), using in combination various computer codes in which the most relevant physical, chemical, mechanical, or even microbiological processes are mathematically described. SCK-CEN organizes training courses in Integrated safety assessment modelling that are intended for individuals who have either a controlling or supervising role within the national radwaste agencies or regulating authorities, or for technical experts that carry out the actual post-closure safety assessment for an existing or new repository. Courses are organised by the Department of Waste and Disposal

  6. AgMIP: Next Generation Models and Assessments

    Science.gov (United States)

    Rosenzweig, C.

    2014-12-01

    Next steps in developing next-generation crop models fall into several categories: significant improvements in simulation of important crop processes and responses to stress; extension from simplified crop models to complex cropping systems models; and scaling up from site-based models to landscape, national, continental, and global scales. Crop processes that require major leaps in understanding and simulation in order to narrow uncertainties around how crops will respond to changing atmospheric conditions include genetics; carbon, temperature, water, and nitrogen; ozone; and nutrition. The field of crop modeling has been built on a single crop-by-crop approach. It is now time to create a new paradigm, moving from 'crop' to 'cropping system.' A first step is to set up the simulation technology so that modelers can rapidly incorporate multiple crops within fields, and multiple crops over time. Then the response of these more complex cropping systems can be tested under different sustainable intensification management strategies utilizing the updated simulation environments. Model improvements for diseases, pests, and weeds include developing process-based models for important diseases, frameworks for coupling air-borne diseases to crop models, gathering significantly more data on crop impacts, and enabling the evaluation of pest management strategies. Most smallholder farming in the world involves integrated crop-livestock systems that cannot be represented by crop modeling alone. Thus, next-generation cropping system models need to include key linkages to livestock. Livestock linkages to be incorporated include growth and productivity models for grasslands and rangelands as well as the usual annual crops. There are several approaches for scaling up, including use of gridded models and development of simpler quasi-empirical models for landscape-scale analysis. On the assessment side, AgMIP is leading a community process for coordinated contributions to IPCC AR6

  7. The role of computer modelling in structural integrity assessment

    International Nuclear Information System (INIS)

    Sauve, R.G.

    2002-01-01

    There is little doubt that computer technology has spawned extraordinary advances in the traditional fields of science and engineering along with the introduction of new disciplines and technologies. In particular, significant developments directly related to modern computer technology that have had a profound impact on the field of structural integrity include: Computational methods (probabilistic, parametric, data analysis); Finite Element Technique; and, Computer-Aided Design and Engineering. In fact it can be argued that these developments have re-defined and expanded the role of structural integrity assessments by providing comprehensive modelling capabilities to the designer and engineers involved in failure analyses. As computer processing speeds and capacity have increased, so has the role of computer modelling in assessments of component structural integrity. While innovation in these fields has been packaged into various CAE software used by the engineering community, the advantages of simulation have only just begun to be realized. With new product development cycles shrinking with the view to improving time-to-market, the role of initial testing is being reduced in favour of computer modelling and simulation to assess component life and durability. For ageing structures, the evaluation of remaining life and the impact of degraded structural integrity becomes tractable with state-of-the-art computational methods. Needless to say, for complex structures, computer modelling coupled with testing provides a robust method that can avoid costly and sometimes fatal errors in design. Computer modelling brings together a number of disciplines including numerical techniques such as the finite element method, fracture mechanics, continuum mechanics, dynamics, heat transfer, structural reliability and probabilistic methods. One of the salient features of the current methods is the ability to handle large complex steady state or transient dynamic problems that

  8. Comparative flood damage model assessment: towards a European approach

    Science.gov (United States)

    Jongman, B.; Kreibich, H.; Apel, H.; Barredo, J. I.; Bates, P. D.; Feyen, L.; Gericke, A.; Neal, J.; Aerts, J. C. J. H.; Ward, P. J.

    2012-12-01

    There is a wide variety of flood damage models in use internationally, differing substantially in their approaches and economic estimates. Since these models are being used more and more as a basis for investment and planning decisions on an increasingly large scale, there is a need to reduce the uncertainties involved and develop a harmonised European approach, in particular with respect to the EU Flood Risks Directive. In this paper we present a qualitative and quantitative assessment of seven flood damage models, using two case studies of past flood events in Germany and the United Kingdom. The qualitative analysis shows that modelling approaches vary strongly, and that current methodologies for estimating infrastructural damage are not as well developed as methodologies for the estimation of damage to buildings. The quantitative results show that the model outcomes are very sensitive to uncertainty in both vulnerability (i.e. depth-damage functions) and exposure (i.e. asset values), whereby the first has a larger effect than the latter. We conclude that care needs to be taken when using aggregated land use data for flood risk assessment, and that it is essential to adjust asset values to the regional economic situation and property characteristics. We call for the development of a flexible but consistent European framework that applies best practice from existing models while providing room for including necessary regional adjustments.

  9. Operational ocean models in the Adriatic Sea: a skill assessment

    Directory of Open Access Journals (Sweden)

    J. Chiggiato

    2008-02-01

    Full Text Available In the framework of the Mediterranean Forecasting System (MFS project, the performance of regional numerical ocean forecasting systems is assessed by means of model-model and model-data comparison. Three different operational systems considered in this study are: the Adriatic REGional Model (AREG; the Adriatic Regional Ocean Modelling System (AdriaROMS and the Mediterranean Forecasting System General Circulation Model (MFS-GCM. AREG and AdriaROMS are regional implementations (with some dedicated variations of POM and ROMS, respectively, while MFS-GCM is an OPA based system. The assessment is done through standard scores. In situ and remote sensing data are used to evaluate the system performance. In particular, a set of CTD measurements collected in the whole western Adriatic during January 2006 and one year of satellite derived sea surface temperature measurements (SST allow to asses a full three-dimensional picture of the operational forecasting systems quality during January 2006 and to draw some preliminary considerations on the temporal fluctuation of scores estimated on surface quantities between summer 2005 and summer 2006.

    The regional systems share a negative bias in simulated temperature and salinity. Nonetheless, they outperform the MFS-GCM in the shallowest locations. Results on amplitude and phase errors are improved in areas shallower than 50 m, while degraded in deeper locations, where major models deficiencies are related to vertical mixing overestimation. In a basin-wide overview, the two regional models show differences in the local displacement of errors. In addition, in locations where the regional models are mutually correlated, the aggregated mean squared error was found to be smaller, that is a useful outcome of having several operational systems in the same region.

  10. Assessing fit in Bayesian models for spatial processes

    KAUST Repository

    Jun, M.

    2014-09-16

    © 2014 John Wiley & Sons, Ltd. Gaussian random fields are frequently used to model spatial and spatial-temporal data, particularly in geostatistical settings. As much of the attention of the statistics community has been focused on defining and estimating the mean and covariance functions of these processes, little effort has been devoted to developing goodness-of-fit tests to allow users to assess the models\\' adequacy. We describe a general goodness-of-fit test and related graphical diagnostics for assessing the fit of Bayesian Gaussian process models using pivotal discrepancy measures. Our method is applicable for both regularly and irregularly spaced observation locations on planar and spherical domains. The essential idea behind our method is to evaluate pivotal quantities defined for a realization of a Gaussian random field at parameter values drawn from the posterior distribution. Because the nominal distribution of the resulting pivotal discrepancy measures is known, it is possible to quantitatively assess model fit directly from the output of Markov chain Monte Carlo algorithms used to sample from the posterior distribution on the parameter space. We illustrate our method in a simulation study and in two applications.

  11. Cost Model for Risk Assessment of Company Operation in Audit

    Directory of Open Access Journals (Sweden)

    S. V.

    2017-12-01

    Full Text Available This article explores the approach to assessing the risk of company activities termination by building a cost model. This model gives auditors information on managers’ understanding of factors influencing change in the value of assets and liabilities, and the methods to identify it in more effective and reliable ways. Based on this information, the auditor can assess the adequacy of use of the assumption on continuity of company operation by management personnel when preparing financial statements. Financial uncertainty entails real manifestations of factors creating risks of the occurrence of costs, revenue losses due their manifestations, which in the long run can be a reason for termination of company operation, and, therefore, need to be foreseen in the auditor’s assessment of the adequacy of use of the continuity assumption when preparing financial statements by company management. The purpose of the study is to explore and develop a methodology for use of cost models to assess the risk of termination of company operation in audit. The issue of methodology for assessing the audit risk through analyzing methods for company valuation has not been dealt with. The review of methodologies for assessing the risks of termination of company operation in course of audit gives grounds for the conclusion that use of cost models can be an effective methodology for identification and assessment of such risks. The analysis of the above methods gives understanding of the existing system for company valuation, integrated into the management system, and the consequences of its use, i. e. comparison of the asset price data with the accounting data and the market value of the asset data. Overvalued or undervalued company assets may be a sign of future sale or liquidation of a company, which may signal on high probability of termination of company operation. A wrong choice or application of valuation methods can be indicative of the risk of non

  12. Fuel cycle assessment: A compendium of models, methodologies, and approaches

    Energy Technology Data Exchange (ETDEWEB)

    1994-07-01

    The purpose of this document is to profile analytical tools and methods which could be used in a total fuel cycle analysis. The information in this document provides a significant step towards: (1) Characterizing the stages of the fuel cycle. (2) Identifying relevant impacts which can feasibly be evaluated quantitatively or qualitatively. (3) Identifying and reviewing other activities that have been conducted to perform a fuel cycle assessment or some component thereof. (4) Reviewing the successes/deficiencies and opportunities/constraints of previous activities. (5) Identifying methods and modeling techniques/tools that are available, tested and could be used for a fuel cycle assessment.

  13. A multi-model assessment of terrestrial biosphere model data needs

    Science.gov (United States)

    Gardella, A.; Cowdery, E.; De Kauwe, M. G.; Desai, A. R.; Duveneck, M.; Fer, I.; Fisher, R.; Knox, R. G.; Kooper, R.; LeBauer, D.; McCabe, T.; Minunno, F.; Raiho, A.; Serbin, S.; Shiklomanov, A. N.; Thomas, A.; Walker, A.; Dietze, M.

    2017-12-01

    Terrestrial biosphere models provide us with the means to simulate the impacts of climate change and their uncertainties. Going beyond direct observation and experimentation, models synthesize our current understanding of ecosystem processes and can give us insight on data needed to constrain model parameters. In previous work, we leveraged the Predictive Ecosystem Analyzer (PEcAn) to assess the contribution of different parameters to the uncertainty of the Ecosystem Demography model v2 (ED) model outputs across various North American biomes (Dietze et al., JGR-G, 2014). While this analysis identified key research priorities, the extent to which these priorities were model- and/or biome-specific was unclear. Furthermore, because the analysis only studied one model, we were unable to comment on the effect of variability in model structure to overall predictive uncertainty. Here, we expand this analysis to all biomes globally and a wide sample of models that vary in complexity: BioCro, CABLE, CLM, DALEC, ED2, FATES, G'DAY, JULES, LANDIS, LINKAGES, LPJ-GUESS, MAESPA, PRELES, SDGVM, SIPNET, and TEM. Prior to performing uncertainty analyses, model parameter uncertainties were assessed by assimilating all available trait data from the combination of the BETYdb and TRY trait databases, using an updated multivariate version of PEcAn's Hierarchical Bayesian meta-analysis. Next, sensitivity analyses were performed for all models across a range of sites globally to assess sensitivities for a range of different outputs (GPP, ET, SH, Ra, NPP, Rh, NEE, LAI) at multiple time scales from the sub-annual to the decadal. Finally, parameter uncertainties and model sensitivities were combined to evaluate the fractional contribution of each parameter to the predictive uncertainty for a specific variable at a specific site and timescale. Facilitated by PEcAn's automated workflows, this analysis represents the broadest assessment of the sensitivities and uncertainties in terrestrial

  14. Modeling and Representing National Climate Assessment Information using Linked Data

    Science.gov (United States)

    Zheng, J.; Tilmes, C.; Smith, A.; Zednik, S.; Fox, P. A.

    2012-12-01

    Every four years, earth scientists work together on a National Climate Assessment (NCA) report which integrates, evaluates, and interprets the findings of climate change and impacts on affected industries such as agriculture, natural environment, energy production and use, etc. Given the amount of information presented in each report, and the wide range of information sources and topics, it can be difficult for users to find and identify desired information. To ease the user effort of information discovery, well-structured metadata is needed that describes the report's key statements and conclusions and provide for traceable provenance of data sources used. We present an assessment ontology developed to describe terms, concepts and relations required for the NCA metadata. Wherever possible, the assessment ontology reuses terms from well-known ontologies such as Semantic Web for Earth and Environmental Terminology (SWEET) ontology, Dublin Core (DC) vocabulary. We have generated sample National Climate Assessment metadata conforming to our assessment ontology and publicly exposed via a SPARQL-endpoint and website. We have also modeled provenance information for the NCA writing activities using the W3C recommendation-candidate PROV-O ontology. Using this provenance the user will be able to trace the sources of information used in the assessment and therefore make trust decisions. In the future, we are planning to implement a faceted browser over the metadata to enhance metadata traversal and information discovery.

  15. Plasma-safety assessment model and safety analyses of ITER

    International Nuclear Information System (INIS)

    Honda, T.; Okazaki, T.; Bartels, H.-H.; Uckan, N.A.; Sugihara, M.; Seki, Y.

    2001-01-01

    A plasma-safety assessment model has been provided on the basis of the plasma physics database of the International Thermonuclear Experimental Reactor (ITER) to analyze events including plasma behavior. The model was implemented in a safety analysis code (SAFALY), which consists of a 0-D dynamic plasma model and a 1-D thermal behavior model of the in-vessel components. Unusual plasma events of ITER, e.g., overfueling, were calculated using the code and plasma burning is found to be self-bounded by operation limits or passively shut down due to impurity ingress from overheated divertor targets. Sudden transition of divertor plasma might lead to failure of the divertor target because of a sharp increase of the heat flux. However, the effects of the aggravating failure can be safely handled by the confinement boundaries. (author)

  16. PORFLOW Modeling Supporting The H-Tank Farm Performance Assessment

    International Nuclear Information System (INIS)

    Jordan, J. M.; Flach, G. P.; Westbrook, M. L.

    2012-01-01

    Numerical simulations of groundwater flow and contaminant transport in the vadose and saturated zones have been conducted using the PORFLOW code in support of an overall Performance Assessment (PA) of the H-Tank Farm. This report provides technical detail on selected aspects of PORFLOW model development and describes the structure of the associated electronic files. The PORFLOW models for the H-Tank Farm PA, Rev. 1 were updated with grout, solubility, and inventory changes. The aquifer model was refined. In addition, a set of flow sensitivity runs were performed to allow flow to be varied in the related probabilistic GoldSim models. The final PORFLOW concentration values are used as input into a GoldSim dose calculator

  17. PORFLOW Modeling Supporting The H-Tank Farm Performance Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, J. M.; Flach, G. P.; Westbrook, M. L.

    2012-08-31

    Numerical simulations of groundwater flow and contaminant transport in the vadose and saturated zones have been conducted using the PORFLOW code in support of an overall Performance Assessment (PA) of the H-Tank Farm. This report provides technical detail on selected aspects of PORFLOW model development and describes the structure of the associated electronic files. The PORFLOW models for the H-Tank Farm PA, Rev. 1 were updated with grout, solubility, and inventory changes. The aquifer model was refined. In addition, a set of flow sensitivity runs were performed to allow flow to be varied in the related probabilistic GoldSim models. The final PORFLOW concentration values are used as input into a GoldSim dose calculator.

  18. Tackling Biocomplexity with Meta-models for Species Risk Assessment

    Directory of Open Access Journals (Sweden)

    Philip J. Nyhus

    2007-06-01

    Full Text Available We describe results of a multi-year effort to strengthen consideration of the human dimension into endangered species risk assessments and to strengthen research capacity to understand biodiversity risk assessment in the context of coupled human-natural systems. A core group of social and biological scientists have worked with a network of more than 50 individuals from four countries to develop a conceptual framework illustrating how human-mediated processes influence biological systems and to develop tools to gather, translate, and incorporate these data into existing simulation models. A central theme of our research focused on (1 the difficulties often encountered in identifying and securing diverse bodies of expertise and information that is necessary to adequately address complex species conservation issues; and (2 the development of quantitative simulation modeling tools that could explicitly link these datasets as a way to gain deeper insight into these issues. To address these important challenges, we promote a "meta-modeling" approach where computational links are constructed between discipline-specific models already in existence. In this approach, each model can function as a powerful stand-alone program, but interaction between applications is achieved by passing data structures describing the state of the system between programs. As one example of this concept, an integrated meta-model of wildlife disease and population biology is described. A goal of this effort is to improve science-based capabilities for decision making by scientists, natural resource managers, and policy makers addressing environmental problems in general, and focusing on biodiversity risk assessment in particular.

  19. Empirical assessment of a threshold model for sylvatic plague

    DEFF Research Database (Denmark)

    Davis, Stephen; Leirs, Herwig; Viljugrein, H.

    2007-01-01

    Plague surveillance programmes established in Kazakhstan, Central Asia, during the previous century, have generated large plague archives that have been used to parameterize an abundance threshold model for sylvatic plague in great gerbil (Rhombomys opimus) populations. Here, we assess the model...... examine six hypotheses that could explain the resulting false positive predictions, namely (i) including end-of-outbreak data erroneously lowers the estimated threshold, (ii) too few gerbils were tested, (iii) plague becomes locally extinct, (iv) the abundance of fleas was too low, (v) the climate...

  20. Connecting single-stock assessment models through correlated survival

    DEFF Research Database (Denmark)

    Albertsen, Christoffer Moesgaard; Nielsen, Anders; Thygesen, Uffe Høgsbro

    2017-01-01

    the corresponding partial correlations. We consider six models where the partial correlation matrix between stocks follows a band structure ranging from independent assessments to complex correlation structures. Further, a simulation study illustrates the importance of handling correlated data sufficiently...... times. We propose a simple alternative. In three case studies each with two stocks, we improve the single-stock models, as measured by Akaike information criterion, by adding correlation in the cohort survival. To limit the number of parameters, the correlations are parameterized through...

  1. Radiological assessments of land disposal options: recent model developments

    International Nuclear Information System (INIS)

    Fearn, H.S.; Pinner, A.V.; Hemming, C.R.

    1984-10-01

    This report describes progress in the development of methodologies and models for assessing the radiological impact of the disposal of low and intermediate level wastes by (i) shallow land burial in simple trenches (land 1), (ii) shallow land burial in engineered facilities (land 2), and (iii) emplacement in mined repositories or existing cavities (land 3/4). In particular the report describes wasteform leaching models, for unconditioned and cemented waste, the role of engineered barriers of a shallow land burial facility in reducing the magnitude of doses arising from groundwater contact and a detailed consideration of the interactions between radioactive carbon and various media. (author)

  2. Adding Value to Ecological Risk Assessment with Population Modeling

    DEFF Research Database (Denmark)

    Forbes, Valery E.; Calow, Peter; Grimm, Volker

    2011-01-01

    Current measures used to estimate the risks of toxic chemicals are not relevant to the goals of the environmental protection process, and thus ecological risk assessment (ERA) is not used as extensively as it should be as a basis for cost-effective management of environmental resources. Appropriate...... population models can provide a powerful basis for expressing ecological risks that better inform the environmental management process and thus that are more likely to be used by managers. Here we provide at least five reasons why population modeling should play an important role in bridging the gap between...

  3. Model of reliability assessment in ultrasonic nondestructive inspection

    International Nuclear Information System (INIS)

    Park, Ik Keun; Park, Un Su; Kim, Hyun Mook; Park, Yoon Won; Kang, Suk Chull; Choi, Young Hwan; Lee, Jin Ho

    2001-01-01

    Ultrasonic inspection system is consisted of the operator, equipment and procedure. The reliability of results in ultrasonic inspection is affected by its ability. Furthermore, the reliability of nondestructive testing is influenced by the inspection environment, other materials and types of defect. Therefore, it is very difficult to estimate the reliability of NDT due to various factors. In this study, the probability of detection, used logistic probability model and Monte Carlo simulation, estimated the reliability of ultrasonic inspection. The utility of the NDT reliability assessment is verified by the analysis of the data from round robin test applied these models

  4. Exploring harmonization between integrated assessment and capacity expansion models

    Science.gov (United States)

    Iyer, G.; Brown, M.; Cohen, S.; Macknick, J.; Patel, P.; Wise, M. A.; Horing, J.

    2017-12-01

    Forward-looking quantitative models of the electric sector are extensively used to provide science-based strategic decision support to national, international and private-sector entities. Given that these models are used to inform a wide-range of stakeholders and influence policy decisions, it is vital to examine how the models' underlying data and structure influence their outcomes. We conduct several experiments harmonizing key model characteristics between ReEDS—an electric sector only model, and GCAM—an integrated assessment model—to understand how different degrees of harmonization impact model outcomes. ReEDS has high spatial, temporal, and process detail but lacks electricity demand elasticity and endogenous representations of other economic sectors, while GCAM has internally consistent representations of energy (including the electric sector), agriculture, and land-use systems but relatively aggregate representations of the factors influencing electric sector investments . We vary the degree of harmonization in electricity demand, fuel prices, technology costs and performance, and variable renewable energy resource characteristics. We then identify the prominent sources of divergence in key outputs (electricity capacity, generation, and price) across the models and study how the convergence between models can be improved with permutations of harmonized characteristics. The remaining inconsistencies help to establish how differences in the models' underlying data, construction, perspective, and methodology play into each model's outcome. There are three broad contributions of this work. First, our study provides a framework to link models with similar scope but different resolutions. Second, our work provides insight into how the harmonization of assumptions contributes to a unified and robust portrayal of the US electricity sector under various potential futures. Finally, our study enhances the understanding of the influence of structural uncertainty

  5. Development of tools and models for computational fracture assessment

    International Nuclear Information System (INIS)

    Talja, H.; Santaoja, K.

    1998-01-01

    The aim of the work presented in this paper has been to develop and test new computational tools and theoretically more sound methods for fracture mechanical analysis. The applicability of the engineering integrity assessment system MASI for evaluation of piping components has been extended. The most important motivation for the theoretical development have been the well-known fundamental limitations in the validity of J-integral, which limits its applicability in many important practical safety assessment cases. Examples are extensive plastic deformation, multimaterial structures and ascending loading paths (especially warm prestress, WPS). Further, the micromechanical Gurson model has been applied to several reactor pressure vessel materials. Special attention is paid to the transferability of Gurson model parameters from tensile test results to prediction of ductile failure behaviour of cracked structures. (author)

  6. Regional Persistent Organic Pollutants' Environmental Impact Assessment and Control Model

    Directory of Open Access Journals (Sweden)

    Jurgis Staniskis

    2008-10-01

    Full Text Available The sources of formation, environmental distribution and fate of persistent organic pollutants (POPs are increasingly seen as topics to be addressed and solved at the global scale. Therefore, there are already two international agreements concerning persistent organic pollutants: the Protocol of 1998 to the 1979 Convention on the Long-Range Transboundary Air Pollution on Persistent Organic Pollutants (Aarhus Protocol; and the Stockholm Convention on Persistent Organic Pollutants. For the assessment of environmental pollution of POPs, for the risk assessment, for the evaluation of new pollutants as potential candidates to be included in the POPs list of the Stokholmo or/and Aarhus Protocol, a set of different models are developed or under development. Multimedia models help describe and understand environmental processes leading to global contamination through POPs and actual risk to the environment and human health. However, there is a lack of the tools based on a systematic and integrated approach to POPs management difficulties in the region.

  7. Sensitivity of Coastal Flood Risk Assessments to Digital Elevation Models

    Directory of Open Access Journals (Sweden)

    Bas van de Sande

    2012-07-01

    Full Text Available Most coastal flood risk studies make use of a Digital Elevation Model (DEM in addition to a projected flood water level in order to estimate the flood inundation and associated damages to property and livelihoods. The resolution and accuracy of a DEM are critical in a flood risk assessment, as land elevation largely determines whether a location will be flooded or will remain dry during a flood event. Especially in low lying deltaic areas, the land elevation variation is usually in the order of only a few decimeters, and an offset of various decimeters in the elevation data has a significant impact on the accuracy of the risk assessment. Publicly available DEMs are often used in studies for coastal flood risk assessments. The accuracy of these datasets is relatively low, in the order of meters, and is especially low in comparison to the level of accuracy required for a flood risk assessment in a deltaic area. For a coastal zone area in Nigeria (Lagos State an accurate LiDAR DEM dataset was adopted as ground truth concerning terrain elevation. In the case study, the LiDAR DEM was compared to various publicly available DEMs. The coastal flood risk assessment using various publicly available DEMs was compared to a flood risk assessment using LiDAR DEMs. It can be concluded that the publicly available DEMs do not meet the accuracy requirement of coastal flood risk assessments, especially in coastal and deltaic areas. For this particular case study, the publically available DEMs highly overestimated the land elevation Z-values and thereby underestimated the coastal flood risk for the Lagos State area. The findings are of interest when selecting data sets for coastal flood risk assessments in low-lying deltaic areas.

  8. Melodie: A global risk assessment model for radioactive waste repositories

    International Nuclear Information System (INIS)

    Lewi, J.; Assouline, M.; Bareau, J.; Raimbault, P.

    1987-03-01

    The Institute of Protection and Nuclear Safety (IPSN), which is part of the French Atomic Energy Commission (C.E.A.) develops since 1984 in collaboration with different groups inside and outside the C.E.A. a computer model for risk assessment of nuclear waste repositories in deep geological formations. The main characteristics of the submodels, the data processing structure and some examples of applications are presented

  9. A Fuzzy Knowledge Representation Model for Student Performance Assessment

    DEFF Research Database (Denmark)

    Badie, Farshad

    Knowledge representation models based on Fuzzy Description Logics (DLs) can provide a foundation for reasoning in intelligent learning environments. While basic DLs are suitable for expressing crisp concepts and binary relationships, Fuzzy DLs are capable of processing degrees of truth/completene....../completeness about vague or imprecise information. This paper tackles the issue of representing fuzzy classes using OWL2 in a dataset describing Performance Assessment Results of Students (PARS)....

  10. Modeling marine surface microplastic transport to assess optimal removal locations

    OpenAIRE

    Sherman, P; Van Sebille, E

    2016-01-01

    Marine plastic pollution is an ever-increasing problem that demands immediate mitigation and reduction plans. Here, a model based on satellite-tracked buoy observations and scaled to a large data set of observations on microplastic from surface trawls was used to simulate the transport of plastics floating on the ocean surface from 2015 to 2025, with the goal to assess the optimal marine microplastic removal locations for two scenarios: removing the most surface microplastic and reducing the ...

  11. Gaia: automated quality assessment of protein structure models.

    Science.gov (United States)

    Kota, Pradeep; Ding, Feng; Ramachandran, Srinivas; Dokholyan, Nikolay V

    2011-08-15

    Increasing use of structural modeling for understanding structure-function relationships in proteins has led to the need to ensure that the protein models being used are of acceptable quality. Quality of a given protein structure can be assessed by comparing various intrinsic structural properties of the protein to those observed in high-resolution protein structures. In this study, we present tools to compare a given structure to high-resolution crystal structures. We assess packing by calculating the total void volume, the percentage of unsatisfied hydrogen bonds, the number of steric clashes and the scaling of the accessible surface area. We assess covalent geometry by determining bond lengths, angles, dihedrals and rotamers. The statistical parameters for the above measures, obtained from high-resolution crystal structures enable us to provide a quality-score that points to specific areas where a given protein structural model needs improvement. We provide these tools that appraise protein structures in the form of a web server Gaia (http://chiron.dokhlab.org). Gaia evaluates the packing and covalent geometry of a given protein structure and provides quantitative comparison of the given structure to high-resolution crystal structures. dokh@unc.edu Supplementary data are available at Bioinformatics online.

  12. Advancing Integrated Systems Modelling Framework for Life Cycle Sustainability Assessment

    Directory of Open Access Journals (Sweden)

    Anthony Halog

    2011-02-01

    Full Text Available The need for integrated methodological framework for sustainability assessment has been widely discussed and is urgent due to increasingly complex environmental system problems. These problems have impacts on ecosystems and human well-being which represent a threat to economic performance of countries and corporations. Integrated assessment crosses issues; spans spatial and temporal scales; looks forward and backward; and incorporates multi-stakeholder inputs. This study aims to develop an integrated methodology by capitalizing the complementary strengths of different methods used by industrial ecologists and biophysical economists. The computational methodology proposed here is systems perspective, integrative, and holistic approach for sustainability assessment which attempts to link basic science and technology to policy formulation. The framework adopts life cycle thinking methods—LCA, LCC, and SLCA; stakeholders analysis supported by multi-criteria decision analysis (MCDA; and dynamic system modelling. Following Pareto principle, the critical sustainability criteria, indicators and metrics (i.e., hotspots can be identified and further modelled using system dynamics or agent based modelling and improved by data envelopment analysis (DEA and sustainability network theory (SNT. The framework is being applied to development of biofuel supply chain networks. The framework can provide new ways of integrating knowledge across the divides between social and natural sciences as well as between critical and problem-solving research.

  13. Habitat hydraulic models - a tool for Danish stream quality assessment?

    DEFF Research Database (Denmark)

    Olsen, Martin

    In relation to the European Water Framework Directive (WFD), Danish water management has to change to a holistic management approach considering both groundwaters and surface waters at the same time. Furthermore the WFD introduces the concept "Good ecological status" where the quality of the biol......In relation to the European Water Framework Directive (WFD), Danish water management has to change to a holistic management approach considering both groundwaters and surface waters at the same time. Furthermore the WFD introduces the concept "Good ecological status" where the quality...... in Danish stream management and stream quality assessment. The stream Ledreborg catchment is modelled using a precipitation-run-off-model (NAM) and as an addition to the normal calibration procedure (Kronvang et al., 2000) the model is calibrated using DAISY adjusted evaporation data. The impact from...... groundwater abstraction upon stream discharge is assessed and in relation to this the relative importance of variations in precipitation, evaporation/temperature and groundwater abstraction are discussed. Physical habitat preferences for trout in the stream Ledreborg are assessed through a series of field...

  14. Modelling Tradescantia fluminensis to assess long term survival

    Directory of Open Access Journals (Sweden)

    Alex James

    2015-06-01

    Full Text Available We present a simple Poisson process model for the growth of Tradescantia fluminensis, an invasive plant species that inhibits the regeneration of native forest remnants in New Zealand. The model was parameterised with data derived from field experiments in New Zealand and then verified with independent data. The model gave good predictions which showed that its underlying assumptions are sound. However, this simple model had less predictive power for outputs based on variance suggesting that some assumptions were lacking. Therefore, we extended the model to include higher variability between plants thereby improving its predictions. This high variance model suggests that control measures that promote node death at the base of the plant or restrict the main stem growth rate will be more effective than those that reduce the number of branching events. The extended model forms a good basis for assessing the efficacy of various forms of control of this weed, including the recently-released leaf-feeding tradescantia leaf beetle (Neolema ogloblini.

  15. Individual-based model for radiation risk assessment

    Science.gov (United States)

    Smirnova, O.

    A mathematical model is developed which enables one to predict the life span probability for mammals exposed to radiation. It relates statistical biometric functions with statistical and dynamic characteristics of an organism's critical system. To calculate the dynamics of the latter, the respective mathematical model is used too. This approach is applied to describe the effects of low level chronic irradiation on mice when the hematopoietic system (namely, thrombocytopoiesis) is the critical one. For identification of the joint model, experimental data on hematopoiesis in nonirradiated and irradiated mice, as well as on mortality dynamics of those in the absence of radiation are utilized. The life span probability and life span shortening predicted by the model agree with corresponding experimental data. Modeling results show the significance of ac- counting the variability of the individual radiosensitivity of critical system cells when estimating the radiation risk. These findings are corroborated by clinical data on persons involved in the elimination of the Chernobyl catastrophe after- effects. All this makes it feasible to use the model for radiation risk assessments for cosmonauts and astronauts on long-term missions such as a voyage to Mars or a lunar colony. In this case the model coefficients have to be determined by making use of the available data for humans. Scenarios for the dynamics of dose accumulation during space flights should also be taken into account.

  16. New techniques and models for assessing ischemic heart disease risks

    Directory of Open Access Journals (Sweden)

    I.N. Yakovina

    2017-09-01

    Full Text Available The paper focuses on tasks of creating and implementing a new technique aimed at assessing ischemic heart diseases risk. The techniques is based on a laboratory-diagnostic complex which includes oxidative, lipid-lipoprotein, inflammatory and metabolic biochemical parameters; s system of logic-mathematic models used for obtaining numeric risk assessments; and a program module which allows to calculate and analyze the results. we justified our models in the course of our re-search which included 172 patients suffering from ischemic heart diseases (IHD combined with coronary atherosclerosis verified by coronary arteriography and 167 patients who didn't have ischemic heart diseases. Our research program in-cluded demographic and social data, questioning on tobacco and alcohol addiction, questioning about dietary habits, chronic diseases case history and medications intake, cardiologic questioning as per Rose, anthropometry, 3-times meas-ured blood pressure, spirometry, and electrocardiogram taking and recording with decoding as per Minnesota code. We detected biochemical parameters of each patient and adjusted our task of creating techniques and models for assessing ischemic heart disease risks on the basis of inflammatory, oxidative, and lipid biological markers. We created a system of logic and mathematic models which is a universal scheme for laboratory parameters processing allowing for dissimilar data specificity. The system of models is universal, but a diagnostic approach to applied biochemical parameters is spe-cific. The created program module (calculator helps a physician to obtain a result on the basis of laboratory research data; the result characterizes numeric risks of coronary atherosclerosis and ischemic heart disease for a patient. It also allows to obtain a visual image of a system of parameters and their deviation from a conditional «standard – pathology» boundary. The complex is implemented into practice by the Scientific

  17. Sustainable BECCS pathways evaluated by an integrated assessment model

    Science.gov (United States)

    Kato, E.

    2017-12-01

    Negative emissions technologies, particularly Bioenergy with Carbon Capture and Storage (BECCS), are key components of mitigation strategies in ambitious future socioeconomic scenarios analysed by integrated assessment models. Generally, scenarios aiming to keep mean global temperature rise below 2°C above pre-industrial would require net negative carbon emissions in the end of the 21st century. Also, in the context of Paris agreement which acknowledges "a balance between anthropogenic emissions by sources and removals by sinks of greenhouse gases in the second half of this century", RD&D for the negative emissions technologies in this decade has a crucial role for the possibility of early deployment of the technology. Because of the requirement of potentially extensive use of land and water for producing the bioenergy feedstock to get the anticipated level of gross negative emissions, researches on how to develop sustainable scenarios of BECCS is needed. Here, we present BECCS deployment scenarios that consider economically viable flow of bioenergy system including power generation and conversion process to liquid and gaseous fuels for transportation and heat with consideration of sustainable global biomass use. In the modelling process, detailed bioenergy representations, i.e. various feedstock and conversion technologies with and without CCS, are implemented in an integrated assessment (IA) model GRAPE (Global Relationship Assessment to Protect the Environment). Also, to overcome a general discrepancy about assumed future agricultural yield between 'top-down' IA models and 'bottom-up' estimates, which would crucially affect the land-use pattern, we applied yields change of food and energy crops consistent with process-based biophysical crop models in consideration of changing climate conditions. Using the framework, economically viable strategy for implementing sustainable bioenergy and BECCS flow are evaluated in the scenarios targeting to keep global average

  18. Modeling Exposure to Persistent Chemicals in Hazard and Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Cowan-Ellsberry, Christina E.; McLachlan, Michael S.; Arnot, Jon A.; MacLeod, Matthew; McKone, Thomas E.; Wania, Frank

    2008-11-01

    Fate and exposure modeling has not thus far been explicitly used in the risk profile documents prepared to evaluate significant adverse effect of candidate chemicals for either the Stockholm Convention or the Convention on Long-Range Transboundary Air Pollution. However, we believe models have considerable potential to improve the risk profiles. Fate and exposure models are already used routinely in other similar regulatory applications to inform decisions, and they have been instrumental in building our current understanding of the fate of POP and PBT chemicals in the environment. The goal of this paper is to motivate the use of fate and exposure models in preparing risk profiles in the POP assessment procedure by providing strategies for incorporating and using models. The ways that fate and exposure models can be used to improve and inform the development of risk profiles include: (1) Benchmarking the ratio of exposure and emissions of candidate chemicals to the same ratio for known POPs, thereby opening the possibility of combining this ratio with the relative emissions and relative toxicity to arrive at a measure of relative risk. (2) Directly estimating the exposure of the environment, biota and humans to provide information to complement measurements, or where measurements are not available or are limited. (3) To identify the key processes and chemical and/or environmental parameters that determine the exposure; thereby allowing the effective prioritization of research or measurements to improve the risk profile. (4) Predicting future time trends including how quickly exposure levels in remote areas would respond to reductions in emissions. Currently there is no standardized consensus model for use in the risk profile context. Therefore, to choose the appropriate model the risk profile developer must evaluate how appropriate an existing model is for a specific setting and whether the assumptions and input data are relevant in the context of the application

  19. An integrated model for the assessment of global water resources Part 2: Applications and assessments

    Science.gov (United States)

    Hanasaki, N.; Kanae, S.; Oki, T.; Masuda, K.; Motoya, K.; Shirakawa, N.; Shen, Y.; Tanaka, K.

    2008-07-01

    To assess global water resources from the perspective of subannual variation in water availability and water use, an integrated water resources model was developed. In a companion report, we presented the global meteorological forcing input used to drive the model and six modules, namely, the land surface hydrology module, the river routing module, the crop growth module, the reservoir operation module, the environmental flow requirement module, and the anthropogenic withdrawal module. Here, we present the results of the model application and global water resources assessments. First, the timing and volume of simulated agriculture water use were examined because agricultural use composes approximately 85% of total consumptive water withdrawal in the world. The estimated crop calendar showed good agreement with earlier reports for wheat, maize, and rice in major countries of production. In major countries, the error in the planting date was ±1 mo, but there were some exceptional cases. The estimated irrigation water withdrawal also showed fair agreement with country statistics, but tended to be underestimated in countries in the Asian monsoon region. The results indicate the validity of the model and the input meteorological forcing because site-specific parameter tuning was not used in the series of simulations. Finally, global water resources were assessed on a subannual basis using a newly devised index. This index located water-stressed regions that were undetected in earlier studies. These regions, which are indicated by a gap in the subannual distribution of water availability and water use, include the Sahel, the Asian monsoon region, and southern Africa. The simulation results show that the reservoir operations of major reservoirs (>1 km3) and the allocation of environmental flow requirements can alter the population under high water stress by approximately -11% to +5% globally. The integrated model is applicable to assessments of various global

  20. An integrated model for the assessment of global water resources – Part 2: Applications and assessments

    Directory of Open Access Journals (Sweden)

    N. Hanasaki

    2008-07-01

    Full Text Available To assess global water resources from the perspective of subannual variation in water availability and water use, an integrated water resources model was developed. In a companion report, we presented the global meteorological forcing input used to drive the model and six modules, namely, the land surface hydrology module, the river routing module, the crop growth module, the reservoir operation module, the environmental flow requirement module, and the anthropogenic withdrawal module. Here, we present the results of the model application and global water resources assessments. First, the timing and volume of simulated agriculture water use were examined because agricultural use composes approximately 85% of total consumptive water withdrawal in the world. The estimated crop calendar showed good agreement with earlier reports for wheat, maize, and rice in major countries of production. In major countries, the error in the planting date was ±1 mo, but there were some exceptional cases. The estimated irrigation water withdrawal also showed fair agreement with country statistics, but tended to be underestimated in countries in the Asian monsoon region. The results indicate the validity of the model and the input meteorological forcing because site-specific parameter tuning was not used in the series of simulations. Finally, global water resources were assessed on a subannual basis using a newly devised index. This index located water-stressed regions that were undetected in earlier studies. These regions, which are indicated by a gap in the subannual distribution of water availability and water use, include the Sahel, the Asian monsoon region, and southern Africa. The simulation results show that the reservoir operations of major reservoirs (>1 km3 and the allocation of environmental flow requirements can alter the population under high water stress by approximately −11% to +5% globally. The integrated model is applicable to

  1. Stratiform chromite deposit model: Chapter E in Mineral deposit models for resource assessment

    Science.gov (United States)

    Schulte, Ruth F.; Taylor, Ryan D.; Piatak, Nadine M.; Seal, Robert R.

    2012-01-01

    A new descriptive stratiform chromite deposit model was prepared which will provide a framework for understanding the characteristics of stratiform chromite deposits worldwide. Previous stratiform chromite deposit models developed by the U.S. Geological Survey (USGS) have been referred to as Bushveld chromium, because the Bushveld Complex in South Africa is the only stratified, mafic-ultramafic intrusion presently mined for chromite and is the most intensely researched. As part of the on-going effort by the USGS Mineral Resources Program to update existing deposit models for the upcoming national mineral resource assessment, this revised stratiform chromite deposit model includes new data on the geological, mineralogical, geophysical, and geochemical attributes of stratiform chromite deposits worldwide. This model will be a valuable tool in future chromite resource and environmental assessments and supplement previously published models used for mineral resource evaluation.

  2. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    Science.gov (United States)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  3. Dental caries: an updated medical model of risk assessment.

    Science.gov (United States)

    Kutsch, V Kim

    2014-04-01

    Dental caries is a transmissible, complex biofilm disease that creates prolonged periods of low pH in the mouth, resulting in a net mineral loss from the teeth. Historically, the disease model for dental caries consisted of mutans streptococci and Lactobacillus species, and the dental profession focused on restoring the lesions/damage from the disease by using a surgical model. The current recommendation is to implement a risk-assessment-based medical model called CAMBRA (caries management by risk assessment) to diagnose and treat dental caries. Unfortunately, many of the suggestions of CAMBRA have been overly complicated and confusing for clinicians. The risk of caries, however, is usually related to just a few common factors, and these factors result in common patterns of disease. This article examines the biofilm model of dental caries, identifies the common disease patterns, and discusses their targeted therapeutic strategies to make CAMBRA more easily adaptable for the privately practicing professional. Copyright © 2014 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  4. Improving treatment outcome assessment in a mouse tuberculosis model.

    Science.gov (United States)

    Mourik, Bas C; Svensson, Robin J; de Knegt, Gerjo J; Bax, Hannelore I; Verbon, Annelies; Simonsson, Ulrika S H; de Steenwinkel, Jurriaan E M

    2018-04-09

    Preclinical treatment outcome evaluation of tuberculosis (TB) occurs primarily in mice. Current designs compare relapse rates of different regimens at selected time points, but lack information about the correlation between treatment length and treatment outcome, which is required to efficiently estimate a regimens' treatment-shortening potential. Therefore we developed a new approach. BALB/c mice were infected with a Mycobacterium tuberculosis Beijing genotype strain and were treated with rifapentine-pyrazinamide-isoniazid-ethambutol (R p ZHE), rifampicin-pyrazinamide-moxifloxacin-ethambutol (RZME) or rifampicin-pyrazinamide-moxifloxacin-isoniazid (RZMH). Treatment outcome was assessed in n = 3 mice after 9 different treatment lengths between 2-6 months. Next, we created a mathematical model that best fitted the observational data and used this for inter-regimen comparison. The observed data were best described by a sigmoidal E max model in favor over linear or conventional E max models. Estimating regimen-specific parameters showed significantly higher curative potentials for RZME and R p ZHE compared to RZMH. In conclusion, we provide a new design for treatment outcome evaluation in a mouse TB model, which (i) provides accurate tools for assessment of the relationship between treatment length and predicted cure, (ii) allows for efficient comparison between regimens and (iii) adheres to the reduction and refinement principles of laboratory animal use.

  5. Independent Assessment of Instrumentation for ISS On-Orbit NDE. Volume 1

    Science.gov (United States)

    Madaras, Eric I

    2013-01-01

    International Space Station (ISS) Structural and Mechanical Systems Manager, requested that the NASA Engineering and Safety Center (NESC) provide a quantitative assessment of commercially available nondestructive evaluation (NDE) instruments for potential application to the ISS. This work supports risk mitigation as outlined in the ISS Integrated Risk Management Application (IRMA) Watch Item #4669, which addresses the requirement for structural integrity after an ISS pressure wall leak in the event of a penetration due to micrometeoroid or debris (MMOD) impact. This document contains the outcome of the NESC assessment.

  6. Application of a leakage model to assess exfiltration from sewers.

    Science.gov (United States)

    Karpf, C; Krebs, P

    2005-01-01

    The exfiltration of wastewater from sewer systems in urban areas causes a deterioration of soil and possibly groundwater quality. Beside the simulation of transport and degradation processes in the unsaturated zone and in the aquifer the analysis of the potential impact requires the estimation of quantity and temporal variation of wastewater exfiltration. Exfiltration can be assessed by the application of a leakage model. The hydrological approach was originally developed to simulate the interactions between the groundwater and surface water, it was adapted to allow for modelling of interactions between groundwater and sewer system. In order to approximate the exfiltration specific model parameters infiltration specific parameters were used as a basis. Scenario analysis of the exfiltration in the City of Dresden from 1997 to 1999 and during the flood event in August 2002 shows the variation and the extent of exfiltration rates.

  7. Screening model for assessing doses from radiological accidents

    International Nuclear Information System (INIS)

    Sjoreen, A.L.; Athey, G.F.; Sakenas, C.A.; McKenna, T.J.

    1988-01-01

    A new dose assessment model, called RASCAL, has been written for the US Nuclear Regulatory Commission (NRC) for use during response to emergencies. RASCAL is designed to provide rough estimates of the health effects from a radiological accident in progress and only limited information is available. RASCAL will be used by the NRC personnel who report to the site of a nuclear accident to conduct an independent evaluation of dose projections. It was written to correct the technical and operational problems in NRC's previous model and to be more appropriate to the personal computers presently in use by the NRC. The model has been constructed to be easy to modify, with separate modules for estimation of the quantity of radioactivity released, its transport through the atmosphere, and the resulting radiologic dose to man. RASCAL results can be displayed in graphical or ASCII form. 4 refs

  8. Healthcare quality maturity assessment model based on quality drivers.

    Science.gov (United States)

    Ramadan, Nadia; Arafeh, Mazen

    2016-04-18

    Purpose - Healthcare providers differ in their readiness and maturity levels regarding quality and quality management systems applications. The purpose of this paper is to serve as a useful quantitative quality maturity-level assessment tool for healthcare organizations. Design/methodology/approach - The model proposes five quality maturity levels (chaotic, primitive, structured, mature and proficient) based on six quality drivers: top management, people, operations, culture, quality focus and accreditation. Findings - Healthcare managers can apply the model to identify the status quo, quality shortcomings and evaluating ongoing progress. Practical implications - The model has been incorporated in an interactive Excel worksheet that visually displays the quality maturity-level risk meter. The tool has been applied successfully to local hospitals. Originality/value - The proposed six quality driver scales appear to measure healthcare provider maturity levels on a single quality meter.

  9. Experimental assessment and modelling of nitrate utilisation for primary sludge.

    Science.gov (United States)

    Avcioğlu, E; Sözen, S; Orhon, D; van Loosdrecht, M C M

    2002-01-01

    Electron acceptor utilisation potential of filtered primary sludge under anoxic conditions was experimentally investigated. Major kinetic and stoichiometric parameters were assessed by means of model evaluation of nitrate profile obtained in batch reactors. ASM1, modified for endogenous decay, and ASM3 were used for model simulation. Both models provided consistent interpretation of experimental data. ASM1 yielded mu(H) and Y(HD) values of 6.1 d(-1) and 0.64 g cell COD(g COD)(-1) respectively for heterotrophic anoxic growth. The corresponding storage mechanism associated with ASM3 could be characterised by a k(STO) of 13 g COD (g COD d)(-1) and a Y(STO) of 0.78 g COD(g COD)(-1). The high k(STO) value suggests re-evaluation of the concept of readily biodegradable substrate as defined in ASM3 and tested in the study.

  10. Evaluating intersectoral collaboration: a model for assessment by service users

    Directory of Open Access Journals (Sweden)

    Bengt Ahgren

    2009-02-01

    Full Text Available Introduction: DELTA was launched as a project in 1997 to improve intersectoral collaboration in the rehabilitation field. In 2005 DELTA was transformed into a local association for financial co-ordination between the institutions involved. Based on a study of the DELTA service users, the purpose of this article is to develop and to validate a model that can be used to assess the integration of welfare services from the perspective of the service users. Theory: The foundation of integration is a well functioning structure of integration. Without such structural conditions, it is difficult to develop a process of integration that combines the resources and competences of the collaborating organisations to create services advantageous for the service users. In this way, both the structure and the process will contribute to the outcome of integration. Method: The study was carried out as a retrospective cross-sectional survey during two weeks, including all the current service users of DELTA. The questionnaire contained 32 questions, which were derived from the theoretical framework and research on service users, capturing perceptions of integration structure, process and outcome. Ordinal scales and open questions where used for the assessment. Results: The survey had a response rate of 82% and no serious biases of the results were detected. The study shows that the users of the rehabilitation services perceived the services as well integrated, relevant and adapted to their needs. The assessment model was tested for reliability and validity and a few modifications were suggested. Some key measurement themes were derived from the study. Conclusion: The model developed in this study is an important step towards an assessment of service integration from the perspective of the service users. It needs to be further refined, however, before it can be used in other evaluations of collaboration in the provision of integrated welfare services.

  11. Assigning probability distributions to input parameters of performance assessment models

    International Nuclear Information System (INIS)

    Mishra, Srikanta

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available

  12. Assigning probability distributions to input parameters of performance assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, Srikanta [INTERA Inc., Austin, TX (United States)

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available.

  13. Dose Assessment Model for Chronic Atmospheric Releases of Tritium

    International Nuclear Information System (INIS)

    Shen Huifang; Yao Rentai

    2010-01-01

    An improved dose assessment model for chronic atmospheric releases of tritium was proposed. The proposed model explicitly considered two chemical forms of tritium.It was based on conservative assumption of transfer of tritiated water (HTO) from air to concentration of HTO and organic beam tritium (OBT) in vegetable and animal products.The concentration of tritium in plant products was calculated based on considering dividedly leafy plant and not leafy plant, meanwhile the concentration contribution of tritium in the different plants from the tritium in soil was taken into account.Calculating the concentration of HTO in animal products, average water fraction of animal products and the average weighted tritium concentration of ingested water based on the fraction of water supplied by each source were considered,including skin absorption, inhalation, drinking water and food.Calculating the annual doses, the ingestion doses were considered, at the same time the contribution of inhalation and skin absorption to the dose was considered. Concentrations in foodstuffs and dose of annual adult calculated with the specific activity model, NEWTRI model and the model proposed by the paper were compared. The results indicate that the model proposed by the paper can predict accurately tritium doses through the food chain from chronic atmospheric releases. (authors)

  14. Task-based dermal exposure models for regulatory risk assessment.

    Science.gov (United States)

    Warren, Nicholas D; Marquart, Hans; Christopher, Yvette; Laitinen, Juha; VAN Hemmen, Joop J

    2006-07-01

    The regulatory risk assessment of chemicals requires the estimation of occupational dermal exposure. Until recently, the models used were either based on limited data or were specific to a particular class of chemical or application. The EU project RISKOFDERM has gathered a considerable number of new measurements of dermal exposure together with detailed contextual information. This article describes the development of a set of generic task-based models capable of predicting potential dermal exposure to both solids and liquids in a wide range of situations. To facilitate modelling of the wide variety of dermal exposure situations six separate models were made for groupings of exposure scenarios called Dermal Exposure Operation units (DEO units). These task-based groupings cluster exposure scenarios with regard to the expected routes of dermal exposure and the expected influence of exposure determinants. Within these groupings linear mixed effect models were used to estimate the influence of various exposure determinants and to estimate components of variance. The models predict median potential dermal exposure rates for the hands and the rest of the body from the values of relevant exposure determinants. These rates are expressed as mg or microl product per minute. Using these median potential dermal exposure rates and an accompanying geometric standard deviation allows a range of exposure percentiles to be calculated.

  15. Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.

  16. USING COPULAS TO MODEL DEPENDENCE IN SIMULATION RISK ASSESSMENT

    Energy Technology Data Exchange (ETDEWEB)

    Dana L. Kelly

    2007-11-01

    Typical engineering systems in applications with high failure consequences such as nuclear reactor plants often employ redundancy and diversity of equipment in an effort to lower the probability of failure and therefore risk. However, it has long been recognized that dependencies exist in these redundant and diverse systems. Some dependencies, such as common sources of electrical power, are typically captured in the logic structure of the risk model. Others, usually referred to as intercomponent dependencies, are treated implicitly by introducing one or more statistical parameters into the model. Such common-cause failure models have limitations in a simulation environment. In addition, substantial subjectivity is associated with parameter estimation for these models. This paper describes an approach in which system performance is simulated by drawing samples from the joint distributions of dependent variables. The approach relies on the notion of a copula distribution, a notion which has been employed by the actuarial community for ten years or more, but which has seen only limited application in technological risk assessment. The paper also illustrates how equipment failure data can be used in a Bayesian framework to estimate the parameter values in the copula model. This approach avoids much of the subjectivity required to estimate parameters in traditional common-cause failure models. Simulation examples are presented for failures in time. The open-source software package R is used to perform the simulations. The open-source software package WinBUGS is used to perform the Bayesian inference via Markov chain Monte Carlo sampling.

  17. An Iterative Uncertainty Assessment Technique for Environmental Modeling

    International Nuclear Information System (INIS)

    Engel, David W.; Liebetrau, Albert M.; Jarman, Kenneth D.; Ferryman, Thomas A.; Scheibe, Timothy D.; Didier, Brett T.

    2004-01-01

    The reliability of and confidence in predictions from model simulations are crucial--these predictions can significantly affect risk assessment decisions. For example, the fate of contaminants at the U.S. Department of Energy's Hanford Site has critical impacts on long-term waste management strategies. In the uncertainty estimation efforts for the Hanford Site-Wide Groundwater Modeling program, computational issues severely constrain both the number of uncertain parameters that can be considered and the degree of realism that can be included in the models. Substantial improvements in the overall efficiency of uncertainty analysis are needed to fully explore and quantify significant sources of uncertainty. We have combined state-of-the-art statistical and mathematical techniques in a unique iterative, limited sampling approach to efficiently quantify both local and global prediction uncertainties resulting from model input uncertainties. The approach is designed for application to widely diverse problems across multiple scientific domains. Results are presented for both an analytical model where the response surface is ''known'' and a simplified contaminant fate transport and groundwater flow model. The results show that our iterative method for approximating a response surface (for subsequent calculation of uncertainty estimates) of specified precision requires less computing time than traditional approaches based upon noniterative sampling methods

  18. Nitrogen Risk Assessment Model for Scotland: I. Nitrogen leaching

    Directory of Open Access Journals (Sweden)

    S. M. Dunn

    2004-01-01

    Full Text Available The Nitrogen Risk Assessment Model for Scotland (NIRAMS has been developed for prediction of streamwater N concentrations draining from agricultural land in Scotland. The objective of the model is to predict N concentrations for ungauged catchments, to fill gaps in monitoring data and to provide guidance in relation to policy development. The model uses nationally available data sets of land use, soils, topography and meteorology and has been developed within a Geographic Information System (GIS. The model includes modules to calculate N inputs to the land, residual N remaining at the end of the growing season, weekly time-series of leached N and transport of N at the catchment scale. This paper presents the methodology for calculating N balances for different land uses and for predicting the time sequence of N leaching after the end of the growing season. Maps are presented of calculated residual N and N leaching for the whole of Scotland and the spatial variability in N leaching is discussed. The results demonstrate the high variability in N leaching across Scotland. The simulations suggest that, in the areas with greatest residual N, the losses of N are not directly proportional to the amount of residual N, because of their coincidence with lower rainfall. In the companion paper, the hydrological controls on N transport within NIRAMS are described, and results of the full model testing are presented. Keywords: nitrogen, diffuse pollution, agriculture, leaching, land use, model, national, catchment

  19. Accuracy Assessment of Recent Global Ocean Tide Models around Antarctica

    Science.gov (United States)

    Lei, J.; Li, F.; Zhang, S.; Ke, H.; Zhang, Q.; Li, W.

    2017-09-01

    Due to the coverage limitation of T/P-series altimeters, the lack of bathymetric data under large ice shelves, and the inaccurate definitions of coastlines and grounding lines, the accuracy of ocean tide models around Antarctica is poorer than those in deep oceans. Using tidal measurements from tide gauges, gravimetric data and GPS records, the accuracy of seven state-of-the-art global ocean tide models (DTU10, EOT11a, GOT4.8, FES2012, FES2014, HAMTIDE12, TPXO8) is assessed, as well as the most widely-used conventional model FES2004. Four regions (Antarctic Peninsula region, Amery ice shelf region, Filchner-Ronne ice shelf region and Ross ice shelf region) are separately reported. The standard deviations of eight main constituents between the selected models are large in polar regions, especially under the big ice shelves, suggesting that the uncertainty in these regions remain large. Comparisons with in situ tidal measurements show that the most accurate model is TPXO8, and all models show worst performance in Weddell sea and Filchner-Ronne ice shelf regions. The accuracy of tidal predictions around Antarctica is gradually improving.

  20. Engaging Students through Assessment: The Success and Limitations of the ASPAL (Authentic Self and Peer Assessment for Learning) Model

    Science.gov (United States)

    Kearney, Sean P.; Perkins, Tim

    2014-01-01

    In 2011 the authors created a model of self- and peer-assessment known as Authentic Self and Peer Assessment for Learning (ASPAL) in an attempt to better engage seemingly disengaged students in their undergraduate coursework. The model focuses on authentic assessment tasks and engages students by involving them in every step of the process from…

  1. Assessment for Improvement: Two Models for Assessing a Large Quantitative Reasoning Requirement

    Directory of Open Access Journals (Sweden)

    Mary C. Wright

    2015-03-01

    Full Text Available We present two models for assessment of a large and diverse quantitative reasoning (QR requirement at the University of Michigan. These approaches address two key challenges in assessment: (1 dissemination of findings for curricular improvement and (2 resource constraints associated with measurement of large programs. Approaches we present for data collection include convergent validation of self-report surveys, as well as use of mixed methods and learning analytics. Strategies we present for dissemination of findings include meetings with instructors to share data and best practices, sharing of results through social media, and use of easily accessible dashboards. These assessment approaches may be of particular interest to universities with large numbers of students engaging in a QR experience, projects that involve multiple courses with diverse instructional goals, or those who wish to promote evidence-based curricular improvement.

  2. A prediction model for assessing residential radon concentration in Switzerland

    International Nuclear Information System (INIS)

    Hauri, Dimitri D.; Huss, Anke; Zimmermann, Frank; Kuehni, Claudia E.; Röösli, Martin

    2012-01-01

    Indoor radon is regularly measured in Switzerland. However, a nationwide model to predict residential radon levels has not been developed. The aim of this study was to develop a prediction model to assess indoor radon concentrations in Switzerland. The model was based on 44,631 measurements from the nationwide Swiss radon database collected between 1994 and 2004. Of these, 80% randomly selected measurements were used for model development and the remaining 20% for an independent model validation. A multivariable log-linear regression model was fitted and relevant predictors selected according to evidence from the literature, the adjusted R², the Akaike's information criterion (AIC), and the Bayesian information criterion (BIC). The prediction model was evaluated by calculating Spearman rank correlation between measured and predicted values. Additionally, the predicted values were categorised into three categories (50th, 50th–90th and 90th percentile) and compared with measured categories using a weighted Kappa statistic. The most relevant predictors for indoor radon levels were tectonic units and year of construction of the building, followed by soil texture, degree of urbanisation, floor of the building where the measurement was taken and housing type (P-values <0.001 for all). Mean predicted radon values (geometric mean) were 66 Bq/m³ (interquartile range 40–111 Bq/m³) in the lowest exposure category, 126 Bq/m³ (69–215 Bq/m³) in the medium category, and 219 Bq/m³ (108–427 Bq/m³) in the highest category. Spearman correlation between predictions and measurements was 0.45 (95%-CI: 0.44; 0.46) for the development dataset and 0.44 (95%-CI: 0.42; 0.46) for the validation dataset. Kappa coefficients were 0.31 for the development and 0.30 for the validation dataset, respectively. The model explained 20% overall variability (adjusted R²). In conclusion, this residential radon prediction model, based on a large number of measurements, was demonstrated to be

  3. The Network Performance Assessment Model - Regulation with a Reference Network

    International Nuclear Information System (INIS)

    Larsson, Mats B.O.

    2003-11-01

    A new model - the Network Performance Assessment Model - has been developed gradually since 1998, in order to evaluate and benchmark local electricity grids. The model is intended to be a regulation tool for the Swedish local electricity networks, used by the Swedish Energy Agency. At spring 2004 the Network Performance Assessment Model will run into operation, based on the companies' results for 2003. The mission of the Network Performance Assessment Model is to evaluate the networks from a costumers' point of view and establish a fair price level. In order to do that, the performance of the operator is evaluated. The performances are assessed in correspondence to a price level that the consumer is considered to accept, can agree to as fair and is prepared to pay. This price level is based on an average cost, based on the cost of an efficient grid that will be built today, with already known technology. The performances are accounted in Customer Values. Those Customer Values are what can be created by someone but can't be created better by someone else. The starting point is to look upon the companies from a customers' point of view. The factors that can't be influenced by the companies are evaluated by fixed rules, valid to all companies. The rules reflect the differences. The cost for a connection is evaluated from the actual facts, i.e. the distances between the subscribers and the demanded capacity by the subscriber. This is done by the creation of a reference network, with a capacity to fulfill the demand from the subscriber. This is an efficient grid with no spare capacity and no excess capacity. The companies' existing grid are without importance, as well as holds for dimensioning as technology. Those factors which the company can influence, for an example connection reliability, are evaluated from a customer perspective by measuring the actual reliability, measured as the number and length of the interruption. When implemented to the regulation the Network

  4. When less is more: Psychometric properties of Norwegian short-forms of the Ambivalent Sexism Scales (ASI and AMI) and the Illinois Rape Myth Acceptance (IRMA) Scale.

    Science.gov (United States)

    Bendixen, Mons; Kennair, Leif Edward Ottesen

    2017-12-01

    This paper reports on the development and the psychometric properties of short forms of Ambivalent Sexism Scales toward women (ASI; Glick & Fiske, 1996) and men (AMI; Glick & Fiske, 1999), and a scale measuring rape stereotypes (IRMA; McMahon & Farmer, 2011). The short form AMI/ASI were applied for examining gender and educational differences in university students (N = 512) and in high school students (N = 1381), and for predicting individual differences in rape stereotypes in the latter. The short forms demonstrated good to excellent psychometric properties across samples of emerging adults. Relative to female students, male students reported markedly more hostility toward women and more stereotypical beliefs about rape. Despite sampling from a highly gender egalitarian and secular culture, these gender differences are on a par with those reported internationally. Rape stereotypes were predicted by sexism in high school students. Additional predictors were educational program, relationship status, and acceptance of derogatory sexual slurs. The paper questions the validity of separate constructs for benevolent sexism toward women versus men. The short form versions of the scales may substitute the original versions in future research, and help prevent attrition while measuring the same constructs. © 2017 Scandinavian Psychological Associations and John Wiley & Sons Ltd.

  5. Literature Review and Assessment of Plant and Animal Transfer Factors Used in Performance Assessment Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, David E.; Cataldo, Dominic A.; Napier, Bruce A.; Krupka, Kenneth M.; Sasser, Lyle B.

    2003-07-20

    A literature review and assessment was conducted by Pacific Northwest National Laboratory (PNNL) to update information on plant and animal radionuclide transfer factors used in performance-assessment modeling. A group of 15 radionuclides was included in this review and assessment. The review is composed of four main sections, not including the Introduction. Section 2.0 provides a review of the critically important issue of physicochemical speciation and geochemistry of the radionuclides in natural soil-water systems as it relates to the bioavailability of the radionuclides. Section 3.0 provides an updated review of the parameters of importance in the uptake of radionuclides by plants, including root uptake via the soil-groundwater system and foliar uptake due to overhead irrigation. Section 3.0 also provides a compilation of concentration ratios (CRs) for soil-to-plant uptake for the 15 selected radionuclides. Section 4.0 provides an updated review on radionuclide uptake data for animal products related to absorption, homeostatic control, approach to equilibration, chemical and physical form, diet, and age. Compiled transfer coefficients are provided for cow’s milk, sheep’s milk, goat’s milk, beef, goat meat, pork, poultry, and eggs. Section 5.0 discusses the use of transfer coefficients in soil, plant, and animal modeling using regulatory models for evaluating radioactive waste disposal or decommissioned sites. Each section makes specific suggestions for future research in its area.

  6. Psychometric model for safety culture assessment in nuclear research facilities

    International Nuclear Information System (INIS)

    Nascimento, C.S. do; Andrade, D.A.; Mesquita, R.N. de

    2017-01-01

    Highlights: • A psychometric model to evaluate ‘safety climate’ at nuclear research facilities. • The model presented evidences of good psychometric qualities. • The model was applied to nuclear research facilities in Brazil. • Some ‘safety culture’ weaknesses were detected in the assessed organization. • A potential tool to develop safety management programs in nuclear facilities. - Abstract: A safe and reliable operation of nuclear power plants depends not only on technical performance, but also on the people and on the organization. Organizational factors have been recognized as the main causal mechanisms of accidents by research organizations through USA, Europe and Japan. Deficiencies related with these factors reveal weaknesses in the organization’s safety culture. A significant number of instruments to assess the safety culture based on psychometric models that evaluate safety climate through questionnaires, and which are based on reliability and validity evidences, have been published in health and ‘safety at work’ areas. However, there are few safety culture assessment instruments with these characteristics (reliability and validity) available on nuclear literature. Therefore, this work proposes an instrument to evaluate, with valid and reliable measures, the safety climate of nuclear research facilities. The instrument was developed based on methodological principles applied to research modeling and its psychometric properties were evaluated by a reliability analysis and validation of content, face and construct. The instrument was applied to an important nuclear research organization in Brazil. This organization comprises 4 research reactors and many nuclear laboratories. The survey results made possible a demographic characterization and the identification of some possible safety culture weaknesses and pointing out potential areas to be improved in the assessed organization. Good evidence of reliability with Cronbach's alpha

  7. Psychometric model for safety culture assessment in nuclear research facilities

    Energy Technology Data Exchange (ETDEWEB)

    Nascimento, C.S. do, E-mail: claudio.souza@ctmsp.mar.mil.br [Centro Tecnológico da Marinha em São Paulo (CTMSP), Av. Professor Lineu Prestes 2468, 05508-000 São Paulo, SP (Brazil); Andrade, D.A., E-mail: delvonei@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN – SP), Av. Professor Lineu Prestes 2242, 05508-000 São Paulo, SP (Brazil); Mesquita, R.N. de, E-mail: rnavarro@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN – SP), Av. Professor Lineu Prestes 2242, 05508-000 São Paulo, SP (Brazil)

    2017-04-01

    Highlights: • A psychometric model to evaluate ‘safety climate’ at nuclear research facilities. • The model presented evidences of good psychometric qualities. • The model was applied to nuclear research facilities in Brazil. • Some ‘safety culture’ weaknesses were detected in the assessed organization. • A potential tool to develop safety management programs in nuclear facilities. - Abstract: A safe and reliable operation of nuclear power plants depends not only on technical performance, but also on the people and on the organization. Organizational factors have been recognized as the main causal mechanisms of accidents by research organizations through USA, Europe and Japan. Deficiencies related with these factors reveal weaknesses in the organization’s safety culture. A significant number of instruments to assess the safety culture based on psychometric models that evaluate safety climate through questionnaires, and which are based on reliability and validity evidences, have been published in health and ‘safety at work’ areas. However, there are few safety culture assessment instruments with these characteristics (reliability and validity) available on nuclear literature. Therefore, this work proposes an instrument to evaluate, with valid and reliable measures, the safety climate of nuclear research facilities. The instrument was developed based on methodological principles applied to research modeling and its psychometric properties were evaluated by a reliability analysis and validation of content, face and construct. The instrument was applied to an important nuclear research organization in Brazil. This organization comprises 4 research reactors and many nuclear laboratories. The survey results made possible a demographic characterization and the identification of some possible safety culture weaknesses and pointing out potential areas to be improved in the assessed organization. Good evidence of reliability with Cronbach's alpha

  8. Assessment of performance of survival prediction models for cancer prognosis

    Directory of Open Access Journals (Sweden)

    Chen Hung-Chia

    2012-07-01

    Full Text Available Abstract Background Cancer survival studies are commonly analyzed using survival-time prediction models for cancer prognosis. A number of different performance metrics are used to ascertain the concordance between the predicted risk score of each patient and the actual survival time, but these metrics can sometimes conflict. Alternatively, patients are sometimes divided into two classes according to a survival-time threshold, and binary classifiers are applied to predict each patient’s class. Although this approach has several drawbacks, it does provide natural performance metrics such as positive and negative predictive values to enable unambiguous assessments. Methods We compare the survival-time prediction and survival-time threshold approaches to analyzing cancer survival studies. We review and compare common performance metrics for the two approaches. We present new randomization tests and cross-validation methods to enable unambiguous statistical inferences for several performance metrics used with the survival-time prediction approach. We consider five survival prediction models consisting of one clinical model, two gene expression models, and two models from combinations of clinical and gene expression models. Results A public breast cancer dataset was used to compare several performance metrics using five prediction models. 1 For some prediction models, the hazard ratio from fitting a Cox proportional hazards model was significant, but the two-group comparison was insignificant, and vice versa. 2 The randomization test and cross-validation were generally consistent with the p-values obtained from the standard performance metrics. 3 Binary classifiers highly depended on how the risk groups were defined; a slight change of the survival threshold for assignment of classes led to very different prediction results. Conclusions 1 Different performance metrics for evaluation of a survival prediction model may give different conclusions in

  9. Accuracy of virtual models in the assessment of maxillary defects

    International Nuclear Information System (INIS)

    Kamburoglu, Kivanc; Kursun, Sebnem; Kilic, Cenk; Eozen, Tuncer

    2015-01-01

    This study aimed to assess the reliability of measurements performed on three-dimensional (3D) virtual models of maxillary defects obtained using cone-beam computed tomography (CBCT) and 3D optical scanning. Mechanical cavities simulating maxillary defects were prepared on the hard palate of nine cadavers. Images were obtained using a CBCT unit at three different fields-of-views (FOVs) and voxel sizes: 1) 60 X 60 mm FOV, 0.125 mm 3 (FOV 60 ); 2) 80 X 80 mm FOV, 0.160 mm 3 (FOV 80 ); and 3) 100 X 100 mm FOV, 0.250 mm 3 (FOV 100 ). Superimposition of the images was performed using software called VRMesh Design. Automated volume measurements were conducted, and differences between surfaces were demonstrated. Silicon impressions obtained from the defects were also scanned with a 3D optical scanner. Virtual models obtained using VRMesh Design were compared with impressions obtained by scanning silicon models. Gold standard volumes of the impression models were then compared with CBCT and 3D scanner measurements. Further, the general linear model was used, and the significance was set to p=0.05. A comparison of the results obtained by the observers and methods revealed the p values to be smaller than 0.05, suggesting that the measurement variations were caused by both methods and observers along with the different cadaver specimens used. Further, the 3D scanner measurements were closer to the gold standard measurements when compared to the CBCT measurements. In the assessment of artificially created maxillary defects, the 3D scanner measurements were more accurate than the CBCT measurements.

  10. Accuracy of virtual models in the assessment of maxillary defects

    Energy Technology Data Exchange (ETDEWEB)

    Kamburoglu, Kivanc [Dept. of Dentomaxillofacial Radiology, Faculty of Dentistry, Ankara University, Ankara (Turkmenistan); Kursun, Sebnem [Division of Dentomaxillofacial Radiology, Ministry of Health, Oral and Dental Health Center, Bolu (Turkmenistan); Kilic, Cenk; Eozen, Tuncer [Gealhane Military Medical Academy, Ankara, (Turkmenistan)

    2015-03-15

    This study aimed to assess the reliability of measurements performed on three-dimensional (3D) virtual models of maxillary defects obtained using cone-beam computed tomography (CBCT) and 3D optical scanning. Mechanical cavities simulating maxillary defects were prepared on the hard palate of nine cadavers. Images were obtained using a CBCT unit at three different fields-of-views (FOVs) and voxel sizes: 1) 60 X 60 mm FOV, 0.125 mm{sup 3} (FOV{sub 60}); 2) 80 X 80 mm FOV, 0.160 mm{sup 3} (FOV{sub 80}); and 3) 100 X 100 mm FOV, 0.250 mm{sup 3} (FOV{sub 100}). Superimposition of the images was performed using software called VRMesh Design. Automated volume measurements were conducted, and differences between surfaces were demonstrated. Silicon impressions obtained from the defects were also scanned with a 3D optical scanner. Virtual models obtained using VRMesh Design were compared with impressions obtained by scanning silicon models. Gold standard volumes of the impression models were then compared with CBCT and 3D scanner measurements. Further, the general linear model was used, and the significance was set to p=0.05. A comparison of the results obtained by the observers and methods revealed the p values to be smaller than 0.05, suggesting that the measurement variations were caused by both methods and observers along with the different cadaver specimens used. Further, the 3D scanner measurements were closer to the gold standard measurements when compared to the CBCT measurements. In the assessment of artificially created maxillary defects, the 3D scanner measurements were more accurate than the CBCT measurements.

  11. An ethical assessment model for digital disease detection technologies.

    Science.gov (United States)

    Denecke, Kerstin

    2017-09-20

    Digital epidemiology, also referred to as digital disease detection (DDD), successfully provided methods and strategies for using information technology to support infectious disease monitoring and surveillance or understand attitudes and concerns about infectious diseases. However, Internet-based research and social media usage in epidemiology and healthcare pose new technical, functional and formal challenges. The focus of this paper is on the ethical issues to be considered when integrating digital epidemiology with existing practices. Taking existing ethical guidelines and the results from the EU project M-Eco and SORMAS as starting point, we develop an ethical assessment model aiming at providing support in identifying relevant ethical concerns in future DDD projects. The assessment model has four dimensions: user, application area, data source and methodology. The model supports in becoming aware, identifying and describing the ethical dimensions of DDD technology or use case and in identifying the ethical issues on the technology use from different perspectives. It can be applied in an interdisciplinary meeting to collect different viewpoints on a DDD system even before the implementation starts and aims at triggering discussions and finding solutions for risks that might not be acceptable even in the development phase. From the answers, ethical issues concerning confidence, privacy, data and patient security or justice may be judged and weighted.

  12. A Zebrafish Heart Failure Model for Assessing Therapeutic Agents.

    Science.gov (United States)

    Zhu, Xiao-Yu; Wu, Si-Qi; Guo, Sheng-Ya; Yang, Hua; Xia, Bo; Li, Ping; Li, Chun-Qi

    2018-03-20

    Heart failure is a leading cause of death and the development of effective and safe therapeutic agents for heart failure has been proven challenging. In this study, taking advantage of larval zebrafish, we developed a zebrafish heart failure model for drug screening and efficacy assessment. Zebrafish at 2 dpf (days postfertilization) were treated with verapamil at a concentration of 200 μM for 30 min, which were determined as optimum conditions for model development. Tested drugs were administered into zebrafish either by direct soaking or circulation microinjection. After treatment, zebrafish were randomly selected and subjected to either visual observation and image acquisition or record videos under a Zebralab Blood Flow System. The therapeutic effects of drugs on zebrafish heart failure were quantified by calculating the efficiency of heart dilatation, venous congestion, cardiac output, and blood flow dynamics. All 8 human heart failure therapeutic drugs (LCZ696, digoxin, irbesartan, metoprolol, qiliqiangxin capsule, enalapril, shenmai injection, and hydrochlorothiazide) showed significant preventive and therapeutic effects on zebrafish heart failure (p failure model developed and validated in this study could be used for in vivo heart failure studies and for rapid screening and efficacy assessment of preventive and therapeutic drugs.

  13. Model assessment using a multi-metric ranking technique

    Science.gov (United States)

    Fitzpatrick, P. J.; Lau, Y.; Alaka, G.; Marks, F.

    2017-12-01

    Validation comparisons of multiple models presents challenges when skill levels are similar, especially in regimes dominated by the climatological mean. Assessing skill separation will require advanced validation metrics and identifying adeptness in extreme events, but maintain simplicity for management decisions. Flexibility for operations is also an asset. This work postulates a weighted tally and consolidation technique which ranks results by multiple types of metrics. Variables include absolute error, bias, acceptable absolute error percentages, outlier metrics, model efficiency, Pearson correlation, Kendall's Tau, reliability Index, multiplicative gross error, and root mean squared differences. Other metrics, such as root mean square difference and rank correlation were also explored, but removed when the information was discovered to be generally duplicative to other metrics. While equal weights are applied, weights could be altered depending for preferred metrics. Two examples are shown comparing ocean models' currents and tropical cyclone products, including experimental products. The importance of using magnitude and direction for tropical cyclone track forecasts instead of distance, along-track, and cross-track are discussed. Tropical cyclone intensity and structure prediction are also assessed. Vector correlations are not included in the ranking process, but found useful in an independent context, and will be briefly reported.

  14. Energy-based numerical models for assessment of soil liquefaction

    Directory of Open Access Journals (Sweden)

    Amir Hossein Alavi

    2012-07-01

    Full Text Available This study presents promising variants of genetic programming (GP, namely linear genetic programming (LGP and multi expression programming (MEP to evaluate the liquefaction resistance of sandy soils. Generalized LGP and MEP-based relationships were developed between the strain energy density required to trigger liquefaction (capacity energy and the factors affecting the liquefaction characteristics of sands. The correlations were established based on well established and widely dispersed experimental results obtained from the literature. To verify the applicability of the derived models, they were employed to estimate the capacity energy values of parts of the test results that were not included in the analysis. The external validation of the models was verified using statistical criteria recommended by researchers. Sensitivity and parametric analyses were performed for further verification of the correlations. The results indicate that the proposed correlations are effectively capable of capturing the liquefaction resistance of a number of sandy soils. The developed correlations provide a significantly better prediction performance than the models found in the literature. Furthermore, the best LGP and MEP models perform superior than the optimal traditional GP model. The verification phases confirm the efficiency of the derived correlations for their general application to the assessment of the strain energy at the onset of liquefaction.

  15. A transportable system of models for natural resource damage assessment

    International Nuclear Information System (INIS)

    Reed, M.; French, D.

    1992-01-01

    A system of computer models has been developed for assessment of natural resource economic damages resulting from spills of oil and hazardous materials in marine and fresh water environments. Under USA federal legislation, the results of the model system are presumed correct in damage litigation proceedings. The model can address a wide range of spatial and temporal scales. The equations describing the motion of both pollutants and biota are solved in three dimensions. The model can simulate continuous releases of a contaminant, with representation of complex coastal boundaries, variable bathymetry, multiple shoreline types, and spatially variable ecosystem habitats. A graphic user interface provides easy control of the system in addition to the ability to display elements of the underlying geographical information system data base. The model is implemented on a personal computer and on a UNIX workstation. The structure of the system is such that transport to new geographic regions can be accomplished relatively easily, requiring only the development of the appropriate physical, toxicological, biological, and economic data sets. Applications are currently in progress for USA inland and coastal waters, the Adriatic Sea, the Strait of Sicily, the Gulf of Suez, and the Baltic Sea. 4 refs., 2 figs

  16. System reliability assessment with an approximate reasoning model

    Energy Technology Data Exchange (ETDEWEB)

    Eisenhawer, S.W.; Bott, T.F.; Helm, T.M.; Boerigter, S.T.

    1998-12-31

    The projected service life of weapons in the US nuclear stockpile will exceed the original design life of their critical components. Interim metrics are needed to describe weapon states for use in simulation models of the nuclear weapons complex. The authors present an approach to this problem based upon the theory of approximate reasoning (AR) that allows meaningful assessments to be made in an environment where reliability models are incomplete. AR models are designed to emulate the inference process used by subject matter experts. The emulation is based upon a formal logic structure that relates evidence about components. This evidence is translated using natural language expressions into linguistic variables that describe membership in fuzzy sets. The authors introduce a metric that measures the acceptability of a weapon to nuclear deterrence planners. Implication rule bases are used to draw a series of forward chaining inferences about the acceptability of components, subsystems and individual weapons. They describe each component in the AR model in some detail and illustrate its behavior with a small example. The integration of the acceptability metric into a prototype model to simulate the weapons complex is also described.

  17. An integrated urban drainage system model for assessing renovation scheme.

    Science.gov (United States)

    Dong, X; Zeng, S; Chen, J; Zhao, D

    2012-01-01

    Due to sustained economic growth in China over the last three decades, urbanization has been on a rapidly expanding track. In recent years, regional industrial relocations were also accelerated across the country from the east coast to the west inland. These changes have led to a large-scale redesign of urban infrastructures, including the drainage system. To help the reconstructed infrastructures towards a better sustainability, a tool is required for assessing the efficiency and environmental performance of different renovation schemes. This paper developed an integrated dynamic modeling tool, which consisted of three models for describing the sewer, the wastewater treatment plant (WWTP) and the receiving water body respectively. Three auxiliary modules were also incorporated to conceptualize the model, calibrate the simulations, and analyze the results. The developed integrated modeling tool was applied to a case study in Shenzhen City, which is one of the most dynamic cities and facing considerable challenges for environmental degradation. The renovation scheme proposed to improve the environmental performance of Shenzhen City's urban drainage system was modeled and evaluated. The simulation results supplied some suggestions for the further improvement of the renovation scheme.

  18. Integrated science model for assessment of climate change. Revision 1

    International Nuclear Information System (INIS)

    Jain, A.K.; Wuebbles, D.J.; Kheshgi, H.S.

    1994-04-01

    Past measurements show that greenhouse gas concentrations, many of which are affected by human related activities, are increasing in the atmosphere. There is wide consensus that this increase influences related activities, are increasing the earth's energy balance and concern that this will cause significant change in climate. Many different policies could be adopted in response to the prospects of greenhouse warming. Models are used by policy markers to analyze the range of possible policy options developed as a response to concerns about climate change. A fully integrated assessment model that spans the many aspects of climate change, including economics, energy options, effects of climate, and impacts of climate change, would be a useful tool. With this goal in mind, the science modules which estimate the effect of emissions of greenhouse gasses on global temperature and sea level are being developed. This is a report of the current characteristics and performance of an Integrated Science Model which consists of coupled modules for carbon cycle, atmospheric chemistry of other trace gases, radiative forcing by greenhouse gases, energy balance model for global temperature, and a model for sea level response

  19. Assessing the Validity of the Simplified Potential Energy Clock Model for Modeling Glass-Ceramics

    Energy Technology Data Exchange (ETDEWEB)

    Jamison, Ryan Dale [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grillet, Anne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stavig, Mark E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Strong, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dai, Steve Xunhu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-10-01

    Glass-ceramic seals may be the future of hermetic connectors at Sandia National Laboratories. They have been shown capable of surviving higher temperatures and pressures than amorphous glass seals. More advanced finite-element material models are required to enable model-based design and provide evidence that the hermetic connectors can meet design requirements. Glass-ceramics are composite materials with both crystalline and amorphous phases. The latter gives rise to (non-linearly) viscoelastic behavior. Given their complex microstructures, glass-ceramics may be thermorheologically complex, a behavior outside the scope of currently implemented constitutive models at Sandia. However, it was desired to assess if the Simplified Potential Energy Clock (SPEC) model is capable of capturing the material response. Available data for SL 16.8 glass-ceramic was used to calibrate the SPEC model. Model accuracy was assessed by comparing model predictions with shear moduli temperature dependence and high temperature 3-point bend creep data. It is shown that the model can predict the temperature dependence of the shear moduli and 3- point bend creep data. Analysis of the results is presented. Suggestions for future experiments and model development are presented. Though further calibration is likely necessary, SPEC has been shown capable of modeling glass-ceramic behavior in the glass transition region but requires further analysis below the transition region.

  20. A short mnemonic to support the comprehensive geriatric assessment model.

    Science.gov (United States)

    Han, Brenda; Grant, Cristin

    2016-10-06

    With an increasing number of older people using emergency services, researchers have raised concerns about the quality of care in an environment that is not designed to address older patients' specific needs and conditions. The comprehensive geriatric assessment (CGA) model was developed to address these issues, and to optimise healthcare delivery to older adults. This article introduces a complementary mnemonic, FRAIL, that refers to important elements of health information to consider before initiating care for older patients - falls/functional decline, reactions, altered mental status, illnesses, and living situation. It is not intended to replace the CGA, but can help to quickly identify high-risk older patients who warrant a more in-depth clinical assessment with CGA.

  1. Methods for Developing Emissions Scenarios for Integrated Assessment Models

    Energy Technology Data Exchange (ETDEWEB)

    Prinn, Ronald [MIT; Webster, Mort [MIT

    2007-08-20

    The overall objective of this research was to contribute data and methods to support the future development of new emissions scenarios for integrated assessment of climate change. Specifically, this research had two main objectives: 1. Use historical data on economic growth and energy efficiency changes, and develop probability density functions (PDFs) for the appropriate parameters for two or three commonly used integrated assessment models. 2. Using the parameter distributions developed through the first task and previous work, we will develop methods of designing multi-gas emission scenarios that usefully span the joint uncertainty space in a small number of scenarios. Results on the autonomous energy efficiency improvement (AEEI) parameter are summarized, an uncertainty analysis of elasticities of substitution is described, and the probabilistic emissions scenario approach is presented.

  2. Bioprocesses: Modelling needs for process evaluation and sustainability assessment

    DEFF Research Database (Denmark)

    Jiménez-Gonzaléz, Concepcion; Woodley, John

    2010-01-01

    development such that they can also be used to evaluate processes against sustainability metrics, as well as economics as an integral part of assessments. Finally, property models will also be required based on compounds not currently present in existing databases. It is clear that many new opportunities......The next generation of process engineers will face a new set of challenges, with the need to devise new bioprocesses, with high selectivity for pharmaceutical manufacture, and for lower value chemicals manufacture based on renewable feedstocks. In this paper the current and predicted future roles...... of process system engineering and life cycle inventory and assessment in the design, development and improvement of sustainable bioprocesses are explored. The existing process systems engineering software tools will prove essential to assist this work. However, the existing tools will also require further...

  3. Some Challenges Posed by Coal Bed Methane Regional Assessment Modeling.

    Science.gov (United States)

    Moore, Catherine R; Doherty, John; Howell, Stephen; Erriah, Leon

    2015-01-01

    Coal measures (coal bearing rock strata) can contain large reserves of methane. These reserves are being exploited at a rapidly increasing rate in many parts of the world. To extract coal seam gas, thousands of wells are drilled at relatively small spacing to depressurize coal seams to induce desorption and allow subsequent capture of the gas. To manage this process effectively, the effect of coal bed methane (CBM) extraction on regional aquifer systems must be properly understood and managed. Groundwater modeling is an integral part of this management process. However, modeling of CBM impacts presents some unique challenges, as processes that are operative at two very different scales must be adequately represented in the models. The impacts of large-scale gas extraction may be felt over a large area, yet despite the significant upscaling that accompanies construction of a regional model, near-well conditions and processes cannot be ignored. These include the highly heterogeneous nature of many coal measures, and the dual-phase flow of water and gas that is induced by coal seam depressurization. To understand these challenges, a fine-scale model was constructed incorporating a detailed representation of lithological heterogeneity to ensure that near-well processes and conditions could be examined. The detail of this heterogeneity was at a level not previously employed in models built to assess groundwater impacts arising from CBM extraction. A dual-phase reservoir simulator was used to examine depressurization and water desaturation processes in the vicinity of an extractive wellfield within this fine-scale model. A single-phase simulator was then employed so that depressurization errors incurred by neglecting near-well, dual-phase flow could be explored. Two models with fewer lithological details were then constructed in order to examine the nature of depressurization errors incurred by upscaling and to assess the interaction of the upscaling process with the

  4. Simplified Predictive Models for CO2 Sequestration Performance Assessment

    Science.gov (United States)

    Mishra, Srikanta; RaviGanesh, Priya; Schuetter, Jared; Mooney, Douglas; He, Jincong; Durlofsky, Louis

    2014-05-01

    We present results from an ongoing research project that seeks to develop and validate a portfolio of simplified modeling approaches that will enable rapid feasibility and risk assessment for CO2 sequestration in deep saline formation. The overall research goal is to provide tools for predicting: (a) injection well and formation pressure buildup, and (b) lateral and vertical CO2 plume migration. Simplified modeling approaches that are being developed in this research fall under three categories: (1) Simplified physics-based modeling (SPM), where only the most relevant physical processes are modeled, (2) Statistical-learning based modeling (SLM), where the simulator is replaced with a "response surface", and (3) Reduced-order method based modeling (RMM), where mathematical approximations reduce the computational burden. The system of interest is a single vertical well injecting supercritical CO2 into a 2-D layered reservoir-caprock system with variable layer permeabilities. In the first category (SPM), we use a set of well-designed full-physics compositional simulations to understand key processes and parameters affecting pressure propagation and buoyant plume migration. Based on these simulations, we have developed correlations for dimensionless injectivity as a function of the slope of fractional-flow curve, variance of layer permeability values, and the nature of vertical permeability arrangement. The same variables, along with a modified gravity number, can be used to develop a correlation for the total storage efficiency within the CO2 plume footprint. In the second category (SLM), we develop statistical "proxy models" using the simulation domain described previously with two different approaches: (a) classical Box-Behnken experimental design with a quadratic response surface fit, and (b) maximin Latin Hypercube sampling (LHS) based design with a Kriging metamodel fit using a quadratic trend and Gaussian correlation structure. For roughly the same number of

  5. Assessing climate model software quality: a defect density analysis of three models

    Directory of Open Access Journals (Sweden)

    J. Pipitone

    2012-08-01

    Full Text Available A climate model is an executable theory of the climate; the model encapsulates climatological theories in software so that they can be simulated and their implications investigated. Thus, in order to trust a climate model, one must trust that the software it is built from is built correctly. Our study explores the nature of software quality in the context of climate modelling. We performed an analysis of defect reports and defect fixes in several versions of leading global climate models by collecting defect data from bug tracking systems and version control repository comments. We found that the climate models all have very low defect densities compared to well-known, similarly sized open-source projects. We discuss the implications of our findings for the assessment of climate model software trustworthiness.

  6. Development and weighting of a life cycle assessment screening model

    Science.gov (United States)

    Bates, Wayne E.; O'Shaughnessy, James; Johnson, Sharon A.; Sisson, Richard

    2004-02-01

    Nearly all life cycle assessment tools available today are high priced, comprehensive and quantitative models requiring a significant amount of data collection and data input. In addition, most of the available software packages require a great deal of training time to learn how to operate the model software. Even after this time investment, results are not guaranteed because of the number of estimations and assumptions often necessary to run the model. As a result, product development, design teams and environmental specialists need a simplified tool that will allow for the qualitative evaluation and "screening" of various design options. This paper presents the development and design of a generic, qualitative life cycle screening model and demonstrates its applicability and ease of use. The model uses qualitative environmental, health and safety factors, based on site or product-specific issues, to sensitize the overall results for a given set of conditions. The paper also evaluates the impact of different population input ranking values on model output. The final analysis is based on site or product-specific variables. The user can then evaluate various design changes and the apparent impact or improvement on the environment, health and safety, compliance cost and overall corporate liability. Major input parameters can be varied, and factors such as materials use, pollution prevention, waste minimization, worker safety, product life, environmental impacts, return of investment, and recycle are evaluated. The flexibility of the model format will be discussed in order to demonstrate the applicability and usefulness within nearly any industry sector. Finally, an example using audience input value scores will be compared to other population input results.

  7. Assessment of RANS CFD modelling for pressurised thermal shock analysis

    International Nuclear Information System (INIS)

    Sander M Willemsen; Ed MJ Komen; Sander Willemsen

    2005-01-01

    Full text of publication follows: The most severe Pressurised Thermal Shock (PTS) scenario is a cold water Emergency Core Coolant (ECC) injection into the cold leg during a LOCA. The injected ECC water mixes with the hot fluid present in the cold leg and flows towards the downcomer where further mixing takes place. When the cold mixture comes into contact with the Reactor Pressure Vessel (RPV) wall, it may lead to large temperature gradients and consequently to high stresses in the RPV wall. Knowledge of these thermal loads is important for RPV remnant life assessments. The existing thermal-hydraulic system codes currently applied for this purpose are based on one-dimensional approximations and can, therefore, not predict the complex three-dimensional flows occurring during ECC injection. Computational Fluid Dynamics (CFD) can be applied to predict these phenomena, with the ultimate benefit of improved remnant RPV life assessment. The present paper presents an assessment of various Reynolds Averaged Navier Stokes (RANS) CFD approaches for modeling the complex mixing phenomena occurring during ECC injection. This assessment has been performed by comparing the numerical results obtained using advanced turbulence models available in the CFX 5.6 CFD code in combination with a hybrid meshing strategy with experimental results of the Upper Plenum Test Facility (UPTF). The UPTF was a full-scale 'simulation' of the primary system of the four loop 1300 MWe Siemens/KWU Pressurised Water Reactor at Grafenrheinfeld. The test vessel upper plenum internals, downcomer and primary coolant piping were replicas of the reference plant, while other components, such as core, coolant pump and steam generators were replaced by simulators. From the extensive test programme, a single-phase fluid-fluid mixing experiment in the cold leg and downcomer was selected. Prediction of the mixing and stratification is assessed by comparison with the measured temperature profiles at several locations

  8. Permafrost Degradation Risk Zone Assessment using Simulation Models

    DEFF Research Database (Denmark)

    Daanen, R.P.; Ingeman-Nielsen, Thomas; Marchenko, S.

    2011-01-01

    as the potential active layer increase due to climate warming and surface alterations. PTP is then used in a simple risk assessment procedure useful for engineering applications. The modelling shows that climate warming will result in continuing wide-spread permafrost warming and degradation in Greenland......In this proof-of-concept study we focus on linking large scale climate and permafrost simulations to small scale engineering projects by bridging the gap between climate and permafrost sciences on the one hand and on the other technical recommendation for adaptation of planned infrastructures...

  9. Status and challenges in risk assessment - the DREAM model

    Energy Technology Data Exchange (ETDEWEB)

    Johnsen, Staale

    1998-12-01

    This publication relates to the Norwegian DREAM project. The objective of the project is to develop a general environmental risk analysis methodology which allows for time-varying sub-lethal exposures of marine biota to discharge plumes which may be composed of mixtures of chemicals. The DREAM project is based on research in this area over the past five years, and a pre-runner of the DREAM model, PROVANN, is presently available, and has been applied for regional risk assessment studies in the Norwegian seas. 7 figs., 2 tabs.

  10. Uncertainty Assessment in Long Term Urban Drainage Modelling

    DEFF Research Database (Denmark)

    Thorndahl, Søren

    on the rainfall inputs. In order to handle the uncertainties three different stochastic approaches are investigated applying a case catchment in the town Frejlev: (1) a reliability approach in which a parameterization of the rainfall input is conducted in order to generate synthetic rainfall events and find...... return periods, and even within the return periods specified in the design criteria. If urban drainage models are based on standard parameters and hence not calibrated, the uncertainties are even larger. The greatest uncertainties are shown to be the rainfall input and the assessment of the contributing...

  11. Model quality assessment using distance constraints from alignments

    DEFF Research Database (Denmark)

    Paluszewski, Martin; Karplus, Kevin

    2008-01-01

    that model which is closest to the true structure. In this article, we present a new approach for addressing the MQA problem. It is based on distance constraints extracted from alignments to templates of known structure, and is implemented in the Undertaker program for protein structure prediction. One novel...... with the best MQA methods that were assessed at CASP7. We also propose a new evaluation measure, Kendall's tau, that is more interpretable than conventional measures used for evaluating MQA methods (Pearson's r and Spearman's rho). We show clear examples where Kendall's tau agrees much more with our intuition...

  12. Probabilistic Modeling and Risk Assessment of Cable Icing

    DEFF Research Database (Denmark)

    Roldsgaard, Joan Hee

    This dissertation addresses the issues related to icing of structures with special emphasis on bridge cables. Cable supported bridges in cold climate suffers for ice accreting on the cables, this poses three different undesirable situations. Firstly the changed shape of the cable due to ice...... accretion can lead to large amplitude vibrations, which might reduce the fatigue life of the cables significantly. Secondly ice shedding from the cables pose a safety issue for the users of the bridge, which leads to the third issue. The third issue is regarding the consequences of the ice shedding from...... the bridge cables, which can cause socioeconomically expensive closures of bridges and traffic disruptions. The objective is to develop a simple model that can be used to assess the occurrence probability of ice accretion on bridge cables from readily available meteorological variables. This model is used...

  13. Lysimeter data as input to performance assessment models

    International Nuclear Information System (INIS)

    McConnell, J.W. Jr.

    1998-01-01

    The Field Lysimeter Investigations: Low-Level Waste Data Base Development Program is obtaining information on the performance of radioactive waste forms in a disposal environment. Waste forms fabricated using ion-exchange resins from EPICOR-117 prefilters employed in the cleanup of the Three Mile Island (TMI) Nuclear Power Station are being tested to develop a low-level waste data base and to obtain information on survivability of waste forms in a disposal environment. The program includes reviewing radionuclide releases from those waste forms in the first 7 years of sampling and examining the relationship between code input parameters and lysimeter data. Also, lysimeter data are applied to performance assessment source term models, and initial results from use of data in two models are presented

  14. Operation quality assessment model for video conference system

    Science.gov (United States)

    Du, Bangshi; Qi, Feng; Shao, Sujie; Wang, Ying; Li, Weijian

    2018-01-01

    Video conference system has become an important support platform for smart grid operation and management, its operation quality is gradually concerning grid enterprise. First, the evaluation indicator system covering network, business and operation maintenance aspects was established on basis of video conference system's operation statistics. Then, the operation quality assessment model combining genetic algorithm with regularized BP neural network was proposed, which outputs operation quality level of the system within a time period and provides company manager with some optimization advice. The simulation results show that the proposed evaluation model offers the advantages of fast convergence and high prediction accuracy in contrast with regularized BP neural network, and its generalization ability is superior to LM-BP neural network and Bayesian BP neural network.

  15. Usage models in reliability assessment of software-based systems

    International Nuclear Information System (INIS)

    Haapanen, P.; Pulkkinen, U.; Korhonen, J.

    1997-04-01

    This volume in the OHA-project report series deals with the statistical reliability assessment of software based systems on the basis of dynamic test results and qualitative evidence from the system design process. Other reports to be published later on in the OHA-project report series will handle the diversity requirements in safety critical software-based systems, generation of test data from operational profiles and handling of programmable automation in plant PSA-studies. In this report the issues related to the statistical testing and especially automated test case generation are considered. The goal is to find an efficient method for building usage models for the generation of statistically significant set of test cases and to gather practical experiences from this method by applying it in a case study. The scope of the study also includes the tool support for the method, as the models may grow quite large and complex. (32 refs., 30 figs.)

  16. Assessing policies towards sustainable transport in Europe: an integrated model

    International Nuclear Information System (INIS)

    Zachariadis, Theodoros

    2005-01-01

    A transport simulation and forecast model is presented, which is designed for the assessment of policy options aiming to achieve sustainability in transportation. Starting from a simulation of the economic behaviour of consumers and producers within a microeconomic optimisation framework and the resulting calculation of the modal split, the allocation of the vehicle stock into vintages and technological groups is modelled. In a third step, a technology-oriented algorithm, which incorporates the relevant state-of-the-art knowledge in Europe, calculates emissions of air pollutants and greenhouse gases as well as appropriate indicators for traffic congestion, noise and road accidents. The paper outlines the methodology and the basic data sources used in connection with work done so far in Europe, presents the outlook according to a 'reference case' run for the 15 current European Union Member States up to 2030, displays aggregate results from a number of alternative scenarios and outlines elements of future work

  17. GERMcode: A Stochastic Model for Space Radiation Risk Assessment

    Science.gov (United States)

    Kim, Myung-Hee Y.; Ponomarev, Artem L.; Cucinotta, Francis A.

    2012-01-01

    A new computer model, the GCR Event-based Risk Model code (GERMcode), was developed to describe biophysical events from high-energy protons and high charge and energy (HZE) particles that have been studied at the NASA Space Radiation Laboratory (NSRL) for the purpose of simulating space radiation biological effects. In the GERMcode, the biophysical description of the passage of HZE particles in tissue and shielding materials is made with a stochastic approach that includes both particle track structure and nuclear interactions. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections. For NSRL applications, the GERMcode evaluates a set of biophysical properties, such as the Poisson distribution of particles or delta-ray hits for a given cellular area and particle dose, the radial dose on tissue, and the frequency distribution of energy deposition in a DNA volume. By utilizing the ProE/Fishbowl ray-tracing analysis, the GERMcode will be used as a bi-directional radiation transport model for future spacecraft shielding analysis in support of Mars mission risk assessments. Recent radiobiological experiments suggest the need for new approaches to risk assessment that include time-dependent biological events due to the signaling times for activation and relaxation of biological processes in cells and tissue. Thus, the tracking of the temporal and spatial distribution of events in tissue is a major goal of the GERMcode in support of the simulation of biological processes important in GCR risk assessments. In order to validate our approach, basic radiobiological responses such as cell survival curves, mutation, chromosomal

  18. TECHNICAL PRODUCT RISK ASSESSMENT: STANDARDS, INTEGRATION IN THE ERM MODEL AND UNCERTAINTY MODELING

    Directory of Open Access Journals (Sweden)

    Mirko Djapic

    2016-03-01

    Full Text Available European Union has accomplished, through introducing New Approach to technical harmonization and standardization, a breakthrough in the field of technical products safety and in assessing their conformity, in such a manner that it integrated products safety requirements into the process of products development. This is achieved by quantifying risk levels with the aim of determining the scope of the required safety measures and systems. The theory of probability is used as a tool for modeling uncertainties in the assessment of that risk. In the last forty years are developed new mathematical theories have proven to be better at modeling uncertainty when we have not enough data about uncertainty events which is usually the case in product development. Bayesian networks based on modeling of subjective probability and Evidence networks based on Dempster-Shafer theory of belief functions proved to be an excellent tool for modeling uncertainty when we do not have enough information about all events aspect.

  19. Quality assessment in higher education using the SERVQUALQ model

    Directory of Open Access Journals (Sweden)

    Sabina Đonlagić

    2015-01-01

    Full Text Available Economy in Bosnia and Herzegovina is striving towards growth and increased employment and it has been proven by empirical studies worldwide that higher education contributes to socio-economic development of a country. Universities are important for generation, preservation and dissemination of knowledge in order to contribute to socio-economic benefits of a country. Higher education institutions are being pressured to improve value for their activities and providing quality higher education service to students should be taken seriously. In this paper we will address the emerging demand for quality in higher education. Higher education institutions should assess quality of their services and establish methods for improving quality. Activities of quality assurance should be integrated into the management process at higher education institutions. This paper is addressing the issue of service quality measurement in higher education institutions. The most frequently used model in this context is the SERVQUAL model. This model is measuring quality from the students' point of view, since students are considered to be one of the most important stakeholders for a higher education institution. The main objective of this research is to provide empirical evidence that the adapted SERVQAL model can be used in higher education and to identify the service quality gap based on its application at one institution of higher education (Faculty of Economics in Bosnia and Herzegovina. Furthermore, results of the gap analysis using the SERVQUAL methodology provide relevant information in which areas improvement is necessary in order to enhance service quality.

  20. Biosphere model for assessing doses from nuclear waste disposal

    International Nuclear Information System (INIS)

    Zach, R.; Amiro, B.D.; Davis, P.A.; Sheppard, S.C.; Szekeley, J.G.

    1994-01-01

    The biosphere model, BIOTRAC, for predicting long term nuclide concentrations and radiological doses from Canada's nuclear fuel waste disposal concept of a vault deep in plutonic rock of the Canadian Shield is presented. This generic, boreal zone biosphere model is based on scenario analysis and systems variability analysis using Monte Carlo simulation techniques. Conservatism is used to bridge uncertainties, even though this creates a small amount of extra nuclide mass. Environmental change over the very long assessment period is mainly handled through distributed parameter values. The dose receptors are a critical group of humans and four generic non-human target organisms. BIOTRAC includes six integrated submodels and it interfaces smoothly with a geosphere model. This interface includes a bedrock well. The geosphere model defines the discharge zones of deep groundwater where nuclides released from the vault enter the biosphere occupied by the dose receptors. The size of one of these zones is reduced when water is withdrawn from the bedrock well. Sensitivity analysis indicates 129 I is by far the most important radionuclide. Results also show bedrock-well water leads to higher doses to man than lake water, but the former doses decrease with the size of the critical group. Under comparable circumstances, doses to the non-human biota are greater than those for man

  1. The Use of Logistic Model in RUL Assessment

    Science.gov (United States)

    Gumiński, R.; Radkowski, S.

    2017-12-01

    The paper takes on the issue of assessment of remaining useful life (RUL). The goal of the paper was to develop a method, which would enable use of diagnostic information in the task of reducing the uncertainty related to technical risk. Prediction of the remaining useful life (RUL) of the system is a very important task for maintenance strategy. In the literature RUL of an engineering system is defined as the first future time instant in which thresholds of conditions (safety, operational quality, maintenance cost, etc) are violated. Knowledge of RUL offers the possibility of planning the testing and repair activities. Building models of damage development is important in this task. In the presented work, logistic function will be used to model fatigue crack development. It should be remembered that modeling of every phase of damage development is very difficult, yet modeling of every phase of damage separately, especially including on-line diagnostic information is more effective. Particular attention was paid to the possibility of forecasting the occurrence of damage due to fatigue while relying on the analysis of the structure of a vibroacoustic signal.

  2. SCORING ASSESSMENT AND FORECASTING MODELS BANKRUPTCY RISK OF COMPANIES

    Directory of Open Access Journals (Sweden)

    SUSU Stefanita

    2014-07-01

    Full Text Available Bankruptcy risk made the subject of many research studies that aim at identifying the time of the bankruptcy, the factors that compete to achieve this state, the indicators that best express this orientation (the bankruptcy. The threats to enterprises require the managers knowledge of continually economic and financial situations, and vulnerable areas with development potential. Managers need to identify and properly manage the threats that would prevent achieving the targets. In terms of methods known in the literature of assessment and evaluation of bankruptcy risk they are static, functional, strategic, and scoring nonfinancial models. This article addresses Altman and Conan-Holder-known internationally as the model developed at national level by two teachers from prestigious universities in our country-the Robu-Mironiuc model. Those models are applied to data released by the profit and loss account and balance sheet Turism Covasna company over which bankruptcy risk analysis is performed. The results of the analysis are interpreted while trying to formulate solutions to the economic and financial viability of the entity.

  3. The MARINA model (Model to Assess River Inputs of Nutrients to seAs)

    OpenAIRE

    Strokal, Maryna; Kroeze, Carolien; Wang, Mengru; Bai, Zhaohai; Ma, Lin

    2016-01-01

    Chinese agriculture has been developing fast towards industrial food production systems that discharge nutrient-rich wastewater into rivers. As a result, nutrient export by rivers has been increasing, resulting in coastal water pollution. We developed a Model to Assess River Inputs of Nutrients to seAs (MARINA) for China. The MARINA Nutrient Model quantifies river export of nutrients by source at the sub-basin scale as a function of human activities on land. MARINA is a downscaled version for...

  4. Frameworks for Assessing the Quality of Modeling and Simulation Capabilities

    Science.gov (United States)

    Rider, W. J.

    2012-12-01

    The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are

  5. Uncertainty assessment in building energy performance with a simplified model

    Directory of Open Access Journals (Sweden)

    Titikpina Fally

    2015-01-01

    Full Text Available To assess a building energy performance, the consumption being predicted or estimated during the design stage is compared to the measured consumption when the building is operational. When valuing this performance, many buildings show significant differences between the calculated and measured consumption. In order to assess the performance accurately and ensure the thermal efficiency of the building, it is necessary to evaluate the uncertainties involved not only in measurement but also those induced by the propagation of the dynamic and the static input data in the model being used. The evaluation of measurement uncertainty is based on both the knowledge about the measurement process and the input quantities which influence the result of measurement. Measurement uncertainty can be evaluated within the framework of conventional statistics presented in the Guide to the Expression of Measurement Uncertainty (GUM as well as by Bayesian Statistical Theory (BST. Another choice is the use of numerical methods like Monte Carlo Simulation (MCS. In this paper, we proposed to evaluate the uncertainty associated to the use of a simplified model for the estimation of the energy consumption of a given building. A detailed review and discussion of these three approaches (GUM, MCS and BST is given. Therefore, an office building has been monitored and multiple temperature sensors have been mounted on candidate locations to get required data. The monitored zone is composed of six offices and has an overall surface of 102 m2.

  6. FLS and FES: comprehensive models of training and assessment.

    Science.gov (United States)

    Vassiliou, Melina C; Dunkin, Brian J; Marks, Jeffrey M; Fried, Gerald M

    2010-06-01

    The Fundamentals of Laparoscopic surgery (FLS) is a validated program for the teaching and evaluation of the basic knowledge and skills required to perform laparoscopic surgery. The educational component includes didactic, Web-based material and a simple, affordable physical simulator with specific tasks and a recommended curriculum. FLS certification requires passing a written multiple-choice examination and a proctored manual skills examination in the FLS simulator. The metrics for the FLS program have been rigorously validated to meet the highest educational standards, and certification is now a requirement for the American Board of Surgery. This article summarizes the validation process and the FLS-related research that has been done to date. The Fundamentals of Endoscopic Surgery is a program modeled after FLS with a similar mission for flexible endoscopy. It is currently in the final stages of development and will be launched in April 2010. The program also includes learning and assessment components, and is undergoing the same meticulous validation process as FLS. These programs serve as models for the creation of simulation-based tools to teach skills and assess competence with the intention of optimizing patient safety and the quality of surgical education. Copyright 2010 Elsevier Inc. All rights reserved.

  7. THE PENA BLANCA NATURAL ANALOGUE PERFORMANCE ASSESSMENT MODEL

    Energy Technology Data Exchange (ETDEWEB)

    G. Saulnier and W. Statham

    2006-04-16

    The Nopal I uranium mine in the Sierra Pena Blanca, Chihuahua, Mexico serves as a natural analogue to the Yucca Mountain repository. The Pena Blanca Natural Analogue Performance Assessment Model simulates the mobilization and transport of radionuclides that are released from the mine and transported to the saturated zone. The Pena Blanca Natural Analogue Performance Assessment Model uses probabilistic simulations of hydrogeologic processes that are analogous to the processes that occur at the Yucca Mountain site. The Nopal I uranium deposit lies in fractured, welded, and altered rhyolitic ash-flow tuffs that overlie carbonate rocks, a setting analogous to the geologic formations at the Yucca Mountain site. The Nopal I mine site has the following analogous characteristics as compared to the Yucca Mountain repository site: (1) Analogous source--UO{sub 2} uranium ore deposit = spent nuclear fuel in the repository; (2) Analogous geology--(i.e. fractured, welded, and altered rhyolitic ash-flow tuffs); (3) Analogous climate--Semiarid to arid; (4) Analogous setting--Volcanic tuffs overlie carbonate rocks; and (5) Analogous geochemistry--Oxidizing conditions Analogous hydrogeology: The ore deposit lies in the unsaturated zone above the water table.

  8. Modelling Approach to Assess Future Agricultural Water Demand

    Science.gov (United States)

    Spano, D.; Mancosu, N.; Orang, M.; Sarreshteh, S.; Snyder, R. L.

    2013-12-01

    The combination of long-term climate changes (e.g., warmer average temperatures) and extremes events (e.g., droughts) can have decisive impacts on water demand, with further implications on the ecosystems. In countries already affected by water scarcity, water management problems are becoming increasingly serious. The sustainable management of available water resources at the global, regional, and site-specific level is necessary. In agriculture, the first step is to compute how much water is needed by crops in regards to climate conditions. Modelling approach can be a way to compute crop water requirement (CWR). In this study, the improved version of the SIMETAW model was used. The model is a user friendly soil water balance model, developed by the University of California, Davis, the California Department of Water Resource, and the University of Sassari. The SIMETAW# model assesses CWR and generates hypothetical irrigation scheduling for a wide range of irrigated crops experiencing full, deficit, or no irrigation. The model computes the evapotranspiration of the applied water (ETaw), which is the sum of the net amount of irrigation water needed to match losses due to the crop evapotranspiration (ETc). ETaw is determined by first computing reference evapotranspiration (ETo) using the daily standardized Reference Evapotranspiration equation. ETaw is computed as ETaw = CETc - CEr, where CETc and CE are the cumulative total crop ET and effective rainfall values, respectively. Crop evapotranspiration is estimated as ETc = ETo x Kc, where Kc is the corrected midseason tabular crop coefficient, adjusted for climate conditions. The net irrigation amounts are determined from a daily soil water balance, using an integrated approach that considers soil and crop management information, and the daily ETc estimates. Using input information on irrigation system distribution uniformity and runoff, when appropriate, the model estimates the applied water to the low quarter of the

  9. Pluripotent stem cells: An in vitro model for nanotoxicity assessments.

    Science.gov (United States)

    Handral, Harish K; Tong, Huei Jinn; Islam, Intekhab; Sriram, Gopu; Rosa, Vinicus; Cao, Tong

    2016-10-01

    The advent of technology has led to an established range of engineered nanoparticles that are used in diverse applications, such as cell-cell interactions, cell-material interactions, medical therapies and the target modulation of cellular processes. The exponential increase in the utilization of nanomaterials and the growing number of associated criticisms has highlighted the potential risks of nanomaterials to human health and the ecosystem. The existing in vivo and in vitro platforms show limitations, with fluctuations being observed in the results of toxicity assessments. Pluripotent stem cells (PSCs) are viable source of cells that are capable of developing into specialized cells of the human body. PSCs can be efficiently used to screen new biomaterials/drugs and are potential candidates for studying impairments of biophysical morphology at both the cellular and tissue levels during interactions with nanomaterials and for diagnosing toxicity. Three-dimensional in vitro models obtained using PSC-derived cells would provide a realistic, patient-specific platform for toxicity assessments and in drug screening applications. The current review focuses on PSCs as an alternative in vitro platform for assessing the hazardous effects of nanomaterials on health systems and highlights the importance of PSC-derived in vitro platforms. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  10. EXPERT MODEL OF LAND SUITABILITY ASSESSMENT FOR CROPS

    Directory of Open Access Journals (Sweden)

    Boris Đurđević

    2010-12-01

    Full Text Available A total of 17404 soil samples (2003rd-2009th year were analysed in the eastern Croatia. The largest number of soil samples belongs to the Osijek-Baranya county, which together with both Eastern sugar beet Factories (Osijek and Županja, conduct the soil fertility control (~4200 samples/yr.. Computer model suitability assessment for crops, supported by GIS, proved to be fast, efficient enough reliable in terms of the number of analyzed soil samples. It allows the visualization of the agricultural area and prediction of its production properties for the purposes of analysis, planning and rationalization of agricultural production. With more precise data about the soil (soil, climate and reliable Digital Soil Map of Croatia, the model could be an acceptable, not only to evaluate the suitability for growing different crops but also their need for fertilizer, necessary machinery, repairs (liming, and other measures of organic matter input. The abovementioned aims to eliminate or reduce effects of limiting factors in primary agricultural production. Assessment of the relative benefits of soil presented by computer model for the crops production and geostatistical method kriging in the Osijek-Baranya county showed: 1 Average soil suitability being 60.06 percent. 2 Kriging predicted that 51751 ha (17.16% are of limited resources (N1 for growing crops whereas a 86142 ha (28.57% of land is limited suitably (S3, b 132789 ha (44.04% are moderately suitable (S2 and c 30772 ha (10.28% are of excellent fertility (S1. A large number of eastern Croatian land data showed that the computer-geostatistical model for determination of soil benefits for growing crops was automated, fast and simple to use and suitable for the implementation of GIS and automatically downloading the necessary benefit indicators from the input base (land, analytical and climate as well as data from the digital soil maps able to: a visualize the suitability for soil tillage, b predict the

  11. The Terrestrial Investigation Model: A probabilistic risk assessment model for birds exposed to pesticides

    Science.gov (United States)

    One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....

  12. The MARINA model (Model to Assess River Inputs of Nutrients to seAs)

    NARCIS (Netherlands)

    Strokal, Maryna; Kroeze, Carolien; Wang, Mengru; Bai, Zhaohai; Ma, Lin

    2016-01-01

    Chinese agriculture has been developing fast towards industrial food production systems that discharge nutrient-rich wastewater into rivers. As a result, nutrient export by rivers has been increasing, resulting in coastal water pollution. We developed a Model to Assess River Inputs of Nutrients

  13. An ensemble model of QSAR tools for regulatory risk assessment.

    Science.gov (United States)

    Pradeep, Prachi; Povinelli, Richard J; White, Shannon; Merrill, Stephen J

    2016-01-01

    Quantitative structure activity relationships (QSARs) are theoretical models that relate a quantitative measure of chemical structure to a physical property or a biological effect. QSAR predictions can be used for chemical risk assessment for protection of human and environmental health, which makes them interesting to regulators, especially in the absence of experimental data. For compatibility with regulatory use, QSAR models should be transparent, reproducible and optimized to minimize the number of false negatives. In silico QSAR tools are gaining wide acceptance as a faster alternative to otherwise time-consuming clinical and animal testing methods. However, different QSAR tools often make conflicting predictions for a given chemical and may also vary in their predictive performance across different chemical datasets. In a regulatory context, conflicting predictions raise interpretation, validation and adequacy concerns. To address these concerns, ensemble learning techniques in the machine learning paradigm can be used to integrate predictions from multiple tools. By leveraging various underlying QSAR algorithms and training datasets, the resulting consensus prediction should yield better overall predictive ability. We present a novel ensemble QSAR model using Bayesian classification. The model allows for varying a cut-off parameter that allows for a selection in the desirable trade-off between model sensitivity and specificity. The predictive performance of the ensemble model is compared with four in silico tools (Toxtree, Lazar, OECD Toolbox, and Danish QSAR) to predict carcinogenicity for a dataset of air toxins (332 chemicals) and a subset of the gold carcinogenic potency database (480 chemicals). Leave-one-out cross validation results show that the ensemble model achieves the best trade-off between sensitivity and specificity (accuracy: 83.8 % and 80.4 %, and balanced accuracy: 80.6 % and 80.8 %) and highest inter-rater agreement [kappa ( κ ): 0

  14. Korean risk assessment model for breast cancer risk prediction.

    Directory of Open Access Journals (Sweden)

    Boyoung Park

    Full Text Available PURPOSE: We evaluated the performance of the Gail model for a Korean population and developed a Korean breast cancer risk assessment tool (KoBCRAT based upon equations developed for the Gail model for predicting breast cancer risk. METHODS: Using 3,789 sets of cases and controls, risk factors for breast cancer among Koreans were identified. Individual probabilities were projected using Gail's equations and Korean hazard data. We compared the 5-year and lifetime risk produced using the modified Gail model which applied Korean incidence and mortality data and the parameter estimators from the original Gail model with those produced using the KoBCRAT. We validated the KoBCRAT based on the expected/observed breast cancer incidence and area under the curve (AUC using two Korean cohorts: the Korean Multicenter Cancer Cohort (KMCC and National Cancer Center (NCC cohort. RESULTS: The major risk factors under the age of 50 were family history, age at menarche, age at first full-term pregnancy, menopausal status, breastfeeding duration, oral contraceptive usage, and exercise, while those at and over the age of 50 were family history, age at menarche, age at menopause, pregnancy experience, body mass index, oral contraceptive usage, and exercise. The modified Gail model produced lower 5-year risk for the cases than for the controls (p = 0.017, while the KoBCRAT produced higher 5-year and lifetime risk for the cases than for the controls (p<0.001 and <0.001, respectively. The observed incidence of breast cancer in the two cohorts was similar to the expected incidence from the KoBCRAT (KMCC, p = 0.880; NCC, p = 0.878. The AUC using the KoBCRAT was 0.61 for the KMCC and 0.89 for the NCC cohort. CONCLUSIONS: Our findings suggest that the KoBCRAT is a better tool for predicting the risk of breast cancer in Korean women, especially urban women.

  15. Documentation of the Ecological Risk Assessment Computer Model ECORSK.5

    Energy Technology Data Exchange (ETDEWEB)

    Anthony F. Gallegos; Gilbert J. Gonzales

    1999-06-01

    The FORTRAN77 ecological risk computer model--ECORSK.5--has been used to estimate the potential toxicity of surficial deposits of radioactive and non-radioactive contaminants to several threatened and endangered (T and E) species at the Los Alamos National Laboratory (LANL). These analyses to date include preliminary toxicity estimates for the Mexican spotted owl, the American peregrine falcon, the bald eagle, and the southwestern willow flycatcher. This work has been performed as required for the Record of Decision for the construction of the Dual Axis Radiographic Hydrodynamic Test (DARHT) Facility at LANL as part of the Environmental Impact Statement. The model is dependent on the use of the geographic information system and associated software--ARC/INFO--and has been used in conjunction with LANL's Facility for Information Management and Display (FIMAD) contaminant database. The integration of FIMAD data and ARC/INFO using ECORSK.5 allows the generation of spatial information from a gridded area of potential exposure called an Ecological Exposure Unit. ECORSK.5 was used to simulate exposures using a modified Environmental Protection Agency Quotient Method. The model can handle a large number of contaminants within the home range of T and E species. This integration results in the production of hazard indices which, when compared to risk evaluation criteria, estimate the potential for impact from consumption of contaminants in food and ingestion of soil. The assessment is considered a Tier-2 type of analysis. This report summarizes and documents the ECORSK.5 code, the mathematical models used in the development of ECORSK.5, and the input and other requirements for its operation. Other auxiliary FORTRAN 77 codes used for processing and graphing output from ECORSK.5 are also discussed. The reader may refer to reports cited in the introduction to obtain greater detail on past applications of ECORSK.5 and assumptions used in deriving model parameters.

  16. Models for dose assessments. Modules for various biosphere types

    International Nuclear Information System (INIS)

    Bergstroem, U.; Nordlinder, S.; Aggeryd, I.

    1999-12-01

    The main objective of this study was to provide a basis for illustrations of yearly dose rates to the most exposed individual from hypothetical leakages of radionuclides from a deep bedrock repository for spent nuclear fuel and other radioactive waste. The results of this study will be used in the safety assessment SR 97 and in a study on the design and long-term safety for a repository planned to contain long-lived low and intermediate level waste. The repositories will be designed to isolate the radionuclides for several hundred thousands of years. In the SR 97 study, however, hypothetical scenarios for leakage are postulated. Radionuclides are hence assumed to be transported in the geosphere by groundwater, and probably discharge into the biosphere. This may occur in several types of ecosystems. A number of categories of such ecosystems were identified, and turnover of radionuclides was modelled separately for each ecosystem. Previous studies had focused on generic models for wells, lakes and coastal areas. These models were, in this study, developed further to use site-specific data. In addition, flows of groundwater, containing radionuclides, to agricultural land and peat bogs were considered. All these categories are referred to as modules in this report. The forest ecosystems were not included, due to a general lack of knowledge of biospheric processes in connection with discharge of groundwater in forested areas. Examples of each type of module were run with the assumption of a continuous annual release into the biosphere of 1 Bq for each radionuclide during 10 000 years. The results are presented as ecosystem specific dose conversion factors (EDFs) for each nuclide at the year 10 000, assuming stationary ecosystems and prevailing living conditions and habits. All calculations were performed with uncertainty analyses included. Simplifications and assumptions in the modelling of biospheric processes are discussed. The use of modules may be seen as a step

  17. Models for dose assessments. Modules for various biosphere types

    Energy Technology Data Exchange (ETDEWEB)

    Bergstroem, U.; Nordlinder, S.; Aggeryd, I. [Studsvik Eco and Safety AB, Nykoeping (Sweden)

    1999-12-01

    The main objective of this study was to provide a basis for illustrations of yearly dose rates to the most exposed individual from hypothetical leakages of radionuclides from a deep bedrock repository for spent nuclear fuel and other radioactive waste. The results of this study will be used in the safety assessment SR 97 and in a study on the design and long-term safety for a repository planned to contain long-lived low and intermediate level waste. The repositories will be designed to isolate the radionuclides for several hundred thousands of years. In the SR 97 study, however, hypothetical scenarios for leakage are postulated. Radionuclides are hence assumed to be transported in the geosphere by groundwater, and probably discharge into the biosphere. This may occur in several types of ecosystems. A number of categories of such ecosystems were identified, and turnover of radionuclides was modelled separately for each ecosystem. Previous studies had focused on generic models for wells, lakes and coastal areas. These models were, in this study, developed further to use site-specific data. In addition, flows of groundwater, containing radionuclides, to agricultural land and peat bogs were considered. All these categories are referred to as modules in this report. The forest ecosystems were not included, due to a general lack of knowledge of biospheric processes in connection with discharge of groundwater in forested areas. Examples of each type of module were run with the assumption of a continuous annual release into the biosphere of 1 Bq for each radionuclide during 10 000 years. The results are presented as ecosystem specific dose conversion factors (EDFs) for each nuclide at the year 10 000, assuming stationary ecosystems and prevailing living conditions and habits. All calculations were performed with uncertainty analyses included. Simplifications and assumptions in the modelling of biospheric processes are discussed. The use of modules may be seen as a step

  18. Environmental assessment of amine-based carbon capture Scenario modelling with life cycle assessment (LCA)

    Energy Technology Data Exchange (ETDEWEB)

    Brekke, Andreas; Askham, Cecilia; Modahl, Ingunn Saur; Vold, Bjoern Ivar; Johnsen, Fredrik Moltu

    2012-07-01

    This report contains a first attempt at introducing the environmental impacts associated with amines and derivatives in a life cycle assessment (LCA) of gas power production with carbon capture and comparing these with other environmental impacts associated with the production system. The report aims to identify data gaps and methodological challenges connected both to modelling toxicity of amines and derivatives and weighting of environmental impacts. A scenario based modelling exercise was performed on a theoretical gas power plant with carbon capture, where emission levels of nitrosamines were varied between zero (gas power without CCS) to a worst case level (outside the probable range of actual carbon capture facilities). Because of extensive research and development in the areas of solvents and emissions from carbon capture facilities in the latter years, data used in the exercise may be outdated and results should therefore not be taken at face value.The results from the exercise showed: According to UseTox, emissions of nitrosamines are less important than emissions of formaldehyde with regard to toxicity related to operation of (i.e. both inputs to and outputs from) a carbon capture facility. If characterisation factors for emissions of metals are included, these outweigh all other toxic emissions in the study. None of the most recent weighting methods in LCA include characterisation factors for nitrosamines, and these are therefore not part of the environmental ranking.These results shows that the EDecIDe project has an important role to play in developing LCA methodology useful for assessing the environmental performance of amine based carbon capture in particular and CCS in general. The EDecIDe project will examine the toxicity models used in LCA in more detail, specifically UseTox. The applicability of the LCA compartment models and site specificity issues for a Norwegian/Arctic situation will be explored. This applies to the environmental compartments

  19. Toxicological risk assessment of complex mixtures through the Wtox model

    Directory of Open Access Journals (Sweden)

    William Gerson Matias

    2015-01-01

    Full Text Available Mathematical models are important tools for environmental management and risk assessment. Predictions about the toxicity of chemical mixtures must be enhanced due to the complexity of eects that can be caused to the living species. In this work, the environmental risk was accessed addressing the need to study the relationship between the organism and xenobiotics. Therefore, ve toxicological endpoints were applied through the WTox Model, and with this methodology we obtained the risk classication of potentially toxic substances. Acute and chronic toxicity, citotoxicity and genotoxicity were observed in the organisms Daphnia magna, Vibrio scheri and Oreochromis niloticus. A case study was conducted with solid wastes from textile, metal-mechanic and pulp and paper industries. The results have shown that several industrial wastes induced mortality, reproductive eects, micronucleus formation and increases in the rate of lipid peroxidation and DNA methylation of the organisms tested. These results, analyzed together through the WTox Model, allowed the classication of the environmental risk of industrial wastes. The evaluation showed that the toxicological environmental risk of the samples analyzed can be classied as signicant or critical.

  20. Model-driven Privacy Assessment in the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Knirsch, Fabian [Salzburg Univ. (Austria); Engel, Dominik [Salzburg Univ. (Austria); Neureiter, Christian [Salzburg Univ. (Austria); Frincu, Marc [Univ. of Southern California, Los Angeles, CA (United States); Prasanna, Viktor [Univ. of Southern California, Los Angeles, CA (United States)

    2015-02-09

    In a smart grid, data and information are transported, transmitted, stored, and processed with various stakeholders having to cooperate effectively. Furthermore, personal data is the key to many smart grid applications and therefore privacy impacts have to be taken into account. For an effective smart grid, well integrated solutions are crucial and for achieving a high degree of customer acceptance, privacy should already be considered at design time of the system. To assist system engineers in early design phase, frameworks for the automated privacy evaluation of use cases are important. For evaluation, use cases for services and software architectures need to be formally captured in a standardized and commonly understood manner. In order to ensure this common understanding for all kinds of stakeholders, reference models have recently been developed. In this paper we present a model-driven approach for the automated assessment of such services and software architectures in the smart grid that builds on the standardized reference models. The focus of qualitative and quantitative evaluation is on privacy. For evaluation, the framework draws on use cases from the University of Southern California microgrid.

  1. Total water storage assessment using GRACE and a hydrological model

    Science.gov (United States)

    Fang, B.; Sridhar, V. R.; Billah, M.; Lakshmi, V.

    2017-12-01

    Seven climate and hydrological datasets from in-situ, gridded, model, and remote sensing data are used to estimate seasonal and annual variations in the water budget for the Chesapeake Bay watershed. The estimated water storage computed from different combination of water budgets inputs and model within the water balance framework are compared with Gravity Recovery and Climate Experiment (GRACE)-derived terrestrial water storage. Among the estimates, a combined application of gridded in-situ and remotely sensed budget components showed reliable estimates of monthly water storage that matched with the GRACE TWSC estimates. The Variable Infiltration Capacity (VIC) model generated water storage estimates were assessed and found to be closer to that of GRACE estimates in the winter and spring seasons, however, demonstrated differing estimates in the summer and fall. When precipitation was limited, combined input of the water budget components showed the highest agreement (combined -32 mm and GRACE -34 mm) in change in water storage in the Susquehanna River basin.

  2. A hierarchical network modeling method for railway tunnels safety assessment

    Science.gov (United States)

    Zhou, Jin; Xu, Weixiang; Guo, Xin; Liu, Xumin

    2017-02-01

    Using network theory to model risk-related knowledge on accidents is regarded as potential very helpful in risk management. A large amount of defects detection data for railway tunnels is collected in autumn every year in China. It is extremely important to discover the regularities knowledge in database. In this paper, based on network theories and by using data mining techniques, a new method is proposed for mining risk-related regularities to support risk management in railway tunnel projects. A hierarchical network (HN) model which takes into account the tunnel structures, tunnel defects, potential failures and accidents is established. An improved Apriori algorithm is designed to rapidly and effectively mine correlations between tunnel structures and tunnel defects. Then an algorithm is presented in order to mine the risk-related regularities table (RRT) from the frequent patterns. At last, a safety assessment method is proposed by consideration of actual defects and possible risks of defects gained from the RRT. This method cannot only generate the quantitative risk results but also reveal the key defects and critical risks of defects. This paper is further development on accident causation network modeling methods which can provide guidance for specific maintenance measure.

  3. Distributional aspects of emissions in climate change integrated assessment models

    International Nuclear Information System (INIS)

    Cantore, Nicola

    2011-01-01

    The recent failure of Copenhagen negotiations shows that concrete actions are needed to create the conditions for a consensus over global emission reduction policies. A wide coalition of countries in international climate change agreements could be facilitated by the perceived fairness of rich and poor countries of the abatement sharing at international level. In this paper I use two popular climate change integrated assessment models to investigate the path and decompose components and sources of future inequality in the emissions distribution. Results prove to be consistent with previous empirical studies and robust to model comparison and show that gaps in GDP across world regions will still play a crucial role in explaining different countries contributions to global warming. - Research highlights: → I implement a scenario analysis with two global climate change models. → I analyse inequality in the distribution of emissions. → I decompose emissions inequality components. → I find that GDP per capita is the main Kaya identity source of emissions inequality. → Current rich countries will mostly remain responsible for emissions inequality.

  4. Modelling of a CFD Microscale Model and Its Application in Wind Energy Resource Assessment

    Directory of Open Access Journals (Sweden)

    Yue Jie-shun

    2016-01-01

    Full Text Available The prediction of a wind farm near the wind turbines has a significant effect on the safety as well as economy of wind power generation. To assess the wind resource distribution within a complex terrain, a computational fluid dynamics (CFD based wind farm forecast microscale model is developed. The model uses the Reynolds Averaged Navier-Stokes (RANS model to characterize the turbulence. By using the results of Weather Research and Forecasting (WRF mesoscale weather forecast model as the input of the CFD model, a coupled model of CFD-WRF is established. A special method is used for the treatment of the information interchange on the lateral boundary between two models. This established coupled model is applied in predicting the wind farm near a wind turbine in Hong Gang-zi, Jilin, China. The results from this simulation are compared to real measured data. On this basis, the accuracy and efficiency of turbulence characterization schemes are discussed. It indicates that this coupling system is easy to implement and can make these two separate models work in parallel. The CFD model coupled with WRF has the advantage of high accuracy and fast speed, which makes it valid for the wind power generation.

  5. Independent Assessment of Instrumentation for ISS On-Orbit NDE. Volume 2; Appendices

    Science.gov (United States)

    Madaras, Eric I.

    2013-01-01

    International Space Station (ISS) Structural and Mechanical Systems Manager, requested that the NASA Engineering and Safety Center (NESC) provide a quantitative assessment of commercially available nondestructive evaluation (NDE) instruments for potential application to the ISS. This work supports risk mitigation as outlined in the ISS Integrated Risk Management Application (IRMA) Watch Item #4669, which addresses the requirement for structural integrity after an ISS pressure wall leak in the event of a penetration due to micrometeoroid or debris (MMOD) impact. This document contains the appendices the final report.

  6. Precast concrete unit assessment through GPR survey and FDTD modelling

    Science.gov (United States)

    Campo, Davide

    2017-04-01

    Precast concrete elements are widely used within United Kingdom house building offering ease in assembly and added values as structural integrity, sound and thermal insulation; most common concrete components include walls, beams, floors, panels, lintels, stairs, etc. The lack of respect of the manufacturer instruction during assembling, however, may induce cracking and short/long term loss of bearing capacity. GPR is a well-established not destructive technique employed in the assessment of structural elements because of real-time imaging, quickness of data collecting and ability to discriminate finest structural details. In this work, GPR has been used to investigate two different precast elements: precast reinforced concrete planks constituting the roof slab of a school and precast wood-cement blocks with insulation material pre-fitted used to build a perimeter wall of a private building. Visible cracks affected both constructions. For the assessment surveys, a GSSI 2.0 GHz GPR antenna has been used because of the high resolution required and the small size of the antenna case (155 by 90 by 105mm) enabling scanning up to 45mm from any obstruction. Finite Difference Time Domain (FDTD) numerical modelling was also performed to build a scenario of the expected GPR signal response for a preliminary real-time interpretation and to help solve uncertainties due to complex reflection patterns: simulated radargrams were built using Reflex Software v. 8.2, reproducing the same GPR pulse used for the surveys in terms of wavelet, nominal frequency, sample frequency and time window. Model geometries were derived from the design projects available both for the planks and the blocks; the electromagnetic properties of the materials (concrete, reinforcing bars, air-filled void, insulation and wooden concrete) were inferred from both values reported in literature and a preliminary interpretation of radargrams where internal layer interfaces were clearly recognizable and

  7. Selection of hydrologic modeling approaches for climate change assessment: A comparison of model scale and structures

    Science.gov (United States)

    Surfleet, Christopher G.; Tullos, Desirèe; Chang, Heejun; Jung, Il-Won

    2012-09-01

    SummaryA wide variety of approaches to hydrologic (rainfall-runoff) modeling of river basins confounds our ability to select, develop, and interpret models, particularly in the evaluation of prediction uncertainty associated with climate change assessment. To inform the model selection process, we characterized and compared three structurally-distinct approaches and spatial scales of parameterization to modeling catchment hydrology: a large-scale approach (using the VIC model; 671,000 km2 area), a basin-scale approach (using the PRMS model; 29,700 km2 area), and a site-specific approach (the GSFLOW model; 4700 km2 area) forced by the same future climate estimates. For each approach, we present measures of fit to historic observations and predictions of future response, as well as estimates of model parameter uncertainty, when available. While the site-specific approach generally had the best fit to historic measurements, the performance of the model approaches varied. The site-specific approach generated the best fit at unregulated sites, the large scale approach performed best just downstream of flood control projects, and model performance varied at the farthest downstream sites where streamflow regulation is mitigated to some extent by unregulated tributaries and water diversions. These results illustrate how selection of a modeling approach and interpretation of climate change projections require (a) appropriate parameterization of the models for climate and hydrologic processes governing runoff generation in the area under study, (b) understanding and justifying the assumptions and limitations of the model, and (c) estimates of uncertainty associated with the modeling approach.

  8. Model for assessing alpha doses for a Reference Japanese Man

    International Nuclear Information System (INIS)

    Kawamura, Hisao

    1993-01-01

    In view of the development of the nuclear fuel cycle in this country, it is urgently important to establish dose assessment models and related human and environmental parameters for long-lived radionuclides. In the current program, intake and body content of actinides (Pu, Th, U) and related alpha-emitting nuclides (Ra and daughters) have been studied as well as physiological aspects of Reference Japanese Man as the basic model of man for dosimetry. The ultimate object is to examine applicability of the existing models particularly recommended by the ICRP for workers to members of the public. The result of an interlaboratory intercomparison of 239 Pu + 240 Pu determination including our result was published. Alpha-spectrometric determinations of 226 Ra in bone yielded repesentative bone concentration level in Tokyo and Ra-Ca O.R. (bone-diet) which appear consistent with the literature value for Sapporo and Kyoto by Ohno using a Rn emanation method. Specific effective energies for alpha radiation from 226 Ra and daughters were calculated using the ICRP dosimetric model for bone incorporating masses of source and target organs of Reference Japanese Man. Reference Japanese data including the adult, adolescent, child and infant of both sexes was extensively and intensively studied by Tanaka as part of the activities of the ICRP Task Group on Reference Man Revision. Normal data for the physical measurements, mass and dimension of internal organs and body surfaces and some of the body composition were analysed viewing the nutritional data in the Japanese population. Some of the above works are to be continued. (author)

  9. Modelling Global Land Use and Social Implications in the Sustainability Assessment of Biofuels

    DEFF Research Database (Denmark)

    Kløverpris, Jesper; Wenzel, Henrik

    2007-01-01

    Cross-fertilising environmental, economic and geographical modelling to improve the environmental assessment of biofuel......Cross-fertilising environmental, economic and geographical modelling to improve the environmental assessment of biofuel...

  10. Status of thermalhydraulic modelling and assessment: Open issues

    International Nuclear Information System (INIS)

    Bestion, D.; Barre, F.

    1997-01-01

    This paper presents the status of the physical modelling in present codes used for Nuclear Reactor Thermalhydraulics (TRAC, RELAP 5, CATHARE, ATHLET,...) and attempts to list the unresolved or partially resolved issues. First, the capabilities and limitations of present codes are presented. They are mainly known from a synthesis of the assessment calculations performed for both separate effect tests and integral effect tests. It is also interesting to list all the assumptions and simplifications which were made in the establishment of the system of equations and of the constitutive relations. Many of the present limitations are associated to physical situations where these assumptions are not valid. Then, recommendations are proposed to extend the capabilities of these codes

  11. Status of thermalhydraulic modelling and assessment: Open issues

    Energy Technology Data Exchange (ETDEWEB)

    Bestion, D.; Barre, F. [CEA, Grenoble (France)

    1997-07-01

    This paper presents the status of the physical modelling in present codes used for Nuclear Reactor Thermalhydraulics (TRAC, RELAP 5, CATHARE, ATHLET,...) and attempts to list the unresolved or partially resolved issues. First, the capabilities and limitations of present codes are presented. They are mainly known from a synthesis of the assessment calculations performed for both separate effect tests and integral effect tests. It is also interesting to list all the assumptions and simplifications which were made in the establishment of the system of equations and of the constitutive relations. Many of the present limitations are associated to physical situations where these assumptions are not valid. Then, recommendations are proposed to extend the capabilities of these codes.

  12. Model analysis: Representing and assessing the dynamics of student learning

    Directory of Open Access Journals (Sweden)

    Lei Bao

    2006-02-01

    Full Text Available Decades of education research have shown that students can simultaneously possess alternate knowledge frameworks and that the development and use of such knowledge are context dependent. As a result of extensive qualitative research, standardized multiple-choice tests such as Force Concept Inventory and Force-Motion Concept Evaluation tests provide instructors tools to probe their students’ conceptual knowledge of physics. However, many existing quantitative analysis methods often focus on a binary question of whether a student answers a question correctly or not. This greatly limits the capacity of using the standardized multiple-choice tests in assessing students’ alternative knowledge. In addition, the context dependence issue, which suggests that a student may apply the correct knowledge in some situations and revert to use alternative types of knowledge in others, is often treated as random noise in current analyses. In this paper, we present a model analysis, which applies qualitative research to establish a quantitative representation framework. With this method, students’ alternative knowledge and the probabilities for students to use such knowledge in a range of equivalent contexts can be quantitatively assessed. This provides a way to analyze research-based multiple choice questions, which can generate much richer information than what is available from score-based analysis.

  13. Modeling interactions between land cover and climate in integrated assessment models (Invited)

    Science.gov (United States)

    Calvin, K. V.

    2013-12-01

    Integrated Assessment Models (IAMs) link representations of the regionally disaggregated global economy, energy system, agriculture and land-use, terrestrial carbon cycle, oceans and climate in an internally consistent framework. These models are often used as science-based decision-support tools for evaluating the consequences of climate, energy, and other policies, and their use in this framework is likely to increase in the future. Additionally, these models are used to develop future scenarios of emissions and land cover for use in climate models (e.g., RCPs and CMIP5). Land use is strongly influenced by assumptions about population, income, diet, ecosystem productivity change, and climate policy. Population, income, and diet determine the amount of food production needed in the future. Assumptions about future changes in crop yields due to agronomic developments influence the amount of land needed to produce food crops. Climate policy has implications for land when land-based mitigation options (e.g., afforestation and bioenergy) are considered. IAM models consider each of these factors in their computation of land use in the future. As each of these factors is uncertain in the future, IAM models use scenario analysis to explore the implications of each. For example, IAMs have been used to explore the effect of different mitigation policies on land cover. These models can quantify the trade-offs in terms of land cover, energy prices, food prices, and mitigation costs of each of these policies. Furthermore, IAMs are beginning to explore the effect of climate change on land productivity, and the implications that changes in productivity have on mitigation efforts. In this talk, we describe the implications for future land use and land cover of a variety of socioeconomic, technological, and policy drivers in several IAM models. Additionally, we will discuss the effects of future land cover on climate and the effects of climate on future land cover, as simulated

  14. A model of the pre-assessment learning effects of assessment is operational in an undergraduate clinical context

    Science.gov (United States)

    2012-01-01

    Background No validated model exists to explain the learning effects of assessment, a problem when designing and researching assessment for learning. We recently developed a model explaining the pre-assessment learning effects of summative assessment in a theory teaching context. The challenge now is to validate this model. The purpose of this study was to explore whether the model was operational in a clinical context as a first step in this process. Methods Given the complexity of the model, we adopted a qualitative approach. Data from in-depth interviews with eighteen medical students were subject to content analysis. We utilised a code book developed previously using grounded theory. During analysis, we remained alert to data that might not conform to the coding framework and open to the possibility of deploying inductive coding. Ethical clearance and informed consent were obtained. Results The three components of the model i.e., assessment factors, mechanism factors and learning effects were all evident in the clinical context. Associations between these components could all be explained by the model. Interaction with preceptors was identified as a new subcomponent of assessment factors. The model could explain the interrelationships of the three facets of this subcomponent i.e., regular accountability, personal consequences and emotional valence of the learning environment, with previously described components of the model. Conclusions The model could be utilized to analyse and explain observations in an assessment context different to that from which it was derived. In the clinical setting, the (negative) influence of preceptors on student learning was particularly prominent. In this setting, learning effects resulted not only from the high-stakes nature of summative assessment but also from personal stakes, e.g. for esteem and agency. The results suggest that to influence student learning, consequences should accrue from assessment that are immediate

  15. Modeling marine surface microplastic transport to assess optimal removal locations

    International Nuclear Information System (INIS)

    Sherman, Peter; Van Sebille, Erik

    2016-01-01

    Marine plastic pollution is an ever-increasing problem that demands immediate mitigation and reduction plans. Here, a model based on satellite-tracked buoy observations and scaled to a large data set of observations on microplastic from surface trawls was used to simulate the transport of plastics floating on the ocean surface from 2015 to 2025, with the goal to assess the optimal marine microplastic removal locations for two scenarios: removing the most surface microplastic and reducing the impact on ecosystems, using plankton growth as a proxy. The simulations show that the optimal removal locations are primarily located off the coast of China and in the Indonesian Archipelago for both scenarios. Our estimates show that 31% of the modeled microplastic mass can be removed by 2025 using 29 plastic collectors operating at a 45% capture efficiency from these locations, compared to only 17% when the 29 plastic collectors are moored in the North Pacific garbage patch, between Hawaii and California. The overlap of ocean surface microplastics and phytoplankton growth can be reduced by 46% at our proposed locations, while sinks in the North Pacific can only reduce the overlap by 14%. These results are an indication that oceanic plastic removal might be more effective in removing a greater microplastic mass and in reducing potential harm to marine life when closer to shore than inside the plastic accumulation zones in the centers of the gyres. (letter)

  16. Mentalized affectivity: A new model and assessment of emotion regulation.

    Directory of Open Access Journals (Sweden)

    David M Greenberg

    Full Text Available Here we introduce a new assessment of emotion regulation called the Mentalized Affectivity Scale (MAS. A large online adult sample (N = 2,840 completed the 60-item MAS along with a battery of psychological measures. Results revealed a robust three-component structure underlying mentalized affectivity, which we labeled: Identifying emotions (the ability to identify emotions and to reflect on the factors that influence them; Processing emotions (the ability to modulate and distinguish complex emotions; and Expressing emotions (the tendency to express emotions outwardly or inwardly. Hierarchical modeling suggested that Processing emotions delineates from Identifying them, and Expressing emotions delineates from Processing them. We then showed how these components are associated with personality traits, well-being, trauma, and 18 different psychological disorders (including mood, neurological, and personality disorders. Notably, those with anxiety, mood, and personality disorders showed a profile of high Identifying and low Processing compared to controls. Further, results showed how mentalized affectivity scores varied across psychological treatment modalities and years spent in therapy. Taken together, the model of mentalized affectivity advances prior theory and research on emotion regulation and the MAS is a useful and reliable instrument that can be used in both clinical and non-clinical settings in psychology, psychiatry, and neuroscience.

  17. A new model of Ishikawa diagram for quality assessment

    Science.gov (United States)

    Liliana, Luca

    2016-11-01

    The paper presents the results of a study concerning the use of the Ishikawa diagram in analyzing the causes that determine errors in the evaluation of theparts precision in the machine construction field. The studied problem was"errors in the evaluation of partsprecision” and this constitutes the head of the Ishikawa diagram skeleton.All the possible, main and secondary causes that could generate the studied problem were identified. The most known Ishikawa models are 4M, 5M, 6M, the initials being in order: materials, methods, man, machines, mother nature, measurement. The paper shows the potential causes of the studied problem, which were firstly grouped in three categories, as follows: causes that lead to errors in assessing the dimensional accuracy, causes that determine errors in the evaluation of shape and position abnormalities and causes for errors in roughness evaluation. We took into account the main components of parts precision in the machine construction field. For each of the three categories of causes there were distributed potential secondary causes on groups of M (man, methods, machines, materials, environment/ medio ambiente-sp.). We opted for a new model of Ishikawa diagram, resulting from the composition of three fish skeletons corresponding to the main categories of parts accuracy.

  18. Mentalized affectivity: A new model and assessment of emotion regulation.

    Science.gov (United States)

    Greenberg, David M; Kolasi, Jonela; Hegsted, Camilla P; Berkowitz, Yoni; Jurist, Elliot L

    2017-01-01

    Here we introduce a new assessment of emotion regulation called the Mentalized Affectivity Scale (MAS). A large online adult sample (N = 2,840) completed the 60-item MAS along with a battery of psychological measures. Results revealed a robust three-component structure underlying mentalized affectivity, which we labeled: Identifying emotions (the ability to identify emotions and to reflect on the factors that influence them); Processing emotions (the ability to modulate and distinguish complex emotions); and Expressing emotions (the tendency to express emotions outwardly or inwardly). Hierarchical modeling suggested that Processing emotions delineates from Identifying them, and Expressing emotions delineates from Processing them. We then showed how these components are associated with personality traits, well-being, trauma, and 18 different psychological disorders (including mood, neurological, and personality disorders). Notably, those with anxiety, mood, and personality disorders showed a profile of high Identifying and low Processing compared to controls. Further, results showed how mentalized affectivity scores varied across psychological treatment modalities and years spent in therapy. Taken together, the model of mentalized affectivity advances prior theory and research on emotion regulation and the MAS is a useful and reliable instrument that can be used in both clinical and non-clinical settings in psychology, psychiatry, and neuroscience.

  19. Life assessment of combustor liner using unified constitutive models

    Science.gov (United States)

    Tong, M. T.; Thompson, R. L.

    1988-01-01

    Hot section components of gas turbine engines are subject to severe thermomechanical loads during each mission cycle. Inelastic deformation can be induced in localized regions leading to eventual fatigue cracking. Assessment of durability requires reasonably accurate calculation of the structural response at the critical location for crack initiation. In recent years nonlinear finite element computer codes have become available for calculating inelastic structural response under cyclic loading. NASA-Lewis sponsored the development of unified constitutive material models and their implementation in nonlinear finite element computer codes for the structural analysis of hot section components. These unified models were evaluated with regard to their effect on the life prediction of a hot section component. The component considered was a gas turbine engine combustor liner. A typical engine mission cycle was used for the thermal and structural analyses. The analyses were performed on a CRAY computer using the MARC finite element code. The results were compared with laboratory test results, in terms of crack initiation lives.

  20. Revenue Risk Modelling and Assessment on BOT Highway Project

    Science.gov (United States)

    Novianti, T.; Setyawan, H. Y.

    2018-01-01

    The infrastructure project which is considered as a public-private partnership approach under BOT (Build-Operate-Transfer) arrangement, such as a highway, is risky. Therefore, assessment on risk factors is essential as the project have a concession period and is influenced by macroeconomic factors and consensus period. In this study, pre-construction risks of a highway were examined by using a Delphi method to create a space for offline expert discussions; a fault tree analysis to map intuition of experts and to create a model from the underlying risk events; a fuzzy logic to interpret the linguistic data of risk models. The loss of revenue for risk tariff, traffic volume, force majeure, and income were then measured. The results showed that the loss of revenue caused by the risk tariff was 10.5% of the normal total revenue. The loss of revenue caused by the risk of traffic volume was 21.0% of total revenue. The loss of revenue caused by the force majeure was 12.2% of the normal income. The loss of income caused by the non-revenue events was 6.9% of the normal revenue. It was also found that the volume of traffic was the major risk of a highway project because it related to customer preferences.

  1. Kamchia watershed groundwater recharge assessment by the CLM3 model

    Directory of Open Access Journals (Sweden)

    Nitcheva Olga

    2018-01-01

    Full Text Available Estimating groundwater recharge is an important part of the water resources evaluation. In spite of the numerous existing methods it continues to be not easy value to quantify. This is due to its dependence on many meteorological, hydrogeological, soil type and cover conditions and the impossibility for direct measurement. Employment of hydrological models in fact directly calculates the influence of the above cited natural factors. The Community Land Model (CLM3 being loaded with all land featuring data in global scale, including an adequate soil filtration process simulation by the Richards equation, together with the possibility for input of NCEP/NCAR Reanalyses database, featuring the meteorological effect, gives an opportunity to avoid to great extent the difficulties in groundwater (GW recharge estimation. The paper presents the results from an experiment concerning GW recharge monthly estimation during 2013, worked out for the Kamchia river watershed in Bulgaria. The computed monthly and annual values are presented on GIS maps and are compared with existing assessments made by other methods. It is proved the good approach and the applicability of the method.

  2. A Model for Assessing the Gender Aspect in Economic Policy

    Directory of Open Access Journals (Sweden)

    Ona Rakauskienė

    2015-06-01

    Full Text Available The purpose of research is to develop a conceptual model for assessing the impact of the gender aspect on economic policy at macro– and microeconomic levels. The research methodology is based on analysing scientific approaches to the gender aspect in economics and gender–responsive budgeting as well as determining the impact of the gender aspect on GDP, foreign trade, the state budget and the labour market. First, the major findings encompass the main idea of a conceptual model proposing that a socio–economic picture of society can be accepted as completed only when, alongside public and private sectors, includes the care/reproductive sector that is dominated by women and creating added value in the form of educated human resources; second, macroeconomics is not neutral in terms of gender equality. Gender asymmetry is manifested not only at the level of microeconomics (labour market and business but also at the level of macroeconomics (GDP, the state budget and foreign trade, which has a negative impact on economic growth and state budget revenues. In this regard, economic decisions, according to the principles of gender equality and in order to achieve gender equality in economics, must be made, as the gender aspect has to be also implemented at the macroeconomic level.

  3. Modeling marine surface microplastic transport to assess optimal removal locations

    Science.gov (United States)

    Sherman, Peter; van Sebille, Erik

    2016-01-01

    Marine plastic pollution is an ever-increasing problem that demands immediate mitigation and reduction plans. Here, a model based on satellite-tracked buoy observations and scaled to a large data set of observations on microplastic from surface trawls was used to simulate the transport of plastics floating on the ocean surface from 2015 to 2025, with the goal to assess the optimal marine microplastic removal locations for two scenarios: removing the most surface microplastic and reducing the impact on ecosystems, using plankton growth as a proxy. The simulations show that the optimal removal locations are primarily located off the coast of China and in the Indonesian Archipelago for both scenarios. Our estimates show that 31% of the modeled microplastic mass can be removed by 2025 using 29 plastic collectors operating at a 45% capture efficiency from these locations, compared to only 17% when the 29 plastic collectors are moored in the North Pacific garbage patch, between Hawaii and California. The overlap of ocean surface microplastics and phytoplankton growth can be reduced by 46% at our proposed locations, while sinks in the North Pacific can only reduce the overlap by 14%. These results are an indication that oceanic plastic removal might be more effective in removing a greater microplastic mass and in reducing potential harm to marine life when closer to shore than inside the plastic accumulation zones in the centers of the gyres.

  4. The modelling and assessment of whale-watching impacts

    Science.gov (United States)

    New, Leslie; Hall, Ailsa J.; Harcourt, Robert; Kaufman, Greg; Parsons, E.C.M.; Pearson, Heidi C.; Cosentino, A. Mel; Schick, Robert S

    2015-01-01

    In recent years there has been significant interest in modelling cumulative effects and the population consequences of individual changes in cetacean behaviour and physiology due to disturbance. One potential source of disturbance that has garnered particular interest is whale-watching. Though perceived as ‘green’ or eco-friendly tourism, there is evidence that whale-watching can result in statistically significant and biologically meaningful changes in cetacean behaviour, raising the question whether whale-watching is in fact a long term sustainable activity. However, an assessment of the impacts of whale-watching on cetaceans requires an understanding of the potential behavioural and physiological effects, data to effectively address the question and suitable modelling techniques. Here, we review the current state of knowledge on the viability of long-term whale-watching, as well as logistical limitations and potential opportunities. We conclude that an integrated, coordinated approach will be needed to further understanding of the possible effects of whale-watching on cetaceans.

  5. Mixture modeling methods for the assessment of normal and abnormal personality, part II: longitudinal models.

    Science.gov (United States)

    Wright, Aidan G C; Hallquist, Michael N

    2014-01-01

    Studying personality and its pathology as it changes, develops, or remains stable over time offers exciting insight into the nature of individual differences. Researchers interested in examining personal characteristics over time have a number of time-honored analytic approaches at their disposal. In recent years there have also been considerable advances in person-oriented analytic approaches, particularly longitudinal mixture models. In this methodological primer we focus on mixture modeling approaches to the study of normative and individual change in the form of growth mixture models and ipsative change in the form of latent transition analysis. We describe the conceptual underpinnings of each of these models, outline approaches for their implementation, and provide accessible examples for researchers studying personality and its assessment.

  6. Hydrodynamic and Ecological Assessment of Nearshore Restoration: A Modeling Study

    International Nuclear Information System (INIS)

    Yang, Zhaoqing; Sobocinski, Kathryn L.; Heatwole, Danelle W.; Khangaonkar, Tarang; Thom, Ronald M.; Fuller, Roger

    2010-01-01

    Along the Pacific Northwest coast, much of the estuarine habitat has been diked over the last century for agricultural land use, residential and commercial development, and transportation corridors. As a result, many of the ecological processes and functions have been disrupted. To protect coastal habitats that are vital to aquatic species, many restoration projects are currently underway to restore the estuarine and coastal ecosystems through dike breaches, setbacks, and removals. Information on physical processes and hydrodynamic conditions are critical for the assessment of the success of restoration actions. Restoration of a 160- acre property at the mouth of the Stillaguamish River in Puget Sound has been proposed. The goal is to restore native tidal habitats and estuary-scale ecological processes by removing the dike. In this study, a three-dimensional hydrodynamic model was developed for the Stillaguamish River estuary to simulate estuarine processes. The model was calibrated to observed tide, current, and salinity data for existing conditions and applied to simulate the hydrodynamic responses to two restoration alternatives. Responses were evaluated at the scale of the restoration footprint. Model data was combined with biophysical data to predict habitat responses at the site. Results showed that the proposed dike removal would result in desired tidal flushing and conditions that would support four habitat types on the restoration footprint. At the estuary scale, restoration would substantially increase the proportion of area flushed with freshwater (< 5 ppt) at flood tide. Potential implications of predicted changes in salinity and flow dynamics are discussed relative to the distribution of tidal marsh habitat.

  7. Water quality modelling for ephemeral rivers: Model development and parameter assessment

    Science.gov (United States)

    Mannina, Giorgio; Viviani, Gaspare

    2010-11-01

    SummaryRiver water quality models can be valuable tools for the assessment and management of receiving water body quality. However, such water quality models require accurate model calibration in order to specify model parameters. Reliable model calibration requires an extensive array of water quality data that are generally rare and resource-intensive, both economically and in terms of human resources, to collect. In the case of small rivers, such data are scarce due to the fact that these rivers are generally considered too insignificant, from a practical and economic viewpoint, to justify the investment of such considerable time and resources. As a consequence, the literature contains very few studies on the water quality modelling for small rivers, and such studies as have been published are fairly limited in scope. In this paper, a simplified river water quality model is presented. The model is an extension of the Streeter-Phelps model and takes into account the physico-chemical and biological processes most relevant to modelling the quality of receiving water bodies (i.e., degradation of dissolved carbonaceous substances, ammonium oxidation, algal uptake and denitrification, dissolved oxygen balance, including depletion by degradation processes and supply by physical reaeration and photosynthetic production). The model has been applied to an Italian case study, the Oreto river (IT), which has been the object of an Italian research project aimed at assessing the river's water quality. For this reason, several monitoring campaigns have been previously carried out in order to collect water quantity and quality data on this river system. In particular, twelve river cross sections were monitored, and both flow and water quality data were collected for each cross section. The results of the calibrated model show satisfactory agreement with the measured data and results reveal important differences between the parameters used to model small rivers as compared to

  8. An Assessment of Mean Areal Precipitation Methods on Simulated Stream Flow: A SWAT Model Performance Assessment

    Directory of Open Access Journals (Sweden)

    Sean Zeiger

    2017-06-01

    Full Text Available Accurate mean areal precipitation (MAP estimates are essential input forcings for hydrologic models. However, the selection of the most accurate method to estimate MAP can be daunting because there are numerous methods to choose from (e.g., proximate gauge, direct weighted average, surface-fitting, and remotely sensed methods. Multiple methods (n = 19 were used to estimate MAP with precipitation data from 11 distributed monitoring sites, and 4 remotely sensed data sets. Each method was validated against the hydrologic model simulated stream flow using the Soil and Water Assessment Tool (SWAT. SWAT was validated using a split-site method and the observed stream flow data from five nested-scale gauging sites in a mixed-land-use watershed of the central USA. Cross-validation results showed the error associated with surface-fitting and remotely sensed methods ranging from −4.5 to −5.1%, and −9.8 to −14.7%, respectively. Split-site validation results showed the percent bias (PBIAS values that ranged from −4.5 to −160%. Second order polynomial functions especially overestimated precipitation and subsequent stream flow simulations (PBIAS = −160 in the headwaters. The results indicated that using an inverse-distance weighted, linear polynomial interpolation or multiquadric function method to estimate MAP may improve SWAT model simulations. Collectively, the results highlight the importance of spatially distributed observed hydroclimate data for precipitation and subsequent steam flow estimations. The MAP methods demonstrated in the current work can be used to reduce hydrologic model uncertainty caused by watershed physiographic differences.

  9. Towards an Integrated Model for Developing Sustainable Assessment Skills

    Science.gov (United States)

    Fastre, Greet M. J.; van der Klink, Marcel R.; Sluijsmans, Dominique; van Merrienboer, Jeroen J. G.

    2013-01-01

    One of the goals of current education is to ensure that graduates can act as independent lifelong learners. Graduates need to be able to assess their own learning and interpret assessment results. The central question in this article is how to acquire sustainable assessment skills, enabling students to assess their performance and learning…

  10. Modeling the World Health Organization Disability Assessment Schedule II using non-parametric item response models.

    Science.gov (United States)

    Galindo-Garre, Francisca; Hidalgo, María Dolores; Guilera, Georgina; Pino, Oscar; Rojo, J Emilio; Gómez-Benito, Juana

    2015-03-01

    The World Health Organization Disability Assessment Schedule II (WHO-DAS II) is a multidimensional instrument developed for measuring disability. It comprises six domains (getting around, self-care, getting along with others, life activities and participation in society). The main purpose of this paper is the evaluation of the psychometric properties for each domain of the WHO-DAS II with parametric and non-parametric Item Response Theory (IRT) models. A secondary objective is to assess whether the WHO-DAS II items within each domain form a hierarchy of invariantly ordered severity indicators of disability. A sample of 352 patients with a schizophrenia spectrum disorder is used in this study. The 36 items WHO-DAS II was administered during the consultation. Partial Credit and Mokken scale models are used to study the psychometric properties of the questionnaire. The psychometric properties of the WHO-DAS II scale are satisfactory for all the domains. However, we identify a few items that do not discriminate satisfactorily between different levels of disability and cannot be invariantly ordered in the scale. In conclusion the WHO-DAS II can be used to assess overall disability in patients with schizophrenia, but some domains are too general to assess functionality in these patients because they contain items that are not applicable to this pathology. Copyright © 2014 John Wiley & Sons, Ltd.

  11. Assessing moderated mediation in linear models requires fewer confounding assumptions than assessing mediation.

    Science.gov (United States)

    Loeys, Tom; Talloen, Wouter; Goubert, Liesbet; Moerkerke, Beatrijs; Vansteelandt, Stijn

    2016-11-01

    It is well known from the mediation analysis literature that the identification of direct and indirect effects relies on strong no unmeasured confounding assumptions of no unmeasured confounding. Even in randomized studies the mediator may still be correlated with unobserved prognostic variables that affect the outcome, in which case the mediator's role in the causal process may not be inferred without bias. In the behavioural and social science literature very little attention has been given so far to the causal assumptions required for moderated mediation analysis. In this paper we focus on the index for moderated mediation, which measures by how much the mediated effect is larger or smaller for varying levels of the moderator. We show that in linear models this index can be estimated without bias in the presence of unmeasured common causes of the moderator, mediator and outcome under certain conditions. Importantly, one can thus use the test for moderated mediation to support evidence for mediation under less stringent confounding conditions. We illustrate our findings with data from a randomized experiment assessing the impact of being primed with social deception upon observer responses to others' pain, and from an observational study of individuals who ended a romantic relationship assessing the effect of attachment anxiety during the relationship on mental distress 2 years after the break-up. © 2016 The British Psychological Society.

  12. Asian water futures - Multi scenarios, models and criteria assessment -

    Science.gov (United States)

    Satoh, Yusuke; Burek, Peter; Wada, Yoshihide; Flrörke, Martina; Eisner, Stephanie; Hanasaki, Naota; Kahil, Taher; Tramberend, Sylvia; Fischer, Günther; Wiberg, David

    2016-04-01

    A better understanding of the current and future availability of water resources is essential for the implementation of the recently agreed Sustainable Development Goals (SDGs). Long-term/efficient strategies for coping with current and potential future water-related challenges are urgently required. Although Representative Concentration Pathways (RCPs) and Shared Socioeconomic Pathways (SSPs) were develop for the impact assessment of climate change, very few assessments have yet used the SSPs to assess water resources. Then the IIASA Water Futures and Solutions Initiative (WFaS), developed a set of water use scenarios consistent with RCPs and SSPs and applying the latest climate changes scenarios. Here this study focuses on results for Asian countries for the period 2010-2050. We present three conceivable future pathways of Asian water resources, determined by feasible combinations of two RCPs and three SSPs. Such a scenario approach provides valuable insights towards identifying appropriate strategies as gaps between a "scenario world" and reality. In addition, for the assessment of future water resources a multi-criteria analysis is applied. A classification system for countries and watershed that consists of two broad dimensions: (i) economic and institutional adaptive capacity, (ii) hydrological complexity. The latter is composed of several sub-indexes including total renewable water resources per capita, the ratio of water demand to renewable water resource, variability of runoff and dependency ratio to external. Furthermore, this analysis uses a multi-model approach to estimate runoff and discharge using 5 GCMs and 5 global hydrological models (GHMs). Three of these GHMs calculate water use based on a consistent set of scenarios in addition to water availability. As a result, we have projected hot spots of water scarcity in Asia and their spatial and temporal change. For example, in a scenario based on SSP2 and RCP6.0, by 2050, in total 2.1 billion people

  13. A generic hydroeconomic model to assess future water scarcity

    Science.gov (United States)

    Neverre, Noémie; Dumas, Patrice

    2015-04-01

    We developed a generic hydroeconomic model able to confront future water supply and demand on a large scale, taking into account man-made reservoirs. The assessment is done at the scale of river basins, using only globally available data; the methodology can thus be generalized. On the supply side, we evaluate the impacts of climate change on water resources. The available quantity of water at each site is computed using the following information: runoff is taken from the outputs of CNRM climate model (Dubois et al., 2010), reservoirs are located using Aquastat, and the sub-basin flow-accumulation area of each reservoir is determined based on a Digital Elevation Model (HYDRO1k). On the demand side, agricultural and domestic demands are projected in terms of both quantity and economic value. For the agricultural sector, globally available data on irrigated areas and crops are combined in order to determine irrigated crops localization. Then, crops irrigation requirements are computed for the different stages of the growing season using Allen (1998) method with Hargreaves potential evapotranspiration. Irrigation water economic value is based on a yield comparison approach between rainfed and irrigated crops. Potential irrigated and rainfed yields are taken from LPJmL (Blondeau et al., 2007), or from FAOSTAT by making simple assumptions on yield ratios. For the domestic sector, we project the combined effects of demographic growth, economic development and water cost evolution on future demands. The method consists in building three-blocks inverse demand functions where volume limits of the blocks evolve with the level of GDP per capita. The value of water along the demand curve is determined from price-elasticity, price and demand data from the literature, using the point-expansion method, and from water costs data. Then projected demands are confronted to future water availability. Operating rules of the reservoirs and water allocation between demands are based on

  14. THE PENA BLANCA NATURAL ANALOGUE PERFORMANCE ASSESSMENT MODEL

    International Nuclear Information System (INIS)

    G.J. Saulnier Jr; W. Statham

    2006-01-01

    The Nopal I uranium mine in the Sierra Pena Blanca, Chihuahua, Mexico serves as a natural analogue to the Yucca Mountain repository. The Pena Blanca Natural Analogue Performance Assessment Model simulates the mobilization and transport of radionuclides that are released from the mine and transported to the saturated zone. the Pena Blanca Natural Analogue Model uses probabilistic simulations of hydrogeologic processes that are analogous to the processes that occur at the Yucca Mountain site. The Nopal I uranium deposit lies in fractured, welded, and altered rhyolitic ash flow tuffs that overlie carbonate rocks, a setting analogous to the geologic formations at the Yucca Mountain site. The Nopal I mine site has the following characteristics as compared to the Yucca Mountain repository site. (1) Analogous source: UO 2 uranium ore deposit = spent nuclear fuel in the repository; (2) Analogous geologic setting: fractured, welded, and altered rhyolitic ash flow tuffs overlying carbonate rocks; (3) Analogous climate: Semiarid to arid; (4) Analogous geochemistry: Oxidizing conditions; and (5) Analogous hydrogeology: The ore deposit lies in the unsaturated zone above the water table. The Nopal I deposit is approximately 8 ± 0.5 million years old and has been exposed to oxidizing conditions during the last 3.2 to 3.4 million years. The Pena Blanca Natural Analogue Model considers that the uranium oxide and uranium silicates in the ore deposit were originally analogous to uranium-oxide spent nuclear fuel. The Pena Blanca site has been characterized using field and laboratory investigations of its fault and fracture distribution, mineralogy, fracture fillings, seepage into the mine adits, regional hydrology, and mineralization that shows the extent of radionuclide migration. Three boreholes were drilled at the Nopal I mine site in 2003 and these boreholes have provided samples for lithologic characterization, water-level measurements, and water samples for laboratory analysis

  15. THE PENA BLANCA NATURAL ANALOGUE PERFORMANCE ASSESSMENT MODEL

    Energy Technology Data Exchange (ETDEWEB)

    G.J. Saulnier Jr; W. Statham

    2006-03-10

    The Nopal I uranium mine in the Sierra Pena Blanca, Chihuahua, Mexico serves as a natural analogue to the Yucca Mountain repository. The Pena Blanca Natural Analogue Performance Assessment Model simulates the mobilization and transport of radionuclides that are released from the mine and transported to the saturated zone. the Pena Blanca Natural Analogue Model uses probabilistic simulations of hydrogeologic processes that are analogous to the processes that occur at the Yucca Mountain site. The Nopal I uranium deposit lies in fractured, welded, and altered rhyolitic ash flow tuffs that overlie carbonate rocks, a setting analogous to the geologic formations at the Yucca Mountain site. The Nopal I mine site has the following characteristics as compared to the Yucca Mountain repository site. (1) Analogous source: UO{sub 2} uranium ore deposit = spent nuclear fuel in the repository; (2) Analogous geologic setting: fractured, welded, and altered rhyolitic ash flow tuffs overlying carbonate rocks; (3) Analogous climate: Semiarid to arid; (4) Analogous geochemistry: Oxidizing conditions; and (5) Analogous hydrogeology: The ore deposit lies in the unsaturated zone above the water table. The Nopal I deposit is approximately 8 {+-} 0.5 million years old and has been exposed to oxidizing conditions during the last 3.2 to 3.4 million years. The Pena Blanca Natural Analogue Model considers that the uranium oxide and uranium silicates in the ore deposit were originally analogous to uranium-oxide spent nuclear fuel. The Pena Blanca site has been characterized using field and laboratory investigations of its fault and fracture distribution, mineralogy, fracture fillings, seepage into the mine adits, regional hydrology, and mineralization that shows the extent of radionuclide migration. Three boreholes were drilled at the Nopal I mine site in 2003 and these boreholes have provided samples for lithologic characterization, water-level measurements, and water samples for laboratory

  16. Two agricultural production data libraries for risk assessment models

    International Nuclear Information System (INIS)

    Baes, C.F. III; Shor, R.W.; Sharp, R.D.; Sjoreen, A.L.

    1985-01-01

    Two data libraries based on the 1974 US Census of Agriculture are described. The data packages (AGDATC and AGDATG) are available from the Radiation Shielding Information Center (RSIC), Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831. Agricultural production and land-use information by county (AGDATC) or by 1/2 by 1/2 degree longitude-latitude grid cell (AGDATG) provide geographical resolution of the data. The libraries were designed for use in risk assessment models that simulate the transport of radionuclides from sources of airborne release through food chains to man. However, they are also suitable for use in the assessment of other airborne pollutants that can affect man from a food ingestion pathway such as effluents from synfuels or coal-fired power plants. The principal significance of the data libraries is that they provide default location-specific food-chain transport parameters when site-specific information are unavailable. Plant food categories in the data libraries include leafy vegetables, vegetables and fruits exposed to direct deposition of airborne pollutants, vegetables and fruits protected from direct deposition, and grains. Livestock feeds are also tabulated in four categories: pasture, grain, hay, and silage. Pasture was estimated by a material balance of cattle and sheep inventories, forage feed requirements, and reported harvested forage. Cattle (Bos spp.), sheep (Ovis aries), goat (Capra hircus), hog (Sus scrofa), chicken (Gallus domesticus), and turkey (Meleagris gallopavo) inventories or sales are also tabulated in the data libraries and can be used to provide estimates of meat, eggs, and milk production. Honey production also is given. Population, irrigation, and meteorological information are also listed

  17. Pesticide fate modeling in soils with the crop model STICS: Feasibility for assessment of agricultural practices.

    Science.gov (United States)

    Queyrel, Wilfried; Habets, Florence; Blanchoud, Hélène; Ripoche, Dominique; Launay, Marie

    2016-01-15

    Numerous pesticide fate models are available, but few of them are able to take into account specific agricultural practices, such as catch crop, mixing crops or tillage in their predictions. In order to better integrate crop management and crop growth in the simulation of diffuse agricultural pollutions, and to manage both pesticide and nitrogen pollution, a pesticide fate module was implemented in the crop model STICS. The objectives of the study were: (i) to implement a pesticide fate module in the crop model STICS; (ii) to evaluate the model performance using experimental data from three sites with different pedoclimatic contexts, one in The Netherlands and two in northern France; (iii) to compare the simulations with several pesticide fate models; and (iv) to test the impact of specific agricultural practices on the transfer of the dissolved fraction of pesticides. The evaluations were carried out with three herbicides: bentazone, isoproturon, and atrazine. The strategy applied in this study relies on a noncalibration approach and sensitivity test to assess the operating limits of the model. To this end, the evaluation was performed with default values found in the literature and completed by sensitivity tests. The extended version of the STICS named STICS-Pest, shows similar results with other pesticide fate models widely used in the literature. Moreover, STICS-Pest was able to estimate realistic crop growth and catch crop dynamic, which thus illustrate agricultural practices leading to a reduction of nitrate and a change in pesticide leaching. The dynamic plot-scale model, STICS-Pest is able to simulate nitrogen and pesticide fluxes, when the hydrologic context is in the validity range of the reservoir (or capacity) model. According to these initial results, the model may be a relevant tool for studying the effect of long-term agricultural practices on pesticide residue dynamics in soil and the associated diffuse pollution transfer. Copyright © 2015 Elsevier

  18. Concepts, methods and models to assess environmental impact

    International Nuclear Information System (INIS)

    Pentreath, R.J.

    2002-01-01

    individual sites, is also planned in Canada. A somewhat conceptually different approach is that of an attempt to develop a hierarchical system for environmental protection based on a narrowly defined set of Reference Fauna and Flora analogous to that of Reference Man - consisting of defined dose models, data sets to estimate exposures, and data on biological effects, to provide a set of 'derived consideration levels' of dose-effect relationships for individual fauna and flora that could be used to help decision making (along with other relevant biological information) in different circumstances. Research work is also underway to produce systematic frameworks - also using a 'reference fauna and flora approach' - for assessing environmental impact in specific geographic areas, such as European and Arctic ecosystems. (author)

  19. Implications of model uncertainty for the practice of risk assessment

    International Nuclear Information System (INIS)

    Laskey, K.B.

    1994-01-01

    A model is a representation of a system that can be used to answer questions about the system's behavior. The term model uncertainty refers to problems in which there is no generally agreed upon, validated model that can be used as a surrogate for the system itself. Model uncertainty affects both the methodology appropriate for building models and how models should be used. This paper discusses representations of model uncertainty, methodologies for exercising and interpreting models in the presence of model uncertainty, and the appropriate use of fallible models for policy making

  20. A Multimedia Hydrological Fate Modeling Framework To Assess Water Consumption Impacts in Life Cycle Assessment.

    Science.gov (United States)

    Núñez, Montserrat; Rosenbaum, Ralph K; Karimpour, Shooka; Boulay, Anne-Marie; Lathuillière, Michael J; Margni, Manuele; Scherer, Laura; Verones, Francesca; Pfister, Stephan

    2018-03-30

    Many new methods have recently been developed to address environmental consequences of water consumption in life cycle assessment (LCA). However, such methods can only partially be compared and combined, because their modeling structure and metrics are inconsistent. Moreover, they focus on specific water sources (e.g., river) and miss description of transport flows between water compartments (e.g., from river to atmosphere via evaporation) and regions (e.g., atmospheric advection). Consequently, they provide a partial regard of the local and global hydrological cycle and derived impacts on the environment. This paper proposes consensus-based guidelines for a harmonized development of the next generation of water consumption LCA indicators, with a focus on consequences of water consumption on ecosystem quality. To include the consideration of the multimedia water fate between compartments of the water cycle, we provide spatial regionalization and temporal specification guidance. The principles and recommendations of the paper are applied to an illustrative case study. The guidelines set the basis of a more accurate, novel way of modeling water consumption impacts in LCA. The environmental relevance of this LCA impact category will improve, yet much research is needed to make the guidelines operational.

  1. TSUNAMI RISK ASSESSMENT MODELLING IN CHABAHAR PORT, IRAN

    Directory of Open Access Journals (Sweden)

    M. R. Delavar

    2017-09-01

    Full Text Available The well-known historical tsunami in the Makran Subduction Zone (MSZ region was generated by the earthquake of November 28, 1945 in Makran Coast in the North of Oman Sea. This destructive tsunami killed over 4,000 people in Southern Pakistan and India, caused great loss of life and devastation along the coasts of Western India, Iran and Oman. According to the report of "Remembering the 1945 Makran Tsunami", compiled by the Intergovernmental Oceanographic Commission (UNESCO/IOC, the maximum inundation of Chabahar port was 367 m toward the dry land, which had a height of 3.6 meters from the sea level. In addition, the maximum amount of inundation at Pasni (Pakistan reached to 3 km from the coastline. For the two beaches of Gujarat (India and Oman the maximum run-up height was 3 m from the sea level. In this paper, we first use Makran 1945 seismic parameters to simulate the tsunami in generation, propagation and inundation phases. The effect of tsunami on Chabahar port is simulated using the ComMIT model which is based on the Method of Splitting Tsunami (MOST. In this process the results are compared with the documented eyewitnesses and some reports from researchers for calibration and validation of the result. Next we have used the model to perform risk assessment for Chabahar port in the south of Iran with the worst case scenario of the tsunami. The simulated results showed that the tsunami waves will reach Chabahar coastline 11 minutes after generation and 9 minutes later, over 9.4 Km2 of the dry land will be flooded with maximum wave amplitude reaching up to 30 meters.

  2. Tsunami Risk Assessment Modelling in Chabahar Port, Iran

    Science.gov (United States)

    Delavar, M. R.; Mohammadi, H.; Sharifi, M. A.; Pirooz, M. D.

    2017-09-01

    The well-known historical tsunami in the Makran Subduction Zone (MSZ) region was generated by the earthquake of November 28, 1945 in Makran Coast in the North of Oman Sea. This destructive tsunami killed over 4,000 people in Southern Pakistan and India, caused great loss of life and devastation along the coasts of Western India, Iran and Oman. According to the report of "Remembering the 1945 Makran Tsunami", compiled by the Intergovernmental Oceanographic Commission (UNESCO/IOC), the maximum inundation of Chabahar port was 367 m toward the dry land, which had a height of 3.6 meters from the sea level. In addition, the maximum amount of inundation at Pasni (Pakistan) reached to 3 km from the coastline. For the two beaches of Gujarat (India) and Oman the maximum run-up height was 3 m from the sea level. In this paper, we first use Makran 1945 seismic parameters to simulate the tsunami in generation, propagation and inundation phases. The effect of tsunami on Chabahar port is simulated using the ComMIT model which is based on the Method of Splitting Tsunami (MOST). In this process the results are compared with the documented eyewitnesses and some reports from researchers for calibration and validation of the result. Next we have used the model to perform risk assessment for Chabahar port in the south of Iran with the worst case scenario of the tsunami. The simulated results showed that the tsunami waves will reach Chabahar coastline 11 minutes after generation and 9 minutes later, over 9.4 Km2 of the dry land will be flooded with maximum wave amplitude reaching up to 30 meters.

  3. Assessment and Review of GIA Models for Antarctica and Greenland

    Science.gov (United States)

    Ivins, E. R.

    2011-12-01

    One of the major obstacles to reducing the uncertainty of GRACE-based ice mass balance estimates for the ice sheets during 2002-2011 is our poor control on the ongoing glacial isostatic adjustment of bedrock (GIA). The later adjustments cause vertical motions of rock at the crustal surface and at great depths within the mantle. These are sources of positive mass trend when measured in space gravimtery data. The poorly understood signal in Antarctica may be large enough to manifest uncertainties that approach 190 Gt/yr (Velicogna and Wahr, 2006), dominating the background error in trend for Antarctica, and potentially corrupting solutions for mass balance for ice drainage basins in the north and central parts of Greenland. This source of error is independent of method, and corrupts the interpretable trend in both spherical harmonic field or mascon releases. Two of the most recent GRACE mass balance assessments for GIA in Antarctica employ combinations of ICE5G and IJ05 models, both of which are now more than half a decade old. In part, because of the urgency to report to the IPCC on the mass balance of Antarctica with greater certainty, GIA modeling has been the focus of intense research in the past 6-7 years. Significant progress lies in three areas of research: i.) Constraint on paleo-ice sheet reconstruction coming from dated sedimentary coring ('bathtub rings') and moraine and nunatuk rock nuclide exposures ('dip sticks of the past'). This data is now rich enough that, in fact, for some areas of Antarctica we now know much more about ice mass evolution since Last Glacial Maximum (21 thousand years ago), that we do for the great Laurentide ice sheet of North America; ii.) Integration of simple ice dynamics models that are specifically constrained by these data (Whitehouse et al., 2011, ISAES XI Edinburgh); iii.) A more robust GPS data set for vertical motion trends of longer legacy (almost two decades in some cases) approaching 50 individual station records on

  4. Using urban forest assessment tools to model bird habitat potential

    Science.gov (United States)

    Lerman, Susannah B.; Nislow, Keith H.; Nowak, David J.; DeStefano, Stephen; King, David I.; Jones-Farrand, D. Todd

    2014-01-01

    The alteration of forest cover and the replacement of native vegetation with buildings, roads, exotic vegetation, and other urban features pose one of the greatest threats to global biodiversity. As more land becomes slated for urban development, identifying effective urban forest wildlife management tools becomes paramount to ensure the urban forest provides habitat to sustain bird and other wildlife populations. The primary goal of this study was to integrate wildlife suitability indices to an existing national urban forest assessment tool, i-Tree. We quantified available habitat characteristics of urban forests for ten northeastern U.S. cities, and summarized bird habitat relationships from the literature in terms of variables that were represented in the i-Tree datasets. With these data, we generated habitat suitability equations for nine bird species representing a range of life history traits and conservation status that predicts the habitat suitability based on i-Tree data. We applied these equations to the urban forest datasets to calculate the overall habitat suitability for each city and the habitat suitability for different types of land-use (e.g., residential, commercial, parkland) for each bird species. The proposed habitat models will help guide wildlife managers, urban planners, and landscape designers who require specific information such as desirable habitat conditions within an urban management project to help improve the suitability of urban forests for birds.

  5. Risk assessment to an integrated planning model for UST programs

    International Nuclear Information System (INIS)

    Ferguson, K.W.

    1993-01-01

    The US Postal Service maintains the largest civilian fleet in the United States totaling approximately 180,000 vehicles. To support the fleets daily energy requirements, the Postal Service also operates one of the largest networks of underground storage tanks nearly 7,500 nationwide. A program to apply risk assessment to planning, budget development and other management actions was implemented during September, 1989. Working closely with a consultant, the postal service developed regulatory and environmental risk criteria and weighting factors for a ranking model. The primary objective was to identify relative risks for each underground tank at individual facilities. Relative risks at each facility were determined central to prioritizing scheduled improvements to the tank network. The survey was conducted on 302 underground tanks in the Northeast Region of the US. An environmental and regulatory risk score was computed for each UST. By ranking the tanks according to their risk score, tanks were classified into management action categories including, but the limited to, underground tank testing, retrofit, repair, replacement and closure

  6. PFI redux? Assessing a new model for financing hospitals.

    Science.gov (United States)

    Hellowell, Mark

    2013-11-01

    There is a growing need for investments in hospital facilities to improve the efficiency and quality of health services. In recent years, publicly financed hospital organisations in many countries have utilised private finance arrangements, variously called private finance initiatives (PFIs), public-private partnerships (PPPs) or P3s, to address their capital requirements. However, such projects have become more difficult to implement since the onset of the global financial crisis, which has led to a reduction in the supply of debt capital and an increase in its price. In December 2012, the government of the United Kingdom outlined a comprehensive set of reforms to the private finance model in order to revive this important source of capital for hospital investments. This article provides a critical assessment of the 'Private Finance 2' reforms, focusing on their likely impact on the supply and cost of capital. It concludes that constraints in supply are likely to continue, in part due to regulatory constraints facing both commercial banks and institutional investors, while the cost of capital is likely to increase, at least in the short term. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  7. Econometric model as a regulatory tool in electricity distribution - Case Network Performance Assessment Model

    International Nuclear Information System (INIS)

    Honkapuro, S.; Lassila, J.; Viljainen, S.; Tahvanainen, K.; Partanen, J.

    2004-01-01

    Electricity distribution companies operate in the state of natural monopolies since building of parallel networks is not cost-effective. Monopoly companies do not have pressure from the open markets to keep their prices and costs at reasonable level. The regulation of these companies is needed to prevent the misuse of the monopoly position. Regulation is usually focused either on the profit of company or on the price of electricity. In this document, the usability of an econometric model in the regulation of electricity distribution companies is evaluated. Regulation method which determines allowed income for each company with generic computation model can be seen as an econometric model. As the special case of an econometric model, the method called Network Performance Assessment Model, NPAM (Naetnyttomodellen in Swedish), is analysed. NPAM is developed by Swedish Energy Agency (STEM) for the regulation of electricity distribution companies. Both theoretical analysis and calculations of an example network area are presented in this document to find the major directing effects of the model. The parameters of NPAM, which are used in the calculations of this research report, were dated on 30th of March 2004. These parameters were most recent available at the time when analysis was done. However, since NPAM is under development, the parameters have been constantly changing. Therefore slightly changes in the results can occur if calculations were made with latest parameters. However, main conclusions are same and do not depend on exact parameters. (orig.)

  8. Econometric model as a regulatory tool in electricity distribution. Case network performance assessment model

    International Nuclear Information System (INIS)

    Honkapuro, S.; Lassila, J.; Viljainen, S.; Tahvanainen, K.; Partanen, J.

    2004-01-01

    Electricity distribution companies operate in the state of natural monopolies since building of parallel networks is not cost- effective. Monopoly companies do not have pressure from the open markets to keep their prices and costs at reasonable level. The regulation of these companies is needed to prevent the misuse of the monopoly position. Regulation is usually focused either on the profit of company or on the price of electricity. Regulation method which determines allowed income for each company with generic computation model can be seen as an econometric model. In this document, the usability of an econometric model in the regulation of electricity distribution companies is evaluated. As the special case of an econometric model, the method called Network Performance Assessment Model, NPAM (Naetnyttomodellen in Swedish), is analysed. NPAM is developed by Swedish Energy Agency (STEM) for the regulation of electricity distribution companies. Both theoretical analysis and calculations of an example network area are presented in this document to find the major directing effects of the model. The parameters of NPAM, which are used in the calculations of this research report, were dated on 30th of March 2004. These parameters were most recent ones available at the time when analysis was done. However, since NPAM have been under development, the parameters have been constantly changing. Therefore slight changes might occur in the numerical results of calculations if they were made with the latest set of parameters. However, main conclusions are same and do not depend on exact parameters

  9. Models for waste life cycle assessment: Review of technical assumptions

    DEFF Research Database (Denmark)

    Gentil, Emmanuel; Damgaard, Anders; Hauschild, Michael Zwicky

    2010-01-01

    waste LCA models. This review infers that some of the differences in waste LCA models are inherent to the time they were developed. It is expected that models developed later, benefit from past modelling assumptions and knowledge and issues. Models developed in different countries furthermore rely...

  10. SR 97 - Alternative models project. Discrete fracture network modelling for performance assessment of Aberg

    International Nuclear Information System (INIS)

    Dershowitz, B.; Eiben, T.; Follin, S.; Andersson, Johan

    1999-08-01

    As part of studies into the siting of a deep repository for nuclear waste, Swedish Nuclear Fuel and Waste Management Company (SKB) has commissioned the Alternative Models Project (AMP). The AMP is a comparison of three alternative modeling approaches for geosphere performance assessment for a single hypothetical site. The hypothetical site, arbitrarily named Aberg is based on parameters from the Aespoe Hard Rock Laboratory in southern Sweden. The Aberg model domain, boundary conditions and canister locations are defined as a common reference case to facilitate comparisons between approaches. This report presents the results of a discrete fracture pathways analysis of the Aberg site, within the context of the SR 97 performance assessment exercise. The Aberg discrete fracture network (DFN) site model is based on consensus Aberg parameters related to the Aespoe HRL site. Discrete fracture pathways are identified from canister locations in a prototype repository design to the surface of the island or to the sea bottom. The discrete fracture pathways analysis presented in this report is used to provide the following parameters for SKB's performance assessment transport codes FARF31 and COMP23: * F-factor: Flow wetted surface normalized with regards to flow rate (yields an appreciation of the contact area available for diffusion and sorption processes) [TL -1 ]. * Travel Time: Advective transport time from a canister location to the environmental discharge [T]. * Canister Flux: Darcy flux (flow rate per unit area) past a representative canister location [LT -1 ]. In addition to the above, the discrete fracture pathways analysis in this report also provides information about: additional pathway parameters such as pathway length, pathway width, transport aperture, reactive surface area and transmissivity, percentage of canister locations with pathways to the surface discharge, spatial pattern of pathways and pathway discharges, visualization of pathways, and statistical

  11. SR 97 - Alternative models project. Discrete fracture network modelling for performance assessment of Aberg

    Energy Technology Data Exchange (ETDEWEB)

    Dershowitz, B.; Eiben, T. [Golder Associates Inc., Seattle (United States); Follin, S.; Andersson, Johan [Golder Grundteknik KB, Stockholm (Sweden)

    1999-08-01

    As part of studies into the siting of a deep repository for nuclear waste, Swedish Nuclear Fuel and Waste Management Company (SKB) has commissioned the Alternative Models Project (AMP). The AMP is a comparison of three alternative modeling approaches for geosphere performance assessment for a single hypothetical site. The hypothetical site, arbitrarily named Aberg is based on parameters from the Aespoe Hard Rock Laboratory in southern Sweden. The Aberg model domain, boundary conditions and canister locations are defined as a common reference case to facilitate comparisons between approaches. This report presents the results of a discrete fracture pathways analysis of the Aberg site, within the context of the SR 97 performance assessment exercise. The Aberg discrete fracture network (DFN) site model is based on consensus Aberg parameters related to the Aespoe HRL site. Discrete fracture pathways are identified from canister locations in a prototype repository design to the surface of the island or to the sea bottom. The discrete fracture pathways analysis presented in this report is used to provide the following parameters for SKB's performance assessment transport codes FARF31 and COMP23: * F-factor: Flow wetted surface normalized with regards to flow rate (yields an appreciation of the contact area available for diffusion and sorption processes) [TL{sup -1}]. * Travel Time: Advective transport time from a canister location to the environmental discharge [T]. * Canister Flux: Darcy flux (flow rate per unit area) past a representative canister location [LT{sup -1}]. In addition to the above, the discrete fracture pathways analysis in this report also provides information about: additional pathway parameters such as pathway length, pathway width, transport aperture, reactive surface area and transmissivity, percentage of canister locations with pathways to the surface discharge, spatial pattern of pathways and pathway discharges, visualization of pathways, and

  12. Report on the model developments in the sectoral assessments

    DEFF Research Database (Denmark)

    Iglesias, Ana; Termansen, Mette; Bouwer, Laurens

    2014-01-01

    The Objective of this Deliverable D3.2 is to describe the models developed in BASE that is, the experimental setup for the sectoral modelling. The model development described in this deliverable will then be implemented in the adaptation and economic analysis in WP6 in order to integrate adaptati...... of impacts is necessary. - How are models linked to the economic model? - How is the information from the case studies and the information from the sectoral models mutually supportive?...

  13. Assessment of Land Surface Models in a High-Resolution Atmospheric Model during Indian Summer Monsoon

    Science.gov (United States)

    Attada, Raju; Kumar, Prashant; Dasari, Hari Prasad

    2018-04-01

    Assessment of the land surface models (LSMs) on monsoon studies over the Indian summer monsoon (ISM) region is essential. In this study, we evaluate the skill of LSMs at 10 km spatial resolution in simulating the 2010 monsoon season. The thermal diffusion scheme (TDS), rapid update cycle (RUC), and Noah and Noah with multi-parameterization (Noah-MP) LSMs are chosen based on nature of complexity, that is, from simple slab model to multi-parameterization options coupled with the Weather Research and Forecasting (WRF) model. Model results are compared with the available in situ observations and reanalysis fields. The sensitivity of monsoon elements, surface characteristics, and vertical structures to different LSMs is discussed. Our results reveal that the monsoon features are reproduced by WRF model with all LSMs, but with some regional discrepancies. The model simulations with selected LSMs are able to reproduce the broad rainfall patterns, orography-induced rainfall over the Himalayan region, and dry zone over the southern tip of India. The unrealistic precipitation pattern over the equatorial western Indian Ocean is simulated by WRF-LSM-based experiments. The spatial and temporal distributions of top 2-m soil characteristics (soil temperature and soil moisture) are well represented in RUC and Noah-MP LSM-based experiments during the ISM. Results show that the WRF simulations with RUC, Noah, and Noah-MP LSM-based experiments significantly improved the skill of 2-m temperature and moisture compared to TDS (chosen as a base) LSM-based experiments. Furthermore, the simulations with Noah, RUC, and Noah-MP LSMs exhibit minimum error in thermodynamics fields. In case of surface wind speed, TDS LSM performed better compared to other LSM experiments. A significant improvement is noticeable in simulating rainfall by WRF model with Noah, RUC, and Noah-MP LSMs over TDS LSM. Thus, this study emphasis the importance of choosing/improving LSMs for simulating the ISM phenomena in

  14. Assessment of Land Surface Models in a High-Resolution Atmospheric Model during Indian Summer Monsoon

    KAUST Repository

    Attada, Raju

    2018-04-17

    Assessment of the land surface models (LSMs) on monsoon studies over the Indian summer monsoon (ISM) region is essential. In this study, we evaluate the skill of LSMs at 10 km spatial resolution in simulating the 2010 monsoon season. The thermal diffusion scheme (TDS), rapid update cycle (RUC), and Noah and Noah with multi-parameterization (Noah-MP) LSMs are chosen based on nature of complexity, that is, from simple slab model to multi-parameterization options coupled with the Weather Research and Forecasting (WRF) model. Model results are compared with the available in situ observations and reanalysis fields. The sensitivity of monsoon elements, surface characteristics, and vertical structures to different LSMs is discussed. Our results reveal that the monsoon features are reproduced by WRF model with all LSMs, but with some regional discrepancies. The model simulations with selected LSMs are able to reproduce the broad rainfall patterns, orography-induced rainfall over the Himalayan region, and dry zone over the southern tip of India. The unrealistic precipitation pattern over the equatorial western Indian Ocean is simulated by WRF–LSM-based experiments. The spatial and temporal distributions of top 2-m soil characteristics (soil temperature and soil moisture) are well represented in RUC and Noah-MP LSM-based experiments during the ISM. Results show that the WRF simulations with RUC, Noah, and Noah-MP LSM-based experiments significantly improved the skill of 2-m temperature and moisture compared to TDS (chosen as a base) LSM-based experiments. Furthermore, the simulations with Noah, RUC, and Noah-MP LSMs exhibit minimum error in thermodynamics fields. In case of surface wind speed, TDS LSM performed better compared to other LSM experiments. A significant improvement is noticeable in simulating rainfall by WRF model with Noah, RUC, and Noah-MP LSMs over TDS LSM. Thus, this study emphasis the importance of choosing/improving LSMs for simulating the ISM phenomena

  15. Reliability assessment using degradation models: bayesian and classical approaches

    Directory of Open Access Journals (Sweden)

    Marta Afonso Freitas

    2010-04-01

    Full Text Available Traditionally, reliability assessment of devices has been based on (accelerated life tests. However, for highly reliable products, little information about reliability is provided by life tests in which few or no failures are typically observed. Since most failures arise from a degradation mechanism at work for which there are characteristics that degrade over time, one alternative is monitor the device for a period of time and assess its reliability from the changes in performance (degradation observed during that period. The goal of this article is to illustrate how degradation data can be modeled and analyzed by using "classical" and Bayesian approaches. Four methods of data analysis based on classical inference are presented. Next we show how Bayesian methods can also be used to provide a natural approach to analyzing degradation data. The approaches are applied to a real data set regarding train wheels degradation.Tradicionalmente, o acesso à confiabilidade de dispositivos tem sido baseado em testes de vida (acelerados. Entretanto, para produtos altamente confiáveis, pouca informação a respeito de sua confiabilidade é fornecida por testes de vida no quais poucas ou nenhumas falhas são observadas. Uma vez que boa parte das falhas é induzida por mecanismos de degradação, uma alternativa é monitorar o dispositivo por um período de tempo e acessar sua confiabilidade através das mudanças em desempenho (degradação observadas durante aquele período. O objetivo deste artigo é ilustrar como dados de degradação podem ser modelados e analisados utilizando-se abordagens "clássicas" e Bayesiana. Quatro métodos de análise de dados baseados em inferência clássica são apresentados. A seguir, mostramos como os métodos Bayesianos podem também ser aplicados para proporcionar uma abordagem natural à análise de dados de degradação. As abordagens são aplicadas a um banco de dados real relacionado à degradação de rodas de trens.

  16. FOREST ECOSYSTEM DYNAMICS ASSESSMENT AND PREDICTIVE MODELLING IN EASTERN HIMALAYA

    Directory of Open Access Journals (Sweden)

    S. P. S. Kushwaha

    2012-09-01

    Full Text Available This study focused on the forest ecosystem dynamics assessment and predictive modelling deforestation and forest cover prediction in a part of north-eastern India i.e. forest areas along West Bengal, Bhutan, Arunachal Pradesh and Assam border in Eastern Himalaya using temporal satellite imagery of 1975, 1990 and 2009 and predicted forest cover for the period 2028 using Cellular Automata Markov Modedel (CAMM. The exercise highlighted large-scale deforestation in the study area during 1975–1990 as well as 1990–2009 forest cover vectors. A net loss of 2,334.28 km2 forest cover was noticed between 1975 and 2009, and with current rate of deforestation, a forest area of 4,563.34 km2 will be lost by 2028. The annual rate of deforestation worked out to be 0.35 and 0.78% during 1975–1990 and 1990–2009 respectively. Bamboo forest increased by 24.98% between 1975 and 2009 due to opening up of the forests. Forests in Kokrajhar, Barpeta, Darrang, Sonitpur, and Dhemaji districts in Assam were noticed to be worst-affected while Lower Subansiri, West and East Siang, Dibang Valley, Lohit and Changlang in Arunachal Pradesh were severely affected. Among different forest types, the maximum loss was seen in case of sal forest (37.97% between 1975 and 2009 and is expected to deplete further to 60.39% by 2028. The tropical moist deciduous forest was the next category, which decreased from 5,208.11 km2 to 3,447.28 (33.81% during same period with further chances of depletion to 2,288.81 km2 (56.05% by 2028. It noted progressive loss of forests in the study area between 1975 and 2009 through 1990 and predicted that, unless checked, the area is in for further depletion of the invaluable climax forests in the region, especially sal and moist deciduous forests. The exercise demonstrated high potential of remote sensing and geographic information system for forest ecosystem dynamics assessment and the efficacy of CAMM to predict the forest cover change.

  17. Utility of Social Modeling for Proliferation Assessment - Enhancing a Facility-Level Model for Proliferation Resistance Assessment of a Nuclear Enegry System

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Garill A.; Brothers, Alan J.; Gastelum, Zoe N.; Olson, Jarrod; Thompson, Sandra E.

    2009-10-26

    The Utility of Social Modeling for Proliferation Assessment project (PL09-UtilSocial) investigates the use of social and cultural information to improve nuclear proliferation assessments, including nonproliferation assessments, Proliferation Resistance (PR) assessments, safeguards assessments, and other related studies. These assessments often use and create technical information about a host State and its posture towards proliferation, the vulnerability of a nuclear energy system (NES) to an undesired event, and the effectiveness of safeguards. This objective of this project is to find and integrate social and technical information by explicitly considering the role of cultural, social, and behavioral factors relevant to proliferation; and to describe and demonstrate if and how social science modeling has utility in proliferation assessment. This report describes a modeling approach and how it might be used to support a location-specific assessment of the PR assessment of a particular NES. The report demonstrates the use of social modeling to enhance an existing assessment process that relies on primarily technical factors. This effort builds on a literature review and preliminary assessment performed as the first stage of the project and compiled in PNNL-18438. [ T his report describes an effort to answer questions about whether it is possible to incorporate social modeling into a PR assessment in such a way that we can determine the effects of social factors on a primarily technical assessment. This report provides: 1. background information about relevant social factors literature; 2. background information about a particular PR assessment approach relevant to this particular demonstration; 3. a discussion of social modeling undertaken to find and characterize social factors that are relevant to the PR assessment of a nuclear facility in a specific location; 4. description of an enhancement concept that integrates social factors into an existing, technically

  18. Forecasting consequences of accidental release: how reliable are current assessment models

    International Nuclear Information System (INIS)

    Rohwer, P.S.; Hoffman, F.O.; Miller, C.W.

    1983-01-01

    This paper focuses on uncertainties in model output used to assess accidents. We begin by reviewing the historical development of assessment models and the associated interest in uncertainties as these evolutionary processes occurred in the United States. This is followed by a description of the sources of uncertainties in assessment calculations. Types of models appropriate for assessment of accidents are identified. A summary of results from our analysis of uncertainty is provided in results obtained with current methodology for assessing routine and accidental radionuclide releases to the environment. We conclude with discussion of preferred procedures and suggested future directions to improve the state-of-the-art of radiological assessments

  19. Computation Modeling and Assessment of Nanocoatings for Ultra Supercritical Boilers

    Energy Technology Data Exchange (ETDEWEB)

    J. Shingledecker; D. Gandy; N. Cheruvu; R. Wei; K. Chan

    2011-06-21

    Forced outages and boiler unavailability of coal-fired fossil plants is most often caused by fire-side corrosion of boiler waterwalls and tubing. Reliable coatings are required for Ultrasupercritical (USC) application to mitigate corrosion since these boilers will operate at a much higher temperatures and pressures than in supercritical (565 C {at} 24 MPa) boilers. Computational modeling efforts have been undertaken to design and assess potential Fe-Cr-Ni-Al systems to produce stable nanocrystalline coatings that form a protective, continuous scale of either Al{sub 2}O{sub 3} or Cr{sub 2}O{sub 3}. The computational modeling results identified a new series of Fe-25Cr-40Ni with or without 10 wt.% Al nanocrystalline coatings that maintain long-term stability by forming a diffusion barrier layer at the coating/substrate interface. The computational modeling predictions of microstructure, formation of continuous Al{sub 2}O{sub 3} scale, inward Al diffusion, grain growth, and sintering behavior were validated with experimental results. Advanced coatings, such as MCrAl (where M is Fe, Ni, or Co) nanocrystalline coatings, have been processed using different magnetron sputtering deposition techniques. Several coating trials were performed and among the processing methods evaluated, the DC pulsed magnetron sputtering technique produced the best quality coating with a minimum number of shallow defects and the results of multiple deposition trials showed that the process is repeatable. scale, inward Al diffusion, grain growth, and sintering behavior were validated with experimental results. The cyclic oxidation test results revealed that the nanocrystalline coatings offer better oxidation resistance, in terms of weight loss, localized oxidation, and formation of mixed oxides in the Al{sub 2}O{sub 3} scale, than widely used MCrAlY coatings. However, the ultra-fine grain structure in these coatings, consistent with the computational model predictions, resulted in accelerated Al

  20. [Homeostasis model assessment (HOMA) values in Chilean elderly subjects].

    Science.gov (United States)

    Garmendia, María Luisa; Lera, Lydia; Sánchez, Hugo; Uauy, Ricardo; Albala, Cecilia

    2009-11-01

    The homeostasis assessment model for insulin resistance (HOMA-IR) estimates insulin resistance using basal insulin and glucose values and has a good concordance with values obtained with the euglycemic clamp. However it has a high variability that depends on environmental, genetic and physiologic factors. Therefore it is imperative to establish normal HOMA values in different populations. To report HOMA-IR values in Chilean elderly subjects and to determine the best cutoff point to diagnose insulin resistance. Cross sectional study of 1003 subjects older than 60 years of whom 803 (71% women) did not have diabetes. In 154 subjects, an oral glucose tolerance test was also performed. Insulin resistance (IR) was defined as the HOMA value corresponding to percentile 75 of subjects without over or underweight. The behavior of HOMA-IR in metabolic syndrome was studied and receiver operating curves (ROC) were calculated, using glucose intolerance defined as a blood glucose over 140 mg/dl and hyperinsulinemia, defined as a serum insulin over 60 microU/ml, two hours after the glucose load. Median HOMA-IR values were 1.7. Percentile 75 in subjects without obesity or underweight was 2.57. The area under the ROC curve, when comparing HOMA-IR with glucose intolerance and hyperinsulinemia, was 0.8 (95% confidence values 0.72-0.87), with HOMA-IR values ranging from 2.04 to 2.33. HOMA-IR is a useful method to determine insulin resistance in epidemiological studies. The HOMA-IR cutoff point for insulin resistance defined in thi spopulation was 2.6.

  1. The PP ampersand L Nuclear Department model for conducting self-assessments

    International Nuclear Information System (INIS)

    Murthy, M.L.R.; Vernick, H.R.; Male, A.M.; Burchill, W.E.

    1995-01-01

    The nuclear department of Pennsylvania Power ampersand Light Company (PP ampersand L) has initiated an aggressive, methodical, self-assessment program. Self-assessments are conducted to prevent problems, improve performance, and monitor results. The assessment activities are conducted by, or for, an individual having responsibility for performing the work being assessed. This individual, or customer, accepts ownership of the assessment effort and commits to implementing the recommendations agreed on during the assessment. This paper discusses the main elements of the assessment model developed by PP ampersand L and the results the model has achieved to date

  2. Risk assessment of power systems models, methods, and applications

    CERN Document Server

    Li, Wenyuan

    2014-01-01

    Risk Assessment of Power Systems addresses the regulations and functions of risk assessment with regard to its relevance in system planning, maintenance, and asset management. Brimming with practical examples, this edition introduces the latest risk information on renewable resources, the smart grid, voltage stability assessment, and fuzzy risk evaluation. It is a comprehensive reference of a highly pertinent topic for engineers, managers, and upper-level students who seek examples of risk theory applications in the workplace.

  3. Sensitivity of hydrological performance assessment analysis to variations in material properties, conceptual models, and ventilation models

    Energy Technology Data Exchange (ETDEWEB)

    Sobolik, S.R.; Ho, C.K.; Dunn, E. [Sandia National Labs., Albuquerque, NM (United States); Robey, T.H. [Spectra Research Inst., Albuquerque, NM (United States); Cruz, W.T. [Univ. del Turabo, Gurabo (Puerto Rico)

    1996-07-01

    The Yucca Mountain Site Characterization Project is studying Yucca Mountain in southwestern Nevada as a potential site for a high-level nuclear waste repository. Site characterization includes surface- based and underground testing. Analyses have been performed to support the design of an Exploratory Studies Facility (ESF) and the design of the tests performed as part of the characterization process, in order to ascertain that they have minimal impact on the natural ability of the site to isolate waste. The information in this report pertains to sensitivity studies evaluating previous hydrological performance assessment analyses to variation in the material properties, conceptual models, and ventilation models, and the implications of this sensitivity on previous recommendations supporting ESF design. This document contains information that has been used in preparing recommendations for Appendix I of the Exploratory Studies Facility Design Requirements document.

  4. Sensitivity of hydrological performance assessment analysis to variations in material properties, conceptual models, and ventilation models

    International Nuclear Information System (INIS)

    Sobolik, S.R.; Ho, C.K.; Dunn, E.; Robey, T.H.; Cruz, W.T.

    1996-07-01

    The Yucca Mountain Site Characterization Project is studying Yucca Mountain in southwestern Nevada as a potential site for a high-level nuclear waste repository. Site characterization includes surface- based and underground testing. Analyses have been performed to support the design of an Exploratory Studies Facility (ESF) and the design of the tests performed as part of the characterization process, in order to ascertain that they have minimal impact on the natural ability of the site to isolate waste. The information in this report pertains to sensitivity studies evaluating previous hydrological performance assessment analyses to variation in the material properties, conceptual models, and ventilation models, and the implications of this sensitivity on previous recommendations supporting ESF design. This document contains information that has been used in preparing recommendations for Appendix I of the Exploratory Studies Facility Design Requirements document

  5. Assessing Models of Public Understanding In ELSI Outreach Materials

    Energy Technology Data Exchange (ETDEWEB)

    Bruce V. Lewenstein, Ph.D.; Dominique Brossard, Ph.D.

    2006-03-01

    issues has been used in educational public settings to affect public understanding of science. After a theoretical background discussion, our approach is three-fold. First, we will provide an overview, a ?map? of DOE-funded of outreach programs within the overall ELSI context to identify the importance of the educational component, and to present the criteria we used to select relevant and representative case studies. Second, we will document the history of the case studies. Finally, we will explore an intertwined set of research questions: (1) To identify what we can expect such projects to accomplish -in other words to determine the goals that can reasonably be achieved by different types of outreach, (2) To point out how the case study approach could be useful for DOE-ELSI outreach as a whole, and (3) To use the case study approach as a basis to test theoretical models of science outreach in order to assess to what extent those models accord with real world outreach activities. For this last goal, we aim at identifying what practices among ELSI outreach activities contribute most to dissemination, or to participation, in other words in which cases outreach materials spark action in terms of public participation in decisions about scientific issues.

  6. Assessing risk factors for dental caries: a statistical modeling approach.

    Science.gov (United States)

    Trottini, Mario; Bossù, Maurizio; Corridore, Denise; Ierardo, Gaetano; Luzzi, Valeria; Saccucci, Matteo; Polimeni, Antonella

    2015-01-01

    The problem of identifying potential determinants and predictors of dental caries is of key importance in caries research and it has received considerable attention in the scientific literature. From the methodological side, a broad range of statistical models is currently available to analyze dental caries indices (DMFT, dmfs, etc.). These models have been applied in several studies to investigate the impact of different risk factors on the cumulative severity of dental caries experience. However, in most of the cases (i) these studies focus on a very specific subset of risk factors; and (ii) in the statistical modeling only few candidate models are considered and model selection is at best only marginally addressed. As a result, our understanding of the robustness of the statistical inferences with respect to the choice of the model is very limited; the richness of the set of statistical models available for analysis in only marginally exploited; and inferences could be biased due the omission of potentially important confounding variables in the model's specification. In this paper we argue that these limitations can be overcome considering a general class of candidate models and carefully exploring the model space using standard model selection criteria and measures of global fit and predictive performance of the candidate models. Strengths and limitations of the proposed approach are illustrated with a real data set. In our illustration the model space contains more than 2.6 million models, which require inferences to be adjusted for 'optimism'.

  7. Ecological models for regulatory risk assessments of pesticides: Developing a strategy for the future.

    NARCIS (Netherlands)

    Thorbek, P.; Forbes, V.; Heimbach, F.; Hommen, U.; Thulke, H.H.; Brink, van den P.J.

    2010-01-01

    Ecological Models for Regulatory Risk Assessments of Pesticides: Developing a Strategy for the Future provides a coherent, science-based view on ecological modeling for regulatory risk assessments. It discusses the benefits of modeling in the context of registrations, identifies the obstacles that

  8. Multi-Level Model of Contextual Factors and Teachers' Assessment Practices: An Integrative Review of Research

    Science.gov (United States)

    Fulmer, Gavin W.; Lee, Iris C. H.; Tan, Kelvin H. K.

    2015-01-01

    We present a multi-level model of contextual factors that may influence teachers' assessment practices, and use this model in a selected review of existing literature on teachers' assessment knowledge, views and conceptions with respect to these contextual factors. Adapting Kozma's model, we distinguish three levels of influence on teachers'…

  9. A Model of the Pre-Assessment Learning Effects of Summative Assessment in Medical Education

    Science.gov (United States)

    Cilliers, Francois J.; Schuwirth, Lambert W. T.; Herman, Nicoline; Adendorff, Hanelie J.; van der Vleuten, Cees P. M.

    2012-01-01

    It has become axiomatic that assessment impacts powerfully on student learning. However, surprisingly little research has been published emanating from authentic higher education settings about the nature and mechanism of the pre-assessment learning effects of summative assessment. Less still emanates from health sciences education settings. This…

  10. Using toxicokinetic-toxicodynamic modeling as an acute risk assessment refinement approach in vertebrate ecological risk assessment.

    Science.gov (United States)

    Ducrot, Virginie; Ashauer, Roman; Bednarska, Agnieszka J; Hinarejos, Silvia; Thorbek, Pernille; Weyman, Gabriel

    2016-01-01

    Recent guidance identified toxicokinetic-toxicodynamic (TK-TD) modeling as a relevant approach for risk assessment refinement. Yet, its added value compared to other refinement options is not detailed, and how to conduct the modeling appropriately is not explained. This case study addresses these issues through 2 examples of individual-level risk assessment for 2 hypothetical plant protection products: 1) evaluating the risk for small granivorous birds and small omnivorous mammals of a single application, as a seed treatment in winter cereals, and 2) evaluating the risk for fish after a pulsed treatment in the edge-of-field zone. Using acute test data, we conducted the first tier risk assessment as defined in the European Food Safety Authority (EFSA) guidance. When first tier risk assessment highlighted a concern, refinement options were discussed. Cases where the use of models should be preferred over other existing refinement approaches were highlighted. We then practically conducted the risk assessment refinement by using 2 different models as examples. In example 1, a TK model accounting for toxicokinetics and relevant feeding patterns in the skylark and in the wood mouse was used to predict internal doses of the hypothetical active ingredient in individuals, based on relevant feeding patterns in an in-crop situation, and identify the residue levels leading to mortality. In example 2, a TK-TD model accounting for toxicokinetics, toxicodynamics, and relevant exposure patterns in the fathead minnow was used to predict the time-course of fish survival for relevant FOCUS SW exposure scenarios and identify which scenarios might lead to mortality. Models were calibrated using available standard data and implemented to simulate the time-course of internal dose of active ingredient or survival for different exposure scenarios. Simulation results were discussed and used to derive the risk assessment refinement endpoints used for decision. Finally, we compared the

  11. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT: A GIS-BASED HYDROLOGIC MODELING TOOL

    Science.gov (United States)

    Planning and assessment in land and water resource management are evolving toward complex, spatially explicit regional assessments. These problems have to be addressed with distributed models that can compute runoff and erosion at different spatial and temporal scales. The extens...

  12. The PASS model for the assessment of cognitive functioning in ...

    African Journals Online (AJOL)

    Diversity is an acknowledged characteristic of the South African society. Traditional standardised methods of assessment for cognitive functioning have been discouraged or abandoned, as they have been found to be discriminatory. Arguing for a systematic assessment process, a previous researcher has stated that ...

  13. Modelling self-assessed vulnerability to HIV and its associated ...

    African Journals Online (AJOL)

    Background: Globally, individuals' self-assessment of vulnerability to HIV infection is important to maintain safer sexual behaviour and reduce risky behaviours. However, determinants of self-perceived risk of HIV infection are not well documented and differ. We assessed the level of self-perceived vulnerability to HIV ...

  14. A Fuzzy Group Decision Making Model for Ordinal Peer Assessment

    Science.gov (United States)

    Capuano, Nicola; Loia, Vincenzo; Orciuoli, Francesco

    2017-01-01

    Massive Open Online Courses (MOOCs) are becoming an increasingly popular choice for education but, to reach their full extent, they require the resolution of new issues like assessing students at scale. A feasible approach to tackle this problem is peer assessment, in which students also play the role of assessor for assignments submitted by…

  15. Evaluation of models for assessing Medicago sativa L. hay quality

    African Journals Online (AJOL)

    UFS Campus

    ) model of Weiss et al. (1992), using lignin to determine truly digestible NDF, ... quality evaluation model for commercial application. .... The almost perfect relationship (r = 0.98; Table 1) between TDNlig of lucerne hay and MY, predicted.

  16. A SELF-ASSESSMENT MODEL IN TEACHING ACADEMIC WRITING FOR INDONESIAN EFL LEARNERS

    Directory of Open Access Journals (Sweden)

    Taufiqulloh

    2014-12-01

    Full Text Available This self-assessment model is developed to help EFL students improve their achievement in academic writing, more particularly essay writing. In English department of Pancasakti University Tegal, academic writing is the course subject which develops models and practices of essay writing in order that students are actively engaged in rhetorical problem-solving. The development of this self-assessment model was derived from the analysis of both theoretical and empirical studies of self-assessment in EFL writing. The self-assessment model developed in this study consists of four kinds of self-assessment instruments: self-edit checklist of writing dimensions (CWD, checklist of writing strategies (CWS, survey questionnaire of writing interest and awareness (SWIA, and questionnaire of learning monitoring strategies (QLMS. This self-assessment model can be an alternative model in teaching academic writing to EFL students at university level, more particularly the students of English Department, Pancasakti University Tegal.

  17. Assessment and Reduction of Model Parametric Uncertainties: A Case Study with A Distributed Hydrological Model

    Science.gov (United States)

    Gan, Y.; Liang, X. Z.; Duan, Q.; Xu, J.; Zhao, P.; Hong, Y.

    2017-12-01

    The uncertainties associated with the parameters of a hydrological model need to be quantified and reduced for it to be useful for operational hydrological forecasting and decision support. An uncertainty quantification framework is presented to facilitate practical assessment and reduction of model parametric uncertainties. A case study, using the distributed hydrological model CREST for daily streamflow simulation during the period 2008-2010 over ten watershed, was used to demonstrate the performance of this new framework. Model behaviors across watersheds were analyzed by a two-stage stepwise sensitivity analysis procedure, using LH-OAT method for screening out insensitive parameters, followed by MARS-based Sobol' sensitivity indices for quantifying each parameter's contribution to the response variance due to its first-order and higher-order effects. Pareto optimal sets of the influential parameters were then found by the adaptive surrogate-based multi-objective optimization procedure, using MARS model for approximating the parameter-response relationship and SCE-UA algorithm for searching the optimal parameter sets of the adaptively updated surrogate model. The final optimal parameter sets were validated against the daily streamflow simulation of the same watersheds during the period 2011-2012. The stepwise sensitivity analysis procedure efficiently reduced the number of parameters that need to be calibrated from twelve to seven, which helps to limit the dimensionality of calibration problem and serves to enhance the efficiency of parameter calibration. The adaptive MARS-based multi-objective calibration exercise provided satisfactory solutions to the reproduction of the observed streamflow for all watersheds. The final optimal solutions showed significant improvement when compared to the default solutions, with about 65-90% reduction in 1-NSE and 60-95% reduction in |RB|. The validation exercise indicated a large improvement in model performance with about 40

  18. Validation of crop weather models for crop assessment arid yield ...

    African Journals Online (AJOL)

    IRSIS and CRPSM models were used in this study to see how closely they could predict grain yields for selected stations in Tanzania. Input for the models comprised of weather, crop and soil data collected from five selected stations. Simulation results show that IRSIS model tends to over predict grain yields of maize, ...

  19. Linear regression models for quantitative assessment of left ...

    African Journals Online (AJOL)

    STORAGESEVER

    2008-07-04

    Jul 4, 2008 ... computed. Linear regression models for the prediction of left ventricular structures were established. Prediction models for ... study aimed at establishing linear regression models that could be used in the prediction ..... Is white cat hypertension associated with artenal disease or left ventricular hypertrophy?

  20. Conceptual model for assessment of inhalation exposure: Defining modifying factors

    NARCIS (Netherlands)

    Tielemans, E.; Schneider, T.; Goede, H.; Tischer, M.; Warren, N.; Kromhout, H.; Tongeren, M. van; Hemmen, J. van; Cherrie, J.W.

    2008-01-01

    The present paper proposes a source-receptor model to schematically describe inhalation exposure to help understand the complex processes leading to inhalation of hazardous substances. The model considers a stepwise transfer of a contaminant from the source to the receptor. The conceptual model is

  1. Model for Environmental Assessment of Container Ship Transport

    DEFF Research Database (Denmark)

    Kristensen, Hans Otto Holmegaard

    2012-01-01

    A generic computer model for systematic investigations of container ship designs is described in this paper. The primary statistical data on container ships used for the model development are also presented. The model can be used to calculate exhaust gas emissions from container ships, including...

  2. Mass movement hazard assessment model in the slope profile

    Science.gov (United States)

    Colangelo, A. C.

    2003-04-01

    The central aim of this work is to assess the spatial behaviour of critical depths for slope stability and the behaviour of their correlated variables in the soil-regolith transition along slope profiles over granite, migmatite and mica-schist parent materials in an humid tropical environment. In this way, we had making measures of shear strength for residual soils and regolith materials with soil "Cohron Sheargraph" apparatus and evaluated the shear stress tension behaviour at soil-regolith boundary along slope profiles, in each referred lithology. In the limit equilibrium approach applied here we adapt the infinite slope model for slope analysis in whole slope profile by means of finite element solution like in Fellenius or Bishop methods. In our case, we assume that the potential rupture surface occurs at soil-regolith or soil-rock boundary in slope material. For each slice, the factor of safety was calculated considering the value of shear strength (cohesion and friction) of material, soil-regolith boundary depth, soil moisture level content, slope gradient, top of subsurface flow gradient, apparent soil bulk density. The correlations showed the relative weight of cohesion, internal friction angle, apparent bulk density of soil materials and slope gradient variables with respect to the evaluation of critical depth behaviour for different simulated soil moisture content levels at slope profile scale. Some important results refer to the central role of behaviour of soil bulk-density variable along slope profile during soil evolution and in present day, because the intense clay production, mainly Kaolinite and Gibbsite at B and C-horizons, in the humid tropical environment. A increase in soil clay content produce a fall of friction angle and bulk density of material, specially when some montmorillonite or illite clay are present. We have observed too at threshold conditions, that a slight change in soil bulk-density value may disturb drastically the equilibrium of

  3. Models used to assess the performance of photovoltaic systems.

    Energy Technology Data Exchange (ETDEWEB)

    Stein, Joshua S.; Klise, Geoffrey T.

    2009-12-01

    This report documents the various photovoltaic (PV) performance models and software developed and utilized by researchers at Sandia National Laboratories (SNL) in support of the Photovoltaics and Grid Integration Department. In addition to PV performance models, hybrid system and battery storage models are discussed. A hybrid system using other distributed sources and energy storage can help reduce the variability inherent in PV generation, and due to the complexity of combining multiple generation sources and system loads, these models are invaluable for system design and optimization. Energy storage plays an important role in reducing PV intermittency and battery storage models are used to understand the best configurations and technologies to store PV generated electricity. Other researcher's models used by SNL are discussed including some widely known models that incorporate algorithms developed at SNL. There are other models included in the discussion that are not used by or were not adopted from SNL research but may provide some benefit to researchers working on PV array performance, hybrid system models and energy storage. The paper is organized into three sections to describe the different software models as applied to photovoltaic performance, hybrid systems, and battery storage. For each model, there is a description which includes where to find the model, whether it is currently maintained and any references that may be available. Modeling improvements underway at SNL include quantifying the uncertainty of individual system components, the overall uncertainty in modeled vs. measured results and modeling large PV systems. SNL is also conducting research into the overall reliability of PV systems.

  4. Sociocultural Behavior Influence Modelling & Assessment: Current Work and Research Frontiers.

    Energy Technology Data Exchange (ETDEWEB)

    Bernard, Michael Lewis [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2018-01-01

    A common problem associated with the effort to better assess potential behaviors of various individuals within different countries is the shear difficulty in comprehending the dynamic nature of populations, particularly over time and considering feedback effects. This paper discusses a theory-based analytical capability designed to enable analysts to better assess the influence of events on individuals interacting within a country or region. These events can include changes in policy, man-made or natural disasters, migration, war, or other changes in environmental/economic conditions. In addition, this paper describes potential extensions of this type of research to enable more timely and accurate assessments.

  5. Damage severity assessment in wind turbine blade laboratory model through fuzzy finite element model updating

    Science.gov (United States)

    Turnbull, Heather; Omenzetter, Piotr

    2017-04-01

    The recent shift towards development of clean, sustainable energy sources has provided a new challenge in terms of structural safety and reliability: with aging, manufacturing defects, harsh environmental and operational conditions, and extreme events such as lightning strikes wind turbines can become damaged resulting in production losses and environmental degradation. To monitor the current structural state of the turbine, structural health monitoring (SHM) techniques would be beneficial. Physics based SHM in the form of calibration of a finite element model (FEMs) by inverse techniques is adopted in this research. Fuzzy finite element model updating (FFEMU) techniques for damage severity assessment of a small-scale wind turbine blade are discussed and implemented. The main advantage is the ability of FFEMU to account in a simple way for uncertainty within the problem of model updating. Uncertainty quantification techniques, such as fuzzy sets, enable a convenient mathematical representation of the various uncertainties. Experimental frequencies obtained from modal analysis on a small-scale wind turbine blade were described by fuzzy numbers to model measurement uncertainty. During this investigation, damage severity estimation was investigated through addition of small masses of varying magnitude to the trailing edge of the structure. This structural modification, intended to be in lieu of damage, enabled non-destructive experimental simulation of structural change. A numerical model was constructed with multiple variable additional masses simulated upon the blades trailing edge and used as updating parameters. Objective functions for updating were constructed and minimized using both particle swarm optimization algorithm and firefly algorithm. FFEMU was able to obtain a prediction of baseline material properties of the blade whilst also successfully predicting, with sufficient accuracy, a larger magnitude of structural alteration and its location.

  6. Assessment of rainfall-runoff modelling for climate change mitigation

    Science.gov (United States)

    Otieno, Hesbon; Han, Dawei; Woods, Ross

    2015-04-01

    Sustainable water resources management requires reliable methods for quantification of hydrological variables. This is a big challenge in developing countries, due to the problem of inadequate data as a result of sparse gauge networks. Successive occurrence of both abundance and shortage of water can arise in a catchment within the same year, with deficit situations becoming an increasingly occurring phenomenon in Kenya. This work compares the performance of two models in the Tana River catchment in Kenya, in generation of synthetic flow data. One of the models is the simpler USGS Thornthwaite monthly water balance model that uses a monthly time step and has three parameters. In order to explore alternative modelling schemes, the more complex Pitman model with 19 parameters was also applied in the catchment. It is uncertain whether the complex model (Pitman) will do better than the simple model, because a model with a large number of parameters may do well in the current system but poorly in future. To check this we have used old data (1970-1985) to calibrate the models and to validate with recent data (after 1985) to see which model is robust over time. This study is relevant and useful to water resources managers in scenario analysis for water resources management, planning and development in African countries with similar climates and catchment conditions.

  7. Bayesian Dimensionality Assessment for the Multidimensional Nominal Response Model

    Directory of Open Access Journals (Sweden)

    Javier Revuelta

    2017-06-01

    Full Text Available This article introduces Bayesian estimation and evaluation procedures for the multidimensional nominal response model. The utility of this model is to perform a nominal factor analysis of items that consist of a finite number of unordered response categories. The key aspect of the model, in comparison with traditional factorial model, is that there is a slope for each response category on the latent dimensions, instead of having slopes associated to the items. The extended parameterization of the multidimensional nominal response model requires large samples for estimation. When sample size is of a moderate or small size, some of these parameters may be weakly empirically identifiable and the estimation algorithm may run into difficulties. We propose a Bayesian MCMC inferential algorithm to estimate the parameters and the number of dimensions underlying the multidimensional nominal response model. Two Bayesian approaches to model evaluation were compared: discrepancy statistics (DIC, WAICC, and LOO that provide an indication of the relative merit of different models, and the standardized generalized discrepancy measure that requires resampling data and is computationally more involved. A simulation study was conducted to compare these two approaches, and the results show that the standardized generalized discrepancy measure can be used to reliably estimate the dimensionality of the model whereas the discrepancy statistics are questionable. The paper also includes an example with real data in the context of learning styles, in which the model is used to conduct an exploratory factor analysis of nominal data.

  8. Regional scale ecological risk assessment: using the relative risk model

    National Research Council Canada - National Science Library

    Landis, Wayne G

    2005-01-01

    ...) in the performance of regional-scale ecological risk assessments. The initial chapters present the methodology and the critical nature of the interaction between risk assessors and decision makers...

  9. A practical guideline for human error assessment: A causal model

    Science.gov (United States)

    Ayele, Y. Z.; Barabadi, A.

    2017-12-01

    To meet the availability target and reduce system downtime, effective maintenance have a great importance. However, maintenance performance is greatly affected in complex ways by human factors. Hence, to have an effective maintenance operation, these factors needs to be assessed and quantified. To avoid the inadequacies of traditional human error assessment (HEA) approaches, the application of Bayesian Networks (BN) is gaining popularity. The main purpose of this paper is to propose a HEA framework based on the BN for maintenance operation. The proposed framework aids for assessing the effects of human performance influencing factors on the likelihood of human error during maintenance activities. Further, the paper investigates how operational issues must be considered in system failure-rate analysis, maintenance planning, and prediction of human error in pre- and post-maintenance operations. The goal is to assess how performance monitoring and evaluation of human factors can effect better operation and maintenance.

  10. CCIEA data and model output - California Current Integrated Ecosystem Assessment

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The California Current Integrated Ecosystem Assessment (CCIEA) is a joint project between staff at the NWFSC, SWFSC, NMML, ONMS, and WCRO to provide managers and...

  11. Life Cycle Assessment modeling of milk production in Iran

    OpenAIRE

    Hamzeh Soltanali; Bagher Emadi; Abbas Rohani; Mehdi Khojastehpour; Amin Nikkhah

    2015-01-01

    Livestock units are known as one of the most influential sectors in the environment pollution. Therefore, the aim of this study was to investigate the environmental impacts of milk production in Guilan province of Iran through Life Cycle Assessment (LCA) methodology. The primary data were collected from 45 units of milk production through a field survey with the help of a structured questionnaire. The reliability was assessed using Cronbach’s alpha coefficient and was estimated an acceptable ...

  12. Tropospheric Ozone Assessment Report: Assessment of global-scale model performance for global and regional ozone distributions, variability, and trends

    Directory of Open Access Journals (Sweden)

    P. J. Young

    2018-01-01

    Full Text Available The goal of the Tropospheric Ozone Assessment Report (TOAR is to provide the research community with an up-to-date scientific assessment of tropospheric ozone, from the surface to the tropopause. While a suite of observations provides significant information on the spatial and temporal distribution of tropospheric ozone, observational gaps make it necessary to use global atmospheric chemistry models to synthesize our understanding of the processes and variables that control tropospheric ozone abundance and its variability. Models facilitate the interpretation of the observations and allow us to make projections of future tropospheric ozone and trace gas distributions for different anthropogenic or natural perturbations. This paper assesses the skill of current-generation global atmospheric chemistry models in simulating the observed present-day tropospheric ozone distribution, variability, and trends. Drawing upon the results of recent international multi-model intercomparisons and using a range of model evaluation techniques, we demonstrate that global chemistry models are broadly skillful in capturing the spatio-temporal variations of tropospheric ozone over the seasonal cycle, for extreme pollution episodes, and changes over interannual to decadal periods. However, models are consistently biased high in the northern hemisphere and biased low in the southern hemisphere, throughout the depth of the troposphere, and are unable to replicate particular metrics that define the longer term trends in tropospheric ozone as derived from some background sites. When the models compare unfavorably against observations, we discuss the potential causes of model biases and propose directions for future developments, including improved evaluations that may be able to better diagnose the root cause of the model-observation disparity. Overall, model results should be approached critically, including determining whether the model performance is acceptable for

  13. Assessment of factors influencing finite element vertebral model predictions.

    Science.gov (United States)

    Jones, Alison C; Wilcox, Ruth K

    2007-12-01

    This study aimed to establish model construction and configuration procedures for future vertebral finite element analysis by studying convergence, sensitivity, and accuracy behaviors of semiautomatically generated models and comparing the results with manually generated models. During a previous study, six porcine vertebral bodies were imaged using a microcomputed tomography scanner and tested in axial compression to establish their stiffness and failure strength. Finite element models were built using a manual meshing method. In this study, the experimental agreement of those models was compared with that of semiautomatically generated models of the same six vertebrae. Both manually and semiautomatically generated models were assigned gray-scale-based, element-specific material properties. The convergence of the semiautomatically generated models was analyzed for the complete models along with material property and architecture control cases. A sensitivity study was also undertaken to test the reaction of the models to changes in material property values, architecture, and boundary conditions. In control cases, the element-specific material properties reduce the convergence of the models in comparison to homogeneous models. However, the full vertebral models showed strong convergence characteristics. The sensitivity study revealed a significant reaction to changes in architecture, boundary conditions, and load position, while the sensitivity to changes in material property values was proportional. The semiautomatically generated models produced stiffness and strength predictions of similar accuracy to the manually generated models with much shorter image segmentation and meshing times. Semiautomatic methods can provide a more rapid alternative to manual mesh generation techniques and produce vertebral models of similar accuracy. The representation of the boundary conditions, load position, and surrounding environment is crucial to the accurate prediction of the

  14. Compartmental models for assessing the fishery production in the Indian Ocean

    Digital Repository Service at National Institute of Oceanography (India)

    Dalal, S.G.; Parulekar, A.H.

    Compartmental models for assessing the fishery production in the Indian Ocean is discussed. The article examines the theoretical basis on which modern fishery sciences is built. The model shows that, large changes in energy flux from one pathway...

  15. Assessing uncertainties in crop and pasture ensemble model simulations of productivity and N2O emissions

    Science.gov (United States)

    Simulation models are extensively used to predict agricultural productivity and greenhouse gas (GHG) emissions. However, the uncertainties of (reduced) model ensemble simulations have not been assessed systematically for variables affecting food security and climate change mitigation, within multisp...

  16. Computer-Based Resource Accounting Model for Automobile Technology Impact Assessment

    Science.gov (United States)

    1976-10-01

    A computer-implemented resource accounting model has been developed for assessing resource impacts of future automobile technology options. The resources tracked are materials, energy, capital, and labor. The model has been used in support of the Int...

  17. A spatial- and age-structured assessment model to estimate the ...

    African Journals Online (AJOL)

    , thereby indirectly negatively impacting juvenile abalone which rely on the urchins for shelter. A model is developed for abalone that is an extension of more standard age-structured assessment models because it explicitly takes spatial effects ...

  18. The Revised Hierarchical Model: A critical review and assessment

    OpenAIRE

    Kroll, Judith F.; van Hell, Janet G.; Tokowicz, Natasha; Green, David W.

    2010-01-01

    Brysbaert and Duyck (2009) suggest that it is time to abandon the Revised Hierarchical Model (Kroll and Stewart, 1994) in favor of connectionist models such as BIA+ (Dijkstra and Van Heuven, 2002) that more accurately account for the recent evidence on nonselective access in bilingual word recognition. In this brief response, we first review the history of the Revised Hierarchical Model (RHM), consider the set of issues that it was proposed to address, and then evaluate the evidence that supp...

  19. Alternative regression models to assess increase in childhood BMI

    OpenAIRE

    Beyerlein, Andreas; Fahrmeir, Ludwig; Mansmann, Ulrich; Toschke, André M

    2008-01-01

    Abstract Background Body mass index (BMI) data usually have skewed distributions, for which common statistical modeling approaches such as simple linear or logistic regression have limitations. Methods Different regression approaches to predict childhood BMI by goodness-of-fit measures and means of interpretation were compared including generalized linear models (GLMs), quantile regression and Generalized Additive Models for Location, Scale and Shape (GAMLSS). We analyzed data of 4967 childre...

  20. A software quality model and metrics for risk assessment

    Science.gov (United States)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can