WorldWideScience

Sample records for modelling assessment irma

  1. Utilizing NASA Earth Observations to Assess Impacts of Hurricanes Andrew and Irma on Mangrove Forests in Biscayne Bay National Park, FL

    Science.gov (United States)

    Kumar, A.; Weber, S.; Remillard, C.; Escobar Pardo, M. L.; Hashemi Tonekaboni, N.; Cameron, C.; Linton, S.; Rickless, D.; Rivero, R.; Madden, M.

    2017-12-01

    Extreme weather events, such as hurricanes, pose major threats to coastal communities around the globe. However, mangrove forests along coastlines act as barriers and subdue the impacts associated with these catastrophic events. The Biscayne Bay National Park mangrove forest located near the city of Miami Beach was recently affected by the category four hurricane Irma in September of 2017. This study analyzed the impact of Hurricane Irma on Biscayne Bay National Park mangroves. Several remote sensing datasets including Landsat 8 Operational Land Imager (OLI), Sentinel 2-Multi Spectral Imager (MSI), PlanetScope, and aerial imagery were utilized to assess pre-and post-hurricane conditions. The high-resolution aerial imagery and PlanetScope data were used to map damaged areas within the national park. Additionally, Landsat 8 OLI and Sentinel-2 MSI data were utilized to estimate changes in biophysical parameters, including gross primary productivity (GPP), before and after Hurricane Irma. This project also examined damages associated with Hurricane Andrew (1992) using historical Landsat 5 Thematic Mapper (TM) data. These results were compared to GPP estimates following Hurricane Irma and suggested that Hurricane Andrew's impact was greater than that of Irma in Biscayne Bay National Park. The results of this study will help to enhance the mangrove health monitoring and shoreline management programs led by officials at the City of Miami Beach Public Works Department.

  2. Adapting National Water Model Forecast Data to Local Hyper-Resolution H&H Models During Hurricane Irma

    Science.gov (United States)

    Singhofen, P.

    2017-12-01

    The National Water Model (NWM) is a remarkable undertaking. The foundation of the NWM is a 1 square kilometer grid which is used for near real-time modeling and flood forecasting of most rivers and streams in the contiguous United States. However, the NWM falls short in highly urbanized areas with complex drainage infrastructure. To overcome these shortcomings, the presenter proposes to leverage existing local hyper-resolution H&H models and adapt the NWM forcing data to them. Gridded near real-time rainfall, short range forecasts (18-hour) and medium range forecasts (10-day) during Hurricane Irma are applied to numerous detailed H&H models in highly urbanized areas of the State of Florida. Coastal and inland models are evaluated. Comparisons of near real-time rainfall data are made with observed gaged data and the ability to predict flooding in advance based on forecast data is evaluated. Preliminary findings indicate that the near real-time rainfall data is consistently and significantly lower than observed data. The forecast data is more promising. For example, the medium range forecast data provides 2 - 3 days advanced notice of peak flood conditions to a reasonable level of accuracy in most cases relative to both timing and magnitude. Short range forecast data provides about 12 - 14 hours advanced notice. Since these are hyper-resolution models, flood forecasts can be made at the street level, providing emergency response teams with valuable information for coordinating and dispatching limited resources.

  3. Comparative assessment of quality of immunoradiometric assay (IRMA) and chemiluminescence immunometric assay (CHEIMA) for estimation of thyroid stimulating hormone (TSH)

    International Nuclear Information System (INIS)

    Sajid, K.M.

    2009-01-01

    Biological substances like hormones, vitamins and enzymes are found in minute quantities in blood. Their estimation requires very sensitive and specific methods. The most modern method for estimation of thyroid stimulating hormone in serum is non-isotopic enzyme enhanced chemiluminescence immunometric method. In our laboratory immunoradiometric assay is in routine for the last many years. Recently interest has grown to establish non-isotopic techniques in laboratories of PAEC. However, the main requirement to adopt the new procedures is to compare their results, cost and other benefits with the existing method. Immunoassay laboratory of MINAR, therefore, conducted a study to compare the two methods. A total of 173 (males: 34 females: 139 age: between 1 and 65 years) cases of clinically confirmed thyroid status were included in the study. Serum samples of these cases were analyzed by two methods and results were compared by plotting precision profiles, correlation plots and calculating sensitivities and specificities of the methods. As the results in all the samples were not normally distributed Wilcoxon rank sum test was applied to compare the analytical results of two methods. The comparison shows that the results obtained in two methods are not completely similar (p=0.0003293), although analysis of samples in groups shows that some similarity exists between the results of hypo and hyperthyroid patients (p<=0.156 and p<=0.6138). This shows that results obtained in these two methods could sometimes disagree in final diagnosis. Although TSH-CHEIMA is analytically more sensitive than TSH-IRMA the clinical sensitivities and specificities of two methods are not significantly different. TSH-CHEIMA test completes in almost 2 hours whereas TSH-IRMA takes about 6 hours to complete. Comparison of costs shows that TSH-CHIEMA is almost 5 times more expensive than TSH-IRMA. We conclude that the two methods could sometimes disagree but the two techniques have almost same

  4. Nowcasting, forecasting and hindcasting Harvey and Irma inundation in near-real time using a continental 2D hydrodynamic model

    Science.gov (United States)

    Sampson, C. C.; Wing, O.; Quinn, N.; Smith, A.; Neal, J. C.; Schumann, G.; Bates, P.

    2017-12-01

    During an ongoing natural disaster data are required on: (1) the current situation (nowcast); (2) its likely immediate evolution (forecast); and (3) a consistent view post-event of what actually happened (hindcast or reanalysis). We describe methods used to achieve all three tasks for flood inundation during the Harvey and Irma events using a continental scale 2D hydrodynamic model (Wing et al., 2017). The model solves the local inertial form of the Shallow Water equations over a regular grid of 1 arcsecond ( 30m). Terrain data are taken from the USGS National Elevation Dataset with known flood defences represented using the U.S. Army Corps of Engineers National Levee Dataset. Channels are treated as sub-grid scale features using the HydroSHEDS global hydrography data set. The model is driven using river flows, rainfall and coastal water levels. It simulates river flooding in basins > 50 km2, and fluvial and coastal flooding everywhere. Previous wide area validation tests show this model to be capable of matching FEMA maps and USGS local models built with bespoke data with hit rates of 86% and 92% respectively (Wing et al., 2017). Boundary conditions were taken from NOAA QPS data to produce nowcast and forecast simulations in near real time, before updating with NOAA observations to produce the hindcast. During the event simulation results were supplied to major insurers and multi-nationals who used them to estimate their likely capital exposure and to mitigate flood damage to their infrastructure whilst the event was underway. Simulations were validated against modelled flood footprints computed by FEMA and USACE, and composite satellite imagery produced by the Dartmouth Flood Observatory. For the Harvey event, hit rates ranged from 60-84% against these data sources, but a lack of metadata meant it was difficult to perform like-for-like comparisons. The satellite data also appeared to miss known flooding in urban areas that was picked up in the models. Despite

  5. Selection of matched pair of monoclonal antibodies for development of immunoradiometric assay (IRMA) : our experience with IRMA of TSH

    International Nuclear Information System (INIS)

    Kadwad, V.B.; Jyotsna, N.; Sivaprasad, N.

    1998-01-01

    Full text: In immunoradiometricassay (IRMA) two antibodies raised against two different epitopes of the same antigen are used, one bound to a solid phase (capture antibody) and the other labelled with 125 I (detector antibody). The development of any IRMA thus involves proper selection of the capture and detector antibody, preparation of solid phase, labelling of the antibody and assay optimization. Extensive studies have been carried out on these aspects in our laboratory with greater emphasis on the behavior of different pairs of antibodies as sandwich partners : monoclonal-monoclonal and monoclonal-polyclonal antibodies. The parameters studied include the ease of radio-iodination of different monoclonal antibodies, the effect of interchange of capture and detector antibody etc. Keeping TSH antibody as a model, two different monoclonal antibodies, a polyclonal antibody and a tracer from a commercial TSH IRMA kit were used in this study. Based on our studies an assay procedure for in-house IRMA of TSH has been developed with a sensitivity of 0.1 μIU/ml and validated

  6. Irma Optimisti "Female mathematics" / Raivo Kelomees

    Index Scriptorium Estoniae

    Kelomees, Raivo, 1960-

    2007-01-01

    Irma Optimisti näitusest "Female mathematics" Helsingis Muu-galeriis, mis on osa projektist "Mõistelisus ja käsitöölikkus (Käsitteellisyys ja käsityöläisyys) ehk naine ja tehnoloogia". Ilmunud ka ajalehes "Eesti Päevaleht", 1996, 27. veebruar

  7. Rapid-response flood mapping during Hurricanes Harvey, Irma and Maria by the Global Flood Partnership (GFP)

    Science.gov (United States)

    Cohen, S.; Alfieri, L.; Brakenridge, G. R.; Coughlan, E.; Galantowicz, J. F.; Hong, Y.; Kettner, A.; Nghiem, S. V.; Prados, A. I.; Rudari, R.; Salamon, P.; Trigg, M.; Weerts, A.

    2017-12-01

    The Global Flood Partnership (GFP; https://gfp.jrc.ec.europa.eu) is a multi-disciplinary group of scientists, operational agencies and flood risk managers focused on developing efficient and effective global flood management tools. Launched in 2014, its aim is to establish a partnership for global flood forecasting, monitoring and impact assessment to strengthen preparedness and response and to reduce global disaster losses. International organizations, the private sector, national authorities, universities and research agencies contribute to the GFP on a voluntary basis and benefit from a global network focused on flood risk reduction. At the onset of Hurricane Harvey, GFP was `activated' using email requests via its mailing service. Soon after, flood inundation maps, based on remote sensing analysis and modeling, were shared by different agencies, institutions, and individuals. These products were disseminated, to varying degrees of effectiveness, to federal, state and local agencies via emails and data-sharing services. This generated a broad data-sharing network which was utilized at the early stages of Hurricane Irma's impact, just two weeks after Harvey. In this presentation, we will describe the extent and chronology of the GFP response to both Hurricanes Harvey, Irma and Maria. We will assess the potential usefulness of this effort for event managers in various types of organizations and discuss future improvements to be implemented.

  8. Estimating the human influence on Hurricanes Harvey, Irma and Maria

    Science.gov (United States)

    Wehner, M. F.; Patricola, C. M.; Risser, M. D.

    2017-12-01

    Attribution of the human-induced climate change influence on the physical characteristics of individual extreme weather events has become an advanced science over the past decade. However, it is only recently that such quantification of anthropogenic influences on event magnitudes and probability of occurrence could be applied to very extreme storms such as hurricanes. We present results from two different classes of attribution studies for the impactful Atlantic hurricanes of 2017. The first is an analysis of the record rainfall amounts during Hurricane Harvey in the Houston, Texas area. We analyzed observed precipitation from the Global Historical Climatology Network with a covariate-based extreme value statistical analysis, accounting for both the external influence of global warming and the internal influence of ENSO. We found that human-induced climate change likely increased Hurricane Harvey's total rainfall by at least 19%, and likely increased the chances of the observed rainfall by a factor of at least 3.5. This suggests that changes exceeded Clausius-Clapeyron scaling, motivating attribution studies using dynamical climate models. The second analysis consists of two sets of hindcast simulations of Hurricanes Harvey, Irma, and Maria using the Weather Research and Forecasting model (WRF) at 4.5 km resolution. The first uses realistic boundary and initial conditions and present-day greenhouse gas forcings while the second uses perturbed conditions and pre-industrial greenhouse has forcings to simulate counterfactual storms without anthropogenic influences. These simulations quantify the fraction of Harvey's precipitation attributable to human activities and test the super Clausius-Clapeyron scaling suggested by the observational analysis. We will further quantify the human influence on intensity for Harvey, Irma, and Maria.

  9. Memoria di un’inguaribile combattente: Irma Adelman (In memoriam of an incurable resilient: Irma Adelman

    Directory of Open Access Journals (Sweden)

    Giulia Zacchia

    2017-09-01

    Full Text Available The paper is a contribution to Irma Adelman, who passed away on February 24, 2017. Prof. Adelman is the only woman economist whose autobiography has been published in the series of recollections and reflections on the professional experiences of distinguished economists that “Moneta e Credito” and the “BNL Quarterly Review” started in 1979. The obituary highlights her main themes of research, research tools and those ideas that most affected the current approach to economic development. It also gives hints on her personality, her courage and resilience that could inspire young women in academia.  JEL codes: B31; O15; O20; A12

  10. Kinetic Consideration of AFP irma assay

    International Nuclear Information System (INIS)

    Aly, M. A.; Moustafa, K.A.

    2003-01-01

    Alpha-fetoprotein (AFP) is a glycoprotein produced by the yolk sac and later by the fetal liver during pregnancy. When the neural tube is not properly formed, by the fetal liver during pregnancy. When the neural tube is not properly formed, large amounts of AFP pass into the amniotic fluid and reach the mother's blood. During pregnancy, the major interest in AFP determination in maternal serum and amniotic fluid is on the early diagnosis of fetal abnormalities. AFP also used as a tumor marker for hepatocellular carcinoma. There are many different techniques for measuring AFP in blood, but the more accurate one is the immunoassay technique. The kinetics of the interaction between AFP antigen and two matched antibodies, one labeled with radioactive isotope 1 25I (tracer) and the other is unlabelled and attached to a solid support (tube), are studied using the more recently, two sites (sandwich) immunoradiometric assay (IRMA) technique. We present here a method for determining the rate constants, using an advanced computer program (RKY), which based on the nelder-mead optimization principle. The rate constant, at three variable temperatures and three different antigen concentrations, as well as the half time of exchange (t 1/2 ) were calculated

  11. TSH IRMA of dried blood spots

    International Nuclear Information System (INIS)

    Tojinda, N.; Pattanachak, C.; Chongchirasiri, S.; Pattanachak, S.; Putrasreni, N.; Pleehachinda, R.; Suwanik, R.

    1990-01-01

    TSH determination is most useful for screening of neonatal hypothyroid in the population in iodine deficient areas. The NETRIA IRMA method for serum TSH was applied for blood-spot TSH. Cord blood on SS No. 903 filter paper was left dry overnight. The spot of 6 mm diameter, one/tube, was mixed with an assay buffer, diluted labelled m-anti-TSH, and diluted anti-TSH-solid phase. The mixture was rotated for 22-24 hours. After washing twice with wash buffer, it was counted for 1 minute. The standard curve with 0, 5, 10, 25, 50, 100, and 150 mIU/L whole blood was obtained with the maximum binding of 25%. The precision profile was satisfactory with %CV of 0 C) or 4 0 C or -20 0 C. The correlation between serum and blood-spot TSH values (n=120) showed r of 0.9541 and y=1.6123 (BS-TSH) +1.382. The mean of normal cord blood spot TSH (n=142) was 5.27 mIU/L. The technique was found to be precise, sensitive and easy to perform. Mass screening with this developed method is underway

  12. Emergency Response Imagery Related to Hurricanes Harvey, Irma, and Maria

    Science.gov (United States)

    Worthem, A. V.; Madore, B.; Imahori, G.; Woolard, J.; Sellars, J.; Halbach, A.; Helmricks, D.; Quarrick, J.

    2017-12-01

    NOAA's National Geodetic Survey (NGS) and Remote Sensing Division acquired and rapidly disseminated emergency response imagery related to the three recent hurricanes Harvey, Irma, and Maria. Aerial imagery was collected using a Trimble Digital Sensor System, a high-resolution digital camera, by means of NOAA's King Air 350ER and DeHavilland Twin Otter (DHC-6) Aircraft. The emergency response images are used to assess the before and after effects of the hurricanes' damage. The imagery aids emergency responders, such as FEMA, Coast Guard, and other state and local governments, in developing recovery strategies and efforts by prioritizing areas most affected and distributing appropriate resources. Collected imagery is also used to provide damage assessment for use in long-term recovery and rebuilding efforts. Additionally, the imagery allows for those evacuated persons to see images of their homes and neighborhoods remotely. Each of the individual images are processed through ortho-rectification and merged into a uniform mosaic image. These remotely sensed datasets are publically available, and often used by web-based map servers as well as, federal, state, and local government agencies. This poster will show the imagery collected for these three hurricanes and the processes involved in getting data quickly into the hands of those that need it most.

  13. Raising of Antiserum and development od IRMA serum ferritin

    International Nuclear Information System (INIS)

    Abdalla, Omer Mohamed; Ali, Nagi Ibrahim; Elbagir, Nabila Musa

    1998-02-01

    Antiserum to human liver ferritin was developed by immunizing sheep with purified human liver ferritin. This antiserum has been purified using ammonium sulphate. A part of it was linked chemically to magnetisale particles, while the other part was adsorbed physically onto polystyrene beads in order to develop two IRMAs. The anti-ferritin antibody obtained was purified and diluted 200,000 folds before being coated to polystyrene beads, or coupled to magnetisable particles. Assay validation, sensitivity and accuracy tests for the two IRMAs were performed. The polystyrene beads IRMA system showed better performance than the magnetisable particles system. It was found that, the minimum detectable dose in the bead system was 0.6 ng/ml, whereas it was 6.0 ng/ml in the magnetisable one. In the beads system, the mean recovery of ferritin was found to be 98.5% while the linearity tests showed a correlation coefficient of 0.996. The comparison between our coated beads IRMA with NETRIA's IRMA serum ferritin showed a correlation coefficient of 0.982. (Author)

  14. Tracking the Aftermath of Irma in Antigua and Barbuda

    Science.gov (United States)

    Friedman, E.; Look, C.

    2017-12-01

    The twin island nation of Antigua and Barbuda were the first places heavily impacted by Hurricane Irma. The powerful imagery generated of destruction and abandonment, stood as warning for many in the U.S. Virgin Islands, Puerto Rico, and the coastal United States. This paper presents findings on how resilience in the aftermath of Irma's destruction has functioned to constitute those who are sustainable from those who are not. The two sister islands experienced completely different outcomes from Hurricane Irma, with Antigua being relatively `untouched' and Barbuda with approximately 90% of the island of the destroyed, presenting a contradictory identity of the twin-island nation both as a victim of climate change and a land of economic opportunity, "open for business". This contradiction will be unpacked through analysis of language from formal practitioner interviews, informal unstructured discussions, local climate-risk reduction policies, local newspapers, and social media.

  15. Impact of Hurricane Irma in the post-recovery of Matthew in South Carolina, the South Atlantic Bight (Western Atlantic)

    Science.gov (United States)

    Harris, M. S.; Levine, N. S.; Jaume, S. C.; Hendricks, J. K.; Rubin, N. D.; Hernandez, J. L.

    2017-12-01

    The impacts on the Southeastern United States (SEUS, Western Atlantic) from Hurricane Irma in Sept 2017 were felt primarily on the active coastline with the third highest inland storm surge in Charleston and Savannah since the 19th Century. Coastal geometry, waves, and wind duration had a strong influence on the storm surge and coastal erosion impacts regionally. To the North and immediate South, impacts were much less. A full year after the 2016 hurricane season (Hurricane Matthew), the lack of regional recovery reduced protection against Irma. The most devastating impacts of Irma in the SAB occurred from 300 to 500 km away from the eye, on the opposite side of the Floridian peninsula. As Irma devastated the Caribbean, winds started to increases off the SAB on September 8 in the early morning, continuing for the next 3 days and blowing directly towards the SC and GA coasts. Tide gauges started to respond the night of September 8, while waves started arriving in the SEUS around Sept 6. Coastal erosion pre- and post-Irma has been calculated for Central SC using vertical and oblique aerial photos. Citizen Science initiatives through the Charleston Resilience Network have provided on-the-ground data during storms when transportation infrastructures were closed, and allow for ground-truth post-storm of surge and impacts. Said information was collected through Facebook, Google, and other social media. Pictures with timestamps and water heights were collected and are validating inundation flood maps generated for the Charleston SC region. The maps have 1-m horizontal and 7- to 15-cm vertical accuracy. Inundation surfaces were generated at MHHW up to a maximum surge in 6 inch increments. The flood extents of the modeled surge and the photographic evidence show a high correspondence. Storm surge measurements from RTK-GPS provide regional coverage of surge elevations from the coast, inland, and allow for testing of modeled results and model tuning. With Hurricane Irma

  16. Magnetic particle separation technique: a reliable and simple tool for RIA/IRMA and quantitative PCR assay

    International Nuclear Information System (INIS)

    Shen Rongsen; Shen Decun

    1998-01-01

    Five types of magnetic particles without or with aldehyde, amino and carboxyl functional groups, respectively were used to immobilize first or second antibody by three models, i. e. physical adsorption, chemical coupling and immuno-affinity, forming four types of magnetic particle antibodies. The second antibody immobilized on polyacrolein magnetic particles through aldehyde functional groups and the first antibodies immobilized on carboxylic polystyrene magnetic particles through carboxyl functional groups were recommended to apply to RIAs and/or IRMAs. Streptavidin immobilized on commercial magnetic particles through amino functional groups was successfully applied to separating specific PCR product for quantification of human cytomegalovirus. In the paper typical data on reliability of these magnetic particle ligands were reported and simplicity of the magnetic particle separation technique was discussed. The results showed that the technique was a reliable and simple tool for RIA/IRMA and quantitative PCR assay. (author)

  17. Proposal for the development of IRMA kits for prostate specific antigen, PSA

    International Nuclear Information System (INIS)

    Abdul, A.B.

    1997-01-01

    The following are the major objectives of this research proposal: (1) To establish a protocol for biotinylation of monoclonal antibody, mabs or polyclonal antibody against the antigen, PSA. This shall include the purifying procedure using size exclusion chromatography on HPLC for use in binding assays to determine its binding capacity with PSA. (2) To establish an immunoassay protocol for IRMAs, using the technique of immobilizing the capture mabs on solid phase (surfaces of polystyrene) and the radioiodine labeled streptavidin-biotinylated bridge system. This will include optimization of the assay design and a Quality Control Assessment with the inclusion of standards derived from the Agency and subsequent work to determine its sensitivity (Minimum Detection Limit) and working range (the phenomenon of Hooke's Effect). An in-house quality control would also be useful to determine the assay's suitability for screening the tumour marker from patient samples obtained from neighboring hospitals (such as the Science University of Malaysia Hospital and the National University Hospital) and private clinical pathology laboratory (such as the Pantai Medical Centre) which compare concurrently the results with existing commercial immunoassay kits (RIA/IRMA). These work and that described earlier in (1) shall be done entirely at MINT. (3) To perform an external coordinated external Quality Control Assurance Programme with other research institutes (such as the Department of Immunology, Medical faculty, Science University of Malaysia and government hospitals) in Malaysia on several batches of the IRMA kits (produced at MINT and proven to be suitable for screening PSA in human serum from in-house Quality Control data, as mentioned earlier in (2)). This coordinated work shall include analyzing and documenting all values obtained from a group of patient's sample in clinical conditions such as batch to batch variation, inter and intra-assay variations and mean values for negative

  18. The Irma-sponge Program: Methodologies For Sustainable Flood Risk Management Along The Rhine and Meuse Rivers

    Science.gov (United States)

    Hooijer, A.; van Os, A. G.

    Recent flood events and socio-economic developments have increased the awareness of the need for improved flood risk management along the Rhine and Meuse Rivers. In response to this, the IRMA-SPONGE program incorporated 13 research projects in which over 30 organisations from all 6 River Basin Countries co-operated. The pro- gram is financed partly by the European INTERREG Rhine-Meuse Activities (IRMA). The main aim of IRMA-SPONGE is defined as: "The development of methodologies and tools to assess the impact of flood risk reduction measures and of land-use and climate change scenarios. This to support the spatial planning process in establish- ing alternative strategies for an optimal realisation of the hydraulic, economical and ecological functions of the Rhine and Meuse River Basins." Further important objec- tives are to promote transboundary co-operation in flood risk management by both scientific and management organisations, and to promote public participation in flood management issues. The projects in the program are grouped in three clusters, looking at measures from different scientific angles. The results of the projects in each cluster have been evaluated to define recommendations for flood risk management; some of these outcomes call for a change to current practices, e.g.: 1. (Flood Risk and Hydrol- ogy cluster): hydrological changes due to climate change exceed those due to further land use change, and are significant enough to necessitate a change in flood risk man- agement strategies if the currently claimed protection levels are to be sustained. 2. (Flood Protection and Ecology cluster): to not only provide flood protection but also enhance the ecological quality of rivers and floodplains, new flood risk management concepts ought to integrate ecological knowledge from start to finish, with a clear perspective on the type of nature desired and the spatial and time scales considered. 3. (Flood Risk Management and Spatial Planning cluster): extreme

  19. Clinical application and evaluation of TSH(IRMA) determination

    International Nuclear Information System (INIS)

    Yuan Jimin; Zhu Cuiying; Cui Wenru

    1993-01-01

    The serum TSH level of 303 healthy persons ranged from 0.3-5.6 mU/l and 205 cases of hyperthyroidism, 56 cases of early stage hyperthyroidism, 67 cases of subclinical hyperthyroidism ranged 3 as far as the sensitivity for the diagnosis and prognostic monitoring of thyrotoxicosis is concerned. Comparison of TSH(64 cases) and TSH stimulating test was highly correlated, therefore the application of latter test can be greatly reduced. And also the establishment and application of TSH(IRMA) gives a strategic change of the diagnostic procedure of thyroid function test

  20. Using High-Resolution Imagery to Characterize Disturbance from Hurricane Irma in South Florida Wetlands

    Science.gov (United States)

    Lagomasino, D.; Cook, B.; Fatoyinbo, T.; Morton, D. C.; Montesano, P.; Neigh, C. S. R.; Wooten, M.; Gaiser, E.; Troxler, T.

    2017-12-01

    Hurricane Irma, one of the strongest hurricanes recorded in the Atlantic, first made landfall in the Florida Keys before coming ashore in southwestern Florida near Everglades National Park (ENP) on September 9th and 10th of this year. Strong winds and storm surge impacted a 100+ km stretch of the southern Florida Gulf Coast, resulting in extensive damages to coastal and inland ecosystems. Impacts from previous catastrophic storms in the region have led to irreversible changes to vegetation communities and in some areas, ecosystem collapse. The processes that drive coastal wetland vulnerability and resilience are largely a function of the severity of the impact to forest structure and ground elevation. Remotely sensed imagery plays an important role in measuring changes to the landscape, particularly for extensive and inaccessible regions like the mangroves in ENP. We have estimated changes in coastal vegetation structure and soil elevation using a combination of repeat measurements from ground, airborne, and satellite platforms. At the ground level, we used before and after Structure-from-Motion models to capture the change in below canopy structure as result of stem breakage and fallen branches. Using airborne imagery collected before and after Hurricane Irma by Goddard's Lidar, Hyperspectral, and Thermal (G-LiHT) Airborne Imager, we measured the change in forest structure and soil elevation. This unique data acquisition covered an area over 130,000 ha in regions most heavily impacted storm surge. Lastly, we also combined commercial and NASA satellite Earth observations to measure forest structural changes across the entire South Florida coast. An analysis of long-term observations from the Landsat data archive highlights the heterogeneity of hurricane and other environmental disturbances along the Florida coast. These findings captured coastal disturbance legacies that have the potential to influence the trajectory of mangrove resilience and vulnerability

  1. NASA Earth Science Disasters Program Response Activities During Hurricanes Harvey, Irma, and Maria in 2017

    Science.gov (United States)

    Bell, J. R.; Schultz, L. A.; Molthan, A.; Kirschbaum, D.; Roman, M.; Yun, S. H.; Meyer, F. J.; Hogenson, K.; Gens, R.; Goodman, H. M.; Owen, S. E.; Lou, Y.; Amini, R.; Glasscoe, M. T.; Brentzel, K. W.; Stefanov, W. L.; Green, D. S.; Murray, J. J.; Seepersad, J.; Struve, J. C.; Thompson, V.

    2017-12-01

    The 2017 Atlantic hurricane season included a series of storms that impacted the United States, and the Caribbean breaking a 12-year drought of landfalls in the mainland United States (Harvey and Irma), with additional impacts from the combination of Irma and Maria felt in the Caribbean. These storms caused widespread devastation resulting in a significant need to support federal partners in response to these destructive weather events. The NASA Earth Science Disasters Program provided support to federal partners including the Federal Emergency Management Agency (FEMA) and the National Guard Bureau (NGB) by leveraging remote sensing and other expertise through NASA Centers and partners in academia throughout the country. The NASA Earth Science Disasters Program leveraged NASA mission products from the GPM mission to monitor cyclone intensity, assist with cyclone center tracking, and quantifying precipitation. Multispectral imagery from the NASA-NOAA Suomi-NPP mission and the VIIRS Day-Night Band proved useful for monitoring power outages and recovery. Synthetic Aperture Radar (SAR) data from the Copernicus Sentinel-1 satellites operated by the European Space Agency were used to create flood inundation and damage assessment maps that were useful for damage density mapping. Using additional datasets made available through the USGS Hazards Data Distribution System and the activation of the International Charter: Space and Major Disasters, the NASA Earth Science Disasters Program created additional flood products from optical and radar remote sensing platforms, along with PI-led efforts to derive products from other international partner assets such as the COSMO-SkyMed system. Given the significant flooding impacts from Harvey in the Houston area, NASA provided airborne L-band SAR collections from the UAVSAR system which captured the daily evolution of record flooding, helping to guide response and mitigation decisions for critical infrastructure and public safety. We

  2. Significant Wave Height under Hurricane Irma derived from SAR Sentinel-1 Data

    Science.gov (United States)

    Lehner, S.; Pleskachevsky, A.; Soloviev, A.; Fujimura, A.

    2017-12-01

    while making landfall on Cuba and the Florida Keys, where IRMA still hit as a category 3 to 4 hurricane. Results are compared to the WW3 model, which could not be validated over an area under strong and variable wind conditions before. A new theory on hurricane intensification based on Kelvin-Helmholtz instability is discussed and a first comparison to the SAR data is given.

  3. Impact of Hurricane Irma on Little Ambergris Cay, Turks and Caicos

    Science.gov (United States)

    Stein, N.; Grotzinger, J. P.; Hayden, A.; Quinn, D. P.; Trower, L.; Lingappa, U.; Present, T. M.; Gomes, M.; Orzechowski, E. A.; Fischer, W. W.

    2017-12-01

    Little Ambergris Cay (21.3° N, 71.7° W) is a 6 km long, 1.6 km wide island on the Caicos platform. The island was the focus of mapping campaigns in July 2016, August 2017, and following Hurricane Irma in September 2017. The cay is lined with lithified upper shoreface and eolian ooid grainstone forming a 1-4 m high bedrock rim that is locally breached, allowing tides to inundate an interior basin lined with extensive microbial mats. The island was mapped in July of 2016 using UAV- and satellite-based images and in situ measurements. Sedimentologic and biofacies were mapped onto a 15 cm/pixel visible light orthomosaic of the cay made from more than 1500 UAV images, and a corresponding stereogrammetric digital elevation model (DEM) was used to track how microbial mat texture varies in response to water depth. An identical UAV-based visible light map of the island was made in August 2017. On September 7th, 2017, the eye of hurricane Irma directly crossed Little Ambergris Cay with sustained winds exceeding 170 MPH. The island was remapped with a UAV on September 24th, yielding a 5 cm/pixel UAV-based visible light orthomosaic and a corresponding DEM. In situ observations and comparison with previous UAV maps shows that Irma caused significant channel and bedrock erosion, scouring and removal of broad tracts of microbial mats, and blanketing by ooid sediment of large portions of the interior basin including smothering of mats by up to 1 m of sediment. The southern rim of the cay was overtopped by water and sediment, indicating a storm surge of at least 3 m. Blocks of rock more than 1 m in length and 50 cm thick were separated from bedrock on the north side of the island and washed higher to form imbricated boulder deposits. Hundreds of 5-30 cm diameter imbricated rip-up intraclasts of rounded microbial mat now line exposed bedrock in the interior basin. Fresh ooid sediment and microbial mats were sampled from three sites: on desiccated mats 50 cm above tide level, on

  4. Optimization Of Preparation And Validation Of Hepatitis C Irma Kit

    International Nuclear Information System (INIS)

    Ariyanto, Agus; Wayan, R.S.; Sukiyati, Dj.; Darwati, Siti; Yunita, Fitri; Mondrida, Gina; Sulaiman; Yulianti, Veronika; Setiowati, Sri

    2000-01-01

    Optimization of preparation and validation of hepatitis C IRMA kit have been done. Tracer was prepared by iodination of anti-hlgG with 125 I using Choramin-T and N-Bromosuksinimide as oxidizing agent. Iodinated anti-hlgG was purified using PD-10 and Sephadex G-50 column. Coated bead was prepared by immobilization of antigen HCV recombinant on polystyrene bead. To obtain a good quality of reagent some parameters were optimized i.e.: colim used for purification. To determine the validity of the kit, validation which include comparison study and determination of specificity and sensitivity have been performed. A good quality of tracer has been able to prepared with high yield (74%) and high P/N value (15). The quality of coated bead is also reasonably good (P/N 34.3), and gave a good density and dissociation index (0.335 and 0.99% respectively). Both tracer and coated bead were remain stable after 2 months storage at 4 o C. In comparison with Elisa kit, the specificity and sensitivity of the HCV IRMA kit is reasonably high, 88.6% and 92.3% respectively

  5. Mapping Daily and Maximum Flood Extents at 90-m Resolution During Hurricanes Harvey and Irma Using Passive Microwave Remote Sensing

    Science.gov (United States)

    Galantowicz, J. F.; Picton, J.; Root, B.

    2017-12-01

    Passive microwave remote sensing can provided a distinct perspective on flood events by virtue of wide sensor fields of view, frequent observations from multiple satellites, and sensitivity through clouds and vegetation. During Hurricanes Harvey and Irma, we used AMSR2 (Advanced Microwave Scanning Radiometer 2, JAXA) data to map flood extents starting from the first post-storm rain-free sensor passes. Our standard flood mapping algorithm (FloodScan) derives flooded fraction from 22-km microwave data (AMSR2 or NASA's GMI) in near real time and downscales it to 90-m resolution using a database built from topography, hydrology, and Global Surface Water Explorer data and normalized to microwave data footprint shapes. During Harvey and Irma we tested experimental versions of the algorithm designed to map the maximum post-storm flood extent rapidly and made a variety of map products available immediately for use in storm monitoring and response. The maps have several unique features including spanning the entire storm-affected area and providing multiple post-storm updates as flood water shifted and receded. From the daily maps we derived secondary products such as flood duration, maximum flood extent (Figure 1), and flood depth. In this presentation, we describe flood extent evolution, maximum extent, and local details as detected by the FloodScan algorithm in the wake of Harvey and Irma. We compare FloodScan results to other available flood mapping resources, note observed shortcomings, and describe improvements made in response. We also discuss how best-estimate maps could be updated in near real time by merging FloodScan products and data from other remote sensing systems and hydrological models.

  6. Using the integrated rural mobility and access (IRMA) approach in prospering rural South Africa

    CSIR Research Space (South Africa)

    Chakwizira, J

    2008-11-01

    Full Text Available settlements implications of current rural development approaches are outlined. The potential and impact of the integrated rural mobility and access approach (IRMA) in unlocking socio-economic and spatial livelihood opportunities are discussed. In this regard...

  7. Detection of HBsAg and Anti HBc on donors of a blood bank by IRMA and ELISA methods

    International Nuclear Information System (INIS)

    Freire Martinez, D.Y.

    1985-10-01

    Comparative evaluation of two methods, Immunoradiometric Assay (IRMA) and Enzyme Immunoassay (ELISA), for detecting HBsAg and Anti HBc was made for determining which is the most advantageous and reliable. The study was made on 300 donors of the Hospital San Juan de Dios Blood Bank. In comparison with the reference method (IRMA), ELISA shows 91.67% of sensitivity. The Anti HBc detection by IRMA is more reliable than the HBsAg detection by IRMA and ELISA for determining the carrier state

  8. Development of IRMA reagent and methodology for PSA

    International Nuclear Information System (INIS)

    Najafi, R.

    1997-01-01

    The PSA test is a solid phase two-site immunoassay. Rabbit anti PSA is coated or bound on surface of solid phase and monoclonal anti PSA labeled with 1-125. The PSA molecules present in the standard solution or serum are 'Sandwiched' between the two antibodies. After formation of coated antibody-antigen-labeled antibody complex, the unbound labeled antibody will removed by washing. The complex is measured by gamma counter. The concentration of analyte is proportional to the counts of test sample. In order to develop kits for IRMA PSA, it should be prepared three essential reagents Antibody coated solid phase, labeled antibody, standards and finally optimizing them to obtain an standard curve fit to measure specimen PSA in desired range of concentration. The type of solid phase and procedure(s) to coat or bind to antibody, is still main debatable subject in development and setting up RIA/IRMA kits. In our experiments, polystyrene beads, because of their easy to coat with antibody as well as easy to use, can be considered as a desired solid phase. Most antibodies are passively adsorbed to a plastic surface (e.g. Polystyrene, Propylene, and Polyvinyl chloride) from a diluted buffer. The antibody coated plastic surface, then acts as solid phase reagent. Poor efficiency and time required to reach equilibrium and also lack of reproducibility especially batch-to-batch variation between materials, are disadvantages in this simple coating procedure. Improvements can be made by coating second antibody on surface of beads, and reaction between second and primary antibodies. There is also possible to enhance more coating efficiency of beads by using Staphylococcus ureus-Protein A. Protein A is a major component of staphylococcus aureus cell wall which has an affinity for FC segment of immunoglobulin G (IgG) of some species, including human; rabbit; and mice. This property of Staphylococcal Protein A has made it a very useful tool in the purification of classes and subclasses

  9. Sedimentary and Vegetative Impacts of Hurricane Irma to Coastal Wetland Ecosystems across Southwest Florida

    Science.gov (United States)

    Moyer, R. P.; Khan, N.; Radabaugh, K.; Engelhart, S. E.; Smoak, J. M.; Horton, B.; Rosenheim, B. E.; Kemp, A.; Chappel, A. R.; Schafer, C.; Jacobs, J. A.; Dontis, E. E.; Lynch, J.; Joyse, K.; Walker, J. S.; Halavik, B. T.; Bownik, M.

    2017-12-01

    Since 2014, our collaborative group has been working in coastal marshes and mangroves across Southwest Florida, including Tampa Bay, Charlotte Harbor, Ten Thousand Islands, Biscayne Bay, and the lower Florida Keys. All existing field sites were located within 50 km of Hurricane Irma's eye path, with a few sites in the Lower Florida Keys and Naples/Ten Thousand Islands region suffering direct eyewall hits. As a result, we have been conducting storm-impact and damage assessments at these locations with the primary goal of understanding how major hurricanes contribute to and/or modify the sedimentary record of mangroves and salt marshes. We have also assessed changes to the vegetative structure of the mangrove forests at each site. Preliminary findings indicate a reduction in mangrove canopy cover from 70-90% pre-storm, to 30-50% post-Irma, and a reduction in tree height of approximately 1.2 m. Sedimentary deposits consisting of fine carbonate mud up to 12 cm thick were imported into the mangroves of the lower Florida Keys, Biscayne Bay, and the Ten Thousand Islands. Import of siliciclastic mud up to 5 cm thick was observed in Charlotte Harbor. In addition to fine mud, all sites had imported tidal wrack consisting of a mixed seagrass and mangrove leaf litter, with some deposits as thick as 6 cm. In areas with newly opened canopy, a microbial layer was coating the surface of the imported wrack layer. Overwash and shoreline erosion were also documented at two sites in the lower Keys and Biscayne Bay, and will be monitored for change and recovery over the next few years. Because active research was being conducted, a wealth of pre-storm data exists, thus these locations are uniquely positioned to quantify hurricane impacts to the sedimentary record and standing biomass across a wide geographic area. Due to changes in intensity along the storm path, direct comparisons of damage metrics can be made to environmental setting, wind speed, storm surge, and distance to eyewall.

  10. Measurement of some tumor markers by IRMA in vietnam

    International Nuclear Information System (INIS)

    Tran Xuan Truong

    2004-01-01

    As we known that a perfect tumor markers could be used in five different ways : for population screening, for diagnose, for monitoring therapy and for follow-up early evidence of cancer recurrence. In order to achieve perfect status a tumor markers would require total negativity in healthy subject, total positivity for single tumor type and close correlation between plasma tumor marker concentration and tumor size . The advance of monoclonal antibodies has had dramatic impact in oncology, where new tumor markers have been discovered and assay methods for all tumor markers have been improved commercially . Analytical performance of these new methods are potentially as good as that of the best Immunoradiometric assay for others analytes. In Vietnam, the first time we use immunoradiometric assay (IRMA) for the measurement of some tumor markers in normal subject and cancer diseases. These are Thyroglobulin (TG) of thyroid cancer, cancer-antigen 15-3 (CA15-3) of breast cancer and cancer-antigen 72-4 (CA72-4) of stomach cancer. We would like applying the CA72-4 in the indication of stomach cancer, CA15-3 in the differential diagnosis of breast cancer, and TG in the differential diagnosis of thyroid cancer. And all of these tumor markers were also used in the clinical follow-up and early detection of recurrence and metastatic Cancer of them. We could try researching on them much more. (authors)

  11. Historia, memoria y impunidad: el caso de Irma Flaquer

    Directory of Open Access Journals (Sweden)

    June Carolyn Erlick

    2005-12-01

    Full Text Available Na Guatemala, talvez mais do que em qualquer outro país, as comissões de investigação da verdade enfatizaram as narrativas de testemunho como documentos sobre os abusos do passado. No entanto, esta documentação manteve seu foco nas vítimas e nos crimes cometidos contra elas. A recuperação da vida das vítimas através da narrativa se apresenta como uma outra maneira de restaurar a memória e transformá-la em história. A vida e a obra da corajosa jornalista guatemalteca, Irma Flaquer, foi documentada pelo projeto da American Press Association, "Crimes Impunes contra Jornalistas." Como resultado, sob os auspícios da Comissão Interamericana dos Direitos Humanos, o governo da Guatemala admitiu sua responsabilidade no desaparecimento da jornalista e reabriu o caso. Assim, a reconstrução da memória através das técnicas narrativas não resultou apenas na reconstrução da história, mas em sua mudança.

  12. Withstanding trauma: the significance of Emma Eckstein's circumcision to Freud's Irma dream.

    Science.gov (United States)

    Bonomi, Carlo

    2013-07-01

    The author considers the medical rationale for Wilhelm Fliess's operation on Emma Eckstein's nose in February 1895 and interprets the possible role that this played in Freud's dream of Irma's injection five months later. The author's main argument is that Emma likely endured female castration as a child and that she therefore experienced the surgery to her nose in 1895 as a retraumatization of her childhood trauma. The author further argues that Freud's unconscious identification with Emma, which broke through in his dream of Irma's injection with resistances and apotropaic defenses, served to accentuate his own "masculine protest". The understanding brought to light by the present interpretation of Freud's Irma dream, when coupled with our previous knowledge of Freud, allows us to better grasp the unconscious logic and origins of psychoanalysis itself.(1.) © 2013 The Psychoanalytic Quarterly, Inc.

  13. High Temporal Resolution Tropospheric Wind Profile Observations at NASA Kennedy Space Center During Hurricane Irma

    Science.gov (United States)

    Decker, Ryan K.; Barbre, Robert E., Jr.; Huddleston, Lisa; Brauer, Thomas; Wilfong, Timothy

    2018-01-01

    The NASA Kennedy Space Center (KSC) operates a 48-MHz Tropospheric/Stratospheric Doppler Radar Wind Profiler (TDRWP) on a continual basis generating wind profiles between 2-19 km in the support of space launch vehicle operations. A benefit of the continual operability of the system is the ability to provide unique observations of severe weather events such as hurricanes. Over the past two Atlantic Hurricane seasons the TDRWP has made high temporal resolution wind profile observations of Hurricane Irma in 2017 and Hurricane Matthew in 2016. Hurricane Irma was responsible for power outages to approximately 2/3 of Florida's population during its movement over the state(Stein,2017). An overview of the TDRWP system configuration, brief summary of Hurricanes Irma and Matthew storm track in proximity to KSC, characteristics of the tropospheric wind observations from the TDRWP during both events, and discussion of the dissemination of TDRWP data during the event will be presented.

  14. Microphysical Structures of Hurricane Irma Observed by Polarimetric Radar

    Science.gov (United States)

    Didlake, A. C.; Kumjian, M. R.

    2017-12-01

    This study examines dual-polarization radar observations of Hurricane Irma as its center passed near the WSR-88D radar in Puerto Rico, capturing needed microphysical information of a mature tropical cyclone. Twenty hours of observations continuously sampled the inner core precipitation features. These data were analyzed by annuli and azimuth, providing a bulk characterization of the primary eyewall, secondary eyewall, and rainbands as they varied around the storm. Polarimetric radar variables displayed distinct signatures of convective and stratiform precipitation in the primary eyewall and rainbands that were organized in a manner consistent with the expected kinematic asymmetry of a storm in weak environmental wind shear but with moderate low-level storm-relative flow. In the front quadrants of the primary eyewall, vertical profiles of differential reflectivity (ZDR) exhibit increasing values with decreasing height consistent with convective precipitation processes. In particular, the front-right quadrant exhibits a signature in reflectivity (ZH) and ZDR indicating larger, sparser drops, which is consistent with a stronger updraft present in this quadrant. In the rear quadrants, a sharply peaked ZDR maximum occurs within the melting layer, which is attributed of stratiform processes. In the rainbands, the convective to stratiform transition can be seen traveling from the front-right to the front-left quadrant. The front-right quadrant exhibits lower co-polar correlation coefficient (ρHV) values in the 3-8 km altitude layer, suggesting larger vertical spreading of various hydrometeors that occurs in convective vertical motions. The front-left quadrant exhibits larger ρHV values, suggesting less diversity of hydrometeor shapes, consistent with stratiform processes. The secondary eyewall did not exhibit a clear signature of processes preferred in a specific quadrant, and a temporal analysis of the secondary eyewall revealed a complex evolution of its structure

  15. Are recent hurricane (Harvey, Irma, Maria) disasters natural?

    Science.gov (United States)

    Trenberth, K. E.; Lijing, C.; Jacobs, P.; Abraham, J. P.

    2017-12-01

    Yes and no! Hurricanes are certainly natural, but human-caused climate change is supersizing them, and unbridled growth is exacerbating risk of major damages. The addition of heat-trapping gases to the atmosphere has led to observed increases in upper ocean heat content (OHC). This human-caused increase in OHC supports higher sea surface temperatures (SSTs) and atmospheric moisture. These elevated temperatures and increased moisture availability fuel tropical storms, allowing them to grow larger, longer lasting, and more intense, and with widespread heavy rainfalls. Our preliminary analysis of OHC through the August of 2017 shows not only was it by far the highest on record globally, but it was also the highest on record in the Gulf of Mexico prior to hurricane Harvey occurring. The human influence on the climate is also evident in rising sea levels, which increases risks from storm surges. These climatic changes are taking place against a background of growing habitation along coasts, which further increases the risk storms pose to life and property. This combination of planning choice and climatic change illustrates the tragedy of global warming, as evidenced by Harvey in Houston, Irma in the Caribbean and Florida, and Maria in Puerto Rico. However, future damages and loss of life can be mitigated, by stopping or slowing human-caused climate change, and through proactive planning (e.g., better building codes, increased-capacity drainage systems, shelters, and evacuation plans). We discuss the climatic and planning contexts of the unnatural disasters of the 2017 Atlantic Hurricane season, including novel indices of climate-hurricane influence.

  16. Reproductive hormones disorders of Sudanese females using immunoradiometric assay (IRMA)

    International Nuclear Information System (INIS)

    Ali, N. I.; Almahi, W. A. A.; Abdalla, O. M.; Bafarag, S. M. I.; Abdelgadir, O. M.; Eltayeb, M. A. H.; Hassan, A. M. E.; Hassan, A. M. E.

    2004-12-01

    In this study fertility hormones were measured for 587 infertile Sudanese female referred from gynecological clinics. The ages of these female ranges from 16-50 years divided into seven groups. Eighty seven percent of them are in the age range between 21 and 40 year which correlate with the female's fertile period and 5.6% of them under 20 years. Sensitive (IRMA) method was used for measuring the hormone concentration. The objective of this study was to found out the percentage of hormonal disorders and its relation to the age in infertile Sudanese females. The age group (21-25) was the most affected group by the Poly Cystic Ovary Syndrome (PCOS) and represented 5.1% of the total number of patients. The least group was the age group (41-45) with a percentage of 0.4. The LH and the FSH in the age group of (31-35) was found to be higher than the other groups and represented 11.4% and 7.8% from the total number of patients respectively. The least percent of high level of LH and FSH was found to be in the most fertile age group (15-20) and it was 1.7% and 1.0% from the total number of studied patient, respectively. Those who were in the age range (26-30) with hyperprolactinaemia represented 10.4% of patients, while those with age rang (46-50) with hyperprolactinaemia represented the lowest percentage (1.2%). The percentage of patients having high LH and high FSH was 44.5% and 29.1% respectively, while the hyperprolactinaemia among the infertile Sudanese female was found to 38.2%.(Author)

  17. Amsterdam Book Design: Irma Boom, Hansje van Halem, Lesley Moore = Amsterdamskij knižnyj dizajn

    NARCIS (Netherlands)

    Lommen, M.

    2012-01-01

    Published on the occasion of the exhibition 'Amsterdam Book Design : Irma Boom, Hansje van Halem, Lesley Moore', from May 26th until June 17th 2012 in the creative platform Taiga Space, Saint Petersburg. In cooperation with Netherlands Institute Saint-Petersburg.

  18. Litterfall Production Prior to and during Hurricanes Irma and Maria in Four Puerto Rican Forests

    Directory of Open Access Journals (Sweden)

    Xianbin Liu

    2018-06-01

    Full Text Available Hurricanes Irma and Maria struck Puerto Rico on the 6th and 20th of September 2017, respectively. These two powerful Cat 5 hurricanes severely defoliated forest canopy and deposited massive amounts of litterfall in the forests across the island. We established a 1-ha research plot in each of four forests (Guánica State Forest, Río Abajo State Forest, Guayama Research Area and Luquillo Experiment Forest before September 2016, and had collected one full year data of litterfall production prior to the arrival of Hurricanes Irma and Maria. Hurricane-induced litterfall was collected within one week after Hurricane Irma, and within two weeks after Hurricane Maria. Each litterfall sample was sorted into leaves, wood (branches and barks, reproductive organs (flowers, fruits and seeds and miscellaneous materials (mostly dead animal bodies or feces after oven-drying to constant weight. Annual litterfall production prior to the arrival of Hurricanes Irma and Maria varied from 4.68 to 25.41 Mg/ha/year among the four forests, and annual litterfall consisted of 50–81% leaffall, 16–44% woodfall and 3–6% fallen reproductive organs. Hurricane Irma severely defoliated the Luquillo Experimental Forest, but had little effect on the other three forests, whereas Hurricane Maria defoliated all four forests. Total hurricane-induced litterfall from Hurricanes Irma and Maria amounted to 95–171% of the annual litterfall production, with leaffall and woodfall from hurricanes amounting to 63–88% and 122–763% of their corresponding annual leaffall and woodfall, respectively. Hurricane-induced litterfall consisted of 30–45% leaves and 55–70% wood. Our data showed that Hurricanes Irma and Maria deposited a pulse of litter deposition equivalent to or more than the total annual litterfall input with at least a doubled fraction of woody materials. This pulse of hurricane-induced debris and elevated proportion of woody component may trigger changes in

  19. The short-term effect on carbonate parameters from hurricanes Harvey, Irma, and Maria.

    Science.gov (United States)

    Jonsson, B. F.; Salisbury, J., II; Melendez Oyola, M.

    2017-12-01

    Tropical storms and hurricanes are events with potentially extreme impacts on ocean conditions. Strong winds generating vigorous vertical mixing and extensive precipitation affect both temperature and salinity in the mixed layer. The surface temperature, for example, decreased several degrees C in the wake of both hurricanes Irma and Maria. While it is clear that the physical state of the surface ocean is affected by hurricanes, how such storms affect carbonate system variability is still an open question. Changes in temperature and salinity combined with extreme winds create the potential for changes in solubility of pCO2, and large net fluxes of CO2 across the air-sea interface. A deepening of the mixed layer from wind-driven mixing may further affect the carbonate system, as sub-surface waters rich in dissolved inorganic carbon and nutrients are entrained to the surface. To examine these process, we evaluate simulated fields of temperature and salinity (from a 1/12° global data assimilated General Circulation Model), satellite ocean color and wind speed data within the context of a conceptual box model. Our model is compared to observed pCO2, wind speed, temperature and salinity data from buoyed assets that survived the storms. We address total CO2 fluxes, the relative effects temperature, salinity and biology on the carbonate system, and the time scales over which the system is "restored" to its initial state. We explore the connection between the magnitude of perturbation and the length of time it takes for the system to recover, and observe recovery over time scales lasting from days to weeks depending on the storm. Although not observed in these data, we speculate that depending on the buoyancy frequency, recovery elsewhere could take place over monthly time scales, raising the potential that hurricanes could exacerbate or alleviate environmental stresses on calcifying marine organisms.

  20. Evaluation of a modified IRMA for anti-D quantitation, using 3H protein A

    International Nuclear Information System (INIS)

    Dumasia, A.; Gupte, S.

    1993-01-01

    A modified immunoradiometric assay (IRMA) using tritiated ( 3 H) protein A was developed to estimate anti-D concentration. The main advantages of the assay were longer shelf life of the labelled reagent (more than two years); minimum radiation hazard and; low non specific binding. Levels of anti-D were estimated in 23 Rh (D) immunized women. A good correlation of anti-D concentration (μg/ml) with Rh antibody titre was observed (r=+ 0.89, P 3 H protein A IRMA correlated well with the severity of Rh-HDN. This assay could quantitate anti-D in sera having exclusively IgG 3 subtype. (author). 20 refs., 2 figs., 2 tabs

  1. Coastal Sediment Distribution Patterns Following Category 5 Hurricanes (Irma and Maria): Pre and Post Hurricane High Resolution Multibeam Surveys of Eastern St. John, US Virgin Islands

    Science.gov (United States)

    Browning, T. N.; Sawyer, D. E.; Russell, P.

    2017-12-01

    In August of 2017 we collected high resolution multibeam data of the seafloor in a large embayment in eastern St. John, US Virgin Islands (USVI). One month later, the eyewall of Category 5 Hurricane Irma directly hit St. John as one of the largest hurricanes on record in the Atlantic Ocean. A week later, Category 5 Hurricane Maria passed over St. John. While the full extent of the impacts are still being assessed, the island experienced a severe loss of vegetation, infrastructure, buildings, roads, and boats. We mobilized less than two months afterward to conduct a repeat survey of the same area on St. John. We then compared these data to document and quantify the sediment influx and movement that occurred in coastal embayments as a result of Hurricanes Irma and Maria. The preliminary result of the intense rain, wind, and storm surge likely yields an event deposit that can be mapped and volumetrically quantified in the bays of eastern St. John. The results of this study allow for a detailed understanding of the post-hurricane pulse of sediment that enters the marine environment, the sediment flux seaward, and the morphological changes to the bay floor.

  2. Antibodies immobilized on magnetic particles for RIA and IRMA of thyroid related hormones

    International Nuclear Information System (INIS)

    Wayan, R.S.; Djayusman, D.S.

    1996-01-01

    In Indonesia radioimmunoassay kits on the magnetic method of separation need to be imported and are very expensive. Local production of these kits would be economical. Different types of magnetic particles have been used for immobilizing antibodies for use in RIA of T 3 , T 4 , IRMA-TSH as well as neonatal IRMA-TSH. The particles studied here include magnetic cellulose (SCIPAC, U.K.), magnetite (Hungary), Silanized Iron Oxide (China) and Latex-M. Various parameters have been studied in order to optimize the antibody immobilization procedures as well as the assays based on these immunoadsorbents. The assays developed by us have been compared with those obtained with commercial kits from Amersham, NETRIA and DPC. The study done in this work includes immobilization of second antibodies for RIA of T 4 and immobilization of anti-TSH for IRMA-TSH. Among several different magnetic particles studied in this work, magnetite and silanized iron oxide were found to be satisfactory on account of the simplicity of immobilization, high binding capacity and the low non specific binding. A good assay performance in the case of RIA T 3 and T 4 was obtained using second antibodies immobilized magnetic particles. However, the quality of first antibodies is found to play an important role on the sensitivity and precision of the assay. Good correlation has been obtained with Amersham kit (y = 1.06x - 0.12 and r = 0.987). Assay performance of IRMA-TSH using in-house prepared anti-TSH immobilized magnetic particles is also found to be comparable with Amersham, NETRIA and DPC kits. (author). 4 refs, 6 figs, 1 tab

  3. The development of IRMA method of hepatocarcinoma ferritin and its preliminary clinical application

    International Nuclear Information System (INIS)

    Ying Jilin; Deng Jinglan; Lin Guocheng; Fu Chenghua; Liu Yanfang; Xu Liqing

    1993-01-01

    The double antibody sandwich IRMA method using human hepatocarcinoma ferritin (HF) and anti-HF monoclonal antibody has been established. The intra-assay or inter-assay CV value (%) is 4.1 ∼ 7.3 or 5.1 ∼ 10.9 respectively. The effective assay range of HF is 5 ∼ 640 μg/L. The mean value of recovery rate is 102.5%. The serum HF specimen were measured in 75 cases of normal persons and 230 cases of carcinoma patients. The results showed that the normal value of male or female was 14.3 +- 10.9 μg/L or 11.6 +- 7.0 μg/L, and there was no statistical difference. The serum HF levels in patients of hepatocarcinoma and pulmonary carcinoma were strikingly higher than normal (P<0.001), and also HF levels in patients of breast and pancreas cancer were apparently higher than normal (P<0.05). It has suggested that HF IRMA method has diagnostic significance in above related tumors

  4. The development of IRMA method of hepatocarcinoma ferritin and its preliminary clinical application

    Energy Technology Data Exchange (ETDEWEB)

    Jilin, Ying; Jinglan, Deng; Guocheng, Lin; Chenghua, Fu; Yanfang, Liu; Liqing, Xu [Fourth Military Medical Coll., Xi' an (China). First Affiliated Hospital

    1993-08-01

    The double antibody sandwich IRMA method using human hepatocarcinoma ferritin (HF) and anti-HF monoclonal antibody has been established. The intra-assay or inter-assay CV value (%) is 4.1 [approx] 7.3 or 5.1 [approx] 10.9 respectively. The effective assay range of HF is 5 [approx] 640 [mu]g/L. The mean value of recovery rate is 102.5%. The serum HF specimen were measured in 75 cases of normal persons and 230 cases of carcinoma patients. The results showed that the normal value of male or female was 14.3 +- 10.9 [mu]g/L or 11.6 +- 7.0 [mu]g/L, and there was no statistical difference. The serum HF levels in patients of hepatocarcinoma and pulmonary carcinoma were strikingly higher than normal (P<0.001), and also HF levels in patients of breast and pancreas cancer were apparently higher than normal (P<0.05). It has suggested that HF IRMA method has diagnostic significance in above related tumors.

  5. Integrated Assessment Model Evaluation

    Science.gov (United States)

    Smith, S. J.; Clarke, L.; Edmonds, J. A.; Weyant, J. P.

    2012-12-01

    Integrated assessment models of climate change (IAMs) are widely used to provide insights into the dynamics of the coupled human and socio-economic system, including emission mitigation analysis and the generation of future emission scenarios. Similar to the climate modeling community, the integrated assessment community has a two decade history of model inter-comparison, which has served as one of the primary venues for model evaluation and confirmation. While analysis of historical trends in the socio-economic system has long played a key role in diagnostics of future scenarios from IAMs, formal hindcast experiments are just now being contemplated as evaluation exercises. Some initial thoughts on setting up such IAM evaluation experiments are discussed. Socio-economic systems do not follow strict physical laws, which means that evaluation needs to take place in a context, unlike that of physical system models, in which there are few fixed, unchanging relationships. Of course strict validation of even earth system models is not possible (Oreskes etal 2004), a fact borne out by the inability of models to constrain the climate sensitivity. Energy-system models have also been grappling with some of the same questions over the last quarter century. For example, one of "the many questions in the energy field that are waiting for answers in the next 20 years" identified by Hans Landsberg in 1985 was "Will the price of oil resume its upward movement?" Of course we are still asking this question today. While, arguably, even fewer constraints apply to socio-economic systems, numerous historical trends and patterns have been identified, although often only in broad terms, that are used to guide the development of model components, parameter ranges, and scenario assumptions. IAM evaluation exercises are expected to provide useful information for interpreting model results and improving model behavior. A key step is the recognition of model boundaries, that is, what is inside

  6. Estimation of serum thyroglobulin using isotopic and non-isotopic methods: a comparison between RIA, IRMA and ELISA

    International Nuclear Information System (INIS)

    Ajay Kumar; Velumani, A.; Dandekar, S.R.; Shah, D.H.; Rajashekharrao, B.

    1997-01-01

    Three in-house immunoassays viz, RIA, IRMA and ELISA for estimation of serum Tg were compared for their assay characteristics and clinical utility. The inter- and intra-assay coefficient of variations of RIA and IRMA were comparable and were marginally better than that of ELISA. The sensitivity of IRMA was superior to RIA which in turn was superior than ELISA. Incubation time for both IRMA and ELISA was 4 h as compared to 90 h required for RIA. The enzyme conjugated antibody had longest shelf-life (9 months) followed by radiolabeled antibody (3 months) while radiolabeled Tg had shortest shelf-life (3 weeks). There was no interference from circulating anti-Tg autoantibodies in the RIA of Tg as compared to a gross underestimation observed in both sandwich methods. Assays correlated well with each other (r > 0.85, n=200) and had suitable clinical utility (sensitivity > 93% and specificity > 89%). Each assay though had its own merits and demerits, yet it served the clinical utility for an effective management of patients with differentiated thyroid carcinoma. (author)

  7. Towards sustainable flood risk management in the Rhine and Meuse river basins: synopsis of the findings of IRMA-SPONGE

    NARCIS (Netherlands)

    Hooijer, A.; Klijn, F.; Pedroli, G.B.M.; Os, van A.G.

    2004-01-01

    Recent flood events in western Europe have shown the need for improved flood risk management along the Rhine and Meuse rivers. In response, the IRMA-SPONGE research programme was established, consisting of 13 research projects, in which over 30 organizations from six countries co-operated. The aim

  8. Performance of the FV3-powered Next Generation Global Prediction System for Harvey and Irma, and a vision for a "beyond weather timescale" prediction system for long-range hurricane track and intensity predictions

    Science.gov (United States)

    Lin, S. J.; Bender, M.; Harris, L.; Hazelton, A.

    2017-12-01

    The performance of a GFDL developed FV3-based Next Generation Global Prediction System (NGGPS) for Harvey and Irma will be reported. We will report on aspects of track and intensity errors (vs operational models), heavy precipitation (Harvey), rapid intensification, and simulated structure (in comparison with ground based radar), and point to a need of a future long-range (from day-5 up to 30 days) physically based ensemble hurricane prediction system for providing useful information to the forecasters, beyond the usual weather timescale.

  9. Unfulfilled farmer expectations: the case of the Insect Resistant Maize for Africa (IRMA project in Kenya

    Directory of Open Access Journals (Sweden)

    Mabeya Justin

    2012-11-01

    Full Text Available Abstract Background Maize is the most important staple food in Kenya; any reduction in production and yield therefore often becomes a national food security concern. To address the challenge posed by the maize stem borer, the Insect Resistant Maize for Africa (IRMA agricultural biotechnology public-private partnership (PPP project was launched in 1999. There were, however, pre-existing concerns regarding the use of genetic engineering in crop production and skepticism about private sector involvement. The purpose of this case study was to understand the role of trust in the IRMA partnership by identifying the challenges to, and practices for, building trust in the project. Methods Data were collected by conducting face-to-face, semi-structured interviews; reviewing publicly available project documents; and direct observations. The data were analyzed to generate recurring and emergent themes on how trust is understood and built among the partners in the IRMA project and between the project and the community. Results Clear and continued communication with stakeholders is of paramount importance to building trust, especially regarding competition among partners about project management positions; a lack of clarity on ownership of intellectual property rights (IPRs; and the influence of anti-genetic modification (GM organizations. Awareness creation about IRMA’s anticipated products raised the end users’ expectations, which were unfulfilled due to failure to deliver Bacillus thuringiensis (Bt-based products, thereby leading to diminished trust between the project and the community. Conclusions Four key issues have been identified from the results of the study. First, the inability to deliver the intended products to the end user diminished stakeholders’ trust and interest in the project. Second, full and honest disclosure of information by partners when entering into project agreements is crucial to ensuring progress in a project. Third

  10. Denissen (Frans), André Baillon. Le gigolo d’Irma Idéal

    OpenAIRE

    Gnocchi, Maria Chiara

    2012-01-01

    Un titre aux résonances insolites vient chatouiller la règle sobre des Archives du Futur, la collection publiée sous la responsabilité des Archives et Musée de la Littérature de Bruxelles. André Baillon. Le gigolo d’Irma Idéal : un titre qui convient bien à une biographie que l’on a déjà — à juste titre — qualifiée d’extra-ordinaire (F. Ghysen, Le Carnet et les Instants, novembre 1998). Extra-ordinaire est d’ailleurs, avant tout, le protagoniste du volume : Baillon le grand écrivain, Baillon ...

  11. Models for Pesticide Risk Assessment

    Science.gov (United States)

    EPA considers the toxicity of the pesticide as well as the amount of pesticide to which a person or the environments may be exposed in risk assessment. Scientists use mathematical models to predict pesticide concentrations in exposure assessment.

  12. ["... my friend Leopold was percussing her through her bodice...". Leopold von Auenbrugger in Sigmund Freud's dream of Irma's injection].

    Science.gov (United States)

    Reicheneder, Johann Georg

    2011-01-01

    This paper provides a psychoanalytic interpretation of an element in the Irma dream that Freud had ignored in his own interpretation. The allusion to Leopold von Auenbrugger, the originator of percussion as a method of clinical investigation, which appears in the manifest dream reflects Freud's hopes and fears about how his Interpretation of Dreams and the new human science established there would be received by his medical colleagues.

  13. How do extreme streamflow due to hurricane IRMA compare during 1938-2017 in South Eastern US?

    Science.gov (United States)

    Anandhi, A.

    2017-12-01

    The question related to Irma, Harvey, Maria, and other hurricanes is: are hurricane more frequent and intense than they have been in the past. Recent hurricanes were unusually strong hitting the US Coastline or territories as a category 4 or 5, dropping unusually large amounts of precipitation on the affected areas creating extreme high-flow events in rivers and streams in affected areas. The objective of the study is to determine how extreme are streamflows from recent hurricanes (e.g. IRMA) when compared to streamflow's during 1938-2017 time-period. Additionally, in this study, the extreme precipitations are also compared during IRMA. Extreme high flows are selected from Indicators of Hydrologic Alteration (IHA). They are distributions, timing, duration, frequency, magnitude, pulses, and days of extreme events in rivers of the southeastern United States and Gulf of Mexico Hydrologic Region—03. Streamflow data from 30 stations in the region with at least 79 years of record (1938-2017) are used. Historical precipitation changes is obtained from meta-analysis of published literature. Our preliminary results indicate the extremeness of streamflow from recent hurricanes vary with the IHA indicator selected. Some potential implications of these extreme events on the region's ecosystem are also discussed using causal chains and loops.

  14. Examining the effects of hurricanes Matthew and Irma on water quality in the intracoastal waterway, St. Augustine, FL.

    Science.gov (United States)

    Ward, N. D.; Osborne, T.; Dye, T.; Julian, P.

    2017-12-01

    The last several years have been marked by a high incidence of Atlantic tropical cyclones making landfall as powerful hurricanes or tropical storms. For example, in 2016 Hurricane Matthew devastated parts of the Caribbean and the southeastern United States. In 2017, this region was further battered by hurricanes Irma and Maria. Here, we present water quality data collected in the intracoastal waterway near the Whitney Lab for Marine Bioscience during hurricanes Matthew and Irma, a region that experienced flooding during both storms. YSI Exo 2 sondes were deployed to measure pH, salinity, temperature, dissolved O2, fluorescent dissolved organic matter (fDOM), turbidity, and Chlorophyll-a (Chl-a) on a 15 minute interval. The Hurricane Matthew sonde deployment failed as soon as the storm hit, but revealed an interesting phenomenon leading up to the storm that was also observed during Irma. Salinity in the intracoastal waterway (off the Whitney Lab dock) typically varies from purely marine to 15-20 psu throughout the tidal cycle. However, several days before both storms approached the Florida coast (i.e. when they were near the Caribbean), the salinity signal became purely marine, overriding any tidal signal. Anecdotally, storm drains were already filled up to street level prior to the storm hitting, poising the region for immense flooding and storm surge. The opposite effect was observed after Irma moved past FL. Water became much fresher than normal for several days and it took almost a week to return to "normal" salinity tidal cycles. As both storms hit, turbidity increased by an order of magnitude for a several hour period. fDOM and O2 behaved similar to salinity during and after Irma, showing a mostly marine signal (e.g. higher O2, lower fDOM) in the lead up, and brief switch to more freshwater influence the week after the storm. Chl-a peaked several days after the storm, presumably due to mobilization of nutrient rich flood and waste waters and subsequent algae

  15. Miliç Irmağı (Terme, Samsun Balık Faunası.

    Directory of Open Access Journals (Sweden)

    Selma Uğurlu

    2015-12-01

    Full Text Available Miliç Irmağı’nda yaşayan balık türlerini ortaya çıkarmak amacıyla yapılan bu araştırma, Nisan 2004–Temmuz 2005 tarihleri arasında gerçekleştirilmiştir. Balık örneklerini toplamak amacıyla ırmak boyunca, ırmağın ekolojik karakterlerini temsil eden 7 istasyon belirlenmiştir. Çalışma süresince toplam 286 balık örneği; elektro şok aleti, balık kepçeleri, balık ağları, serpme ve oltalar aracılığıyla yakalanmıştır. Bu çalışmada 5 familyaya ait (Cyprinidae, Mugilidae, Syngnathidae, Blenniidae, Gobiidae 16 tür teşhis edilmiştir

  16. Integrated Environmental Assessment Modelling

    Energy Technology Data Exchange (ETDEWEB)

    Guardanz, R; Gimeno, B S; Bermejo, V; Elvira, S; Martin, F; Palacios, M; Rodriguez, E; Donaire, I [Ciemat, Madrid (Spain)

    2000-07-01

    This report describes the results of the Spanish participation in the project Coupling CORINAIR data to cost-effect emission reduction strategies based on critical threshold. (EU/LIFE97/ENV/FIN/336). The subproject has focused on three tasks. Develop tools to improve knowledge on the spatial and temporal details of emissions of air pollutants in Spain. Exploit existing experimental information on plant response to air pollutants in temperate ecosystem and Integrate these findings in a modelling framework that can asses with more accuracy the impact of air pollutants to temperate ecosystems. The results obtained during the execution of this project have significantly improved the models of the impact of alternative emission control strategies on ecosystems and crops in the Iberian Peninsula. (Author) 375 refs.

  17. Uncertainties in radioecological assessment models

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Miller, C.W.; Ng, Y.C.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because models are inexact representations of real systems. The major sources of this uncertainty are related to bias in model formulation and imprecision in parameter estimation. The magnitude of uncertainty is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, health risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible. 41 references, 4 figures, 4 tables

  18. Measuring and building resilience after big storms: Lessons learned from Super-Storm Sandy for the Harvey, Irma, Jose, and Maria coasts

    Science.gov (United States)

    Murdoch, P. S.; Penn, K. M.; Taylor, S. M.; Subramanian, B.; Bennett, R.

    2017-12-01

    As we recover from recent large storms, we need information to support increased environmental and socio-economic resilience of the Nation's coasts. Defining baseline conditions, tracking effects of mitigation actions, and measuring the uncertainty of resilience to future disturbance are essential so that the best management practices can be determined. The US Dept. of the Interior invested over $787 million dollars in 2013 to understand and mitigate coastal storm vulnerabilities and enhance resilience of the Northeast coast following Super-Storm Sandy. Several lessons-learned from that investment have direct application to mitigation and restoration needs following Hurricanes Harvey, Irma, Jose and Maria. New models of inundation, overwash, and erosion, developed during the Sandy projects have already been applied to coastlines before and after these recent storms. Results from wetland, beach, back-bay, estuary, and built-environment projects improved models of inundation and erosion from surge and waves. Tests of nature-based infrastructure for mitigating coastal disturbance yielded new concepts for best-practices. Ecological and socio-economic measurements established for detecting disturbance and tracking recovery provide baseline data critical to early detection of vulnerabilities. The Sandy lessons and preliminary applications on the recent storms could help define best-resilience practices before more costly mitigation or restoration efforts are required.

  19. MAPPING THE EXTENT AND MAGNITUDE OF SEVER FLOODING INDUCED BY HURRICANE IRMA WITH MULTI-TEMPORAL SENTINEL-1 SAR AND INSAR OBSERVATIONS

    Directory of Open Access Journals (Sweden)

    B. Zhang

    2018-04-01

    Full Text Available During Hurricane Irma’s passage over Florida in September 2017, many sections of the state experienced heavy rain and sequent flooding. In order to drain water out of potential flooding zones and assess property damage, it is important to map the extent and magnitude of the flooded areas at various stages of the storm. We use Synthetic Aperture Radar (SAR and Interferometric SAR (InSAR observations, acquired by Sentinel-1 before, during and after the hurricane passage, which enable us to evaluate surface condition during different stages of the hurricane. This study uses multi-temporal images acquired under dry condition before the hurricane to constrain the background backscattering signature. Flooded areas are detected when the backscattering during the hurricane is statistically significantly different from the average dry conditions. The detected changes can be either an increase or decrease of the backscattering, which depends on the scattering characteristics of the surface. In addition, water level change information in Palmdale, South Florida is extracted from an interferogram with the aid of a local water gauge as the reference. The results of our flooding analysis revealed that the majority of the study area in South Florida was flooded during Hurricane Irma.

  20. Reproductive Hormones Disorders Of Sudanese Infertile Females Using Immunoradiometric assay (IRMA)

    International Nuclear Information System (INIS)

    Ali, N. I.; Almahi, W. A. A.; Abdalla, O. M.; Bafarag, S. M. I.; Abdelgadir, O. M.; Eltayeb, M. A. H.; Hassan, A. M. E.; Hassan, A. M. E.

    2004-01-01

    In this study fertility hormones were measured for 587 infertile Sudanese female referred from gynecological clinics. The ages of these female ranges from 16 - 50 years divided into seven groups. Eighty seven percent of them are in the age range between 21 and 40 year which correlate with the female's fertile period and 5.6% of them under 20 years. Sensitive (IRMA) method was used for measuring the hormone concentration. The objective of this study was to found out the percentage of hormonal disorders and its relation to the age in infertile Sudanese females. The age group (21 - 25) was the most affected group by the Poly-Cystic Ovary Syndrome (PCOS) and represented 5.1% of the total number of patients. The least group was the age group (41- 45) with a percentage of 0.4. The LH and the FSH in the age group of (31- 35) was found to be higher than the other groups and represented 11.4% and 7.8% from the total number of patients respectively. The least percent of high level of LH and FSH was found to be in the most fertile age group (15 - 20) and it was 1.7% and 1.0% from the total number of studied patient, respectively. Those who were in the age range (26 - 30) with hyperprolactinemia represented 10.4% of patients, while those with age range (46 - 50) with hyperprolactinemia represented the lowest percentage (1.2%). The percentage of patients having high LH and high FSH was 44.5% and 29.1% respectively , while the hyperprolactinemia among the infertile Sudanese female was found to 38.2%. (Authors)

  1. Model uncertainty in safety assessment

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Huovinen, T.

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.)

  2. Model uncertainty in safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Pulkkinen, U; Huovinen, T [VTT Automation, Espoo (Finland). Industrial Automation

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.).

  3. Minimization of nonspecific binding to improve the sensitivity of a magnetic immunoradiometric assay (IRMA) for human thyrotropin (hTSH)

    International Nuclear Information System (INIS)

    Peroni, C.N.; Ribela, M.T.C.P.; Bartolini, P.

    1996-01-01

    An IRMA of hTSH, based on magnetic solid phase separation, was studied especially in terms of its nonspecific bindings (B 0 ). These were identified as a product of the interaction between radioiodinated anti-hTSH monoclonal antibody ( 125 I-mAB) and the uncoupled magnetizable cellulose particle (matrix). The negative effects of B 0 on the assay performance were minimized and practically eliminated, in the optimized system, with tracer storage at 4 deg. C, repurification and pre-incubation with the same matrix, serum addition during incubation and solid phase saturation with milk proteins. These findings were used in order to reproducibly decrease non specific binding to values 60 /B 0 ) into values of 300-500. This way, hTSH IRMAs were obtained with functional sensitivities of about 0.05 mlU/L and analytical sensitivities of the order of 0.02 mlU/L, which represent an approximate 10-fold increase in sensitivity when compared with non-optimized system. A more optimistic sensitivity calculation, based on Rodbard's definition, provided values down to 0.008 mlU/L. Such sensitivities, moreover, were obtained in a very reproducible way and all over the useful tracer life. (author). 10 refs, 1 fig., 8 tabs

  4. Study of antibody immobilization on different magnetic particles utilized for the radioimmunoassay (RIA) and immunoradiometric assay (IRMA) of hormones

    International Nuclear Information System (INIS)

    Ribela, M.T.C.P.; Peroni, C.N.; Bartolini, P.

    1996-01-01

    A study was carried out on antibody immobilization on three different types of magnetic particles: plain magnetite (Institute of Isotopes, Hungary), silanized magnetite (Institute of Atomic Energy, China) and Magnetizable cellulose (SCIPAC, UK). For radioimmunoassay (RIA) applications an efficient 2 nd antibody (AB)-coupled magnetic solid phase, utilizing plain magnetite and a purified anti-rabbit IgG antibody (Trilab, Brazil), was prepared. A consistent bias, detected in comparison with a well known commercial magnetic solid phase kit, was practically eliminated by modifying the coupling and saturation procedure. Concerning two-site IRMA application, an extensive study was carried out on the matching and selection of anti-hTSH antibodies that could be used for capture and detection. Very satisfactory results were obtained with the three types of magnetic particles using different monoclonal and polyclonal antibodies and in particular, two partners anti-hTSH mABs from the National Institute of Health of Thailand. Utilizing also a recombinant hTSH standard preparation, calibrated and distributed by our laboratory (IPEN-CNEN/SP, Brazil), it was possible to obtain a complete set of in-house reagents for hTSH IRMA, prepared and tested under IAEA support. (author). 11 refs, 4 figs, 12 tabs

  5. Ultrasensitive human thyrotropin (h TSH) immunoradiometric assay (IRMA) set up, through identification and minimization of non specific bindings

    International Nuclear Information System (INIS)

    Peroni, C.N.

    1994-01-01

    An IRMA of h TSH, based on magnetic solid phase separation, was studied especially for what concerns its non specific bindings. These were identified as a product of the interaction between an altered form of radioiodinated anti-h TSH monoclonal antibody ( 125 I-m AB) and the uncoupled magnetizable cellulose particle (matrix). Apparently this form of 125 I-m AB is a type of aggregate that can be partly resolved from the main peak on Sephadex G-200 and further minimized via a single pre-incubation with the same matrix. Solid phase saturation with milk proteins, tracer storage at 4 0 C and serum addition during incubation were also found particularly effective is preventing its formation. These findings were used in order to reproducibly decrease non specific bindings to values 60 /B O ) up to values of 300-500. This way we obtained h TSH radio assays with functional sensitivities of about 0.05 m IU/L and analytical sensitivities of the order of 0.02 m IU/L, which classify them at least as among the best second generation assays and that are excellent indeed for magnetic IRMA s. A more optimistic sensitivity calculation, based on Rodbard's definition, provided values down to 0.008 m IU/L. Such sensitivities, moreover, were obtained in a very reproducible way and all over the useful tracer life. (author). 83 refs, 13 figs, 25 tabs

  6. Characterization of Landslide Sites in Puerto Rico after Hurricanes Irma and María

    Science.gov (United States)

    Hughes, K. S.; Morales Vélez, A. C.

    2017-12-01

    Thousands of landslides in Puerto Rico and the U.S. Virgin Islands were triggered by the passage of Hurricanes Irma (Sep. 6) and María (Sep. 20) in 2017. Both were classified as Category 5 hurricanes on the Saffir-Simpson scale before making landfall. Most of the mass wasting occurred in the rugged mountainous regions of Puerto Rico and—along with bridge collapse, flooding, and the threat of dam failure—left many communities isolated for up to a month or longer. Aerial photography collected by FEMA and the Civil Air Patrol have allowed for the rapid inventory of landslide sites across the archipelago by the USGS and other groups. Using this dataset and other local information, we identified a list of priority sites that were documented in detail as part of a NSF-GEER (Geotechnical Extreme Event Reconnaissance) mission. The juvenile landscape and short-wavelength topography in most of Puerto Rico present considerable landslide risk that is exaggerated during heavy rainfall events like Hurricane María. Our preliminary work shows that natural escarpments, de-vegetated pastureland in mountainous areas, and road cuts along incised river valleys were areas of concentrated failures during these storms. Notably, the northern karst area suffered fewer failures than the arc basement rocks exposed elsewhere in the island. In addition to previously active landslides at specific sites on the island, new landslides along PR-143 in the municipality of Barranquitas, PR-431 in the municipality of Lares, and PR-109 in the municipality of Añasco are among important mass wasting events that were a focus of the GEER team and remain important in our ongoing research. A team of undergraduate and graduate students led by faculty at the University of Puerto Rico in Mayagüez are working to characterize the complete inventory of landslides in terms of underlying geology, soil type, slope, curvature, rain fall amounts during both atmospheric events, and other local geomorphic and

  7. Climate Modeling Computing Needs Assessment

    Science.gov (United States)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  8. The Comparison Of TSH IRMA Serum Level With TRH Test Value In Healthy People Who Are Suspected To Have Hyperthyroidism

    Directory of Open Access Journals (Sweden)

    Keshavarz zirak A

    2005-06-01

    Full Text Available Background: Sub clinical hyperthyroidism is a state of subnormal serum TSH and T3,T4 within normal range, although usually without overt clinical manifestation but many disastrous complications especially in senile patient. In Iranian people, serum TSH is generally assayed by IRMA method. This study is aimed to determine the value of low serum TSH in these patients, better management and decision when encountered. Materials and Methods: The populations under study are guys with serum TSH lower than 0.5mu/l and normal thyroid hormones without known thyroidal and non-thyroidal illness. A basal serum TSH and TSH 30 minutes after TRH injection intra venous were sampled and correlation of clinical signs and symptoms and basal TSH with sub clinical hyperthyroidism was considered. Results: The population under study was categorized into five groups and prevalence of sub clinical hyperthyroidism was noted. In patients with b.TSH equal or lower than 0.1mu/l, 100%, 0.1-0.2mu/l, 75%, 0.2-0.3mu/l, 38.5%, 0.3-0.4mu/l, 14.3% and TSH levels greater than 0.4mu/l, were all normal. After analyzing of these data and determination of sensitivity and specificity of IRMA, it was concluded that IRMA is not sufficient to distinguish sub clinical hyperthyroidism, although there is a good linear (r=0.68; P<0.001 and cubic (r=0.79; P<0.001 relationship between b.TSH and d.TSH. Conclusion: Since TRH test is not cost effective for all cases, TSH levels lower than 0.25mu/l, can be considered as sub clinical hyperthyroidism and levels more than 0.4mu/l, as normal. In cases with TSH level between 0.25 and 0.4mu/l, TRH test is needed in high-risk patients.

  9. Tumour associated antigen CA-50, CA-242 immunoradiometric assay (IRMA) in genitourinary malignancy and gastrointestinal carcinoma early diagnosis

    International Nuclear Information System (INIS)

    Chen Zhizhou.

    1992-04-01

    Tumour markers CA-50 and CA-242 were measured by immunometric assay (IRMA) to investigate their usefulness in the diagnosis of cancer of the pancreas, biliary tract, liver, breast, lung, gastrointestinal and genitourinary systems. The cutoff points, derived from studies on normal subjects and those with proven benign disease, were 20 u/ml and 12 u/ml for CA-50 and CA-242 respectively. Both markers were found to be generally useful with significant differences between malignant and non malignant disease. The highest positive rates, were found in cancers of the pancreas and gall bladder. The overall rate of false positives was low. It is concluded that measurements of CA-50 and CA-242 are useful in the detection of malignancy, particularly of the pancreas and biliary tract. 2 figs, 2 tabs

  10. Köprüçay Irmağı (Antalya Bentik Omurgasız Faunası

    Directory of Open Access Journals (Sweden)

    Melek ZEYBEK

    2014-07-01

    Full Text Available Bu çalışma, Köprüçay Irmağı bentik omurgasız faunasını belirlemek üzere Şubat 2008- Ocak 2009 tarihleri arasında 7 istasyondan bentik omurgasız örnekleri alınarak, gerçekleştirilmiştir. Çalışma sonunda toplam 85 takson ve bu taksonları oluşturan toplam 21318 birey tespit edilmiştir. Toplanan örneklerin incelenmesi sonucu Ephemeroptera takımına ait 26, Plecoptera takımına ait 6, Trichoptera takımına ait 23, Diptera takımına ait 11, Odonata takımına ait 8, Coleoptera takımına ait 4, Hemiptera takımına ait 1, Hirudinea sınıfına ait 1, Gastropoda sınıfına ait 2, Malacostraca sınıfına ait 3 takson teşhis edilmiştir. Bu çalışma Köprüçay Irmağı bentik faunasının belirlenmesine yönelik yapılmış ilk çalışmadır. Bu nedenle tespit edilen bütün taksonlar bölge için ilk kez bildirilmiştir.

  11. Short-term impacts of Hurricanes Irma and Maria on tropical stream chemistry as measured by in-situ sensors

    Science.gov (United States)

    McDowell, W. H.; Potter, J.; López-Lloreda, C.

    2017-12-01

    High intensity hurricanes have been shown to alter topical forest productivity and stream chemistry for years to decades in the montane rain forest of Puerto Rico, but much less is known about the immediate ecosystem response to these extreme events. Here we report the short-term impacts of Hurricanes Irma and Maria on the chemistry of Quebrada Sonadora immediately before and after the storms. We place the results from our 15-minute sensor record in the context of long-term weekly sampling that spans 34 years and includes two earlier major hurricanes (Hugo and Geoges). As expected, turbidity during Maria was the highest in our sensor record (> 1000 NTU). Contrary to our expectations, we found that solute-flow behavior changed with the advent of the storms. Specific conductance showed a dilution response to flow before the storms, but then changed to an enrichment response during and after Maria. This switch in system behavior is likely due to the deposition of marine aerosols during the hurricane. Nitrate concentrations showed very little response to discharge prior to the recent hurricanes, but large increase in concentration occurred at high flow both during and after the hurricanes. Baseflow nitrate concentrations decreased immediately after Irma to below the long-term background concentrations, which we attribute to the immobilization of N on organic debris choking the stream channel. Within three weeks of Hurricane Maria, baseflow nitrate concentrations began to rise. This is likely due to mineralization of N from decomposing canopy vegetation on the forest floor, and reduced N uptake by hurricane-damaged vegetation. The high frequency sensors are providing new insights into the response of this ecosystem in the days and weeks following two major disturbance events. The flipping of nitrate response to storms, from source limited to transport limited, suggests that these two severe hurricanes have fundamentally altered the nitrogen cycle at the site in ways

  12. The SAVI vulnerability assessment model

    International Nuclear Information System (INIS)

    Winblad, A.E.

    1987-01-01

    The assessment model ''Systematic Analysis of Vulnerability to Intrusion'' (SAVI) presented in this report is a PC-based path analysis model. It can provide estimates of protection system effectiveness (or vulnerability) against a spectrum of outsider threats including collusion with an insider adversary. It calculates one measure of system effectiveness, the probability of interruption P(I), for all potential adversary paths. SAVI can perform both theft and sabotage vulnerability analyses. For theft, the analysis is based on the assumption that adversaries should be interrupted either before they can accomplish removal of the target material from its normal location or removal from the site boundary. For sabotage, the analysis is based on the assumption that adversaries should be interrupted before completion of their sabotage task

  13. Irrigation in dose assessments models

    Energy Technology Data Exchange (ETDEWEB)

    Bergstroem, Ulla; Barkefors, Catarina [Studsvik RadWaste AB, Nykoeping (Sweden)

    2004-05-01

    SKB has carried out several safety analyses for repositories for radioactive waste, one of which was SR 97, a multi-site study concerned with a future deep bedrock repository for high-level waste. In case of future releases due to unforeseen failure of the protective multiple barrier system, radionuclides may be transported with groundwater and may reach the biosphere. Assessments of doses have to be carried out with a long-term perspective. Specific models are therefore employed to estimate consequences to man. It has been determined that the main pathway for nuclides from groundwater or surface water to soil is via irrigation. Irrigation may cause contamination of crops directly by e.g. interception or rain-splash, and indirectly via root-uptake from contaminated soil. The exposed people are in many safety assessments assumed to be self-sufficient, i.e. their food is produced locally where the concentration of radionuclides may be the highest. Irrigation therefore plays an important role when estimating consequences. The present study is therefore concerned with a more extensive analysis of the role of irrigation for possible future doses to people living in the area surrounding a repository. Current irrigation practices in Sweden are summarised, showing that vegetables and potatoes are the most common crops for irrigation. In general, however, irrigation is not so common in Sweden. The irrigation model used in the latest assessments is described. A sensitivity analysis is performed showing that, as expected, interception of irrigation water and retention on vegetation surfaces are important parameters. The parameters used to describe this are discussed. A summary is also given how irrigation is proposed to be handled in the international BIOMASS (BIOsphere Modelling and ASSessment) project and in models like TAME and BIOTRAC. Similarities and differences are pointed out. Some numerical results are presented showing that surface contamination in general gives the

  14. Irrigation in dose assessments models

    International Nuclear Information System (INIS)

    Bergstroem, Ulla; Barkefors, Catarina

    2004-05-01

    SKB has carried out several safety analyses for repositories for radioactive waste, one of which was SR 97, a multi-site study concerned with a future deep bedrock repository for high-level waste. In case of future releases due to unforeseen failure of the protective multiple barrier system, radionuclides may be transported with groundwater and may reach the biosphere. Assessments of doses have to be carried out with a long-term perspective. Specific models are therefore employed to estimate consequences to man. It has been determined that the main pathway for nuclides from groundwater or surface water to soil is via irrigation. Irrigation may cause contamination of crops directly by e.g. interception or rain-splash, and indirectly via root-uptake from contaminated soil. The exposed people are in many safety assessments assumed to be self-sufficient, i.e. their food is produced locally where the concentration of radionuclides may be the highest. Irrigation therefore plays an important role when estimating consequences. The present study is therefore concerned with a more extensive analysis of the role of irrigation for possible future doses to people living in the area surrounding a repository. Current irrigation practices in Sweden are summarised, showing that vegetables and potatoes are the most common crops for irrigation. In general, however, irrigation is not so common in Sweden. The irrigation model used in the latest assessments is described. A sensitivity analysis is performed showing that, as expected, interception of irrigation water and retention on vegetation surfaces are important parameters. The parameters used to describe this are discussed. A summary is also given how irrigation is proposed to be handled in the international BIOMASS (BIOsphere Modelling and ASSessment) project and in models like TAME and BIOTRAC. Similarities and differences are pointed out. Some numerical results are presented showing that surface contamination in general gives the

  15. Behavior model for performance assessment

    International Nuclear Information System (INIS)

    Brown-VanHoozer, S. A.

    1999-01-01

    Every individual channels information differently based on their preference of the sensory modality or representational system (visual auditory or kinesthetic) we tend to favor most (our primary representational system (PRS)). Therefore, some of us access and store our information primarily visually first, some auditorily, and others kinesthetically (through feel and touch); which in turn establishes our information processing patterns and strategies and external to internal (and subsequently vice versa) experiential language representation. Because of the different ways we channel our information, each of us will respond differently to a task--the way we gather and process the external information (input), our response time (process), and the outcome (behavior). Traditional human models of decision making and response time focus on perception, cognitive and motor systems stimulated and influenced by the three sensory modalities, visual, auditory and kinesthetic. For us, these are the building blocks to knowing how someone is thinking. Being aware of what is taking place and how to ask questions is essential in assessing performance toward reducing human errors. Existing models give predications based on time values or response times for a particular event, and may be summed and averaged for a generalization of behavior(s). However, by our not establishing a basic understanding of the foundation of how the behavior was predicated through a decision making strategy process, predicative models are overall inefficient in their analysis of the means by which behavior was generated. What is seen is the end result

  16. Behavior model for performance assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Borwn-VanHoozer, S. A.

    1999-07-23

    Every individual channels information differently based on their preference of the sensory modality or representational system (visual auditory or kinesthetic) we tend to favor most (our primary representational system (PRS)). Therefore, some of us access and store our information primarily visually first, some auditorily, and others kinesthetically (through feel and touch); which in turn establishes our information processing patterns and strategies and external to internal (and subsequently vice versa) experiential language representation. Because of the different ways we channel our information, each of us will respond differently to a task--the way we gather and process the external information (input), our response time (process), and the outcome (behavior). Traditional human models of decision making and response time focus on perception, cognitive and motor systems stimulated and influenced by the three sensory modalities, visual, auditory and kinesthetic. For us, these are the building blocks to knowing how someone is thinking. Being aware of what is taking place and how to ask questions is essential in assessing performance toward reducing human errors. Existing models give predications based on time values or response times for a particular event, and may be summed and averaged for a generalization of behavior(s). However, by our not establishing a basic understanding of the foundation of how the behavior was predicated through a decision making strategy process, predicative models are overall inefficient in their analysis of the means by which behavior was generated. What is seen is the end result.

  17. Endothelial dysfunction and inflammation predict development of diabetic nephropathy in the Irbesartan in Patients with Type 2 Diabetes and Microalbuminuria (IRMA 2) study

    DEFF Research Database (Denmark)

    Persson, Frederik; Rossing, Peter; Hovind, Peter

    2008-01-01

    OBJECTIVE: To evaluate risk factors for progression from persistent microalbuminuria to diabetic nephropathy in the Irbesartan in Patients with Type 2 diabetes and Microalbuminuria (IRMA 2) study, including biomarkers of endothelial dysfunction, chronic low-grade inflammation, growth factors...... and advanced glycation end products (AGE peptides). METHODS: IRMA 2 was a 2-year multicentre, randomized, double-blind trial comparing irbesartan (150 and 300 mg once daily) versus placebo. The primary end-point was time to onset of diabetic nephropathy. Samples from a subgroup from the placebo and the 300 mg...... and vWf predicted the end-point. In addition, endothelial Z-score was associated with progression of albuminuria (p = 0.038). CONCLUSION: Endothelial dysfunction and possibly inflammation are novel predictors of progression to diabetic nephropathy in patients with type 2 diabetes and microalbuminuria...

  18. Utility of Social Modeling for Proliferation Assessment - Preliminary Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Garill A.; Gastelum, Zoe N.; Brothers, Alan J.; Thompson, Sandra E.

    2009-06-01

    This Preliminary Assessment draft report will present the results of a literature search and preliminary assessment of the body of research, analysis methods, models and data deemed to be relevant to the Utility of Social Modeling for Proliferation Assessment research. This report will provide: 1) a description of the problem space and the kinds of information pertinent to the problem space, 2) a discussion of key relevant or representative literature, 3) a discussion of models and modeling approaches judged to be potentially useful to the research, and 4) the next steps of this research that will be pursued based on this preliminary assessment. This draft report represents a technical deliverable for the NA-22 Simulations, Algorithms, and Modeling (SAM) program. Specifically this draft report is the Task 1 deliverable for project PL09-UtilSocial-PD06, Utility of Social Modeling for Proliferation Assessment. This project investigates non-traditional use of social and cultural information to improve nuclear proliferation assessment, including nonproliferation assessment, proliferation resistance assessments, safeguards assessments and other related studies. These assessments often use and create technical information about the State’s posture towards proliferation, the vulnerability of a nuclear energy system to an undesired event, and the effectiveness of safeguards. This project will find and fuse social and technical information by explicitly considering the role of cultural, social and behavioral factors relevant to proliferation. The aim of this research is to describe and demonstrate if and how social science modeling has utility in proliferation assessment.

  19. Regional-scale impact of storm surges on groundwaters of Texas, Florida and Puerto Rico after 2017 hurricanes Harvey, Irma, Jose, Maria

    Science.gov (United States)

    Sellier, W. H.; Dürr, H. H.

    2017-12-01

    Hurricanes and related storm surges have devastating effects on near-shore infrastructure and above-ground installations. They also heavily impact groundwater resources, with potentially millions of people dependant on these resources as a freshwater source. Destructions of casings and direct incursions of saline and/or polluted waters have been widely observed. It is uncertain how extensive the effects are on underground water systems, especially in limestone karst areas such as Florida and Puerto Rico. Here, we report regional-scale water level changes in groundwater systems of Texas, Florida and Puerto Rico for the 2017 Hurricanes Harvey, Irma, Jose and Maria. We collected regional scale data from the USGS Waterdata portal. Puerto Rico shows the strongest increase in groundwater levels in wells during Hurricane Maria, with less reaction for the preceding storms Irma and Jose. Increases in water levels range from 0.5 to 11m, with maximum storm surges in Puerto Rico around 3m. These wells are located throughout Puerto Rico, on the coast and inland. In Florida, most wells that show a response during Hurricane Irma are located in the Miami region. Wells located on the west coast show smaller responses with the exception of one well located directly on Hurricane Irma's track. These wells show an increase of 0.2 to 1.7m. In Texas, wells located in proximity to Hurricane Harvey's track show an increase in water level. The effect of groundwater level increases is not limited to the Texas coast, but inland as well. An increase between 0.03 and 2.9m is seen. Storm surges for both Florida and Texas have ranged from 1.8-3.7m maximum. We discuss the findings in the context of local and regional geology and hydrogeology (presence of connected aquifer systems, faulting, presence of carbonate/karst systems etc.).

  20. Dose assessment models. Annex A

    International Nuclear Information System (INIS)

    1982-01-01

    The models presented in this chapter have been separated into 2 general categories: environmental transport models which describe the movement of radioactive materials through all sectors of the environment after their release, and dosimetric models to calculate the absorbed dose following an intake of radioactive materials or exposure to external irradiation. Various sections of this chapter also deal with atmospheric transport models, terrestrial models, and aquatic models.

  1. The Irma-sponge Project Frhymap: Flood Risk and Hydrological Mapping

    Science.gov (United States)

    Hoffmann, L.; Pfister, L.

    In the context of both increasing socio-economic developments in floodplains and the recent heavy floodings that have occurred in the Rhine and Meuse basins, the need for reliable hydro-climatological data, easily transposable hydrological and hydraulic models, as well as risk management tools has increased crucially. In the FRHYMAP project, some of these issues were addressed within a common mesoscale experimen- tal basin: the Alzette river basin, located in the Grand-duchy of Luxembourg. The various aspects concerning flooding events, reaching from the hydro-climatological analysis of field data to the risk assessment of socio-economic impacts, taking into account past and future climate and landuse changes were analysed by the six partici- pating research institutes (CREBS, L; CEREG, F; DLR, D; EPFL, CH; UB, D; VUB, B). Hydro-climatological data analysis over the past five decades has shown that in the study area, the increase in westerly and south-westerly atmospheric circulation patterns induced higher winter rainfall totals, leading to more frequent groundwater resurgences and ultimately also to higher daily maximum streamflow of the Alzette. The thus increased flood hazard has nonetheless a certain spatial variability, closely linked to the rainfall distribution patterns, which are strongly depending on the topo- graphical characteristics of the study area. Although the overall regime of the Alzette is more dependent on climate fluctuations, land use changes (mining activities, urbani- sation) had a marked effect on the rainfall-runoff relationship in some sub-basins over the last decades. By linking model parameters to physiographical basin characteris- tics, regionalised and thus easily transposable hydrological models were developed. Within a study area with very little long-term observation series, this technique, com- bined with the use of hydraulic models, allowed to define hydrological hazard pro- ducing and hydrological risk exposed areas. The

  2. Assessing uncertainty in mechanistic models

    Science.gov (United States)

    Edwin J. Green; David W. MacFarlane; Harry T. Valentine

    2000-01-01

    Concern over potential global change has led to increased interest in the use of mechanistic models for predicting forest growth. The rationale for this interest is that empirical models may be of limited usefulness if environmental conditions change. Intuitively, we expect that mechanistic models, grounded as far as possible in an understanding of the biology of tree...

  3. Quantitative comparison of in house irma/ria methods using reagents supplied in bulk with assays based on commercial kits

    International Nuclear Information System (INIS)

    Sajid, K.M.; Jafri, S.R.A.

    1997-01-01

    Due to high cost of commercial kit assays, local trials were started to establish low cost immunoassay techniques at MINAR, Multan. First available alternate of commercial kits was the use of matched reagents supplied in bulk by NETRIA through INMOL, Lahore under IAEA assistance. As quality is crucial in RIA estimations this laboratory collected passive quality control data of 50, 51 and 52 in-house assay batches of T3,T4 and TSH to compare with the same number of last Amerlex RIAs(successive lots). A qualitative comparison based on computerized data analysis shows linear correlation between the results of two assay systems with low and acceptable precision in T3 and T4 assays. In TSH assays both systems show high imprecision although Amerlex RIA system is relatively more precise than in-house TSH- IRMA. T3 and T4 assays in both the systems show wide working ranges covering all clinical regions. In TSH, working ranges of both the techniques do not cover all clinical regions. In-house TSH assay excludes below 5.5 mulU/ml, whereas Amerlex excludes levels below 4.5 mulL/ml. This may reduce the clinical efficacy of these tests. Amerlex-M T3 and T4 assays show high negative drift with relatively less between variation, whereas in-house assays show low positive drift with high between batch variation. (author)

  4. Modeling for operational event risk assessment

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The U.S. Nuclear Regulatory Commission has been using risk models to evaluate the risk significance of operational events in U.S. commercial nuclear power plants for more seventeen years. During that time, the models have evolved in response to the advances in risk assessment technology and insights gained with experience. Evaluation techniques fall into two categories, initiating event assessments and condition assessments. The models used for these analyses have become uniquely specialized for just this purpose

  5. Sensitivity Assessment of Ozone Models

    Energy Technology Data Exchange (ETDEWEB)

    Shorter, Jeffrey A.; Rabitz, Herschel A.; Armstrong, Russell A.

    2000-01-24

    The activities under this contract effort were aimed at developing sensitivity analysis techniques and fully equivalent operational models (FEOMs) for applications in the DOE Atmospheric Chemistry Program (ACP). MRC developed a new model representation algorithm that uses a hierarchical, correlated function expansion containing a finite number of terms. A full expansion of this type is an exact representation of the original model and each of the expansion functions is explicitly calculated using the original model. After calculating the expansion functions, they are assembled into a fully equivalent operational model (FEOM) that can directly replace the original mode.

  6. Assessment of Molecular Modeling & Simulation

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-01-03

    This report reviews the development and applications of molecular and materials modeling in Europe and Japan in comparison to those in the United States. Topics covered include computational quantum chemistry, molecular simulations by molecular dynamics and Monte Carlo methods, mesoscale modeling of material domains, molecular-structure/macroscale property correlations like QSARs and QSPRs, and related information technologies like informatics and special-purpose molecular-modeling computers. The panel's findings include the following: The United States leads this field in many scientific areas. However, Canada has particular strengths in DFT methods and homogeneous catalysis; Europe in heterogeneous catalysis, mesoscale, and materials modeling; and Japan in materials modeling and special-purpose computing. Major government-industry initiatives are underway in Europe and Japan, notably in multi-scale materials modeling and in development of chemistry-capable ab-initio molecular dynamics codes.

  7. The Model for Assessment of Telemedicine (MAST)

    DEFF Research Database (Denmark)

    Kidholm, Kristian; Clemensen, Jane; Caffery, Liam J

    2017-01-01

    The evaluation of telemedicine can be achieved using different evaluation models or theoretical frameworks. This paper presents a scoping review of published studies which have applied the Model for Assessment of Telemedicine (MAST). MAST includes pre-implementation assessment (e.g. by use...

  8. Assessment of the Rescorla-Wagner model.

    Science.gov (United States)

    Miller, R R; Barnet, R C; Grahame, N J

    1995-05-01

    The Rescorla-Wagner model has been the most influential theory of associative learning to emerge from the study of animal behavior over the last 25 years. Recently, equivalence to this model has become a benchmark in assessing connectionist models, with such equivalence often achieved by incorporating the Widrow-Hoff delta rule. This article presents the Rescorla-Wagner model's basic assumptions, reviews some of the model's predictive successes and failures, relates the failures to the model's assumptions, and discusses the model's heuristic value. It is concluded that the model has had a positive influence on the study of simple associative learning by stimulating research and contributing to new model development. However, this benefit should neither lead to the model being regarded as inherently "correct" nor imply that its predictions can be profitably used to assess other models.

  9. Understanding National Models for Climate Assessments

    Science.gov (United States)

    Dave, A.; Weingartner, K.

    2017-12-01

    National-level climate assessments have been produced or are underway in a number of countries. These efforts showcase a variety of approaches to mapping climate impacts onto human and natural systems, and involve a variety of development processes, organizational structures, and intended purposes. This presentation will provide a comparative overview of national `models' for climate assessments worldwide, drawing from a geographically diverse group of nations with varying capacities to conduct such assessments. Using an illustrative sampling of assessment models, the presentation will highlight the range of assessment mandates and requirements that drive this work, methodologies employed, focal areas, and the degree to which international dimensions are included for each nation's assessment. This not only allows the U.S. National Climate Assessment to be better understood within an international context, but provides the user with an entry point into other national climate assessments around the world, enabling a better understanding of the risks and vulnerabilities societies face.

  10. Ecological models and pesticide risk assessment: current modeling practice.

    Science.gov (United States)

    Schmolke, Amelie; Thorbek, Pernille; Chapman, Peter; Grimm, Volker

    2010-04-01

    Ecological risk assessments of pesticides usually focus on risk at the level of individuals, and are carried out by comparing exposure and toxicological endpoints. However, in most cases the protection goal is populations rather than individuals. On the population level, effects of pesticides depend not only on exposure and toxicity, but also on factors such as life history characteristics, population structure, timing of application, presence of refuges in time and space, and landscape structure. Ecological models can integrate such factors and have the potential to become important tools for the prediction of population-level effects of exposure to pesticides, thus allowing extrapolations, for example, from laboratory to field. Indeed, a broad range of ecological models have been applied to chemical risk assessment in the scientific literature, but so far such models have only rarely been used to support regulatory risk assessments of pesticides. To better understand the reasons for this situation, the current modeling practice in this field was assessed in the present study. The scientific literature was searched for relevant models and assessed according to nine characteristics: model type, model complexity, toxicity measure, exposure pattern, other factors, taxonomic group, risk assessment endpoint, parameterization, and model evaluation. The present study found that, although most models were of a high scientific standard, many of them would need modification before they are suitable for regulatory risk assessments. The main shortcomings of currently available models in the context of regulatory pesticide risk assessments were identified. When ecological models are applied to regulatory risk assessments, we recommend reviewing these models according to the nine characteristics evaluated here. (c) 2010 SETAC.

  11. Ecosystem Model Skill Assessment. Yes We Can!

    Science.gov (United States)

    Olsen, Erik; Fay, Gavin; Gaichas, Sarah; Gamble, Robert; Lucey, Sean; Link, Jason S

    2016-01-01

    Accelerated changes to global ecosystems call for holistic and integrated analyses of past, present and future states under various pressures to adequately understand current and projected future system states. Ecosystem models can inform management of human activities in a complex and changing environment, but are these models reliable? Ensuring that models are reliable for addressing management questions requires evaluating their skill in representing real-world processes and dynamics. Skill has been evaluated for just a limited set of some biophysical models. A range of skill assessment methods have been reviewed but skill assessment of full marine ecosystem models has not yet been attempted. We assessed the skill of the Northeast U.S. (NEUS) Atlantis marine ecosystem model by comparing 10-year model forecasts with observed data. Model forecast performance was compared to that obtained from a 40-year hindcast. Multiple metrics (average absolute error, root mean squared error, modeling efficiency, and Spearman rank correlation), and a suite of time-series (species biomass, fisheries landings, and ecosystem indicators) were used to adequately measure model skill. Overall, the NEUS model performed above average and thus better than expected for the key species that had been the focus of the model tuning. Model forecast skill was comparable to the hindcast skill, showing that model performance does not degenerate in a 10-year forecast mode, an important characteristic for an end-to-end ecosystem model to be useful for strategic management purposes. We identify best-practice approaches for end-to-end ecosystem model skill assessment that would improve both operational use of other ecosystem models and future model development. We show that it is possible to not only assess the skill of a complicated marine ecosystem model, but that it is necessary do so to instill confidence in model results and encourage their use for strategic management. Our methods are applicable

  12. Model of MSD Risk Assessment at Workplace

    OpenAIRE

    K. Sekulová; M. Šimon

    2015-01-01

    This article focuses on upper-extremity musculoskeletal disorders risk assessment model at workplace. In this model are used risk factors that are responsible for musculoskeletal system damage. Based on statistic calculations the model is able to define what risk of MSD threatens workers who are under risk factors. The model is also able to say how MSD risk would decrease if these risk factors are eliminated.

  13. Attention modeling for video quality assessment

    DEFF Research Database (Denmark)

    You, Junyong; Korhonen, Jari; Perkis, Andrew

    2010-01-01

    averaged spatiotemporal pooling. The local quality is derived from visual attention modeling and quality variations over frames. Saliency, motion, and contrast information are taken into account in modeling visual attention, which is then integrated into IQMs to calculate the local quality of a video frame...... average between the global quality and the local quality. Experimental results demonstrate that the combination of the global quality and local quality outperforms both sole global quality and local quality, as well as other quality models, in video quality assessment. In addition, the proposed video...... quality modeling algorithm can improve the performance of image quality metrics on video quality assessment compared to the normal averaged spatiotemporal pooling scheme....

  14. evaluation of models for assessing groundwater vulnerability

    African Journals Online (AJOL)

    DR. AMINU

    applied models for groundwater vulnerability assessment mapping. The appraoches .... The overall 'pollution potential' or DRASTIC index is established by applying the formula: DRASTIC Index: ... affected by the structure of the soil surface.

  15. A Model for Situation and Threat Assessment

    Science.gov (United States)

    2006-12-01

    CUBRC , Inc.) 8151 Needwood #T103 Derwood, MD 20855 UNITED STATES steinberg@cubrc.org A model is presented for situation and threat assessment...PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Subject Matter Expert (SME) Calspan-UB Research Center ( CUBRC , Inc.) 8151 Needwood #T103 Derwood, MD...1 A Model for Situation and Threat Assessment Alan Steinberg CUBRC , Inc. steinberg@cubrc.org November, 2005 2 Objectives • Advance the state-of

  16. Modeling of Communication in a Computational Situation Assessment Model

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Seong, Poong Hyun

    2009-01-01

    Operators in nuclear power plants have to acquire information from human system interfaces (HSIs) and the environment in order to create, update, and confirm their understanding of a plant state, or situation awareness, because failures of situation assessment may result in wrong decisions for process control and finally errors of commission in nuclear power plants. Quantitative or prescriptive models to predict operator's situation assessment in a situation, the results of situation assessment, provide many benefits such as HSI design solutions, human performance data, and human reliability. Unfortunately, a few computational situation assessment models for NPP operators have been proposed and those insufficiently embed human cognitive characteristics. Thus we proposed a new computational situation assessment model of nuclear power plant operators. The proposed model incorporating significant cognitive factors uses a Bayesian belief network (BBN) as model architecture. It is believed that communication between nuclear power plant operators affects operators' situation assessment and its result, situation awareness. We tried to verify that the proposed model represent the effects of communication on situation assessment. As the result, the proposed model succeeded in representing the operators' behavior and this paper shows the details

  17. STAMINA - Model description. Standard Model Instrumentation for Noise Assessments

    NARCIS (Netherlands)

    Schreurs EM; Jabben J; Verheijen ENG; CMM; mev

    2010-01-01

    Deze rapportage beschrijft het STAMINA-model, dat staat voor Standard Model Instrumentation for Noise Assessments en door het RIVM is ontwikkeld. Het instituut gebruikt dit standaardmodel om omgevingsgeluid in Nederland in kaart te brengen. Het model is gebaseerd op de Standaard Karteringsmethode

  18. Models and parameters for environmental radiological assessments

    International Nuclear Information System (INIS)

    Miller, C.W.

    1984-01-01

    This book presents a unified compilation of models and parameters appropriate for assessing the impact of radioactive discharges to the environment. Models examined include those developed for the prediction of atmospheric and hydrologic transport and deposition, for terrestrial and aquatic food-chain bioaccumulation, and for internal and external dosimetry. Chapters have been entered separately into the data base

  19. Models and parameters for environmental radiological assessments

    Energy Technology Data Exchange (ETDEWEB)

    Miller, C W [ed.

    1984-01-01

    This book presents a unified compilation of models and parameters appropriate for assessing the impact of radioactive discharges to the environment. Models examined include those developed for the prediction of atmospheric and hydrologic transport and deposition, for terrestrial and aquatic food-chain bioaccumulation, and for internal and external dosimetry. Chapters have been entered separately into the data base. (ACR)

  20. A Simple Model of Self-Assessments

    NARCIS (Netherlands)

    S. Dominguez Martinez (Silvia); O.H. Swank (Otto)

    2006-01-01

    textabstractWe develop a simple model that describes individuals' self-assessments of their abilities. We assume that individuals learn about their abilities from appraisals of others and experience. Our model predicts that if communication is imperfect, then (i) appraisals of others tend to be too

  1. A simple model of self-assessment

    NARCIS (Netherlands)

    Dominguez-Martinez, S.; Swank, O.H.

    2009-01-01

    We develop a simple model that describes individuals' self-assessments of their abilities. We assume that individuals learn about their abilities from appraisals of others and experience. Our model predicts that if communication is imperfect, then (i) appraisals of others tend to be too positive and

  2. Predictions of models for environmental radiological assessment

    International Nuclear Information System (INIS)

    Peres, Sueli da Silva; Lauria, Dejanira da Costa; Mahler, Claudio Fernando

    2011-01-01

    In the field of environmental impact assessment, models are used for estimating source term, environmental dispersion and transfer of radionuclides, exposure pathway, radiation dose and the risk for human beings Although it is recognized that the specific information of local data are important to improve the quality of the dose assessment results, in fact obtaining it can be very difficult and expensive. Sources of uncertainties are numerous, among which we can cite: the subjectivity of modelers, exposure scenarios and pathways, used codes and general parameters. The various models available utilize different mathematical approaches with different complexities that can result in different predictions. Thus, for the same inputs different models can produce very different outputs. This paper presents briefly the main advances in the field of environmental radiological assessment that aim to improve the reliability of the models used in the assessment of environmental radiological impact. The intercomparison exercise of model supplied incompatible results for 137 Cs and 60 Co, enhancing the need for developing reference methodologies for environmental radiological assessment that allow to confront dose estimations in a common comparison base. The results of the intercomparison exercise are present briefly. (author)

  3. Underwater noise modelling for environmental impact assessment

    Energy Technology Data Exchange (ETDEWEB)

    Farcas, Adrian [Centre for Environment, Fisheries and Aquaculture Science (Cefas), Pakefield Road, Lowestoft, NR33 0HT (United Kingdom); Thompson, Paul M. [Lighthouse Field Station, Institute of Biological and Environmental Sciences, University of Aberdeen, Cromarty IV11 8YL (United Kingdom); Merchant, Nathan D., E-mail: nathan.merchant@cefas.co.uk [Centre for Environment, Fisheries and Aquaculture Science (Cefas), Pakefield Road, Lowestoft, NR33 0HT (United Kingdom)

    2016-02-15

    Assessment of underwater noise is increasingly required by regulators of development projects in marine and freshwater habitats, and noise pollution can be a constraining factor in the consenting process. Noise levels arising from the proposed activity are modelled and the potential impact on species of interest within the affected area is then evaluated. Although there is considerable uncertainty in the relationship between noise levels and impacts on aquatic species, the science underlying noise modelling is well understood. Nevertheless, many environmental impact assessments (EIAs) do not reflect best practice, and stakeholders and decision makers in the EIA process are often unfamiliar with the concepts and terminology that are integral to interpreting noise exposure predictions. In this paper, we review the process of underwater noise modelling and explore the factors affecting predictions of noise exposure. Finally, we illustrate the consequences of errors and uncertainties in noise modelling, and discuss future research needs to reduce uncertainty in noise assessments.

  4. Underwater noise modelling for environmental impact assessment

    International Nuclear Information System (INIS)

    Farcas, Adrian; Thompson, Paul M.; Merchant, Nathan D.

    2016-01-01

    Assessment of underwater noise is increasingly required by regulators of development projects in marine and freshwater habitats, and noise pollution can be a constraining factor in the consenting process. Noise levels arising from the proposed activity are modelled and the potential impact on species of interest within the affected area is then evaluated. Although there is considerable uncertainty in the relationship between noise levels and impacts on aquatic species, the science underlying noise modelling is well understood. Nevertheless, many environmental impact assessments (EIAs) do not reflect best practice, and stakeholders and decision makers in the EIA process are often unfamiliar with the concepts and terminology that are integral to interpreting noise exposure predictions. In this paper, we review the process of underwater noise modelling and explore the factors affecting predictions of noise exposure. Finally, we illustrate the consequences of errors and uncertainties in noise modelling, and discuss future research needs to reduce uncertainty in noise assessments.

  5. Assessing Ecosystem Model Performance in Semiarid Systems

    Science.gov (United States)

    Thomas, A.; Dietze, M.; Scott, R. L.; Biederman, J. A.

    2017-12-01

    In ecosystem process modelling, comparing outputs to benchmark datasets observed in the field is an important way to validate models, allowing the modelling community to track model performance over time and compare models at specific sites. Multi-model comparison projects as well as models themselves have largely been focused on temperate forests and similar biomes. Semiarid regions, on the other hand, are underrepresented in land surface and ecosystem modelling efforts, and yet will be disproportionately impacted by disturbances such as climate change due to their sensitivity to changes in the water balance. Benchmarking models at semiarid sites is an important step in assessing and improving models' suitability for predicting the impact of disturbance on semiarid ecosystems. In this study, several ecosystem models were compared at a semiarid grassland in southwestern Arizona using PEcAn, or the Predictive Ecosystem Analyzer, an open-source eco-informatics toolbox ideal for creating the repeatable model workflows necessary for benchmarking. Models included SIPNET, DALEC, JULES, ED2, GDAY, LPJ-GUESS, MAESPA, CLM, CABLE, and FATES. Comparison between model output and benchmarks such as net ecosystem exchange (NEE) tended to produce high root mean square error and low correlation coefficients, reflecting poor simulation of seasonality and the tendency for models to create much higher carbon sources than observed. These results indicate that ecosystem models do not currently adequately represent semiarid ecosystem processes.

  6. The development of T3-RIA, T4-RIA and TSH-IRMA for in vitro testing of thyroid function

    International Nuclear Information System (INIS)

    Borza, V.; Neacsu, G.; Chariton, Despina

    1998-01-01

    Thyroxine (T 4 ) and triiodothyronine (T 3 ) are two principal thyroid hormones; the release of this hormones and control of different stages of their synthesis are performed by thyrotropin (TSH), secreted by pituitary gland. Also, T 3 and T 4 exert negative feed-back on the pituitary, inhibiting the release of TSH. The measurement of T 3 , T 4 content in un-extracted serum, correlated with TSH values are useful results for investigating the pituitary-thyroid axis. This paper describes radioimmunological procedures for the measurement of T 3 and T 4 using as separation method of the bound and free radiolabeled antigen, the precipitation of antigen-antibody complex by polyethyleneglycol (PEG). Antisera against T 3 , T 4 were produced by immunizing sheep with conjugates of the hormones and bovine albumin; T 3 and T 4 standards were made in horse serum free of these hormones. Binding of T 3 and T 4 to TBG in serum was inhibited by addition of 8-aniline-1-naphthalene-sulfonic acid (ANS). The separation of antigen-antibody complex was carried out using 25.5% PEG 6000. In order to develop a simple T 3 solid phase radioimmunoassay, in this paper the immobilization of anti-T 3 antibodies on polystyrene tubes is presented. The best results were obtained with an exposure time of anti-T 3 antibodies (diluted in buffer solution, pH 8.4-8.6) of 40 h at 4 o C. Also, in this study the preparation of 125 I labeled monoclonal antibody (Mab)-anti-TSH is described, which will be used as a component of a TSH-IRMA kit; this kit is to be realized in our department. 125 I - Mab anti-TSH has the following characteristics: specific activity = 20 - 24 μCi/μg and radioactive concentration ≅ 25 μCi/ml; also, the immunological properties of tracer were verified. The major results of this activity is that the total dependence on important kits will be eliminated and also, the costs will be reduced. (authors)

  7. Hurricane Irma's Effects on Dune and Beach Morphology at Matanzas Inlet, Atlantic Coast of North Florida: Impacts and Inhibited Recovery?

    Science.gov (United States)

    Adams, P. N.; Conlin, M. P.; Johnson, H. A.; Paniagua-Arroyave, J. F.; Woo, H. B.; Kelly, B. P.

    2017-12-01

    During energetic coastal storms, surge from low atmospheric pressure, high wave set-up, and increased wave activity contribute to significant morphologic change within the dune and upper beach environments of barrier island systems. Hurricane Irma made landfall on the southwestern portion of the Florida peninsula, as a category 4 storm on Sept 10th, 2017 and tracked northward along the axis of the Florida peninsula for two days before dissipating over the North American continent. Observations along the North Florida Atlantic coast recorded significant wave heights of nearly 7 m and water levels that exceeded predictions by 2 meters on the early morning of Sept. 11th. At Fort Matanzas National Monument, the dune and upper beach adjacent to Matanzas Inlet experienced landward retreat during the storm, diminishing the acreage of dune and scrub habitat for federally-listed endangered and threatened animal species, including the Anastasia beach mouse, gopher tortoises, and several protected shore birds. Real Time Kinematic (RTK) GPS surveys, conducted prior to the passage of the storm (Sept. 8) and immediately after the storm (Sept. 13) document dune scarp retreat >10 m in places and an average retreat of 7.8 m (+/- 5.2 m) of the 2-m beach contour, attributable to the event, within the study region. Although it is typical to see sedimentary recovery at the base of dunes within weeks following an erosive event of this magnitude, our follow up RTK surveys, two weeks (Sept. 26) and five weeks (Oct. 19) after the storm, document continued dune retreat and upper beach lowering. Subsequent local buoy observations during the offshore passage of Hurricanes Jose, Maria (Sept. 17 and 23, respectively) and several early-season Nor'easters recorded wave heights well above normal (2-3 meters) from the northeast. The lack of recovery may reveal a threshold vulnerability of the system, in which the timing of multiple moderate-to-high wave events, in the aftermath of a land falling

  8. Modeling inputs to computer models used in risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.

    1987-01-01

    Computer models for various risk assessment applications are closely scrutinized both from the standpoint of questioning the correctness of the underlying mathematical model with respect to the process it is attempting to model and from the standpoint of verifying that the computer model correctly implements the underlying mathematical model. A process that receives less scrutiny, but is nonetheless of equal importance, concerns the individual and joint modeling of the inputs. This modeling effort clearly has a great impact on the credibility of results. Model characteristics are reviewed in this paper that have a direct bearing on the model input process and reasons are given for using probabilities-based modeling with the inputs. The authors also present ways to model distributions for individual inputs and multivariate input structures when dependence and other constraints may be present

  9. Bioavailability in the boris assessment model

    International Nuclear Information System (INIS)

    Norden, M.; Avila, R.; Gonze, M.A.; Tamponnet, C.

    2004-01-01

    The fifth framework EU project BORIS (Bioavailability Of Radionuclides In Soils: role of biological components and resulting improvement of prediction models) has three scientific objectives. The first is to improve understanding of the mechanisms governing the transfer of radionuclides to plants. The second is to improve existing predictive models of radionuclide interaction with soils by incorporating the knowledge acquired from the experimental results. The last and third objective is to extract from the experimental results some scientific basis for the development of bioremediation methods of radionuclides contaminated soils and to apprehend the role of additional non-radioactive pollutants on radionuclide bio-availability. This paper is focused on the second objective. The purpose of the BORIS assessment model is to describe the behaviour of radionuclides in the soil-plant system with the aim of making predictions of the time dynamics of the bioavailability of radionuclides in soil and the radionuclides concentrations in plants. To be useful the assessment model should be rather simple and use only a few parameters, which are commonly available or possible to measure for different sites. The model shall take into account, as much as possible, the results of the experimental studies and the mechanistic models developed in the BORIS project. One possible approach is to introduce in the assessment model a quantitative relationship between bioavailability of the radionuclides in soil and the soil properties. To do this an operational definition of bioavailability is needed. Here operational means experimentally measurable, directly or indirectly, and that the bioavailability can be translated into a mathematical expression. This paper describes the reasoning behind the chosen definition of bioavailability for the assessment model, how to derive operational expressions for the bioavailability and how to use them in the assessment model. (author)

  10. Assessing alternative conceptual models of fracture flow

    International Nuclear Information System (INIS)

    Ho, C.K.

    1995-01-01

    The numerical code TOUGH2 was used to assess alternative conceptual models of fracture flow. The models that were considered included the equivalent continuum model (ECM) and the dual permeability (DK) model. A one-dimensional, layered, unsaturated domain was studied with a saturated bottom boundary and a constant infiltration at the top boundary. Two different infiltration rates were used in the studies. In addition, the connection areas between the fracture and matrix elements in the dual permeability model were varied. Results showed that the two conceptual models of fracture flow produced different saturation and velocity profiles-even under steady-state conditions. The magnitudes of the discrepancies were sensitive to two parameters that affected the flux between the fractures and matrix in the dual permeability model: (1) the fracture-matrix connection areas and (2) the capillary pressure gradients between the fracture and matrix elements

  11. Personalized pseudophakic model for refractive assessment.

    Directory of Open Access Journals (Sweden)

    Filomena J Ribeiro

    Full Text Available PURPOSE: To test a pseudophakic eye model that allows for intraocular lens power (IOL calculation, both in normal eyes and in extreme conditions, such as post-LASIK. METHODS: PARTICIPANTS: The model's efficacy was tested in 54 participants (104 eyes who underwent LASIK and were assessed before and after surgery, thus allowing to test the same method in the same eye after only changing corneal topography. MODELLING: The Liou-Brennan eye model was used as a starting point, and biometric values were replaced by individual measurements. Detailed corneal surface data were obtained from topography (Orbscan® and a grid of elevation values was used to define corneal surfaces in an optical ray-tracing software (Zemax®. To determine IOL power, optimization criteria based on values of the modulation transfer function (MTF weighted according to contrast sensitivity function (CSF, were applied. RESULTS: Pre-operative refractive assessment calculated by our eye model correlated very strongly with SRK/T (r = 0.959, p0.05. Comparison of post-operative refractive assessment obtained using our eye model with the average of currently used formulas showed a strong correlation (r = 0.778, p0.05. CONCLUSIONS: Results suggest that personalized pseudophakic eye models and ray-tracing allow for the use of the same methodology, regardless of previous LASIK, independent of population averages and commonly used regression correction factors, which represents a clinical advantage.

  12. Personalized pseudophakic model for refractive assessment.

    Science.gov (United States)

    Ribeiro, Filomena J; Castanheira-Dinis, António; Dias, João M

    2012-01-01

    To test a pseudophakic eye model that allows for intraocular lens power (IOL) calculation, both in normal eyes and in extreme conditions, such as post-LASIK. The model's efficacy was tested in 54 participants (104 eyes) who underwent LASIK and were assessed before and after surgery, thus allowing to test the same method in the same eye after only changing corneal topography. MODELLING: The Liou-Brennan eye model was used as a starting point, and biometric values were replaced by individual measurements. Detailed corneal surface data were obtained from topography (Orbscan®) and a grid of elevation values was used to define corneal surfaces in an optical ray-tracing software (Zemax®). To determine IOL power, optimization criteria based on values of the modulation transfer function (MTF) weighted according to contrast sensitivity function (CSF), were applied. Pre-operative refractive assessment calculated by our eye model correlated very strongly with SRK/T (r = 0.959, p0.05). Comparison of post-operative refractive assessment obtained using our eye model with the average of currently used formulas showed a strong correlation (r = 0.778, p0.05). Results suggest that personalized pseudophakic eye models and ray-tracing allow for the use of the same methodology, regardless of previous LASIK, independent of population averages and commonly used regression correction factors, which represents a clinical advantage.

  13. Assessment of Venous Thrombosis in Animal Models.

    Science.gov (United States)

    Grover, Steven P; Evans, Colin E; Patel, Ashish S; Modarai, Bijan; Saha, Prakash; Smith, Alberto

    2016-02-01

    Deep vein thrombosis and common complications, including pulmonary embolism and post-thrombotic syndrome, represent a major source of morbidity and mortality worldwide. Experimental models of venous thrombosis have provided considerable insight into the cellular and molecular mechanisms that regulate thrombus formation and subsequent resolution. Here, we critically appraise the ex vivo and in vivo techniques used to assess venous thrombosis in these models. Particular attention is paid to imaging modalities, including magnetic resonance imaging, micro-computed tomography, and high-frequency ultrasound that facilitate longitudinal assessment of thrombus size and composition. © 2015 American Heart Association, Inc.

  14. A model for assessment of telemedicine applications

    DEFF Research Database (Denmark)

    Kidholm, Kristian; Ekeland, Anne Granstrøm; Jensen, Lise Kvistgaard

    2012-01-01

    the European Commission initiated the development of a framework for assessing telemedicine applications, based on the users' need for information for decision making. This article presents the Model for ASsessment of Telemedicine applications (MAST) developed in this study.......Telemedicine applications could potentially solve many of the challenges faced by the healthcare sectors in Europe. However, a framework for assessment of these technologies is need by decision makers to assist them in choosing the most efficient and cost-effective technologies. Therefore in 2009...

  15. NILAI-NILAI PENDIDIKAN MULTIKULTURAL NOVEL DIFFERENT: KETIKA PERBEDAAN BUKAN SEBUAH PENGHALANG KARYA IRMA T. LESTARI

    Directory of Open Access Journals (Sweden)

    Setijani

    2017-08-01

    Full Text Available Penelitian ini bertujuan mendeskripsikan dan menganalisis nilainilai pendidikan: (1 menghargai perbedaan Suku; (2 menghargai perbedaan agama; (3 menghargai perbedaan ras; dan (4 menghargai perbedaan budaya yang terkandung dalam novel Different Penelitian ini menggunakan jenis kualitatif, sedangkan pendekatan yang digunakan adalah sosiologi sastra. Objek penelitiannya adalah Novel: Different dan analisis datanya menggunakan model interakcrive dari Milles & Huberman. Hasil penelitian menunjukkan bahwa nilai-nilai pendidikan multikultural novel Different adalah mencakup beberapa hal. (1 Nilai menghargai perbedaan suku tercermin dalam sikap saling menghormati dan saling menghargai antar tokoh suku atau etnis China dengan tokohtokoh suku Jawa dan Bali. Mereka dapat hidup damai dan rukun mulai SD sampai perguruan tinggi; (2 Nilai menghargai perbedaan agama tercermin dalam sikap saling menghormati dan saling menghargai antar tokoh dalam novel, antara tokoh dengan orang tuanya, antara tokoh dengan dosen dan teman-temannya yang berbeda agama; (3 Nilai menghargai perbedaan ras tercermin dalam sikap saling menghormati dan saling menghargai antara tokoh yang memiliki ciri-ciri fisik China dengan tokoh yang memiliki ciri-ciri fisik suku Jawa dan Bali; dan (4 Nilai menghargai perbedaan budaya tercermin dalam sikap saling menghormati dan saling menghargai antara tokoh yang memiliki budaya China yang beragama Konghucu dengan tokoh yang memiliki budaya Jawa yang berama Islam, dan tokoh yang memiliki budaya Bali yang beragama Hindu dan Kristen.

  16. Models and parameters for environmental radiological assessments

    International Nuclear Information System (INIS)

    Miller, C.W.

    1983-01-01

    This article reviews the forthcoming book Models and Parameters for Environmental Radiological Assessments, which presents a unified compilation of models and parameters for assessing the impact on man of radioactive discharges, both routine and accidental, into the environment. Models presented in this book include those developed for the prediction of atmospheric and hydrologic transport and deposition, for terrestrial and aquatic food-chain bioaccumulation, and for internal and external dosimetry. Summaries are presented for each of the transport and dosimetry areas previously for each of the transport and dosimetry areas previously mentioned, and details are available in the literature cited. A chapter of example problems illustrates many of the methodologies presented throughout the text. Models and parameters presented are based on the results of extensive literature reviews and evaluations performed primarily by the staff of the Health and Safety Research Division of Oak Ridge National Laboratory

  17. Conceptual models for cumulative risk assessment.

    Science.gov (United States)

    Linder, Stephen H; Sexton, Ken

    2011-12-01

    In the absence of scientific consensus on an appropriate theoretical framework, cumulative risk assessment and related research have relied on speculative conceptual models. We argue for the importance of theoretical backing for such models and discuss 3 relevant theoretical frameworks, each supporting a distinctive "family" of models. Social determinant models postulate that unequal health outcomes are caused by structural inequalities; health disparity models envision social and contextual factors acting through individual behaviors and biological mechanisms; and multiple stressor models incorporate environmental agents, emphasizing the intermediary role of these and other stressors. The conclusion is that more careful reliance on established frameworks will lead directly to improvements in characterizing cumulative risk burdens and accounting for disproportionate adverse health effects.

  18. Real-Time Tracking of the Extreme Rainfall of Hurricanes Harvey, Irma, and Maria using UCI CHRS's iRain System

    Science.gov (United States)

    Shearer, E. J.; Nguyen, P.; Ombadi, M.; Palacios, T.; Huynh, P.; Furman, D.; Tran, H.; Braithwaite, D.; Hsu, K. L.; Sorooshian, S.; Logan, W. S.

    2017-12-01

    During the 2017 hurricane season, three major hurricanes-Harvey, Irma, and Maria-devastated the Atlantic coast of the US and the Caribbean Islands. Harvey set the record for the rainiest storm in continental US history, Irma was the longest-lived powerful hurricane ever observed, and Maria was the costliest storm in Puerto Rican history. The recorded maximum precipitation totals for these storms were 65, 16, and 20 inches respectively. These events provided the Center for Hydrometeorology and Remote Sensing (CHRS) an opportunity to test its global real-time satellite precipitation observation system, iRain, for extreme storm events. The iRain system has been under development through a collaboration between CHRS at the University of California, Irvine (UCI) and UNESCO's International Hydrological Program (IHP). iRain provides near real-time high resolution (0.04°, approx. 4km) global (60°N - 60°S) satellite precipitation data estimated by the PERSIANN-Cloud Classification System (PERSIANN-CCS) algorithm developed by the scientists at CHRS. The user-interactive and web-accessible iRain system allows users to visualize and download real-time global satellite precipitation estimates and track the development and path of the current 50 largest storms globally from data generated by the PERSIANN-CCS algorithm. iRain continuously proves to be an effective tool for measuring real-time precipitation amounts of extreme storms-especially in locations that do not have extensive rain gauge or radar coverage. Such areas include large portions of the world's oceans and over continents such as Africa and Asia. CHRS also created a mobile app version of the system named "iRain UCI", available for iOS and Android devices. During these storms, real-time rainfall data generated by PERSIANN-CCS was consistently comparable to radar and rain gauge data. This presentation evaluates iRain's efficiency as a tool for extreme precipitation monitoring and provides an evaluation of the

  19. PRACTICAL APPLICATION OF A MODEL FOR ASSESSING

    Directory of Open Access Journals (Sweden)

    Petr NOVOTNÝ

    2015-12-01

    Full Text Available Rail transport is an important sub-sector of transport infrastructure. Disruption of its operation due to emergencies can result in a reduction in functional parameters of provided services with consequent impacts on society. Identification of critical elements of this system enables its timely and effective protection. On that ground, the article presents a draft model for assessing the criticality of railway infrastructure elements. This model uses a systems approach and multicriteria semi-quantitative analysis with weighted criteria for calculating the criticality of individual elements of the railway infrastructure. In the conclusion, it presents a practical application of the proposed model including the discussion of results.

  20. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  1. Radionuclide transport and dose assessment modelling in biosphere assessment 2009

    International Nuclear Information System (INIS)

    Hjerpe, T.; Broed, R.

    2010-11-01

    Following the guidelines set forth by the Ministry of Trade and Industry (now Ministry of Employment and Economy), Posiva is preparing to submit a construction license application for the final disposal spent nuclear fuel at the Olkiluoto site, Finland, by the end of the year 2012. Disposal will take place in a geological repository implemented according to the KBS-3 method. The long-term safety section supporting the license application will be based on a safety case that, according to the internationally adopted definition, will be a compilation of the evidence, analyses and arguments that quantify and substantiate the safety and the level of expert confidence in the safety of the planned repository. This report documents in detail the conceptual and mathematical models and key data used in the landscape model set-up, radionuclide transport modelling, and radiological consequences analysis applied in the 2009 biosphere assessment. Resulting environmental activity concentrations in landscape model due to constant unit geosphere release rates, and the corresponding annual doses, are also calculated and presented in this report. This provides the basis for understanding the behaviour of the applied landscape model and subsequent dose calculations. (orig.)

  2. Review and assessment of pool scrubbing models

    International Nuclear Information System (INIS)

    Herranz, L.E.; Escudero, M.J.; Peyres, V.; Polo, J.; Lopez, J.

    1996-01-01

    Decontamination of fission products bearing bubbles as they pass through aqueous pools becomes a crucial phenomenon for source term evaluation of hypothetical risk dominant sequences of Light Water Reactors. In the present report a peer review and assessment of models encapsulated in SPARC and BUSCA codes is presented. Several aspects of pool scrubbing have been addressed: particle removal, fission product vapour retention and bubble hydrodynamics. Particular emphasis has been given to the close link between retention and hydrodynamics, from both modelling and experimental point of view. In addition, RHR and SGTR sequences were simulated with SPARC90 and BUSCA-AUG92 codes, and their results were compared with those obtained with MAAP 3.0B. As a result of this work, model capabilities and shortcomings have been assessed and some areas susceptible of further research have been identified. (Author) 73 refs

  3. Review and assessment of pool scrubbing models

    Energy Technology Data Exchange (ETDEWEB)

    Herranz, L.E.; Escudero, M.J.; Peyres, V.; Polo, J.; Lopez, J.

    1996-07-01

    Decontamination of fission products bearing bubbles as they pass through aqueous pools becomes a crucial phenomenon for source term evaluation of hypothetical risk dominant sequences of Light Water Reactors. In the present report a peer review and assessment of models encapsulated in SPARC and BUSCA codes is presented. Several aspects of pool scrubbing have been addressed: particle removal, fission product vapour retention and bubble hydrodynamics. Particular emphasis has been given to the close link between retention and hydrodynamics, from both modelling and experimental point of view. In addition, RHR and SGTR sequences were simulated with SPARC90 and BUSCA-AUG92 codes, and their results were compared with those obtained with MAAP 3.0B. As a result of this work, model capabilities and shortcomings have been assessed and some areas susceptible of further research have been identified. (Author) 73 refs.

  4. Model evaluation methodology applicable to environmental assessment models

    International Nuclear Information System (INIS)

    Shaeffer, D.L.

    1979-08-01

    A model evaluation methodology is presented to provide a systematic framework within which the adequacy of environmental assessment models might be examined. The necessity for such a tool is motivated by the widespread use of models for predicting the environmental consequences of various human activities and by the reliance on these model predictions for deciding whether a particular activity requires the deployment of costly control measures. Consequently, the uncertainty associated with prediction must be established for the use of such models. The methodology presented here consists of six major tasks: model examination, algorithm examination, data evaluation, sensitivity analyses, validation studies, and code comparison. This methodology is presented in the form of a flowchart to show the logical interrelatedness of the various tasks. Emphasis has been placed on identifying those parameters which are most important in determining the predictive outputs of a model. Importance has been attached to the process of collecting quality data. A method has been developed for analyzing multiplicative chain models when the input parameters are statistically independent and lognormally distributed. Latin hypercube sampling has been offered as a promising candidate for doing sensitivity analyses. Several different ways of viewing the validity of a model have been presented. Criteria are presented for selecting models for environmental assessment purposes

  5. Models for assessing and managing credit risk

    Directory of Open Access Journals (Sweden)

    Neogradi Slađana

    2014-01-01

    Full Text Available This essay deals with the definition of a model for assessing and managing credit risk. Risk is an inseparable component of any average and normal credit transaction. Looking at the different aspects of the identification and classification of risk in the banking industry as well as representation of the key components of modern risk management. In the first part of the essay will analyze how the impact of credit risk on bank and empirical models for determining the financial difficulties in which the company can be found. Bank on the basis of these models can reduce number of approved risk assets. In the second part, we consider models for improving credit risk with emphasis on Basel I, II and III, and the third part, we conclude that the most appropriate model and gives the best effect for measuring credit risk in domestic banks.

  6. Integrated assessment models of global climate change

    International Nuclear Information System (INIS)

    Parson, E.A.; Fisher-Vanden, K.

    1997-01-01

    The authors review recent work in the integrated assessment modeling of global climate change. This field has grown rapidly since 1990. Integrated assessment models seek to combine knowledge from multiple disciplines in formal integrated representations; inform policy-making, structure knowledge, and prioritize key uncertainties; and advance knowledge of broad system linkages and feedbacks, particularly between socio-economic and bio-physical processes. They may combine simplified representations of the socio-economic determinants of greenhouse gas emissions, the atmosphere and oceans, impacts on human activities and ecosystems, and potential policies and responses. The authors summarize current projects, grouping them according to whether they emphasize the dynamics of emissions control and optimal policy-making, uncertainty, or spatial detail. They review the few significant insights that have been claimed from work to date and identify important challenges for integrated assessment modeling in its relationships to disciplinary knowledge and to broader assessment seeking to inform policy- and decision-making. 192 refs., 2 figs

  7. Revolutionary introduction of RIA/IRMA methodology in medical diagnostics: a study employing the technique for hyperprolactinemia and its correlation with hypothyroidism

    International Nuclear Information System (INIS)

    Tasneem, A.

    2011-01-01

    The aim of the study was to determine the incidence of hyperprolactinemia, its underlying causes and consequences, and to study its correlation with hypothyroidism. The study was carried out on 1365 male and female subjects referred to Centre for Nuclear Medicine Lahore for hormonal estimation. Serum Prolactin and thyroid stimulating hormone (TSH) levels were measured using IRMA kits. Prevalence of hyperprolactinemia turned out to be 4.90%. Menstrual irregularity appeared as a major consequence. The incidence rate was the highest in the age range of 21-27 years. Hypothyroidism in hyperprolactinemic subjects was observed to be 22.7%. i) Immunoradiometric assay is a micro analytical technique which can measure very minute amount of the antigens in the serum. II) Prevalence of hypothyroidism in hyperprolactinemic subjects in our population is promising enough to estimate thyroid hormone levels in hyperprolactinemic patients. (author)

  8. Tailored model abstraction in performance assessments

    International Nuclear Information System (INIS)

    Kessler, J.H.

    1995-01-01

    Total System Performance Assessments (TSPAs) are likely to be one of the most significant parts of making safety cases for the continued development and licensing of geologic repositories for the disposal of spent fuel and HLW. Thus, it is critical that the TSPA model capture the 'essence' of the physical processes relevant to demonstrating the appropriate regulation is met. But how much detail about the physical processes must be modeled and understood before there is enough confidence that the appropriate essence has been captured? In this summary the level of model abstraction that is required is discussed. Approaches for subsystem and total system performance analyses are outlined, and the role of best estimate models is examined. It is concluded that a conservative approach for repository performance, based on limited amount of field and laboratory data, can provide sufficient confidence for a regulatory decision

  9. Performance assessment modeling of pyrometallurgical process wasteforms

    International Nuclear Information System (INIS)

    Nutt, W.M.; Hill, R.N.; Bullen, D.B.

    1995-01-01

    Performance assessment analyses have been completed to estimate the behavior of high-level nuclear wasteforms generated from the pyrometallurgical processing of liquid metal reactor (LMR) and light water reactor (LWR) spent nuclear fuel. Waste emplaced in the proposed repository at Yucca Mountain is investigated as the basis for the study. The resulting cumulative actinide and fission product releases to the accessible environment within a 100,000 year period from the various pyrometallurgical process wasteforms are compared to those of directly disposed LWR spent fuel using the same total repository system model. The impact of differing radionuclide transport models on the overall release characteristics is investigated

  10. Nuclear security assessment with Markov model approach

    International Nuclear Information System (INIS)

    Suzuki, Mitsutoshi; Terao, Norichika

    2013-01-01

    Nuclear security risk assessment with the Markov model based on random event is performed to explore evaluation methodology for physical protection in nuclear facilities. Because the security incidences are initiated by malicious and intentional acts, expert judgment and Bayes updating are used to estimate scenario and initiation likelihood, and it is assumed that the Markov model derived from stochastic process can be applied to incidence sequence. Both an unauthorized intrusion as Design Based Threat (DBT) and a stand-off attack as beyond-DBT are assumed to hypothetical facilities, and performance of physical protection and mitigation and minimization of consequence are investigated to develop the assessment methodology in a semi-quantitative manner. It is shown that cooperation between facility operator and security authority is important to respond to the beyond-DBT incidence. (author)

  11. Improving Flood Damage Assessment Models in Italy

    Science.gov (United States)

    Amadio, M.; Mysiak, J.; Carrera, L.; Koks, E.

    2015-12-01

    The use of Stage-Damage Curve (SDC) models is prevalent in ex-ante assessments of flood risk. To assess the potential damage of a flood event, SDCs describe a relation between water depth and the associated potential economic damage over land use. This relation is normally developed and calibrated through site-specific analysis based on ex-post damage observations. In some cases (e.g. Italy) SDCs are transferred from other countries, undermining the accuracy and reliability of simulation results. Against this background, we developed a refined SDC model for Northern Italy, underpinned by damage compensation records from a recent flood event. Our analysis considers both damage to physical assets and production losses from business interruptions. While the first is calculated based on land use information, production losses are measured through the spatial distribution of Gross Value Added (GVA). An additional component of the model assesses crop-specific agricultural losses as a function of flood seasonality. Our results show an overestimation of asset damage from non-calibrated SDC values up to a factor of 4.5 for tested land use categories. Furthermore, we estimate that production losses amount to around 6 per cent of the annual GVA. Also, maximum yield losses are less than a half of the amount predicted by the standard SDC methods.

  12. Expert judgement models in quantitative risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rosqvist, T. [VTT Automation, Helsinki (Finland); Tuominen, R. [VTT Automation, Tampere (Finland)

    1999-12-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed.

  13. Expert judgement models in quantitative risk assessment

    International Nuclear Information System (INIS)

    Rosqvist, T.; Tuominen, R.

    1999-01-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed

  14. Review and assessment of pool scrubbing models

    International Nuclear Information System (INIS)

    Herranz, L.E.; Escudero, M.J.; Peyres, V.; Polo, J.; Lopez-Jimenez, J.

    1996-01-01

    Decontamination of fission products bearing bubbles as they through aqueous pools becomes a crucial phenomenon for source term evaluation of hypothetical risk dominant sequences of Light Water Reactors. In the present report a peer review and assessment of models encapsulated in SPARC andBUSCA codes is presented. Several aspects of pool scrubbing have been addressed: particle removal, fission product vapour retention and bubble hydrodynamics. Particular emphasis has been given to the close link between retention and hydrodynamics, from both modelling and experimental point of view. In addition, RHR and SGTR sequences were simulated with SPARC90 and BUSCA-AUG92 codes, and their results were compared with those obtained with MAAP 3.0B.As a result of this work, model capabilities and shortcomings have beenassessed and some areas susceptible of further research have been identified.(Author) 73 refs

  15. Assessment model validity document FARF31

    International Nuclear Information System (INIS)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria

    2004-08-01

    The prime goal of model validation is to build confidence in the model concept and that the model is fit for its intended purpose. In other words: Does the model predict transport in fractured rock adequately to be used in repository performance assessments. Are the results reasonable for the type of modelling tasks the model is designed for. Commonly, in performance assessments a large number of realisations of flow and transport is made to cover the associated uncertainties. Thus, the flow and transport including radioactive chain decay are preferably calculated in the same model framework. A rather sophisticated concept is necessary to be able to model flow and radionuclide transport in the near field and far field of a deep repository, also including radioactive chain decay. In order to avoid excessively long computational times there is a need for well-based simplifications. For this reason, the far field code FARF31 is made relatively simple, and calculates transport by using averaged entities to represent the most important processes. FARF31 has been shown to be suitable for the performance assessments within the SKB studies, e.g. SR 97. Among the advantages are that it is a fast, simple and robust code, which enables handling of many realisations with wide spread in parameters in combination with chain decay of radionuclides. Being a component in the model chain PROPER, it is easy to assign statistical distributions to the input parameters. Due to the formulation of the advection-dispersion equation in FARF31 it is possible to perform the groundwater flow calculations separately.The basis for the modelling is a stream tube, i.e. a volume of rock including fractures with flowing water, with the walls of the imaginary stream tube defined by streamlines. The transport within the stream tube is described using a dual porosity continuum approach, where it is assumed that rock can be divided into two distinct domains with different types of porosity

  16. Probabilistic Radiological Performance Assessment Modeling and Uncertainty

    Science.gov (United States)

    Tauxe, J.

    2004-12-01

    A generic probabilistic radiological Performance Assessment (PA) model is presented. The model, built using the GoldSim systems simulation software platform, concerns contaminant transport and dose estimation in support of decision making with uncertainty. Both the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE) require assessments of potential future risk to human receptors of disposal of LLW. Commercially operated LLW disposal facilities are licensed by the NRC (or agreement states), and the DOE operates such facilities for disposal of DOE-generated LLW. The type of PA model presented is probabilistic in nature, and hence reflects the current state of knowledge about the site by using probability distributions to capture what is expected (central tendency or average) and the uncertainty (e.g., standard deviation) associated with input parameters, and propagating through the model to arrive at output distributions that reflect expected performance and the overall uncertainty in the system. Estimates of contaminant release rates, concentrations in environmental media, and resulting doses to human receptors well into the future are made by running the model in Monte Carlo fashion, with each realization representing a possible combination of input parameter values. Statistical summaries of the results can be compared to regulatory performance objectives, and decision makers are better informed of the inherently uncertain aspects of the model which supports their decision-making. While this information may make some regulators uncomfortable, they must realize that uncertainties which were hidden in a deterministic analysis are revealed in a probabilistic analysis, and the chance of making a correct decision is now known rather than hoped for. The model includes many typical features and processes that would be part of a PA, but is entirely fictitious. This does not represent any particular site and is meant to be a generic example. A

  17. Triangular model integrating clinical teaching and assessment.

    Science.gov (United States)

    Abdelaziz, Adel; Koshak, Emad

    2014-01-01

    Structuring clinical teaching is a challenge facing medical education curriculum designers. A variety of instructional methods on different domains of learning are indicated to accommodate different learning styles. Conventional methods of clinical teaching, like training in ambulatory care settings, are prone to the factor of coincidence in having varieties of patient presentations. Accordingly, alternative methods of instruction are indicated to compensate for the deficiencies of these conventional methods. This paper presents an initiative that can be used to design a checklist as a blueprint to guide appropriate selection and implementation of teaching/learning and assessment methods in each of the educational courses and modules based on educational objectives. Three categories of instructional methods were identified, and within each a variety of methods were included. These categories are classroom-type settings, health services-based settings, and community service-based settings. Such categories have framed our triangular model of clinical teaching and assessment.

  18. The MESORAD dose assessment model: Computer code

    International Nuclear Information System (INIS)

    Ramsdell, J.V.; Athey, G.F.; Bander, T.J.; Scherpelz, R.I.

    1988-10-01

    MESORAD is a dose equivalent model for emergency response applications that is designed to be run on minicomputers. It has been developed by the Pacific Northwest Laboratory for use as part of the Intermediate Dose Assessment System in the US Nuclear Regulatory Commission Operations Center in Washington, DC, and the Emergency Management System in the US Department of Energy Unified Dose Assessment Center in Richland, Washington. This volume describes the MESORAD computer code and contains a listing of the code. The technical basis for MESORAD is described in the first volume of this report (Scherpelz et al. 1986). A third volume of the documentation planned. That volume will contain utility programs and input and output files that can be used to check the implementation of MESORAD. 18 figs., 4 tabs

  19. Assessing the present and future probability of Hurricane Harvey’s rainfall

    OpenAIRE

    Emanuel, Kerry

    2017-01-01

    Significance Natural disasters such as the recent Hurricanes Harvey, Irma, and Maria highlight the need for quantitative estimates of the risk of such disasters. Statistically based risk assessment suffers from short records of often poor quality, and in the case of meteorological hazards, from the fact that the underlying climate is changing. This study shows how a recently developed physics-based risk assessment method can be applied to assessing the probabilities of extreme hurricane rainf...

  20. Triangular model integrating clinical teaching and assessment

    Directory of Open Access Journals (Sweden)

    Abdelaziz A

    2014-03-01

    Full Text Available Adel Abdelaziz,1,2 Emad Koshak3 1Medical Education Development Unit, Faculty of Medicine, Al Baha University, Al Baha, Saudi Arabia; 2Medical Education Department, Faculty of Medicine, Suez Canal University, Egypt; 3Dean and Internal Medicine Department, Faculty of Medicine, Al Baha University, Al Baha, Saudi Arabia Abstract: Structuring clinical teaching is a challenge facing medical education curriculum designers. A variety of instructional methods on different domains of learning are indicated to accommodate different learning styles. Conventional methods of clinical teaching, like training in ambulatory care settings, are prone to the factor of coincidence in having varieties of patient presentations. Accordingly, alternative methods of instruction are indicated to compensate for the deficiencies of these conventional methods. This paper presents an initiative that can be used to design a checklist as a blueprint to guide appropriate selection and implementation of teaching/learning and assessment methods in each of the educational courses and modules based on educational objectives. Three categories of instructional methods were identified, and within each a variety of methods were included. These categories are classroom-type settings, health services-based settings, and community service-based settings. Such categories have framed our triangular model of clinical teaching and assessment. Keywords: curriculum development, teaching, learning, assessment, apprenticeship, community-based settings, health service-based settings

  1. Assessing elders using the functional health pattern assessment model.

    Science.gov (United States)

    Beyea, S; Matzo, M

    1989-01-01

    The impact of older Americans on the health care system requires we increase our students' awareness of their unique needs. The authors discuss strategies to develop skills using Gordon's Functional Health Patterns Assessment for assessing older clients.

  2. Mapping the Extent and Magnitude of Severe Flooding Induced by Hurricanes Harvey, Irma, and Maria with Sentinel-1 SAR and InSAR Observations

    Science.gov (United States)

    Zhang, B.; Koirala, R.; Oliver-Cabrera, T.; Wdowinski, S.; Osmanoglu, B.

    2017-12-01

    Hurricanes can cause winds, rainfall and storm surge, all of which could result in flooding. Between August and September 2017, Hurricanes Harvey, Irma and Maria made landfall over Texas, Florida and Puerto Rico causing destruction and damages. Flood mapping is important for water management and to estimate risks and property damage. Though water gauges are able to monitor water levels, they are normally distributed sparsely. To map flooding products of these extreme events, we use Synthetic Aperture Radar (SAR) observations acquired by the European satellite constellation Sentinel-1. We obtained two acquisitions from before each flooding event, a single acquisition during the hurricane, and two after each event, a total of five acquisitions. We use both amplitude and phase observations to map extent and magnitude of flooding respectively. To map flooding extents, we use amplitude images from before, after and if possible during the hurricane pass. A calibration is used to convert the image raw data to backscatter coefficient, termed sigma nought. We generate a composite of the two image layers using red and green bands to show the change of sigma nought between acquisitions, which directly reflects the extent of flooding. Because inundation can result with either an increase or decrease of sigma nought values depending on the surface scattering characteristics, we map flooded areas in location where sigma nought changes were above a detection threshold. To study magnitude of flooding we study Interferometric Synthetic Aperture Radar (InSAR) phase changes. Changes in the water level can be detected by the radar when the signal is reflected away from water surface and bounces again by another object (e.g. trees and/or buildings) known as double bounce phase. To generate meaningful interferograms, we compare phase information with the nearest water gauge records to verify our results. Preliminary results show that the three hurricanes caused flooding condition over

  3. Hurricanes Harvey and Irma - High-Resolution Flood Mapping and Monitoring from Sentinel SAR with the Depolarization Reduction Algorithm for Global Observations of InundatioN (DRAGON)

    Science.gov (United States)

    Nghiem, S. V.; Brakenridge, G. R.; Nguyen, D. T.

    2017-12-01

    Hurricane Harvey inflicted historical catastrophic flooding across extensive regions around Houston and southeast Texas after making landfall on 25 August 2017. The Federal Emergency Management Agency (FEMA) requested urgent supports for flood mapping and monitoring in an emergency response to the extreme flood situation. An innovative satellite remote sensing method, called the Depolarization Reduction Algorithm for Global Observations of inundatioN (DRAGON), has been developed and implemented for use with Sentinel synthetic aperture radar (SAR) satellite data at a resolution of 10 meters to identify, map, and monitor inundation including pre-existing water bodies and newly flooded areas. Results from this new method are hydrologically consistent and have been verified with known surface waters (e.g., coastal ocean, rivers, lakes, reservoirs, etc.), with clear-sky high-resolution WorldView images (where waves can be seen on surface water in inundated areas within a small spatial coverage), and with other flood maps from the consortium of Global Flood Partnership derived from multiple satellite datasets (including clear-sky Landsat and MODIS at lower resolutions). Figure 1 is a high-resolution (4K UHD) image of a composite inundation map for the region around Rosharon (in Brazoria County, south of Houston, Texas). This composite inundation map reveals extensive flooding on 29 August 2017 (four days after Hurricane Harvey made landfall), and the inundation was still persistent in most of the west and south of Rosharon one week later (5 September 2017) while flooding was reduced in the east of Rosharon. Hurricane Irma brought flooding to a number of areas in Florida. As of 10 September 2017, Sentinel SAR flood maps reveal inundation in the Florida Panhandle and over lowland surfaces on several islands in the Florida Keys. However, Sentinel SAR results indicate that flooding along the Florida coast was not extreme despite Irma was a Category-5 hurricane that might

  4. An Exploratory Study: Assessment of Modeled Dioxin ...

    Science.gov (United States)

    EPA has released an external review draft entitled, An Exploratory Study: Assessment of Modeled Dioxin Exposure in Ceramic Art Studios(External Review Draft). The public comment period and the external peer-review workshop are separate processes that provide opportunities for all interested parties to comment on the document. In addition to consideration by EPA, all public comments submitted in accordance with this notice will also be forwarded to EPA’s contractor for the external peer-review panel prior to the workshop. EPA has realeased this draft document solely for the purpose of pre-dissemination peer review under applicable information quality guidelines. This document has not been formally disseminated by EPA. It does not represent and should not be construed to represent any Agency policy or determination. The purpose of this report is to describe an exploratory investigation of potential dioxin exposures to artists/hobbyists who use ball clay to make pottery and related products.

  5. Modelling saline intrusion for repository performance assessment

    International Nuclear Information System (INIS)

    Jackson, C.P.

    1989-04-01

    UK Nirex Ltd are currently considering the possibility of disposal of radioactive waste by burial in deep underground repositories. The natural pathway for radionuclides from such a repository to return to Man's immediate environment (the biosphere) is via groundwater. Thus analyses of the groundwater flow in the neighbourhood of a possible repository, and consequent radionuclide transport form an important part of a performance assessment for a repository. Some of the areas in the UK that might be considered as possible locations for a repository are near the coast. If a repository is located in a coastal region seawater may intrude into the groundwater flow system. As seawater is denser than fresh water buoyancy forces acting on the intruding saline water may have significant effects on the groundwater flow system, and consequently on the time for radionuclides to return to the biosphere. Further, the chemistry of the repository near-field may be strongly influenced by the salinity of the groundwater. It is therefore important for Nirex to have a capability for reliably modelling saline intrusion to an appropriate degree of accuracy in order to make performance assessments for a repository in a coastal region. This report describes work undertaken in the Nirex Research programme to provide such a capability. (author)

  6. Accuracy Assessment of Different Digital Surface Models

    Directory of Open Access Journals (Sweden)

    Ugur Alganci

    2018-03-01

    Full Text Available Digital elevation models (DEMs, which can occur in the form of digital surface models (DSMs or digital terrain models (DTMs, are widely used as important geospatial information sources for various remote sensing applications, including the precise orthorectification of high-resolution satellite images, 3D spatial analyses, multi-criteria decision support systems, and deformation monitoring. The accuracy of DEMs has direct impacts on specific calculations and process chains; therefore, it is important to select the most appropriate DEM by considering the aim, accuracy requirement, and scale of each study. In this research, DSMs obtained from a variety of satellite sensors were compared to analyze their accuracy and performance. For this purpose, freely available Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER 30 m, Shuttle Radar Topography Mission (SRTM 30 m, and Advanced Land Observing Satellite (ALOS 30 m resolution DSM data were obtained. Additionally, 3 m and 1 m resolution DSMs were produced from tri-stereo images from the SPOT 6 and Pleiades high-resolution (PHR 1A satellites, respectively. Elevation reference data provided by the General Command of Mapping, the national mapping agency of Turkey—produced from 30 cm spatial resolution stereo aerial photos, with a 5 m grid spacing and ±3 m or better overall vertical accuracy at the 90% confidence interval (CI—were used to perform accuracy assessments. Gross errors and water surfaces were removed from the reference DSM. The relative accuracies of the different DSMs were tested using a different number of checkpoints determined by different methods. In the first method, 25 checkpoints were selected from bare lands to evaluate the accuracies of the DSMs on terrain surfaces. In the second method, 1000 randomly selected checkpoints were used to evaluate the methods’ accuracies for the whole study area. In addition to the control point approach, vertical cross

  7. Hierarchical Model of Assessing and Selecting Experts

    Science.gov (United States)

    Chernysheva, T. Y.; Korchuganova, M. A.; Borisov, V. V.; Min'kov, S. L.

    2016-04-01

    Revealing experts’ competences is a multi-objective issue. Authors of the paper deal with competence assessing methods of experts seen as objects, and criteria of qualities. An analytic hierarchy process of assessing and ranking experts is offered, which is based on paired comparison matrices and scores, quality parameters are taken into account as well. Calculation and assessment of experts is given as an example.

  8. Hierarchical Model of Assessing and Selecting Experts

    OpenAIRE

    Chernysheva, Tatiana Yurievna; Korchuganova, Mariya Anatolievna; Borisov, V. V.; Minkov, S. L.

    2016-01-01

    Revealing experts' competences is a multi-objective issue. Authors of the paper deal with competence assessing methods of experts seen as objects, and criteria of qualities. An analytic hierarchy process of assessing and ranking experts is offered, which is based on paired comparison matrices and scores, quality parameters are taken into account as well. Calculation and assessment of experts is given as an example.

  9. Critical assessment of nuclear mass models

    International Nuclear Information System (INIS)

    Moeller, P.; Nix, J.R.

    1992-01-01

    Some of the physical assumptions underlying various nuclear mass models are discussed. The ability of different mass models to predict new masses that were not taken into account when the models were formulated and their parameters determined is analyzed. The models are also compared with respect to their ability to describe nuclear-structure properties in general. The analysis suggests future directions for mass-model development

  10. The Development of a Secondary School Health Assessment Model

    Science.gov (United States)

    Sriring, Srinual; Erawan, Prawit; Sriwarom, Monoon

    2015-01-01

    The objective of this research was to: 1) involved a survey of information relating to secondary school health, 2) involved the construction of a model of health assessment and a handbook for using the model in secondary school, 3) develop an assessment model for secondary school. The research included 3 phases. (1) involved a survey of…

  11. FORMATIVE ASSESSMENT MODEL OF LEARNING SUCCESS ACHIEVEMENTS

    Directory of Open Access Journals (Sweden)

    Mikhailova Elena Konstantinovna

    2013-05-01

    Full Text Available The paper is devoted to the problem of assessment of the school students’ learning success achievements. The problem is investigated from the viewpoint of assessing the students’ learning outcomes that is aimed to ensure the teachers and students with the means and conditions to improve the educational process and results.

  12. Assessing Asset Pricing Models Using Revealed Preference

    OpenAIRE

    Jonathan B. Berk; Jules H. van Binsbergen

    2014-01-01

    We propose a new method of testing asset pricing models that relies on using quantities rather than prices or returns. We use the capital flows into and out of mutual funds to infer which risk model investors use. We derive a simple test statistic that allows us to infer, from a set of candidate models, the model that is closest to the model that investors use in making their capital allocation decisions. Using this methodology, we find that of the models most commonly used in the literature,...

  13. Assessing NARCCAP climate model effects using spatial confidence regions

    Directory of Open Access Journals (Sweden)

    J. P. French

    2017-07-01

    Full Text Available We assess similarities and differences between model effects for the North American Regional Climate Change Assessment Program (NARCCAP climate models using varying classes of linear regression models. Specifically, we consider how the average temperature effect differs for the various global and regional climate model combinations, including assessment of possible interaction between the effects of global and regional climate models. We use both pointwise and simultaneous inference procedures to identify regions where global and regional climate model effects differ. We also show conclusively that results from pointwise inference are misleading, and that accounting for multiple comparisons is important for making proper inference.

  14. A comparison of models for risk assessment

    International Nuclear Information System (INIS)

    Kellerer, A.M.; Jing Chen

    1993-01-01

    Various mathematical models have been used to represent the dependence of excess cancer risk on dose, age and time since exposure. For solid cancers, i.e. all cancers except leukaemia, the so-called relative risk model is usually employed. However, there can be quite different relative risk models. The most usual model for the quantification of excess tumour rate among the atomic bomb survivors has been a dependence of the relative risk on age at exposure, but it has been shown recently that an age attained model can be equally applied, to represent the observations among the atomic bomb survivors. The differences between the models and their implications are explained. It is also shown that the age attained model is similar to the approaches that have been used in the analysis of lung cancer incidence among radon exposed miners. A more unified approach to modelling of radiation risks can thus be achieved. (3 figs.)

  15. Assessment of multi class kinematic wave models

    NARCIS (Netherlands)

    Van Wageningen-Kessels, F.L.M.; Van Lint, J.W.C.; Vuik, C.; Hoogendoorn, S.P.

    2012-01-01

    In the last decade many multi class kinematic wave (MCKW) traffic ow models have been proposed. MCKW models introduce heterogeneity among vehicles and drivers. For example, they take into account differences in (maximum) velocities and driving style. Nevertheless, the models are macroscopic and the

  16. Theoretical Models, Assessment Frameworks and Test Construction.

    Science.gov (United States)

    Chalhoub-Deville, Micheline

    1997-01-01

    Reviews the usefulness of proficiency models influencing second language testing. Findings indicate that several factors contribute to the lack of congruence between models and test construction and make a case for distinguishing between theoretical models. Underscores the significance of an empirical, contextualized and structured approach to the…

  17. THE MODEL FOR RISK ASSESSMENT ERP-SYSTEMS INFORMATION SECURITY

    Directory of Open Access Journals (Sweden)

    V. S. Oladko

    2016-12-01

    Full Text Available The article deals with the problem assessment of information security risks in the ERP-system. ERP-system functions and architecture are studied. The model malicious impacts on levels of ERP-system architecture are composed. Model-based risk assessment, which is the quantitative and qualitative approach to risk assessment, built on the partial unification 3 methods for studying the risks of information security - security models with full overlapping technique CRAMM and FRAP techniques developed.

  18. Conceptual Models and Guidelines for Clinical Assessment of Financial Capacity.

    Science.gov (United States)

    Marson, Daniel

    2016-09-01

    The ability to manage financial affairs is a life skill of critical importance, and neuropsychologists are increasingly asked to assess financial capacity across a variety of settings. Sound clinical assessment of financial capacity requires knowledge and appreciation of applicable clinical conceptual models and principles. However, the literature has presented relatively little conceptual guidance for clinicians concerning financial capacity and its assessment. This article seeks to address this gap. The article presents six clinical models of financial capacity : (1) the early gerontological IADL model of Lawton, (2) the clinical skills model and (3) related cognitive psychological model developed by Marson and colleagues, (4) a financial decision-making model adapting earlier decisional capacity work of Appelbaum and Grisso, (5) a person-centered model of financial decision-making developed by Lichtenberg and colleagues, and (6) a recent model of financial capacity in the real world developed through the Institute of Medicine. Accompanying presentation of the models is discussion of conceptual and practical perspectives they represent for clinician assessment. Based on the models, the article concludes by presenting a series of conceptually oriented guidelines for clinical assessment of financial capacity. In summary, sound assessment of financial capacity requires knowledge and appreciation of clinical conceptual models and principles. Awareness of such models, principles and guidelines will strengthen and advance clinical assessment of financial capacity. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Robust flood area detection using a L-band synthetic aperture radar: Preliminary application for Florida, the U.S. affected by Hurricane Irma

    Science.gov (United States)

    Nagai, H.; Ohki, M.; Abe, T.

    2017-12-01

    Urgent crisis response for a hurricane-induced flood needs urgent providing of a flood map covering a broad region. However, there is no standard threshold values for automatic flood identification from pre-and-post images obtained by satellite-based synthetic aperture radars (SARs). This problem could hamper prompt data providing for operational uses. Furthermore, one pre-flood SAR image does not always represent potential water surfaces and river flows especially in tropical flat lands which are greatly influenced by seasonal precipitation cycle. We are, therefore, developing a new method of flood mapping using PALSAR-2, an L-band SAR, which is less affected by temporal surface changes. Specifically, a mean-value image and a standard-deviation image are calculated from a series of pre-flood SAR images. It is combined with a post-flood SAR image to obtain normalized backscatter amplitude difference (NoBADi), with which a difference between a post-flood image and a mean-value image is divided by a standard-deviation image to emphasize anomalous water extents. Flooding areas are then automatically obtained from the NoBADi images as lower-value pixels avoiding potential water surfaces. We applied this method to PALSAR-2 images acquired on Sept. 8, 10, and 12, 2017, covering flooding areas in a central region of Dominican Republic and west Florida, the U.S. affected by Hurricane Irma. The output flooding outlines are validated with flooding areas manually delineated from high-resolution optical satellite images, resulting in higher consistency and less uncertainty than previous methods (i.e., a simple pre-and-post flood difference and pre-and-post coherence changes). The NoBADi method has a great potential to obtain a reliable flood map for future flood hazards, not hampered by cloud cover, seasonal surface changes, and "casual" thresholds in the flood identification process.

  20. Accuracy assessment of landslide prediction models

    International Nuclear Information System (INIS)

    Othman, A N; Mohd, W M N W; Noraini, S

    2014-01-01

    The increasing population and expansion of settlements over hilly areas has greatly increased the impact of natural disasters such as landslide. Therefore, it is important to developed models which could accurately predict landslide hazard zones. Over the years, various techniques and models have been developed to predict landslide hazard zones. The aim of this paper is to access the accuracy of landslide prediction models developed by the authors. The methodology involved the selection of study area, data acquisition, data processing and model development and also data analysis. The development of these models are based on nine different landslide inducing parameters i.e. slope, land use, lithology, soil properties, geomorphology, flow accumulation, aspect, proximity to river and proximity to road. Rank sum, rating, pairwise comparison and AHP techniques are used to determine the weights for each of the parameters used. Four (4) different models which consider different parameter combinations are developed by the authors. Results obtained are compared to landslide history and accuracies for Model 1, Model 2, Model 3 and Model 4 are 66.7, 66.7%, 60% and 22.9% respectively. From the results, rank sum, rating and pairwise comparison can be useful techniques to predict landslide hazard zones

  1. The Automation of Nowcast Model Assessment Processes

    Science.gov (United States)

    2016-09-01

    secondly, provide modelers with the information needed to understand the model errors and how their algorithm changes might mitigate these errors. In...by ARL modelers. 2. Development Environment The automation of Point-Stat processes (i.e., PSA) was developed using Python 3.5.* Python was selected...because it is easy to use, widely used for scripting, and satisfies all the requirements to automate the implementation of the Point-Stat tool. In

  2. Quality assessment for radiological model parameters

    International Nuclear Information System (INIS)

    Funtowicz, S.O.

    1989-01-01

    A prototype framework for representing uncertainties in radiological model parameters is introduced. This follows earlier development in this journal of a corresponding framework for representing uncertainties in radiological data. Refinements and extensions to the earlier framework are needed in order to take account of the additional contextual factors consequent on using data entries to quantify model parameters. The parameter coding can in turn feed in to methods for evaluating uncertainties in calculated model outputs. (author)

  3. Polytomous Rasch Models in Counseling Assessment

    Science.gov (United States)

    Willse, John T.

    2017-01-01

    This article provides a brief introduction to the Rasch model. Motivation for using Rasch analyses is provided. Important Rasch model concepts and key aspects of result interpretation are introduced, with major points reinforced using a simulation demonstration. Concrete guidelines are provided regarding sample size and the evaluation of items.

  4. Assessing a Theoretical Model on EFL College Students

    Science.gov (United States)

    Chang, Yu-Ping

    2011-01-01

    This study aimed to (1) integrate relevant language learning models and theories, (2) construct a theoretical model of college students' English learning performance, and (3) assess the model fit between empirically observed data and the theoretical model proposed by the researchers of this study. Subjects of this study were 1,129 Taiwanese EFL…

  5. Assessing Model Characterization of Single Source ...

    Science.gov (United States)

    Aircraft measurements made downwind from specific coal fired power plants during the 2013 Southeast Nexus field campaign provide a unique opportunity to evaluate single source photochemical model predictions of both O3 and secondary PM2.5 species. The model did well at predicting downwind plume placement. The model shows similar patterns of an increasing fraction of PM2.5 sulfate ion to the sum of SO2 and PM2.5 sulfate ion by distance from the source compared with ambient based estimates. The model was less consistent in capturing downwind ambient based trends in conversion of NOX to NOY from these sources. Source sensitivity approaches capture near-source O3 titration by fresh NO emissions, in particular subgrid plume treatment. However, capturing this near-source chemical feature did not translate into better downwind peak estimates of single source O3 impacts. The model estimated O3 production from these sources but often was lower than ambient based source production. The downwind transect ambient measurements, in particular secondary PM2.5 and O3, have some level of contribution from other sources which makes direct comparison with model source contribution challenging. Model source attribution results suggest contribution to secondary pollutants from multiple sources even where primary pollutants indicate the presence of a single source. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, deci

  6. Mathematical Models for Camouflage Pattern Assessment

    Science.gov (United States)

    2013-04-01

    Matemático Facultad de Ciencias F́ısicas y Matemáticas http://www.cmm.uchile.cl DISTRIBUTION A: Distribution approved for public release University of Chile...Centro de Modelamiento Matemático Facultad de Ciencias Físicas y Matemáticas Final Report: Camouage Assessment January 2013 Abstract The main

  7. Structure ignition assessment model (SIAM)\\t

    Science.gov (United States)

    Jack D. Cohen

    1995-01-01

    Major wildland/urban interface fire losses, principally residences, continue to occur. Although the problem is not new, the specific mechanisms are not well known on how structures ignite in association with wildland fires. In response to the need for a better understanding of wildland/urban interface ignition mechanisms and a method of assessing the ignition risk,...

  8. Auditory modelling for assessing room acoustics

    NARCIS (Netherlands)

    Van Dorp Schuitman, J.

    2011-01-01

    The acoustics of a concert hall, or any other room, are generally assessed by measuring room impulse responses for one or multiple source and receiver location(s). From these responses, objective parameters can be determined that should be related to various perceptual attributes of room acoustics.

  9. Development of Risk Assessment Matrix for NASA Engineering and Safety Center

    Science.gov (United States)

    Malone, Roy W., Jr.; Moses, Kelly

    2004-01-01

    This paper describes a study, which had as its principal goal the development of a sufficiently detailed 5 x 5 Risk Matrix Scorecard. The purpose of this scorecard is to outline the criteria by which technical issues can be qualitatively and initially prioritized. The tool using this score card has been proposed to be one of the information resources the NASA Engineering and Safety Center (NESC) takes into consideration when making decisions with respect to incoming information on safety concerns across the entire NASA agency. The contents of this paper discuss in detail each element of the risk matrix scorecard, definitions for those elements and the rationale behind the development of those definitions. This scorecard development was performed in parallel with the tailoring of the existing Futron Corporation Integrated Risk Management Application (IRMA) software tool. IRMA was tailored to fit NESC needs for evaluating incoming safety concerns and was renamed NESC Assessment Risk Management Application (NAFMA) which is still in developmental phase.

  10. Specialty Payment Model Opportunities and Assessment

    Science.gov (United States)

    Mulcahy, Andrew W.; Chan, Chris; Hirshman, Samuel; Huckfeldt, Peter J.; Kofner, Aaron; Liu, Jodi L.; Lovejoy, Susan L.; Popescu, Ioana; Timbie, Justin W.; Hussey, Peter S.

    2015-01-01

    Abstract Gastroenterology and cardiology services are common and costly among Medicare beneficiaries. Episode-based payment, which aims to create incentives for high-quality, low-cost care, has been identified as a promising alternative payment model. This article describes research related to the design of episode-based payment models for ambulatory gastroenterology and cardiology services for possible testing by the Center for Medicare and Medicaid Innovation at the Centers for Medicare and Medicaid Services (CMS). The authors analyzed Medicare claims data to describe the frequency and characteristics of gastroenterology and cardiology index procedures, the practices that delivered index procedures, and the patients that received index procedures. The results of these analyses can help inform CMS decisions about the definition of episodes in an episode-based payment model; payment adjustments for service setting, multiple procedures, or other factors; and eligibility for the payment model. PMID:28083363

  11. Higher Education Quality Assessment Model: Towards Achieving Educational Quality Standard

    Science.gov (United States)

    Noaman, Amin Y.; Ragab, Abdul Hamid M.; Madbouly, Ayman I.; Khedra, Ahmed M.; Fayoumi, Ayman G.

    2017-01-01

    This paper presents a developed higher education quality assessment model (HEQAM) that can be applied for enhancement of university services. This is because there is no universal unified quality standard model that can be used to assess the quality criteria of higher education institutes. The analytical hierarchy process is used to identify the…

  12. Assessing The Performance of Hydrological Models

    Science.gov (United States)

    van der Knijff, Johan

    The performance of hydrological models is often characterized using the coefficient of efficiency, E. The sensitivity of E to extreme streamflow values, and the difficulty of deciding what value of E should be used as a threshold to identify 'good' models or model parameterizations, have proven to be serious shortcomings of this index. This paper reviews some alternative performance indices that have appeared in the litera- ture. Legates and McCabe (1999) suggested a more generalized form of E, E'(j,B). Here, j is a parameter that controls how much emphasis is put on extreme streamflow values, and B defines a benchmark or 'null hypothesis' against which the results of the model are tested. E'(j,B) was used to evaluate a large number of parameterizations of a conceptual rainfall-runoff model, using 6 different combinations of j and B. First, the effect of j and B is explained. Second, it is demonstrated how the index can be used to explicitly test hypotheses about the model and the data. This approach appears to be particularly attractive if the index is used as a likelihood measure within a GLUE-type analysis.

  13. A normative model for assessing competitive strategy

    OpenAIRE

    Ungerer, Gerard David; Cayzer, Steve

    2016-01-01

    The hyper-competitive nature of e-business has raised the need for a generic way to appraise the merit of a developed business strategy. Although progress has been made in the domain of strategy evaluation, the established literature differs over the ‘tests’ that a strategy must pass to be considered well-constructed. This paper therefore investigates the existing strategy-evaluation literature to propose a more integrated and comprehensive normative strategic assessment that can be used to e...

  14. Uncertainties in environmental radiological assessment models and their implications

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Miller, C.W.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because these models are inexact representations of real systems. The major sources of this uncertainty are related to biases in model formulation and parameter estimation. The best approach for estimating the actual extent of over- or underprediction is model validation, a procedure that requires testing over the range of the intended realm of model application. Other approaches discussed are the use of screening procedures, sensitivity and stochastic analyses, and model comparison. The magnitude of uncertainty in model predictions is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. Estimates are made of the relative magnitude of uncertainty for situations requiring predictions of individual and collective risks for both chronic and acute releases of radionuclides. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible

  15. Assessing physical models used in nuclear aerosol transport models

    International Nuclear Information System (INIS)

    McDonald, B.H.

    1987-01-01

    Computer codes used to predict the behaviour of aerosols in water-cooled reactor containment buildings after severe accidents contain a variety of physical models. Special models are in place for describing agglomeration processes where small aerosol particles combine to form larger ones. Other models are used to calculate the rates at which aerosol particles are deposited on building structures. Condensation of steam on aerosol particles is currently a very active area in aerosol modelling. In this paper, the physical models incorporated in the current available international codes for all of these processes are reviewed and documented. There is considerable variation in models used in different codes, and some uncertainties exist as to which models are superior. 28 refs

  16. Review of early assessment models of innovative medical technologies.

    Science.gov (United States)

    Fasterholdt, Iben; Krahn, Murray; Kidholm, Kristian; Yderstræde, Knud Bonnet; Pedersen, Kjeld Møller

    2017-08-01

    Hospitals increasingly make decisions regarding the early development of and investment in technologies, but a formal evaluation model for assisting hospitals early on in assessing the potential of innovative medical technologies is lacking. This article provides an overview of models for early assessment in different health organisations and discusses which models hold most promise for hospital decision makers. A scoping review of published studies between 1996 and 2015 was performed using nine databases. The following information was collected: decision context, decision problem, and a description of the early assessment model. 2362 articles were identified and 12 studies fulfilled the inclusion criteria. An additional 12 studies were identified and included in the review by searching reference lists. The majority of the 24 early assessment studies were variants of traditional cost-effectiveness analysis. Around one fourth of the studies presented an evaluation model with a broader focus than cost-effectiveness. Uncertainty was mostly handled by simple sensitivity or scenario analysis. This review shows that evaluation models using known methods assessing cost-effectiveness are most prevalent in early assessment, but seems ill-suited for early assessment in hospitals. Four models provided some usable elements for the development of a hospital-based model. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  17. Regional Models for Sediment Toxicity Assessment

    Science.gov (United States)

    This paper investigates the use of empirical models to predict the toxicity of sediment samples within a region to laboratory test organisms based on sediment chemistry. In earlier work, we used a large nationwide database of matching sediment chemistry and marine amphipod sedim...

  18. Persistence Modeling for Assessing Marketing Strategy Performance

    NARCIS (Netherlands)

    M.G. Dekimpe (Marnik); D.M. Hanssens (Dominique)

    2003-01-01

    textabstractThe question of long-run market response lies at the heart of any marketing strategy that tries to create a sustainable competitive advantage for the firm or brand. A key challenge, however, is that only short-run results of marketing actions are readily observable. Persistence modeling

  19. The air emissions risk assessment model (AERAM)

    International Nuclear Information System (INIS)

    Gratt, L.B.

    1991-01-01

    AERAM is an environmental analysis and power generation station investment decision support tool. AERAM calculates the public health risk (in terms of the lifetime cancers) in the nearby population from pollutants released into the air. AERAM consists of four main subroutines: Emissions, Air, Exposure and Risk. The Emission subroutine uses power plant parameters to calculate the expected release of the pollutants. A coal-fired and oil-fired power plant are currently available. A gas-fired plant model is under preparation. The release of the pollutants into the air is followed by their dispersal in the environment. The dispersion in the Air Subroutine uses the Environmental Protection Agency's model, Industrial Source Complex-Long Term. Additional dispersion models (Industrial Source Complex - Short Term and Cooling Tower Drift) are being implemented for future AERAM versions. The Expose Subroutine uses the ambient concentrations to compute population exposures for the pollutants of concern. The exposures are used with corresponding dose-response model in the Risk Subroutine to estimate both the total population risk and individual risk. The risk for the dispersion receptor-population centroid for the maximum concentration is also calculated for regulatory-population purposes. In addition, automated interfaces with AirTox (an air risk decision model) have been implemented to extend AERAM's steady-state single solution to the decision-under-uncertainty domain. AERAM was used for public health risks, the investment decision for additional pollution control systems based on health risk reductions, and the economics of fuel vs. health risk tradeoffs. AERAM provides that state-of-the-art capability for evaluating the public health impact airborne toxic substances in response to regulations and public concern

  20. A normative model for assessing competitive strategy

    Directory of Open Access Journals (Sweden)

    Ungerer, Gerard David

    2016-12-01

    Full Text Available The hyper-competitive nature of e-business has raised the need for a generic way to appraise the merit of a developed business strategy. Although progress has been made in the domain of strategy evaluation, the established literature differs over the ‘tests’ that a strategy must pass to be considered well-constructed. This paper therefore investigates the existing strategy-evaluation literature to propose a more integrated and comprehensive normative strategic assessment that can be used to evaluate and refine a business’ s competitive strategy , adding to its robustness and survivability.

  1. Sustainability Assessment Model in Product Development

    Science.gov (United States)

    Turan, Faiz Mohd; Johan, Kartina; Nor, Nik Hisyamudin Muhd; Omar, Badrul

    2017-08-01

    Faster and more efficient development of innovative and sustainable products has become the focus for manufacturing companies in order to remain competitive in today’s technologically driven world. Design concept evaluation which is the end of conceptual design is one of the most critical decision points. It relates to the final success of product development, because poor criteria assessment in design concept evaluation can rarely compensated at the later stages. Furthermore, consumers, investors, shareholders and even competitors are basing their decisions on what to buy or invest in, from whom, and also on what company report, and sustainability is one of a critical component. In this research, a new methodology of sustainability assessment in product development for Malaysian industry has been developed using integration of green project management, new scale of “Weighting criteria” and Rough-Grey Analysis. This method will help design engineers to improve the effectiveness and objectivity of the sustainable design concept evaluation, enable them to make better-informed decisions before finalising their choice and consequently create value to the company or industry. The new framework is expected to provide an alternative to existing methods.

  2. Interactive Rapid Dose Assessment Model (IRDAM): user's guide

    International Nuclear Information System (INIS)

    Poeton, R.W.; Moeller, M.P.; Laughlin, G.J.; Desrosiers, A.E.

    1983-05-01

    As part of the continuing emphasis on emergency preparedness the US Nuclear Regulatory Commission (NRC) sponsored the development of a rapid dose assessment system by Pacific Northwest Laboratory (PNL). This system, the Interactive Rapid Dose Assessment Model (IRDAM) is a micro-computer based program for rapidly assessing the radiological impact of accidents at nuclear power plants. This User's Guide provides instruction in the setup and operation of the equipment necessary to run IRDAM. Instructions are also given on how to load the magnetic disks and access the interactive part of the program. Two other companion volumes to this one provide additional information on IRDAM. Reactor Accident Assessment Methods (NUREG/CR-3012, Volume 2) describes the technical bases for IRDAM including methods, models and assumptions used in calculations. Scenarios for Comparing Dose Assessment Models (NUREG/CR-3012, Volume 3) provides the results of calculations made by IRDAM and other models for specific accident scenarios

  3. ITER plasma safety interface models and assessments

    International Nuclear Information System (INIS)

    Uckan, N.A.; Bartels, H-W.; Honda, T.; Amano, T.; Boucher, D.; Post, D.; Wesley, J.

    1996-01-01

    Physics models and requirements to be used as a basis for safety analysis studies are developed and physics results motivated by safety considerations are presented for the ITER design. Physics specifications are provided for enveloping plasma dynamic events for Category I (operational event), Category II (likely event), and Category III (unlikely event). A safety analysis code SAFALY has been developed to investigate plasma anomaly events. The plasma response to ex-vessel component failure and machine response to plasma transients are considered

  4. Survivability Assessment: Modeling A Recovery Process

    OpenAIRE

    Paputungan, Irving Vitra; Abdullah, Azween

    2009-01-01

    Survivability is the ability of a system to continue operating, in a timely manner, in the presence ofattacks, failures, or accidents. Recovery in survivability is a process of a system to heal or recover from damageas early as possible to fulfill its mission as condition permit. In this paper, we show a preliminary recoverymodel to enhance the system survivability. The model focuses on how we preserve the system and resumes itscritical service under attacks as soon as possible.Keywords: surv...

  5. Enterprise Cloud Adoption - Cloud Maturity Assessment Model

    OpenAIRE

    Conway, Gerry; Doherty, Eileen; Carcary, Marian; Crowley, Catherine

    2017-01-01

    The introduction and use of cloud computing by an organization has the promise of significant benefits that include reduced costs, improved services, and a pay-per-use model. Organizations that successfully harness these benefits will potentially have a distinct competitive edge, due to their increased agility and flexibility to rapidly respond to an ever changing and complex business environment. However, as cloud technology is a relatively new ph...

  6. Assessing testamentary and decision-making capacity: Approaches and models.

    Science.gov (United States)

    Purser, Kelly; Rosenfeld, Tuly

    2015-09-01

    The need for better and more accurate assessments of testamentary and decision-making capacity grows as Australian society ages and incidences of mentally disabling conditions increase. Capacity is a legal determination, but one on which medical opinion is increasingly being sought. The difficulties inherent within capacity assessments are exacerbated by the ad hoc approaches adopted by legal and medical professionals based on individual knowledge and skill, as well as the numerous assessment paradigms that exist. This can negatively affect the quality of assessments, and results in confusion as to the best way to assess capacity. This article begins by assessing the nature of capacity. The most common general assessment models used in Australia are then discussed, as are the practical challenges associated with capacity assessment. The article concludes by suggesting a way forward to satisfactorily assess legal capacity given the significant ramifications of getting it wrong.

  7. Performability assessment by model checking of Markov reward models

    NARCIS (Netherlands)

    Baier, Christel; Cloth, L.; Haverkort, Boudewijn R.H.M.; Hermanns, H.; Katoen, Joost P.

    2010-01-01

    This paper describes efficient procedures for model checking Markov reward models, that allow us to evaluate, among others, the performability of computer-communication systems. We present the logic CSRL (Continuous Stochastic Reward Logic) to specify performability measures. It provides flexibility

  8. Model-Based Approaches for Teaching and Practicing Personality Assessment.

    Science.gov (United States)

    Blais, Mark A; Hopwood, Christopher J

    2017-01-01

    Psychological assessment is a complex professional skill. Competence in assessment requires an extensive knowledge of personality, neuropsychology, social behavior, and psychopathology, a background in psychometrics, familiarity with a range of multimethod tools, cognitive flexibility, skepticism, and interpersonal sensitivity. This complexity makes assessment a challenge to teach and learn, particularly as the investment of resources and time in assessment has waned in psychological training programs over the last few decades. In this article, we describe 3 conceptual models that can assist teaching and learning psychological assessments. The transtheoretical model of personality provides a personality systems-based framework for understanding how multimethod assessment data relate to major personality systems and can be combined to describe and explain complex human behavior. The quantitative psychopathology-personality trait model is an empirical model based on the hierarchical organization of individual differences. Application of this model can help students understand diagnostic comorbidity and symptom heterogeneity, focus on more meaningful high-order domains, and identify the most effective assessment tools for addressing a given question. The interpersonal situation model is rooted in interpersonal theory and can help students connect test data to here-and-now interactions with patients. We conclude by demonstrating the utility of these models using a case example.

  9. A Hierarchal Risk Assessment Model Using the Evidential Reasoning Rule

    Directory of Open Access Journals (Sweden)

    Xiaoxiao Ji

    2017-02-01

    Full Text Available This paper aims to develop a hierarchical risk assessment model using the newly-developed evidential reasoning (ER rule, which constitutes a generic conjunctive probabilistic reasoning process. In this paper, we first provide a brief introduction to the basics of the ER rule and emphasize the strengths for representing and aggregating uncertain information from multiple experts and sources. Further, we discuss the key steps of developing the hierarchical risk assessment framework systematically, including (1 formulation of risk assessment hierarchy; (2 representation of both qualitative and quantitative information; (3 elicitation of attribute weights and information reliabilities; (4 aggregation of assessment information using the ER rule and (5 quantification and ranking of risks using utility-based transformation. The proposed hierarchical risk assessment framework can potentially be implemented to various complex and uncertain systems. A case study on the fire/explosion risk assessment of marine vessels demonstrates the applicability of the proposed risk assessment model.

  10. Modelling fog in probabilistic consequence assessment

    International Nuclear Information System (INIS)

    Underwood, B.Y.

    1993-02-01

    Earlier work examined the potential influence of foggy weather conditions on the probabilistic assessment of the consequences of accidental releases of radioactive material to the atmosphere (PCA), in particular the impact of a fraction of the released aerosol becoming incorporated into droplets. A major uncertainty emerging from the initial scoping study concerned estimation of the fraction of the released material that would be taken up into droplets. An objective is to construct a method for handling in a PCA context the effect of fog on deposition, basing the method on the experience gained from prior investigations. There are two aspects to explicitly including the effect of fog in PCA: estimating the probability of occurrence of various types of foggy condition and calculating the impact on the conventional end-points of consequence assessment. For the first, a brief outline is given of the use of meteorological data by PCA computer codes, followed by a discussion of some routinely-recorded meteorological parameters that are pertinent to fog, such as the presentweather code and horizontal visibility. Four stylized scenarios are defined to cover a wide range of situations in which particle growth by uptake of water may have an important impact on deposition. A description is then given of the way in which routine meteorological data could be used to flag the presence of each of these conditions in the meteorological data file used by the PCA code. The approach developed to calculate the impact on deposition is pitched at a level of complexity appropriate to the PCA context and reflects the physical constraints of the system and accounts for the specific characteristics of the released aerosol. (Author)

  11. A new assessment model and tool for pediatric nurse practitioners.

    Science.gov (United States)

    Burns, C

    1992-01-01

    This article presents a comprehensive assessment model for pediatric nurse practitioner (PNP) practice that integrates familiar elements of the classical medical history, Gordon's Functional Health Patterns, and developmental fields into one system. This model drives the diagnostic reasoning process toward consideration of a broad range of disease, daily living (nursing diagnosis), and developmental diagnoses, which represents PNP practice better than the medical model does.

  12. Uncertainty Assessment in Urban Storm Water Drainage Modelling

    DEFF Research Database (Denmark)

    Thorndahl, Søren

    The object of this paper is to make an overall description of the author's PhD study, concerning uncertainties in numerical urban storm water drainage models. Initially an uncertainty localization and assessment of model inputs and parameters as well as uncertainties caused by different model...

  13. The importance of trajectory modelling in accident consequence assessments

    International Nuclear Information System (INIS)

    Jones, J.A.; Williams, J.A.; Hill, M.D.

    1988-01-01

    Most atmospheric dispersion models used at present or probabilistic risk assessment (PRA) are linear: they take account of the wind speed but not the direction after the first hour. Therefore, the trajectory model is a more realistic description of the cloud's behaviour. However, the extra complexity means that the computing costs increase. This is an important factor for the MARIA code which is intended to be run on computers of varying power. The numbers of early effects predicted by a linear model and a trajectory model in a probabilistic risk assessment were compared to see which model should be preferred. The trajectory model predicted about 25% fewer expected early deaths and 30% more people evacuated than the linear model. However, the trajectory model took about ten times longer to calculate its results. The choice between the two models may depend on the speed of the computer available

  14. Road Assessment Model and Pilot Application in China

    Directory of Open Access Journals (Sweden)

    Tiejun Zhang

    2014-01-01

    Full Text Available Risk assessment of roads is an effective approach for road agencies to determine safety improvement investments. It can increases the cost-effective returns in crash and injury reductions. To get a powerful Chinese risk assessment model, Research Institute of Highway (RIOH is developing China Road Assessment Programme (ChinaRAP model to show the traffic crashes in China in partnership with International Road Assessment Programme (iRAP. The ChinaRAP model is based upon RIOH’s achievements and iRAP models. This paper documents part of ChinaRAP’s research work, mainly including the RIOH model and its pilot application in a province in China.

  15. Modeling and assessing international climate financing

    Science.gov (United States)

    Wu, Jing; Tang, Lichun; Mohamed, Rayman; Zhu, Qianting; Wang, Zheng

    2016-06-01

    Climate financing is a key issue in current negotiations on climate protection. This study establishes a climate financing model based on a mechanism in which donor countries set up funds for climate financing and recipient countries use the funds exclusively for carbon emission reduction. The burden-sharing principles are based on GDP, historical emissions, and consumptionbased emissions. Using this model, we develop and analyze a series of scenario simulations, including a financing program negotiated at the Cancun Climate Change Conference (2010) and several subsequent programs. Results show that sustained climate financing can help to combat global climate change. However, the Cancun Agreements are projected to result in a reduction of only 0.01°C in global warming by 2100 compared to the scenario without climate financing. Longer-term climate financing programs should be established to achieve more significant benefits. Our model and simulations also show that climate financing has economic benefits for developing countries. Developed countries will suffer a slight GDP loss in the early stages of climate financing, but the longterm economic growth and the eventual benefits of climate mitigation will compensate for this slight loss. Different burden-sharing principles have very similar effects on global temperature change and economic growth of recipient countries, but they do result in differences in GDP changes for Japan and the FSU. The GDP-based principle results in a larger share of financial burden for Japan, while the historical emissions-based principle results in a larger share of financial burden for the FSU. A larger burden share leads to a greater GDP loss.

  16. Computational model for the assessment of oil spill damages

    Energy Technology Data Exchange (ETDEWEB)

    Seip, K L; Heiberg, A B; Brekke, K A

    1985-06-01

    A description is given of the method and the required data of a model for calculating oil spill damages. Eleven damage attributes are defined: shorelength contaminated, shore restitution time, birds dead, restitution time for three groups of birds, open sea damages-two types, damages to recreation, economy and fisheries. The model has been applied in several cases of oil pollution assessments: in an examination of alternative models for the organization of oil spill combat in Norway, in the assessment of the damages coused by a blowout at Tromsoeflaket and in assessing a possible increase in oil spill preparedness for Svalbard. 56 references.

  17. High Resolution Satellite Data reveals Massive Export of Carbon and Nitrogen-Rich Seagrass Wrack from Greater Florida Bay to the Open Ocean after Hurricane Irma

    Science.gov (United States)

    Dierssen, H. M.; Hedley, J. D.; Russell, B. J.; Vaudrey, J. M.; Perry, R. A.

    2017-12-01

    Episodic storms are known to be important drivers of ocean ecosystem processes, but the impacts are notoriously difficult to quantify with traditional sampling techniques. Here, we use stunning high spatial resolution satellite imagery from Sentinel 2A collected 13 September 2017, only days after Hurricane Irma passed directly over the Florida Keys, to quantify massive amounts of floating vegetative material. This Category 4 storm passed directly over the Florida Keys, bringing wind gusts over 35 m s-1 and creating turbulence in the water column that scoured the seafloor. The imagery reveals as initial estimate of 40 km2 of surface drifting material. Although the identity of the brown material cannot be fully determined without a hyperspectral sensor, the accumulations are consistent with our past research showing large aggregations of seagrass leaves or "wrack" advected under high winds from dense beds of Syringodium filiforme within Greater Florida Bay to the oceanic waters of the Atlantic. Using measurements of wrack collected from this area, we estimate that this single event corresponds to a total export of 9.7 x 1010 gC and 2.7 x 109 gN from the seagrass beds. This high amount of export is not considered typical for many types of tropical seagrass meadows that are thought to highly recycle nutrients within the beds. Elemental analysis of seagrass leaves from Greater Florida Bay is consistent with nitrogen-fixation in the beds, which could provide the means to sustain a large export of nitrogen from the meadows. As the wrack travels at the sea surface, some of these nutrients are exuded into the surrounding waters providing a nutrient subsidy of dissolved and particulate carbon and nitrogen and making the wrack an ecological hot spot for organisms. Although wrack can potentially remain floating for months, the ultimate fate of the wrack is to either wash ashore, providing connectivity between marine and terrestrial ecosystems, or sink to the seafloor. If most

  18. Mathematical modeling in biology: A critical assessment

    Energy Technology Data Exchange (ETDEWEB)

    Buiatti, M. [Florence, Univ. (Italy). Dipt. di Biologia Animale e Genetica

    1998-01-01

    The molecular revolution and the development of biology-derived industry have led in the last fifty years to an unprecedented `lead forward` of life sciences in terms of experimental data. Less success has been achieved in the organisation of such data and in the consequent development of adequate explanatory and predictive theories and models. After a brief historical excursus inborn difficulties of mathematisation of biological objects and processes derived from the complex dynamics of life are discussed along with the logical tools (simplifications, choice of observation points etc.) used to overcome them. `Autistic`, monodisciplinary attitudes towards biological modeling of mathematicians, physicists, biologists aimed in each case at the use of the tools of other disciplines to solve `selfish` problems are also taken into account and a warning against derived dangers (reification of mono disciplinary metaphors, lack of falsification etc.) is given. Finally `top.down` (deductive) and `bottom up` (inductive) heuristic interactive approaches to mathematisation are critically discussed with the help of serie of examples.

  19. Mathematical modeling in biology: A critical assessment

    International Nuclear Information System (INIS)

    Buiatti, M.

    1998-01-01

    The molecular revolution and the development of biology-derived industry have led in the last fifty years to an unprecedented 'lead forward' of life sciences in terms of experimental data. Less success has been achieved in the organisation of such data and in the consequent development of adequate explanatory and predictive theories and models. After a brief historical excursus inborn difficulties of mathematisation of biological objects and processes derived from the complex dynamics of life are discussed along with the logical tools (simplifications, choice of observation points etc.) used to overcome them. 'Autistic', monodisciplinary attitudes towards biological modeling of mathematicians, physicists, biologists aimed in each case at the use of the tools of other disciplines to solve 'selfish' problems are also taken into account and a warning against derived dangers (reification of mono disciplinary metaphors, lack of falsification etc.) is given. Finally 'top.down' (deductive) and 'bottom up' (inductive) heuristic interactive approaches to mathematisation are critically discussed with the help of serie of examples

  20. Assessing work disability for social security benefits: international models for the direct assessment of work capacity.

    Science.gov (United States)

    Geiger, Ben Baumberg; Garthwaite, Kayleigh; Warren, Jon; Bambra, Clare

    2017-08-25

    It has been argued that social security disability assessments should directly assess claimants' work capacity, rather than relying on proxies such as on functioning. However, there is little academic discussion of how such assessments could be conducted. The article presents an account of different models of direct disability assessments based on case studies of the Netherlands, Germany, Denmark, Norway, the United States of America, Canada, Australia, and New Zealand, utilising over 150 documents and 40 expert interviews. Three models of direct work disability assessments can be observed: (i) structured assessment, which measures the functional demands of jobs across the national economy and compares these to claimants' functional capacities; (ii) demonstrated assessment, which looks at claimants' actual experiences in the labour market and infers a lack of work capacity from the failure of a concerned rehabilitation attempt; and (iii) expert assessment, based on the judgement of skilled professionals. Direct disability assessment within social security is not just theoretically desirable, but can be implemented in practice. We have shown that there are three distinct ways that this can be done, each with different strengths and weaknesses. Further research is needed to clarify the costs, validity/legitimacy, and consequences of these different models. Implications for rehabilitation It has recently been argued that social security disability assessments should directly assess work capacity rather than simply assessing functioning - but we have no understanding about how this can be done in practice. Based on case studies of nine countries, we show that direct disability assessment can be implemented, and argue that there are three different ways of doing it. These are "demonstrated assessment" (using claimants' experiences in the labour market), "structured assessment" (matching functional requirements to workplace demands), and "expert assessment" (the

  1. Pipeline modeling and assessment in unstable slopes

    Energy Technology Data Exchange (ETDEWEB)

    Caceres, Carlos Nieves [Oleoducto Central S.A., Bogota, Cundinamarca (Colombia); Ordonez, Mauricio Pereira [SOLSIN S.A.S, Bogota, Cundinamarca (Colombia)

    2010-07-01

    The OCENSA pipeline system is vulnerable to geotechnical problems such as faults, landslides or creeping slopes, which are well-known in the Andes Mountains and tropical countries like Colombia. This paper proposes a methodology to evaluate the pipe behaviour during the soil displacements of slow landslides. Three different cases of analysis are examined, according to site characteristics. The process starts with a simplified analytical model and develops into 3D finite element numerical simulations applied to the on-site geometry of soil and pipe. Case 1 should be used when the unstable site is subject to landslides impacting significant lengths of pipeline, pipeline is straight, and landslide is simple from the geotechnical perspective. Case 2 should be used when pipeline is straight and landslide is complex (creeping slopes and non-conventional stabilization solutions). Case 3 should be used if the pipeline presents vertical or horizontal bends.

  2. A Multi-Actor Dynamic Integrated Assessment Model (MADIAM)

    OpenAIRE

    Weber, Michael

    2004-01-01

    The interactions between climate and the socio-economic system are investigated with a Multi-Actor Dynamic Integrated Assessment Model (MADIAM) obtained by coupling a nonlinear impulse response model of the climate sub-system (NICCS) to a multi-actor dynamic economic model (MADEM). The main goal is to initiate a model development that is able to treat the dynamics of the coupled climate socio-economic system, including endogenous technological change, in a non-equilibrium situation, thereby o...

  3. Conceptual adsorption models and open issues pertaining to performance assessment

    International Nuclear Information System (INIS)

    Serne, R.J.

    1992-01-01

    Recently several articles have been published that question the appropriateness of the distribution coefficient, Rd, concept to quantify radionuclide migration. Several distinct issues surrounding the modeling of nuclide retardation. The first section defines adsorption terminology and discusses various adsorption processes. The next section describes five commonly used adsorption conceptual models, specifically emphasizing what attributes that affect adsorption are explicitly accommodated in each model. I also review efforts to incorporate each adsorption model into performance assessment transport computer codes. The five adsorption conceptual models are (1) the constant Rd model, (2) the parametric Rd model, (3) isotherm adsorption models, (4) mass action adsorption models, and (5) surface-complexation with electrostatics models. The final section discusses the adequacy of the distribution ratio concept, the adequacy of transport calculations that rely on constant retardation factors and the status of incorporating sophisticated adsorption models into transport codes. 86 refs., 1 fig., 1 tab

  4. Conceptual adsorption models and open issues pertaining to performance assessment

    International Nuclear Information System (INIS)

    Serne, R.J.

    1991-10-01

    Recently several articles have been published that question the appropriateness of the distribution coefficient, Rd, concept to quantify radionuclide migration. Several distinct issues are raised by various critics. In this paper I provide some perspective on issues surrounding the modeling of nuclide retardation. The first section defines adsorption terminology and discusses various adsorption processes. The next section describes five commonly used adsorption conceptual models, specifically emphasizing what attributes that affect adsorption are explicitly accommodated in each model. I also review efforts to incorporate each adsorption model into performance assessment transport computer codes. The five adsorption conceptual models are (1) the constant Rd model, (2) the parametric Rd model, (3) isotherm adsorption models, (4) mass-action adsorption models, and (5) surface-complexation with electrostatics models. The final section discusses the adequacy of the distribution ratio concept, the adequacy of transport calculations that rely on constant retardation factors and the status of incorporating sophisticated adsorption models into transport codes

  5. Indoor Air Quality Building Education and Assessment Model

    Science.gov (United States)

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM), released in 2002, is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  6. Indoor Air Quality Building Education and Assessment Model Forms

    Science.gov (United States)

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM) is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  7. Assessment and development of implementation models of health ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Assessment and development of implementation models of health-related ... The Contribution of Civil Society Organizations in Achieving Health for All ... Health Information for Maternal and Child Health Planning in Urban Bangladesh.

  8. Route Assessment for Unmanned Aerial Vehicle Based on Cloud Model

    Directory of Open Access Journals (Sweden)

    Xixia Sun

    2014-01-01

    Full Text Available An integrated route assessment approach based on cloud model is proposed in this paper, where various sources of uncertainties are well kept and modeled by cloud theory. Firstly, a systemic criteria framework incorporating models for scoring subcriteria is developed. Then, the cloud model is introduced to represent linguistic variables, and survivability probability histogram of each route is converted into normal clouds by cloud transformation, enabling both randomness and fuzziness in the assessment environment to be managed simultaneously. Finally, a new way to measure the similarity between two normal clouds satisfying reflexivity, symmetry, transitivity, and overlapping is proposed. Experimental results demonstrate that the proposed route assessment approach outperforms fuzzy logic based assessment approach with regard to feasibility, reliability, and consistency with human thinking.

  9. Model of environmental life cycle assessment for coal mining operations.

    Science.gov (United States)

    Burchart-Korol, Dorota; Fugiel, Agata; Czaplicka-Kolarz, Krystyna; Turek, Marian

    2016-08-15

    This paper presents a novel approach to environmental assessment of coal mining operations, which enables assessment of the factors that are both directly and indirectly affecting the environment and are associated with the production of raw materials and energy used in processes. The primary novelty of the paper is the development of a computational environmental life cycle assessment (LCA) model for coal mining operations and the application of the model for coal mining operations in Poland. The LCA model enables the assessment of environmental indicators for all identified unit processes in hard coal mines with the life cycle approach. The proposed model enables the assessment of greenhouse gas emissions (GHGs) based on the IPCC method and the assessment of damage categories, such as human health, ecosystems and resources based on the ReCiPe method. The model enables the assessment of GHGs for hard coal mining operations in three time frames: 20, 100 and 500years. The model was used to evaluate the coal mines in Poland. It was demonstrated that the largest environmental impacts in damage categories were associated with the use of fossil fuels, methane emissions and the use of electricity, processing of wastes, heat, and steel supports. It was concluded that an environmental assessment of coal mining operations, apart from direct influence from processing waste, methane emissions and drainage water, should include the use of electricity, heat and steel, particularly for steel supports. Because the model allows the comparison of environmental impact assessment for various unit processes, it can be used for all hard coal mines, not only in Poland but also in the world. This development is an important step forward in the study of the impacts of fossil fuels on the environment with the potential to mitigate the impact of the coal industry on the environment. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Predicting the ungauged basin: Model validation and realism assessment

    Directory of Open Access Journals (Sweden)

    Tim evan Emmerik

    2015-10-01

    Full Text Available The hydrological decade on Predictions in Ungauged Basins (PUB led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this paper we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. This paper does not present a generic approach that can be transferred to other ungauged catchments, but it aims to show how clever model design and alternative data acquisition can result in a valuable hydrological model for an ungauged catchment.

  11. ASSESSING INDIVIDUAL PERFORMANCE ON INFORMATION TECHNOLOGY ADOPTION: A NEW MODEL

    OpenAIRE

    Diah Hari Suryaningrum

    2012-01-01

    This paper aims to propose a new model in assessing individual performance on information technology adoption. The new model to assess individual performance was derived from two different theories: decomposed theory of planned behavior and task-technology fit theory. Although many researchers have tried to expand these theories, some of their efforts might lack of theoretical assumptions. To overcome this problem and enhance the coherence of the integration, I used a theory from social scien...

  12. Adaptation in integrated assessment modeling: where do we stand?

    OpenAIRE

    Patt, A.; van Vuuren, D.P.; Berkhout, F.G.H.; Aaheim, A.; Hof, A.F.; Isaac, M.; Mechler, R.

    2010-01-01

    Adaptation is an important element on the climate change policy agenda. Integrated assessment models, which are key tools to assess climate change policies, have begun to address adaptation, either by including it implicitly in damage cost estimates, or by making it an explicit control variable. We analyze how modelers have chosen to describe adaptation within an integrated framework, and suggest many ways they could improve the treatment of adaptation by considering more of its bottom-up cha...

  13. Semantic modeling of portfolio assessment in e-learning environment

    Directory of Open Access Journals (Sweden)

    Lucila Romero

    2017-01-01

    Full Text Available In learning environment, portfolio is used as a tool to keep track of learner’s progress. Particularly, when it comes to e-learning, continuous assessment allows greater customization and efficiency in learning process and prevents students lost interest in their study. Also, each student has his own characteristics and learning skills that must be taken into account in order to keep learner`s interest. So, personalized monitoring is the key to guarantee the success of technology-based education. In this context, portfolio assessment emerge as the solution because is an easy way to allow teacher organize and personalize assessment according to students characteristic and need. A portfolio assessment can contain various types of assessment like formative assessment, summative assessment, hetero or self-assessment and use different instruments like multiple choice questions, conceptual maps, and essay among others. So, a portfolio assessment represents a compilation of all assessments must be solved by a student in a course, it documents progress and set targets. In previous work, it has been proposed a conceptual framework that consist of an ontology network named AOnet which is a semantic tool conceptualizing different types of assessments. Continuing that work, this paper presents a proposal to implement portfolios assessment in e-learning environments. The proposal consists of a semantic model that describes key components and relations of this domain to set the bases to develop a tool to generate, manage and perform portfolios assessment.

  14. Adaptation in integrated assessment modeling: where do we stand?

    NARCIS (Netherlands)

    Patt, A.; van Vuuren, D.P.; Berkhout, F.G.H.; Aaheim, A.; Hof, A.F.; Isaac, M.; Mechler, R.

    2010-01-01

    Adaptation is an important element on the climate change policy agenda. Integrated assessment models, which are key tools to assess climate change policies, have begun to address adaptation, either by including it implicitly in damage cost estimates, or by making it an explicit control variable. We

  15. Performance and Cognitive Assessment in 3-D Modeling

    Science.gov (United States)

    Fahrer, Nolan E.; Ernst, Jeremy V.; Branoff, Theodore J.; Clark, Aaron C.

    2011-01-01

    The purpose of this study was to investigate identifiable differences between performance and cognitive assessment scores in a 3-D modeling unit of an engineering drafting course curriculum. The study aimed to provide further investigation of the need of skill-based assessments in engineering/technical graphics courses to potentially increase…

  16. Advanced REACH tool: A Bayesian model for occupational exposure assessment

    NARCIS (Netherlands)

    McNally, K.; Warren, N.; Fransman, W.; Entink, R.K.; Schinkel, J.; Van Tongeren, M.; Cherrie, J.W.; Kromhout, H.; Schneider, T.; Tielemans, E.

    2014-01-01

    This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate

  17. Using models in Integrated Ecosystem Assessment of coastal areas

    Science.gov (United States)

    Solidoro, Cosimo; Bandelj, Vinko; Cossarini, Gianpiero; Melaku Canu, Donata; Libralato, Simone

    2014-05-01

    Numerical Models can greatly contribute to integrated ecological assessment of coastal and marine systems. Indeed, models can: i) assist in the identification of efficient sampling strategy; ii) provide space interpolation and time extrapolation of experiemtanl data which are based on the knowedge on processes dynamics and causal realtionships which is coded within the model, iii) provide estimates of hardly measurable indicators. Furthermore model can provide indication on potential effects of implementation of alternative management policies. Finally, by providing a synthetic representation of an ideal system, based on its essential dynamic, model return a picture of ideal behaviour of a system in the absence of external perturbation, alteration, noise, which might help in the identification of reference behaivuor. As an important example, model based reanalyses of biogeochemical and ecological properties are an urgent need for the estimate of the environmental status and the assessment of efficacy of conservation and environmental policies, also with reference to the enforcement of the European MSFD. However, the use of numerical models, and particularly of ecological models, in modeling and in environmental management still is far from be the rule, possibly because of a lack in realizing the benefits which a full integration of modeling and montoring systems might provide, possibly because of a lack of trust in modeling results, or because many problems still exists in the development, validation and implementation of models. For istance, assessing the validity of model results is a complex process that requires the definition of appropriate indicators, metrics, methodologies and faces with the scarcity of real-time in-situ biogeochemical data. Furthermore, biogeochemical models typically consider dozens of variables which are heavily undersampled. Here we show how the integration of mathematical model and monitoring data can support integrated ecosystem

  18. The Effect of Computer Models as Formative Assessment on Student Understanding of the Nature of Models

    Science.gov (United States)

    Park, Mihwa; Liu, Xiufeng; Smith, Erica; Waight, Noemi

    2017-01-01

    This study reports the effect of computer models as formative assessment on high school students' understanding of the nature of models. Nine high school teachers integrated computer models and associated formative assessments into their yearlong high school chemistry course. A pre-test and post-test of students' understanding of the nature of…

  19. Users guide to REGIONAL-1: a regional assessment model

    International Nuclear Information System (INIS)

    Davis, W.E.; Eadie, W.J.; Powell, D.C.

    1979-09-01

    A guide was prepared to allow a user to run the PNL long-range transport model, REGIONAL 1. REGIONAL 1 is a computer model set up to run atmospheric assessments on a regional basis. The model has the capability of being run in three modes for a single time period. The three modes are: (1) no deposition, (2) dry deposition, (3) wet and dry deposition. The guide provides the physical and mathematical basis used in the model for calculating transport, diffusion, and deposition for all three modes. Also the guide includes a program listing with an explanation of the listings and an example in the form of a short-term assessment for 48 hours. The purpose of the example is to allow a person who has past experience with programming and meteorology to operate the assessment model and compare his results with the guide results. This comparison will assure the user that the program is operating in a proper fashion

  20. The role of computer modelling in participatory integrated assessments

    International Nuclear Information System (INIS)

    Siebenhuener, Bernd; Barth, Volker

    2005-01-01

    In a number of recent research projects, computer models have been included in participatory procedures to assess global environmental change. The intention was to support knowledge production and to help the involved non-scientists to develop a deeper understanding of the interactions between natural and social systems. This paper analyses the experiences made in three projects with the use of computer models from a participatory and a risk management perspective. Our cross-cutting analysis of the objectives, the employed project designs and moderation schemes and the observed learning processes in participatory processes with model use shows that models play a mixed role in informing participants and stimulating discussions. However, no deeper reflection on values and belief systems could be achieved. In terms of the risk management phases, computer models serve best the purposes of problem definition and option assessment within participatory integrated assessment (PIA) processes

  1. Accident consequence assessments with different atmospheric dispersion models

    International Nuclear Information System (INIS)

    Panitz, H.J.

    1989-11-01

    An essential aim of the improvements of the new program system UFOMOD for Accident Consequence Assessments (ACAs) was to substitute the straight-line Gaussian plume model conventionally used in ACA models by more realistic atmospheric dispersion models. To identify improved models which can be applied in ACA codes and to quantify the implications of different dispersion models on the results of an ACA, probabilistic comparative calculations with different atmospheric dispersion models have been performed. The study showed that there are trajectory models available which can be applied in ACAs and that they provide more realistic results of ACAs than straight-line Gaussian models. This led to a completely novel concept of atmospheric dispersion modelling in which two different distance ranges of validity are distinguished: the near range of some ten kilometres distance and the adjacent far range which are assigned to respective trajectory models. (orig.) [de

  2. Specialty Payment Model Opportunities and Assessment: Oncology Model Design Report.

    Science.gov (United States)

    Huckfeldt, Peter J; Chan, Chris; Hirshman, Samuel; Kofner, Aaron; Liu, Jodi L; Mulcahy, Andrew W; Popescu, Ioana; Stevens, Clare; Timbie, Justin W; Hussey, Peter S

    2015-07-15

    This article describes research related to the design of a payment model for specialty oncology services for possible testing by the Center for Medicare and Medicaid Innovation at the Centers for Medicare & Medicaid Services (CMS). Cancer is a common and costly condition. Episode-based payment, which aims to create incentives for high-quality, low-cost care, has been identified as a promising alternative payment model for oncology care. Episode-based payment systems can provide flexibility to health care providers to select among the most effective and efficient treatment alternatives, including activities that are not currently reimbursed under Medicare payment policies. However, the model design also needs to ensure that high-quality care is delivered and that beneficial treatments are not withheld from patients. CMS asked MITRE and RAND to conduct analyses to inform design decisions related to an episode-based oncology model for Medicare beneficiaries undergoing chemotherapy treatment for cancer. In particular, this study focuses on analyses of Medicare claims data related to the definition of the initiation of an episode of chemotherapy, patterns of spending during and surrounding episodes of chemotherapy, and attribution of episodes of chemotherapy to physician practices. We found that the time between the primary cancer diagnosis and chemotherapy initiation varied widely across patients, ranging from one day to over seven years, with a median of 2.4 months. The average level of total monthly payments varied considerably across cancers, with the highest spending peak of $9,972 for lymphoma, and peaks of $3,109 for breast cancer and $2,135 for prostate cancer.

  3. A parsimonious dynamic model for river water quality assessment.

    Science.gov (United States)

    Mannina, Giorgio; Viviani, Gaspare

    2010-01-01

    Water quality modelling is of crucial importance for the assessment of physical, chemical, and biological changes in water bodies. Mathematical approaches to water modelling have become more prevalent over recent years. Different model types ranging from detailed physical models to simplified conceptual models are available. Actually, a possible middle ground between detailed and simplified models may be parsimonious models that represent the simplest approach that fits the application. The appropriate modelling approach depends on the research goal as well as on data available for correct model application. When there is inadequate data, it is mandatory to focus on a simple river water quality model rather than detailed ones. The study presents a parsimonious river water quality model to evaluate the propagation of pollutants in natural rivers. The model is made up of two sub-models: a quantity one and a quality one. The model employs a river schematisation that considers different stretches according to the geometric characteristics and to the gradient of the river bed. Each stretch is represented with a conceptual model of a series of linear channels and reservoirs. The channels determine the delay in the pollution wave and the reservoirs cause its dispersion. To assess the river water quality, the model employs four state variables: DO, BOD, NH(4), and NO. The model was applied to the Savena River (Italy), which is the focus of a European-financed project in which quantity and quality data were gathered. A sensitivity analysis of the model output to the model input or parameters was done based on the Generalised Likelihood Uncertainty Estimation methodology. The results demonstrate the suitability of such a model as a tool for river water quality management.

  4. A Process Model for Assessing Adolescent Risk for Suicide.

    Science.gov (United States)

    Stoelb, Matt; Chiriboga, Jennifer

    1998-01-01

    This comprehensive assessment process model includes primary, secondary, and situational risk factors and their combined implications and significance in determining an adolescent's level or risk for suicide. Empirical data and clinical intuition are integrated to form a working client model that guides the professional in continuously reassessing…

  5. Predicting the ungauged basin : Model validation and realism assessment

    NARCIS (Netherlands)

    Van Emmerik, T.H.M.; Mulder, G.; Eilander, D.; Piet, M.; Savenije, H.H.G.

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  6. EASETECH – A LCA model for assessment of environmental technologies

    DEFF Research Database (Denmark)

    Damgaard, Anders; Baumeister, Hubert; Astrup, Thomas Fruergaard

    2014-01-01

    EASETECH is a new model for the environmental assessment of environmental technologies developed in collaboration between DTU Environment and DTU Compute. EASETECH is based on experience gained in the field of waste management modelling over the last decade and applies the same concepts to systems...

  7. Goodness-of-Fit Assessment of Item Response Theory Models

    Science.gov (United States)

    Maydeu-Olivares, Alberto

    2013-01-01

    The article provides an overview of goodness-of-fit assessment methods for item response theory (IRT) models. It is now possible to obtain accurate "p"-values of the overall fit of the model if bivariate information statistics are used. Several alternative approaches are described. As the validity of inferences drawn on the fitted model…

  8. Review of early assessment models of innovative medical technologies

    DEFF Research Database (Denmark)

    Fasterholdt, Iben; Krahn, Murray D; Kidholm, Kristian

    2017-01-01

    INTRODUCTION: Hospitals increasingly make decisions regarding the early development of and investment in technologies, but a formal evaluation model for assisting hospitals early on in assessing the potential of innovative medical technologies is lacking. This article provides an overview of models...

  9. Predicting the ungauged basin: model validation and realism assessment

    NARCIS (Netherlands)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  10. Application of the cognitive therapy model to initial crisis assessment.

    Science.gov (United States)

    Calvert, Patricia; Palmer, Christine

    2003-03-01

    This article provides a background to the development of cognitive therapy and cognitive therapeutic skills with a specific focus on the treatment of a depressive episode. It discusses the utility of cognitive therapeutic strategies to the model of crisis theory and initial crisis assessment currently used by the Community Assessment & Treatment Team of Waitemata District Health Board on the North Shore of Auckland, New Zealand. A brief background to cognitive therapy is provided, followed by a comprehensive example of the use of the Socratic questioning method in guiding collaborative assessment and treatment of suicidality by nurses during the initial crisis assessment.

  11. Assessment of Teacher Perceived Skill in Classroom Assessment Practices Using IRT Models

    Science.gov (United States)

    Koloi-Keaikitse, Setlhomo

    2017-01-01

    The purpose of this study was to assess teacher perceived skill in classroom assessment practices. Data were collected from a sample of (N = 691) teachers selected from government primary, junior secondary, and senior secondary schools in Botswana. Item response theory models were used to identify teacher response on items that measured their…

  12. Model summary report for the safety assessment SR-Site

    International Nuclear Information System (INIS)

    Vahlund, Fredrik; Zetterstroem Evins, Lena; Lindgren, Maria

    2010-12-01

    This document is the model summary report for the safety assessment SR-Site. In the report, the quality assurance (QA) measures conducted for assessment codes are presented together with the chosen QA methodology. In the safety assessment project SR-Site, a large number of numerical models are used to analyse the system and to show compliance. In order to better understand how the different models interact and how information are transferred between the different models Assessment Model Flowcharts, AMFs, are used. From these, different modelling tasks can be identify and the computer codes used. As a large number of computer codes are used in the assessment the complexity of these differs to a large extent, some of the codes are commercial while others are developed especially for the assessment at hand. QA requirements must on the one hand take this diversity into account and on the other hand be well defined. In the methodology section of the report the following requirements are defined for all codes: - It must be demonstrated that the code is suitable for its purpose. - It must be demonstrated that the code has been properly used. - It must be demonstrated that the code development process has followed appropriate procedures and that the code produces accurate results. - It must be described how data are transferred between the different computational tasks. Although the requirements are identical for all codes in the assessment, the measures used to show that the requirements are fulfilled will be different for different types of codes (for instance due to the fact that for some software the source-code is not available for review). Subsequent to the methodology section, each assessment code is presented together with a discussion on how the requirements are met

  13. Model summary report for the safety assessment SR-Site

    Energy Technology Data Exchange (ETDEWEB)

    Vahlund, Fredrik; Zetterstroem Evins, Lena (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)); Lindgren, Maria (Kemakta Konsult AB, Stockholm (Sweden))

    2010-12-15

    This document is the model summary report for the safety assessment SR-Site. In the report, the quality assurance (QA) measures conducted for assessment codes are presented together with the chosen QA methodology. In the safety assessment project SR-Site, a large number of numerical models are used to analyse the system and to show compliance. In order to better understand how the different models interact and how information are transferred between the different models Assessment Model Flowcharts, AMFs, are used. From these, different modelling tasks can be identify and the computer codes used. As a large number of computer codes are used in the assessment the complexity of these differs to a large extent, some of the codes are commercial while others are developed especially for the assessment at hand. QA requirements must on the one hand take this diversity into account and on the other hand be well defined. In the methodology section of the report the following requirements are defined for all codes: - It must be demonstrated that the code is suitable for its purpose. - It must be demonstrated that the code has been properly used. - It must be demonstrated that the code development process has followed appropriate procedures and that the code produces accurate results. - It must be described how data are transferred between the different computational tasks. Although the requirements are identical for all codes in the assessment, the measures used to show that the requirements are fulfilled will be different for different types of codes (for instance due to the fact that for some software the source-code is not available for review). Subsequent to the methodology section, each assessment code is presented together with a discussion on how the requirements are met

  14. Sensitivity and uncertainty analyses for performance assessment modeling

    International Nuclear Information System (INIS)

    Doctor, P.G.

    1988-08-01

    Sensitivity and uncertainty analyses methods for computer models are being applied in performance assessment modeling in the geologic high level radioactive waste repository program. The models used in performance assessment tend to be complex physical/chemical models with large numbers of input variables. There are two basic approaches to sensitivity and uncertainty analyses: deterministic and statistical. The deterministic approach to sensitivity analysis involves numerical calculation or employs the adjoint form of a partial differential equation to compute partial derivatives; the uncertainty analysis is based on Taylor series expansions of the input variables propagated through the model to compute means and variances of the output variable. The statistical approach to sensitivity analysis involves a response surface approximation to the model with the sensitivity coefficients calculated from the response surface parameters; the uncertainty analysis is based on simulation. The methods each have strengths and weaknesses. 44 refs

  15. Geologic modeling in risk assessment methodology for radioactive waste management

    International Nuclear Information System (INIS)

    Logan, S.E.; Berbano, M.C.

    1977-01-01

    Under contract to the U.S. Environmental Protection Agency (EPA), the University of New Mexico is developing a computer based assessment methodology for evaluating public health and environmental impacts from the disposal of radioactive waste in geologic formations. Methodology incorporates a release or fault tree model, an environmental model, and an economic model. The release model and its application to a model repository in bedded salt is described. Fault trees are constructed to provide the relationships between various geologic and man-caused events which are potential mechanisms for release of radioactive material beyond the immediate environs of the repository. The environmental model includes: 1) the transport to and accumulations at various receptors in the biosphere, 2) pathways from these environmental concentrations, and 3) radiation dose to man. Finally, economic results are used to compare and assess various disposal configurations as a basis for formulatin

  16. Application of Physiologically Based Pharmacokinetic Models in Chemical Risk Assessment

    Directory of Open Access Journals (Sweden)

    Moiz Mumtaz

    2012-01-01

    Full Text Available Post-exposure risk assessment of chemical and environmental stressors is a public health challenge. Linking exposure to health outcomes is a 4-step process: exposure assessment, hazard identification, dose response assessment, and risk characterization. This process is increasingly adopting “in silico” tools such as physiologically based pharmacokinetic (PBPK models to fine-tune exposure assessments and determine internal doses in target organs/tissues. Many excellent PBPK models have been developed. But most, because of their scientific sophistication, have found limited field application—health assessors rarely use them. Over the years, government agencies, stakeholders/partners, and the scientific community have attempted to use these models or their underlying principles in combination with other practical procedures. During the past two decades, through cooperative agreements and contracts at several research and higher education institutions, ATSDR funded translational research has encouraged the use of various types of models. Such collaborative efforts have led to the development and use of transparent and user-friendly models. The “human PBPK model toolkit” is one such project. While not necessarily state of the art, this toolkit is sufficiently accurate for screening purposes. Highlighted in this paper are some selected examples of environmental and occupational exposure assessments of chemicals and their mixtures.

  17. A test-bed modeling study for wave resource assessment

    Science.gov (United States)

    Yang, Z.; Neary, V. S.; Wang, T.; Gunawan, B.; Dallman, A.

    2016-02-01

    Hindcasts from phase-averaged wave models are commonly used to estimate standard statistics used in wave energy resource assessments. However, the research community and wave energy converter industry is lacking a well-documented and consistent modeling approach for conducting these resource assessments at different phases of WEC project development, and at different spatial scales, e.g., from small-scale pilot study to large-scale commercial deployment. Therefore, it is necessary to evaluate current wave model codes, as well as limitations and knowledge gaps for predicting sea states, in order to establish best wave modeling practices, and to identify future research needs to improve wave prediction for resource assessment. This paper presents the first phase of an on-going modeling study to address these concerns. The modeling study is being conducted at a test-bed site off the Central Oregon Coast using two of the most widely-used third-generation wave models - WaveWatchIII and SWAN. A nested-grid modeling approach, with domain dimension ranging from global to regional scales, was used to provide wave spectral boundary condition to a local scale model domain, which has a spatial dimension around 60km by 60km and a grid resolution of 250m - 300m. Model results simulated by WaveWatchIII and SWAN in a structured-grid framework are compared to NOAA wave buoy data for the six wave parameters, including omnidirectional wave power, significant wave height, energy period, spectral width, direction of maximum directionally resolved wave power, and directionality coefficient. Model performance and computational efficiency are evaluated, and the best practices for wave resource assessments are discussed, based on a set of standard error statistics and model run times.

  18. Economic assessment model architecture for AGC/AVLIS selection

    International Nuclear Information System (INIS)

    Hoglund, R.L.

    1984-01-01

    The economic assessment model architecture described provides the flexibility and completeness in economic analysis that the selection between AGC and AVLIS demands. Process models which are technology-specific will provide the first-order responses of process performance and cost to variations in process parameters. The economics models can be used to test the impacts of alternative deployment scenarios for a technology. Enterprise models provide global figures of merit for evaluating the DOE perspective on the uranium enrichment enterprise, and business analysis models compute the financial parameters from the private investor's viewpoint

  19. Fire models for assessment of nuclear power plant fires

    International Nuclear Information System (INIS)

    Nicolette, V.F.; Nowlen, S.P.

    1989-01-01

    This paper reviews the state-of-the-art in available fire models for the assessment of nuclear power plants fires. The advantages and disadvantages of three basic types of fire models (zone, field, and control volume) and Sandia's experience with these models will be discussed. It is shown that the type of fire model selected to solve a particular problem should be based on the information that is required. Areas of concern which relate to all nuclear power plant fire models are identified. 17 refs., 6 figs

  20. A model for assessing human cognitive reliability in PRA studies

    International Nuclear Information System (INIS)

    Hannaman, G.W.; Spurgin, A.J.; Lukic, Y.

    1985-01-01

    This paper summarizes the status of a research project sponsored by EPRI as part of the Probabilistic Risk Assessment (PRA) technology improvement program and conducted by NUS Corporation to develop a model of Human Cognitive Reliability (HCR). The model was synthesized from features identified in a review of existing models. The model development was based on the hypothesis that the key factors affecting crew response times are separable. The inputs to the model consist of key parameters the values of which can be determined by PRA analysts for each accident situation being assessed. The output is a set of curves which represent the probability of control room crew non-response as a function of time for different conditions affecting their performance. The non-response probability is then a contributor to the overall non-success of operating crews to achieve a functional objective identified in the PRA study. Simulator data and some small scale tests were utilized to illustrate the calibration of interim HCR model coefficients for different types of cognitive processing since the data were sparse. The model can potentially help PRA analysts make human reliability assessments more explicit. The model incorporates concepts from psychological models of human cognitive behavior, information from current collections of human reliability data sources and crew response time data from simulator training exercises

  1. Dependability modeling and assessment in UML-based software development.

    Science.gov (United States)

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.

  2. Modelling the pre-assessment learning effects of assessment: evidence in the validity chain.

    Science.gov (United States)

    Cilliers, Francois J; Schuwirth, Lambert W T; van der Vleuten, Cees P M

    2012-11-01

    We previously developed a model of the pre-assessment learning effects of consequential assessment and started to validate it. The model comprises assessment factors, mechanism factors and learning effects. The purpose of this study was to continue the validation process. For stringency, we focused on a subset of assessment factor-learning effect associations that featured least commonly in a baseline qualitative study. Our aims were to determine whether these uncommon associations were operational in a broader but similar population to that in which the model was initially derived. A cross-sectional survey of 361 senior medical students at one medical school was undertaken using a purpose-made questionnaire based on a grounded theory and comprising pairs of written situational tests. In each pair, the manifestation of an assessment factor was varied. The frequencies at which learning effects were selected were compared for each item pair, using an adjusted alpha to assign significance. The frequencies at which mechanism factors were selected were calculated. There were significant differences in the learning effect selected between the two scenarios of an item pair for 13 of this subset of 21 uncommon associations, even when a p-value of value. For a subset of uncommon associations in the model, the role of most assessment factor-learning effect associations and the mechanism factors involved were supported in a broader but similar population to that in which the model was derived. Although model validation is an ongoing process, these results move the model one step closer to the stage of usefully informing interventions. Results illustrate how factors not typically included in studies of the learning effects of assessment could confound the results of interventions aimed at using assessment to influence learning. © Blackwell Publishing Ltd 2012.

  3. Utility of Social Modeling for Proliferation Assessment - Preliminary Findings

    International Nuclear Information System (INIS)

    Coles, Garill A.; Gastelum, Zoe N.; Brothers, Alan J.; Thompson, Sandra E.

    2009-01-01

    Often the methodologies for assessing proliferation risk are focused around the inherent vulnerability of nuclear energy systems and associated safeguards. For example an accepted approach involves ways to measure the intrinsic and extrinsic barriers to potential proliferation. This paper describes preliminary investigation into non-traditional use of social and cultural information to improve proliferation assessment and advance the approach to assessing nuclear material diversion. Proliferation resistance assessment, safeguard assessments and related studies typically create technical information about the vulnerability of a nuclear energy system to diversion of nuclear material. The purpose of this research project is to find ways to integrate social information with technical information by explicitly considering the role of culture, groups and/or individuals to factors that impact the possibility of proliferation. When final, this work is expected to describe and demonstrate the utility of social science modeling in proliferation and proliferation risk assessments.

  4. Integrated assessment models of climate change. An incomplete overview

    International Nuclear Information System (INIS)

    Dowlatabadi, H.

    1995-01-01

    Integrated assessment is a trendy phrase that has recently entered the vocabulary of folks in Washington, DC and elsewhere. The novelty of the term in policy analysis and policy making circles belies the longevity of this approach in the sciences and past attempts at their application to policy issues. This paper is an attempt at providing an overview of integrated assessment with a special focus on policy motivated integrated assessments of climate change. The first section provides an introduction to integrated assessments in general, followed by a discussion of the bounds to the climate change issue. The next section is devoted to a taxonomy of the policy motivated models. Then the integrated assessment effort at Carnegie Mellon is described briefly. A perspective on the challenges ahead in successful representation of natural and social dynamics in integrated assessments of global climate change is presented in the final section. (Author)

  5. Model of environmental life cycle assessment for coal mining operations

    Energy Technology Data Exchange (ETDEWEB)

    Burchart-Korol, Dorota, E-mail: dburchart@gig.eu; Fugiel, Agata, E-mail: afugiel@gig.eu; Czaplicka-Kolarz, Krystyna, E-mail: kczaplicka@gig.eu; Turek, Marian, E-mail: mturek@gig.eu

    2016-08-15

    This paper presents a novel approach to environmental assessment of coal mining operations, which enables assessment of the factors that are both directly and indirectly affecting the environment and are associated with the production of raw materials and energy used in processes. The primary novelty of the paper is the development of a computational environmental life cycle assessment (LCA) model for coal mining operations and the application of the model for coal mining operations in Poland. The LCA model enables the assessment of environmental indicators for all identified unit processes in hard coal mines with the life cycle approach. The proposed model enables the assessment of greenhouse gas emissions (GHGs) based on the IPCC method and the assessment of damage categories, such as human health, ecosystems and resources based on the ReCiPe method. The model enables the assessment of GHGs for hard coal mining operations in three time frames: 20, 100 and 500 years. The model was used to evaluate the coal mines in Poland. It was demonstrated that the largest environmental impacts in damage categories were associated with the use of fossil fuels, methane emissions and the use of electricity, processing of wastes, heat, and steel supports. It was concluded that an environmental assessment of coal mining operations, apart from direct influence from processing waste, methane emissions and drainage water, should include the use of electricity, heat and steel, particularly for steel supports. Because the model allows the comparison of environmental impact assessment for various unit processes, it can be used for all hard coal mines, not only in Poland but also in the world. This development is an important step forward in the study of the impacts of fossil fuels on the environment with the potential to mitigate the impact of the coal industry on the environment. - Highlights: • A computational LCA model for assessment of coal mining operations • Identification of

  6. Model of environmental life cycle assessment for coal mining operations

    International Nuclear Information System (INIS)

    Burchart-Korol, Dorota; Fugiel, Agata; Czaplicka-Kolarz, Krystyna; Turek, Marian

    2016-01-01

    This paper presents a novel approach to environmental assessment of coal mining operations, which enables assessment of the factors that are both directly and indirectly affecting the environment and are associated with the production of raw materials and energy used in processes. The primary novelty of the paper is the development of a computational environmental life cycle assessment (LCA) model for coal mining operations and the application of the model for coal mining operations in Poland. The LCA model enables the assessment of environmental indicators for all identified unit processes in hard coal mines with the life cycle approach. The proposed model enables the assessment of greenhouse gas emissions (GHGs) based on the IPCC method and the assessment of damage categories, such as human health, ecosystems and resources based on the ReCiPe method. The model enables the assessment of GHGs for hard coal mining operations in three time frames: 20, 100 and 500 years. The model was used to evaluate the coal mines in Poland. It was demonstrated that the largest environmental impacts in damage categories were associated with the use of fossil fuels, methane emissions and the use of electricity, processing of wastes, heat, and steel supports. It was concluded that an environmental assessment of coal mining operations, apart from direct influence from processing waste, methane emissions and drainage water, should include the use of electricity, heat and steel, particularly for steel supports. Because the model allows the comparison of environmental impact assessment for various unit processes, it can be used for all hard coal mines, not only in Poland but also in the world. This development is an important step forward in the study of the impacts of fossil fuels on the environment with the potential to mitigate the impact of the coal industry on the environment. - Highlights: • A computational LCA model for assessment of coal mining operations • Identification of

  7. Model summary report for the safety assessment SR-Can

    Energy Technology Data Exchange (ETDEWEB)

    Vahlund, Fredrik

    2006-10-15

    This document is the model summary report for the safety assessment SR-Can. In the report, the quality assurance measures conducted for the assessment codes are presented together with the chosen methodology. In the safety assessment SR-Can, a number of different computer codes are used. In order to better understand how these codes are related Assessment Model Flowcharts, AMFs, have been produced within the project. From these, it is possible to identify the different modelling tasks and consequently also the different computer codes used. A large number of different computer codes are used in the assessment of which some are commercial while others are developed especially for the current assessment project. QA requirements must on the one hand take this diversity into account and on the other hand be well defined. In the methodology section of the report the following requirements are defined: It must be demonstrated that the code is suitable for its purpose; It must be demonstrated that the code has been properly used; and, It must be demonstrated that the code development process has followed appropriate procedures and that the code produces accurate results. Although the requirements are identical for all codes, the measures used to show that the requirements are fulfilled will be different for different codes (for instance due to the fact that for some software the source-code is not available for review). Subsequent to the methodology section, each assessment code is presented and it is shown how the requirements are met.

  8. Model summary report for the safety assessment SR-Can

    International Nuclear Information System (INIS)

    Vahlund, Fredrik

    2006-10-01

    This document is the model summary report for the safety assessment SR-Can. In the report, the quality assurance measures conducted for the assessment codes are presented together with the chosen methodology. In the safety assessment SR-Can, a number of different computer codes are used. In order to better understand how these codes are related Assessment Model Flowcharts, AMFs, have been produced within the project. From these, it is possible to identify the different modelling tasks and consequently also the different computer codes used. A large number of different computer codes are used in the assessment of which some are commercial while others are developed especially for the current assessment project. QA requirements must on the one hand take this diversity into account and on the other hand be well defined. In the methodology section of the report the following requirements are defined: It must be demonstrated that the code is suitable for its purpose; It must be demonstrated that the code has been properly used; and, It must be demonstrated that the code development process has followed appropriate procedures and that the code produces accurate results. Although the requirements are identical for all codes, the measures used to show that the requirements are fulfilled will be different for different codes (for instance due to the fact that for some software the source-code is not available for review). Subsequent to the methodology section, each assessment code is presented and it is shown how the requirements are met

  9. PARALLEL MODELS OF ASSESSMENT: INFANT MENTAL HEALTH AND THERAPEUTIC ASSESSMENT MODELS INTERSECT THROUGH EARLY CHILDHOOD CASE STUDIES.

    Science.gov (United States)

    Gart, Natalie; Zamora, Irina; Williams, Marian E

    2016-07-01

    Therapeutic Assessment (TA; S.E. Finn & M.E. Tonsager, 1997; J.D. Smith, 2010) is a collaborative, semistructured model that encourages self-discovery and meaning-making through the use of assessment as an intervention approach. This model shares core strategies with infant mental health assessment, including close collaboration with parents and caregivers, active participation of the family, a focus on developing new family stories and increasing parents' understanding of their child, and reducing isolation and increasing hope through the assessment process. The intersection of these two theoretical approaches is explored, using case studies of three infants/young children and their families to illustrate the application of TA to infant mental health. The case of an 18-month-old girl whose parents fear that she has bipolar disorder illustrates the core principles of the TA model, highlighting the use of assessment intervention sessions and the clinical approach to preparing assessment feedback. The second case follows an infant with a rare genetic syndrome from ages 2 to 24 months, focusing on the assessor-parent relationship and the importance of a developmental perspective. Finally, assessment of a 3-year-old boy illustrates the development and use of a fable as a tool to provide feedback to a young child about assessment findings and recommendations. © 2016 Michigan Association for Infant Mental Health.

  10. NEW MODEL OF QUALITY ASSESSMENT IN PUBLIC ADMINISTRATION - UPGRADING THE COMMON ASSESSMENT FRAMEWORK (CAF

    Directory of Open Access Journals (Sweden)

    Mirna Macur

    2017-01-01

    Full Text Available In our study, we developed new model of quality assessment in public administration. The Common Assessment Framework (CAF is frequently used in continental Europe for this purpose. Its use has many benefits, however we believe its assessment logic is not adequate for public administration. Upgraded version of CAF is conceptually different: instead of analytical and linear CAF we get the instrument that measures organisation as a network of complex processes. Original and upgraded assessment approaches are presented in the paper and compared in the case of self-assessment of selected public administration organisation. The two approaches produced different, sometimes contradictory results. The upgraded model proved to be logically more consistent and it produced higher interpretation capacity.

  11. Fish habitat simulation models and integrated assessment tools

    International Nuclear Information System (INIS)

    Harby, A.; Alfredsen, K.

    1999-01-01

    Because of human development water use increases in importance, and this worldwide trend is leading to an increasing number of user conflicts with a strong need for assessment tools to measure the impacts both on the ecosystem and the different users and user groups. The quantitative tools must allow a comparison of alternatives, different user groups, etc., and the tools must be integrated while impact assessments includes different disciplines. Fish species, especially young ones, are indicators of the environmental state of a riverine system and monitoring them is a way to follow environmental changes. The direct and indirect impacts on the ecosystem itself are measured, and impacts on user groups is not included. Fish habitat simulation models are concentrated on, and methods and examples are considered from Norway. Some ideas on integrated modelling tools for impact assessment studies are included. One dimensional hydraulic models are rapidly calibrated and do not require any expert knowledge in hydraulics. Two and three dimensional models require a bit more skilled users, especially if the topography is very heterogeneous. The advantages of using two and three dimensional models include: they do not need any calibration, just validation; they are predictive; and they can be more cost effective than traditional habitat hydraulic models when combined with modern data acquisition systems and tailored in a multi-disciplinary study. Suitable modelling model choice should be based on available data and possible data acquisition, available manpower, computer, and software resources, and needed output and accuracy in the output. 58 refs

  12. Addressing challenges in single species assessments via a simple state-space assessment model

    DEFF Research Database (Denmark)

    Nielsen, Anders

    Single-species and age-structured fish stock assessments still remains the main tool for managing fish stocks. A simple state-space assessment model is presented as an alternative to (semi) deterministic procedures and the full parametric statistical catch at age models. It offers a solution...... to some of the key challenges of these models. Compared to the deterministic procedures it solves a list of problems originating from falsely assuming that age classified catches are known without errors and allows quantification of uncertainties of estimated quantities of interest. Compared to full...

  13. Model and Analytic Processes for Export License Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.; Wood, Thomas W.; Daly, Don S.; Brothers, Alan J.; Sanfilippo, Antonio P.; Cook, Diane; Holder, Larry

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determine which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An

  14. Combining catchment and instream modelling to assess physical habitat quality

    DEFF Research Database (Denmark)

    Olsen, Martin

    Study objectives After the implementation of EU's Water Framework Directive (WFD) in Denmark ecological impacts from groundwater exploitation on surface waters has to receive additional consideration. Small streams in particular are susceptible to changes in run-off but have only recieved little...... attention in past studies of run-off impact on the quality of stream physical habitats. This study combined catchment and instream models with instream habitat observations to assess the ecological impacts from groundwater exploitation on a small stream. The main objectives of this study was; • to assess...... which factors are controlling the run-off conditions in stream Ledreborg and to what degree • to assess the run-off reference condition of stream Ledreborg where intensive groundwater abstraction has taken place in 67 years using a simple rainfall-run-off-model • to assess how stream run-off affect...

  15. A mathematical model for environmental risk assessment in manufacturing industry

    Institute of Scientific and Technical Information of China (English)

    何莉萍; 徐盛明; 陈大川; 党创寅

    2002-01-01

    Environmental conscious manufacturing has become an important issue in industry because of market pressure and environmental regulations. An environmental risk assessment model was developed based on the network analytic method and fuzzy set theory. The "interval analysis method" was applied to deal with the on-site monitoring data as basic information for assessment. In addition, the fuzzy set theory was employed to allow uncertain, interactive and dynamic information to be effectively incorporated into the environmental risk assessment. This model is a simple, practical and effective tool for evaluating the environmental risk of manufacturing industry and for analyzing the relative impacts of emission wastes, which are hazardous to both human and ecosystem health. Furthermore, the model is considered useful for design engineers and decision-maker to design and select processes when the costs, environmental impacts and performances of a product are taken into consideration.

  16. Environmental impact assessments and geological repositories: A model process

    International Nuclear Information System (INIS)

    Webster, S.

    2000-01-01

    In a recent study carried out for the European Commission, the scope and application of environmental impact assessment (EIA) legislation and current EIA practice in European Union Member States and applicant countries of Central and Eastern Europe was investigated, specifically in relation to the geological disposal of radioactive waste. This paper reports the study's investigations into a model approach to EIA in the context of geological repositories, including the role of the assessment in the overall decision processes and public involvement. (author)

  17. Ultrasensitive human thyrotropin (h TSH) immunoradiometric assay (IRMA) set up, through identification and minimization of non specific bindings; Ensaio imunoradiometrico ultra-sensivel de tireotrofina humana (hTSH) obtido mediante a identificacao e minimizacao de ligacoes inespecificas

    Energy Technology Data Exchange (ETDEWEB)

    Peroni, C N

    1994-12-31

    An IRMA of h TSH, based on magnetic solid phase separation, was studied especially for what concerns its non specific bindings. These were identified as a product of the interaction between an altered form of radioiodinated anti-h TSH monoclonal antibody ({sup 125} I-m AB) and the uncoupled magnetizable cellulose particle (matrix). Apparently this form of {sup 125} I-m AB is a type of aggregate that can be partly resolved from the main peak on Sephadex G-200 and further minimized via a single pre-incubation with the same matrix. Solid phase saturation with milk proteins, tracer storage at 4{sup 0} C and serum addition during incubation were also found particularly effective is preventing its formation. These findings were used in order to reproducibly decrease non specific bindings to values <0.1% (or <70 cpm), increasing thus the signal-to-noise ratio (B{sub 60}/B{sub O}) up to values of 300-500. This way we obtained h TSH radio assays with functional sensitivities of about 0.05 m IU/L and analytical sensitivities of the order of 0.02 m IU/L, which classify them at least as among the best second generation assays and that are excellent indeed for magnetic IRMA s. A more optimistic sensitivity calculation, based on Rodbard`s definition, provided values down to 0.008 m IU/L. Such sensitivities, moreover, were obtained in a very reproducible way and all over the useful tracer life. (author). 83 refs, 13 figs, 25 tabs.

  18. Proposing an Environmental Excellence Self-Assessment Model

    DEFF Research Database (Denmark)

    Meulengracht Jensen, Peter; Johansen, John; Wæhrens, Brian Vejrum

    2013-01-01

    that the EEA model can be used in global organizations to differentiate environmental efforts depending on the maturity stage of the individual sites. Furthermore, the model can be used to support the decision-making process regarding when organizations should embark on more complex environmental efforts......This paper presents an Environmental Excellence Self-Assessment (EEA) model based on the structure of the European Foundation of Quality Management Business Excellence Framework. Four theoretical scenarios for deploying the model are presented as well as managerial implications, suggesting...

  19. Report on the model developments in the sectoral assessments

    DEFF Research Database (Denmark)

    Iglesias, Ana; Termansen, Mette; Bouwer, Laurens

    2014-01-01

    into the economic assessments. At the same time, the models will link to the case studies in two ways. First, they use the data in the case studies for model validation and then they provide information to inform stakeholders on adaptation strategies. Therefore, Deliverable 3.2 aims to address three main questions......The Objective of this Deliverable D3.2 is to describe the models developed in BASE that is, the experimental setup for the sectoral modelling. The model development described in this deliverable will then be implemented in the adaptation and economic analysis in WP6 in order to integrate adaptation......: How to address climate adaptation options with the sectoral bottom-up models? - This includes a quantification of the costs of adaptation with the sectoral models, in monetary terms or in other measures of costs. The benefits in this framework will be the avoided damages, therefore a measure...

  20. Predictive assessment of models for dynamic functional connectivity

    DEFF Research Database (Denmark)

    Nielsen, Søren Føns Vind; Schmidt, Mikkel Nørgaard; Madsen, Kristoffer Hougaard

    2018-01-01

    represent functional brain networks as a meta-stable process with a discrete number of states; however, there is a lack of consensus on how to perform model selection and learn the number of states, as well as a lack of understanding of how different modeling assumptions influence the estimated state......In neuroimaging, it has become evident that models of dynamic functional connectivity (dFC), which characterize how intrinsic brain organization changes over time, can provide a more detailed representation of brain function than traditional static analyses. Many dFC models in the literature...... dynamics. To address these issues, we consider a predictive likelihood approach to model assessment, where models are evaluated based on their predictive performance on held-out test data. Examining several prominent models of dFC (in their probabilistic formulations) we demonstrate our framework...

  1. Agricultural climate impacts assessment for economic modeling and decision support

    Science.gov (United States)

    Thomson, A. M.; Izaurralde, R. C.; Beach, R.; Zhang, X.; Zhao, K.; Monier, E.

    2013-12-01

    A range of approaches can be used in the application of climate change projections to agricultural impacts assessment. Climate projections can be used directly to drive crop models, which in turn can be used to provide inputs for agricultural economic or integrated assessment models. These model applications, and the transfer of information between models, must be guided by the state of the science. But the methodology must also account for the specific needs of stakeholders and the intended use of model results beyond pure scientific inquiry, including meeting the requirements of agencies responsible for designing and assessing policies, programs, and regulations. Here we present methodology and results of two climate impacts studies that applied climate model projections from CMIP3 and from the EPA Climate Impacts and Risk Analysis (CIRA) project in a crop model (EPIC - Environmental Policy Indicator Climate) in order to generate estimates of changes in crop productivity for use in an agricultural economic model for the United States (FASOM - Forest and Agricultural Sector Optimization Model). The FASOM model is a forward-looking dynamic model of the US forest and agricultural sector used to assess market responses to changing productivity of alternative land uses. The first study, focused on climate change impacts on the UDSA crop insurance program, was designed to use available daily climate projections from the CMIP3 archive. The decision to focus on daily data for this application limited the climate model and time period selection significantly; however for the intended purpose of assessing impacts on crop insurance payments, consideration of extreme event frequency was critical for assessing periodic crop failures. In a second, coordinated impacts study designed to assess the relative difference in climate impacts under a no-mitigation policy and different future climate mitigation scenarios, the stakeholder specifically requested an assessment of a

  2. Consensus-based training and assessment model for general surgery.

    Science.gov (United States)

    Szasz, P; Louridas, M; de Montbrun, S; Harris, K A; Grantcharov, T P

    2016-05-01

    Surgical education is becoming competency-based with the implementation of in-training milestones. Training guidelines should reflect these changes and determine the specific procedures for such milestone assessments. This study aimed to develop a consensus view regarding operative procedures and tasks considered appropriate for junior and senior trainees, and the procedures that can be used as technical milestone assessments for trainee progression in general surgery. A Delphi process was followed where questionnaires were distributed to all 17 Canadian general surgery programme directors. Items were ranked on a 5-point Likert scale, with consensus defined as Cronbach's α of at least 0·70. Items rated 4 or above on the 5-point Likert scale by 80 per cent of the programme directors were included in the models. Two Delphi rounds were completed, with 14 programme directors taking part in round one and 11 in round two. The overall consensus was high (Cronbach's α = 0·98). The training model included 101 unique procedures and tasks, 24 specific to junior trainees, 68 specific to senior trainees, and nine appropriate to all. The assessment model included four procedures. A system of operative procedures and tasks for junior- and senior-level trainees has been developed along with an assessment model for trainee progression. These can be used as milestones in competency-based assessments. © 2016 BJS Society Ltd Published by John Wiley & Sons Ltd.

  3. Assessment of the assessment: Evaluation of the model quality estimates in CASP10

    KAUST Repository

    Kryshtafovych, Andriy

    2013-08-31

    The article presents an assessment of the ability of the thirty-seven model quality assessment (MQA) methods participating in CASP10 to provide an a priori estimation of the quality of structural models, and of the 67 tertiary structure prediction groups to provide confidence estimates for their predicted coordinates. The assessment of MQA predictors is based on the methods used in previous CASPs, such as correlation between the predicted and observed quality of the models (both at the global and local levels), accuracy of methods in distinguishing between good and bad models as well as good and bad regions within them, and ability to identify the best models in the decoy sets. Several numerical evaluations were used in our analysis for the first time, such as comparison of global and local quality predictors with reference (baseline) predictors and a ROC analysis of the predictors\\' ability to differentiate between the well and poorly modeled regions. For the evaluation of the reliability of self-assessment of the coordinate errors, we used the correlation between the predicted and observed deviations of the coordinates and a ROC analysis of correctly identified errors in the models. A modified two-stage procedure for testing MQA methods in CASP10 whereby a small number of models spanning the whole range of model accuracy was released first followed by the release of a larger number of models of more uniform quality, allowed a more thorough analysis of abilities and inabilities of different types of methods. Clustering methods were shown to have an advantage over the single- and quasi-single- model methods on the larger datasets. At the same time, the evaluation revealed that the size of the dataset has smaller influence on the global quality assessment scores (for both clustering and nonclustering methods), than its diversity. Narrowing the quality range of the assessed models caused significant decrease in accuracy of ranking for global quality predictors but

  4. Radioactive waste disposal assessment - overview of biosphere processes and models

    International Nuclear Information System (INIS)

    Coughtrey, P.J.

    1992-09-01

    This report provides an overview of biosphere processes and models in the general context of the radiological assessment of radioactive waste disposal as a basis for HMIP's response to biosphere aspects of Nirex's submissions for disposal of radioactive wastes in a purpose-built repository at Sellafield, Cumbria. The overview takes into account published information from the UK as available from Nirex's safety and assessment research programme and HMIP's disposal assessment programme, as well as that available from studies in the UK and elsewhere. (Author)

  5. Testing of an accident consequence assessment model using field data

    International Nuclear Information System (INIS)

    Homma, Toshimitsu; Matsubara, Takeshi; Tomita, Kenichi

    2007-01-01

    This paper presents the results obtained from the application of an accident consequence assessment model, OSCAAR to the Iput dose reconstruction scenario of BIOMASS and also to the Chernobyl 131 I fallout scenario of EMRAS, both organized by International Atomic Energy Agency. The Iput Scenario deals with 137 Cs contamination of the catchment basin and agricultural area in the Bryansk Region of Russia, which was heavily contaminated after the Chernobyl accident. This exercise was used to test the chronic exposure pathway models in OSCAAR with actual measurements and to identify the most important sources of uncertainty with respect to each part of the assessment. The OSCAAR chronic exposure pathway models had some limitations but the refined model, COLINA almost successfully reconstructed the whole 10-year time course of 137 Cs activity concentrations in most requested types of agricultural products and natural foodstuffs. The Plavsk scenario provides a good opportunity to test not only the food chain transfer model of 131 I but also the method of assessing 131 I thyroid burden. OSCAAR showed in general good capabilities for assessing the important 131 I exposure pathways. (author)

  6. Confidence assessment. Site-descriptive modelling SDM-Site Laxemar

    International Nuclear Information System (INIS)

    2009-06-01

    The objective of this report is to assess the confidence that can be placed in the Laxemar site descriptive model, based on the information available at the conclusion of the surface-based investigations (SDM-Site Laxemar). In this exploration, an overriding question is whether remaining uncertainties are significant for repository engineering design or long-term safety assessment and could successfully be further reduced by more surface-based investigations or more usefully by explorations underground made during construction of the repository. Procedures for this assessment have been progressively refined during the course of the site descriptive modelling, and applied to all previous versions of the Forsmark and Laxemar site descriptive models. They include assessment of whether all relevant data have been considered and understood, identification of the main uncertainties and their causes, possible alternative models and their handling, and consistency between disciplines. The assessment then forms the basis for an overall confidence statement. The confidence in the Laxemar site descriptive model, based on the data available at the conclusion of the surface based site investigations, has been assessed by exploring: - Confidence in the site characterization data base, - remaining issues and their handling, - handling of alternatives, - consistency between disciplines and - main reasons for confidence and lack of confidence in the model. Generally, the site investigation database is of high quality, as assured by the quality procedures applied. It is judged that the Laxemar site descriptive model has an overall high level of confidence. Because of the relatively robust geological model that describes the site, the overall confidence in the Laxemar Site Descriptive model is judged to be high, even though details of the spatial variability remain unknown. The overall reason for this confidence is the wide spatial distribution of the data and the consistency between

  7. Confidence assessment. Site-descriptive modelling SDM-Site Laxemar

    Energy Technology Data Exchange (ETDEWEB)

    2008-12-15

    The objective of this report is to assess the confidence that can be placed in the Laxemar site descriptive model, based on the information available at the conclusion of the surface-based investigations (SDM-Site Laxemar). In this exploration, an overriding question is whether remaining uncertainties are significant for repository engineering design or long-term safety assessment and could successfully be further reduced by more surface-based investigations or more usefully by explorations underground made during construction of the repository. Procedures for this assessment have been progressively refined during the course of the site descriptive modelling, and applied to all previous versions of the Forsmark and Laxemar site descriptive models. They include assessment of whether all relevant data have been considered and understood, identification of the main uncertainties and their causes, possible alternative models and their handling, and consistency between disciplines. The assessment then forms the basis for an overall confidence statement. The confidence in the Laxemar site descriptive model, based on the data available at the conclusion of the surface based site investigations, has been assessed by exploring: - Confidence in the site characterization data base, - remaining issues and their handling, - handling of alternatives, - consistency between disciplines and - main reasons for confidence and lack of confidence in the model. Generally, the site investigation database is of high quality, as assured by the quality procedures applied. It is judged that the Laxemar site descriptive model has an overall high level of confidence. Because of the relatively robust geological model that describes the site, the overall confidence in the Laxemar Site Descriptive model is judged to be high, even though details of the spatial variability remain unknown. The overall reason for this confidence is the wide spatial distribution of the data and the consistency between

  8. Guide for developing conceptual models for ecological risk assessments

    International Nuclear Information System (INIS)

    Suter, G.W., II.

    1996-05-01

    Ecological conceptual models are the result of the problem formulation phase of an ecological risk assessment, which is an important component of the Remedial Investigation process. They present hypotheses of how the site contaminants might affect the site ecology. The contaminant sources, routes, media, routes, and endpoint receptors are presented in the form of a flow chart. This guide is for preparing the conceptual models; use of this guide will standardize the models so that they will be of high quality, useful to the assessment process, and sufficiently consistent so that connections between sources of exposure and receptors can be extended across operable units (OU). Generic conceptual models are presented for source, aquatic integrator, groundwater integrator, and terrestrial OUs

  9. Persistent hemifacial spasm after microvascular decompression: a risk assessment model.

    Science.gov (United States)

    Shah, Aalap; Horowitz, Michael

    2017-06-01

    Microvascular decompression (MVD) for hemifacial spasm (HFS) provides resolution of disabling symptoms such as eyelid twitching and muscle contractions of the entire hemiface. The primary aim of this study was to evaluate the predictive value of patient demographics and spasm characteristics on long-term outcomes, with or without intraoperative lateral spread response (LSR) as an additional variable in a risk assessment model. A retrospective study was undertaken to evaluate the associations of pre-operative patient characteristics, as well as intraoperative LSR and need for a staged procedure on the presence of persistent or recurrent HFS at the time of hospital discharge and at follow-up. A risk assessment model was constructed with the inclusion of six clinically or statistically significant variables from the univariate analyses. A receiving operator characteristic curve was generated, and area under the curve was calculated to determine the strength of the predictive model. A risk assessment model was first created consisting of significant pre-operative variables (Model 1) (age >50, female gender, history of botulinum toxin use, platysma muscle involvement). This model demonstrated borderline predictive value for persistent spasm at discharge (AUC .60; p=.045) and fair predictive value at follow-up (AUC .75; p=.001). Intraoperative variables (e.g. LSR persistence) demonstrated little additive value (Model 2) (AUC .67). Patients with a higher risk score (three or greater) demonstrated greater odds of persistent HFS at the time of discharge (OR 1.5 [95%CI 1.16-1.97]; p=.035), as well as greater odds of persistent or recurrent spasm at the time of follow-up (OR 3.0 [95%CI 1.52-5.95]; p=.002) Conclusions: A risk assessment model consisting of pre-operative clinical characteristics is useful in prognosticating HFS persistence at follow-up.

  10. Modeling Composite Assessment Data Using Item Response Theory

    Science.gov (United States)

    Ueckert, Sebastian

    2018-01-01

    Composite assessments aim to combine different aspects of a disease in a single score and are utilized in a variety of therapeutic areas. The data arising from these evaluations are inherently discrete with distinct statistical properties. This tutorial presents the framework of the item response theory (IRT) for the analysis of this data type in a pharmacometric context. The article considers both conceptual (terms and assumptions) and practical questions (modeling software, data requirements, and model building). PMID:29493119

  11. Assessing biocomputational modelling in transforming clinical guidelines for osteoporosis management.

    Science.gov (United States)

    Thiel, Rainer; Viceconti, Marco; Stroetmann, Karl

    2011-01-01

    Biocomputational modelling as developed by the European Virtual Physiological Human (VPH) Initiative is the area of ICT most likely to revolutionise in the longer term the practice of medicine. Using the example of osteoporosis management, a socio-economic assessment framework is presented that captures how the transformation of clinical guidelines through VPH models can be evaluated. Applied to the Osteoporotic Virtual Physiological Human Project, a consequent benefit-cost analysis delivers promising results, both methodologically and substantially.

  12. The Generalised Ecosystem Modelling Approach in Radiological Assessment

    International Nuclear Information System (INIS)

    Klos, Richard

    2008-03-01

    An independent modelling capability is required by SSI in order to evaluate dose assessments carried out in Sweden by, amongst others, SKB. The main focus is the evaluation of the long-term radiological safety of radioactive waste repositories for both spent fuel and low-level radioactive waste. To meet the requirement for an independent modelling tool for use in biosphere dose assessments, SSI through its modelling team CLIMB commissioned the development of a new model in 2004, a project to produce an integrated model of radionuclides in the landscape. The generalised ecosystem modelling approach (GEMA) is the result. GEMA is a modular system of compartments representing the surface environment. It can be configured, through water and solid material fluxes, to represent local details in the range of ecosystem types found in the past, present and future Swedish landscapes. The approach is generic but fine tuning can be carried out using local details of the surface drainage system. The modular nature of the modelling approach means that GEMA modules can be linked to represent large scale surface drainage features over an extended domain in the landscape. System change can also be managed in GEMA, allowing a flexible and comprehensive model of the evolving landscape to be constructed. Environmental concentrations of radionuclides can be calculated and the GEMA dose pathway model provides a means of evaluating the radiological impact of radionuclide release to the surface environment. This document sets out the philosophy and details of GEMA and illustrates the functioning of the model with a range of examples featuring the recent CLIMB review of SKB's SR-Can assessment

  13. The Generalised Ecosystem Modelling Approach in Radiological Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Klos, Richard

    2008-03-15

    An independent modelling capability is required by SSI in order to evaluate dose assessments carried out in Sweden by, amongst others, SKB. The main focus is the evaluation of the long-term radiological safety of radioactive waste repositories for both spent fuel and low-level radioactive waste. To meet the requirement for an independent modelling tool for use in biosphere dose assessments, SSI through its modelling team CLIMB commissioned the development of a new model in 2004, a project to produce an integrated model of radionuclides in the landscape. The generalised ecosystem modelling approach (GEMA) is the result. GEMA is a modular system of compartments representing the surface environment. It can be configured, through water and solid material fluxes, to represent local details in the range of ecosystem types found in the past, present and future Swedish landscapes. The approach is generic but fine tuning can be carried out using local details of the surface drainage system. The modular nature of the modelling approach means that GEMA modules can be linked to represent large scale surface drainage features over an extended domain in the landscape. System change can also be managed in GEMA, allowing a flexible and comprehensive model of the evolving landscape to be constructed. Environmental concentrations of radionuclides can be calculated and the GEMA dose pathway model provides a means of evaluating the radiological impact of radionuclide release to the surface environment. This document sets out the philosophy and details of GEMA and illustrates the functioning of the model with a range of examples featuring the recent CLIMB review of SKB's SR-Can assessment

  14. Mesorad dose assessment model. Volume 1. Technical basis

    International Nuclear Information System (INIS)

    Scherpelz, R.I.; Bander, T.J.; Athey, G.F.; Ramsdell, J.V.

    1986-03-01

    MESORAD is a dose assessment model for emergency response applications. Using release data for as many as 50 radionuclides, the model calculates: (1) external doses resulting from exposure to radiation emitted by radionuclides contained in elevated or deposited material; (2) internal dose commitment resulting from inhalation; and (3) total whole-body doses. External doses from airborne material are calculated using semi-infinite and finite cloud approximations. At each stage in model execution, the appropriate approximation is selected after considering the cloud dimensions. Atmospheric processes are represented in MESORAD by a combination of Lagrangian puff and Gaussian plume dispersion models, a source depletion (deposition velocity) dry deposition model, and a wet deposition model using washout coefficients based on precipitation rates

  15. Biosphere models for safety assessment of radioactive waste disposal

    International Nuclear Information System (INIS)

    Proehl, G.; Olyslaegers, G.; Zeevaert, T.; Kanyar, B.; Bergstroem, U.; Hallberg, B.; Mobbs, S.; Chen, Q.; Kowe, R.

    2004-01-01

    The aim of the BioMoSA project has been to contribute in the confidence building of biosphere models, for application in performance assessments of radioactive waste disposal. The detailed objectives of this project are: development and test of practical biosphere models for application in long-term safety studies of radioactive waste disposal to different European locations, identification of features, events and processes that need to be modelled on a site-specific rather than on a generic base, comparison of the results and quantification of the variability of site-specific models developed according to the reference biosphere methodology, development of a generic biosphere tool for application in long term safety studies, comparison of results from site-specific models to those from generic one, Identification of possibilities and limitations for the application of the generic biosphere model. (orig.)

  16. Immunoradiometric versus radioimmunoassay: a comparison using alpha-fetoprotein as the model analyte

    International Nuclear Information System (INIS)

    Hunter, W.M.; Budd, P.S.

    1981-01-01

    With alpha-fetoprotein as a model a formal comparison has been made between inhibition type radioimmunoassay with radioiodinated antigen, liquid-phase incubation and double antibody separation (RIA) and variants of the sandwich immunoradiometric assay (IRMA). 125 I-antibody was prepared (expensively) by labelling the IgG fraction and subsequently undertaking immunopurification to yield a reproducibly-high quality reagent, 70-75% being reactive with antigen. The same antiserum was coupled to Sepharose 4B for use in the IRMA and separation was by the sucrose layer procedure (Hunter, 1980) which obviates the need for centrifugation. The principal basis used for the comparison was computer-generated precision profiles (Ekins, 1976), each of which was derived from 10 assays of each kind. (Auth.)

  17. Skill and independence weighting for multi-model assessments

    International Nuclear Information System (INIS)

    Sanderson, Benjamin M.; Wehner, Michael; Knutti, Reto

    2017-01-01

    We present a weighting strategy for use with the CMIP5 multi-model archive in the fourth National Climate Assessment, which considers both skill in the climatological performance of models over North America as well as the inter-dependency of models arising from common parameterizations or tuning practices. The method exploits information relating to the climatological mean state of a number of projection-relevant variables as well as metrics representing long-term statistics of weather extremes. The weights, once computed can be used to simply compute weighted means and significance information from an ensemble containing multiple initial condition members from potentially co-dependent models of varying skill. Two parameters in the algorithm determine the degree to which model climatological skill and model uniqueness are rewarded; these parameters are explored and final values are defended for the assessment. The influence of model weighting on projected temperature and precipitation changes is found to be moderate, partly due to a compensating effect between model skill and uniqueness. However, more aggressive skill weighting and weighting by targeted metrics is found to have a more significant effect on inferred ensemble confidence in future patterns of change for a given projection.

  18. A Corrosion Risk Assessment Model for Underground Piping

    Science.gov (United States)

    Datta, Koushik; Fraser, Douglas R.

    2009-01-01

    The Pressure Systems Manager at NASA Ames Research Center (ARC) has embarked on a project to collect data and develop risk assessment models to support risk-informed decision making regarding future inspections of underground pipes at ARC. This paper shows progress in one area of this project - a corrosion risk assessment model for the underground high-pressure air distribution piping system at ARC. It consists of a Corrosion Model of pipe-segments, a Pipe Wrap Protection Model; and a Pipe Stress Model for a pipe segment. A Monte Carlo simulation of the combined models provides a distribution of the failure probabilities. Sensitivity study results show that the model uncertainty, or lack of knowledge, is the dominant contributor to the calculated unreliability of the underground piping system. As a result, the Pressure Systems Manager may consider investing resources specifically focused on reducing these uncertainties. Future work includes completing the data collection effort for the existing ground based pressure systems and applying the risk models to risk-based inspection strategies of the underground pipes at ARC.

  19. An analytical model for the assessment of airline expansion strategies

    Directory of Open Access Journals (Sweden)

    Mauricio Emboaba Moreira

    2014-01-01

    Full Text Available Purpose: The purpose of this article is to develop an analytical model to assess airline expansion strategies by combining generic business strategy models with airline business models. Methodology and approach: A number of airline business models are examined, as are Porter’s (1983 industry five forces that drive competition, complemented by Nalebuff/ Brandenburger’s  (1996 sixth force, and the basic elements of the general environment in which the expansion process takes place.  A system of points and weights is developed to create a score among the 904,736 possible combinations considered. The model’s outputs are generic expansion strategies with quantitative assessments for each specific combination of elements inputted. Originality and value: The analytical model developed is original because it combines for the first time and explicitly elements of the general environment, industry environment, airline business models and the generic expansion strategy types. Besides it creates a system of scores that may be used to drive the decision process toward the choice of a specific strategic expansion path. Research implications: The analytical model may be adapted to other industries apart from the airline industry by substituting the element “airline business model” by other industries corresponding elements related to the different specific business models.

  20. A Methodology to Assess Ionospheric Models for GNSS

    Science.gov (United States)

    Rovira-Garcia, Adria; Juan, José Miguel; Sanz, Jaume; González-Casado, Guillermo; Ibánez, Deimos

    2015-04-01

    Testing the accuracy of the ionospheric models used in the Global Navigation Satellite System (GNSS) is a long-standing issue. It is still a challenging problem due to the lack of accurate enough slant ionospheric determinations to be used as a reference. The present study proposes a methodology to assess any ionospheric model used in satellite-based applications and, in particular, GNSS ionospheric models. The methodology complements other analysis comparing the navigation based on different models to correct the code and carrier-phase observations. Specifically, the following ionospheric models are assessed: the operational models broadcast in the Global Positioning System (GPS), Galileo and the European Geostationary Navigation Overlay System (EGNOS), the post-process Global Ionospheric Maps (GIMs) from different analysis centers belonging to the International GNSS Service (IGS) and, finally, a new GIM computed by the gAGE/UPC research group. The methodology is based in the comparison between the predictions of the ionospheric model with actual unambiguous carrier-phase measurements from a global distribution of permanent receivers. The differences shall be separated into the hardware delays (a receiver constant plus a satellite constant) per data interval, e.g., a day. The condition that these Differential Code Biases (DCBs) are commonly shared throughout the world-wide network of receivers and satellites provides a global character to the assessment. This approach generalizes simple tests based on double differenced Slant Total Electron Contents (STECs) between pairs of satellites and receivers on a much local scale. The present study has been conducted during the entire 2014, i.e., the last Solar Maximum. The seasonal and latitudinal structures of the results clearly reflect the different strategies used by the different models. On one hand, ionospheric model corrections based on a grid (IGS-GIMs or EGNOS) are shown to be several times better than the models

  1. Risk assessment and remedial policy evaluation using predictive modeling

    International Nuclear Information System (INIS)

    Linkov, L.; Schell, W.R.

    1996-01-01

    As a result of nuclear industry operation and accidents, large areas of natural ecosystems have been contaminated by radionuclides and toxic metals. Extensive societal pressure has been exerted to decrease the radiation dose to the population and to the environment. Thus, in making abatement and remediation policy decisions, not only economic costs but also human and environmental risk assessments are desired. This paper introduces a general framework for risk assessment and remedial policy evaluation using predictive modeling. Ecological risk assessment requires evaluation of the radionuclide distribution in ecosystems. The FORESTPATH model is used for predicting the radionuclide fate in forest compartments after deposition as well as for evaluating the efficiency of remedial policies. Time of intervention and radionuclide deposition profile was predicted as being crucial for the remediation efficiency. Risk assessment conducted for a critical group of forest users in Belarus shows that consumption of forest products (berries and mushrooms) leads to about 0.004% risk of a fatal cancer annually. Cost-benefit analysis for forest cleanup suggests that complete removal of organic layer is too expensive for application in Belarus and a better methodology is required. In conclusion, FORESTPATH modeling framework could have wide applications in environmental remediation of radionuclides and toxic metals as well as in dose reconstruction and, risk-assessment

  2. Peer Assessment with Online Tools to Improve Student Modeling

    Science.gov (United States)

    Atkins, Leslie J.

    2012-11-01

    Introductory physics courses often require students to develop precise models of phenomena and represent these with diagrams, including free-body diagrams, light-ray diagrams, and maps of field lines. Instructors expect that students will adopt a certain rigor and precision when constructing these diagrams, but we want that rigor and precision to be an aid to sense-making rather than meeting seemingly arbitrary requirements set by the instructor. By giving students the authority to develop their own models and establish requirements for their diagrams, the sense that these are arbitrary requirements diminishes and students are more likely to see modeling as a sense-making activity. The practice of peer assessment can help students take ownership; however, it can be difficult for instructors to manage. Furthermore, it is not without risk: students can be reluctant to critique their peers, they may view this as the job of the instructor, and there is no guarantee that students will employ greater rigor and precision as a result of peer assessment. In this article, we describe one approach for peer assessment that can establish norms for diagrams in a way that is student driven, where students retain agency and authority in assessing and improving their work. We show that such an approach does indeed improve students' diagrams and abilities to assess their own work, without sacrificing students' authority and agency.

  3. Interactive Rapid Dose Assessment Model (IRDAM): reactor-accident assessment methods. Vol.2

    International Nuclear Information System (INIS)

    Poeton, R.W.; Moeller, M.P.; Laughlin, G.J.; Desrosiers, A.E.

    1983-05-01

    As part of the continuing emphasis on emergency preparedness, the US Nuclear Regulatory Commission (NRC) sponsored the development of a rapid dose assessment system by Pacific Northwest Laboratory (PNL). This system, the Interactive Rapid Dose Assessment Model (IRDAM) is a micro-computer based program for rapidly assessing the radiological impact of accidents at nuclear power plants. This document describes the technical bases for IRDAM including methods, models and assumptions used in calculations. IRDAM calculates whole body (5-cm depth) and infant thyroid doses at six fixed downwind distances between 500 and 20,000 meters. Radionuclides considered primarily consist of noble gases and radioiodines. In order to provide a rapid assessment capability consistent with the capacity of the Osborne-1 computer, certain simplifying approximations and assumptions are made. These are described, along with default values (assumptions used in the absence of specific input) in the text of this document. Two companion volumes to this one provide additional information on IRDAM. The user's Guide (NUREG/CR-3012, Volume 1) describes the setup and operation of equipment necessary to run IRDAM. Scenarios for Comparing Dose Assessment Models (NUREG/CR-3012, Volume 3) provides the results of calculations made by IRDAM and other models for specific accident scenarios

  4. A critique of recent models for human error rate assessment

    International Nuclear Information System (INIS)

    Apostolakis, G.E.

    1988-01-01

    This paper critically reviews two groups of models for assessing human error rates under accident conditions. The first group, which includes the US Nuclear Regulatory Commission (NRC) handbook model and the human cognitive reliability (HCR) model, considers as fundamental the time that is available to the operators to act. The second group, which is represented by the success likelihood index methodology multiattribute utility decomposition (SLIM-MAUD) model, relies on ratings of the human actions with respect to certain qualitative factors and the subsequent derivation of error rates. These models are evaluated with respect to two criteria: the treatment of uncertainties and the internal coherence of the models. In other words, this evaluation focuses primarily on normative aspects of these models. The principal findings are as follows: (1) Both of the time-related models provide human error rates as a function of the available time for action and the prevailing conditions. However, the HCR model ignores the important issue of state-of-knowledge uncertainties, dealing exclusively with stochastic uncertainty, whereas the model presented in the NRC handbook handles both types of uncertainty. (2) SLIM-MAUD provides a highly structured approach for the derivation of human error rates under given conditions. However, the treatment of the weights and ratings in this model is internally inconsistent. (author)

  5. Assessment of the Eu migration experiments and their modelling

    International Nuclear Information System (INIS)

    Klotz, D.

    2001-01-01

    The humic acid transport of heavy metals in underground water was investigated in laboratory experiments using the lanthanide Eu in the form of 152 Eu 3+ , which is both a model heavy metal and an indicator for assessing the potential hazards of ultimate storage sites for radioactive waste [de

  6. Application of mixed models for the assessment genotype and ...

    African Journals Online (AJOL)

    Application of mixed models for the assessment genotype and environment interactions in cotton ( Gossypium hirsutum ) cultivars in Mozambique. ... The cultivars ISA 205, STAM 42 and REMU 40 showed superior productivity when they were selected by the Harmonic Mean of Genotypic Values (HMGV) criterion in relation ...

  7. Groundwater Impacts of Radioactive Wastes and Associated Environmental Modeling Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Rui; Zheng, Chunmiao; Liu, Chongxuan

    2012-11-01

    This article provides a review of the major sources of radioactive wastes and their impacts on groundwater contamination. The review discusses the major biogeochemical processes that control the transport and fate of radionuclide contaminants in groundwater, and describe the evolution of mathematical models designed to simulate and assess the transport and transformation of radionuclides in groundwater.

  8. Task-based dermal exposure models for regulatory risk assessment

    NARCIS (Netherlands)

    Warren, N.D.; Marquart, H.; Christopher, Y.; Laitinen, J.; Hemmen, J.J. van

    2006-01-01

    The regulatory risk assessment of chemicals requires the estimation of occupational dermal exposure. Until recently, the models used were either based on limited data or were specific to a particular class of chemical or application. The EU project RISKOFDERM has gathered a considerable number of

  9. Confidence Intervals for Assessing Heterogeneity in Generalized Linear Mixed Models

    Science.gov (United States)

    Wagler, Amy E.

    2014-01-01

    Generalized linear mixed models are frequently applied to data with clustered categorical outcomes. The effect of clustering on the response is often difficult to practically assess partly because it is reported on a scale on which comparisons with regression parameters are difficult to make. This article proposes confidence intervals for…

  10. Modeling current climate conditions for forest pest risk assessment

    Science.gov (United States)

    Frank H. Koch; John W. Coulston

    2010-01-01

    Current information on broad-scale climatic conditions is essential for assessing potential distribution of forest pests. At present, sophisticated spatial interpolation approaches such as the Parameter-elevation Regressions on Independent Slopes Model (PRISM) are used to create high-resolution climatic data sets. Unfortunately, these data sets are based on 30-year...

  11. Assessment of the Quality Management Models in Higher Education

    Science.gov (United States)

    Basar, Gulsun; Altinay, Zehra; Dagli, Gokmen; Altinay, Fahriye

    2016-01-01

    This study involves the assessment of the quality management models in Higher Education by explaining the importance of quality in higher education and by examining the higher education quality assurance system practices in other countries. The qualitative study was carried out with the members of the Higher Education Planning, Evaluation,…

  12. Model assessment of protective barrier designs: Part 2

    International Nuclear Information System (INIS)

    Fayer, M.J.

    1987-11-01

    Protective barriers are being considered for use at the Hanford Site to enhance the isolation of radioactive wastes from water, plant, and animal intrusion. This study assesses the effectiveness of protective barriers for isolation of wastes from water. In this report, barrier designs are reviewed and several barrier modeling assumptions are tested. 20 refs., 16 figs., 6 tabs

  13. Modeling Logistic Performance in Quantitative Microbial Risk Assessment

    NARCIS (Netherlands)

    Rijgersberg, H.; Tromp, S.O.; Jacxsens, L.; Uyttendaele, M.

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage

  14. Heuristic Model Of The Composite Quality Index Of Environmental Assessment

    Science.gov (United States)

    Khabarov, A. N.; Knyaginin, A. A.; Bondarenko, D. V.; Shepet, I. P.; Korolkova, L. N.

    2017-01-01

    The goal of the paper is to present the heuristic model of the composite environmental quality index based on the integrated application of the elements of utility theory, multidimensional scaling, expert evaluation and decision-making. The composite index is synthesized in linear-quadratic form, it provides higher adequacy of the results of the assessment preferences of experts and decision-makers.

  15. An Empirical Study of a Solo Performance Assessment Model

    Science.gov (United States)

    Russell, Brian E.

    2015-01-01

    The purpose of this study was to test a hypothesized model of solo music performance assessment. Specifically, this study investigates the influence of technique and musical expression on perceptions of overall performance quality. The Aural Musical Performance Quality (AMPQ) measure was created to measure overall performance quality, technique,…

  16. A model for assessing Medicago Sativa L. hay quality | Scholtz ...

    African Journals Online (AJOL)

    A study was conducted to identify chemical parameters and/or models for assessing. Medicago sativa L. (L) hay quality, using near infrared reflectance spectroscopy (NIRS) analysis and Cornell Net Carbohydrate and Protein System (CNCPS) milk prediction as a criterion of accuracy. Milk yield (MY) derived from the ...

  17. A Comprehensive Assessment Model for Critical Infrastructure Protection

    Directory of Open Access Journals (Sweden)

    Häyhtiö Markus

    2017-12-01

    Full Text Available International business demands seamless service and IT-infrastructure throughout the entire supply chain. However, dependencies between different parts of this vulnerable ecosystem form a fragile web. Assessment of the financial effects of any abnormalities in any part of the network is demanded in order to protect this network in a financially viable way. Contractual environment between the actors in a supply chain, different business domains and functions requires a management model, which enables a network wide protection for critical infrastructure. In this paper authors introduce such a model. It can be used to assess financial differences between centralized and decentralized protection of critical infrastructure. As an end result of this assessment business resilience to unknown threats can be improved across the entire supply chain.

  18. Validation study of safety assessment model for radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Munakata, Masahiro; Takeda, Seiji; Kimura, Hideo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-12-01

    The JAERI-AECL collaboration research program has been conducted to validate a groundwater flow and radionuclide transport models for safety assessment. JAERI have developed a geostatistical model for radionuclide transport through a heterogeneous geological media and verify using experimental results of field tracer tests. The simulated tracer plumes explain favorably the experimental tracer plumes. A regional groundwater flow and transport model using site-scale parameter obtained from tracer tests have been verified by comparing simulation results with observation ones of natural environmental tracer. (author)

  19. Model Test Bed for Evaluating Wave Models and Best Practices for Resource Assessment and Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Neary, Vincent Sinclair [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Water Power Technologies; Yang, Zhaoqing [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). Coastal Sciences Division; Wang, Taiping [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). Coastal Sciences Division; Gunawan, Budi [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Water Power Technologies; Dallman, Ann Renee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Water Power Technologies

    2016-03-01

    A wave model test bed is established to benchmark, test and evaluate spectral wave models and modeling methodologies (i.e., best practices) for predicting the wave energy resource parameters recommended by the International Electrotechnical Commission, IEC TS 62600-101Ed. 1.0 ©2015. Among other benefits, the model test bed can be used to investigate the suitability of different models, specifically what source terms should be included in spectral wave models under different wave climate conditions and for different classes of resource assessment. The overarching goal is to use these investigations to provide industry guidance for model selection and modeling best practices depending on the wave site conditions and desired class of resource assessment. Modeling best practices are reviewed, and limitations and knowledge gaps in predicting wave energy resource parameters are identified.

  20. Spatial variability and parametric uncertainty in performance assessment models

    International Nuclear Information System (INIS)

    Pensado, Osvaldo; Mancillas, James; Painter, Scott; Tomishima, Yasuo

    2011-01-01

    The problem of defining an appropriate treatment of distribution functions (which could represent spatial variability or parametric uncertainty) is examined based on a generic performance assessment model for a high-level waste repository. The generic model incorporated source term models available in GoldSim ® , the TDRW code for contaminant transport in sparse fracture networks with a complex fracture-matrix interaction process, and a biosphere dose model known as BDOSE TM . Using the GoldSim framework, several Monte Carlo sampling approaches and transport conceptualizations were evaluated to explore the effect of various treatments of spatial variability and parametric uncertainty on dose estimates. Results from a model employing a representative source and ensemble-averaged pathway properties were compared to results from a model allowing for stochastic variation of transport properties along streamline segments (i.e., explicit representation of spatial variability within a Monte Carlo realization). We concluded that the sampling approach and the definition of an ensemble representative do influence consequence estimates. In the examples analyzed in this paper, approaches considering limited variability of a transport resistance parameter along a streamline increased the frequency of fast pathways resulting in relatively high dose estimates, while those allowing for broad variability along streamlines increased the frequency of 'bottlenecks' reducing dose estimates. On this basis, simplified approaches with limited consideration of variability may suffice for intended uses of the performance assessment model, such as evaluation of site safety. (author)

  1. Assessing groundwater policy with coupled economic-groundwater hydrologic modeling

    Science.gov (United States)

    Mulligan, Kevin B.; Brown, Casey; Yang, Yi-Chen E.; Ahlfeld, David P.

    2014-03-01

    This study explores groundwater management policies and the effect of modeling assumptions on the projected performance of those policies. The study compares an optimal economic allocation for groundwater use subject to streamflow constraints, achieved by a central planner with perfect foresight, with a uniform tax on groundwater use and a uniform quota on groundwater use. The policies are compared with two modeling approaches, the Optimal Control Model (OCM) and the Multi-Agent System Simulation (MASS). The economic decision models are coupled with a physically based representation of the aquifer using a calibrated MODFLOW groundwater model. The results indicate that uniformly applied policies perform poorly when simulated with more realistic, heterogeneous, myopic, and self-interested agents. In particular, the effects of the physical heterogeneity of the basin and the agents undercut the perceived benefits of policy instruments assessed with simple, single-cell groundwater modeling. This study demonstrates the results of coupling realistic hydrogeology and human behavior models to assess groundwater management policies. The Republican River Basin, which overlies a portion of the Ogallala aquifer in the High Plains of the United States, is used as a case study for this analysis.

  2. Some considerations for validation of repository performance assessment models

    International Nuclear Information System (INIS)

    Eisenberg, N.

    1991-01-01

    Validation is an important aspect of the regulatory uses of performance assessment. A substantial body of literature exists indicating the manner in which validation of models is usually pursued. Because performance models for a nuclear waste repository cannot be tested over the long time periods for which the model must make predictions, the usual avenue for model validation is precluded. Further impediments to model validation include a lack of fundamental scientific theory to describe important aspects of repository performance and an inability to easily deduce the complex, intricate structures characteristic of a natural system. A successful strategy for validation must attempt to resolve these difficulties in a direct fashion. Although some procedural aspects will be important, the main reliance of validation should be on scientific substance and logical rigor. The level of validation needed will be mandated, in part, by the uses to which these models are put, rather than by the ideal of validation of a scientific theory. Because of the importance of the validation of performance assessment models, the NRC staff has engaged in a program of research and international cooperation to seek progress in this important area. 2 figs., 16 refs

  3. Assessment of health surveys: fitting a multidimensional graded response model.

    Science.gov (United States)

    Depaoli, Sarah; Tiemensma, Jitske; Felt, John M

    The multidimensional graded response model, an item response theory (IRT) model, can be used to improve the assessment of surveys, even when sample sizes are restricted. Typically, health-based survey development utilizes classical statistical techniques (e.g. reliability and factor analysis). In a review of four prominent journals within the field of Health Psychology, we found that IRT-based models were used in less than 10% of the studies examining scale development or assessment. However, implementing IRT-based methods can provide more details about individual survey items, which is useful when determining the final item content of surveys. An example using a quality of life survey for Cushing's syndrome (CushingQoL) highlights the main components for implementing the multidimensional graded response model. Patients with Cushing's syndrome (n = 397) completed the CushingQoL. Results from the multidimensional graded response model supported a 2-subscale scoring process for the survey. All items were deemed as worthy contributors to the survey. The graded response model can accommodate unidimensional or multidimensional scales, be used with relatively lower sample sizes, and is implemented in free software (example code provided in online Appendix). Use of this model can help to improve the quality of health-based scales being developed within the Health Sciences.

  4. Modeling human intention formation for human reliability assessment

    International Nuclear Information System (INIS)

    Woods, D.D.; Roth, E.M.; Pople, H. Jr.

    1988-01-01

    This paper describes a dynamic simulation capability for modeling how people form intentions to act in nuclear power plant emergency situations. This modeling tool, Cognitive Environment Simulation or CES, was developed based on techniques from artificial intelligence. It simulates the cognitive processes that determine situation assessment and intention formation. It can be used to investigate analytically what situations and factors lead to intention failures, what actions follow from intention failures (e.g. errors of omission, errors of commission, common mode errors), the ability to recover from errors or additional machine failures, and the effects of changes in the NPP person machine system. One application of the CES modeling environment is to enhance the measurement of the human contribution to risk in probabilistic risk assessment studies. (author)

  5. Connecting single-stock assessment models through correlated survival

    DEFF Research Database (Denmark)

    Albertsen, Christoffer Moesgaard; Nielsen, Anders; Thygesen, Uffe Høgsbro

    2017-01-01

    times. We propose a simple alternative. In three case studies each with two stocks, we improve the single-stock models, as measured by Akaike information criterion, by adding correlation in the cohort survival. To limit the number of parameters, the correlations are parameterized through...... the corresponding partial correlations. We consider six models where the partial correlation matrix between stocks follows a band structure ranging from independent assessments to complex correlation structures. Further, a simulation study illustrates the importance of handling correlated data sufficiently...... by investigating the coverage of confidence intervals for estimated fishing mortality. The results presented will allow managers to evaluate stock statuses based on a more accurate evaluation of model output uncertainty. The methods are directly implementable for stocks with an analytical assessment and do...

  6. Dynamic model based on Bayesian method for energy security assessment

    International Nuclear Information System (INIS)

    Augutis, Juozas; Krikštolaitis, Ričardas; Pečiulytė, Sigita; Žutautaitė, Inga

    2015-01-01

    Highlights: • Methodology for dynamic indicator model construction and forecasting of indicators. • Application of dynamic indicator model for energy system development scenarios. • Expert judgement involvement using Bayesian method. - Abstract: The methodology for the dynamic indicator model construction and forecasting of indicators for the assessment of energy security level is presented in this article. An indicator is a special index, which provides numerical values to important factors for the investigated area. In real life, models of different processes take into account various factors that are time-dependent and dependent on each other. Thus, it is advisable to construct a dynamic model in order to describe these dependences. The energy security indicators are used as factors in the dynamic model. Usually, the values of indicators are obtained from statistical data. The developed dynamic model enables to forecast indicators’ variation taking into account changes in system configuration. The energy system development is usually based on a new object construction. Since the parameters of changes of the new system are not exactly known, information about their influences on indicators could not be involved in the model by deterministic methods. Thus, dynamic indicators’ model based on historical data is adjusted by probabilistic model with the influence of new factors on indicators using the Bayesian method

  7. An Improved Nested Sampling Algorithm for Model Selection and Assessment

    Science.gov (United States)

    Zeng, X.; Ye, M.; Wu, J.; WANG, D.

    2017-12-01

    Multimodel strategy is a general approach for treating model structure uncertainty in recent researches. The unknown groundwater system is represented by several plausible conceptual models. Each alternative conceptual model is attached with a weight which represents the possibility of this model. In Bayesian framework, the posterior model weight is computed as the product of model prior weight and marginal likelihood (or termed as model evidence). As a result, estimating marginal likelihoods is crucial for reliable model selection and assessment in multimodel analysis. Nested sampling estimator (NSE) is a new proposed algorithm for marginal likelihood estimation. The implementation of NSE comprises searching the parameters' space from low likelihood area to high likelihood area gradually, and this evolution is finished iteratively via local sampling procedure. Thus, the efficiency of NSE is dominated by the strength of local sampling procedure. Currently, Metropolis-Hasting (M-H) algorithm and its variants are often used for local sampling in NSE. However, M-H is not an efficient sampling algorithm for high-dimensional or complex likelihood function. For improving the performance of NSE, it could be feasible to integrate more efficient and elaborated sampling algorithm - DREAMzs into the local sampling. In addition, in order to overcome the computation burden problem of large quantity of repeating model executions in marginal likelihood estimation, an adaptive sparse grid stochastic collocation method is used to build the surrogates for original groundwater model.

  8. Agent Model Development for Assessing Climate-Induced Geopolitical Instability.

    Energy Technology Data Exchange (ETDEWEB)

    Boslough, Mark B.; Backus, George A.

    2005-12-01

    We present the initial stages of development of new agent-based computational methods to generate and test hypotheses about linkages between environmental change and international instability. This report summarizes the first year's effort of an originally proposed three-year Laboratory Directed Research and Development (LDRD) project. The preliminary work focused on a set of simple agent-based models and benefited from lessons learned in previous related projects and case studies of human response to climate change and environmental scarcity. Our approach was to define a qualitative model using extremely simple cellular agent models akin to Lovelock's Daisyworld and Schelling's segregation model. Such models do not require significant computing resources, and users can modify behavior rules to gain insights. One of the difficulties in agent-based modeling is finding the right balance between model simplicity and real-world representation. Our approach was to keep agent behaviors as simple as possible during the development stage (described herein) and to ground them with a realistic geospatial Earth system model in subsequent years. This work is directed toward incorporating projected climate data--including various C02 scenarios from the Intergovernmental Panel on Climate Change (IPCC) Third Assessment Report--and ultimately toward coupling a useful agent-based model to a general circulation model.3

  9. Models for waste life cycle assessment: Review of technical assumptions

    DEFF Research Database (Denmark)

    Gentil, Emmanuel; Damgaard, Anders; Hauschild, Michael Zwicky

    2010-01-01

    A number of waste life cycle assessment (LCA) models have been gradually developed since the early 1990s, in a number of countries, usually independently from each other. Large discrepancies in results have been observed among different waste LCA models, although it has also been shown that results...... from different LCA studies can be consistent. This paper is an attempt to identify, review and analyse methodologies and technical assumptions used in various parts of selected waste LCA models. Several criteria were identified, which could have significant impacts on the results......, such as the functional unit, system boundaries, waste composition and energy modelling. The modelling assumptions of waste management processes, ranging from collection, transportation, intermediate facilities, recycling, thermal treatment, biological treatment, and landfilling, are obviously critical when comparing...

  10. Assessing uncertainty in SRTM elevations for global flood modelling

    Science.gov (United States)

    Hawker, L. P.; Rougier, J.; Neal, J. C.; Bates, P. D.

    2017-12-01

    The SRTM DEM is widely used as the topography input to flood models in data-sparse locations. Understanding spatial error in the SRTM product is crucial in constraining uncertainty about elevations and assessing the impact of these upon flood prediction. Assessment of SRTM error was carried out by Rodriguez et al (2006), but this did not explicitly quantify the spatial structure of vertical errors in the DEM, and nor did it distinguish between errors over different types of landscape. As a result, there is a lack of information about spatial structure of vertical errors of the SRTM in the landscape that matters most to flood models - the floodplain. Therefore, this study attempts this task by comparing SRTM, an error corrected SRTM product (The MERIT DEM of Yamazaki et al., 2017) and near truth LIDAR elevations for 3 deltaic floodplains (Mississippi, Po, Wax Lake) and a large lowland region (the Fens, UK). Using the error covariance function, calculated by comparing SRTM elevations to the near truth LIDAR, perturbations of the 90m SRTM DEM were generated, producing a catalogue of plausible DEMs. This allows modellers to simulate a suite of plausible DEMs at any aggregated block size above native SRTM resolution. Finally, the generated DEM's were input into a hydrodynamic model of the Mekong Delta, built using the LISFLOOD-FP hydrodynamic model, to assess how DEM error affects the hydrodynamics and inundation extent across the domain. The end product of this is an inundation map with the probability of each pixel being flooded based on the catalogue of DEMs. In a world of increasing computer power, but a lack of detailed datasets, this powerful approach can be used throughout natural hazard modelling to understand how errors in the SRTM DEM can impact the hazard assessment.

  11. GEMA3D - landscape modelling for dose assessments

    International Nuclear Information System (INIS)

    Klos, Richard

    2010-08-01

    Concerns have been raised about SKB's interpretation of landscape objects in their radiological assessment models, specifically in relation to the size of the objects represented - leading to excessive volumetric dilution - and to the interpretation of local hydrology - leading to non-conservative hydrologic dilution. Developed from the Generic Ecosystem Modelling Approach, GEMA3D is an attempt to address these issues in a simple radiological assessment landscape model. In GEMA3D landscape features are model led as landscape elements (lels) based on a three compartment structure which is able to represent both terrestrial and aquatic lels. The area of the lels can be chosen to coincide with the bedrock fracture from which radionuclides are assumed to be released and the dispersion of radionuclides through out the landscape can be traced. Result indicate that released contaminants remain localised close to the release location and follow the main flow axis of the surface drainage system. This is true even for relatively weakly sorbing species. An interpretation of the size of landscape elements suitable to represent dilution in the biosphere for radiological assessment purposes is suggested, though the concept remains flexible. For reference purposes an agricultural area of one hectare is the baseline. The Quaternary deposits (QD) at the Forsmark site are only a few metres thick above the crystalline bedrock in which the planned repository for spent fuel will be constructed. The biosphere model is assumed to be the upper one metre of the QD. A further model has been implemented for advective - dispersive transport in the deeper QD. The effects of chemical zonation have been briefly investigated. The results confirm the importance of retention close to the release point from the bedrock and clearly indicate that there is a need for a better description of the hydrology of the QD on the spatial scales relevant to the lels required for radiological assessments

  12. GEMA3D - landscape modelling for dose assessments

    Energy Technology Data Exchange (ETDEWEB)

    Klos, Richard (Aleksandria Sciences (United Kingdom))

    2010-08-15

    Concerns have been raised about SKB's interpretation of landscape objects in their radiological assessment models, specifically in relation to the size of the objects represented - leading to excessive volumetric dilution - and to the interpretation of local hydrology - leading to non-conservative hydrologic dilution. Developed from the Generic Ecosystem Modelling Approach, GEMA3D is an attempt to address these issues in a simple radiological assessment landscape model. In GEMA3D landscape features are model led as landscape elements (lels) based on a three compartment structure which is able to represent both terrestrial and aquatic lels. The area of the lels can be chosen to coincide with the bedrock fracture from which radionuclides are assumed to be released and the dispersion of radionuclides through out the landscape can be traced. Result indicate that released contaminants remain localised close to the release location and follow the main flow axis of the surface drainage system. This is true even for relatively weakly sorbing species. An interpretation of the size of landscape elements suitable to represent dilution in the biosphere for radiological assessment purposes is suggested, though the concept remains flexible. For reference purposes an agricultural area of one hectare is the baseline. The Quaternary deposits (QD) at the Forsmark site are only a few metres thick above the crystalline bedrock in which the planned repository for spent fuel will be constructed. The biosphere model is assumed to be the upper one metre of the QD. A further model has been implemented for advective - dispersive transport in the deeper QD. The effects of chemical zonation have been briefly investigated. The results confirm the importance of retention close to the release point from the bedrock and clearly indicate that there is a need for a better description of the hydrology of the QD on the spatial scales relevant to the lels required for radiological assessments

  13. The Assessment of Patient Clinical Outcome: Advantages, Models, Features of an Ideal Model

    Directory of Open Access Journals (Sweden)

    Mou’ath Hourani

    2016-06-01

    Full Text Available Background: The assessment of patient clinical outcome focuses on measuring various aspects of the health status of a patient who is under healthcare intervention. Patient clinical outcome assessment is a very significant process in the clinical field as it allows health care professionals to better understand the effectiveness of their health care programs and thus for enhancing the health care quality in general. It is thus vital that a high quality, informative review of current issues regarding the assessment of patient clinical outcome should be conducted. Aims & Objectives: 1 Summarizes the advantages of the assessment of patient clinical outcome; 2 reviews some of the existing patient clinical outcome assessment models namely: Simulation, Markov, Bayesian belief networks, Bayesian statistics and Conventional statistics, and Kaplan-Meier analysis models; and 3 demonstrates the desired features that should be fulfilled by a well-established ideal patient clinical outcome assessment model. Material & Methods: An integrative review of the literature has been performed using the Google Scholar to explore the field of patient clinical outcome assessment. Conclusion: This paper will directly support researchers, clinicians and health care professionals in their understanding of developments in the domain of the assessment of patient clinical outcome, thus enabling them to propose ideal assessment models.

  14. The Assessment of Patient Clinical Outcome: Advantages, Models, Features of an Ideal Model

    Directory of Open Access Journals (Sweden)

    Mou’ath Hourani

    2016-06-01

    Full Text Available Background: The assessment of patient clinical outcome focuses on measuring various aspects of the health status of a patient who is under healthcare intervention. Patient clinical outcome assessment is a very significant process in the clinical field as it allows health care professionals to better understand the effectiveness of their health care programs and thus for enhancing the health care quality in general. It is thus vital that a high quality, informative review of current issues regarding the assessment of patient clinical outcome should be conducted. Aims & Objectives: 1 Summarizes the advantages of the assessment of patient clinical outcome; 2 reviews some of the existing patient clinical outcome assessment models namely: Simulation, Markov, Bayesian belief networks, Bayesian statistics and Conventional statistics, and Kaplan-Meier analysis models; and 3 demonstrates the desired features that should be fulfilled by a well-established ideal patient clinical outcome assessment model. Material & Methods: An integrative review of the literature has been performed using the Google Scholar to explore the field of patient clinical outcome assessment. Conclusion: This paper will directly support researchers, clinicians and health care professionals in their understanding of developments in the domain of the assessment of patient clinical outcome, thus enabling them to propose ideal assessment models.

  15. Avian collision risk models for wind energy impact assessments

    Energy Technology Data Exchange (ETDEWEB)

    Masden, E.A., E-mail: elizabeth.masden@uhi.ac.uk [Environmental Research Institute, North Highland College-UHI, University of the Highlands and Islands, Ormlie Road, Thurso, Caithness KW14 7EE (United Kingdom); Cook, A.S.C.P. [British Trust for Ornithology, The Nunnery, Thetford IP24 2PU (United Kingdom)

    2016-01-15

    With the increasing global development of wind energy, collision risk models (CRMs) are routinely used to assess the potential impacts of wind turbines on birds. We reviewed and compared the avian collision risk models currently available in the scientific literature, exploring aspects such as the calculation of a collision probability, inclusion of stationary components e.g. the tower, angle of approach and uncertainty. 10 models were cited in the literature and of these, all included a probability of collision of a single bird colliding with a wind turbine during passage through the rotor swept area, and the majority included a measure of the number of birds at risk. 7 out of the 10 models calculated the probability of birds colliding, whilst the remainder used a constant. We identified four approaches to calculate the probability of collision and these were used by others. 6 of the 10 models were deterministic and included the most frequently used models in the UK, with only 4 including variation or uncertainty in some way, the most recent using Bayesian methods. Despite their appeal, CRMs have their limitations and can be ‘data hungry’ as well as assuming much about bird movement and behaviour. As data become available, these assumptions should be tested to ensure that CRMs are functioning to adequately answer the questions posed by the wind energy sector. - Highlights: • We highlighted ten models available to assess avian collision risk. • Only 4 of the models included variability or uncertainty. • Collision risk models have limitations and can be ‘data hungry’. • It is vital that the most appropriate model is used for a given task.

  16. Avian collision risk models for wind energy impact assessments

    International Nuclear Information System (INIS)

    Masden, E.A.; Cook, A.S.C.P.

    2016-01-01

    With the increasing global development of wind energy, collision risk models (CRMs) are routinely used to assess the potential impacts of wind turbines on birds. We reviewed and compared the avian collision risk models currently available in the scientific literature, exploring aspects such as the calculation of a collision probability, inclusion of stationary components e.g. the tower, angle of approach and uncertainty. 10 models were cited in the literature and of these, all included a probability of collision of a single bird colliding with a wind turbine during passage through the rotor swept area, and the majority included a measure of the number of birds at risk. 7 out of the 10 models calculated the probability of birds colliding, whilst the remainder used a constant. We identified four approaches to calculate the probability of collision and these were used by others. 6 of the 10 models were deterministic and included the most frequently used models in the UK, with only 4 including variation or uncertainty in some way, the most recent using Bayesian methods. Despite their appeal, CRMs have their limitations and can be ‘data hungry’ as well as assuming much about bird movement and behaviour. As data become available, these assumptions should be tested to ensure that CRMs are functioning to adequately answer the questions posed by the wind energy sector. - Highlights: • We highlighted ten models available to assess avian collision risk. • Only 4 of the models included variability or uncertainty. • Collision risk models have limitations and can be ‘data hungry’. • It is vital that the most appropriate model is used for a given task.

  17. Modeling risk assessment for nuclear processing plants with LAVA

    International Nuclear Information System (INIS)

    Smith, S.T.; Tisinger, R.M.

    1988-01-01

    Using the Los Alamos Vulnerability and Risk Assessment (LAVA) methodology, the authors developed a model for assessing risks associated with nuclear processing plants. LAVA is a three-part systematic approach to risk assessment. The first part is the mathematical methodology; the second is the general personal computer-based software engine; and the third is the application itself. The methodology provides a framework for creating applications for the software engine to operate upon; all application-specific information is data. Using LAVA, the authors build knowledge-based expert systems to assess risks in applications systems comprising a subject system and a safeguards system. The subject system model is sets of threats, assets, and undesirable outcomes. The safeguards system model is sets of safeguards functions for protecting the assets from the threats by preventing or ameliorating the undesirable outcomes, sets of safeguards subfunctions whose performance determine whether the function is adequate and complete, and sets of issues, appearing as interactive questionnaires, whose measures (in both monetary and linguistic terms) define both the weaknesses in the safeguards system and the potential costs of an undesirable outcome occurring

  18. Radionuclide release rates from spent fuel for performance assessment modeling

    International Nuclear Information System (INIS)

    Curtis, D.B.

    1994-01-01

    In a scenario of aqueous transport from a high-level radioactive waste repository, the concentration of radionuclides in water in contact with the waste constitutes the source term for transport models, and as such represents a fundamental component of all performance assessment models. Many laboratory experiments have been done to characterize release rates and understand processes influencing radionuclide release rates from irradiated nuclear fuel. Natural analogues of these waste forms have been studied to obtain information regarding the long-term stability of potential waste forms in complex natural systems. This information from diverse sources must be brought together to develop and defend methods used to define source terms for performance assessment models. In this manuscript examples of measures of radionuclide release rates from spent nuclear fuel or analogues of nuclear fuel are presented. Each example represents a very different approach to obtaining a numerical measure and each has its limitations. There is no way to obtain an unambiguous measure of this or any parameter used in performance assessment codes for evaluating the effects of processes operative over many millennia. The examples are intended to suggest by example that in the absence of the ability to evaluate accuracy and precision, consistency of a broadly based set of data can be used as circumstantial evidence to defend the choice of parameters used in performance assessments

  19. Training courses on integrated safety assessment modelling for waste repositories

    International Nuclear Information System (INIS)

    Mallants, D.

    2007-01-01

    Near-surface or deep repositories of radioactive waste are being developed and evaluated all over the world. Also, existing repositories for low- and intermediate-level waste often need to be re-evaluated to extend their license or to obtain permission for final closure. The evaluation encompasses both a technical feasibility as well as a safety analysis. The long term safety is usually demonstrated by means of performance or safety assessment. For this purpose computer models are used that calculate the migration of radionuclides from the conditioned radioactive waste, through engineered barriers to the environment (groundwater, surface water, and biosphere). Integrated safety assessment modelling addresses all relevant radionuclide pathways from source to receptor (man), using in combination various computer codes in which the most relevant physical, chemical, mechanical, or even microbiological processes are mathematically described. SCK-CEN organizes training courses in Integrated safety assessment modelling that are intended for individuals who have either a controlling or supervising role within the national radwaste agencies or regulating authorities, or for technical experts that carry out the actual post-closure safety assessment for an existing or new repository. Courses are organised by the Department of Waste and Disposal

  20. Comparison of models used for ecological risk assessment and human health risk assessment

    International Nuclear Information System (INIS)

    Ryti, R.T.; Gallegos, A.F.

    1994-01-01

    Models are used to derive action levels for site screening, or to estimate potential ecological or human health risks posed by potentially hazardous sites. At the Los Alamos National Laboratory (LANL), which is RCRA-regulated, the human-health screening action levels are based on hazardous constituents described in RCRA Subpart S and RESRAD-derived soil guidelines (based on 10 mRem/year) for radiological constituents. Also, an ecological risk screening model was developed for a former firing site, where the primary constituents include depleted uranium, beryllium and lead. Sites that fail the screening models are evaluated with site-specific human risk assessment (using RESRAD and other approaches) and a detailed ecological effect model (ECOTRAN). ECOTRAN is based on pharmacokinetics transport modeling within a multitrophic-level biological-growth dynamics model. ECOTRAN provides detailed temporal records of contaminant concentrations in biota, and annual averages of these body burdens are compared to equivalent site-specific runs of the RESRAD model. The results show that thoughtful interpretation of the results of these models must be applied before they can be used for evaluation of current risk posed by sites and the benefits of various remedial options. This presentation compares the concentrations of biological media in the RESRAD screening runs to the concentrations in ecological endpoints predicted by the ecological screening model. The assumptions and limitations of these screening models and the decision process where these are screening models are applied are discussed

  1. Modelling requirements for future assessments based on FEP analysis

    International Nuclear Information System (INIS)

    Locke, J.; Bailey, L.

    1998-01-01

    This report forms part of a suite of documents describing the Nirex model development programme. The programme is designed to provide a clear audit trail from the identification of significant features, events and processes (FEPs) to the models and modelling processes employed within a detailed safety assessment. A scenario approach to performance assessment has been adopted. It is proposed that potential evolutions of a deep geological radioactive waste repository can be represented by a base scenario and a number of variant scenarios. The base scenario is chosen to be broad-ranging and to represent the natural evolution of the repository system and its surrounding environment. The base scenario is defined to include all those FEPs that are certain to occur and those which are judged likely to occur for a significant period of the assessment timescale. The structuring of FEPs on a Master Directed Diagram (MDD) provides a systematic framework for identifying those FEPs that form part of the natural evolution of the system and those, which may define alternative potential evolutions of the repository system. In order to construct a description of the base scenario, FEPs have been grouped into a series of conceptual models. Conceptual models are groups of FEPs, identified from the MDD, representing a specific component or process within the disposal system. It has been found appropriate to define conceptual models in terms of the three main components of the disposal system: the repository engineered system, the surrounding geosphere and the biosphere. For each of these components, conceptual models provide a description of the relevant subsystem in terms of its initial characteristics, subsequent evolution and the processes affecting radionuclide transport for the groundwater and gas pathways. The aim of this document is to present the methodology that has been developed for deriving modelling requirements and to illustrate the application of the methodology by

  2. Student Generated Rubrics: An Assessment Model To Help All Students Succeed. Assessment Bookshelf Series.

    Science.gov (United States)

    Ainsworth, Larry; Christinson, Jan

    The assessment model described in this guide was initially developed by a team of fifth-grade teachers who wrote objectives of integrating social studies and language arts. It helps the teacher guide students to create a task-specific rubric that they use to evaluate their own and peers' work. Teachers review the student evaluations, determine the…

  3. Industrial process system assessment: bridging process engineering and life cycle assessment through multiscale modeling.

    Science.gov (United States)

    The Industrial Process System Assessment (IPSA) methodology is a multiple step allocation approach for connecting information from the production line level up to the facility level and vice versa using a multiscale model of process systems. The allocation procedure assigns inpu...

  4. AgMIP: Next Generation Models and Assessments

    Science.gov (United States)

    Rosenzweig, C.

    2014-12-01

    Next steps in developing next-generation crop models fall into several categories: significant improvements in simulation of important crop processes and responses to stress; extension from simplified crop models to complex cropping systems models; and scaling up from site-based models to landscape, national, continental, and global scales. Crop processes that require major leaps in understanding and simulation in order to narrow uncertainties around how crops will respond to changing atmospheric conditions include genetics; carbon, temperature, water, and nitrogen; ozone; and nutrition. The field of crop modeling has been built on a single crop-by-crop approach. It is now time to create a new paradigm, moving from 'crop' to 'cropping system.' A first step is to set up the simulation technology so that modelers can rapidly incorporate multiple crops within fields, and multiple crops over time. Then the response of these more complex cropping systems can be tested under different sustainable intensification management strategies utilizing the updated simulation environments. Model improvements for diseases, pests, and weeds include developing process-based models for important diseases, frameworks for coupling air-borne diseases to crop models, gathering significantly more data on crop impacts, and enabling the evaluation of pest management strategies. Most smallholder farming in the world involves integrated crop-livestock systems that cannot be represented by crop modeling alone. Thus, next-generation cropping system models need to include key linkages to livestock. Livestock linkages to be incorporated include growth and productivity models for grasslands and rangelands as well as the usual annual crops. There are several approaches for scaling up, including use of gridded models and development of simpler quasi-empirical models for landscape-scale analysis. On the assessment side, AgMIP is leading a community process for coordinated contributions to IPCC AR6

  5. Cost Model for Risk Assessment of Company Operation in Audit

    Directory of Open Access Journals (Sweden)

    S. V.

    2017-12-01

    Full Text Available This article explores the approach to assessing the risk of company activities termination by building a cost model. This model gives auditors information on managers’ understanding of factors influencing change in the value of assets and liabilities, and the methods to identify it in more effective and reliable ways. Based on this information, the auditor can assess the adequacy of use of the assumption on continuity of company operation by management personnel when preparing financial statements. Financial uncertainty entails real manifestations of factors creating risks of the occurrence of costs, revenue losses due their manifestations, which in the long run can be a reason for termination of company operation, and, therefore, need to be foreseen in the auditor’s assessment of the adequacy of use of the continuity assumption when preparing financial statements by company management. The purpose of the study is to explore and develop a methodology for use of cost models to assess the risk of termination of company operation in audit. The issue of methodology for assessing the audit risk through analyzing methods for company valuation has not been dealt with. The review of methodologies for assessing the risks of termination of company operation in course of audit gives grounds for the conclusion that use of cost models can be an effective methodology for identification and assessment of such risks. The analysis of the above methods gives understanding of the existing system for company valuation, integrated into the management system, and the consequences of its use, i. e. comparison of the asset price data with the accounting data and the market value of the asset data. Overvalued or undervalued company assets may be a sign of future sale or liquidation of a company, which may signal on high probability of termination of company operation. A wrong choice or application of valuation methods can be indicative of the risk of non

  6. Fuel cycle assessment: A compendium of models, methodologies, and approaches

    Energy Technology Data Exchange (ETDEWEB)

    1994-07-01

    The purpose of this document is to profile analytical tools and methods which could be used in a total fuel cycle analysis. The information in this document provides a significant step towards: (1) Characterizing the stages of the fuel cycle. (2) Identifying relevant impacts which can feasibly be evaluated quantitatively or qualitatively. (3) Identifying and reviewing other activities that have been conducted to perform a fuel cycle assessment or some component thereof. (4) Reviewing the successes/deficiencies and opportunities/constraints of previous activities. (5) Identifying methods and modeling techniques/tools that are available, tested and could be used for a fuel cycle assessment.

  7. Assessing fit in Bayesian models for spatial processes

    KAUST Repository

    Jun, M.

    2014-09-16

    © 2014 John Wiley & Sons, Ltd. Gaussian random fields are frequently used to model spatial and spatial-temporal data, particularly in geostatistical settings. As much of the attention of the statistics community has been focused on defining and estimating the mean and covariance functions of these processes, little effort has been devoted to developing goodness-of-fit tests to allow users to assess the models\\' adequacy. We describe a general goodness-of-fit test and related graphical diagnostics for assessing the fit of Bayesian Gaussian process models using pivotal discrepancy measures. Our method is applicable for both regularly and irregularly spaced observation locations on planar and spherical domains. The essential idea behind our method is to evaluate pivotal quantities defined for a realization of a Gaussian random field at parameter values drawn from the posterior distribution. Because the nominal distribution of the resulting pivotal discrepancy measures is known, it is possible to quantitatively assess model fit directly from the output of Markov chain Monte Carlo algorithms used to sample from the posterior distribution on the parameter space. We illustrate our method in a simulation study and in two applications.

  8. Assessing fit in Bayesian models for spatial processes

    KAUST Repository

    Jun, M.; Katzfuss, M.; Hu, J.; Johnson, V. E.

    2014-01-01

    © 2014 John Wiley & Sons, Ltd. Gaussian random fields are frequently used to model spatial and spatial-temporal data, particularly in geostatistical settings. As much of the attention of the statistics community has been focused on defining and estimating the mean and covariance functions of these processes, little effort has been devoted to developing goodness-of-fit tests to allow users to assess the models' adequacy. We describe a general goodness-of-fit test and related graphical diagnostics for assessing the fit of Bayesian Gaussian process models using pivotal discrepancy measures. Our method is applicable for both regularly and irregularly spaced observation locations on planar and spherical domains. The essential idea behind our method is to evaluate pivotal quantities defined for a realization of a Gaussian random field at parameter values drawn from the posterior distribution. Because the nominal distribution of the resulting pivotal discrepancy measures is known, it is possible to quantitatively assess model fit directly from the output of Markov chain Monte Carlo algorithms used to sample from the posterior distribution on the parameter space. We illustrate our method in a simulation study and in two applications.

  9. Assessing Local Model Adequacy in Bayesian Hierarchical Models Using the Partitioned Deviance Information Criterion

    Science.gov (United States)

    Wheeler, David C.; Hickson, DeMarc A.; Waller, Lance A.

    2010-01-01

    Many diagnostic tools and goodness-of-fit measures, such as the Akaike information criterion (AIC) and the Bayesian deviance information criterion (DIC), are available to evaluate the overall adequacy of linear regression models. In addition, visually assessing adequacy in models has become an essential part of any regression analysis. In this paper, we focus on a spatial consideration of the local DIC measure for model selection and goodness-of-fit evaluation. We use a partitioning of the DIC into the local DIC, leverage, and deviance residuals to assess local model fit and influence for both individual observations and groups of observations in a Bayesian framework. We use visualization of the local DIC and differences in local DIC between models to assist in model selection and to visualize the global and local impacts of adding covariates or model parameters. We demonstrate the utility of the local DIC in assessing model adequacy using HIV prevalence data from pregnant women in the Butare province of Rwanda during 1989-1993 using a range of linear model specifications, from global effects only to spatially varying coefficient models, and a set of covariates related to sexual behavior. Results of applying the diagnostic visualization approach include more refined model selection and greater understanding of the models as applied to the data. PMID:21243121

  10. A multi-model assessment of terrestrial biosphere model data needs

    Science.gov (United States)

    Gardella, A.; Cowdery, E.; De Kauwe, M. G.; Desai, A. R.; Duveneck, M.; Fer, I.; Fisher, R.; Knox, R. G.; Kooper, R.; LeBauer, D.; McCabe, T.; Minunno, F.; Raiho, A.; Serbin, S.; Shiklomanov, A. N.; Thomas, A.; Walker, A.; Dietze, M.

    2017-12-01

    Terrestrial biosphere models provide us with the means to simulate the impacts of climate change and their uncertainties. Going beyond direct observation and experimentation, models synthesize our current understanding of ecosystem processes and can give us insight on data needed to constrain model parameters. In previous work, we leveraged the Predictive Ecosystem Analyzer (PEcAn) to assess the contribution of different parameters to the uncertainty of the Ecosystem Demography model v2 (ED) model outputs across various North American biomes (Dietze et al., JGR-G, 2014). While this analysis identified key research priorities, the extent to which these priorities were model- and/or biome-specific was unclear. Furthermore, because the analysis only studied one model, we were unable to comment on the effect of variability in model structure to overall predictive uncertainty. Here, we expand this analysis to all biomes globally and a wide sample of models that vary in complexity: BioCro, CABLE, CLM, DALEC, ED2, FATES, G'DAY, JULES, LANDIS, LINKAGES, LPJ-GUESS, MAESPA, PRELES, SDGVM, SIPNET, and TEM. Prior to performing uncertainty analyses, model parameter uncertainties were assessed by assimilating all available trait data from the combination of the BETYdb and TRY trait databases, using an updated multivariate version of PEcAn's Hierarchical Bayesian meta-analysis. Next, sensitivity analyses were performed for all models across a range of sites globally to assess sensitivities for a range of different outputs (GPP, ET, SH, Ra, NPP, Rh, NEE, LAI) at multiple time scales from the sub-annual to the decadal. Finally, parameter uncertainties and model sensitivities were combined to evaluate the fractional contribution of each parameter to the predictive uncertainty for a specific variable at a specific site and timescale. Facilitated by PEcAn's automated workflows, this analysis represents the broadest assessment of the sensitivities and uncertainties in terrestrial

  11. Comparative Assessment of Nonlocal Continuum Solvent Models Exhibiting Overscreening

    Directory of Open Access Journals (Sweden)

    Ren Baihua

    2017-01-01

    Full Text Available Nonlocal continua have been proposed to offer a more realistic model for the electrostatic response of solutions such as the electrolyte solvents prominent in biology and electrochemistry. In this work, we review three nonlocal models based on the Landau-Ginzburg framework which have been proposed but not directly compared previously, due to different expressions of the nonlocal constitutive relationship. To understand the relationships between these models and the underlying physical insights from which they are derive, we situate these models into a single, unified Landau-Ginzburg framework. One of the models offers the capacity to interpret how temperature changes affect dielectric response, and we note that the variations with temperature are qualitatively reasonable even though predictions at ambient temperatures are not quantitatively in agreement with experiment. Two of these models correctly reproduce overscreening (oscillations between positive and negative polarization charge densities, and we observe small differences between them when we simulate the potential between parallel plates held at constant potential. These computations require reformulating the two models as coupled systems of local partial differential equations (PDEs, and we use spectral methods to discretize both problems. We propose further assessments to discriminate between the models, particularly in regards to establishing boundary conditions and comparing to explicit-solvent molecular dynamics simulations.

  12. Plasma-safety assessment model and safety analyses of ITER

    International Nuclear Information System (INIS)

    Honda, T.; Okazaki, T.; Bartels, H.-H.; Uckan, N.A.; Sugihara, M.; Seki, Y.

    2001-01-01

    A plasma-safety assessment model has been provided on the basis of the plasma physics database of the International Thermonuclear Experimental Reactor (ITER) to analyze events including plasma behavior. The model was implemented in a safety analysis code (SAFALY), which consists of a 0-D dynamic plasma model and a 1-D thermal behavior model of the in-vessel components. Unusual plasma events of ITER, e.g., overfueling, were calculated using the code and plasma burning is found to be self-bounded by operation limits or passively shut down due to impurity ingress from overheated divertor targets. Sudden transition of divertor plasma might lead to failure of the divertor target because of a sharp increase of the heat flux. However, the effects of the aggravating failure can be safely handled by the confinement boundaries. (author)

  13. Modeling issues associated with production reactor safety assessment

    International Nuclear Information System (INIS)

    Stack, D.W.; Thomas, W.R.

    1990-01-01

    This paper describes several Probabilistic Safety Assessment (PSA) modeling issues that are related to the unique design and operation of the production reactors. The identification of initiating events and determination of a set of success criteria for the production reactors is of concern because of their unique design. The modeling of accident recovery must take into account the unique operation of these reactors. Finally, a more thorough search and evaluation of common-cause events is required to account for combinations of unique design features and operation that might otherwise not be included in the PSA. It is expected that most of these modeling issues also would be encountered when modeling some of the other more unique reactor and nonreactor facilities that are part of the DOE nuclear materials production complex. 9 refs., 2 figs

  14. PORFLOW Modeling Supporting The H-Tank Farm Performance Assessment

    International Nuclear Information System (INIS)

    Jordan, J. M.; Flach, G. P.; Westbrook, M. L.

    2012-01-01

    Numerical simulations of groundwater flow and contaminant transport in the vadose and saturated zones have been conducted using the PORFLOW code in support of an overall Performance Assessment (PA) of the H-Tank Farm. This report provides technical detail on selected aspects of PORFLOW model development and describes the structure of the associated electronic files. The PORFLOW models for the H-Tank Farm PA, Rev. 1 were updated with grout, solubility, and inventory changes. The aquifer model was refined. In addition, a set of flow sensitivity runs were performed to allow flow to be varied in the related probabilistic GoldSim models. The final PORFLOW concentration values are used as input into a GoldSim dose calculator

  15. PORFLOW Modeling Supporting The H-Tank Farm Performance Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, J. M.; Flach, G. P.; Westbrook, M. L.

    2012-08-31

    Numerical simulations of groundwater flow and contaminant transport in the vadose and saturated zones have been conducted using the PORFLOW code in support of an overall Performance Assessment (PA) of the H-Tank Farm. This report provides technical detail on selected aspects of PORFLOW model development and describes the structure of the associated electronic files. The PORFLOW models for the H-Tank Farm PA, Rev. 1 were updated with grout, solubility, and inventory changes. The aquifer model was refined. In addition, a set of flow sensitivity runs were performed to allow flow to be varied in the related probabilistic GoldSim models. The final PORFLOW concentration values are used as input into a GoldSim dose calculator.

  16. Tackling Biocomplexity with Meta-models for Species Risk Assessment

    Directory of Open Access Journals (Sweden)

    Philip J. Nyhus

    2007-06-01

    Full Text Available We describe results of a multi-year effort to strengthen consideration of the human dimension into endangered species risk assessments and to strengthen research capacity to understand biodiversity risk assessment in the context of coupled human-natural systems. A core group of social and biological scientists have worked with a network of more than 50 individuals from four countries to develop a conceptual framework illustrating how human-mediated processes influence biological systems and to develop tools to gather, translate, and incorporate these data into existing simulation models. A central theme of our research focused on (1 the difficulties often encountered in identifying and securing diverse bodies of expertise and information that is necessary to adequately address complex species conservation issues; and (2 the development of quantitative simulation modeling tools that could explicitly link these datasets as a way to gain deeper insight into these issues. To address these important challenges, we promote a "meta-modeling" approach where computational links are constructed between discipline-specific models already in existence. In this approach, each model can function as a powerful stand-alone program, but interaction between applications is achieved by passing data structures describing the state of the system between programs. As one example of this concept, an integrated meta-model of wildlife disease and population biology is described. A goal of this effort is to improve science-based capabilities for decision making by scientists, natural resource managers, and policy makers addressing environmental problems in general, and focusing on biodiversity risk assessment in particular.

  17. Modeling Of Construction Noise For Environmental Impact Assessment

    Directory of Open Access Journals (Sweden)

    Mohamed F. Hamoda

    2008-06-01

    Full Text Available This study measured the noise levels generated at different construction sites in reference to the stage of construction and the equipment used, and examined the methods to predict such noise in order to assess the environmental impact of noise. It included 33 construction sites in Kuwait and used artificial neural networks (ANNs for the prediction of noise. A back-propagation neural network (BPNN model was compared with a general regression neural network (GRNN model. The results obtained indicated that the mean equivalent noise level was 78.7 dBA which exceeds the threshold limit. The GRNN model was superior to the BPNN model in its accuracy of predicting construction noise due to its ability to train quickly on sparse data sets. Over 93% of the predictions were within 5% of the observed values. The mean absolute error between the predicted and observed data was only 2 dBA. The ANN modeling proved to be a useful technique for noise predictions required in the assessment of environmental impact of construction activities.

  18. Empirical assessment of a threshold model for sylvatic plague

    DEFF Research Database (Denmark)

    Davis, Stephen; Leirs, Herwig; Viljugrein, H.

    2007-01-01

    Plague surveillance programmes established in Kazakhstan, Central Asia, during the previous century, have generated large plague archives that have been used to parameterize an abundance threshold model for sylvatic plague in great gerbil (Rhombomys opimus) populations. Here, we assess the model...... examine six hypotheses that could explain the resulting false positive predictions, namely (i) including end-of-outbreak data erroneously lowers the estimated threshold, (ii) too few gerbils were tested, (iii) plague becomes locally extinct, (iv) the abundance of fleas was too low, (v) the climate...

  19. Model-based pH monitor for sensor assessment.

    Science.gov (United States)

    van Schagen, Kim; Rietveld, Luuk; Veersma, Alex; Babuska, Robert

    2009-01-01

    Owing to the nature of the treatment processes, monitoring the processes based on individual online measurements is difficult or even impossible. However, the measurements (online and laboratory) can be combined with a priori process knowledge, using mathematical models, to objectively monitor the treatment processes and measurement devices. The pH measurement is a commonly used measurement at different stages in the drinking water treatment plant, although it is a unreliable instrument, requiring significant maintenance. It is shown that, using a grey-box model, it is possible to assess the measurement devices effectively, even if detailed information of the specific processes is unknown.

  20. Risk Assessment of Engineering Project Financing Based on PPP Model

    Directory of Open Access Journals (Sweden)

    Ma Qiuli

    2017-01-01

    Full Text Available At present, the project financing channel is single, and the urban facilities are in short supply, and the risk assessment and prevention mechanism of financing should be further improved to reduce the risk of project financing. In view of this, the fuzzy comprehensive evaluation model of project financing risk which combined the method of fuzzy comprehensive evaluation and analytic hierarchy process is established. The scientificalness and effectiveness of the model are verified by the example of the world port project in Luohe city, and it provides basis and reference for engineering project financing based on PPP mode.

  1. Radiological assessments of land disposal options: recent model developments

    International Nuclear Information System (INIS)

    Fearn, H.S.; Pinner, A.V.; Hemming, C.R.

    1984-10-01

    This report describes progress in the development of methodologies and models for assessing the radiological impact of the disposal of low and intermediate level wastes by (i) shallow land burial in simple trenches (land 1), (ii) shallow land burial in engineered facilities (land 2), and (iii) emplacement in mined repositories or existing cavities (land 3/4). In particular the report describes wasteform leaching models, for unconditioned and cemented waste, the role of engineered barriers of a shallow land burial facility in reducing the magnitude of doses arising from groundwater contact and a detailed consideration of the interactions between radioactive carbon and various media. (author)

  2. A Fuzzy Knowledge Representation Model for Student Performance Assessment

    DEFF Research Database (Denmark)

    Badie, Farshad

    Knowledge representation models based on Fuzzy Description Logics (DLs) can provide a foundation for reasoning in intelligent learning environments. While basic DLs are suitable for expressing crisp concepts and binary relationships, Fuzzy DLs are capable of processing degrees of truth/completene......Knowledge representation models based on Fuzzy Description Logics (DLs) can provide a foundation for reasoning in intelligent learning environments. While basic DLs are suitable for expressing crisp concepts and binary relationships, Fuzzy DLs are capable of processing degrees of truth....../completeness about vague or imprecise information. This paper tackles the issue of representing fuzzy classes using OWL2 in a dataset describing Performance Assessment Results of Students (PARS)....

  3. Probabilistic Modeling and Risk Assessment of Cable Icing

    DEFF Research Database (Denmark)

    Roldsgaard, Joan Hee

    This dissertation addresses the issues related to icing of structures with special emphasis on bridge cables. Cable supported bridges in cold climate suffers for ice accreting on the cables, this poses three different undesirable situations. Firstly the changed shape of the cable due to ice...... preliminary framework is modified for assessing the probability of occurrence of in-cloud and precipitation icing and its duration. Different probabilistic models are utilized for the representation of the meteorological variables and their appropriateness is evaluated both through goodness-of-fit tests...... are influencing the two icing mechanisms and their duration. The model is found to be more sensitive to changes in the discretization levels of the input variables. Thirdly the developed operational probabilistic framework for the assessment of the expected number of occurrences of ice/snow accretion on bridge...

  4. Development of tools and models for computational fracture assessment

    International Nuclear Information System (INIS)

    Talja, H.; Santaoja, K.

    1998-01-01

    The aim of the work presented in this paper has been to develop and test new computational tools and theoretically more sound methods for fracture mechanical analysis. The applicability of the engineering integrity assessment system MASI for evaluation of piping components has been extended. The most important motivation for the theoretical development have been the well-known fundamental limitations in the validity of J-integral, which limits its applicability in many important practical safety assessment cases. Examples are extensive plastic deformation, multimaterial structures and ascending loading paths (especially warm prestress, WPS). Further, the micromechanical Gurson model has been applied to several reactor pressure vessel materials. Special attention is paid to the transferability of Gurson model parameters from tensile test results to prediction of ductile failure behaviour of cracked structures. (author)

  5. A model for assessing social impacts of nuclear technology

    International Nuclear Information System (INIS)

    Suzuki, Atsuyuki; Kiyose, Ryohei

    1981-01-01

    A theoretical framework is given for assessing the social or environmental impacts of nuclear technology. A two-act problem concerning the incentive-penalty system is supposed to formulate the principle of ALAP. An observation plan to make decision on the problem is optimized with the Bayseian decision theory. The optimized solution resting on the amount of incentive or penalty is compared with an actual or practical plan. Then, by finding the indifference between the two plans, an impact is assessed in monetary terms. As regards the third step, the model does not provide the details since it is beyond the scope of the description. If there exists an actual plan, it can be easily compared with the results from this theory. If there does not or in the process of making it, its feasibility must be studied by another model or by different approaches. (J.P.N.)

  6. Regional Persistent Organic Pollutants' Environmental Impact Assessment and Control Model

    Directory of Open Access Journals (Sweden)

    Jurgis Staniskis

    2008-10-01

    Full Text Available The sources of formation, environmental distribution and fate of persistent organic pollutants (POPs are increasingly seen as topics to be addressed and solved at the global scale. Therefore, there are already two international agreements concerning persistent organic pollutants: the Protocol of 1998 to the 1979 Convention on the Long-Range Transboundary Air Pollution on Persistent Organic Pollutants (Aarhus Protocol; and the Stockholm Convention on Persistent Organic Pollutants. For the assessment of environmental pollution of POPs, for the risk assessment, for the evaluation of new pollutants as potential candidates to be included in the POPs list of the Stokholmo or/and Aarhus Protocol, a set of different models are developed or under development. Multimedia models help describe and understand environmental processes leading to global contamination through POPs and actual risk to the environment and human health. However, there is a lack of the tools based on a systematic and integrated approach to POPs management difficulties in the region.

  7. Model error assessment of burst capacity models for energy pipelines containing surface cracks

    International Nuclear Information System (INIS)

    Yan, Zijian; Zhang, Shenwei; Zhou, Wenxing

    2014-01-01

    This paper develops the probabilistic characteristics of the model errors associated with five well-known burst capacity models/methodologies for pipelines containing longitudinally-oriented external surface cracks, namely the Battelle and CorLAS™ models as well as the failure assessment diagram (FAD) methodologies recommended in the BS 7910 (2005), API RP579 (2007) and R6 (Rev 4, Amendment 10). A total of 112 full-scale burst test data for cracked pipes subjected internal pressure only were collected from the literature. The model error for a given burst capacity model is evaluated based on the ratios of the test to predicted burst pressures for the collected data. Analysis results suggest that the CorLAS™ model is the most accurate model among the five models considered and the Battelle, BS 7910, API RP579 and R6 models are in general conservative; furthermore, the API RP579 and R6 models are markedly more accurate than the Battelle and BS 7910 models. The results will facilitate the development of reliability-based structural integrity management of pipelines. - Highlights: • Model errors for five burst capacity models for pipelines containing surface cracks are characterized. • Basic statistics of the model errors are obtained based on test-to-predicted ratios. • Results will facilitate reliability-based design and assessment of energy pipelines

  8. Exploring harmonization between integrated assessment and capacity expansion models

    Science.gov (United States)

    Iyer, G.; Brown, M.; Cohen, S.; Macknick, J.; Patel, P.; Wise, M. A.; Horing, J.

    2017-12-01

    Forward-looking quantitative models of the electric sector are extensively used to provide science-based strategic decision support to national, international and private-sector entities. Given that these models are used to inform a wide-range of stakeholders and influence policy decisions, it is vital to examine how the models' underlying data and structure influence their outcomes. We conduct several experiments harmonizing key model characteristics between ReEDS—an electric sector only model, and GCAM—an integrated assessment model—to understand how different degrees of harmonization impact model outcomes. ReEDS has high spatial, temporal, and process detail but lacks electricity demand elasticity and endogenous representations of other economic sectors, while GCAM has internally consistent representations of energy (including the electric sector), agriculture, and land-use systems but relatively aggregate representations of the factors influencing electric sector investments . We vary the degree of harmonization in electricity demand, fuel prices, technology costs and performance, and variable renewable energy resource characteristics. We then identify the prominent sources of divergence in key outputs (electricity capacity, generation, and price) across the models and study how the convergence between models can be improved with permutations of harmonized characteristics. The remaining inconsistencies help to establish how differences in the models' underlying data, construction, perspective, and methodology play into each model's outcome. There are three broad contributions of this work. First, our study provides a framework to link models with similar scope but different resolutions. Second, our work provides insight into how the harmonization of assumptions contributes to a unified and robust portrayal of the US electricity sector under various potential futures. Finally, our study enhances the understanding of the influence of structural uncertainty

  9. Modeling marine surface microplastic transport to assess optimal removal locations

    OpenAIRE

    Sherman, Peter; Van Sebille, Erik

    2016-01-01

    Marine plastic pollution is an ever-increasing problem that demands immediate mitigation and reduction plans. Here, a model based on satellite-tracked buoy observations and scaled to a large data set of observations on microplastic from surface trawls was used to simulate the transport of plastics floating on the ocean surface from 2015 to 2025, with the goal to assess the optimal marine microplastic removal locations for two scenarios: removing the most surface microplastic and reducing the ...

  10. A maturity model to assess organisational readiness for change

    OpenAIRE

    Zephir, Olivier; Minel, Stéphanie; Chapotot, Emilie

    2011-01-01

    International audience; The presented model which is developed in a European project allows project management teams to assess the organisational maturity to integrate new practices under structural or technological change. Maturity for change is defined here as workforce capability to operate effectively in transformed processes. This methodology is addressed to tackle organisational readiness to fulfil business objectives through technological and structural improvements. The tool integrate...

  11. Melodie: A global risk assessment model for radioactive waste repositories

    International Nuclear Information System (INIS)

    Lewi, J.; Assouline, M.; Bareau, J.; Raimbault, P.

    1987-03-01

    The Institute of Protection and Nuclear Safety (IPSN), which is part of the French Atomic Energy Commission (C.E.A.) develops since 1984 in collaboration with different groups inside and outside the C.E.A. a computer model for risk assessment of nuclear waste repositories in deep geological formations. The main characteristics of the submodels, the data processing structure and some examples of applications are presented

  12. Advancing Integrated Systems Modelling Framework for Life Cycle Sustainability Assessment

    Directory of Open Access Journals (Sweden)

    Anthony Halog

    2011-02-01

    Full Text Available The need for integrated methodological framework for sustainability assessment has been widely discussed and is urgent due to increasingly complex environmental system problems. These problems have impacts on ecosystems and human well-being which represent a threat to economic performance of countries and corporations. Integrated assessment crosses issues; spans spatial and temporal scales; looks forward and backward; and incorporates multi-stakeholder inputs. This study aims to develop an integrated methodology by capitalizing the complementary strengths of different methods used by industrial ecologists and biophysical economists. The computational methodology proposed here is systems perspective, integrative, and holistic approach for sustainability assessment which attempts to link basic science and technology to policy formulation. The framework adopts life cycle thinking methods—LCA, LCC, and SLCA; stakeholders analysis supported by multi-criteria decision analysis (MCDA; and dynamic system modelling. Following Pareto principle, the critical sustainability criteria, indicators and metrics (i.e., hotspots can be identified and further modelled using system dynamics or agent based modelling and improved by data envelopment analysis (DEA and sustainability network theory (SNT. The framework is being applied to development of biofuel supply chain networks. The framework can provide new ways of integrating knowledge across the divides between social and natural sciences as well as between critical and problem-solving research.

  13. Efficiency assessment models of higher education institution staff activity

    Directory of Open Access Journals (Sweden)

    K. A. Dyusekeyev

    2016-01-01

    Full Text Available The paper substantiates the necessity of improvement of university staff incentive system under the conditions of competition in the field of higher education, the necessity to develop a separate model for the evaluation of the effectiveness of the department heads. The authors analysed the methods for assessing production function of units. The advantage of the application of the methods to assess the effectiveness of border economic structures in the field of higher education is shown. The choice of the data envelopment analysis method to solve the problem has proved. The model for evaluating of university departments activity on the basis of the DEAmethodology has developed. On the basis of operating in Russia, Kazakhstan and other countries universities staff pay systems the structure of the criteria system for university staff activity evaluation has been designed. For clarification and specification of the departments activity efficiency criteria a strategic map has been developed that allowed us to determine the input and output parameters of the model. DEA-methodology using takes into account a large number of input and output parameters, increases the assessment objectivity by excluding experts, receives interim data to identify the strengths and weaknesses of the evaluated object.

  14. Ensemble atmospheric dispersion modeling for emergency response consequence assessments

    International Nuclear Information System (INIS)

    Addis, R.P.; Buckley, R.L.

    2003-01-01

    Full text: Prognostic atmospheric dispersion models are used to generate consequence assessments, which assist decision-makers in the event of a release from a nuclear facility. Differences in the forecast wind fields generated by various meteorological agencies, differences in the transport and diffusion models themselves, as well as differences in the way these models treat the release source term, all may result in differences in the simulated plumes. This talk will address the U.S. participation in the European ENSEMBLE project, and present a perspective an how ensemble techniques may be used to enable atmospheric modelers to provide decision-makers with a more realistic understanding of how both the atmosphere and the models behave. Meteorological forecasts generated by numerical models from national and multinational meteorological agencies provide individual realizations of three-dimensional, time dependent atmospheric wind fields. These wind fields may be used to drive atmospheric dispersion (transport and diffusion) models, or they may be used to initiate other, finer resolution meteorological models, which in turn drive dispersion models. Many modeling agencies now utilize ensemble-modeling techniques to determine how sensitive the prognostic fields are to minor perturbations in the model parameters. However, the European Union programs RTMOD and ENSEMBLE are the first projects to utilize a WEB based ensemble approach to interpret the output from atmospheric dispersion models. The ensembles produced are different from those generated by meteorological forecasting centers in that they are ensembles of dispersion model outputs from many different atmospheric transport and diffusion models utilizing prognostic atmospheric fields from several different forecast centers. As such, they enable a decision-maker to consider the uncertainty in the plume transport and growth as a result of the differences in the forecast wind fields as well as the differences in the

  15. Individual-based model for radiation risk assessment

    Science.gov (United States)

    Smirnova, O.

    A mathematical model is developed which enables one to predict the life span probability for mammals exposed to radiation. It relates statistical biometric functions with statistical and dynamic characteristics of an organism's critical system. To calculate the dynamics of the latter, the respective mathematical model is used too. This approach is applied to describe the effects of low level chronic irradiation on mice when the hematopoietic system (namely, thrombocytopoiesis) is the critical one. For identification of the joint model, experimental data on hematopoiesis in nonirradiated and irradiated mice, as well as on mortality dynamics of those in the absence of radiation are utilized. The life span probability and life span shortening predicted by the model agree with corresponding experimental data. Modeling results show the significance of ac- counting the variability of the individual radiosensitivity of critical system cells when estimating the radiation risk. These findings are corroborated by clinical data on persons involved in the elimination of the Chernobyl catastrophe after- effects. All this makes it feasible to use the model for radiation risk assessments for cosmonauts and astronauts on long-term missions such as a voyage to Mars or a lunar colony. In this case the model coefficients have to be determined by making use of the available data for humans. Scenarios for the dynamics of dose accumulation during space flights should also be taken into account.

  16. Modelling Tradescantia fluminensis to assess long term survival

    Directory of Open Access Journals (Sweden)

    Alex James

    2015-06-01

    Full Text Available We present a simple Poisson process model for the growth of Tradescantia fluminensis, an invasive plant species that inhibits the regeneration of native forest remnants in New Zealand. The model was parameterised with data derived from field experiments in New Zealand and then verified with independent data. The model gave good predictions which showed that its underlying assumptions are sound. However, this simple model had less predictive power for outputs based on variance suggesting that some assumptions were lacking. Therefore, we extended the model to include higher variability between plants thereby improving its predictions. This high variance model suggests that control measures that promote node death at the base of the plant or restrict the main stem growth rate will be more effective than those that reduce the number of branching events. The extended model forms a good basis for assessing the efficacy of various forms of control of this weed, including the recently-released leaf-feeding tradescantia leaf beetle (Neolema ogloblini.

  17. An integrated model for the assessment of global water resources – Part 2: Applications and assessments

    Directory of Open Access Journals (Sweden)

    N. Hanasaki

    2008-07-01

    Full Text Available To assess global water resources from the perspective of subannual variation in water availability and water use, an integrated water resources model was developed. In a companion report, we presented the global meteorological forcing input used to drive the model and six modules, namely, the land surface hydrology module, the river routing module, the crop growth module, the reservoir operation module, the environmental flow requirement module, and the anthropogenic withdrawal module. Here, we present the results of the model application and global water resources assessments. First, the timing and volume of simulated agriculture water use were examined because agricultural use composes approximately 85% of total consumptive water withdrawal in the world. The estimated crop calendar showed good agreement with earlier reports for wheat, maize, and rice in major countries of production. In major countries, the error in the planting date was ±1 mo, but there were some exceptional cases. The estimated irrigation water withdrawal also showed fair agreement with country statistics, but tended to be underestimated in countries in the Asian monsoon region. The results indicate the validity of the model and the input meteorological forcing because site-specific parameter tuning was not used in the series of simulations. Finally, global water resources were assessed on a subannual basis using a newly devised index. This index located water-stressed regions that were undetected in earlier studies. These regions, which are indicated by a gap in the subannual distribution of water availability and water use, include the Sahel, the Asian monsoon region, and southern Africa. The simulation results show that the reservoir operations of major reservoirs (>1 km3 and the allocation of environmental flow requirements can alter the population under high water stress by approximately −11% to +5% globally. The integrated model is applicable to

  18. Confidence assessment. Site descriptive modelling SDM-Site Forsmark

    International Nuclear Information System (INIS)

    2008-09-01

    The objective of this report is to assess the confidence that can be placed in the Forsmark site descriptive model, based on the information available at the conclusion of the surface-based investigations (SDM-Site Forsmark). In this exploration, an overriding question is whether remaining uncertainties are significant for repository engineering design or long-term safety assessment and could successfully be further reduced by more surface based investigations or more usefully by explorations underground made during construction of the repository. The confidence in the Forsmark site descriptive model, based on the data available at the conclusion of the surface-based site investigations, have been assessed by exploring: Confidence in the site characterisation data base; Key remaining issues and their handling; Handling of alternative models; Consistency between disciplines; and, Main reasons for confidence and lack of confidence in the model. It is generally found that the key aspects of importance for safety assessment and repository engineering of the Forsmark site descriptive model are associated with a high degree of confidence. Because of the robust geological model that describes the site, the overall confidence in Forsmark site descriptive model is judged to be high. While some aspects have lower confidence this lack of confidence is handled by providing wider uncertainty ranges, bounding estimates and/or alternative models. Most, but not all, of the low confidence aspects have little impact on repository engineering design or for long-term safety. Poor precision in the measured data are judged to have limited impact on uncertainties on the site descriptive model, with the exceptions of inaccuracy in determining the position of some boreholes at depth in 3-D space, as well as the poor precision of the orientation of BIPS images in some boreholes, and the poor precision of stress data determined by overcoring at the locations where the pre

  19. Sustainable BECCS pathways evaluated by an integrated assessment model

    Science.gov (United States)

    Kato, E.

    2017-12-01

    Negative emissions technologies, particularly Bioenergy with Carbon Capture and Storage (BECCS), are key components of mitigation strategies in ambitious future socioeconomic scenarios analysed by integrated assessment models. Generally, scenarios aiming to keep mean global temperature rise below 2°C above pre-industrial would require net negative carbon emissions in the end of the 21st century. Also, in the context of Paris agreement which acknowledges "a balance between anthropogenic emissions by sources and removals by sinks of greenhouse gases in the second half of this century", RD&D for the negative emissions technologies in this decade has a crucial role for the possibility of early deployment of the technology. Because of the requirement of potentially extensive use of land and water for producing the bioenergy feedstock to get the anticipated level of gross negative emissions, researches on how to develop sustainable scenarios of BECCS is needed. Here, we present BECCS deployment scenarios that consider economically viable flow of bioenergy system including power generation and conversion process to liquid and gaseous fuels for transportation and heat with consideration of sustainable global biomass use. In the modelling process, detailed bioenergy representations, i.e. various feedstock and conversion technologies with and without CCS, are implemented in an integrated assessment (IA) model GRAPE (Global Relationship Assessment to Protect the Environment). Also, to overcome a general discrepancy about assumed future agricultural yield between 'top-down' IA models and 'bottom-up' estimates, which would crucially affect the land-use pattern, we applied yields change of food and energy crops consistent with process-based biophysical crop models in consideration of changing climate conditions. Using the framework, economically viable strategy for implementing sustainable bioenergy and BECCS flow are evaluated in the scenarios targeting to keep global average

  20. The Gain-Loss Model: A Probabilistic Skill Multimap Model for Assessing Learning Processes

    Science.gov (United States)

    Robusto, Egidio; Stefanutti, Luca; Anselmi, Pasquale

    2010-01-01

    Within the theoretical framework of knowledge space theory, a probabilistic skill multimap model for assessing learning processes is proposed. The learning process of a student is modeled as a function of the student's knowledge and of an educational intervention on the attainment of specific skills required to solve problems in a knowledge…

  1. Modeling Exposure to Persistent Chemicals in Hazard and Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Cowan-Ellsberry, Christina E.; McLachlan, Michael S.; Arnot, Jon A.; MacLeod, Matthew; McKone, Thomas E.; Wania, Frank

    2008-11-01

    Fate and exposure modeling has not thus far been explicitly used in the risk profile documents prepared to evaluate significant adverse effect of candidate chemicals for either the Stockholm Convention or the Convention on Long-Range Transboundary Air Pollution. However, we believe models have considerable potential to improve the risk profiles. Fate and exposure models are already used routinely in other similar regulatory applications to inform decisions, and they have been instrumental in building our current understanding of the fate of POP and PBT chemicals in the environment. The goal of this paper is to motivate the use of fate and exposure models in preparing risk profiles in the POP assessment procedure by providing strategies for incorporating and using models. The ways that fate and exposure models can be used to improve and inform the development of risk profiles include: (1) Benchmarking the ratio of exposure and emissions of candidate chemicals to the same ratio for known POPs, thereby opening the possibility of combining this ratio with the relative emissions and relative toxicity to arrive at a measure of relative risk. (2) Directly estimating the exposure of the environment, biota and humans to provide information to complement measurements, or where measurements are not available or are limited. (3) To identify the key processes and chemical and/or environmental parameters that determine the exposure; thereby allowing the effective prioritization of research or measurements to improve the risk profile. (4) Predicting future time trends including how quickly exposure levels in remote areas would respond to reductions in emissions. Currently there is no standardized consensus model for use in the risk profile context. Therefore, to choose the appropriate model the risk profile developer must evaluate how appropriate an existing model is for a specific setting and whether the assumptions and input data are relevant in the context of the application

  2. Modeling exposure to persistent chemicals in hazard and risk assessment.

    Science.gov (United States)

    Cowan-Ellsberry, Christina E; McLachlan, Michael S; Arnot, Jon A; Macleod, Matthew; McKone, Thomas E; Wania, Frank

    2009-10-01

    Fate and exposure modeling has not, thus far, been explicitly used in the risk profile documents prepared for evaluating the significant adverse effect of candidate chemicals for either the Stockholm Convention or the Convention on Long-Range Transboundary Air Pollution. However, we believe models have considerable potential to improve the risk profiles. Fate and exposure models are already used routinely in other similar regulatory applications to inform decisions, and they have been instrumental in building our current understanding of the fate of persistent organic pollutants (POP) and persistent, bioaccumulative, and toxic (PBT) chemicals in the environment. The goal of this publication is to motivate the use of fate and exposure models in preparing risk profiles in the POP assessment procedure by providing strategies for incorporating and using models. The ways that fate and exposure models can be used to improve and inform the development of risk profiles include 1) benchmarking the ratio of exposure and emissions of candidate chemicals to the same ratio for known POPs, thereby opening the possibility of combining this ratio with the relative emissions and relative toxicity to arrive at a measure of relative risk; 2) directly estimating the exposure of the environment, biota, and humans to provide information to complement measurements or where measurements are not available or are limited; 3) to identify the key processes and chemical or environmental parameters that determine the exposure, thereby allowing the effective prioritization of research or measurements to improve the risk profile; and 4) forecasting future time trends, including how quickly exposure levels in remote areas would respond to reductions in emissions. Currently there is no standardized consensus model for use in the risk profile context. Therefore, to choose the appropriate model the risk profile developer must evaluate how appropriate an existing model is for a specific setting and

  3. Surrogacy assessment using principal stratification and a Gaussian copula model.

    Science.gov (United States)

    Conlon, Asc; Taylor, Jmg; Elliott, M R

    2017-02-01

    In clinical trials, a surrogate outcome ( S) can be measured before the outcome of interest ( T) and may provide early information regarding the treatment ( Z) effect on T. Many methods of surrogacy validation rely on models for the conditional distribution of T given Z and S. However, S is a post-randomization variable, and unobserved, simultaneous predictors of S and T may exist, resulting in a non-causal interpretation. Frangakis and Rubin developed the concept of principal surrogacy, stratifying on the joint distribution of the surrogate marker under treatment and control to assess the association between the causal effects of treatment on the marker and the causal effects of treatment on the clinical outcome. Working within the principal surrogacy framework, we address the scenario of an ordinal categorical variable as a surrogate for a censored failure time true endpoint. A Gaussian copula model is used to model the joint distribution of the potential outcomes of T, given the potential outcomes of S. Because the proposed model cannot be fully identified from the data, we use a Bayesian estimation approach with prior distributions consistent with reasonable assumptions in the surrogacy assessment setting. The method is applied to data from a colorectal cancer clinical trial, previously analyzed by Burzykowski et al.

  4. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    Science.gov (United States)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  5. Improving treatment outcome assessment in a mouse tuberculosis model.

    Science.gov (United States)

    Mourik, Bas C; Svensson, Robin J; de Knegt, Gerjo J; Bax, Hannelore I; Verbon, Annelies; Simonsson, Ulrika S H; de Steenwinkel, Jurriaan E M

    2018-04-09

    Preclinical treatment outcome evaluation of tuberculosis (TB) occurs primarily in mice. Current designs compare relapse rates of different regimens at selected time points, but lack information about the correlation between treatment length and treatment outcome, which is required to efficiently estimate a regimens' treatment-shortening potential. Therefore we developed a new approach. BALB/c mice were infected with a Mycobacterium tuberculosis Beijing genotype strain and were treated with rifapentine-pyrazinamide-isoniazid-ethambutol (R p ZHE), rifampicin-pyrazinamide-moxifloxacin-ethambutol (RZME) or rifampicin-pyrazinamide-moxifloxacin-isoniazid (RZMH). Treatment outcome was assessed in n = 3 mice after 9 different treatment lengths between 2-6 months. Next, we created a mathematical model that best fitted the observational data and used this for inter-regimen comparison. The observed data were best described by a sigmoidal E max model in favor over linear or conventional E max models. Estimating regimen-specific parameters showed significantly higher curative potentials for RZME and R p ZHE compared to RZMH. In conclusion, we provide a new design for treatment outcome evaluation in a mouse TB model, which (i) provides accurate tools for assessment of the relationship between treatment length and predicted cure, (ii) allows for efficient comparison between regimens and (iii) adheres to the reduction and refinement principles of laboratory animal use.

  6. Arc-related porphyry molybdenum deposit model: Chapter D in Mineral deposit models for resource assessment

    Science.gov (United States)

    Taylor, Ryan D.; Hammarstrom, Jane M.; Piatak, Nadine M.; Seal, Robert R.

    2012-01-01

    This report provides a descriptive model for arc-related porphyry molybdenum deposits. Presented within are geological, geochemical, and mineralogical characteristics that differentiate this deposit type from porphyry copper and alkali-feldspar rhyolite-granite porphyry molybdenum deposits. The U.S. Geological Survey's effort to update existing mineral deposit models spurred this research, which is intended to supplement previously published models for this deposit type that help guide mineral-resource and mineral-environmental assessments.

  7. Assigning probability distributions to input parameters of performance assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, Srikanta [INTERA Inc., Austin, TX (United States)

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available.

  8. Assigning probability distributions to input parameters of performance assessment models

    International Nuclear Information System (INIS)

    Mishra, Srikanta

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available

  9. Evaluating intersectoral collaboration: a model for assessment by service users

    Directory of Open Access Journals (Sweden)

    Bengt Ahgren

    2009-02-01

    Full Text Available Introduction: DELTA was launched as a project in 1997 to improve intersectoral collaboration in the rehabilitation field. In 2005 DELTA was transformed into a local association for financial co-ordination between the institutions involved. Based on a study of the DELTA service users, the purpose of this article is to develop and to validate a model that can be used to assess the integration of welfare services from the perspective of the service users. Theory: The foundation of integration is a well functioning structure of integration. Without such structural conditions, it is difficult to develop a process of integration that combines the resources and competences of the collaborating organisations to create services advantageous for the service users. In this way, both the structure and the process will contribute to the outcome of integration. Method: The study was carried out as a retrospective cross-sectional survey during two weeks, including all the current service users of DELTA. The questionnaire contained 32 questions, which were derived from the theoretical framework and research on service users, capturing perceptions of integration structure, process and outcome. Ordinal scales and open questions where used for the assessment. Results: The survey had a response rate of 82% and no serious biases of the results were detected. The study shows that the users of the rehabilitation services perceived the services as well integrated, relevant and adapted to their needs. The assessment model was tested for reliability and validity and a few modifications were suggested. Some key measurement themes were derived from the study. Conclusion: The model developed in this study is an important step towards an assessment of service integration from the perspective of the service users. It needs to be further refined, however, before it can be used in other evaluations of collaboration in the provision of integrated welfare services.

  10. Are revised models better models? A skill score assessment of regional interannual variability

    Science.gov (United States)

    Sperber, Kenneth R.; Participating AMIP Modelling Groups

    1999-05-01

    Various skill scores are used to assess the performance of revised models relative to their original configurations. The interannual variability of all-India, Sahel and Nordeste rainfall and summer monsoon windshear is examined in integrations performed under the experimental design of the Atmospheric Model Intercomparison Project. For the indices considered, the revised models exhibit greater fidelity at simulating the observed interannual variability. Interannual variability of all-India rainfall is better simulated by models that have a more realistic rainfall climatology in the vicinity of India, indicating the beneficial effect of reducing systematic model error.

  11. Habitat hydraulic models - a tool for Danish stream quality assessment?

    DEFF Research Database (Denmark)

    Olsen, Martin

    and hydromorphological and chemical characteristics has to be enlightened (EUROPA, 2005). This study links catchment hydrology, stream discharge and physical habitat in a small Danish stream, the stream Ledreborg, and discusses the utility of habitat hydraulic models in relation to the present criteria and methods used......).  Hydromorphological conditions in the stream are measured through field study, using a habitat mapping approach and modelled using a habitat hydraulic model (RHYHABSIM). Using RHYHABSIM and both "site-specific" and general HSI's, Weighted Usable Area (WUA) for the trout population at different discharges is assessed...... and differences between simulated WUA using "site-specific" and general habitat preferences are discussed. In RHYHABSIM it is possible to use two different approaches to investigate the hydromorphological conditions in a river, the habitat mapping approach used in this project and the representative reach...

  12. Permafrost Degradation Risk Zone Assessment using Simulation Models

    DEFF Research Database (Denmark)

    Daanen, R.P.; Ingeman-Nielsen, Thomas; Marchenko, S.

    2011-01-01

    In this proof-of-concept study we focus on linking large scale climate and permafrost simulations to small scale engineering projects by bridging the gap between climate and permafrost sciences on the one hand and on the other technical recommendation for adaptation of planned infrastructures...... to climate change in a region generally underlain by permafrost. We present the current and future state of permafrost in Greenland as modelled numerically with the GIPL model driven by HIRHAM climate projections up to 2080. We develop a concept called Permafrost Thaw Potential (PTP), defined...... as the potential active layer increase due to climate warming and surface alterations. PTP is then used in a simple risk assessment procedure useful for engineering applications. The modelling shows that climate warming will result in continuing wide-spread permafrost warming and degradation in Greenland...

  13. Application of a leakage model to assess exfiltration from sewers.

    Science.gov (United States)

    Karpf, C; Krebs, P

    2005-01-01

    The exfiltration of wastewater from sewer systems in urban areas causes a deterioration of soil and possibly groundwater quality. Beside the simulation of transport and degradation processes in the unsaturated zone and in the aquifer the analysis of the potential impact requires the estimation of quantity and temporal variation of wastewater exfiltration. Exfiltration can be assessed by the application of a leakage model. The hydrological approach was originally developed to simulate the interactions between the groundwater and surface water, it was adapted to allow for modelling of interactions between groundwater and sewer system. In order to approximate the exfiltration specific model parameters infiltration specific parameters were used as a basis. Scenario analysis of the exfiltration in the City of Dresden from 1997 to 1999 and during the flood event in August 2002 shows the variation and the extent of exfiltration rates.

  14. Assessment of realizability constraints in v2-f turbulence models

    International Nuclear Information System (INIS)

    Sveningsson, A.; Davidson, L.

    2004-01-01

    The use of the realizability constraint in v 2 -f turbulence models is assessed by computing a stator vane passage flow. In this flow the stagnation region is large and it is shown that the time scale bound suggested by [Int. J. Heat Fluid Flow 17 (1995) 89] is well suited to prevent unphysical growth of turbulence kinetic energy. However, this constraint causes numerical instabilities when used in the equation for the relaxation parameter, f. It is also shown that the standard use of the realizability constraint in the v 2 -f model is inconsistent and some modifications are suggested. These changes of the v 2 -f model are examined and shown to have negligible effect on the overall performance of the v 2 -f model. In this work two different versions of the v 2 -f model are investigated and the results obtained are compared with experimental data. The model on a form similar to that originally suggested by Durbin (e.g. [AIAA J. 33 (1995) 659]) produced the overall best agreement with stator vane heat transfer data

  15. ACCURACY ASSESSMENT OF RECENT GLOBAL OCEAN TIDE MODELS AROUND ANTARCTICA

    Directory of Open Access Journals (Sweden)

    J. Lei

    2017-09-01

    Full Text Available Due to the coverage limitation of T/P-series altimeters, the lack of bathymetric data under large ice shelves, and the inaccurate definitions of coastlines and grounding lines, the accuracy of ocean tide models around Antarctica is poorer than those in deep oceans. Using tidal measurements from tide gauges, gravimetric data and GPS records, the accuracy of seven state-of-the-art global ocean tide models (DTU10, EOT11a, GOT4.8, FES2012, FES2014, HAMTIDE12, TPXO8 is assessed, as well as the most widely-used conventional model FES2004. Four regions (Antarctic Peninsula region, Amery ice shelf region, Filchner-Ronne ice shelf region and Ross ice shelf region are separately reported. The standard deviations of eight main constituents between the selected models are large in polar regions, especially under the big ice shelves, suggesting that the uncertainty in these regions remain large. Comparisons with in situ tidal measurements show that the most accurate model is TPXO8, and all models show worst performance in Weddell sea and Filchner-Ronne ice shelf regions. The accuracy of tidal predictions around Antarctica is gradually improving.

  16. Task-based dermal exposure models for regulatory risk assessment.

    Science.gov (United States)

    Warren, Nicholas D; Marquart, Hans; Christopher, Yvette; Laitinen, Juha; VAN Hemmen, Joop J

    2006-07-01

    The regulatory risk assessment of chemicals requires the estimation of occupational dermal exposure. Until recently, the models used were either based on limited data or were specific to a particular class of chemical or application. The EU project RISKOFDERM has gathered a considerable number of new measurements of dermal exposure together with detailed contextual information. This article describes the development of a set of generic task-based models capable of predicting potential dermal exposure to both solids and liquids in a wide range of situations. To facilitate modelling of the wide variety of dermal exposure situations six separate models were made for groupings of exposure scenarios called Dermal Exposure Operation units (DEO units). These task-based groupings cluster exposure scenarios with regard to the expected routes of dermal exposure and the expected influence of exposure determinants. Within these groupings linear mixed effect models were used to estimate the influence of various exposure determinants and to estimate components of variance. The models predict median potential dermal exposure rates for the hands and the rest of the body from the values of relevant exposure determinants. These rates are expressed as mg or microl product per minute. Using these median potential dermal exposure rates and an accompanying geometric standard deviation allows a range of exposure percentiles to be calculated.

  17. Dose Assessment Model for Chronic Atmospheric Releases of Tritium

    International Nuclear Information System (INIS)

    Shen Huifang; Yao Rentai

    2010-01-01

    An improved dose assessment model for chronic atmospheric releases of tritium was proposed. The proposed model explicitly considered two chemical forms of tritium.It was based on conservative assumption of transfer of tritiated water (HTO) from air to concentration of HTO and organic beam tritium (OBT) in vegetable and animal products.The concentration of tritium in plant products was calculated based on considering dividedly leafy plant and not leafy plant, meanwhile the concentration contribution of tritium in the different plants from the tritium in soil was taken into account.Calculating the concentration of HTO in animal products, average water fraction of animal products and the average weighted tritium concentration of ingested water based on the fraction of water supplied by each source were considered,including skin absorption, inhalation, drinking water and food.Calculating the annual doses, the ingestion doses were considered, at the same time the contribution of inhalation and skin absorption to the dose was considered. Concentrations in foodstuffs and dose of annual adult calculated with the specific activity model, NEWTRI model and the model proposed by the paper were compared. The results indicate that the model proposed by the paper can predict accurately tritium doses through the food chain from chronic atmospheric releases. (authors)

  18. Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.

  19. USING COPULAS TO MODEL DEPENDENCE IN SIMULATION RISK ASSESSMENT

    Energy Technology Data Exchange (ETDEWEB)

    Dana L. Kelly

    2007-11-01

    Typical engineering systems in applications with high failure consequences such as nuclear reactor plants often employ redundancy and diversity of equipment in an effort to lower the probability of failure and therefore risk. However, it has long been recognized that dependencies exist in these redundant and diverse systems. Some dependencies, such as common sources of electrical power, are typically captured in the logic structure of the risk model. Others, usually referred to as intercomponent dependencies, are treated implicitly by introducing one or more statistical parameters into the model. Such common-cause failure models have limitations in a simulation environment. In addition, substantial subjectivity is associated with parameter estimation for these models. This paper describes an approach in which system performance is simulated by drawing samples from the joint distributions of dependent variables. The approach relies on the notion of a copula distribution, a notion which has been employed by the actuarial community for ten years or more, but which has seen only limited application in technological risk assessment. The paper also illustrates how equipment failure data can be used in a Bayesian framework to estimate the parameter values in the copula model. This approach avoids much of the subjectivity required to estimate parameters in traditional common-cause failure models. Simulation examples are presented for failures in time. The open-source software package R is used to perform the simulations. The open-source software package WinBUGS is used to perform the Bayesian inference via Markov chain Monte Carlo sampling.

  20. Accuracy Assessment of Recent Global Ocean Tide Models around Antarctica

    Science.gov (United States)

    Lei, J.; Li, F.; Zhang, S.; Ke, H.; Zhang, Q.; Li, W.

    2017-09-01

    Due to the coverage limitation of T/P-series altimeters, the lack of bathymetric data under large ice shelves, and the inaccurate definitions of coastlines and grounding lines, the accuracy of ocean tide models around Antarctica is poorer than those in deep oceans. Using tidal measurements from tide gauges, gravimetric data and GPS records, the accuracy of seven state-of-the-art global ocean tide models (DTU10, EOT11a, GOT4.8, FES2012, FES2014, HAMTIDE12, TPXO8) is assessed, as well as the most widely-used conventional model FES2004. Four regions (Antarctic Peninsula region, Amery ice shelf region, Filchner-Ronne ice shelf region and Ross ice shelf region) are separately reported. The standard deviations of eight main constituents between the selected models are large in polar regions, especially under the big ice shelves, suggesting that the uncertainty in these regions remain large. Comparisons with in situ tidal measurements show that the most accurate model is TPXO8, and all models show worst performance in Weddell sea and Filchner-Ronne ice shelf regions. The accuracy of tidal predictions around Antarctica is gradually improving.

  1. Modeling logistic performance in quantitative microbial risk assessment.

    Science.gov (United States)

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  2. Literature Review and Assessment of Plant and Animal Transfer Factors Used in Performance Assessment Modeling

    International Nuclear Information System (INIS)

    Robertson, David E.; Cataldo, Dominic A.; Napier, Bruce A.; Krupka, Kenneth M.; Sasser, Lyle B.

    2003-01-01

    A literature review and assessment was conducted by Pacific Northwest National Laboratory (PNNL) to update information on plant and animal radionuclide transfer factors used in performance-assessment modeling. A group of 15 radionuclides was included in this review and assessment. The review is composed of four main sections, not including the Introduction. Section 2.0 provides a review of the critically important issue of physicochemical speciation and geochemistry of the radionuclides in natural soil-water systems as it relates to the bioavailability of the radionuclides. Section 3.0 provides an updated review of the parameters of importance in the uptake of radionuclides by plants, including root uptake via the soil-groundwater system and foliar uptake due to overhead irrigation. Section 3.0 also provides a compilation of concentration ratios (CRs) for soil-to-plant uptake for the 15 selected radionuclides. Section 4.0 provides an updated review on radionuclide uptake data for animal products related to absorption, homeostatic control, approach to equilibration, chemical and physical form, diet, and age. Compiled transfer coefficients are provided for cow's milk, sheep's milk, goat's milk, beef, goat meat, pork, poultry, and eggs. Section 5.0 discusses the use of transfer coefficients in soil, plant, and animal modeling using regulatory models for evaluating radioactive waste disposal or decommissioned sites. Each section makes specific suggestions for future research in its area.

  3. Literature Review and Assessment of Plant and Animal Transfer Factors Used in Performance Assessment Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, David E.; Cataldo, Dominic A.; Napier, Bruce A.; Krupka, Kenneth M.; Sasser, Lyle B.

    2003-07-20

    A literature review and assessment was conducted by Pacific Northwest National Laboratory (PNNL) to update information on plant and animal radionuclide transfer factors used in performance-assessment modeling. A group of 15 radionuclides was included in this review and assessment. The review is composed of four main sections, not including the Introduction. Section 2.0 provides a review of the critically important issue of physicochemical speciation and geochemistry of the radionuclides in natural soil-water systems as it relates to the bioavailability of the radionuclides. Section 3.0 provides an updated review of the parameters of importance in the uptake of radionuclides by plants, including root uptake via the soil-groundwater system and foliar uptake due to overhead irrigation. Section 3.0 also provides a compilation of concentration ratios (CRs) for soil-to-plant uptake for the 15 selected radionuclides. Section 4.0 provides an updated review on radionuclide uptake data for animal products related to absorption, homeostatic control, approach to equilibration, chemical and physical form, diet, and age. Compiled transfer coefficients are provided for cow’s milk, sheep’s milk, goat’s milk, beef, goat meat, pork, poultry, and eggs. Section 5.0 discusses the use of transfer coefficients in soil, plant, and animal modeling using regulatory models for evaluating radioactive waste disposal or decommissioned sites. Each section makes specific suggestions for future research in its area.

  4. A prediction model for assessing residential radon concentration in Switzerland

    International Nuclear Information System (INIS)

    Hauri, Dimitri D.; Huss, Anke; Zimmermann, Frank; Kuehni, Claudia E.; Röösli, Martin

    2012-01-01

    Indoor radon is regularly measured in Switzerland. However, a nationwide model to predict residential radon levels has not been developed. The aim of this study was to develop a prediction model to assess indoor radon concentrations in Switzerland. The model was based on 44,631 measurements from the nationwide Swiss radon database collected between 1994 and 2004. Of these, 80% randomly selected measurements were used for model development and the remaining 20% for an independent model validation. A multivariable log-linear regression model was fitted and relevant predictors selected according to evidence from the literature, the adjusted R², the Akaike's information criterion (AIC), and the Bayesian information criterion (BIC). The prediction model was evaluated by calculating Spearman rank correlation between measured and predicted values. Additionally, the predicted values were categorised into three categories (50th, 50th–90th and 90th percentile) and compared with measured categories using a weighted Kappa statistic. The most relevant predictors for indoor radon levels were tectonic units and year of construction of the building, followed by soil texture, degree of urbanisation, floor of the building where the measurement was taken and housing type (P-values <0.001 for all). Mean predicted radon values (geometric mean) were 66 Bq/m³ (interquartile range 40–111 Bq/m³) in the lowest exposure category, 126 Bq/m³ (69–215 Bq/m³) in the medium category, and 219 Bq/m³ (108–427 Bq/m³) in the highest category. Spearman correlation between predictions and measurements was 0.45 (95%-CI: 0.44; 0.46) for the development dataset and 0.44 (95%-CI: 0.42; 0.46) for the validation dataset. Kappa coefficients were 0.31 for the development and 0.30 for the validation dataset, respectively. The model explained 20% overall variability (adjusted R²). In conclusion, this residential radon prediction model, based on a large number of measurements, was demonstrated to be

  5. Assessing global vegetation activity using spatio-temporal Bayesian modelling

    Science.gov (United States)

    Mulder, Vera L.; van Eck, Christel M.; Friedlingstein, Pierre; Regnier, Pierre A. G.

    2016-04-01

    This work demonstrates the potential of modelling vegetation activity using a hierarchical Bayesian spatio-temporal model. This approach allows modelling changes in vegetation and climate simultaneous in space and time. Changes of vegetation activity such as phenology are modelled as a dynamic process depending on climate variability in both space and time. Additionally, differences in observed vegetation status can be contributed to other abiotic ecosystem properties, e.g. soil and terrain properties. Although these properties do not change in time, they do change in space and may provide valuable information in addition to the climate dynamics. The spatio-temporal Bayesian models were calibrated at a regional scale because the local trends in space and time can be better captured by the model. The regional subsets were defined according to the SREX segmentation, as defined by the IPCC. Each region is considered being relatively homogeneous in terms of large-scale climate and biomes, still capturing small-scale (grid-cell level) variability. Modelling within these regions is hence expected to be less uncertain due to the absence of these large-scale patterns, compared to a global approach. This overall modelling approach allows the comparison of model behavior for the different regions and may provide insights on the main dynamic processes driving the interaction between vegetation and climate within different regions. The data employed in this study encompasses the global datasets for soil properties (SoilGrids), terrain properties (Global Relief Model based on SRTM DEM and ETOPO), monthly time series of satellite-derived vegetation indices (GIMMS NDVI3g) and climate variables (Princeton Meteorological Forcing Dataset). The findings proved the potential of a spatio-temporal Bayesian modelling approach for assessing vegetation dynamics, at a regional scale. The observed interrelationships of the employed data and the different spatial and temporal trends support

  6. Assessing climate change impact by integrated hydrological modelling

    Science.gov (United States)

    Lajer Hojberg, Anker; Jørgen Henriksen, Hans; Olsen, Martin; der Keur Peter, van; Seaby, Lauren Paige; Troldborg, Lars; Sonnenborg, Torben; Refsgaard, Jens Christian

    2013-04-01

    showed some unexpected results, where climate models predicting the largest increase in net precipitation did not result in the largest increase in groundwater heads. This was found to be the result of different initial conditions (1990 - 2010) for the various climate models. In some areas a combination of a high initial groundwater head and an increase in precipitation towards 2021 - 2050 resulted in a groundwater head raise that reached the drainage or the surface water system. This will increase the exchange from the groundwater to the surface water system, but reduce the raise in groundwater heads. An alternative climate model, with a lower initial head can thus predict a higher increase in the groundwater head, although the increase in precipitation is lower. This illustrates an extra dimension in the uncertainty assessment, namely the climate models capability of simulating the current climatic conditions in a way that can reproduce the observed hydrological response. Højberg, AL, Troldborg, L, Stisen, S, et al. (2012) Stakeholder driven update and improvement of a national water resources model - http://www.sciencedirect.com/science/article/pii/S1364815212002423 Seaby, LP, Refsgaard, JC, Sonnenborg, TO, et al. (2012) Assessment of robustness and significance of climate change signals for an ensemble of distribution-based scaled climate projections (submitted) Journal of Hydrology Stisen, S, Højberg, AL, Troldborg, L et al., (2012): On the importance of appropriate rain-gauge catch correction for hydrological modelling at mid to high latitudes - http://www.hydrol-earth-syst-sci.net/16/4157/2012/

  7. Development of good modelling practice for phsiologically based pharmacokinetic models for use in risk assessment: The first steps

    Science.gov (United States)

    The increasing use of tissue dosimetry estimated using pharmacokinetic models in chemical risk assessments in multiple countries necessitates the need to develop internationally recognized good modelling practices. These practices would facilitate sharing of models and model eva...

  8. Psychometric model for safety culture assessment in nuclear research facilities

    International Nuclear Information System (INIS)

    Nascimento, C.S. do; Andrade, D.A.; Mesquita, R.N. de

    2017-01-01

    Highlights: • A psychometric model to evaluate ‘safety climate’ at nuclear research facilities. • The model presented evidences of good psychometric qualities. • The model was applied to nuclear research facilities in Brazil. • Some ‘safety culture’ weaknesses were detected in the assessed organization. • A potential tool to develop safety management programs in nuclear facilities. - Abstract: A safe and reliable operation of nuclear power plants depends not only on technical performance, but also on the people and on the organization. Organizational factors have been recognized as the main causal mechanisms of accidents by research organizations through USA, Europe and Japan. Deficiencies related with these factors reveal weaknesses in the organization’s safety culture. A significant number of instruments to assess the safety culture based on psychometric models that evaluate safety climate through questionnaires, and which are based on reliability and validity evidences, have been published in health and ‘safety at work’ areas. However, there are few safety culture assessment instruments with these characteristics (reliability and validity) available on nuclear literature. Therefore, this work proposes an instrument to evaluate, with valid and reliable measures, the safety climate of nuclear research facilities. The instrument was developed based on methodological principles applied to research modeling and its psychometric properties were evaluated by a reliability analysis and validation of content, face and construct. The instrument was applied to an important nuclear research organization in Brazil. This organization comprises 4 research reactors and many nuclear laboratories. The survey results made possible a demographic characterization and the identification of some possible safety culture weaknesses and pointing out potential areas to be improved in the assessed organization. Good evidence of reliability with Cronbach's alpha

  9. Psychometric model for safety culture assessment in nuclear research facilities

    Energy Technology Data Exchange (ETDEWEB)

    Nascimento, C.S. do, E-mail: claudio.souza@ctmsp.mar.mil.br [Centro Tecnológico da Marinha em São Paulo (CTMSP), Av. Professor Lineu Prestes 2468, 05508-000 São Paulo, SP (Brazil); Andrade, D.A., E-mail: delvonei@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN – SP), Av. Professor Lineu Prestes 2242, 05508-000 São Paulo, SP (Brazil); Mesquita, R.N. de, E-mail: rnavarro@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN – SP), Av. Professor Lineu Prestes 2242, 05508-000 São Paulo, SP (Brazil)

    2017-04-01

    Highlights: • A psychometric model to evaluate ‘safety climate’ at nuclear research facilities. • The model presented evidences of good psychometric qualities. • The model was applied to nuclear research facilities in Brazil. • Some ‘safety culture’ weaknesses were detected in the assessed organization. • A potential tool to develop safety management programs in nuclear facilities. - Abstract: A safe and reliable operation of nuclear power plants depends not only on technical performance, but also on the people and on the organization. Organizational factors have been recognized as the main causal mechanisms of accidents by research organizations through USA, Europe and Japan. Deficiencies related with these factors reveal weaknesses in the organization’s safety culture. A significant number of instruments to assess the safety culture based on psychometric models that evaluate safety climate through questionnaires, and which are based on reliability and validity evidences, have been published in health and ‘safety at work’ areas. However, there are few safety culture assessment instruments with these characteristics (reliability and validity) available on nuclear literature. Therefore, this work proposes an instrument to evaluate, with valid and reliable measures, the safety climate of nuclear research facilities. The instrument was developed based on methodological principles applied to research modeling and its psychometric properties were evaluated by a reliability analysis and validation of content, face and construct. The instrument was applied to an important nuclear research organization in Brazil. This organization comprises 4 research reactors and many nuclear laboratories. The survey results made possible a demographic characterization and the identification of some possible safety culture weaknesses and pointing out potential areas to be improved in the assessed organization. Good evidence of reliability with Cronbach's alpha

  10. Potential of 3D City Models to assess flood vulnerability

    Science.gov (United States)

    Schröter, Kai; Bochow, Mathias; Schüttig, Martin; Nagel, Claus; Ross, Lutz; Kreibich, Heidi

    2016-04-01

    Vulnerability, as the product of exposure and susceptibility, is a key factor of the flood risk equation. Furthermore, the estimation of flood loss is very sensitive to the choice of the vulnerability model. Still, in contrast to elaborate hazard simulations, vulnerability is often considered in a simplified manner concerning the spatial resolution and geo-location of exposed objects as well as the susceptibility of these objects at risk. Usually, area specific potential flood loss is quantified on the level of aggregated land-use classes, and both hazard intensity and resistance characteristics of affected objects are represented in highly simplified terms. We investigate the potential of 3D City Models and spatial features derived from remote sensing data to improve the differentiation of vulnerability in flood risk assessment. 3D City Models are based on CityGML, an application scheme of the Geography Markup Language (GML), which represents the 3D geometry, 3D topology, semantics and appearance of objects on different levels of detail. As such, 3D City Models offer detailed spatial information which is useful to describe the exposure and to characterize the susceptibility of residential buildings at risk. This information is further consolidated with spatial features of the building stock derived from remote sensing data. Using this database a spatially detailed flood vulnerability model is developed by means of data-mining. Empirical flood damage data are used to derive and to validate flood susceptibility models for individual objects. We present first results from a prototype application in the city of Dresden, Germany. The vulnerability modeling based on 3D City Models and remote sensing data is compared i) to the generally accepted good engineering practice based on area specific loss potential and ii) to a highly detailed representation of flood vulnerability based on a building typology using urban structure types. Comparisons are drawn in terms of

  11. Assessing women's lacrosse head impacts using finite element modelling.

    Science.gov (United States)

    Clark, J Michio; Hoshizaki, T Blaine; Gilchrist, Michael D

    2018-04-01

    Recently studies have assessed the ability of helmets to reduce peak linear and rotational acceleration for women's lacrosse head impacts. However, such measures have had low correlation with injury. Maximum principal strain interprets loading curves which provide better injury prediction than peak linear and rotational acceleration, especially in compliant situations which create low magnitude accelerations but long impact durations. The purpose of this study was to assess head and helmet impacts in women's lacrosse using finite element modelling. Linear and rotational acceleration loading curves from women's lacrosse impacts to a helmeted and an unhelmeted Hybrid III headform were input into the University College Dublin Brain Trauma Model. The finite element model was used to calculate maximum principal strain in the cerebrum. The results demonstrated for unhelmeted impacts, falls and ball impacts produce higher maximum principal strain values than stick and shoulder collisions. The strain values for falls and ball impacts were found to be within the range of concussion and traumatic brain injury. The results also showed that men's lacrosse helmets reduced maximum principal strain for follow-through slashing, falls and ball impacts. These findings are novel and demonstrate that for high risk events, maximum principal strain can be reduced by implementing the use of helmets if the rules of the sport do not effectively manage such situations. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. A Zebrafish Heart Failure Model for Assessing Therapeutic Agents.

    Science.gov (United States)

    Zhu, Xiao-Yu; Wu, Si-Qi; Guo, Sheng-Ya; Yang, Hua; Xia, Bo; Li, Ping; Li, Chun-Qi

    2018-03-20

    Heart failure is a leading cause of death and the development of effective and safe therapeutic agents for heart failure has been proven challenging. In this study, taking advantage of larval zebrafish, we developed a zebrafish heart failure model for drug screening and efficacy assessment. Zebrafish at 2 dpf (days postfertilization) were treated with verapamil at a concentration of 200 μM for 30 min, which were determined as optimum conditions for model development. Tested drugs were administered into zebrafish either by direct soaking or circulation microinjection. After treatment, zebrafish were randomly selected and subjected to either visual observation and image acquisition or record videos under a Zebralab Blood Flow System. The therapeutic effects of drugs on zebrafish heart failure were quantified by calculating the efficiency of heart dilatation, venous congestion, cardiac output, and blood flow dynamics. All 8 human heart failure therapeutic drugs (LCZ696, digoxin, irbesartan, metoprolol, qiliqiangxin capsule, enalapril, shenmai injection, and hydrochlorothiazide) showed significant preventive and therapeutic effects on zebrafish heart failure (p failure model developed and validated in this study could be used for in vivo heart failure studies and for rapid screening and efficacy assessment of preventive and therapeutic drugs.

  13. Model assessment using a multi-metric ranking technique

    Science.gov (United States)

    Fitzpatrick, P. J.; Lau, Y.; Alaka, G.; Marks, F.

    2017-12-01

    Validation comparisons of multiple models presents challenges when skill levels are similar, especially in regimes dominated by the climatological mean. Assessing skill separation will require advanced validation metrics and identifying adeptness in extreme events, but maintain simplicity for management decisions. Flexibility for operations is also an asset. This work postulates a weighted tally and consolidation technique which ranks results by multiple types of metrics. Variables include absolute error, bias, acceptable absolute error percentages, outlier metrics, model efficiency, Pearson correlation, Kendall's Tau, reliability Index, multiplicative gross error, and root mean squared differences. Other metrics, such as root mean square difference and rank correlation were also explored, but removed when the information was discovered to be generally duplicative to other metrics. While equal weights are applied, weights could be altered depending for preferred metrics. Two examples are shown comparing ocean models' currents and tropical cyclone products, including experimental products. The importance of using magnitude and direction for tropical cyclone track forecasts instead of distance, along-track, and cross-track are discussed. Tropical cyclone intensity and structure prediction are also assessed. Vector correlations are not included in the ranking process, but found useful in an independent context, and will be briefly reported.

  14. An ethical assessment model for digital disease detection technologies.

    Science.gov (United States)

    Denecke, Kerstin

    2017-09-20

    Digital epidemiology, also referred to as digital disease detection (DDD), successfully provided methods and strategies for using information technology to support infectious disease monitoring and surveillance or understand attitudes and concerns about infectious diseases. However, Internet-based research and social media usage in epidemiology and healthcare pose new technical, functional and formal challenges. The focus of this paper is on the ethical issues to be considered when integrating digital epidemiology with existing practices. Taking existing ethical guidelines and the results from the EU project M-Eco and SORMAS as starting point, we develop an ethical assessment model aiming at providing support in identifying relevant ethical concerns in future DDD projects. The assessment model has four dimensions: user, application area, data source and methodology. The model supports in becoming aware, identifying and describing the ethical dimensions of DDD technology or use case and in identifying the ethical issues on the technology use from different perspectives. It can be applied in an interdisciplinary meeting to collect different viewpoints on a DDD system even before the implementation starts and aims at triggering discussions and finding solutions for risks that might not be acceptable even in the development phase. From the answers, ethical issues concerning confidence, privacy, data and patient security or justice may be judged and weighted.

  15. Accuracy of virtual models in the assessment of maxillary defects

    International Nuclear Information System (INIS)

    Kamburoglu, Kivanc; Kursun, Sebnem; Kilic, Cenk; Eozen, Tuncer

    2015-01-01

    This study aimed to assess the reliability of measurements performed on three-dimensional (3D) virtual models of maxillary defects obtained using cone-beam computed tomography (CBCT) and 3D optical scanning. Mechanical cavities simulating maxillary defects were prepared on the hard palate of nine cadavers. Images were obtained using a CBCT unit at three different fields-of-views (FOVs) and voxel sizes: 1) 60 X 60 mm FOV, 0.125 mm 3 (FOV 60 ); 2) 80 X 80 mm FOV, 0.160 mm 3 (FOV 80 ); and 3) 100 X 100 mm FOV, 0.250 mm 3 (FOV 100 ). Superimposition of the images was performed using software called VRMesh Design. Automated volume measurements were conducted, and differences between surfaces were demonstrated. Silicon impressions obtained from the defects were also scanned with a 3D optical scanner. Virtual models obtained using VRMesh Design were compared with impressions obtained by scanning silicon models. Gold standard volumes of the impression models were then compared with CBCT and 3D scanner measurements. Further, the general linear model was used, and the significance was set to p=0.05. A comparison of the results obtained by the observers and methods revealed the p values to be smaller than 0.05, suggesting that the measurement variations were caused by both methods and observers along with the different cadaver specimens used. Further, the 3D scanner measurements were closer to the gold standard measurements when compared to the CBCT measurements. In the assessment of artificially created maxillary defects, the 3D scanner measurements were more accurate than the CBCT measurements.

  16. Accuracy of virtual models in the assessment of maxillary defects

    Energy Technology Data Exchange (ETDEWEB)

    Kamburoglu, Kivanc [Dept. of Dentomaxillofacial Radiology, Faculty of Dentistry, Ankara University, Ankara (Turkmenistan); Kursun, Sebnem [Division of Dentomaxillofacial Radiology, Ministry of Health, Oral and Dental Health Center, Bolu (Turkmenistan); Kilic, Cenk; Eozen, Tuncer [Gealhane Military Medical Academy, Ankara, (Turkmenistan)

    2015-03-15

    This study aimed to assess the reliability of measurements performed on three-dimensional (3D) virtual models of maxillary defects obtained using cone-beam computed tomography (CBCT) and 3D optical scanning. Mechanical cavities simulating maxillary defects were prepared on the hard palate of nine cadavers. Images were obtained using a CBCT unit at three different fields-of-views (FOVs) and voxel sizes: 1) 60 X 60 mm FOV, 0.125 mm{sup 3} (FOV{sub 60}); 2) 80 X 80 mm FOV, 0.160 mm{sup 3} (FOV{sub 80}); and 3) 100 X 100 mm FOV, 0.250 mm{sup 3} (FOV{sub 100}). Superimposition of the images was performed using software called VRMesh Design. Automated volume measurements were conducted, and differences between surfaces were demonstrated. Silicon impressions obtained from the defects were also scanned with a 3D optical scanner. Virtual models obtained using VRMesh Design were compared with impressions obtained by scanning silicon models. Gold standard volumes of the impression models were then compared with CBCT and 3D scanner measurements. Further, the general linear model was used, and the significance was set to p=0.05. A comparison of the results obtained by the observers and methods revealed the p values to be smaller than 0.05, suggesting that the measurement variations were caused by both methods and observers along with the different cadaver specimens used. Further, the 3D scanner measurements were closer to the gold standard measurements when compared to the CBCT measurements. In the assessment of artificially created maxillary defects, the 3D scanner measurements were more accurate than the CBCT measurements.

  17. Forensic DNA phenotyping: Developing a model privacy impact assessment.

    Science.gov (United States)

    Scudder, Nathan; McNevin, Dennis; Kelty, Sally F; Walsh, Simon J; Robertson, James

    2018-05-01

    Forensic scientists around the world are adopting new technology platforms capable of efficiently analysing a larger proportion of the human genome. Undertaking this analysis could provide significant operational benefits, particularly in giving investigators more information about the donor of genetic material, a particularly useful investigative lead. Such information could include predicting externally visible characteristics such as eye and hair colour, as well as biogeographical ancestry. This article looks at the adoption of this new technology from a privacy perspective, using this to inform and critique the application of a Privacy Impact Assessment to this emerging technology. Noting the benefits and limitations, the article develops a number of themes that would influence a model Privacy Impact Assessment as a contextual framework for forensic laboratories and law enforcement agencies considering implementing forensic DNA phenotyping for operational use. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Methods for Developing Emissions Scenarios for Integrated Assessment Models

    Energy Technology Data Exchange (ETDEWEB)

    Prinn, Ronald [MIT; Webster, Mort [MIT

    2007-08-20

    The overall objective of this research was to contribute data and methods to support the future development of new emissions scenarios for integrated assessment of climate change. Specifically, this research had two main objectives: 1. Use historical data on economic growth and energy efficiency changes, and develop probability density functions (PDFs) for the appropriate parameters for two or three commonly used integrated assessment models. 2. Using the parameter distributions developed through the first task and previous work, we will develop methods of designing multi-gas emission scenarios that usefully span the joint uncertainty space in a small number of scenarios. Results on the autonomous energy efficiency improvement (AEEI) parameter are summarized, an uncertainty analysis of elasticities of substitution is described, and the probabilistic emissions scenario approach is presented.

  19. Bioprocesses: Modelling needs for process evaluation and sustainability assessment

    DEFF Research Database (Denmark)

    Jiménez-Gonzaléz, Concepcion; Woodley, John

    2010-01-01

    development such that they can also be used to evaluate processes against sustainability metrics, as well as economics as an integral part of assessments. Finally, property models will also be required based on compounds not currently present in existing databases. It is clear that many new opportunities......The next generation of process engineers will face a new set of challenges, with the need to devise new bioprocesses, with high selectivity for pharmaceutical manufacture, and for lower value chemicals manufacture based on renewable feedstocks. In this paper the current and predicted future roles...... of process system engineering and life cycle inventory and assessment in the design, development and improvement of sustainable bioprocesses are explored. The existing process systems engineering software tools will prove essential to assist this work. However, the existing tools will also require further...

  20. Assessing the limitations of the Banister model in monitoring training

    Science.gov (United States)

    Hellard, Philippe; Avalos, Marta; Lacoste, Lucien; Barale, Frédéric; Chatard, Jean-Claude; Millet, Grégoire P.

    2006-01-01

    The aim of this study was to carry out a statistical analysis of the Banister model to verify how useful it is in monitoring the training programmes of elite swimmers. The accuracy, the ill-conditioning and the stability of this model were thus investigated. Training loads of nine elite swimmers, measured over one season, were related to performances with the Banister model. Firstly, to assess accuracy, the 95% bootstrap confidence interval (95% CI) of parameter estimates and modelled performances were calculated. Secondly, to study ill-conditioning, the correlation matrix of parameter estimates was computed. Finally, to analyse stability, iterative computation was performed with the same data but minus one performance, chosen randomly. Performances were significantly related to training loads in all subjects (R2= 0.79 ± 0.13, P < 0.05) and the estimation procedure seemed to be stable. Nevertheless, the 95% CI of the most useful parameters for monitoring training were wide τa =38 (17, 59), τf =19 (6, 32), tn =19 (7, 35), tg =43 (25, 61). Furthermore, some parameters were highly correlated making their interpretation worthless. The study suggested possible ways to deal with these problems and reviewed alternative methods to model the training-performance relationships. PMID:16608765

  1. Korean risk assessment model for breast cancer risk prediction.

    Science.gov (United States)

    Park, Boyoung; Ma, Seung Hyun; Shin, Aesun; Chang, Myung-Chul; Choi, Ji-Yeob; Kim, Sungwan; Han, Wonshik; Noh, Dong-Young; Ahn, Sei-Hyun; Kang, Daehee; Yoo, Keun-Young; Park, Sue K

    2013-01-01

    We evaluated the performance of the Gail model for a Korean population and developed a Korean breast cancer risk assessment tool (KoBCRAT) based upon equations developed for the Gail model for predicting breast cancer risk. Using 3,789 sets of cases and controls, risk factors for breast cancer among Koreans were identified. Individual probabilities were projected using Gail's equations and Korean hazard data. We compared the 5-year and lifetime risk produced using the modified Gail model which applied Korean incidence and mortality data and the parameter estimators from the original Gail model with those produced using the KoBCRAT. We validated the KoBCRAT based on the expected/observed breast cancer incidence and area under the curve (AUC) using two Korean cohorts: the Korean Multicenter Cancer Cohort (KMCC) and National Cancer Center (NCC) cohort. The major risk factors under the age of 50 were family history, age at menarche, age at first full-term pregnancy, menopausal status, breastfeeding duration, oral contraceptive usage, and exercise, while those at and over the age of 50 were family history, age at menarche, age at menopause, pregnancy experience, body mass index, oral contraceptive usage, and exercise. The modified Gail model produced lower 5-year risk for the cases than for the controls (p = 0.017), while the KoBCRAT produced higher 5-year and lifetime risk for the cases than for the controls (pKorean women, especially urban women.

  2. A transportable system of models for natural resource damage assessment

    International Nuclear Information System (INIS)

    Reed, M.; French, D.

    1992-01-01

    A system of computer models has been developed for assessment of natural resource economic damages resulting from spills of oil and hazardous materials in marine and fresh water environments. Under USA federal legislation, the results of the model system are presumed correct in damage litigation proceedings. The model can address a wide range of spatial and temporal scales. The equations describing the motion of both pollutants and biota are solved in three dimensions. The model can simulate continuous releases of a contaminant, with representation of complex coastal boundaries, variable bathymetry, multiple shoreline types, and spatially variable ecosystem habitats. A graphic user interface provides easy control of the system in addition to the ability to display elements of the underlying geographical information system data base. The model is implemented on a personal computer and on a UNIX workstation. The structure of the system is such that transport to new geographic regions can be accomplished relatively easily, requiring only the development of the appropriate physical, toxicological, biological, and economic data sets. Applications are currently in progress for USA inland and coastal waters, the Adriatic Sea, the Strait of Sicily, the Gulf of Suez, and the Baltic Sea. 4 refs., 2 figs

  3. Assessment of a Low-Cost Ultrasound Pericardiocentesis Model

    Directory of Open Access Journals (Sweden)

    Marco Campo dell'Orto

    2013-01-01

    Full Text Available Introduction. The use of ultrasound during resuscitation is emphasized in the latest European resuscitation council guidelines of 2013 to identify treatable conditions such as pericardial tamponade. The recommended standard treatment of tamponade in various guidelines is pericardiocentesis. As ultrasound guidance lowers the complication rates and increases the patient’s safety, pericardiocentesis should be performed under ultrasound guidance. Acute care physicians actually need to train emergency pericardiocentesis. Methods. We describe in detail a pericardiocentesis ultrasound model, using materials at a cost of about 60 euros. During training courses of focused echocardiography n=67, participants tested the phantom and completed a 16-item questionnaire, assessing the model using a visual analogue scale (VAS. Results. Eleven of fourteen questions were answered with a mean VAS score higher than 60% and thus regarded as showing the strengths of the model. Unrealistically outer appearance and heart shape were rated as weakness of the model. A total mean VAS score of all questions of 63% showed that participants gained confidence for further interventions. Conclusions. Our low-cost pericardiocentesis model, which can be easily constructed, may serve as an effective training tool of ultrasound-guided pericardiocentesis for acute and critical care physicians.

  4. Assessment of a Low-Cost Ultrasound Pericardiocentesis Model

    Science.gov (United States)

    Campo dell'Orto, Marco; Hempel, Dorothea; Starzetz, Agnieszka; Seibel, Armin; Hannemann, Ulf; Walcher, Felix; Breitkreutz, Raoul

    2013-01-01

    Introduction. The use of ultrasound during resuscitation is emphasized in the latest European resuscitation council guidelines of 2013 to identify treatable conditions such as pericardial tamponade. The recommended standard treatment of tamponade in various guidelines is pericardiocentesis. As ultrasound guidance lowers the complication rates and increases the patient's safety, pericardiocentesis should be performed under ultrasound guidance. Acute care physicians actually need to train emergency pericardiocentesis. Methods. We describe in detail a pericardiocentesis ultrasound model, using materials at a cost of about 60 euros. During training courses of focused echocardiography n = 67, participants tested the phantom and completed a 16-item questionnaire, assessing the model using a visual analogue scale (VAS). Results. Eleven of fourteen questions were answered with a mean VAS score higher than 60% and thus regarded as showing the strengths of the model. Unrealistically outer appearance and heart shape were rated as weakness of the model. A total mean VAS score of all questions of 63% showed that participants gained confidence for further interventions. Conclusions. Our low-cost pericardiocentesis model, which can be easily constructed, may serve as an effective training tool of ultrasound-guided pericardiocentesis for acute and critical care physicians. PMID:24288616

  5. Energy-based numerical models for assessment of soil liquefaction

    Directory of Open Access Journals (Sweden)

    Amir Hossein Alavi

    2012-07-01

    Full Text Available This study presents promising variants of genetic programming (GP, namely linear genetic programming (LGP and multi expression programming (MEP to evaluate the liquefaction resistance of sandy soils. Generalized LGP and MEP-based relationships were developed between the strain energy density required to trigger liquefaction (capacity energy and the factors affecting the liquefaction characteristics of sands. The correlations were established based on well established and widely dispersed experimental results obtained from the literature. To verify the applicability of the derived models, they were employed to estimate the capacity energy values of parts of the test results that were not included in the analysis. The external validation of the models was verified using statistical criteria recommended by researchers. Sensitivity and parametric analyses were performed for further verification of the correlations. The results indicate that the proposed correlations are effectively capable of capturing the liquefaction resistance of a number of sandy soils. The developed correlations provide a significantly better prediction performance than the models found in the literature. Furthermore, the best LGP and MEP models perform superior than the optimal traditional GP model. The verification phases confirm the efficiency of the derived correlations for their general application to the assessment of the strain energy at the onset of liquefaction.

  6. An integrated urban drainage system model for assessing renovation scheme.

    Science.gov (United States)

    Dong, X; Zeng, S; Chen, J; Zhao, D

    2012-01-01

    Due to sustained economic growth in China over the last three decades, urbanization has been on a rapidly expanding track. In recent years, regional industrial relocations were also accelerated across the country from the east coast to the west inland. These changes have led to a large-scale redesign of urban infrastructures, including the drainage system. To help the reconstructed infrastructures towards a better sustainability, a tool is required for assessing the efficiency and environmental performance of different renovation schemes. This paper developed an integrated dynamic modeling tool, which consisted of three models for describing the sewer, the wastewater treatment plant (WWTP) and the receiving water body respectively. Three auxiliary modules were also incorporated to conceptualize the model, calibrate the simulations, and analyze the results. The developed integrated modeling tool was applied to a case study in Shenzhen City, which is one of the most dynamic cities and facing considerable challenges for environmental degradation. The renovation scheme proposed to improve the environmental performance of Shenzhen City's urban drainage system was modeled and evaluated. The simulation results supplied some suggestions for the further improvement of the renovation scheme.

  7. Assessing the Validity of the Simplified Potential Energy Clock Model for Modeling Glass-Ceramics

    Energy Technology Data Exchange (ETDEWEB)

    Jamison, Ryan Dale [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grillet, Anne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stavig, Mark E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Strong, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dai, Steve Xunhu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-10-01

    Glass-ceramic seals may be the future of hermetic connectors at Sandia National Laboratories. They have been shown capable of surviving higher temperatures and pressures than amorphous glass seals. More advanced finite-element material models are required to enable model-based design and provide evidence that the hermetic connectors can meet design requirements. Glass-ceramics are composite materials with both crystalline and amorphous phases. The latter gives rise to (non-linearly) viscoelastic behavior. Given their complex microstructures, glass-ceramics may be thermorheologically complex, a behavior outside the scope of currently implemented constitutive models at Sandia. However, it was desired to assess if the Simplified Potential Energy Clock (SPEC) model is capable of capturing the material response. Available data for SL 16.8 glass-ceramic was used to calibrate the SPEC model. Model accuracy was assessed by comparing model predictions with shear moduli temperature dependence and high temperature 3-point bend creep data. It is shown that the model can predict the temperature dependence of the shear moduli and 3- point bend creep data. Analysis of the results is presented. Suggestions for future experiments and model development are presented. Though further calibration is likely necessary, SPEC has been shown capable of modeling glass-ceramic behavior in the glass transition region but requires further analysis below the transition region.

  8. Assessment of RANS CFD modelling for pressurised thermal shock analysis

    International Nuclear Information System (INIS)

    Sander M Willemsen; Ed MJ Komen; Sander Willemsen

    2005-01-01

    Full text of publication follows: The most severe Pressurised Thermal Shock (PTS) scenario is a cold water Emergency Core Coolant (ECC) injection into the cold leg during a LOCA. The injected ECC water mixes with the hot fluid present in the cold leg and flows towards the downcomer where further mixing takes place. When the cold mixture comes into contact with the Reactor Pressure Vessel (RPV) wall, it may lead to large temperature gradients and consequently to high stresses in the RPV wall. Knowledge of these thermal loads is important for RPV remnant life assessments. The existing thermal-hydraulic system codes currently applied for this purpose are based on one-dimensional approximations and can, therefore, not predict the complex three-dimensional flows occurring during ECC injection. Computational Fluid Dynamics (CFD) can be applied to predict these phenomena, with the ultimate benefit of improved remnant RPV life assessment. The present paper presents an assessment of various Reynolds Averaged Navier Stokes (RANS) CFD approaches for modeling the complex mixing phenomena occurring during ECC injection. This assessment has been performed by comparing the numerical results obtained using advanced turbulence models available in the CFX 5.6 CFD code in combination with a hybrid meshing strategy with experimental results of the Upper Plenum Test Facility (UPTF). The UPTF was a full-scale 'simulation' of the primary system of the four loop 1300 MWe Siemens/KWU Pressurised Water Reactor at Grafenrheinfeld. The test vessel upper plenum internals, downcomer and primary coolant piping were replicas of the reference plant, while other components, such as core, coolant pump and steam generators were replaced by simulators. From the extensive test programme, a single-phase fluid-fluid mixing experiment in the cold leg and downcomer was selected. Prediction of the mixing and stratification is assessed by comparison with the measured temperature profiles at several locations

  9. Simplified Predictive Models for CO2 Sequestration Performance Assessment

    Science.gov (United States)

    Mishra, Srikanta; RaviGanesh, Priya; Schuetter, Jared; Mooney, Douglas; He, Jincong; Durlofsky, Louis

    2014-05-01

    We present results from an ongoing research project that seeks to develop and validate a portfolio of simplified modeling approaches that will enable rapid feasibility and risk assessment for CO2 sequestration in deep saline formation. The overall research goal is to provide tools for predicting: (a) injection well and formation pressure buildup, and (b) lateral and vertical CO2 plume migration. Simplified modeling approaches that are being developed in this research fall under three categories: (1) Simplified physics-based modeling (SPM), where only the most relevant physical processes are modeled, (2) Statistical-learning based modeling (SLM), where the simulator is replaced with a "response surface", and (3) Reduced-order method based modeling (RMM), where mathematical approximations reduce the computational burden. The system of interest is a single vertical well injecting supercritical CO2 into a 2-D layered reservoir-caprock system with variable layer permeabilities. In the first category (SPM), we use a set of well-designed full-physics compositional simulations to understand key processes and parameters affecting pressure propagation and buoyant plume migration. Based on these simulations, we have developed correlations for dimensionless injectivity as a function of the slope of fractional-flow curve, variance of layer permeability values, and the nature of vertical permeability arrangement. The same variables, along with a modified gravity number, can be used to develop a correlation for the total storage efficiency within the CO2 plume footprint. In the second category (SLM), we develop statistical "proxy models" using the simulation domain described previously with two different approaches: (a) classical Box-Behnken experimental design with a quadratic response surface fit, and (b) maximin Latin Hypercube sampling (LHS) based design with a Kriging metamodel fit using a quadratic trend and Gaussian correlation structure. For roughly the same number of

  10. Assessment of the five-factor model of personality.

    Science.gov (United States)

    Widiger, T A; Trull, T J

    1997-04-01

    The five-factor model (FFM) of personality is obtaining construct validation, recognition, and practical consideration across a broad domain of fields, including clinical psychology, industrial-organizational psychology, and health psychology. As a result, an array of instruments have been developed and existing instruments are being modified to assess the FFM. In this article, we present an overview and critique of five such instruments (the Goldberg Big Five Markers, the revised NEO Personality Inventory, the Interpersonal Adjective Scales-Big Five, the Personality Psychopathology-Five, and the Hogan Personality Inventory), focusing in particular on their representation of the lexical FFM and their practical application.

  11. Human Factor Modelling in the Risk Assessment of Port Manoeuvers

    Directory of Open Access Journals (Sweden)

    Teresa Abramowicz-Gerigk

    2015-09-01

    Full Text Available The documentation of human factor influence on the scenario development in maritime accidents compared with expert methods is commonly used as a basis in the process of setting up safety regulations and instructions. The new accidents and near misses show the necessity for further studies in determining the human factor influence on both risk acceptance criteria and development of risk control options for the manoeuvers in restricted waters. The paper presents the model of human error probability proposed for the assessment of ship masters and marine pilots' error decision and its influence on the risk of port manoeuvres.

  12. Modelling and performance assessment of an antenna-control system

    Science.gov (United States)

    Burrows, C. R.

    1982-03-01

    An assessment is made of a surveillance-radar control system designed to provide a sector-search capability and continuous control of antenna speed without unwanted torque-reaction on the supporting mast. These objectives are attained by utilizing regenerative braking, and control is exercised through Perbury CVTs. A detailed analysis of the system is given. The models derived for the Perbury CVTs supplement the qualitative data contained in earlier papers. Some results from a computer simulation are presented. Although the paper is concerned with a particular problem, the analysis of the CVTs, and the concept of using energy transfer to control large inertial loads, are of more general interest.

  13. Predicting Performance on MOOC Assessments using Multi-Regression Models

    OpenAIRE

    Ren, Zhiyun; Rangwala, Huzefa; Johri, Aditya

    2016-01-01

    The past few years has seen the rapid growth of data min- ing approaches for the analysis of data obtained from Mas- sive Open Online Courses (MOOCs). The objectives of this study are to develop approaches to predict the scores a stu- dent may achieve on a given grade-related assessment based on information, considered as prior performance or prior ac- tivity in the course. We develop a personalized linear mul- tiple regression (PLMR) model to predict the grade for a student, prior to attempt...

  14. Usage models in reliability assessment of software-based systems

    Energy Technology Data Exchange (ETDEWEB)

    Haapanen, P.; Pulkkinen, U. [VTT Automation, Espoo (Finland); Korhonen, J. [VTT Electronics, Espoo (Finland)

    1997-04-01

    This volume in the OHA-project report series deals with the statistical reliability assessment of software based systems on the basis of dynamic test results and qualitative evidence from the system design process. Other reports to be published later on in the OHA-project report series will handle the diversity requirements in safety critical software-based systems, generation of test data from operational profiles and handling of programmable automation in plant PSA-studies. In this report the issues related to the statistical testing and especially automated test case generation are considered. The goal is to find an efficient method for building usage models for the generation of statistically significant set of test cases and to gather practical experiences from this method by applying it in a case study. The scope of the study also includes the tool support for the method, as the models may grow quite large and complex. (32 refs., 30 figs.).

  15. Assessing ecological sustainability in urban planning - EcoBalance model

    Energy Technology Data Exchange (ETDEWEB)

    Wahlgren, I., Email: irmeli.wahlgren@vtt.fi

    2012-06-15

    Urban planning solutions and decisions have large-scale significance for ecological sustainability (eco-efficiency) the consumption of energy and other natural resources, the production of greenhouse gas and other emissions and the costs caused by urban form. Climate change brings new and growing challenges for urban planning. The EcoBalance model was developed to assess the sustainability of urban form and has been applied at various planning levels: regional plans, local master plans and detailed plans. The EcoBalance model estimates the total consumption of energy and other natural resources, the production of emissions and wastes and the costs caused directly and indirectly by urban form on a life cycle basis. The results of the case studies provide information about the ecological impacts of various solutions in urban development. (orig.)

  16. Operation quality assessment model for video conference system

    Science.gov (United States)

    Du, Bangshi; Qi, Feng; Shao, Sujie; Wang, Ying; Li, Weijian

    2018-01-01

    Video conference system has become an important support platform for smart grid operation and management, its operation quality is gradually concerning grid enterprise. First, the evaluation indicator system covering network, business and operation maintenance aspects was established on basis of video conference system's operation statistics. Then, the operation quality assessment model combining genetic algorithm with regularized BP neural network was proposed, which outputs operation quality level of the system within a time period and provides company manager with some optimization advice. The simulation results show that the proposed evaluation model offers the advantages of fast convergence and high prediction accuracy in contrast with regularized BP neural network, and its generalization ability is superior to LM-BP neural network and Bayesian BP neural network.

  17. Considerations on assessment of different time depending models adequacy

    International Nuclear Information System (INIS)

    Constantinescu, C.

    2015-01-01

    The operating period of nuclear power plants can be prolonged if it can be shown that their safety has remained on a high level, and for this, it is necessary to estimate how the aged systems, structures and components (SSCs) influence the NPP reliability and safety. To emphasize the ageing aspects the case study presented in this paper will assess different time depending models for rate of occurrence of failures with the goal to obtain the best fitting model. A sensitivity analysis for the impact of burn-in failures was performed to improve the result of the goodness of fit test. Based on the analysis results, a conclusion about the existence or the absence of an ageing trend could be developed. A sensitivity analysis regarding of the reliability parameters was performed, and the results were used to observe the impact over the time-dependent rate of occurrence of failures. (authors)

  18. Assessing policies towards sustainable transport in Europe: an integrated model

    International Nuclear Information System (INIS)

    Zachariadis, Theodoros

    2005-01-01

    A transport simulation and forecast model is presented, which is designed for the assessment of policy options aiming to achieve sustainability in transportation. Starting from a simulation of the economic behaviour of consumers and producers within a microeconomic optimisation framework and the resulting calculation of the modal split, the allocation of the vehicle stock into vintages and technological groups is modelled. In a third step, a technology-oriented algorithm, which incorporates the relevant state-of-the-art knowledge in Europe, calculates emissions of air pollutants and greenhouse gases as well as appropriate indicators for traffic congestion, noise and road accidents. The paper outlines the methodology and the basic data sources used in connection with work done so far in Europe, presents the outlook according to a 'reference case' run for the 15 current European Union Member States up to 2030, displays aggregate results from a number of alternative scenarios and outlines elements of future work

  19. Lysimeter data as input to performance assessment models

    International Nuclear Information System (INIS)

    McConnell, J.W. Jr.

    1998-01-01

    The Field Lysimeter Investigations: Low-Level Waste Data Base Development Program is obtaining information on the performance of radioactive waste forms in a disposal environment. Waste forms fabricated using ion-exchange resins from EPICOR-117 prefilters employed in the cleanup of the Three Mile Island (TMI) Nuclear Power Station are being tested to develop a low-level waste data base and to obtain information on survivability of waste forms in a disposal environment. The program includes reviewing radionuclide releases from those waste forms in the first 7 years of sampling and examining the relationship between code input parameters and lysimeter data. Also, lysimeter data are applied to performance assessment source term models, and initial results from use of data in two models are presented

  20. Usage models in reliability assessment of software-based systems

    International Nuclear Information System (INIS)

    Haapanen, P.; Pulkkinen, U.; Korhonen, J.

    1997-04-01

    This volume in the OHA-project report series deals with the statistical reliability assessment of software based systems on the basis of dynamic test results and qualitative evidence from the system design process. Other reports to be published later on in the OHA-project report series will handle the diversity requirements in safety critical software-based systems, generation of test data from operational profiles and handling of programmable automation in plant PSA-studies. In this report the issues related to the statistical testing and especially automated test case generation are considered. The goal is to find an efficient method for building usage models for the generation of statistically significant set of test cases and to gather practical experiences from this method by applying it in a case study. The scope of the study also includes the tool support for the method, as the models may grow quite large and complex. (32 refs., 30 figs.)

  1. Assessment of tropospheric delay mapping function models in Egypt: Using PTD database model

    Science.gov (United States)

    Abdelfatah, M. A.; Mousa, Ashraf E.; El-Fiky, Gamal S.

    2018-06-01

    For space geodetic measurements, estimates of tropospheric delays are highly correlated with site coordinates and receiver clock biases. Thus, it is important to use the most accurate models for the tropospheric delay to reduce errors in the estimates of the other parameters. Both the zenith delay value and mapping function should be assigned correctly to reduce such errors. Several mapping function models can treat the troposphere slant delay. The recent models were not evaluated for the Egyptian local climate conditions. An assessment of these models is needed to choose the most suitable one. The goal of this paper is to test the quality of global mapping function which provides high consistency with precise troposphere delay (PTD) mapping functions. The PTD model is derived from radiosonde data using ray tracing, which consider in this paper as true value. The PTD mapping functions were compared, with three recent total mapping functions model and another three separate dry and wet mapping function model. The results of the research indicate that models are very close up to zenith angle 80°. Saastamoinen and 1/cos z model are behind accuracy. Niell model is better than VMF model. The model of Black and Eisner is a good model. The results also indicate that the geometric range error has insignificant effect on slant delay and the fluctuation of azimuth anti-symmetric is about 1%.

  2. Erosion Assessment Modeling Using the Sateec Gis Model on the Prislop Catchment

    Directory of Open Access Journals (Sweden)

    Damian Gheorghe

    2014-05-01

    Full Text Available The Sediment Assessment Tool for Effective Erosion Control (SATEEC acts as an extension for ArcView GIS 3, with easy to use commands. The erosion assessment is divided into two modules that consist of Universal Soil Loss Equation (USLE for sheet/rill erosion and the nLS/USPED modeling for gully head erosion. The SATEEC erosion modules can be successfully implemented for areas where sheet, rill and gully erosion occurs, such as the Prislop Catchment. The enhanced SATEEC system does not require experienced GIS users to operate the system therefore it is suitable for local authorities and/or students not so familiar with erosion modeling.

  3. Assessment of bullet effectiveness based on a human vulnerability model.

    Science.gov (United States)

    Liu, Susu; Xu, C; Wen, Y; Li, G; Zhou, J

    2017-12-25

    Penetrating wounds from explosively propelled fragments and bullets are the most common causes of combat injury. There is a requirement to assess the potential effectiveness of bullets penetrating human tissues in order to optimise preventive measures and wound trauma management. An advanced voxel model based on the Chinese Visible Human data was built. A digital human vulnerability model was established in combination with wound reconstruction and vulnerability assessment rules, in which wound penetration profiles were obtained by recreating the penetration of projectiles into ballistic gelatin. An effectiveness evaluation method of bullet penetration using the Abbreviated Injury Scale (AIS) was developed and solved using the Monte Carlo sampling method. The effectiveness of rifle bullets was demonstrated to increase with increasing velocity in the range of 300-700 m/s. When imparting the same energy, the effectiveness of the 5.56 mm bullet was higher than the 7.62 mm bullet in this model. The superimposition of simulant penetration profiles produced from ballistic gelatin simulant has been used to predict wound tracts in damaged tissues. The authors recognise that determining clinical effectiveness based on the AIS scores alone without verification of outcome by review of clinical hospital records means that this technique should be seen more as a manner of comparing the effectiveness of bullets than an injury prediction model. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. Quality assessment in higher education using the SERVQUALQ model

    Directory of Open Access Journals (Sweden)

    Sabina Đonlagić

    2015-01-01

    Full Text Available Economy in Bosnia and Herzegovina is striving towards growth and increased employment and it has been proven by empirical studies worldwide that higher education contributes to socio-economic development of a country. Universities are important for generation, preservation and dissemination of knowledge in order to contribute to socio-economic benefits of a country. Higher education institutions are being pressured to improve value for their activities and providing quality higher education service to students should be taken seriously. In this paper we will address the emerging demand for quality in higher education. Higher education institutions should assess quality of their services and establish methods for improving quality. Activities of quality assurance should be integrated into the management process at higher education institutions. This paper is addressing the issue of service quality measurement in higher education institutions. The most frequently used model in this context is the SERVQUAL model. This model is measuring quality from the students' point of view, since students are considered to be one of the most important stakeholders for a higher education institution. The main objective of this research is to provide empirical evidence that the adapted SERVQAL model can be used in higher education and to identify the service quality gap based on its application at one institution of higher education (Faculty of Economics in Bosnia and Herzegovina. Furthermore, results of the gap analysis using the SERVQUAL methodology provide relevant information in which areas improvement is necessary in order to enhance service quality.

  5. SCORING ASSESSMENT AND FORECASTING MODELS BANKRUPTCY RISK OF COMPANIES

    Directory of Open Access Journals (Sweden)

    SUSU Stefanita

    2014-07-01

    Full Text Available Bankruptcy risk made the subject of many research studies that aim at identifying the time of the bankruptcy, the factors that compete to achieve this state, the indicators that best express this orientation (the bankruptcy. The threats to enterprises require the managers knowledge of continually economic and financial situations, and vulnerable areas with development potential. Managers need to identify and properly manage the threats that would prevent achieving the targets. In terms of methods known in the literature of assessment and evaluation of bankruptcy risk they are static, functional, strategic, and scoring nonfinancial models. This article addresses Altman and Conan-Holder-known internationally as the model developed at national level by two teachers from prestigious universities in our country-the Robu-Mironiuc model. Those models are applied to data released by the profit and loss account and balance sheet Turism Covasna company over which bankruptcy risk analysis is performed. The results of the analysis are interpreted while trying to formulate solutions to the economic and financial viability of the entity.

  6. Biosphere model for assessing doses from nuclear waste disposal

    International Nuclear Information System (INIS)

    Zach, R.; Amiro, B.D.; Davis, P.A.; Sheppard, S.C.; Szekeley, J.G.

    1994-01-01

    The biosphere model, BIOTRAC, for predicting long term nuclide concentrations and radiological doses from Canada's nuclear fuel waste disposal concept of a vault deep in plutonic rock of the Canadian Shield is presented. This generic, boreal zone biosphere model is based on scenario analysis and systems variability analysis using Monte Carlo simulation techniques. Conservatism is used to bridge uncertainties, even though this creates a small amount of extra nuclide mass. Environmental change over the very long assessment period is mainly handled through distributed parameter values. The dose receptors are a critical group of humans and four generic non-human target organisms. BIOTRAC includes six integrated submodels and it interfaces smoothly with a geosphere model. This interface includes a bedrock well. The geosphere model defines the discharge zones of deep groundwater where nuclides released from the vault enter the biosphere occupied by the dose receptors. The size of one of these zones is reduced when water is withdrawn from the bedrock well. Sensitivity analysis indicates 129 I is by far the most important radionuclide. Results also show bedrock-well water leads to higher doses to man than lake water, but the former doses decrease with the size of the critical group. Under comparable circumstances, doses to the non-human biota are greater than those for man

  7. Spatially Informed Plant PRA Models for Security Assessment

    International Nuclear Information System (INIS)

    Wheeler, Timothy A.; Thomas, Willard; Thornsbury, Eric

    2006-01-01

    Traditional risk models can be adapted to evaluate plant response for situations where plant systems and structures are intentionally damaged, such as from sabotage or terrorism. This paper describes a process by which traditional risk models can be spatially informed to analyze the effects of compound and widespread harsh environments through the use of 'damage footprints'. A 'damage footprint' is a spatial map of regions of the plant (zones) where equipment could be physically destroyed or disabled as a direct consequence of an intentional act. The use of 'damage footprints' requires that the basic events from the traditional probabilistic risk assessment (PRA) be spatially transformed so that the failure of individual components can be linked to the destruction of or damage to specific spatial zones within the plant. Given the nature of intentional acts, extensive modifications must be made to the risk models to account for the special nature of the 'initiating events' associated with deliberate adversary actions. Intentional acts might produce harsh environments that in turn could subject components and structures to one or more insults, such as structural, fire, flood, and/or vibration and shock damage. Furthermore, the potential for widespread damage from some of these insults requires an approach that addresses the impacts of these potentially severe insults even when they occur in locations distant from the actual physical location of a component or structure modeled in the traditional PRA. (authors)

  8. Intrinsic ethics regarding integrated assessment models for climate management.

    Science.gov (United States)

    Schienke, Erich W; Baum, Seth D; Tuana, Nancy; Davis, Kenneth J; Keller, Klaus

    2011-09-01

    In this essay we develop and argue for the adoption of a more comprehensive model of research ethics than is included within current conceptions of responsible conduct of research (RCR). We argue that our model, which we label the ethical dimensions of scientific research (EDSR), is a more comprehensive approach to encouraging ethically responsible scientific research compared to the currently typically adopted approach in RCR training. This essay focuses on developing a pedagogical approach that enables scientists to better understand and appreciate one important component of this model, what we call intrinsic ethics. Intrinsic ethical issues arise when values and ethical assumptions are embedded within scientific findings and analytical methods. Through a close examination of a case study and its application in teaching, namely, evaluation of climate change integrated assessment models, this paper develops a method and case for including intrinsic ethics within research ethics training to provide scientists with a comprehensive understanding and appreciation of the critical role of values and ethical choices in the production of research outcomes.

  9. The Use of Logistic Model in RUL Assessment

    Science.gov (United States)

    Gumiński, R.; Radkowski, S.

    2017-12-01

    The paper takes on the issue of assessment of remaining useful life (RUL). The goal of the paper was to develop a method, which would enable use of diagnostic information in the task of reducing the uncertainty related to technical risk. Prediction of the remaining useful life (RUL) of the system is a very important task for maintenance strategy. In the literature RUL of an engineering system is defined as the first future time instant in which thresholds of conditions (safety, operational quality, maintenance cost, etc) are violated. Knowledge of RUL offers the possibility of planning the testing and repair activities. Building models of damage development is important in this task. In the presented work, logistic function will be used to model fatigue crack development. It should be remembered that modeling of every phase of damage development is very difficult, yet modeling of every phase of damage separately, especially including on-line diagnostic information is more effective. Particular attention was paid to the possibility of forecasting the occurrence of damage due to fatigue while relying on the analysis of the structure of a vibroacoustic signal.

  10. Assessing the Hydrogeomorphic Effects of Environmental Flows using Hydrodynamic Modeling.

    Science.gov (United States)

    Gregory, Angela; Morrison, Ryan R; Stone, Mark

    2018-04-13

    Water managers are increasingly using environmental flows (e-flows) as a tool to improve ecological conditions downstream from impoundments. Recent studies have called for e-flow approaches that explicitly consider impacts on hydrogeomorphic processes when developing management alternatives. Process-based approaches are particularly relevant in river systems that have been highly modified and where water supplies are over allocated. One-dimensional (1D) and two-dimensional (2D) hydrodynamic models can be used to resolve hydrogeomorphic processes at different spatial and temporal scales to support the development, testing, and refinement of e-flow hypotheses. Thus, the objective of this paper is to demonstrate the use of hydrodynamic models as a tool for assisting stakeholders in targeting and assessing environmental flows within a decision-making framework. We present a case study of e-flows on the Rio Chama in northern New Mexico, USA, where 1D and 2D hydrodynamic modeling was used within a collaborative process to implement an e-flow experiment. A specific goal of the e-flow process was to improve spawning habitat for brown trout by flushing fine sediments from gravel features. The results revealed that the 2D hydrodynamic model provided much greater insight with respect to hydrodynamic and sediment transport processes, which led to a reduction in the recommended e-flow discharge. The results suggest that 2D hydrodynamic models can be useful tools for improving process understanding, developing e-flow recommendations, and supporting adaptive management even when limited or no data are available for model calibration and validation.

  11. When less is more: Psychometric properties of Norwegian short-forms of the Ambivalent Sexism Scales (ASI and AMI) and the Illinois Rape Myth Acceptance (IRMA) Scale.

    Science.gov (United States)

    Bendixen, Mons; Kennair, Leif Edward Ottesen

    2017-12-01

    This paper reports on the development and the psychometric properties of short forms of Ambivalent Sexism Scales toward women (ASI; Glick & Fiske, 1996) and men (AMI; Glick & Fiske, 1999), and a scale measuring rape stereotypes (IRMA; McMahon & Farmer, 2011). The short form AMI/ASI were applied for examining gender and educational differences in university students (N = 512) and in high school students (N = 1381), and for predicting individual differences in rape stereotypes in the latter. The short forms demonstrated good to excellent psychometric properties across samples of emerging adults. Relative to female students, male students reported markedly more hostility toward women and more stereotypical beliefs about rape. Despite sampling from a highly gender egalitarian and secular culture, these gender differences are on a par with those reported internationally. Rape stereotypes were predicted by sexism in high school students. Additional predictors were educational program, relationship status, and acceptance of derogatory sexual slurs. The paper questions the validity of separate constructs for benevolent sexism toward women versus men. The short form versions of the scales may substitute the original versions in future research, and help prevent attrition while measuring the same constructs. © 2017 Scandinavian Psychological Associations and John Wiley & Sons Ltd.

  12. THE PENA BLANCA NATURAL ANALOGUE PERFORMANCE ASSESSMENT MODEL

    Energy Technology Data Exchange (ETDEWEB)

    G. Saulnier and W. Statham

    2006-04-16

    The Nopal I uranium mine in the Sierra Pena Blanca, Chihuahua, Mexico serves as a natural analogue to the Yucca Mountain repository. The Pena Blanca Natural Analogue Performance Assessment Model simulates the mobilization and transport of radionuclides that are released from the mine and transported to the saturated zone. The Pena Blanca Natural Analogue Performance Assessment Model uses probabilistic simulations of hydrogeologic processes that are analogous to the processes that occur at the Yucca Mountain site. The Nopal I uranium deposit lies in fractured, welded, and altered rhyolitic ash-flow tuffs that overlie carbonate rocks, a setting analogous to the geologic formations at the Yucca Mountain site. The Nopal I mine site has the following analogous characteristics as compared to the Yucca Mountain repository site: (1) Analogous source--UO{sub 2} uranium ore deposit = spent nuclear fuel in the repository; (2) Analogous geology--(i.e. fractured, welded, and altered rhyolitic ash-flow tuffs); (3) Analogous climate--Semiarid to arid; (4) Analogous setting--Volcanic tuffs overlie carbonate rocks; and (5) Analogous geochemistry--Oxidizing conditions Analogous hydrogeology: The ore deposit lies in the unsaturated zone above the water table.

  13. THE PENA BLANCA NATURAL ANALOGUE PERFORMANCE ASSESSMENT MODEL

    International Nuclear Information System (INIS)

    G. Saulnier; W. Statham

    2006-01-01

    The Nopal I uranium mine in the Sierra Pena Blanca, Chihuahua, Mexico serves as a natural analogue to the Yucca Mountain repository. The Pena Blanca Natural Analogue Performance Assessment Model simulates the mobilization and transport of radionuclides that are released from the mine and transported to the saturated zone. The Pena Blanca Natural Analogue Performance Assessment Model uses probabilistic simulations of hydrogeologic processes that are analogous to the processes that occur at the Yucca Mountain site. The Nopal I uranium deposit lies in fractured, welded, and altered rhyolitic ash-flow tuffs that overlie carbonate rocks, a setting analogous to the geologic formations at the Yucca Mountain site. The Nopal I mine site has the following analogous characteristics as compared to the Yucca Mountain repository site: (1) Analogous source--UO 2 uranium ore deposit = spent nuclear fuel in the repository; (2) Analogous geology--(i.e. fractured, welded, and altered rhyolitic ash-flow tuffs); (3) Analogous climate--Semiarid to arid; (4) Analogous setting--Volcanic tuffs overlie carbonate rocks; and (5) Analogous geochemistry--Oxidizing conditions Analogous hydrogeology: The ore deposit lies in the unsaturated zone above the water table

  14. Analytical Modeling for Underground Risk Assessment in Smart Cities

    Directory of Open Access Journals (Sweden)

    Israr Ullah

    2018-06-01

    Full Text Available In the developed world, underground facilities are increasing day-by-day, as it is considered as an improved utilization of available space in smart cities. Typical facilities include underground railway lines, electricity lines, parking lots, water supply systems, sewerage network, etc. Besides its utility, these facilities also pose serious threats to citizens and property. To preempt accidental loss of precious human lives and properties, a real time monitoring system is highly desirable for conducting risk assessment on continuous basis and timely report any abnormality before its too late. In this paper, we present an analytical formulation to model system behavior for risk analysis and assessment based on various risk contributing factors. Based on proposed analytical model, we have evaluated three approximation techniques for computing final risk index: (a simple linear approximation based on multiple linear regression analysis; (b hierarchical fuzzy logic based technique in which related risk factors are combined in a tree like structure; and (c hybrid approximation approach which is a combination of (a and (b. Experimental results shows that simple linear approximation fails to accurately estimate final risk index as compared to hierarchical fuzzy logic based system which shows that the latter provides an efficient method for monitoring and forecasting critical issues in the underground facilities and may assist in maintenance efficiency as well. Estimation results based on hybrid approach fails to accurately estimate final risk index. However, hybrid scheme reveals some interesting and detailed information by performing automatic clustering based on location risk index.

  15. Pluripotent stem cells: An in vitro model for nanotoxicity assessments.

    Science.gov (United States)

    Handral, Harish K; Tong, Huei Jinn; Islam, Intekhab; Sriram, Gopu; Rosa, Vinicus; Cao, Tong

    2016-10-01

    The advent of technology has led to an established range of engineered nanoparticles that are used in diverse applications, such as cell-cell interactions, cell-material interactions, medical therapies and the target modulation of cellular processes. The exponential increase in the utilization of nanomaterials and the growing number of associated criticisms has highlighted the potential risks of nanomaterials to human health and the ecosystem. The existing in vivo and in vitro platforms show limitations, with fluctuations being observed in the results of toxicity assessments. Pluripotent stem cells (PSCs) are viable source of cells that are capable of developing into specialized cells of the human body. PSCs can be efficiently used to screen new biomaterials/drugs and are potential candidates for studying impairments of biophysical morphology at both the cellular and tissue levels during interactions with nanomaterials and for diagnosing toxicity. Three-dimensional in vitro models obtained using PSC-derived cells would provide a realistic, patient-specific platform for toxicity assessments and in drug screening applications. The current review focuses on PSCs as an alternative in vitro platform for assessing the hazardous effects of nanomaterials on health systems and highlights the importance of PSC-derived in vitro platforms. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  16. Frameworks for Assessing the Quality of Modeling and Simulation Capabilities

    Science.gov (United States)

    Rider, W. J.

    2012-12-01

    The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are

  17. EXPERT MODEL OF LAND SUITABILITY ASSESSMENT FOR CROPS

    Directory of Open Access Journals (Sweden)

    Boris Đurđević

    2010-12-01

    Full Text Available A total of 17404 soil samples (2003rd-2009th year were analysed in the eastern Croatia. The largest number of soil samples belongs to the Osijek-Baranya county, which together with both Eastern sugar beet Factories (Osijek and Županja, conduct the soil fertility control (~4200 samples/yr.. Computer model suitability assessment for crops, supported by GIS, proved to be fast, efficient enough reliable in terms of the number of analyzed soil samples. It allows the visualization of the agricultural area and prediction of its production properties for the purposes of analysis, planning and rationalization of agricultural production. With more precise data about the soil (soil, climate and reliable Digital Soil Map of Croatia, the model could be an acceptable, not only to evaluate the suitability for growing different crops but also their need for fertilizer, necessary machinery, repairs (liming, and other measures of organic matter input. The abovementioned aims to eliminate or reduce effects of limiting factors in primary agricultural production. Assessment of the relative benefits of soil presented by computer model for the crops production and geostatistical method kriging in the Osijek-Baranya county showed: 1 Average soil suitability being 60.06 percent. 2 Kriging predicted that 51751 ha (17.16% are of limited resources (N1 for growing crops whereas a 86142 ha (28.57% of land is limited suitably (S3, b 132789 ha (44.04% are moderately suitable (S2 and c 30772 ha (10.28% are of excellent fertility (S1. A large number of eastern Croatian land data showed that the computer-geostatistical model for determination of soil benefits for growing crops was automated, fast and simple to use and suitable for the implementation of GIS and automatically downloading the necessary benefit indicators from the input base (land, analytical and climate as well as data from the digital soil maps able to: a visualize the suitability for soil tillage, b predict the

  18. Developing a Model for Assessing Public Culture Indicators at Universities

    Directory of Open Access Journals (Sweden)

    Meisam Latifi

    2015-06-01

    Full Text Available The present study is aimed to develop a model for assessing public culture at universities and evaluating its indicators at public universities in Mashhad. The research follows an exploratory mixed approach. Research strategies in qualitative and quantitative sections are thematic networks analysis and descriptive- survey method, respectively. In the qualitative section, document analysis and semi-structured interviews with cultural experts are used as research tools. In this section, targeted sampling is carried out. In the quantitative section, a questionnaire which is developed based on the findings of the qualitative section is used as the research tool. Research population of the quantitative section consists of all the students who are admitted to public universities in Mashhad between 2009 and 2012. Sample size was calculated according to Cochran’s formula. Stratified sampling was used to select the sample. The results of the qualitative section led to the identification of 44 basic themes which are referred to as the micro indicators. These themes were clustered into similar groups. Then, 10 organizer themes were identified and recognized as macro indicators. In the next phase, importance factor of each indicator is determined according to the AHP method. The results of the qualitative assessment of indicators at public universities of Mashhad show that the overall cultural index declines during the years the student attends the university. Additionally, the highest correlation exists between national identity and revolutionary identity. The only negative correlations are observed between family and two indicators including social capital and cultural consumption. The results of the present study can be used to assess the state of public culture among university students and also be considered as a basis for assessing cultural planning.

  19. Model of affective assessment of primary school students

    Directory of Open Access Journals (Sweden)

    Amir Syamsudin

    2016-06-01

    Full Text Available This study aims to develop an instrument of affective assessment to measure the social competence of elementary school students in the learning process in schools. This study used the development model of Borg & Gall’s approach which was modified into five phases, including the need analyses, developing draft of the product conducted by experts, developing an affective assessment instrument, trying out the affective assessment instrument conducted by teachers of primary education in Yogyakarta, and the dissemination and implementation of the developed affective assessment instrument. The subjects were elementary school students whose school implemented Curriculum 2013 in the academic year of 2013/2014. The validity and reliability of each construct of the affective instrument were established using the PLS SEM Wrap PLS 3.0 analysis program. The study finds the following results. First, the construct of Honesty, Discipline, Responsibility, Decency, Care, and Self-Confidence in the limited, main, and extended testing has been supported by empirical data. Second, the validity of Honesty, Discipline, Responsibility, Decency, Care, and Self-Confidence in the limited, main, and extended testing meets the criteria above 0.70 for each indicator of the loading factor and the criteria below 0.50 for each indicator score of the cross-loading factor. Third, the reliability of Honesty, Discipline, Responsibility, Decency, Care, and Self-Confidence in limited, main, and extended testing meets the criteria above 0.70 for both composite reliability and Cronbach’s alpha scores. Fourth, the number of indicators at preresearch was 53, and 10 indicators were rejected in the limited testing, and four indicators were rejected in the main testing, and one indicator was rejected in the extended testing.

  20. An ensemble model of QSAR tools for regulatory risk assessment.

    Science.gov (United States)

    Pradeep, Prachi; Povinelli, Richard J; White, Shannon; Merrill, Stephen J

    2016-01-01

    Quantitative structure activity relationships (QSARs) are theoretical models that relate a quantitative measure of chemical structure to a physical property or a biological effect. QSAR predictions can be used for chemical risk assessment for protection of human and environmental health, which makes them interesting to regulators, especially in the absence of experimental data. For compatibility with regulatory use, QSAR models should be transparent, reproducible and optimized to minimize the number of false negatives. In silico QSAR tools are gaining wide acceptance as a faster alternative to otherwise time-consuming clinical and animal testing methods. However, different QSAR tools often make conflicting predictions for a given chemical and may also vary in their predictive performance across different chemical datasets. In a regulatory context, conflicting predictions raise interpretation, validation and adequacy concerns. To address these concerns, ensemble learning techniques in the machine learning paradigm can be used to integrate predictions from multiple tools. By leveraging various underlying QSAR algorithms and training datasets, the resulting consensus prediction should yield better overall predictive ability. We present a novel ensemble QSAR model using Bayesian classification. The model allows for varying a cut-off parameter that allows for a selection in the desirable trade-off between model sensitivity and specificity. The predictive performance of the ensemble model is compared with four in silico tools (Toxtree, Lazar, OECD Toolbox, and Danish QSAR) to predict carcinogenicity for a dataset of air toxins (332 chemicals) and a subset of the gold carcinogenic potency database (480 chemicals). Leave-one-out cross validation results show that the ensemble model achieves the best trade-off between sensitivity and specificity (accuracy: 83.8 % and 80.4 %, and balanced accuracy: 80.6 % and 80.8 %) and highest inter-rater agreement [kappa ( κ ): 0

  1. The Terrestrial Investigation Model: A probabilistic risk assessment model for birds exposed to pesticides

    Science.gov (United States)

    One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....

  2. The MARINA model (Model to Assess River Inputs of Nutrients to seAs)

    NARCIS (Netherlands)

    Strokal, Maryna; Kroeze, Carolien; Wang, Mengru; Bai, Zhaohai; Ma, Lin

    2016-01-01

    Chinese agriculture has been developing fast towards industrial food production systems that discharge nutrient-rich wastewater into rivers. As a result, nutrient export by rivers has been increasing, resulting in coastal water pollution. We developed a Model to Assess River Inputs of Nutrients

  3. Environmental assessment of amine-based carbon capture Scenario modelling with life cycle assessment (LCA)

    Energy Technology Data Exchange (ETDEWEB)

    Brekke, Andreas; Askham, Cecilia; Modahl, Ingunn Saur; Vold, Bjoern Ivar; Johnsen, Fredrik Moltu

    2012-07-01

    This report contains a first attempt at introducing the environmental impacts associated with amines and derivatives in a life cycle assessment (LCA) of gas power production with carbon capture and comparing these with other environmental impacts associated with the production system. The report aims to identify data gaps and methodological challenges connected both to modelling toxicity of amines and derivatives and weighting of environmental impacts. A scenario based modelling exercise was performed on a theoretical gas power plant with carbon capture, where emission levels of nitrosamines were varied between zero (gas power without CCS) to a worst case level (outside the probable range of actual carbon capture facilities). Because of extensive research and development in the areas of solvents and emissions from carbon capture facilities in the latter years, data used in the exercise may be outdated and results should therefore not be taken at face value.The results from the exercise showed: According to UseTox, emissions of nitrosamines are less important than emissions of formaldehyde with regard to toxicity related to operation of (i.e. both inputs to and outputs from) a carbon capture facility. If characterisation factors for emissions of metals are included, these outweigh all other toxic emissions in the study. None of the most recent weighting methods in LCA include characterisation factors for nitrosamines, and these are therefore not part of the environmental ranking.These results shows that the EDecIDe project has an important role to play in developing LCA methodology useful for assessing the environmental performance of amine based carbon capture in particular and CCS in general. The EDecIDe project will examine the toxicity models used in LCA in more detail, specifically UseTox. The applicability of the LCA compartment models and site specificity issues for a Norwegian/Arctic situation will be explored. This applies to the environmental compartments

  4. Models for dose assessments. Modules for various biosphere types

    Energy Technology Data Exchange (ETDEWEB)

    Bergstroem, U.; Nordlinder, S.; Aggeryd, I. [Studsvik Eco and Safety AB, Nykoeping (Sweden)

    1999-12-01

    The main objective of this study was to provide a basis for illustrations of yearly dose rates to the most exposed individual from hypothetical leakages of radionuclides from a deep bedrock repository for spent nuclear fuel and other radioactive waste. The results of this study will be used in the safety assessment SR 97 and in a study on the design and long-term safety for a repository planned to contain long-lived low and intermediate level waste. The repositories will be designed to isolate the radionuclides for several hundred thousands of years. In the SR 97 study, however, hypothetical scenarios for leakage are postulated. Radionuclides are hence assumed to be transported in the geosphere by groundwater, and probably discharge into the biosphere. This may occur in several types of ecosystems. A number of categories of such ecosystems were identified, and turnover of radionuclides was modelled separately for each ecosystem. Previous studies had focused on generic models for wells, lakes and coastal areas. These models were, in this study, developed further to use site-specific data. In addition, flows of groundwater, containing radionuclides, to agricultural land and peat bogs were considered. All these categories are referred to as modules in this report. The forest ecosystems were not included, due to a general lack of knowledge of biospheric processes in connection with discharge of groundwater in forested areas. Examples of each type of module were run with the assumption of a continuous annual release into the biosphere of 1 Bq for each radionuclide during 10 000 years. The results are presented as ecosystem specific dose conversion factors (EDFs) for each nuclide at the year 10 000, assuming stationary ecosystems and prevailing living conditions and habits. All calculations were performed with uncertainty analyses included. Simplifications and assumptions in the modelling of biospheric processes are discussed. The use of modules may be seen as a step

  5. Models for dose assessments. Modules for various biosphere types

    International Nuclear Information System (INIS)

    Bergstroem, U.; Nordlinder, S.; Aggeryd, I.

    1999-12-01

    The main objective of this study was to provide a basis for illustrations of yearly dose rates to the most exposed individual from hypothetical leakages of radionuclides from a deep bedrock repository for spent nuclear fuel and other radioactive waste. The results of this study will be used in the safety assessment SR 97 and in a study on the design and long-term safety for a repository planned to contain long-lived low and intermediate level waste. The repositories will be designed to isolate the radionuclides for several hundred thousands of years. In the SR 97 study, however, hypothetical scenarios for leakage are postulated. Radionuclides are hence assumed to be transported in the geosphere by groundwater, and probably discharge into the biosphere. This may occur in several types of ecosystems. A number of categories of such ecosystems were identified, and turnover of radionuclides was modelled separately for each ecosystem. Previous studies had focused on generic models for wells, lakes and coastal areas. These models were, in this study, developed further to use site-specific data. In addition, flows of groundwater, containing radionuclides, to agricultural land and peat bogs were considered. All these categories are referred to as modules in this report. The forest ecosystems were not included, due to a general lack of knowledge of biospheric processes in connection with discharge of groundwater in forested areas. Examples of each type of module were run with the assumption of a continuous annual release into the biosphere of 1 Bq for each radionuclide during 10 000 years. The results are presented as ecosystem specific dose conversion factors (EDFs) for each nuclide at the year 10 000, assuming stationary ecosystems and prevailing living conditions and habits. All calculations were performed with uncertainty analyses included. Simplifications and assumptions in the modelling of biospheric processes are discussed. The use of modules may be seen as a step

  6. Comparative assessment of condensation models for horizontal tubes

    International Nuclear Information System (INIS)

    Schaffrath, A.; Kruessenberg, A.K.; Lischke, W.; Gocht, U.; Fjodorow, A.

    1999-01-01

    The condensation in horizontal tubes plays an important role e.g. for the determination of the operation mode of horizontal steam generators of VVER reactors or passive safety systems for the next generation of nuclear power plants. Two different approaches (HOTKON and KONWAR) for modeling this process have been undertaken by Forschungszentrum Juelich (FZJ) and University for Applied Sciences Zittau/Goerlitz (HTWS) and implemented into the 1D-thermohydraulic code ATHLET, which is developed by the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH for the analysis of anticipated and abnormal transients in light water reactors. Although the improvements of the condensation models are developed for different applications (VVER steam generators - emergency condenser of the SWR1000) with strongly different operation conditions (e.g. the temperature difference over the tube wall in HORUS is up to 30 K and in NOKO up to 250 K, the heat flux density in HORUS is up to 40 kW/m 2 and in NOKO up to 1 GW/m 2 ) both models are now compared and assessed by Forschungszentrum Rossendorf FZR e.V. Therefore, post test calculations of selected HORUS experiments were performed with ATHLET/KONWAR and compared to existing ATHLET and ATHLET/HOTKON calculations of HTWS. It can be seen that the calculations with the extension KONWAR as well as HOTKON improve significantly the agreement between computational and experimental data. (orig.) [de

  7. A hierarchical network modeling method for railway tunnels safety assessment

    Science.gov (United States)

    Zhou, Jin; Xu, Weixiang; Guo, Xin; Liu, Xumin

    2017-02-01

    Using network theory to model risk-related knowledge on accidents is regarded as potential very helpful in risk management. A large amount of defects detection data for railway tunnels is collected in autumn every year in China. It is extremely important to discover the regularities knowledge in database. In this paper, based on network theories and by using data mining techniques, a new method is proposed for mining risk-related regularities to support risk management in railway tunnel projects. A hierarchical network (HN) model which takes into account the tunnel structures, tunnel defects, potential failures and accidents is established. An improved Apriori algorithm is designed to rapidly and effectively mine correlations between tunnel structures and tunnel defects. Then an algorithm is presented in order to mine the risk-related regularities table (RRT) from the frequent patterns. At last, a safety assessment method is proposed by consideration of actual defects and possible risks of defects gained from the RRT. This method cannot only generate the quantitative risk results but also reveal the key defects and critical risks of defects. This paper is further development on accident causation network modeling methods which can provide guidance for specific maintenance measure.

  8. Assessing Mental Models of Emergencies Through Two Knowledge Elicitation Tasks.

    Science.gov (United States)

    Whitmer, Daphne E; Sims, Valerie K; Torres, Michael E

    2017-05-01

    The goals of this study were to assess the risk identification aspect of mental models using standard elicitation methods and how university campus alerts were related to these mental models. People fail to follow protective action recommendations in emergency warnings. Past research has yet to examine cognitive processes that influence emergency decision-making. Study 1 examined 2 years of emergency alerts distributed by a large southeastern university. In Study 2, participants listed emergencies in a thought-listing task. Study 3 measured participants' time to decide if a situation was an emergency. The university distributed the most alerts about an armed person, theft, and fire. In Study 2, participants most frequently listed fire, car accident, heart attack, and theft. In Study 3, participants quickly decided a bomb, murder, fire, tornado, and rape were emergencies. They most slowly decided that a suspicious package and identify theft were emergencies. Recent interaction with warnings was only somewhat related to participants' mental models of emergencies. Risk identification precedes decision-making and applying protective actions. Examining these characteristics of people's mental representations of emergencies is fundamental to further understand why some emergency warnings go ignored. Someone must believe a situation is serious to categorize it as an emergency before taking the protective action recommendations in an emergency warning. Present-day research must continue to examine the problem of people ignoring warning communication, as there are important cognitive factors that have not yet been explored until the present research.

  9. Model-driven Privacy Assessment in the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Knirsch, Fabian [Salzburg Univ. (Austria); Engel, Dominik [Salzburg Univ. (Austria); Neureiter, Christian [Salzburg Univ. (Austria); Frincu, Marc [Univ. of Southern California, Los Angeles, CA (United States); Prasanna, Viktor [Univ. of Southern California, Los Angeles, CA (United States)

    2015-02-09

    In a smart grid, data and information are transported, transmitted, stored, and processed with various stakeholders having to cooperate effectively. Furthermore, personal data is the key to many smart grid applications and therefore privacy impacts have to be taken into account. For an effective smart grid, well integrated solutions are crucial and for achieving a high degree of customer acceptance, privacy should already be considered at design time of the system. To assist system engineers in early design phase, frameworks for the automated privacy evaluation of use cases are important. For evaluation, use cases for services and software architectures need to be formally captured in a standardized and commonly understood manner. In order to ensure this common understanding for all kinds of stakeholders, reference models have recently been developed. In this paper we present a model-driven approach for the automated assessment of such services and software architectures in the smart grid that builds on the standardized reference models. The focus of qualitative and quantitative evaluation is on privacy. For evaluation, the framework draws on use cases from the University of Southern California microgrid.

  10. Toxicological risk assessment of complex mixtures through the Wtox model

    Directory of Open Access Journals (Sweden)

    William Gerson Matias

    2015-01-01

    Full Text Available Mathematical models are important tools for environmental management and risk assessment. Predictions about the toxicity of chemical mixtures must be enhanced due to the complexity of eects that can be caused to the living species. In this work, the environmental risk was accessed addressing the need to study the relationship between the organism and xenobiotics. Therefore, ve toxicological endpoints were applied through the WTox Model, and with this methodology we obtained the risk classication of potentially toxic substances. Acute and chronic toxicity, citotoxicity and genotoxicity were observed in the organisms Daphnia magna, Vibrio scheri and Oreochromis niloticus. A case study was conducted with solid wastes from textile, metal-mechanic and pulp and paper industries. The results have shown that several industrial wastes induced mortality, reproductive eects, micronucleus formation and increases in the rate of lipid peroxidation and DNA methylation of the organisms tested. These results, analyzed together through the WTox Model, allowed the classication of the environmental risk of industrial wastes. The evaluation showed that the toxicological environmental risk of the samples analyzed can be classied as signicant or critical.

  11. Distributional aspects of emissions in climate change integrated assessment models

    International Nuclear Information System (INIS)

    Cantore, Nicola

    2011-01-01

    The recent failure of Copenhagen negotiations shows that concrete actions are needed to create the conditions for a consensus over global emission reduction policies. A wide coalition of countries in international climate change agreements could be facilitated by the perceived fairness of rich and poor countries of the abatement sharing at international level. In this paper I use two popular climate change integrated assessment models to investigate the path and decompose components and sources of future inequality in the emissions distribution. Results prove to be consistent with previous empirical studies and robust to model comparison and show that gaps in GDP across world regions will still play a crucial role in explaining different countries contributions to global warming. - Research highlights: → I implement a scenario analysis with two global climate change models. → I analyse inequality in the distribution of emissions. → I decompose emissions inequality components. → I find that GDP per capita is the main Kaya identity source of emissions inequality. → Current rich countries will mostly remain responsible for emissions inequality.

  12. Precast concrete unit assessment through GPR survey and FDTD modelling

    Science.gov (United States)

    Campo, Davide

    2017-04-01

    Precast concrete elements are widely used within United Kingdom house building offering ease in assembly and added values as structural integrity, sound and thermal insulation; most common concrete components include walls, beams, floors, panels, lintels, stairs, etc. The lack of respect of the manufacturer instruction during assembling, however, may induce cracking and short/long term loss of bearing capacity. GPR is a well-established not destructive technique employed in the assessment of structural elements because of real-time imaging, quickness of data collecting and ability to discriminate finest structural details. In this work, GPR has been used to investigate two different precast elements: precast reinforced concrete planks constituting the roof slab of a school and precast wood-cement blocks with insulation material pre-fitted used to build a perimeter wall of a private building. Visible cracks affected both constructions. For the assessment surveys, a GSSI 2.0 GHz GPR antenna has been used because of the high resolution required and the small size of the antenna case (155 by 90 by 105mm) enabling scanning up to 45mm from any obstruction. Finite Difference Time Domain (FDTD) numerical modelling was also performed to build a scenario of the expected GPR signal response for a preliminary real-time interpretation and to help solve uncertainties due to complex reflection patterns: simulated radargrams were built using Reflex Software v. 8.2, reproducing the same GPR pulse used for the surveys in terms of wavelet, nominal frequency, sample frequency and time window. Model geometries were derived from the design projects available both for the planks and the blocks; the electromagnetic properties of the materials (concrete, reinforcing bars, air-filled void, insulation and wooden concrete) were inferred from both values reported in literature and a preliminary interpretation of radargrams where internal layer interfaces were clearly recognizable and

  13. Model for assessing alpha doses for a Reference Japanese Man

    International Nuclear Information System (INIS)

    Kawamura, Hisao

    1993-01-01

    In view of the development of the nuclear fuel cycle in this country, it is urgently important to establish dose assessment models and related human and environmental parameters for long-lived radionuclides. In the current program, intake and body content of actinides (Pu, Th, U) and related alpha-emitting nuclides (Ra and daughters) have been studied as well as physiological aspects of Reference Japanese Man as the basic model of man for dosimetry. The ultimate object is to examine applicability of the existing models particularly recommended by the ICRP for workers to members of the public. The result of an interlaboratory intercomparison of 239 Pu + 240 Pu determination including our result was published. Alpha-spectrometric determinations of 226 Ra in bone yielded repesentative bone concentration level in Tokyo and Ra-Ca O.R. (bone-diet) which appear consistent with the literature value for Sapporo and Kyoto by Ohno using a Rn emanation method. Specific effective energies for alpha radiation from 226 Ra and daughters were calculated using the ICRP dosimetric model for bone incorporating masses of source and target organs of Reference Japanese Man. Reference Japanese data including the adult, adolescent, child and infant of both sexes was extensively and intensively studied by Tanaka as part of the activities of the ICRP Task Group on Reference Man Revision. Normal data for the physical measurements, mass and dimension of internal organs and body surfaces and some of the body composition were analysed viewing the nutritional data in the Japanese population. Some of the above works are to be continued. (author)

  14. Status of thermalhydraulic modelling and assessment: Open issues

    Energy Technology Data Exchange (ETDEWEB)

    Bestion, D.; Barre, F. [CEA, Grenoble (France)

    1997-07-01

    This paper presents the status of the physical modelling in present codes used for Nuclear Reactor Thermalhydraulics (TRAC, RELAP 5, CATHARE, ATHLET,...) and attempts to list the unresolved or partially resolved issues. First, the capabilities and limitations of present codes are presented. They are mainly known from a synthesis of the assessment calculations performed for both separate effect tests and integral effect tests. It is also interesting to list all the assumptions and simplifications which were made in the establishment of the system of equations and of the constitutive relations. Many of the present limitations are associated to physical situations where these assumptions are not valid. Then, recommendations are proposed to extend the capabilities of these codes.

  15. Model quality assessment using distance constraints from alignments

    DEFF Research Database (Denmark)

    Paluszewski, Martin; Karplus, Kevin

    2008-01-01

    that model which is closest to the true structure. In this article, we present a new approach for addressing the MQA problem. It is based on distance constraints extracted from alignments to templates of known structure, and is implemented in the Undertaker program for protein structure prediction. One novel...... feature is that we extract noncontact constraints as well as contact constraints. We describe how the distance constraint extraction is done and we show how they can be used to address the MQA problem. We have compared our method on CASP7 targets and the results show that our method is at least comparable...... with the best MQA methods that were assessed at CASP7. We also propose a new evaluation measure, Kendall's tau, that is more interpretable than conventional measures used for evaluating MQA methods (Pearson's r and Spearman's rho). We show clear examples where Kendall's tau agrees much more with our intuition...

  16. Status of thermalhydraulic modelling and assessment: Open issues

    International Nuclear Information System (INIS)

    Bestion, D.; Barre, F.

    1997-01-01

    This paper presents the status of the physical modelling in present codes used for Nuclear Reactor Thermalhydraulics (TRAC, RELAP 5, CATHARE, ATHLET,...) and attempts to list the unresolved or partially resolved issues. First, the capabilities and limitations of present codes are presented. They are mainly known from a synthesis of the assessment calculations performed for both separate effect tests and integral effect tests. It is also interesting to list all the assumptions and simplifications which were made in the establishment of the system of equations and of the constitutive relations. Many of the present limitations are associated to physical situations where these assumptions are not valid. Then, recommendations are proposed to extend the capabilities of these codes

  17. Modelling Global Land Use and Social Implications in the Sustainability Assessment of Biofuels

    DEFF Research Database (Denmark)

    Kløverpris, Jesper; Wenzel, Henrik

    2007-01-01

    Cross-fertilising environmental, economic and geographical modelling to improve the environmental assessment of biofuel......Cross-fertilising environmental, economic and geographical modelling to improve the environmental assessment of biofuel...

  18. WORKSHOP ON APPLICATION OF STATISTICAL METHODS TO BIOLOGICALLY-BASED PHARMACOKINETIC MODELING FOR RISK ASSESSMENT

    Science.gov (United States)

    Biologically-based pharmacokinetic models are being increasingly used in the risk assessment of environmental chemicals. These models are based on biological, mathematical, statistical and engineering principles. Their potential uses in risk assessment include extrapolation betwe...

  19. Development of a model to assess orthostatic responses

    Science.gov (United States)

    Rubin, Marilyn

    1993-01-01

    A major change for crewmembers during weightlessness in microgravity is the redistribution of body fluids from the legs into the abdomen, thorax, and head. The fluids continue to be sequestered in these areas throughout the flight. Upon reentry into gravity on landing, these same body fluids are displaced again to their normal locations, however, not without hazardous incidence to the crewmembers. The problem remains that upon landing, crewmembers are subject to orthostasis, that is, the blood flowing into the legs reduces the blood supply to the brain and may result in the crewmember fainting. The purpose of this study was to develop a model of testing orthostatic responses of blood pressure regulating mechanisms of the cardiovascular system, when challenged, to maintain blood pressure to the brain. To accomplish this, subjects' responses were assessed as they proceeded from the supine position of progressive head-up tilt positions of 30 deg, 60 deg, and 90 deg angles. A convenience sample consisted of 21 subjects, females (N=11) and males (N=10), selected from a list of potential subjects available through the NASA subject screening office. The methodology included all non-invasive measurements of blood pressure, heart rate, echocardiograms, cardiac output, cardiac stroke volume, fluid shifts in the thorax, ventricular ejection and velocity times, and skin blood perfusion. The Fischer statistical analysis was done of all data with the significance level at .05. Significant differences were demonstrated in many instances of changes of posture for all variables. Based on the significance of the findings of this study, this model for assessing orthostatic responses does provide an adequate challenge to the blood pressure regulatory systems. While individuals may use different adaptations to incremental changes in gravity, the subjects, in aggregate, demonstrated significant adaptive cardiovascular changes to orthostatic challenges which were presented to them.

  20. Low-level radioactive waste performance assessments: Source term modeling

    International Nuclear Information System (INIS)

    Icenhour, A.S.; Godbee, H.W.; Miller, L.F.

    1995-01-01

    Low-level radioactive wastes (LLW) generated by government and commercial operations need to be isolated from the environment for at least 300 to 500 yr. Most existing sites for the storage or disposal of LLW employ the shallow-land burial approach. However, the U.S. Department of Energy currently emphasizes the use of engineered systems (e.g., packaging, concrete and metal barriers, and water collection systems). Future commercial LLW disposal sites may include such systems to mitigate radionuclide transport through the biosphere. Performance assessments must be conducted for LUW disposal facilities. These studies include comprehensive evaluations of radionuclide migration from the waste package, through the vadose zone, and within the water table. Atmospheric transport mechanisms are also studied. Figure I illustrates the performance assessment process. Estimates of the release of radionuclides from the waste packages (i.e., source terms) are used for subsequent hydrogeologic calculations required by a performance assessment. Computer models are typically used to describe the complex interactions of water with LLW and to determine the transport of radionuclides. Several commonly used computer programs for evaluating source terms include GWSCREEN, BLT (Breach-Leach-Transport), DUST (Disposal Unit Source Term), BARRIER (Ref. 5), as well as SOURCE1 and SOURCE2 (which are used in this study). The SOURCE1 and SOURCE2 codes were prepared by Rogers and Associates Engineering Corporation for the Oak Ridge National Laboratory (ORNL). SOURCE1 is designed for tumulus-type facilities, and SOURCE2 is tailored for silo, well-in-silo, and trench-type disposal facilities. This paper focuses on the source term for ORNL disposal facilities, and it describes improved computational methods for determining radionuclide transport from waste packages

  1. Assessing Mediational Models: Testing and Interval Estimation for Indirect Effects.

    Science.gov (United States)

    Biesanz, Jeremy C; Falk, Carl F; Savalei, Victoria

    2010-08-06

    Theoretical models specifying indirect or mediated effects are common in the social sciences. An indirect effect exists when an independent variable's influence on the dependent variable is mediated through an intervening variable. Classic approaches to assessing such mediational hypotheses ( Baron & Kenny, 1986 ; Sobel, 1982 ) have in recent years been supplemented by computationally intensive methods such as bootstrapping, the distribution of the product methods, and hierarchical Bayesian Markov chain Monte Carlo (MCMC) methods. These different approaches for assessing mediation are illustrated using data from Dunn, Biesanz, Human, and Finn (2007). However, little is known about how these methods perform relative to each other, particularly in more challenging situations, such as with data that are incomplete and/or nonnormal. This article presents an extensive Monte Carlo simulation evaluating a host of approaches for assessing mediation. We examine Type I error rates, power, and coverage. We study normal and nonnormal data as well as complete and incomplete data. In addition, we adapt a method, recently proposed in statistical literature, that does not rely on confidence intervals (CIs) to test the null hypothesis of no indirect effect. The results suggest that the new inferential method-the partial posterior p value-slightly outperforms existing ones in terms of maintaining Type I error rates while maximizing power, especially with incomplete data. Among confidence interval approaches, the bias-corrected accelerated (BC a ) bootstrapping approach often has inflated Type I error rates and inconsistent coverage and is not recommended; In contrast, the bootstrapped percentile confidence interval and the hierarchical Bayesian MCMC method perform best overall, maintaining Type I error rates, exhibiting reasonable power, and producing stable and accurate coverage rates.

  2. Model analysis: Representing and assessing the dynamics of student learning

    Directory of Open Access Journals (Sweden)

    Lei Bao

    2006-02-01

    Full Text Available Decades of education research have shown that students can simultaneously possess alternate knowledge frameworks and that the development and use of such knowledge are context dependent. As a result of extensive qualitative research, standardized multiple-choice tests such as Force Concept Inventory and Force-Motion Concept Evaluation tests provide instructors tools to probe their students’ conceptual knowledge of physics. However, many existing quantitative analysis methods often focus on a binary question of whether a student answers a question correctly or not. This greatly limits the capacity of using the standardized multiple-choice tests in assessing students’ alternative knowledge. In addition, the context dependence issue, which suggests that a student may apply the correct knowledge in some situations and revert to use alternative types of knowledge in others, is often treated as random noise in current analyses. In this paper, we present a model analysis, which applies qualitative research to establish a quantitative representation framework. With this method, students’ alternative knowledge and the probabilities for students to use such knowledge in a range of equivalent contexts can be quantitatively assessed. This provides a way to analyze research-based multiple choice questions, which can generate much richer information than what is available from score-based analysis.

  3. Strategic Assessment Model and Its Application:a Case Study

    Institute of Scientific and Technical Information of China (English)

    ZHU Xiu-wen; CAO Meng-xia; ZHU Ning; ZUO M1ng-j1an

    2001-01-01

    Accurate and effective assessment of strategic alternatives of an organization directly affects the decision-making and execution of its development strategy. In evaluation of strategic alternatives, relevant elements from both internal and external environments of an organization must be considered. In this paper we use strategic assessment model to evaluate strategic alternatives of an air-conditioning company. Strategic objectives and alternatives of the company are developed through analysis of the competitive environment,key competitors and internal conditions. The environment factors are classified into internal, task, and general opportunities and threats. Analytical hierarchy process, subjective probabilities, entropy concept,and utility theory are used to enhance decision-maker's ability in evaluating strategic alternatives. The evaluation results show that the most effective strategic alternative for the company is to reduce types of products, concentrate its effort on producing window-type and cupboard-type air-conditioners, enlarge the production scale, and pre-empt the market. The company has made great progress by implementing this alternative. We conclude that SAM is an appropriate tool for evaluating strategic alternatives.

  4. MODEL AUTHENTIC SELF-ASSESSMENT DALAM PENGEMBANGAN EMPLOYABILITY SKILLS MAHASISWA PENDIDIKAN TINGGI VOKASI

    Directory of Open Access Journals (Sweden)

    I Made Suarta

    2015-06-01

    ______________________________________________________________ AUTHENTIC SELF-ASSESSMENT MODEL FOR DEVELOPING EMPLOYABILITY SKILLS STUDENT IN HIGHER VOCATIONAL EDUCATION Abstract The purpose of this research is to develop assessment tools to evaluate achievement of employability skills which are integrated in the learning database applications. The assessment model developed is a combination of self-assessment and authentic assessment, proposed as models of authentic self-assessment. The steps of developing authentic self-assessment models include: identifying the standards, selecting an authentic task, identifying the criteria for the task, and creating the rubric. The results of development assessment tools include: (1 problem solving skills assessment model, (2 self-management skills assessment model, and (3 competence database applications assessment model. This model can be used to assess the cognitive, affective, and psychomotor achievement. The results indicate: achievement of problem solving and self-management ability was in good category, and competencies in designing conceptual and logical database was in high category. This model also has met the basic principles of assessment, i.e.: validity, reliability, focused on competencies, comprehen-sive, objectivity, and the principle of educating. Keywords: authentic assessment, self-assessment, problem solving skills, self-management skills, vocational education

  5. Modelling future impacts of air pollution using the multi-scale UK Integrated Assessment Model (UKIAM).

    Science.gov (United States)

    Oxley, Tim; Dore, Anthony J; ApSimon, Helen; Hall, Jane; Kryza, Maciej

    2013-11-01

    Integrated assessment modelling has evolved to support policy development in relation to air pollutants and greenhouse gases by providing integrated simulation tools able to produce quick and realistic representations of emission scenarios and their environmental impacts without the need to re-run complex atmospheric dispersion models. The UK Integrated Assessment Model (UKIAM) has been developed to investigate strategies for reducing UK emissions by bringing together information on projected UK emissions of SO2, NOx, NH3, PM10 and PM2.5, atmospheric dispersion, criteria for protection of ecosystems, urban air quality and human health, and data on potential abatement measures to reduce emissions, which may subsequently be linked to associated analyses of costs and benefits. We describe the multi-scale model structure ranging from continental to roadside, UK emission sources, atmospheric dispersion of emissions, implementation of abatement measures, integration with European-scale modelling, and environmental impacts. The model generates outputs from a national perspective which are used to evaluate alternative strategies in relation to emissions, deposition patterns, air quality metrics and ecosystem critical load exceedance. We present a selection of scenarios in relation to the 2020 Business-As-Usual projections and identify potential further reductions beyond those currently being planned. © 2013.

  6. A new model of Ishikawa diagram for quality assessment

    Science.gov (United States)

    Liliana, Luca

    2016-11-01

    The paper presents the results of a study concerning the use of the Ishikawa diagram in analyzing the causes that determine errors in the evaluation of theparts precision in the machine construction field. The studied problem was"errors in the evaluation of partsprecision” and this constitutes the head of the Ishikawa diagram skeleton.All the possible, main and secondary causes that could generate the studied problem were identified. The most known Ishikawa models are 4M, 5M, 6M, the initials being in order: materials, methods, man, machines, mother nature, measurement. The paper shows the potential causes of the studied problem, which were firstly grouped in three categories, as follows: causes that lead to errors in assessing the dimensional accuracy, causes that determine errors in the evaluation of shape and position abnormalities and causes for errors in roughness evaluation. We took into account the main components of parts precision in the machine construction field. For each of the three categories of causes there were distributed potential secondary causes on groups of M (man, methods, machines, materials, environment/ medio ambiente-sp.). We opted for a new model of Ishikawa diagram, resulting from the composition of three fish skeletons corresponding to the main categories of parts accuracy.

  7. A Remote Sensing-Derived Corn Yield Assessment Model

    Science.gov (United States)

    Shrestha, Ranjay Man

    be further associated with the actual yield. Utilizing satellite remote sensing products, such as daily NDVI derived from Moderate Resolution Imaging Spectroradiometer (MODIS) at 250 m pixel size, the crop yield estimation can be performed at a very fine spatial resolution. Therefore, this study examined the potential of these daily NDVI products within agricultural studies and crop yield assessments. In this study, a regression-based approach was proposed to estimate the annual corn yield through changes in MODIS daily NDVI time series. The relationship between daily NDVI and corn yield was well defined and established, and as changes in corn phenology and yield were directly reflected by the changes in NDVI within the growing season, these two entities were combined to develop a relational model. The model was trained using 15 years (2000-2014) of historical NDVI and county-level corn yield data for four major corn producing states: Kansas, Nebraska, Iowa, and Indiana, representing four climatic regions as South, West North Central, East North Central, and Central, respectively, within the U.S. Corn Belt area. The model's goodness of fit was well defined with a high coefficient of determination (R2>0.81). Similarly, using 2015 yield data for validation, 92% of average accuracy signified the performance of the model in estimating corn yield at county level. Besides providing the county-level corn yield estimations, the derived model was also accurate enough to estimate the yield at finer spatial resolution (field level). The model's assessment accuracy was evaluated using the randomly selected field level corn yield within the study area for 2014, 2015, and 2016. A total of over 120 plot level corn yield were used for validation, and the overall average accuracy was 87%, which statistically justified the model's capability to estimate plot-level corn yield. Additionally, the proposed model was applied to the impact estimation by examining the changes in corn yield

  8. A Model for Assessing the Gender Aspect in Economic Policy

    Directory of Open Access Journals (Sweden)

    Ona Rakauskienė

    2015-06-01

    Full Text Available The purpose of research is to develop a conceptual model for assessing the impact of the gender aspect on economic policy at macro– and microeconomic levels. The research methodology is based on analysing scientific approaches to the gender aspect in economics and gender–responsive budgeting as well as determining the impact of the gender aspect on GDP, foreign trade, the state budget and the labour market. First, the major findings encompass the main idea of a conceptual model proposing that a socio–economic picture of society can be accepted as completed only when, alongside public and private sectors, includes the care/reproductive sector that is dominated by women and creating added value in the form of educated human resources; second, macroeconomics is not neutral in terms of gender equality. Gender asymmetry is manifested not only at the level of microeconomics (labour market and business but also at the level of macroeconomics (GDP, the state budget and foreign trade, which has a negative impact on economic growth and state budget revenues. In this regard, economic decisions, according to the principles of gender equality and in order to achieve gender equality in economics, must be made, as the gender aspect has to be also implemented at the macroeconomic level.

  9. Modeling marine surface microplastic transport to assess optimal removal locations

    Science.gov (United States)

    Sherman, Peter; van Sebille, Erik

    2016-01-01

    Marine plastic pollution is an ever-increasing problem that demands immediate mitigation and reduction plans. Here, a model based on satellite-tracked buoy observations and scaled to a large data set of observations on microplastic from surface trawls was used to simulate the transport of plastics floating on the ocean surface from 2015 to 2025, with the goal to assess the optimal marine microplastic removal locations for two scenarios: removing the most surface microplastic and reducing the impact on ecosystems, using plankton growth as a proxy. The simulations show that the optimal removal locations are primarily located off the coast of China and in the Indonesian Archipelago for both scenarios. Our estimates show that 31% of the modeled microplastic mass can be removed by 2025 using 29 plastic collectors operating at a 45% capture efficiency from these locations, compared to only 17% when the 29 plastic collectors are moored in the North Pacific garbage patch, between Hawaii and California. The overlap of ocean surface microplastics and phytoplankton growth can be reduced by 46% at our proposed locations, while sinks in the North Pacific can only reduce the overlap by 14%. These results are an indication that oceanic plastic removal might be more effective in removing a greater microplastic mass and in reducing potential harm to marine life when closer to shore than inside the plastic accumulation zones in the centers of the gyres.

  10. Revenue Risk Modelling and Assessment on BOT Highway Project

    Science.gov (United States)

    Novianti, T.; Setyawan, H. Y.

    2018-01-01

    The infrastructure project which is considered as a public-private partnership approach under BOT (Build-Operate-Transfer) arrangement, such as a highway, is risky. Therefore, assessment on risk factors is essential as the project have a concession period and is influenced by macroeconomic factors and consensus period. In this study, pre-construction risks of a highway were examined by using a Delphi method to create a space for offline expert discussions; a fault tree analysis to map intuition of experts and to create a model from the underlying risk events; a fuzzy logic to interpret the linguistic data of risk models. The loss of revenue for risk tariff, traffic volume, force majeure, and income were then measured. The results showed that the loss of revenue caused by the risk tariff was 10.5% of the normal total revenue. The loss of revenue caused by the risk of traffic volume was 21.0% of total revenue. The loss of revenue caused by the force majeure was 12.2% of the normal income. The loss of income caused by the non-revenue events was 6.9% of the normal revenue. It was also found that the volume of traffic was the major risk of a highway project because it related to customer preferences.

  11. The modelling and assessment of whale-watching impacts

    Science.gov (United States)

    New, Leslie; Hall, Ailsa J.; Harcourt, Robert; Kaufman, Greg; Parsons, E.C.M.; Pearson, Heidi C.; Cosentino, A. Mel; Schick, Robert S

    2015-01-01

    In recent years there has been significant interest in modelling cumulative effects and the population consequences of individual changes in cetacean behaviour and physiology due to disturbance. One potential source of disturbance that has garnered particular interest is whale-watching. Though perceived as ‘green’ or eco-friendly tourism, there is evidence that whale-watching can result in statistically significant and biologically meaningful changes in cetacean behaviour, raising the question whether whale-watching is in fact a long term sustainable activity. However, an assessment of the impacts of whale-watching on cetaceans requires an understanding of the potential behavioural and physiological effects, data to effectively address the question and suitable modelling techniques. Here, we review the current state of knowledge on the viability of long-term whale-watching, as well as logistical limitations and potential opportunities. We conclude that an integrated, coordinated approach will be needed to further understanding of the possible effects of whale-watching on cetaceans.

  12. Mentalized affectivity: A new model and assessment of emotion regulation.

    Directory of Open Access Journals (Sweden)

    David M Greenberg

    Full Text Available Here we introduce a new assessment of emotion regulation called the Mentalized Affectivity Scale (MAS. A large online adult sample (N = 2,840 completed the 60-item MAS along with a battery of psychological measures. Results revealed a robust three-component structure underlying mentalized affectivity, which we labeled: Identifying emotions (the ability to identify emotions and to reflect on the factors that influence them; Processing emotions (the ability to modulate and distinguish complex emotions; and Expressing emotions (the tendency to express emotions outwardly or inwardly. Hierarchical modeling suggested that Processing emotions delineates from Identifying them, and Expressing emotions delineates from Processing them. We then showed how these components are associated with personality traits, well-being, trauma, and 18 different psychological disorders (including mood, neurological, and personality disorders. Notably, those with anxiety, mood, and personality disorders showed a profile of high Identifying and low Processing compared to controls. Further, results showed how mentalized affectivity scores varied across psychological treatment modalities and years spent in therapy. Taken together, the model of mentalized affectivity advances prior theory and research on emotion regulation and the MAS is a useful and reliable instrument that can be used in both clinical and non-clinical settings in psychology, psychiatry, and neuroscience.

  13. Modeling marine surface microplastic transport to assess optimal removal locations

    International Nuclear Information System (INIS)

    Sherman, Peter; Van Sebille, Erik

    2016-01-01

    Marine plastic pollution is an ever-increasing problem that demands immediate mitigation and reduction plans. Here, a model based on satellite-tracked buoy observations and scaled to a large data set of observations on microplastic from surface trawls was used to simulate the transport of plastics floating on the ocean surface from 2015 to 2025, with the goal to assess the optimal marine microplastic removal locations for two scenarios: removing the most surface microplastic and reducing the impact on ecosystems, using plankton growth as a proxy. The simulations show that the optimal removal locations are primarily located off the coast of China and in the Indonesian Archipelago for both scenarios. Our estimates show that 31% of the modeled microplastic mass can be removed by 2025 using 29 plastic collectors operating at a 45% capture efficiency from these locations, compared to only 17% when the 29 plastic collectors are moored in the North Pacific garbage patch, between Hawaii and California. The overlap of ocean surface microplastics and phytoplankton growth can be reduced by 46% at our proposed locations, while sinks in the North Pacific can only reduce the overlap by 14%. These results are an indication that oceanic plastic removal might be more effective in removing a greater microplastic mass and in reducing potential harm to marine life when closer to shore than inside the plastic accumulation zones in the centers of the gyres. (letter)

  14. A model for radiological dose assessment in an urban environment

    International Nuclear Information System (INIS)

    Hwang, Won Tae; Kim, Eun Han; Jeong, Hyo Joon; Suh, Kyung Suk; Han, Moon Hee

    2007-01-01

    A model for radiological dose assessment in an urban environment, METRO-K has been developed. Characteristics of the model are as follows ; 1) mathematical structures are simple (i.e. simplified input parameters) and easy to understand due to get the results by analytical methods using experimental and empirical data, 2) complex urban environment can easily be made up using only 5 types of basic surfaces, 3) various remediation measures can be applied to different surfaces by evaluating the exposure doses contributing from each contamination surface. Exposure doses contributing from each contamination surface at a particular location of a receptor were evaluated using the data library of kerma values as a function of gamma energy and contamination surface. A kerma data library was prepared for 7 representative types of Korean urban building by extending those data given for 4 representative types of European urban buildings. Initial input data are daily radionuclide concentration in air and precipitation, and fraction of chemical type. Final outputs are absorbed dose rate in air contributing from the basic surfaces as a function of time following a radionuclide deposition, and exposure dose rate contributing from various surfaces constituting the urban environment at a particular location of a receptor. As the result of a contaminative scenario for an apartment built-up area, exposure dose rates show a distinct difference for surrounding environment as well as locations of a receptor

  15. Hydrodynamic and Ecological Assessment of Nearshore Restoration: A Modeling Study

    International Nuclear Information System (INIS)

    Yang, Zhaoqing; Sobocinski, Kathryn L.; Heatwole, Danelle W.; Khangaonkar, Tarang; Thom, Ronald M.; Fuller, Roger

    2010-01-01

    Along the Pacific Northwest coast, much of the estuarine habitat has been diked over the last century for agricultural land use, residential and commercial development, and transportation corridors. As a result, many of the ecological processes and functions have been disrupted. To protect coastal habitats that are vital to aquatic species, many restoration projects are currently underway to restore the estuarine and coastal ecosystems through dike breaches, setbacks, and removals. Information on physical processes and hydrodynamic conditions are critical for the assessment of the success of restoration actions. Restoration of a 160- acre property at the mouth of the Stillaguamish River in Puget Sound has been proposed. The goal is to restore native tidal habitats and estuary-scale ecological processes by removing the dike. In this study, a three-dimensional hydrodynamic model was developed for the Stillaguamish River estuary to simulate estuarine processes. The model was calibrated to observed tide, current, and salinity data for existing conditions and applied to simulate the hydrodynamic responses to two restoration alternatives. Responses were evaluated at the scale of the restoration footprint. Model data was combined with biophysical data to predict habitat responses at the site. Results showed that the proposed dike removal would result in desired tidal flushing and conditions that would support four habitat types on the restoration footprint. At the estuary scale, restoration would substantially increase the proportion of area flushed with freshwater (< 5 ppt) at flood tide. Potential implications of predicted changes in salinity and flow dynamics are discussed relative to the distribution of tidal marsh habitat.

  16. Mixture modeling methods for the assessment of normal and abnormal personality, part II: longitudinal models.

    Science.gov (United States)

    Wright, Aidan G C; Hallquist, Michael N

    2014-01-01

    Studying personality and its pathology as it changes, develops, or remains stable over time offers exciting insight into the nature of individual differences. Researchers interested in examining personal characteristics over time have a number of time-honored analytic approaches at their disposal. In recent years there have also been considerable advances in person-oriented analytic approaches, particularly longitudinal mixture models. In this methodological primer we focus on mixture modeling approaches to the study of normative and individual change in the form of growth mixture models and ipsative change in the form of latent transition analysis. We describe the conceptual underpinnings of each of these models, outline approaches for their implementation, and provide accessible examples for researchers studying personality and its assessment.

  17. Towards an Integrated Model for Developing Sustainable Assessment Skills

    Science.gov (United States)

    Fastre, Greet M. J.; van der Klink, Marcel R.; Sluijsmans, Dominique; van Merrienboer, Jeroen J. G.

    2013-01-01

    One of the goals of current education is to ensure that graduates can act as independent lifelong learners. Graduates need to be able to assess their own learning and interpret assessment results. The central question in this article is how to acquire sustainable assessment skills, enabling students to assess their performance and learning…

  18. Dynamic Assessment and Its Implications for RTI Models

    Science.gov (United States)

    Wagner, Richard K.; Compton, Donald L.

    2011-01-01

    Dynamic assessment refers to assessment that combines elements of instruction for the purpose of learning something about an individual that cannot be learned as easily or at all from conventional assessment. The origins of dynamic assessment can be traced to Thorndike (1924), Rey (1934), and Vygotsky (1962), who shared three basic assumptions.…

  19. An Assessment of Mean Areal Precipitation Methods on Simulated Stream Flow: A SWAT Model Performance Assessment

    Directory of Open Access Journals (Sweden)

    Sean Zeiger

    2017-06-01

    Full Text Available Accurate mean areal precipitation (MAP estimates are essential input forcings for hydrologic models. However, the selection of the most accurate method to estimate MAP can be daunting because there are numerous methods to choose from (e.g., proximate gauge, direct weighted average, surface-fitting, and remotely sensed methods. Multiple methods (n = 19 were used to estimate MAP with precipitation data from 11 distributed monitoring sites, and 4 remotely sensed data sets. Each method was validated against the hydrologic model simulated stream flow using the Soil and Water Assessment Tool (SWAT. SWAT was validated using a split-site method and the observed stream flow data from five nested-scale gauging sites in a mixed-land-use watershed of the central USA. Cross-validation results showed the error associated with surface-fitting and remotely sensed methods ranging from −4.5 to −5.1%, and −9.8 to −14.7%, respectively. Split-site validation results showed the percent bias (PBIAS values that ranged from −4.5 to −160%. Second order polynomial functions especially overestimated precipitation and subsequent stream flow simulations (PBIAS = −160 in the headwaters. The results indicated that using an inverse-distance weighted, linear polynomial interpolation or multiquadric function method to estimate MAP may improve SWAT model simulations. Collectively, the results highlight the importance of spatially distributed observed hydroclimate data for precipitation and subsequent steam flow estimations. The MAP methods demonstrated in the current work can be used to reduce hydrologic model uncertainty caused by watershed physiographic differences.

  20. Assessing moderated mediation in linear models requires fewer confounding assumptions than assessing mediation.

    Science.gov (United States)

    Loeys, Tom; Talloen, Wouter; Goubert, Liesbet; Moerkerke, Beatrijs; Vansteelandt, Stijn

    2016-11-01

    It is well known from the mediation analysis literature that the identification of direct and indirect effects relies on strong no unmeasured confounding assumptions of no unmeasured confounding. Even in randomized studies the mediator may still be correlated with unobserved prognostic variables that affect the outcome, in which case the mediator's role in the causal process may not be inferred without bias. In the behavioural and social science literature very little attention has been given so far to the causal assumptions required for moderated mediation analysis. In this paper we focus on the index for moderated mediation, which measures by how much the mediated effect is larger or smaller for varying levels of the moderator. We show that in linear models this index can be estimated without bias in the presence of unmeasured common causes of the moderator, mediator and outcome under certain conditions. Importantly, one can thus use the test for moderated mediation to support evidence for mediation under less stringent confounding conditions. We illustrate our findings with data from a randomized experiment assessing the impact of being primed with social deception upon observer responses to others' pain, and from an observational study of individuals who ended a romantic relationship assessing the effect of attachment anxiety during the relationship on mental distress 2 years after the break-up. © 2016 The British Psychological Society.

  1. Modeling the World Health Organization Disability Assessment Schedule II using non-parametric item response models.

    Science.gov (United States)

    Galindo-Garre, Francisca; Hidalgo, María Dolores; Guilera, Georgina; Pino, Oscar; Rojo, J Emilio; Gómez-Benito, Juana

    2015-03-01

    The World Health Organization Disability Assessment Schedule II (WHO-DAS II) is a multidimensional instrument developed for measuring disability. It comprises six domains (getting around, self-care, getting along with others, life activities and participation in society). The main purpose of this paper is the evaluation of the psychometric properties for each domain of the WHO-DAS II with parametric and non-parametric Item Response Theory (IRT) models. A secondary objective is to assess whether the WHO-DAS II items within each domain form a hierarchy of invariantly ordered severity indicators of disability. A sample of 352 patients with a schizophrenia spectrum disorder is used in this study. The 36 items WHO-DAS II was administered during the consultation. Partial Credit and Mokken scale models are used to study the psychometric properties of the questionnaire. The psychometric properties of the WHO-DAS II scale are satisfactory for all the domains. However, we identify a few items that do not discriminate satisfactorily between different levels of disability and cannot be invariantly ordered in the scale. In conclusion the WHO-DAS II can be used to assess overall disability in patients with schizophrenia, but some domains are too general to assess functionality in these patients because they contain items that are not applicable to this pathology. Copyright © 2014 John Wiley & Sons, Ltd.

  2. Two agricultural production data libraries for risk assessment models

    International Nuclear Information System (INIS)

    Baes, C.F. III; Shor, R.W.; Sharp, R.D.; Sjoreen, A.L.

    1985-01-01

    Two data libraries based on the 1974 US Census of Agriculture are described. The data packages (AGDATC and AGDATG) are available from the Radiation Shielding Information Center (RSIC), Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831. Agricultural production and land-use information by county (AGDATC) or by 1/2 by 1/2 degree longitude-latitude grid cell (AGDATG) provide geographical resolution of the data. The libraries were designed for use in risk assessment models that simulate the transport of radionuclides from sources of airborne release through food chains to man. However, they are also suitable for use in the assessment of other airborne pollutants that can affect man from a food ingestion pathway such as effluents from synfuels or coal-fired power plants. The principal significance of the data libraries is that they provide default location-specific food-chain transport parameters when site-specific information are unavailable. Plant food categories in the data libraries include leafy vegetables, vegetables and fruits exposed to direct deposition of airborne pollutants, vegetables and fruits protected from direct deposition, and grains. Livestock feeds are also tabulated in four categories: pasture, grain, hay, and silage. Pasture was estimated by a material balance of cattle and sheep inventories, forage feed requirements, and reported harvested forage. Cattle (Bos spp.), sheep (Ovis aries), goat (Capra hircus), hog (Sus scrofa), chicken (Gallus domesticus), and turkey (Meleagris gallopavo) inventories or sales are also tabulated in the data libraries and can be used to provide estimates of meat, eggs, and milk production. Honey production also is given. Population, irrigation, and meteorological information are also listed

  3. THE PENA BLANCA NATURAL ANALOGUE PERFORMANCE ASSESSMENT MODEL

    Energy Technology Data Exchange (ETDEWEB)

    G.J. Saulnier Jr; W. Statham

    2006-03-10

    The Nopal I uranium mine in the Sierra Pena Blanca, Chihuahua, Mexico serves as a natural analogue to the Yucca Mountain repository. The Pena Blanca Natural Analogue Performance Assessment Model simulates the mobilization and transport of radionuclides that are released from the mine and transported to the saturated zone. the Pena Blanca Natural Analogue Model uses probabilistic simulations of hydrogeologic processes that are analogous to the processes that occur at the Yucca Mountain site. The Nopal I uranium deposit lies in fractured, welded, and altered rhyolitic ash flow tuffs that overlie carbonate rocks, a setting analogous to the geologic formations at the Yucca Mountain site. The Nopal I mine site has the following characteristics as compared to the Yucca Mountain repository site. (1) Analogous source: UO{sub 2} uranium ore deposit = spent nuclear fuel in the repository; (2) Analogous geologic setting: fractured, welded, and altered rhyolitic ash flow tuffs overlying carbonate rocks; (3) Analogous climate: Semiarid to arid; (4) Analogous geochemistry: Oxidizing conditions; and (5) Analogous hydrogeology: The ore deposit lies in the unsaturated zone above the water table. The Nopal I deposit is approximately 8 {+-} 0.5 million years old and has been exposed to oxidizing conditions during the last 3.2 to 3.4 million years. The Pena Blanca Natural Analogue Model considers that the uranium oxide and uranium silicates in the ore deposit were originally analogous to uranium-oxide spent nuclear fuel. The Pena Blanca site has been characterized using field and laboratory investigations of its fault and fracture distribution, mineralogy, fracture fillings, seepage into the mine adits, regional hydrology, and mineralization that shows the extent of radionuclide migration. Three boreholes were drilled at the Nopal I mine site in 2003 and these boreholes have provided samples for lithologic characterization, water-level measurements, and water samples for laboratory

  4. THE PENA BLANCA NATURAL ANALOGUE PERFORMANCE ASSESSMENT MODEL

    International Nuclear Information System (INIS)

    G.J. Saulnier Jr; W. Statham

    2006-01-01

    The Nopal I uranium mine in the Sierra Pena Blanca, Chihuahua, Mexico serves as a natural analogue to the Yucca Mountain repository. The Pena Blanca Natural Analogue Performance Assessment Model simulates the mobilization and transport of radionuclides that are released from the mine and transported to the saturated zone. the Pena Blanca Natural Analogue Model uses probabilistic simulations of hydrogeologic processes that are analogous to the processes that occur at the Yucca Mountain site. The Nopal I uranium deposit lies in fractured, welded, and altered rhyolitic ash flow tuffs that overlie carbonate rocks, a setting analogous to the geologic formations at the Yucca Mountain site. The Nopal I mine site has the following characteristics as compared to the Yucca Mountain repository site. (1) Analogous source: UO 2 uranium ore deposit = spent nuclear fuel in the repository; (2) Analogous geologic setting: fractured, welded, and altered rhyolitic ash flow tuffs overlying carbonate rocks; (3) Analogous climate: Semiarid to arid; (4) Analogous geochemistry: Oxidizing conditions; and (5) Analogous hydrogeology: The ore deposit lies in the unsaturated zone above the water table. The Nopal I deposit is approximately 8 ± 0.5 million years old and has been exposed to oxidizing conditions during the last 3.2 to 3.4 million years. The Pena Blanca Natural Analogue Model considers that the uranium oxide and uranium silicates in the ore deposit were originally analogous to uranium-oxide spent nuclear fuel. The Pena Blanca site has been characterized using field and laboratory investigations of its fault and fracture distribution, mineralogy, fracture fillings, seepage into the mine adits, regional hydrology, and mineralization that shows the extent of radionuclide migration. Three boreholes were drilled at the Nopal I mine site in 2003 and these boreholes have provided samples for lithologic characterization, water-level measurements, and water samples for laboratory analysis

  5. A generic hydroeconomic model to assess future water scarcity

    Science.gov (United States)

    Neverre, Noémie; Dumas, Patrice

    2015-04-01

    We developed a generic hydroeconomic model able to confront future water supply and demand on a large scale, taking into account man-made reservoirs. The assessment is done at the scale of river basins, using only globally available data; the methodology can thus be generalized. On the supply side, we evaluate the impacts of climate change on water resources. The available quantity of water at each site is computed using the following information: runoff is taken from the outputs of CNRM climate model (Dubois et al., 2010), reservoirs are located using Aquastat, and the sub-basin flow-accumulation area of each reservoir is determined based on a Digital Elevation Model (HYDRO1k). On the demand side, agricultural and domestic demands are projected in terms of both quantity and economic value. For the agricultural sector, globally available data on irrigated areas and crops are combined in order to determine irrigated crops localization. Then, crops irrigation requirements are computed for the different stages of the growing season using Allen (1998) method with Hargreaves potential evapotranspiration. Irrigation water economic value is based on a yield comparison approach between rainfed and irrigated crops. Potential irrigated and rainfed yields are taken from LPJmL (Blondeau et al., 2007), or from FAOSTAT by making simple assumptions on yield ratios. For the domestic sector, we project the combined effects of demographic growth, economic development and water cost evolution on future demands. The method consists in building three-blocks inverse demand functions where volume limits of the blocks evolve with the level of GDP per capita. The value of water along the demand curve is determined from price-elasticity, price and demand data from the literature, using the point-expansion method, and from water costs data. Then projected demands are confronted to future water availability. Operating rules of the reservoirs and water allocation between demands are based on

  6. Concepts, methods and models to assess environmental impact

    International Nuclear Information System (INIS)

    Pentreath, R.J.

    2002-01-01

    individual sites, is also planned in Canada. A somewhat conceptually different approach is that of an attempt to develop a hierarchical system for environmental protection based on a narrowly defined set of Reference Fauna and Flora analogous to that of Reference Man - consisting of defined dose models, data sets to estimate exposures, and data on biological effects, to provide a set of 'derived consideration levels' of dose-effect relationships for individual fauna and flora that could be used to help decision making (along with other relevant biological information) in different circumstances. Research work is also underway to produce systematic frameworks - also using a 'reference fauna and flora approach' - for assessing environmental impact in specific geographic areas, such as European and Arctic ecosystems. (author)

  7. Retrofitting Non-Cognitive-Diagnostic Reading Assessment under the Generalized DINA Model Framework

    Science.gov (United States)

    Chen, Huilin; Chen, Jinsong

    2016-01-01

    Cognitive diagnosis models (CDMs) are psychometric models developed mainly to assess examinees' specific strengths and weaknesses in a set of skills or attributes within a domain. By adopting the Generalized-DINA model framework, the recently developed general modeling framework, we attempted to retrofit the PISA reading assessments, a…

  8. Tsunami Risk Assessment Modelling in Chabahar Port, Iran

    Science.gov (United States)

    Delavar, M. R.; Mohammadi, H.; Sharifi, M. A.; Pirooz, M. D.

    2017-09-01

    The well-known historical tsunami in the Makran Subduction Zone (MSZ) region was generated by the earthquake of November 28, 1945 in Makran Coast in the North of Oman Sea. This destructive tsunami killed over 4,000 people in Southern Pakistan and India, caused great loss of life and devastation along the coasts of Western India, Iran and Oman. According to the report of "Remembering the 1945 Makran Tsunami", compiled by the Intergovernmental Oceanographic Commission (UNESCO/IOC), the maximum inundation of Chabahar port was 367 m toward the dry land, which had a height of 3.6 meters from the sea level. In addition, the maximum amount of inundation at Pasni (Pakistan) reached to 3 km from the coastline. For the two beaches of Gujarat (India) and Oman the maximum run-up height was 3 m from the sea level. In this paper, we first use Makran 1945 seismic parameters to simulate the tsunami in generation, propagation and inundation phases. The effect of tsunami on Chabahar port is simulated using the ComMIT model which is based on the Method of Splitting Tsunami (MOST). In this process the results are compared with the documented eyewitnesses and some reports from researchers for calibration and validation of the result. Next we have used the model to perform risk assessment for Chabahar port in the south of Iran with the worst case scenario of the tsunami. The simulated results showed that the tsunami waves will reach Chabahar coastline 11 minutes after generation and 9 minutes later, over 9.4 Km2 of the dry land will be flooded with maximum wave amplitude reaching up to 30 meters.

  9. TSUNAMI RISK ASSESSMENT MODELLING IN CHABAHAR PORT, IRAN

    Directory of Open Access Journals (Sweden)

    M. R. Delavar

    2017-09-01

    Full Text Available The well-known historical tsunami in the Makran Subduction Zone (MSZ region was generated by the earthquake of November 28, 1945 in Makran Coast in the North of Oman Sea. This destructive tsunami killed over 4,000 people in Southern Pakistan and India, caused great loss of life and devastation along the coasts of Western India, Iran and Oman. According to the report of "Remembering the 1945 Makran Tsunami", compiled by the Intergovernmental Oceanographic Commission (UNESCO/IOC, the maximum inundation of Chabahar port was 367 m toward the dry land, which had a height of 3.6 meters from the sea level. In addition, the maximum amount of inundation at Pasni (Pakistan reached to 3 km from the coastline. For the two beaches of Gujarat (India and Oman the maximum run-up height was 3 m from the sea level. In this paper, we first use Makran 1945 seismic parameters to simulate the tsunami in generation, propagation and inundation phases. The effect of tsunami on Chabahar port is simulated using the ComMIT model which is based on the Method of Splitting Tsunami (MOST. In this process the results are compared with the documented eyewitnesses and some reports from researchers for calibration and validation of the result. Next we have used the model to perform risk assessment for Chabahar port in the south of Iran with the worst case scenario of the tsunami. The simulated results showed that the tsunami waves will reach Chabahar coastline 11 minutes after generation and 9 minutes later, over 9.4 Km2 of the dry land will be flooded with maximum wave amplitude reaching up to 30 meters.

  10. Implications of model uncertainty for the practice of risk assessment

    International Nuclear Information System (INIS)

    Laskey, K.B.

    1994-01-01

    A model is a representation of a system that can be used to answer questions about the system's behavior. The term model uncertainty refers to problems in which there is no generally agreed upon, validated model that can be used as a surrogate for the system itself. Model uncertainty affects both the methodology appropriate for building models and how models should be used. This paper discusses representations of model uncertainty, methodologies for exercising and interpreting models in the presence of model uncertainty, and the appropriate use of fallible models for policy making

  11. Verification and validation of the decision analysis model for assessment of TWRS waste treatment strategies

    International Nuclear Information System (INIS)

    Awadalla, N.G.; Eaton, S.C.F.

    1996-01-01

    This document is the verification and validation final report for the Decision Analysis Model for Assessment of Tank Waste Remediation System Waste Treatment Strategies. This model is also known as the INSIGHT Model

  12. An Introduction to the Partial Credit Model for Developing Nursing Assessments.

    Science.gov (United States)

    Fox, Christine

    1999-01-01

    Demonstrates how the partial credit model, a variation of the Rasch Measurement Model, can be used to develop performance-based assessments for nursing education. Applies the model using the Practical Knowledge Inventory for Nurses. (SK)

  13. Risk assessment to an integrated planning model for UST programs

    International Nuclear Information System (INIS)

    Ferguson, K.W.

    1993-01-01

    The US Postal Service maintains the largest civilian fleet in the United States totaling approximately 180,000 vehicles. To support the fleets daily energy requirements, the Postal Service also operates one of the largest networks of underground storage tanks nearly 7,500 nationwide. A program to apply risk assessment to planning, budget development and other management actions was implemented during September, 1989. Working closely with a consultant, the postal service developed regulatory and environmental risk criteria and weighting factors for a ranking model. The primary objective was to identify relative risks for each underground tank at individual facilities. Relative risks at each facility were determined central to prioritizing scheduled improvements to the tank network. The survey was conducted on 302 underground tanks in the Northeast Region of the US. An environmental and regulatory risk score was computed for each UST. By ranking the tanks according to their risk score, tanks were classified into management action categories including, but the limited to, underground tank testing, retrofit, repair, replacement and closure

  14. PFI redux? Assessing a new model for financing hospitals.

    Science.gov (United States)

    Hellowell, Mark

    2013-11-01

    There is a growing need for investments in hospital facilities to improve the efficiency and quality of health services. In recent years, publicly financed hospital organisations in many countries have utilised private finance arrangements, variously called private finance initiatives (PFIs), public-private partnerships (PPPs) or P3s, to address their capital requirements. However, such projects have become more difficult to implement since the onset of the global financial crisis, which has led to a reduction in the supply of debt capital and an increase in its price. In December 2012, the government of the United Kingdom outlined a comprehensive set of reforms to the private finance model in order to revive this important source of capital for hospital investments. This article provides a critical assessment of the 'Private Finance 2' reforms, focusing on their likely impact on the supply and cost of capital. It concludes that constraints in supply are likely to continue, in part due to regulatory constraints facing both commercial banks and institutional investors, while the cost of capital is likely to increase, at least in the short term. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  15. Using urban forest assessment tools to model bird habitat potential