Sample records for assessment ioa analysis

  1. Validation of the Indicators of Abuse (IOA) Screen.

    Reis, Myrna; Nahmiash, Daphne


    Reports on the validity of the Indicators of Abuse (IOA) Screen, used by social-services-agency practitioners as an abuse screening tool. An abuse-indicator model evolving from the IOA suggests three main types of abuse signals: caregivers' personal problems/issues, caregivers interpersonal problems, and care receivers' social-support shortages…

  2. Urban energy consumption: Different insights from energy flow analysis, input–output analysis and ecological network analysis

    Highlights: • Urban energy consumption was assessed from three different perspectives. • A new concept called controlled energy was developed from network analysis. • Embodied energy and controlled energy consumption of Beijing were compared. • The integration of all three perspectives will elucidate sustainable energy use. - Abstract: Energy consumption has always been a central issue for sustainable urban assessment and planning. Different forms of energy analysis can provide various insights for energy policy making. This paper brought together three approaches for energy consumption accounting, i.e., energy flow analysis (EFA), input–output analysis (IOA) and ecological network analysis (ENA), and compared their different perspectives and the policy implications for urban energy use. Beijing was used to exemplify the different energy analysis processes, and the 42 economic sectors of the city were aggregated into seven components. It was determined that EFA quantifies both the primary and final energy consumption of the urban components by tracking the different types of fuel used by the urban economy. IOA accounts for the embodied energy consumption (direct and indirect) used to produce goods and services in the city, whereas the control analysis of ENA quantifies the specific embodied energy that is regulated by the activities within the city’s boundary. The network control analysis can also be applied to determining which economic sectors drive the energy consumption and to what extent these sectors are dependent on each other for energy. So-called “controlled energy” is a new concept that adds to the analysis of urban energy consumption, indicating the adjustable energy consumed by sectors. The integration of insights from all three accounting perspectives further our understanding of sustainable energy use in cities

  3. Analysis and assessment

    The ultimate objective is to predict potential health costs tp man accruing from the effluents or by-products of any energy system or mix of systems, but the establishment of reliable prediction equations first requires a baseline analysis of those preexisting and essentially uncontrolled factors known to have significant influence on patterns of mortality. These factors are the cultural, social, economic, and demographic traits of a defined local or regional population. Thus, the immediate objective is the rigorous statistical definition of consistent relationships that may exist among the above traits and between them and selected causes of death, especially those causes that may have interpretive value for the detection of environmental pollutants

  4. Integrated Operations Architecture Technology Assessment Study


    As part of NASA's Integrated Operations Architecture (IOA) Baseline, NASA will consolidate all communications operations. including ground-based, near-earth, and deep-space communications, into a single integrated network. This network will make maximum use of commercial equipment, services and standards. It will be an Internet Protocol (IP) based network. This study supports technology development planning for the IOA. The technical problems that may arise when LEO mission spacecraft interoperate with commercial satellite services were investigated. Commercial technology and services that could support the IOA were surveyed, and gaps in the capability of existing technology and techniques were identified. Recommendations were made on which gaps should be closed by means of NASA research and development funding. Several findings emerged from the interoperability assessment: in the NASA mission set, there is a preponderance of small. inexpensive, low data rate science missions; proposed commercial satellite communications services could potentially provide TDRSS-like data relay functions; and. IP and related protocols, such as TCP, require augmentation to operate in the mobile networking environment required by the space-to-ground portion of the IOA. Five case studies were performed in the technology assessment. Each case represented a realistic implementation of the near-earth portion of the IOA. The cases included the use of frequencies at L-band, Ka-band and the optical spectrum. The cases also represented both space relay architectures and direct-to-ground architectures. Some of the main recommendations resulting from the case studies are: select an architecture for the LEO/MEO communications network; pursue the development of a Ka-band space-qualified transmitter (and possibly a receiver), and a low-cost Ka-band ground terminal for a direct-to-ground network, pursue the development of an Inmarsat (L-band) space-qualified transceiver to implement a global, low

  5. Assessment of Thorium Analysis Methods

    The Assessment of thorium analytical methods for mixture power fuel consisting of titrimetry, X-ray flouresence spectrometry, UV-VIS spectrometry, alpha spectrometry, emission spectrography, polarography, chromatography (HPLC) and neutron activation were carried out. It can be concluded that analytical methods which have high accuracy (deviation standard < 3%) were; titrimetry neutron activation analysis and UV-VIS spectrometry; whereas with low accuracy method (deviation standard 3-10%) were; alpha spectrometry and emission spectrography. Ore samples can be analyzed by X-ray flourescnce spectrometry, neutron activation analysis, UV-VIS spectrometry, emission spectrography, chromatography and alpha spectometry. Concentrated samples can be analyzed by X-ray flourescence spectrometry; simulation samples can be analyzed by titrimetry, polarography and UV-VIS spectrometry, and samples of thorium as minor constituent can be analyzed by neutron activation analysis and alpha spectrometry. Thorium purity (impurities element in thorium samples) can be analyzed by emission spectography. Considering interference aspects, in general analytical methods without molecule reaction are better than those involving molecule reactions (author). 19 refs., 1 tabs

  6. Change point analysis and assessment

    Müller, Sabine; Neergaard, Helle; Ulhøi, John Parm


    The aim of this article is to develop an analytical framework for studying processes such as continuous innovation and business development in high-tech SME clusters that transcends the traditional qualitative-quantitative divide. It integrates four existing and well-recognized approaches to stud...... studying events, processes and change, mamely change-point analysis, event-history analysis, critical-incident technique and sequence analysis....

  7. Multifractal analysis for nutritional assessment.

    Youngja Park

    Full Text Available The concept of multifractality is currently used to describe self-similar and complex scaling properties observed in numerous biological signals. Fractals are geometric objects or dynamic variations which exhibit some degree of similarity (irregularity to the original object in a wide range of scales. This approach determines irregularity of biologic signal as an indicator of adaptability, the capability to respond to unpredictable stress, and health. In the present work, we propose the application of multifractal analysis of wavelet-transformed proton nuclear magnetic resonance ((1H NMR spectra of plasma to determine nutritional insufficiency. For validation of this method on (1H NMR signal of human plasma, standard deviation from classical statistical approach and Hurst exponent (H, left slope and partition function from multifractal analysis were extracted from (1H NMR spectra to test whether multifractal indices could discriminate healthy subjects from unhealthy, intensive care unit patients. After validation, the multifractal approach was applied to spectra of plasma from a modified crossover study of sulfur amino acid insufficiency and tested for associations with blood lipids. The results showed that standard deviation and H, but not left slope, were significantly different for sulfur amino acid sufficiency and insufficiency. Quadratic discriminant analysis of H, left slope and the partition function showed 78% overall classification accuracy according to sulfur amino acid status. Triglycerides and apolipoprotein C3 were significantly correlated with a multifractal model containing H, left slope, and standard deviation, and cholesterol and high-sensitivity C-reactive protein were significantly correlated to H. In conclusion, multifractal analysis of (1H NMR spectra provides a new approach to characterize nutritional status.

  8. Assessing Analysis and Reasoning in Bioethics

    Pearce, Roger S.


    Developing critical thinking is a perceived weakness in current education. Analysis and reasoning are core skills in bioethics making bioethics a useful vehicle to address this weakness. Assessment is widely considered to be the most influential factor on learning (Brown and Glasner, 1999) and this piece describes how analysis and reasoning in…

  9. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)


    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  10. Quality Assessment of Urinary Stone Analysis

    Siener, Roswitha; Buchholz, Noor; Daudon, Michel;


    , fulfilled the quality requirements. According to the current standard, chemical analysis is considered to be insufficient for stone analysis, whereas infrared spectroscopy or X-ray diffraction is mandatory. However, the poor results of infrared spectroscopy highlight the importance of equipment, reference...... and chemical analysis. The aim of the present study was to assess the quality of urinary stone analysis of laboratories in Europe. Nine laboratories from eight European countries participated in six quality control surveys for urinary calculi analyses of the Reference Institute for Bioanalytics, Bonn......, Germany, between 2010 and 2014. Each participant received the same blinded test samples for stone analysis. A total of 24 samples, comprising pure substances and mixtures of two or three components, were analysed. The evaluation of the quality of the laboratory in the present study was based on the...

  11. Safety analysis and risk assessment handbook

    This Safety Analysis and Risk Assessment Handbook (SARAH) provides guidance to the safety analyst at the Rocky Flats Environmental Technology Site (RFETS) in the preparation of safety analyses and risk assessments. Although the older guidance (the Rocky Flats Risk Assessment Guide) continues to be used for updating the Final Safety Analysis Reports developed in the mid-1980s, this new guidance is used with all new authorization basis documents. With the mission change at RFETS came the need to establish new authorization basis documents for its facilities, whose functions had changed. The methodology and databases for performing the evaluations that support the new authorization basis documents had to be standardized, to avoid the use of different approaches and/or databases for similar accidents in different facilities. This handbook presents this new standardized approach. The handbook begins with a discussion of the requirements of the different types of authorization basis documents and how to choose the one appropriate for the facility to be evaluated. It then walks the analyst through the process of identifying all the potential hazards in the facility, classifying them, and choosing the ones that need to be analyzed further. It then discusses the methods for evaluating accident initiation and progression and covers the basic steps in a safety analysis, including consequence and frequency binning and risk ranking. The handbook lays out standardized approaches for determining the source terms of the various accidents (including airborne release fractions, leakpath factors, etc.), the atmospheric dispersion factors appropriate for Rocky Flats, and the methods for radiological and chemical consequence assessments. The radiological assessments use a radiological open-quotes templateclose quotes, a spreadsheet that incorporates the standard values of parameters, whereas the chemical assessments use the standard codes ARCHIE and ALOHA

  12. Spatial interaction analysis in probabilistic risk assessment

    In severe probabilistic risk assessments (PRA), it has been shown that accident scenarios involving ''external events'', such as fires and floods, can make an important contribution to the frequency of core damage and radionuclide release. These events belong to the broader category of common cause events, and an important issue in the evaluation of these events is whether a complete set of scenarios has been considered. In this article, a systematic scoping method is described for identifying and ranking scenarios involving environmental hazards that originate within plant boundaries and for determining the scope of the following detailed external event analysis. This method is also known as spatial interaction analysis. It was developed as part of the Seabrook Station Probabilistic Safety Assessment and has since been improved and applied to two other PRAs

  13. Dynamic analysis and assessment for sustainable development


    The assessment of sustainable development is crucial for constituting sustainable development strategies. Assessment methods that exist so far usually only use an indicator system for making sustainable judgement. These indicators rarely reflect dynamic characteristics. However, sustainable development is influenced by changes in the social-economic system and in the eco-environmental system at different times. Besides the spatial character, sustainable development has a temporal character that can not be neglected; therefore the research system should also be dynamic. This paper focuses on this dynamic trait, so that the assessment results obtained provide more information for judgements in decision-making processes. Firstly the dynamic characteristics of sustainable development are analyzed, which point to a track of sustainable development that is an upward undulating curve. According to the dynamic character and the development rules of a social, economic and ecological system, a flexible assessment approach that is based on tendency analysis, restrictive conditions and a feedback system is then proposed for sustainable development.

  14. Office of Integrated Assessment and Policy Analysis

    The mission of the Office of Integrated Assessments and Policy Analysis (OIAPA) is to examine current and future policies related to the development and use of energy technologies. The principal ongoing research activity to date has focused on the impacts of several energy sources, including coal, oil shale, solar, and geothermal, from the standpoint of the Resource Conservation and Recovery Act. An additional project has recently been initiated on an evaluation of impacts associated with the implementation of the Toxic Substances Control Act. The impacts of the Resource Conservation and Recovery Act and the Toxic Substances Control Act on energy supply constitute the principal research focus of OIAPA for the near term. From these studies a research approach will be developed to identify certain common elements in the regulatory evaluation cycle as a means of evaluating subsequent environmental, health, and socioeconomic impact. It is planned that an integrated assessment team examine studies completed or underway on the following aspects of major regulations: health, risk assessment, testing protocols, environment control cost/benefits, institutional structures, and facility siting. This examination would assess the methodologies used, determine the general applicability of such studies, and present in a logical form information that appears to have broad general application. A suggested action plan for the State of Tennessee on radioactive and hazardous waste management is outlined

  15. Assessment of right atrial function analysis

    To assess the potential utility of right atrial function analysis in cardiac disease, reservoir function, pump function, and right atrial peak emptying rate (RAPER) were compared in 10 normal subjects, 32 patients with coronary artery disease, and 4 patients with primary pulmonary hypertension. Right atrial volume curves were obtained using cardiac radionuclide method with Kr-81m. In normal subjects, reservoir function index was 0.41+-0.05; pump function index was 0.25+-0.05. Both types of patients has decreased reservoir funcion and increased pump function. Pump function tended to decrease with an increase of right ventricular end-diastolic pressure. RAPER correlated well with right ventricular peak filling rate, probably reflecting right ventricular diastolic function. Analysis of right atrial function seemed to be of value in evaluating factors regulating right ventricular contraction and diastolic function, and cardiac output. (Namekawa, K)

  16. Multicriteria analysis in hazards assessment in Libya

    Zeleňáková, Martina; Gargar, Ibrahim; Purcz, Pavol


    Environmental hazards (natural and man-made) have always constituted problem in many developing and developed countries. Many applications proved that these problems could be solved through planning studies and detailed information about these prone areas. Determining time and location and size of the problem are important for decision makers for planning and management activities. It is important to know the risk represented by those hazards and take actions to protect against them. Multicriteria analysis methods - Analytic hierarchy process, Pairwise comparison, Ranking method are used to analyse which is the most dangerous hazard facing Libya country. The multicriteria analysis ends with a more or less stable ranking of the given alternatives and hence a recommendation as to which alternative(s) problems should be preferred. Regarding our problem of environmental risk assessment, the result will be a ranking or categorisation of hazards with regard to their risk level.


    EVANS, C B


    The purpose of the Hanford Safety Analysis and Risk Assessment Handbook (SARAH) is to support the development of safety basis documentation for Hazard Category 2 and 3 (HC-2 and 3) U.S. Department of Energy (DOE) nuclear facilities to meet the requirements of 10 CFR 830, ''Nuclear Safety Management''. Subpart B, ''Safety Basis Requirements.'' Consistent with DOE-STD-3009-94, Change Notice 2, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'' (STD-3009), and DOE-STD-3011-2002, ''Guidance for Preparation of Basis for Interim Operation (BIO) Documents'' (STD-3011), the Hanford SARAH describes methodology for performing a safety analysis leading to development of a Documented Safety Analysis (DSA) and derivation of Technical Safety Requirements (TSR), and provides the information necessary to ensure a consistently rigorous approach that meets DOE expectations. The DSA and TSR documents, together with the DOE-issued Safety Evaluation Report (SER), are the basic components of facility safety basis documentation. For HC-2 or 3 nuclear facilities in long-term surveillance and maintenance (S&M), for decommissioning activities, where source term has been eliminated to the point that only low-level, residual fixed contamination is present, or for environmental remediation activities outside of a facility structure, DOE-STD-1120-98, ''Integration of Environment, Safety, and Health into Facility Disposition Activities'' (STD-1120), may serve as the basis for the DSA. HC-2 and 3 environmental remediation sites also are subject to the hazard analysis methodologies of this standard.

  18. Geochemical and Geochronologic Investigations of Zircon-hosted Melt Inclusions in Rhyolites from the Mesoproterozoic Pea Ridge IOA-REE Deposit, St. Francois Mountains, Missouri

    Watts, K. E.; Mercer, C. N.; Vazquez, J. A.


    Silicic volcanic and plutonic rocks of an eroded Mesoproterozoic caldera complex were intruded and replaced by iron ore, and cross-cut by REE-enriched breccia pipes (~12% total REO) to form the Pea Ridge iron-oxide-apatite-REE (IOA-REE) deposit. Igneous activity, iron ore formation, and REE mineralization overlapped in space and time, however the source of REEs and other metals (Fe, Cu, Au) integral to these economically important deposits remains unclear. Melt inclusions (MI) hosted in refractory zircon phenocrysts are used to constrain magmatic components and processes in the formation of the Pea Ridge deposit. Homogenized (1.4 kbar, 1000°C, 1 hr) MI in zircons from rhyolites ~600 ft (PR-91) and ~1200 ft (PR-12) laterally from the ore body were analyzed for major elements by EPMA and volatiles and trace elements (H2O, S, F, Cl, REEs, Rb, Sr, Y, Zr, Nb, U, Th) by SHRIMP-RG. Metals (including Cu, Au) will be measured in an upcoming SHRIMP-RG session. U-Pb ages, Ti and REE were determined by SHRIMP-RG for a subset of zircon spots adjacent to MI (1458 ± 18 Ma (PR-12); 1480 ± 45 Ma (PR-91)). MI glasses range from fresh and homogeneous dacite-rhyolite (65-75 wt% SiO2) to heterogeneous, patchy mixtures of K-spar and quartz (PR-12, 91), and more rarely mica, albite and/or anorthoclase (PR-91). MI are commonly attached to monazite and xenotime, particularly along re-entrants and zircon rims (PR-91). Fresh dacite-rhyolite glasses (PR-12) have moderate H2O (~2-2.5 wt%), Rb/Sr ratios (~8) and U (~5-7 ppm), and negative (chondrite-normalized) Eu anomalies (Eu ~0.4-0.7 ppm) (typical of rhyolites), whereas HREEs (Tb, Ho, Tm) are elevated (~2-3 ppm). Patchy K-spar and quartz inclusions (PR-12, 91) have flat LREE patterns, and positive anomalies in Tb, Ho, and Tm. One K-spar inclusion (PR-91) has a ~5-50 fold increase in HREEs (Tb, Dy, Ho, Er, Tm) and U (35 ppm) relative to other MI. U-Pb and REE analyses of its zircon host are not unusual (1484 ± 21 Ma); its irregular shape

  19. Qualitative Analysis for Maintenance Process Assessment

    Brand, Lionel; Kim, Yong-Mi; Melo, Walcelio; Seaman, Carolyn; Basili, Victor


    In order to improve software maintenance processes, we first need to be able to characterize and assess them. These tasks must be performed in depth and with objectivity since the problems are complex. One approach is to set up a measurement-based software process improvement program specifically aimed at maintenance. However, establishing a measurement program requires that one understands the problems to be addressed by the measurement program and is able to characterize the maintenance environment and processes in order to collect suitable and cost-effective data. Also, enacting such a program and getting usable data sets takes time. A short term substitute is therefore needed. We propose in this paper a characterization process aimed specifically at maintenance and based on a general qualitative analysis methodology. This process is rigorously defined in order to be repeatable and usable by people who are not acquainted with such analysis procedures. A basic feature of our approach is that actual implemented software changes are analyzed in order to understand the flaws in the maintenance process. Guidelines are provided and a case study is shown that demonstrates the usefulness of the approach.

  20. Multi Criteria Analysis for bioenergy systems assessments

    Sustainable bioenergy systems are, by definition, embedded in social, economic, and environmental contexts and depend on support of many stakeholders with different perspectives. The resulting complexity constitutes a major barrier to the implementation of bioenergy projects. The goal of this paper is to evaluate the potential of Multi Criteria Analysis (MCA) to facilitate the design and implementation of sustainable bioenergy projects. Four MCA tools (Super Decisions, DecideIT, Decision Lab, NAIADE) are reviewed for their suitability to assess sustainability of bioenergy systems with a special focus on multi-stakeholder inclusion. The MCA tools are applied using data from a multi-stakeholder bioenergy case study in Uganda. Although contributing to only a part of a comprehensive decision process, MCA can assist in overcoming implementation barriers by (i) structuring the problem, (ii) assisting in the identification of the least robust and/or most uncertain components in bioenergy systems and (iii) integrating stakeholders into the decision process. Applying the four MCA tools to a Ugandan case study resulted in a large variability in outcomes. However, social criteria were consistently identified by all tools as being decisive in making a bioelectricity project viable

  1. Seismic vulnerability assessments in risk analysis

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander


    The assessment of seismic vulnerability is a critical issue within natural and technological risk analysis. In general, there are three common types of methods used for development of vulnerability functions of different elements at risk: empirical, analytical and expert estimations. The paper addresses the empirical methods for seismic vulnerability estimation for residential buildings and industrial facilities. The results of engineering analysis of past earthquake consequences, as well as the statistical data on buildings behavior during strong earthquakes presented in the different seismic intensity scales, are used to verify the regional parameters of mathematical models in order to simulate physical and economic vulnerability for different building types classified according to seismic scale MMSK-86. Verified procedure has been used to estimate the physical and economic vulnerability of buildings and constructions against earthquakes for the Northern Caucasus Federal region of the Russian Federation and Krasnodar area, which are characterized by rather high level of seismic activity and high population density. In order to estimate expected damage states to buildings and constructions in the case of the earthquakes according to the OSR-97B (return period T=1,000 years) within big cities and towns, they were divided into unit sites and their coordinates were presented as dots located in the centers of unit sites. Then the indexes obtained for each unit site were summed up. The maps of physical vulnerability zoning for Northern Caucasus Federal region of the Russian Federation and Krasnodar area includes two elements: percent of different damage states for settlements with number of inhabitants less than 1,000 and vulnerability for cities and towns with number of inhabitants more than 1,000. The hypsometric scale is used to represent both elements on the maps. Taking into account the size of oil pipe line systems located in the highly active seismic zones in

  2. Improved reliability analysis method based on the failure assessment diagram

    Zhou, Yu; Zhang, Zheng; Zhong, Qunpeng


    With the uncertainties related to operating conditions, in-service non-destructive testing (NDT) measurements and material properties considered in the structural integrity assessment, probabilistic analysis based on the failure assessment diagram (FAD) approach has recently become an important concern. However, the point density revealing the probabilistic distribution characteristics of the assessment points is usually ignored. To obtain more detailed and direct knowledge from the reliability analysis, an improved probabilistic fracture mechanics (PFM) assessment method is proposed. By integrating 2D kernel density estimation (KDE) technology into the traditional probabilistic assessment, the probabilistic density of the randomly distributed assessment points is visualized in the assessment diagram. Moreover, a modified interval sensitivity analysis is implemented and compared with probabilistic sensitivity analysis. The improved reliability analysis method is applied to the assessment of a high pressure pipe containing an axial internal semi-elliptical surface crack. The results indicate that these two methods can give consistent sensitivities of input parameters, but the interval sensitivity analysis is computationally more efficient. Meanwhile, the point density distribution and its contour are plotted in the FAD, thereby better revealing the characteristics of PFM assessment. This study provides a powerful tool for the reliability analysis of critical structures.

  3. A Content Analysis of Intimate Partner Violence Assessments

    Hays, Danica G.; Emelianchik, Kelly


    With approximately 30% of individuals of various cultural identities experiencing intimate partner violence (IPV) in their lifetimes, it is imperative that professional counselors engage in effective assessment practices and be aware of the limitations of available IPV assessments. A content analysis of 38 IPV assessments was conducted, yielding…

  4. Data Analysis and Next Generation Assessments

    Pon, Kathy


    For the last decade, much of the work of California school administrators has been shaped by the accountability of the No Child Left Behind Act. Now as they stand at the precipice of Common Core Standards and next generation assessments, it is important to reflect on the proficiency educators have attained in using data to improve instruction and…

  5. Non-human biota dose assessment. Sensitivity analysis and knowledge quality assessment

    This report provides a summary of a programme of work, commissioned within the BIOPROTA collaborative forum, to assess the quantitative and qualitative elements of uncertainty associated with biota dose assessment of potential impacts of long-term releases from geological disposal facilities (GDF). Quantitative and qualitative aspects of uncertainty were determined through sensitivity and knowledge quality assessments, respectively. Both assessments focused on default assessment parameters within the ERICA assessment approach. The sensitivity analysis was conducted within the EIKOS sensitivity analysis software tool and was run in both generic and test case modes. The knowledge quality assessment involved development of a questionnaire around the ERICA assessment approach, which was distributed to a range of experts in the fields of non-human biota dose assessment and radioactive waste disposal assessments. Combined, these assessments enabled critical model features and parameters that are both sensitive (i.e. have a large influence on model output) and of low knowledge quality to be identified for each of the three test cases. The output of this project is intended to provide information on those parameters that may need to be considered in more detail for prospective site-specific biota dose assessments for GDFs. Such information should help users to enhance the quality of their assessments and build greater confidence in the results. (orig.)

  6. Technical Guidance for Assessing Environmental Justice in Regulatory Analysis

    The Technical Guidance for Assessing Environmental Justice in Regulatory Analysis (also referred to as the Environmental Justice Technical Guidance or EJTG) is intended for use by Agency analysts, including risk assessors, economists, and other analytic staff that conduct analyse...

  7. Analysis and Assessment of Computer-Supported Collaborative Learning Conversations

    Trausan-Matu, Stefan


    Trausan-Matu, S. (2008). Analysis and Assessment of Computer-Supported Collaborative Learning Conversations. Workshop presentation at the symposium Learning networks for professional. November, 14, 2008, Heerlen, Nederland: Open Universiteit Nederland.

  8. Material Analysis for a Fire Assessment.

    Brown, Alexander; Nemer, Martin B.


    This report consolidates technical information on several materials and material classes for a fire assessment. The materials include three polymeric materials, wood, and hydraulic oil. The polymers are polystyrene, polyurethane, and melamine- formaldehyde foams. Samples of two of the specific materials were tested for their behavior in a fire - like environment. Test data and the methods used to test the materials are presented. Much of the remaining data are taken from a literature survey. This report serves as a reference source of properties necessary to predict the behavior of these materials in a fire.

  9. Assessment and Planning Using Portfolio Analysis

    Roberts, Laura B.


    Portfolio analysis is a simple yet powerful management tool. Programs and activities are placed on a grid with mission along one axis and financial return on the other. The four boxes of the grid (low mission, low return; high mission, low return; high return, low mission; high return, high mission) help managers identify which programs might be…

  10. Environmental risk assessment in GMO analysis.

    Pirondini, Andrea; Marmiroli, Nelson


    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity. PMID:21384330

  11. Uncertainty analysis in integrated assessment: the users' perspective

    Gabbert, S.G.M.; Ittersum, van M.K.; Kroeze, C.; Stalpers, S.I.P.; Ewert, F.; Alkan Olsson, J.


    Integrated Assessment (IA) models aim at providing information- and decision-support to complex problems. This paper argues that uncertainty analysis in IA models should be user-driven in order to strengthen science–policy interaction. We suggest an approach to uncertainty analysis that starts with


    Bojan Krstic, Milica Tasic, Vladimir Ivanovic


    Lifecycle analysis is one of the techniques for assessing the impact of enterprise on the environment, by monitoring environmental effects of the product along its lifecycle. Since the cycle can be seen in stages (extraction of raw materials, raw materials processing, final product production, product use and end of use of the product), the analysis can be applied to all or only some parts of the aforementioned cycle, hence the different variants of this technique. The analysis itself is defi...

  13. 78 FR 39284 - Technical Guidance for Assessing Environmental Justice in Regulatory Analysis


    ... AGENCY Technical Guidance for Assessing Environmental Justice in Regulatory Analysis AGENCY... Technical Guidance for Assessing Environmental Justice in Regulatory Analysis Docket, EPA/DC, EPA West, Room... Technical Guidance for Assessing Environmental Justice in Regulatory Analysis is available in the...

  14. 78 FR 27235 - Technical Guidance for Assessing Environmental Justice in Regulatory Analysis


    ... AGENCY Technical Guidance for Assessing Environmental Justice in Regulatory Analysis AGENCY..., ``Technical Guidance for Assessing Environmental Justice in Regulatory Analysis.'' The purpose of this... Technical Guidance for Assessing Environmental Justice in Regulatory Analysis Docket, EPA/DC, EPA West,...

  15. Metallic Mineral Resources Assessment and Analysis System Design


    This paper presents the aim and the design structure of the metallic mineral resources assessment and analysis system. This system adopts an integrated technique of data warehouse composed of affairs-processing layer and analysis-application layer. The affairs-processing layer includes multiform databases (such as geological database, geophysical database, geochemical database),while the analysis application layer includes data warehouse, online analysis processing and data mining. This paper also presents in detail the data warehouse of the present system and the appropriate spatial analysis methods and models. Finally, this paper presents the prospect of the system.

  16. Accuracy Assessment and Analysis for GPT2

    YAO Yibin


    Full Text Available GPT(global pressure and temperature is a global empirical model usually used to provide temperature and pressure for the determination of tropospheric delay, there are some weakness to GPT, these have been improved with a new empirical model named GPT2, which not only improves the accuracy of temperature and pressure, but also provides specific humidity, water vapor pressure, mapping function coefficients and other tropospheric parameters, and no accuracy analysis of GPT2 has been made until now. In this paper high-precision meteorological data from ECWMF and NOAA were used to test and analyze the accuracy of temperature, pressure and water vapor pressure expressed by GPT2, testing results show that the mean Bias of temperature is -0.59℃, average RMS is 3.82℃; absolute value of average Bias of pressure and water vapor pressure are less than 1 mb, GPT2 pressure has average RMS of 7 mb, and water vapor pressure no more than 3 mb, accuracy is different in different latitudes, all of them have obvious seasonality. In conclusion, GPT2 model has high accuracy and stability on global scale.

  17. Analysis of assessment tools used in engineering degree programs

    Martínez Martínez, María del Rosario; Olmedo Torre, Noelia; Amante García, Beatriz; Farrerons Vidal, Óscar; Cadenato Matia, Ana María


    This work presents an analysis of the assessment tools used by professors at the Universitat Politécnica de Catalunya to assess the generic competencies introduced in the Bachelor’s Degrees in Engineering. In order to conduct this study, a survey was designed and administered anonymously to a sample of the professors most receptive to educational innovation at their own university. All total, 80 professors responded to this survey, of whom 26% turned out to be members of the un...

  18. Assessment of structural analysis technology for elastic shell collapse problems

    Knight, N. F., Jr.; Macy, S. C.; Mccleary, S. L.


    The prediction of the ultimate load carrying capability for compressively loaded shell structures is a challenging nonlinear analysis problem. Selected areas of finite element technology research and nonlinear solution technology are assessed. Herein, a finite element analysis procedure is applied to four shell collapse problems which have been used by computational structural mechanics researchers in the past. This assessment will focus on a number of different shell element formulations and on different approaches used to account for geometric nonlinearities. The results presented confirm that these aspects of nonlinear shell analysis can have a significant effect on the predicted nonlinear structural response. All analyses were performed using the CSM Testbed software system which allowed a convenient assessment of different element formulations with a consistent approach to solving the discretized nonlinear equations.

  19. Comparative analysis of model assessment in community detection

    Kawamoto, Tatsuro


    Bayesian cluster inference with a flexible generative model allows us to detect various types of structures. However, it has problems stemming from computational complexity and difficulties in model assessment. We consider the stochastic block model with restricted hyperparameter space, which is known to correspond to modularity maximization. We show that it not only reduces computational complexity, but is also beneficial for model assessment. Using various criteria, we conduct a comparative analysis of the model assessments, and analyze whether each criterion tends to overfit or underfit. We also show that the learning of hyperparameters leads to qualitative differences in Bethe free energy and cross-validation errors.

  20. No-Reference Video Quality Assessment using MPEG Analysis

    Søgaard, Jacob; Forchhammer, Søren; Korhonen, Jari

    We present a method for No-Reference (NR) Video Quality Assessment (VQA) for decoded video without access to the bitstream. This is achieved by extracting and pooling features from a NR image quality assessment method used frame by frame. We also present methods to identify the video coding and...... estimate the video coding parameters for MPEG-2 and H.264/AVC which can be used to improve the VQA. The analysis differs from most other video coding analysis methods since it is without access to the bitstream. The results show that our proposed method is competitive with other recent NR VQA methods for...

  1. Uncertainty analysis on probabilistic fracture mechanics assessment methodology

    Fracture Mechanics has found a profound usage in the area of design of components and assessing fitness for purpose/residual life estimation of an operating component. Since defect size and material properties are statistically distributed, various probabilistic approaches have been employed for the computation of fracture probability. Monte Carlo Simulation is one such procedure towards the analysis of fracture probability. This paper deals with uncertainty analysis using the Monte Carlo Simulation methods. These methods were developed based on the R6 failure assessment procedure, which has been widely used in analysing the integrity of structures. The application of this method is illustrated with a case study. (author)

  2. FENCH-analysis of electricity generation greenhouse gas emissions from solar and wind power in Germany

    The assessment of energy supply systems with regard to the influence on climate change requires not only the quantification of direct emissions caused by the operation of a power plant. It also has to take into account indirect emissions resulting from e.g. construction and dismounting of the power plant. Processes like manufacturing the materials for building the plant, the transportation of components and the construction and maintenance of the power plant are included. A tool to determine and assess the energy and mass flows is the Life Cycle Analysis (LCA) which allows the assessment of environmental impacts related to a product or service. In this paper a FENCH (Full Energy Chain)-analysis based on a LCA of electricity production from wind and solar power plants under operation conditions typical for application its Germany is presented. The FENCH-analysis is based on two methods, Process Chain Analysis (PCA) and Input-Output-Analysis (IOA) which are illustrated by the example of an electricity generation from a wind power plant. The calculated results are shown for the cumulated (indirect and direct) Greenhouse-Gas (GHG)-emissions for an electricity production from wind and solar power plants. A comparison of the results to the electricity production from a coal fired power plant is performed. At last a comparison of 1 kWh electricity from renewable energy to 1 kWh from fossil energy carrier has to be done, because the benefits of 1 kWh electricity from various types of power plants are different. Electricity from wind energy depends on the meteorological conditions while electricity from a fossil fired power plant is able to follow the power requirements of the consumers nearly all the time. By considering the comparison of the different benefit provided the GHG-Emissions are presented. (author)

  3. Hanford safety analysis and risk assessment handbook (SARAH)

    The purpose of the Hanford Safety Analysis and Risk Assessment Handbook (SARAH) is to support the development of safety basis documentation for Hazard Category 1,2, and 3 U.S. Department of Energy (DOE) nuclear facilities. SARAH describes currently acceptable methodology for development of a Documented Safety Analysis (DSA) and derivation of technical safety requirements (TSR) based on 10 CFR 830, ''Nuclear Safety Management,'' Subpart B, ''Safety Basis Requirements,'' and provides data to ensure consistency in approach

  4. Background, Assessment and Analysis of the Gender Issues in Pakistan

    Moheyuddin, Ghulam


    This paper describes the assessment of the gender issue in Pakistan, review and analysis of the major sector depicting gender inequalities. Before continuing to the detailed analysis of the gender issues in Pakistan, it gives a bird’s eye-view of the socio-economic, political and cultural background of Pakistan. The paper explains the areas of critical gender inequalities in Pakistan and reviews the various gender indicators in Pakistan. It also discusses the current policies and the program...

  5. Intuitive Analysis of Variance-- A Formative Assessment Approach

    Trumpower, David


    This article describes an assessment activity that can show students how much they intuitively understand about statistics, but also alert them to common misunderstandings. How the activity can be used formatively to help improve students' conceptual understanding of analysis of variance is discussed. (Contains 1 figure and 1 table.)

  6. Assessing Group Interaction with Social Language Network Analysis

    Scholand, Andrew J.; Tausczik, Yla R.; Pennebaker, James W.

    In this paper we discuss a new methodology, social language network analysis (SLNA), that combines tools from social language processing and network analysis to assess socially situated working relationships within a group. Specifically, SLNA aims to identify and characterize the nature of working relationships by processing artifacts generated with computer-mediated communication systems, such as instant message texts or emails. Because social language processing is able to identify psychological, social, and emotional processes that individuals are not able to fully mask, social language network analysis can clarify and highlight complex interdependencies between group members, even when these relationships are latent or unrecognized.

  7. Data management and statistical analysis for environmental assessment

    Data management and statistical analysis for environmental assessment are important issues on the interface of computer science and statistics. Data collection for environmental decision making can generate large quantities of various types of data. A database/GIS system developed is described which provides efficient data storage as well as visualization tools which may be integrated into the data analysis process. FIMAD is a living database and GIS system. The system has changed and developed over time to meet the needs of the Los Alamos National Laboratory Restoration Program. The system provides a repository for data which may be accessed by different individuals for different purposes. The database structure is driven by the large amount and varied types of data required for environmental assessment. The integration of the database with the GIS system provides the foundation for powerful visualization and analysis capabilities

  8. Development and Assessment of Best Estimate Integrated Safety Analysis Code

    The integrated safety analysis code MARS3.0 has been developed and assessed through v and v procedure. Integrated safety analysis system has been established through coupling with severe accident code and utilizing MARS subchannel capability. The coupled containment module has been also improved. Development of indigenous thermal hydraulic models for MARS3.0 has been done through the implementation of multidimensional two phase flow model, APR1400, SMART safety issue models and new reactor models. Development of droplet field model has been also attempted and implemented to trial version. The full scope assessment has been carried out for the system analysis module and 3D vessel module. The code has been also assessed through participating international cooperation programs. The experimental data needed to code assessment has been collected and maintained through the WEB based data bank program. 3D GUI(graphic user interface) has been developed for MARS users. MARS users group has been organized, and currently it consists of 22 domestic organizations, including research, industrial, regulatory organizations and universities

  9. Web-Based Instruction and Learning: Analysis and Needs Assessment

    Grabowski, Barbara; McCarthy, Marianne; Koszalka, Tiffany


    An analysis and needs assessment was conducted to identify kindergarten through grade 14 (K-14) customer needs with regard to using the World Wide Web (WWW) for instruction and to identify obstacles K-14 teachers face in utilizing NASA Learning Technologies products in the classroom. The needs assessment was conducted as part of the Dryden Learning Technologies Project which is a collaboration between Dryden Flight Research Center (DFRC), Edwards, California and Tne Pennsylvania State University (PSU), University Park, Pennsylvania. The overall project is a multiyear effort to conduct research in the development of teacher training and tools for Web-based science, mathematics and technology instruction and learning.

  10. Human reliability analysis methods for probabilistic safety assessment

    Human reliability analysis (HRA) of a probabilistic safety assessment (PSA) includes identifying human actions from safety point of view, modelling the most important of them in PSA models, and assessing their probabilities. As manifested by many incidents and studies, human actions may have both positive and negative effect on safety and economy. Human reliability analysis is one of the areas of probabilistic safety assessment (PSA) that has direct applications outside the nuclear industry. The thesis focuses upon developments in human reliability analysis methods and data. The aim is to support PSA by extending the applicability of HRA. The thesis consists of six publications and a summary. The summary includes general considerations and a discussion about human actions in the nuclear power plant (NPP) environment. A condensed discussion about the results of the attached publications is then given, including new development in methods and data. At the end of the summary part, the contribution of the publications to good practice in HRA is presented. In the publications, studies based on the collection of data on maintenance-related failures, simulator runs and expert judgement are presented in order to extend the human reliability analysis database. Furthermore, methodological frameworks are presented to perform a comprehensive HRA, including shutdown conditions, to study reliability of decision making, and to study the effects of wrong human actions. In the last publication, an interdisciplinary approach to analysing human decision making is presented. The publications also include practical applications of the presented methodological frameworks. (orig.)

  11. Life Cycle Exergy Analysis of Wind Energy Systems : Assessing and improving life cycle analysis methodology

    Davidsson, Simon


    Wind power capacity is currently growing fast around the world. At the same time different forms of life cycle analysis are becoming common for measuring the environmental impact of wind energy systems. This thesis identifies several problems with current methods for assessing the environmental impact of wind energy and suggests improvements that will make these assessments more robust. The use of the exergy concept combined with life cycle analysis has been proposed by several researchers ov...

  12. Modeling the Assessment of Agricultural Enterprises Headcount Analysis

    Tatyana Viatkina


    Full Text Available The modern procedures of enterprises labour resources have been analyzed. The algorithm for calculation the enterprise performancepotential efficiency ratio and assessment of the performance potential of the enterprise assessment based on quantitativeand qualitative characteristics has been provided. The model for assessment the effectiveness of labour management of enterprise,branch or region subject to such factors as motivation, labour expenses, staff rotation and its qualifications has been proposed. Theproposed model covers the assessment of effectiveness of labour management of enterprise, branch or region subject to baselines.Situation where all inequalities are meeting means that this strategy is implementing effectively. If otherwise, the company shouldtake additional measures for to improve indexes specifying deployment of staff, its motivation, turnover, qualifications and labourexpenses. Application of the considered provisions and concepts together with applied tools makes it possible to model elements ofeffective utilization of performance of potential of any agricultural enterprise. The proposed procedure for assessment of agriculturalenterprises headcount analysis is applied one and can be used when developing a strategy for adequate assessment, looking fornew ways to improve utilization of labour resources in agricultural sector.

  13. System of gait analysis based on ground reaction force assessment

    František Vaverka


    Full Text Available Background: Biomechanical analysis of gait employs various methods used in kinematic and kinetic analysis, EMG, and others. One of the most frequently used methods is kinetic analysis based on the assessment of the ground reaction forces (GRF recorded on two force plates. Objective: The aim of the study was to present a method of gait analysis based on the assessment of the GRF recorded during the stance phase of two steps. Methods: The GRF recorded with a force plate on one leg during stance phase has three components acting in directions: Fx - mediolateral, Fy - anteroposterior, and Fz - vertical. A custom-written MATLAB script was used for gait analysis in this study. This software displays instantaneous force data for both legs as Fx(t, Fy(t and Fz(t curves, automatically determines the extremes of functions and sets the visual markers defining the individual points of interest. Positions of these markers can be easily adjusted by the rater, which may be necessary if the GRF has an atypical pattern. The analysis is fully automated and analyzing one trial takes only 1-2 minutes. Results: The method allows quantification of temporal variables of the extremes of the Fx(t, Fy(t, Fz(t functions, durations of the braking and propulsive phase, duration of the double support phase, the magnitudes of reaction forces in extremes of measured functions, impulses of force, and indices of symmetry. The analysis results in a standardized set of 78 variables (temporal, force, indices of symmetry which can serve as a basis for further research and diagnostics. Conclusions: The resulting set of variable offers a wide choice for selecting a specific group of variables with consideration to a particular research topic. The advantage of this method is the standardization of the GRF analysis, low time requirements allowing rapid analysis of a large number of trials in a short time, and comparability of the variables obtained during different research measurements.

  14. A hybrid input–output multi-objective model to assess economic–energy–environment trade-offs in Brazil

    A multi-objective linear programming (MOLP) model based on a hybrid Input–Output (IO) framework is presented. This model aims at assessing the trade-offs between economic, energy, environmental (E3) and social objectives in the Brazilian economic system. This combination of multi-objective models with Input–Output Analysis (IOA) plays a supplementary role in understanding the interactions between the economic and energy systems, and the corresponding impacts on the environment, offering a consistent framework for assessing the effects of distinct policies on these systems. Firstly, the System of National Accounts (SNA) is reorganized to include the National Energy Balance, creating a hybrid IO framework that is extended to assess Greenhouse Gas (GHG) emissions and the employment level. The objective functions considered are the maximization of GDP (gross domestic product) and employment levels, as well as the minimization of energy consumption and GHG emissions. An interactive method enabling a progressive and selective search of non-dominated solutions with distinct characteristics and underlying trade-offs is utilized. Illustrative results indicate that the maximization of GDP and the employment levels lead to an increase of both energy consumption and GHG emissions, while the minimization of either GHG emissions or energy consumption cause negative impacts on GDP and employment. - Highlights: • A hybrid Input–Output multi-objective model is applied to the Brazilian economy. • Objective functions are GDP, employment level, energy consumption and GHG emissions. • Interactive search process identifies trade-offs between the competing objectives. • Positive correlations between GDP growth and employment. • Positive correlations between energy consumption and GHG emissions

  15. No-Reference Video Quality Assessment using Codec Analysis

    Søgaard, Jacob; Forchhammer, Søren; Korhonen, Jari


    A no-reference video quality assessment (VQA) method is presented for videos distorted by H.264/AVC and MPEG-2. The assessment is performed without access to the bit-stream. Instead we analyze and estimate coefficients based on decoded pixels. The approach involves distinguishing between the two...... types of videos, estimating the level of quantization used in the I-frames, and exploiting this information to assess the video quality. In order to do this for H.264/AVC, the distribution of the DCT-coefficients after intra-prediction and deblocking are modeled. To obtain VQA features for H.264/AVC, we...... propose a novel estimation method of the quantization in H.264/AVC videos without bitstream access, which can also be used for Peak Signalto-Noise Ratio (PSNR) estimation. The results from the MPEG-2 and H.264/AVC analysis are mapped to a perceptual measure of video quality by Support Vector Regression...

  16. Quantitative risk assessment using the capacity-demand analysis

    The hydroelectric industry's recognition of the importance of avoiding unexpected failure, or forced outages, led to the development of probabilistic, or risk-based, methods in order to attempt to quantify exposures. Traditionally, such analysis has been carried out by qualitative assessments, relying on experience and sound engineering judgment to determine the optimum time to maintain, repair or replace a part or system. Depending on the nature of the problem, however, and the level of experience of those included in the decision making process, it is difficult to find a balance between acting proactively and accepting some amount of risk. The development of a practical means for establishing the probability of failure of any part or system, based on the determination of the statistical distribution of engineering properties such as acting stresses, is discussed. The capacity-demand analysis methodology, coupled with probablistic, risk-based analysis, permits all the factors associated with a decision to rehabilitate or replace a part, including the risks associated with the timing of the decision, to be assessed in a transparent and defendable manner. The methodology does not eliminate judgment altogether, but does move it from the level of estimating the risk of failure to the lower level of estimating variability in material properties, uncertainty in loading, and the uncertainties inherent in any engineering analysis. The method was successfully used in 1998 to carry out a comprehensive, economic risk analysis for the entire water conveyance system of a 90 year old hydropower station. The analysis included a number of diverse parts ranging from rock slopes and aging steel and concrete conduits, and the method allowed a rational assessment of the risks associated with reach of these varied parts to be determined, permitting the essential remedial works to be prioritized. 14 refs., 4 figs

  17. Assessment of Transport Projects: Risk Analysis and Decision Support

    Salling, Kim Bang


    The subject of this thesis is risk analysis and decision support in the context of transport infrastructure assessment. During my research I have observed a tendency in studies of assessing transport projects of overlooking the substantial amount of uncertainties within the decision making process...... Monte Carlo simulation, being the technique behind the quantitative risk analysis of CBA-DK. The informed decision support is dealt with by a set of resulting accumulated descending graphs (ADG) which makes it possible for decision-makers to come to terms with their risk aversion given a specific...... transport projects, namely by moving from point estimates to interval results. The main focus of this Ph.D. study has been to develop a valid, flexible and functional decision support tool in which risk oriented aspects of project evaluation is implemented. Throughout the study six papers have been produced...

  18. Total life cycle management - assessment tool an exploratory analysis

    Young, Brad de


    It is essential for the Marine Corps to ensure the successful supply, movement and maintenance of an armed force in peacetime and combat. Integral to an effective, long-term logistics plan is the ability to accurately forecast future requirements to sustain materiel readiness. Total Life Cycle Management Assessment Tool (TLCM-AT) is a simulation tool combining operations, maintenance, and logistics. This exploratory analysis gives insight into the factors used by TLCM-AT beyond the tool s emb...

  19. Safety analysis, risk assessment, and risk acceptance criteria

    This paper discusses a number of topics that relate safety analysis as documented in the Department of Energy (DOE) safety analysis reports (SARs), probabilistic risk assessments (PRA) as characterized primarily in the context of the techniques that have assumed some level of formality in commercial nuclear power plant applications, and risk acceptance criteria as an outgrowth of PRA applications. DOE SARs of interest are those that are prepared for DOE facilities under DOE Order 5480.23 and the implementing guidance in DOE STD-3009-94. It must be noted that the primary area of application for DOE STD-3009 is existing DOE facilities and that certain modifications of the STD-3009 approach are necessary in SARs for new facilities. Moreover, it is the hazard analysis (HA) and accident analysis (AA) portions of these SARs that are relevant to the present discussions. Although PRAs can be qualitative in nature, PRA as used in this paper refers more generally to all quantitative risk assessments and their underlying methods. HA as used in this paper refers more generally to all qualitative risk assessments and their underlying methods that have been in use in hazardous facilities other than nuclear power plants. This discussion includes both quantitative and qualitative risk assessment methods. PRA has been used, improved, developed, and refined since the Reactor Safety Study (WASH-1400) was published in 1975 by the Nuclear Regulatory Commission (NRC). Much debate has ensued since WASH-1400 on exactly what the role of PRA should be in plant design, reactor licensing, 'ensuring' plant and process safety, and a large number of other decisions that must be made for potentially hazardous activities. Of particular interest in this area is whether the risks quantified using PRA should be compared with numerical risk acceptance criteria (RACs) to determine whether a facility is 'safe.' Use of RACs requires quantitative estimates of consequence frequency and magnitude

  20. Assessment of residual stress using thermoelastic stress analysis

    Robinson, Andrew Ferrand


    The work described in this thesis considers the application of thermoelastic stress analysis (TSA) to the assessment of residual stresses in metallic materials. Residual stresses exist within almost all engineering components and structures. They are an unavoidable consequence of manufacturing processes and may cause the premature and catastrophic failure of a component when coupled with in-service stresses. Alternatively, beneficial residual stress may be introduced to enhance th...

  1. Analysis of the judicial file: assessing the validity of testimony

    Scott, M. Teresa; Antonio L. Manzanero


    Under the holistic approach to the assessment of testimony (HELPT), this paper describes a protocol for the analysis of all of the information that can be extracted from a judicial file, regarding the knowledge of heuristic principles and psychology of testimony. The aim is to provide a systematization for expert reports about the topics that could be explored in a file, extracting the maximum unbiased information to establish the relevant hypotheses of the case and evaluate possible factors ...

  2. Site Characterization and Analysis Penetrometer System (SCAPS) : Assessing Site Cotamination

    ECT Team, Purdue


    While a number of techniques exist for the remediation of contaminated soils, one of the largest problems is often the initial site assessment. It can be a difficult, expensive and time-consuming process to determine the exact extent of site contamination. The U.S. Army Engineer Waterways Experiment Station (WES) under the sponsorship of the U.S. Army Environmental Center (AEC) initiated the development of the Site Characterization and Analysis Penetrometer System (SCAPS) Research, Developmen...

  3. Assessment report on NRP sub-theme 'Risk Analysis'

    An overview and assessment are presented of the three research projects carried out under NRP funding that concern risk-related topics: (1) The risks of nonlinear climate changes, (2) Socio-economic and policy aspects of changes in incidence and intensity of extreme (weather) events, and (3) Characterizing the risks: a comparative analysis of the risks of global warming and of relevant policy strategies. 1 tab., 6 refs

  4. Scenario analysis in spatial impact assessment:a methodological approach

    Torrieri, F.; Nijkamp, P.


    This paper introduces the concept of Spatial or Territorial Impact Assessment as a new tool for balanced urban or regional planning from a long-term sustainability perspective. It then argues that modern scenario methods may be a useful complement to pro-active and future oriented urban or regional strategic thinking. A cognitive interactive model for scenario analysis is next presented and its advantages are outlined.

  5. Life cycle analysis for the assessment of environmental impacts

    The paper presents the structure of a model and a database devoted to the life-cycle analysis of industrial products for the assessment of environmental impacts. The data cover a large variety of industrial sectors; the whole life-cycle of the products has to be considered when the environmental impacts are calculated. The author considers that the data format could be standardized in view of exchanging data between different studies and to enlarge the quality of the studies. (author)

  6. Modular risk analysis for assessing multiple waste sites

    Human-health impacts, especially to the surrounding public, are extremely difficult to assess at installations that contain multiple waste sites and a variety of mixed-waste constituents (e.g., organic, inorganic, and radioactive). These assessments must address different constituents, multiple waste sites, multiple release patterns, different transport pathways (i.e., groundwater, surface water, air, and overland soil), different receptor types and locations, various times of interest, population distributions, land-use patterns, baseline assessments, a variety of exposure scenarios, etc. Although the process is complex, two of the most important difficulties to overcome are associated with (1) establishing an approach that allows for modifying the source term, transport, or exposure component as an individual module without having to re-evaluate the entire installation-wide assessment (i.e., all modules simultaneously), and (2) displaying and communicating the results in an understandable and useable maimer to interested parties. An integrated, physics-based, compartmentalized approach, which is coupled to a Geographical Information System (GIS), captures the regional health impacts associated with multiple waste sites (e.g., hundreds to thousands of waste sites) at locations within and surrounding the installation. Utilizing a modular/GIS-based approach overcomes difficulties in (1) analyzing a wide variety of scenarios for multiple waste sites, and (2) communicating results from a complex human-health-impact analysis by capturing the essence of the assessment in a relatively elegant manner, so the meaning of the results can be quickly conveyed to all who review them


    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R.; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.


    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manual gating, and sample classification to determine if analysis pipelines can identify characteristics that correlate with external variables (e.g., clinical outcome). This analysis presents the results of the first of these challenges. Several methods performed well compared to manual gating or external variables using statistical performance measures, suggesting that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis. PMID:23396282

  8. NASA Langley Systems Analysis & Concepts Directorate Technology Assessment/Portfolio Analysis

    Cavanaugh, Stephen; Chytka, Trina; Arcara, Phil; Jones, Sharon; Stanley, Doug; Wilhite, Alan W.


    Systems analysis develops and documents candidate mission and architectures, associated system concepts, enabling capabilities and investment strategies to achieve NASA s strategic objectives. The technology assessment process connects the mission and architectures to the investment strategies. In order to successfully implement a technology assessment, there is a need to collect, manipulate, analyze, document, and disseminate technology-related information. Information must be collected and organized on the wide variety of potentially applicable technologies, including: previous research results, key technical parameters and characteristics, technology readiness levels, relationships to other technologies, costs, and potential barriers and risks. This information must be manipulated to facilitate planning and documentation. An assessment is included of the programmatic and technical risks associated with each technology task as well as potential risk mitigation plans. Risks are assessed and tracked in terms of likelihood of the risk occurring and consequences of the risk if it does occur. The risk assessments take into account cost, schedule, and technical risk dimensions. Assessment data must be simplified for presentation to decision makers. The Systems Analysis and Concepts Directorate (SACD) at NASA Langley Research Center has a wealth of experience in performing Technology Assessment and Portfolio Analysis as this has been a business line since 1978.

  9. Spatial risk assessment for critical network infrastructure using sensitivity analysis

    Michael M·derl; Wolfgang Rauch


    The presented spatial risk assessment method allows for managing critical network infrastructure in urban areas under abnormal and future conditions caused e.g.,by terrorist attacks,infrastructure deterioration or climate change.For the spatial risk assessment,vulnerability maps for critical network infrastructure are merged with hazard maps for an interfering process.Vulnerability maps are generated using a spatial sensitivity analysis of network transport models to evaluate performance decrease under investigated thread scenarios.Thereby parameters are varied according to the specific impact of a particular threat scenario.Hazard maps are generated with a geographical information system using raster data of the same threat scenario derived from structured interviews and cluster analysis of events in the past.The application of the spatial risk assessment is exemplified by means of a case study for a water supply system,but the principal concept is applicable likewise to other critical network infrastructure.The aim of the approach is to help decision makers in choosing zones for preventive measures.

  10. Climate Change Scientific Assessment and Policy Analysis. Scientific Assessment of Solar Induced Climate Change

    The programme Scientific Assessment and Policy Analysis is commissioned by the Dutch Ministry of Housing, Spatial Planning, and the Environment (VROM) and has the following objectives: Collection and evaluation of relevant scientific information for policy development and decision-making in the field of climate change; Analysis of resolutions and decisions in the framework of international climate negotiations and their implications. The programme is concerned with analyses and assessments intended for a balanced evaluation of the state of the art knowledge for underpinning policy choices. These analyses and assessment activities are carried out within several months to about a year, depending on the complexity and the urgency of the policy issue. Assessment teams organised to handle the various topics consist of the best Dutch experts in their fields. Teams work on incidental and additionally financed activities, as opposed to the regular, structurally financed activities of the climate research consortium. The work should reflect the current state of science on the relevant topic. In this report an assessment on the following topics is presented: (1) Reconstructions of solar variability, especially with respect to those parameters which are relevant for climate change; (2) Reconstructions of proxies of solar variability, e.g. cosmogenic isotopes; (3) Reconstructions of global as well as regional climate, with respect to temperature, precipitation and circulation; (4) Physical understanding of the mechanisms which play a role in the solar terrestrial link. We focus on the Holocene with emphasis on the last centuries because of data availability, to avoid confusing climate responses to orbital changes with those due to solar activity and because of the relevance for human induced climate change as compared to the role of the variable sun in the 20th century

  11. Biological dosimetry: chromosomal aberration analysis for dose assessment

    In view of the growing importance of chromosomal aberration analysis as a biological dosimeter, the present report provides a concise summary of the scientific background of the subject and a comprehensive source of information at the technical level. After a review of the basic principles of radiation dosimetry and radiation biology basic information on the biology of lymphocytes, the structure of chromosomes and the classification of chromosomal aberrations are presented. This is followed by a presentation of techniques for collecting blood, storing, transporting, culturing, making chromosomal preparations and scaring of aberrations. The physical and statistical parameters involved in dose assessment are discussed and examples of actual dose assessments taken from the scientific literature are given

  12. Social and ethical analysis in health technology assessment.

    Tantivess, Sripen


    This paper presents a review of the domestic and international literature on the assessment of the social and ethical implications of health technologies. It gives an overview of the key concepts, principles, and approaches that should be taken into account when conducting a social and ethical analysis within health technology assessment (HTA). Although there is growing consensus among healthcare experts that the social and ethical ramifications of a given technology should be examined before its adoption, the demand for this kind of analysis among policy-makers around the world, including in Thailand, has so far been lacking. Currently decision-makers mainly base technology adoption decisions using evidence on clinical effectiveness, value for money, and budget impact, while social and ethical aspects have been neglected. Despite the recognized importance of considering equity, justice, and social issues when making decisions regarding health resource allocation, the absence of internationally-accepted principles and methodologies, among other factors, hinders research in these areas. Given that developing internationally agreed standards takes time, it has been recommended that priority be given to defining processes that are justifiable, transparent, and contestable. A discussion of the current situation in Thailand concerning social and ethical analysis of health technologies is also presented. PMID:24964703

  13. Model analysis: Representing and assessing the dynamics of student learning

    Edward F. Redish


    Full Text Available Decades of education research have shown that students can simultaneously possess alternate knowledge frameworks and that the development and use of such knowledge are context dependent. As a result of extensive qualitative research, standardized multiple-choice tests such as Force Concept Inventory and Force-Motion Concept Evaluation tests provide instructors tools to probe their students’ conceptual knowledge of physics. However, many existing quantitative analysis methods often focus on a binary question of whether a student answers a question correctly or not. This greatly limits the capacity of using the standardized multiple-choice tests in assessing students’ alternative knowledge. In addition, the context dependence issue, which suggests that a student may apply the correct knowledge in some situations and revert to use alternative types of knowledge in others, is often treated as random noise in current analyses. In this paper, we present a model analysis, which applies qualitative research to establish a quantitative representation framework. With this method, students’ alternative knowledge and the probabilities for students to use such knowledge in a range of equivalent contexts can be quantitatively assessed. This provides a way to analyze research-based multiple choice questions, which can generate much richer information than what is available from score-based analysis.

  14. Development and assessment of best estimate integrated safety analysis code

    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published

  15. Development and assessment of best estimate integrated safety analysis code

    Chung, Bub Dong; Lee, Young Jin; Hwang, Moon Kyu (and others)


    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published.

  16. The advanced scenario analysis for performance assessment of geological disposal

    First of all, with regard to the FEP information data on the Engineered Barrier System (EBS) developed by JNC, description level and content of the FEPs have been examined from various angles on the basis of the latest research information. Each content of the FEP data has been classified and modified by means of integrating descriptive items, checking detail levels and correlations with other FEPs, collating with the H12 report, and adding technical information after H12 report. Secondly, scenario-modeling process has been studied. The study has been conducted by evaluating representation of the repository system, definition of FEP properties, and process interactions based on the concept of the interaction matrix (RES format) which represents influences between physicochemical characteristics of the repository, followed by an experimental development of the actual RES interaction matrix based on the H12 report as the examination to improve the transparency, traceability and comprehensibility of the scenario analysis process. Lastly, in relation to the geological disposal system, assessment techniques have been examined for more practical scenario analysis on particularly strong perturbations. Possible conceptual models have been proposed for each of these scenarios; seismic, faulting, and dike intrusion. As a result of these researches, a future direction for advanced scenario analysis on performance assessment has been indicated, as well as associated issues to be discussed have been clarified. (author)

  17. A Monte Carlo based spent fuel analysis safeguards strategy assessment

    Fensin, Michael L [Los Alamos National Laboratory; Tobin, Stephen J [Los Alamos National Laboratory; Swinhoe, Martyn T [Los Alamos National Laboratory; Menlove, Howard O [Los Alamos National Laboratory; Sandoval, Nathan P [Los Alamos National Laboratory


    assessment process, the techniques employed to automate the coupled facets of the assessment process, and the standard burnup/enrichment/cooling time dependent spent fuel assembly library. We also clearly define the diversion scenarios that will be analyzed during the standardized assessments. Though this study is currently limited to generic PWR assemblies, it is expected that the results of the assessment will yield an adequate spent fuel analysis strategy knowledge that will help the down-select process for other reactor types.

  18. Statistical analysis applied to safety culture self-assessment

    Interviews and opinion surveys are instruments used to assess the safety culture in an organization as part of the Safety Culture Enhancement Programme. Specific statistical tools are used to analyse the survey results. This paper presents an example of an opinion survey with the corresponding application of the statistical analysis and the conclusions obtained. Survey validation, Frequency statistics, Kolmogorov-Smirnov non-parametric test, Student (T-test) and ANOVA means comparison tests and LSD post-hoc multiple comparison test, are discussed. (author)

  19. Comparison of two software versions for assessment of body-composition analysis by DXA

    Vozarova, B; Wang, J; Weyer, C;


    To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA.......To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA....

  20. 7 CFR 2.71 - Director, Office of Risk Assessment and Cost-Benefit Analysis.


    ... Chief Economist § 2.71 Director, Office of Risk Assessment and Cost-Benefit Analysis. (a) Delegations..., Office of Risk Assessment and Cost-Benefit Analysis: (1) Responsible for assessing the risks to human... 7 Agriculture 1 2010-01-01 2010-01-01 false Director, Office of Risk Assessment and...

  1. Supporting analysis and assessments quality metrics: Utility market sector

    Ohi, J. [National Renewable Energy Lab., Golden, CO (United States)


    In FY96, NREL was asked to coordinate all analysis tasks so that in FY97 these tasks will be part of an integrated analysis agenda that will begin to define a 5-15 year R&D roadmap and portfolio for the DOE Hydrogen Program. The purpose of the Supporting Analysis and Assessments task at NREL is to provide this coordination and conduct specific analysis tasks. One of these tasks is to prepare the Quality Metrics (QM) for the Program as part of the overall QM effort at DOE/EERE. The Hydrogen Program one of 39 program planning units conducting QM, a process begun in FY94 to assess benefits/costs of DOE/EERE programs. The purpose of QM is to inform decisionmaking during budget formulation process by describing the expected outcomes of programs during the budget request process. QM is expected to establish first step toward merit-based budget formulation and allow DOE/EERE to get {open_quotes}most bang for its (R&D) buck.{close_quotes} In FY96. NREL coordinated a QM team that prepared a preliminary QM for the utility market sector. In the electricity supply sector, the QM analysis shows hydrogen fuel cells capturing 5% (or 22 GW) of the total market of 390 GW of new capacity additions through 2020. Hydrogen consumption in the utility sector increases from 0.009 Quads in 2005 to 0.4 Quads in 2020. Hydrogen fuel cells are projected to displace over 0.6 Quads of primary energy in 2020. In future work, NREL will assess the market for decentralized, on-site generation, develop cost credits for distributed generation benefits (such as deferral of transmission and distribution investments, uninterruptible power service), cost credits for by-products such as heat and potable water, cost credits for environmental benefits (reduction of criteria air pollutants and greenhouse gas emissions), compete different fuel cell technologies against each other for market share, and begin to address economic benefits, especially employment.

  2. Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis

    Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e

  3. Developing an assessment scale for character. An exploratory factorial analysis

    Ionescu, D.


    Full Text Available eveloping a character assessment scale is the more distal goal of the author. In this paper I aim to present a sequence of this psychometric process, namely exploring the factorial structure of a character assessment scale. In order to achieve this aim, we first explored the psychological factors relevant for a moral character. We also explored the moral standards that are valued in the main life contexts of an individual: family, workplace, close relationships and public context. These theoretical endeavors were important for the item writing process, as they provided the content of the scale. Furthermore, the item development phase was empirically supported through some piloting studies, which highlighted the direction of the scale to assess instances of moral character failure, generically recognized as proofs of a bad character. The present paper focuses on the results obtained after performing an exploratory factor analysis on a sample of 300 participants. The results suggest that the 21-item scale best fits a four-factor structures that cumulatively explain 42.45% of the variance. The factors are: evilness, ill-tempered behavior, dishonesty, upstartness. The scale reveals the moral profile of an individual in all four life contexts.

  4. Risk assessment of groundwater pollution using sensitivity analysis and a worst-case scenario analysis

    Huysmans, Marijke; Madarasz, Tamas; Dassargues, Alain


    This paper illustrates how sensitivity analysis and a worst-case scenario analysis can be useful tools in risk assessment of groundwater pollution. The approach is applied to a study area in Hungary with several known groundwater pollution sources and nearby drinking water production wells. The main concern is whether the contamination sources threaten the drinking water wells of the area. A groundwater flow and transport model is set up to answer this question. Due to limited data availabili...

  5. Assessing microstructures of pyrrhotites in basalts by multifractal analysis

    S. Xie


    Full Text Available Understanding and describing spatial arrangements of mineral particles and determining the mineral distribution structure are important to model the rock-forming process. Geometric properties of individual mineral particles can be estimated from thin sections, and different models have been proposed to quantify the spatial complexity of mineral arrangement. The Gejiu tin-polymetallic ore-forming district, located in Yunnan province, southwestern China, is chosen as the study area. The aim of this paper is to apply fractal and multifractal analysis to quantify distribution patterns of pyrrhotite particles from twenty-eight binary images obtained from seven basalt segments and then to discern the possible petrological formation environments of the basalts based on concentrations of trace elements. The areas and perimeters of pyrrhotite particles were measured for each image. Perimeter-area fractal analysis shows that the perimeter and area of pyrrhotite particles follow a power-law relationship, which implies the scale-invariance of the shapes of the pyrrhotites. Furthermore, the spatial variation of the pyrrhotite particles in space was characterized by multifractal analysis using the method of moments. The results show that the average values of the area-perimeter exponent (DAP, the width of the multifractal spectra (Δ(D(0−D(2 and Δ(D(qminD(qmax and the multifractality index (τ"(1 for the pyrrhotite particles reach their minimum in the second basalt segment, which implies that the spatial arrangement of pyrrhotite particles in Segment 2 is less heterogeneous. Geochemical trace element analysis results distinguish the second basalt segment sample from other basalt samples. In this aspect, the fractal and multifractal analysis may provide new insights into the quantitative assessment of mineral microstructures which may be closely associated with the petrogenesis as shown by the

  6. A comparison of integrated safety analysis and probabilistic risk assessment

    The U.S. Nuclear Regulatory Commission conducted a comparison of two standard tools for risk informing the regulatory process, namely, the Probabilistic Risk Assessment (PRA) and the Integrated Safety Analysis (ISA). PRA is a calculation of risk metrics, such as Large Early Release Frequency (LERF), and has been used to assess the safety of all commercial power reactors. ISA is an analysis required for fuel cycle facilities (FCFs) licensed to possess potentially critical quantities of special nuclear material. A PRA is usually more detailed and uses more refined models and data than an ISA, in order to obtain reasonable quantitative estimates of risk. PRA is considered fully quantitative, while most ISAs are typically only partially quantitative. The extension of PRA methodology to augment or supplant ISAs in FCFs has long been considered. However, fuel cycle facilities have a wide variety of possible accident consequences, rather than a few surrogates like LERF or core damage as used for reactors. It has been noted that a fuel cycle PRA could be used to better focus attention on the most risk-significant structures, systems, components, and operator actions. ISA and PRA both identify accident sequences; however, their treatment is quite different. ISA's identify accidents that lead to high or intermediate consequences, as defined in 10 Code of Federal Regulations (CFR) 70, and develop a set of Items Relied on For Safety (IROFS) to assure adherence to performance criteria. PRAs identify potential accident scenarios and estimate their frequency and consequences to obtain risk metrics. It is acceptable for ISAs to provide bounding evaluations of accident consequences and likelihoods in order to establish acceptable safety; but PRA applications usually require a reasonable quantitative estimate, and often obtain metrics of uncertainty. This paper provides the background, features, and methodology associated with the PRA and ISA. The differences between the

  7. Phonological assessment and analysis tools for Tagalog: Preliminary development.

    Chen, Rachelle Kay; Bernhardt, B May; Stemberger, Joseph P


    Information and assessment tools concerning Tagalog phonological development are minimally available. The current study thus sets out to develop elicitation and analysis tools for Tagalog. A picture elicitation task was designed with a warm-up, screener and two extension lists, one with more complex and one with simpler words. A nonlinear phonological analysis form was adapted from English (Bernhardt & Stemberger, 2000) to capture key characteristics of Tagalog. The tools were piloted on a primarily Tagalog-speaking 4-year-old boy living in a Canadian-English-speaking environment. The data provided initial guidance for revision of the elicitation tool (available at The analysis provides preliminary observations about possible expectations for primarily Tagalog-speaking 4-year-olds in English-speaking environments: Lack of mastery for tap/trill 'r', and minor mismatches for vowels, /l/, /h/ and word stress. Further research is required in order to develop the tool into a norm-referenced instrument for Tagalog in both monolingual and multilingual environments. PMID:27096390

  8. Analysis of complete logical structures in system reliability assessment

    The application field of the fault-tree techniques has been explored in order to assess whether the AND-OR structures covered all possible actual binary systems. This resulted in the identification of various situations requiring the complete AND-OR-NOT structures for their analysis. We do not use the term non-coherent for such cases, since the monotonicity or not of a structure function is not a characteristic of a system, but of the particular top event being examined. The report presents different examples of complete fault-trees, which can be examined according to different degrees of approximation. In fact, the exact analysis for the determination of the smallest irredundant bases is very time consuming and actually necessary only in some particular cases (multi-state systems, incidental situations). Therefore, together with the exact procedure, the report shows two different methods of logical analysis that permit the reduction of complete fault-trees to AND-OR structures. Moreover, it discusses the problems concerning the evaluation of the probability distribution of the time to first top event occurrence, once the hypothesis of structure function monotonicity is removed

  9. Time-dependent reliability analysis and condition assessment of structures

    Ellingwood, B.R. [Johns Hopkins Univ., Baltimore, MD (United States)


    Structures generally play a passive role in assurance of safety in nuclear plant operation, but are important if the plant is to withstand the effect of extreme environmental or abnormal events. Relative to mechanical and electrical components, structural systems and components would be difficult and costly to replace. While the performance of steel or reinforced concrete structures in service generally has been very good, their strengths may deteriorate during an extended service life as a result of changes brought on by an aggressive environment, excessive loading, or accidental loading. Quantitative tools for condition assessment of aging structures can be developed using time-dependent structural reliability analysis methods. Such methods provide a framework for addressing the uncertainties attendant to aging in the decision process.

  10. A Comparative Analysis of Privacy Impact Assessment in Six Countries

    David Wright


    Full Text Available he European Commission is revising the EU’s data protection framework. One of the changes concerns privacy impact assessment (PIA. This paper argues that the European Commission and the EU Member States should draw on the experience of other countries that have adopted PIA policies and methodologies to construct its own framework. There are similarities and differences in the approaches of Australia, Canada, Ireland, New Zealand, the UK and US, the countries with the most experience in PIA. Each has its strong points, but also shortcomings. Audits have identified some of the latter in the instance of Canada. This paper provides a comparative analysis of the six countries to identify some of the best elements that could be used to improve Article 33 in European Commission’s proposed Data Protection Regulation.

  11. New challenges on uncertainty propagation assessment of flood risk analysis

    Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés


    Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis

  12. Cyber threat impact assessment and analysis for space vehicle architectures

    McGraw, Robert M.; Fowler, Mark J.; Umphress, David; MacDonald, Richard A.


    This paper covers research into an assessment of potential impacts and techniques to detect and mitigate cyber attacks that affect the networks and control systems of space vehicles. Such systems, if subverted by malicious insiders, external hackers and/or supply chain threats, can be controlled in a manner to cause physical damage to the space platforms. Similar attacks on Earth-borne cyber physical systems include the Shamoon, Duqu, Flame and Stuxnet exploits. These have been used to bring down foreign power generation and refining systems. This paper discusses the potential impacts of similar cyber attacks on space-based platforms through the use of simulation models, including custom models developed in Python using SimPy and commercial SATCOM analysis tools, as an example STK/SOLIS. The paper discusses the architecture and fidelity of the simulation model that has been developed for performing the impact assessment. The paper walks through the application of an attack vector at the subsystem level and how it affects the control and orientation of the space vehicle. SimPy is used to model and extract raw impact data at the bus level, while STK/SOLIS is used to extract raw impact data at the subsystem level and to visually display the effect on the physical plant of the space vehicle.

  13. Potential Improvements in Human Reliability Analysis for Fire Risk Assessments

    The results of numerous fire risk assessments (FRA) and the experience gained from actual fire events have shown that fire can be a significant contributor to nuclear power plant (NPP) risk. However, on the basis of reviews of the FRAs performed for the Individual Plant External Events Examination (IPEEE) program in the U.S. and on recent research performed by U.S. Nuclear Regulatory Commission (NRC) to support increased use of risk information in regulatory decision making [e.g., Ref. 1, 2], it has become clear that improved modelling and quantification of human performance during fire events requires a better treatment of the special environment and response context produced by fires. This paper describes fire-related factors that have been identified as potentially impacting human performance, discusses to what extent such factors were modelled in the IPEEE FRAs, discusses prioritization of the factors likely to be most important to a realistic assessment of plant safety, and discusses which factors are likely to need additional research and development in order to allow adequate modelling in the human reliability analysis (HRA) portions of FRAs. The determination of which factors need to be modelled and the improvement of HRA related approaches for modelling such factors are critical aspects of the NRC's plan to improve FRA methods, tools, and data and to update a number of existing FRAs. (authors)

  14. Nutritional assessment and eating habits analysis in young adults.

    Nieradko-Iwanicka, Barbara; Borzecki, Andrzej


    Good eating habit is an essential part of a healthy lifestyle. It helps prevent civilisation diseases. The BMI and eating plan analysis are useful in individual's nutritional assessment. The aim of the study was to assess nutritional status and eating habits in young adults. An average BMI was 23.63 kg/ m2 in the interviewed men, and 20.6 kg/m2 in women. Caloric value of the daily eating plans was average: in men 2943 kcal, in women 2272 kcal. Four people were on diets, but none of BMI over 25 kg/m2. There were no people suffering from food allergies nor gastrointestinal diseases. Only one male did sports (weight-lifting) regularly. The majority of the students ate at lunchtime at the university cafeteria or prepared meals themselves. The eating plans varied very much: the majority was based on the Eating Guide Pyramid and consisted of three balanced meals during the day-time; there were also single cases where students stuck to eating high-calorie meals at night-time mostly. PMID:16146124

  15. Assessment of water quality of Buna River using microbiological analysis



    Full Text Available The Buna River is situated near Shkodra town, between the hill of Rozafa castle and Taraboshi Mountain. It is the only emissary of the Shkodra Lake. Buna River is exposed to different sources of pollution related to urban pollution, sewerage discharge, agricultural activity, and climate change which are associated with an increase in water levels, erosion and floods. This research assesses the quality of water in Buna River, based on the microbiological and physical-chemical analysis. Samples were taken at three different points during years 2013-2014. The analysis will stress out data about heterotrophic and fecal coliform general characteristics, figures, and the role as indicators of water pollution and also information about PH, conductibility and the temperature of water. Microbiological contamination tests show relatively large water contamination, especially in the first sample point where Buna River begins. The high level presence of these microorganisms indicates that the water quality of the river is bad according to standards, presenting a risk to health for all the organisms that inhabit the sweet waters of Buna River.

  16. Integrating multicriteria evaluation and stakeholders analysis for assessing hydropower projects

    The use of hydroelectric potential and the protection of the river ecosystem are two contrasting aspects that arise in the management of the same resource, generating conflicts between different stakeholders. The purpose of the paper is to develop a multi-level decision-making tool, able to support energy planning, with specific reference to the construction of hydropower plants in mountain areas. Starting from a real-world problem concerning the basin of the Sesia Valley (Italy), an evaluation framework based on the combined use of Multicriteria Evaluation and Stakeholders Analysis is proposed in the study. The results of the work show that the methodology is able to grant participated decisions through a multi-stakeholders traceable and transparent assessment process, to highlight the important elements of the decision problem and to support the definition of future design guidelines. - Highlights: • The paper concerns a multi-level decision-making tool able to support energy planning. • The evaluation framework is based on the use of AHP and Stakeholders Analysis. • Hydropower projects in the Sesia Valley (Italy) are evaluated and ranked in the study. • Environmental, economic, technical and sociopolitical criteria have been considered. • 42 stakeholder groups have been included in the evaluation

  17. Assessing population exposure for landslide risk analysis using dasymetric cartography

    Garcia, Ricardo A. C.; Oliveira, Sergio C.; Zezere, Jose L.


    Exposed Population is a major topic that needs to be taken into account in a full landslide risk analysis. Usually, risk analysis is based on an accounting of inhabitants number or inhabitants density, applied over statistical or administrative terrain units, such as NUTS or parishes. However, this kind of approach may skew the obtained results underestimating the importance of population, mainly in territorial units with predominance of rural occupation. Furthermore, the landslide susceptibility scores calculated for each terrain unit are frequently more detailed and accurate than the location of the exposed population inside each territorial unit based on Census data. These drawbacks are not the ideal setting when landslide risk analysis is performed for urban management and emergency planning. Dasymetric cartography, which uses a parameter or set of parameters to restrict the spatial distribution of a particular phenomenon, is a methodology that may help to enhance the resolution of Census data and therefore to give a more realistic representation of the population distribution. Therefore, this work aims to map and to compare the population distribution based on a traditional approach (population per administrative terrain units) and based on dasymetric cartography (population by building). The study is developed in the Region North of Lisbon using 2011 population data and following three main steps: i) the landslide susceptibility assessment based on statistical models independently validated; ii) the evaluation of population distribution (absolute and density) for different administrative territorial units (Parishes and BGRI - the basic statistical unit in the Portuguese Census); and iii) the dasymetric population's cartography based on building areal weighting. Preliminary results show that in sparsely populated administrative units, population density differs more than two times depending on the application of the traditional approach or the dasymetric

  18. Assessing temporal variations in connectivity through suspended sediment hysteresis analysis

    Sherriff, Sophie; Rowan, John; Fenton, Owen; Jordan, Phil; Melland, Alice; Mellander, Per-Erik; hUallacháin, Daire Ó.


    Connectivity provides a valuable concept for understanding catchment-scale sediment dynamics. In intensive agricultural catchments, land management through tillage, high livestock densities and extensive land drainage practices significantly change hydromorphological behaviour and alter sediment supply and downstream delivery. Analysis of suspended sediment-discharge hysteresis has offered insights into sediment dynamics but typically on a limited selection of events. Greater availability of continuous high-resolution discharge and turbidity data and qualitative hysteresis metrics enables assessment of sediment dynamics during more events and over time. This paper assesses the utility of this approach to explore seasonal variations in connectivity. Data were collected from three small (c. 10 km2) intensive agricultural catchments in Ireland with contrasting morphologies, soil types, land use patterns and management practices, and are broadly defined as low-permeability supporting grassland, moderate-permeability supporting arable and high-permeability supporting arable. Suspended sediment concentration (using calibrated turbidity measurements) and discharge data were collected at 10-min resolution from each catchment outlet and precipitation data were collected from a weather station within each catchment. Event databases (67-90 events per catchment) collated information on sediment export metrics, hysteresis category (e.g., clockwise, anti-clockwise, no hysteresis), numeric hysteresis index, and potential hydro-meteorological controls on sediment transport including precipitation amount, duration, intensity, stream flow and antecedent soil moisture and rainfall. Statistical analysis of potential controls on sediment export was undertaken using Pearson's correlation coefficient on separate hysteresis categories in each catchment. Sediment hysteresis fluctuations through time were subsequently assessed using the hysteresis index. Results showed the numeric

  19. Imminent Cardiac Risk Assessment via Optical Intravascular Biochemical Analysis

    Wetzel, D.; Wetzel, L; Wetzel, M; Lodder, R


    Heart disease is by far the biggest killer in the United States, and type II diabetes, which affects 8% of the U.S. population, is on the rise. In many cases, the acute coronary syndrome and/or sudden cardiac death occurs without warning. Atherosclerosis has known behavioral, genetic and dietary risk factors. However, our laboratory studies with animal models and human post-mortem tissue using FT-IR microspectroscopy reveal the chemical microstructure within arteries and in the arterial walls themselves. These include spectra obtained from the aortas of ApoE-/- knockout mice on sucrose and normal diets showing lipid deposition in the former case. Also pre-aneurysm chemical images of knockout mouse aorta walls, and spectra of plaque excised from a living human patient are shown for comparison. In keeping with the theme of the SPEC 2008 conference Spectroscopic Diagnosis of Disease this paper describes the background and potential value of a new catheter-based system to provide in vivo biochemical analysis of plaque in human coronary arteries. We report the following: (1) results of FT-IR microspectroscopy on animal models of vascular disease to illustrate the localized chemical distinctions between pathological and normal tissue, (2) current diagnostic techniques used for risk assessment of patients with potential unstable coronary syndromes, and (3) the advantages and limitations of each of these techniques illustrated with patent care histories, related in the first person, by the physician coauthors. Note that the physician comments clarify the contribution of each diagnostic technique to imminent cardiac risk assessment in a clinical setting, leading to the appreciation of what localized intravascular chemical analysis can contribute as an add-on diagnostic tool. The quality of medical imaging has improved dramatically since the turn of the century. Among clinical non-invasive diagnostic tools, laboratory tests of body fluids, EKG, and physical examination are

  20. An improved rank assessment method for weibull analysis of reliability data

    Weibull analysis has been applied widely in reliability data analysis. Rank assessment is one of the key steps in weibull analysis, which also induces the original errors. An improved median rank function obtained by genetic algorithms is presented to reduce the errors of rank assessment. (authors)

  1. Analysis of existing risk assessments, and list of suggestions

    Heimsch, Laura


    The scope of this project was to analyse risk assessments made at CERN and extracting some crucial information about the different methodologies used, profiles of people who make the risk assessments, and gathering information of whether the risk matrix was used and if the acceptable level of risk was defined. Second step of the project was to trigger discussion inside HSE about risk assessment by suggesting a risk matrix and a risk assessment template.

  2. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    Edwards, Michelle


    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  3. Portfolio Assessment on Chemical Reactor Analysis and Process Design Courses

    Alha, Katariina


    Assessment determines what students regard as important: if a teacher wants to change students' learning, he/she should change the methods of assessment. This article describes the use of portfolio assessment on five courses dealing with chemical reactor and process design during the years 1999-2001. Although the use of portfolio was a new…

  4. Flood Risk Analysis and Flood Potential Losses Assessment


    The heavy floods in the Taihu Basin showed increasing trend in recent years. In thiswork, a typical area in the northern Taihu Basin was selected for flood risk analysis and potentialflood losses assessment. Human activities have strong impact on the study area' s flood situation (asaffected by the polders built, deforestation, population increase, urbanization, etc. ), and havemade water level higher, flood duration shorter, and flood peaks sharper. Five years of differentflood return periods [(1970), 5 (1962), 10 (1987), 20 (1954), 50 (1991)] were used to cal-culate the potential flood risk area and its losses. The potential flood risk map, economic losses,and flood-impacted population were also calculated. The study's main conclusions are: 1 ) Humanactivities have strongly changed the natural flood situation in the study area, increasing runoff andflooding; 2) The flood risk area is closely related with the precipitation center; 3) Polder construc-tion has successfully protected land from flood, shortened the flood duration, and elevated waterlevel in rivers outside the polders; 4) Economic and social development have caused flood losses toincrease in recent years.

  5. RWMC Performance Assessment/Composite Analysis Monitoring Report - FY-2002

    US DOE Order 435.1, Radioactive Waste Management, Chapter IV and the associated implementation manual and guidance require monitoring of low-level radioactive waste (LLW) disposal facilities. The Performance Assessment/Composite Analysis (PA/CA) Monitoring program was developed and implemented to meet this requirement. This report represents the results of PA/CA monitoring projects that are available as of September 2002. The technical basis for the PA/CA program is provided in the PA/CA Monitoring Program document and a program description document (PDD) serves as the quality assurance project plan for implementing the PM program. Subsurface monitoring, air pathway surveillance, and subsidence monitoring/control are required to comply with DOE Order 435.1, Chapter IV. Subsidence monitoring/control and air pathway surveillance are performed entirely by other INEEL programs - their work is summarized herein. Subsurface monitoring includes near-field (source) monitoring of buried activated beryllium and steel, monitoring of groundwater in the vadose zone, and monitoring of the Snake River Plain Aquifer. Most of the required subsurface monitoring information presented in this report was gathered from the results of ongoing INEEL monitoring programs. This report also presents results for several new monitoring efforts that have been initiated to characterize any migration of radionuclides in surface sediment near the waste

  6. Uncertainty analysis in the applications of nuclear probabilistic risk assessment

    The aim of this thesis is to propose an approach to model parameter and model uncertainties affecting the results of risk indicators used in the applications of nuclear Probabilistic Risk assessment (PRA). After studying the limitations of the traditional probabilistic approach to represent uncertainty in PRA model, a new approach based on the Dempster-Shafer theory has been proposed. The uncertainty analysis process of the proposed approach consists in five main steps. The first step aims to model input parameter uncertainties by belief and plausibility functions according to the data PRA model. The second step involves the propagation of parameter uncertainties through the risk model to lay out the uncertainties associated with output risk indicators. The model uncertainty is then taken into account in the third step by considering possible alternative risk models. The fourth step is intended firstly to provide decision makers with information needed for decision making under uncertainty (parametric and model) and secondly to identify the input parameters that have significant uncertainty contributions on the result. The final step allows the process to be continued in loop by studying the updating of beliefs functions given new data. The proposed methodology was implemented on a real but simplified application of PRA model. (author)

  7. Assessment of gene set analysis methods based on microarray data.

    Alavi-Majd, Hamid; Khodakarim, Soheila; Zayeri, Farid; Rezaei-Tavirani, Mostafa; Tabatabaei, Seyyed Mohammad; Heydarpour-Meymeh, Maryam


    Gene set analysis (GSA) incorporates biological information into statistical knowledge to identify gene sets differently expressed between two or more phenotypes. It allows us to gain an insight into the functional working mechanism of cells beyond the detection of differently expressed gene sets. In order to evaluate the competence of GSA approaches, three self-contained GSA approaches with different statistical methods were chosen; Category, Globaltest and Hotelling's T(2) together with their assayed power to identify the differences expressed via simulation and real microarray data. The Category does not take care of the correlation structure, while the other two deal with correlations. In order to perform these methods, R and Bioconductor were used. Furthermore, venous thromboembolism and acute lymphoblastic leukemia microarray data were applied. The results of three GSAs showed that the competence of these methods depends on the distribution of gene expression in a dataset. It is very important to assay the distribution of gene expression data before choosing the GSA method to identify gene sets differently expressed between phenotypes. On the other hand, assessment of common genes among significant gene sets indicated that there was a significant agreement between the result of GSA and the findings of biologists. PMID:24012817

  8. Assessment of the Assessment Tool: Analysis of Items in a Non-MCQ Mathematics Exam

    Khoshaim, Heba Bakr; Rashid, Saima


    Assessment is one of the vital steps in the teaching and learning process. The reported action research examines the effectiveness of an assessment process and inspects the validity of exam questions used for the assessment purpose. The instructors of a college-level mathematics course studied questions used in the final exams during the academic…

  9. Analysis Strategy for Fracture Assessment of Defects in Ductile Materials

    Dillstroem, Peter; Andersson, Magnus; Sattari-Far, Iradj; Weilin Zang (Inspecta Technology AB, Stockholm (Sweden))


    The main purpose of this work is to investigate the significance of the residual stresses for defects (cracks) in ductile materials with nuclear applications, when the applied primary (mechanical) loads are high. The treatment of weld-induced stresses as expressed in the SACC/ProSACC handbook and other fracture assessment procedures such as the ASME XI code and the R6-method is believed to be conservative for ductile materials. This is because of the general approach not to account for the improved fracture resistance caused by ductile tearing. Furthermore, there is experimental evidence that the contribution of residual stresses to fracture diminishes as the degree of yielding increases to a high level. However, neglecting weld-induced stresses in general, though, is doubtful for loads that are mostly secondary (e.g. thermal shocks) and for materials which are not ductile enough to be limit load controlled. Both thin-walled and thick-walled pipes containing surface cracks are studied here. This is done by calculating the relative contribution from the weld residual stresses to CTOD and the J-integral. Both circumferential and axial cracks are analysed. Three different crack geometries are studied here by using the finite element method (FEM). (i) 2D axisymmetric modelling of a V-joint weld in a thin-walled pipe. (ii) 2D axisymmetric modelling of a V-joint weld in a thick-walled pipe. (iii) 3D modelling of a X-joint weld in a thick-walled pipe. t. Each crack configuration is analysed for two load cases; (1) Only primary (mechanical) loading is applied to the model, (2) Both secondary stresses and primary loading are applied to the model. Also presented in this report are some published experimental investigations conducted on cracked components of ductile materials subjected to both primary and secondary stresses. Based on the outcome of this study, an analysis strategy for fracture assessment of defects in ductile materials of nuclear components is proposed. A new

  10. Analysis Strategy for Fracture Assessment of Defects in Ductile Materials

    The main purpose of this work is to investigate the significance of the residual stresses for defects (cracks) in ductile materials with nuclear applications, when the applied primary (mechanical) loads are high. The treatment of weld-induced stresses as expressed in the SACC/ProSACC handbook and other fracture assessment procedures such as the ASME XI code and the R6-method is believed to be conservative for ductile materials. This is because of the general approach not to account for the improved fracture resistance caused by ductile tearing. Furthermore, there is experimental evidence that the contribution of residual stresses to fracture diminishes as the degree of yielding increases to a high level. However, neglecting weld-induced stresses in general, though, is doubtful for loads that are mostly secondary (e.g. thermal shocks) and for materials which are not ductile enough to be limit load controlled. Both thin-walled and thick-walled pipes containing surface cracks are studied here. This is done by calculating the relative contribution from the weld residual stresses to CTOD and the J-integral. Both circumferential and axial cracks are analysed. Three different crack geometries are studied here by using the finite element method (FEM). (i) 2D axisymmetric modelling of a V-joint weld in a thin-walled pipe. (ii) 2D axisymmetric modelling of a V-joint weld in a thick-walled pipe. (iii) 3D modelling of a X-joint weld in a thick-walled pipe. t. Each crack configuration is analysed for two load cases; (1) Only primary (mechanical) loading is applied to the model, (2) Both secondary stresses and primary loading are applied to the model. Also presented in this report are some published experimental investigations conducted on cracked components of ductile materials subjected to both primary and secondary stresses. Based on the outcome of this study, an analysis strategy for fracture assessment of defects in ductile materials of nuclear components is proposed. A new


    Sang Putu Kaler Surata


    Full Text Available Abstract: Social Network Analysis for Assessing Social Capital in Biosecurity Ecoliteracy. Biosecurity ecoliteracy (BEL is a view of literacy that applies ecological concepts to promote in-depth understanding, critical reflection, creative thinking, self consciousness, communication and social skills, in analyzing and managing issues around plant health/living, animal health/living and the risks that are associated with the environment. We used social network analysis (SNA to evaluate two distinct forms of social capital of BEL: social cohesion and network structure. This study was executed by employing cooperative learning in BEL toward 30 undergraduate teacher training students. Data then was analyzed using UCINET software. We found the tendency of so­cial cohesion to increase after students participated in BEL. This was supported by several SNA measures (density, closeness and degree and these values at the end were statistically different than at the beginning of BEL. The social structure map (sociogram after BEL visualized that students were much more likely to cluster in groups compared with the sociogram before BEL. Thus BEL, through cooperative learning, was able to promote social capital. In addition SNA proved a useful tool for evaluating the achievement levels of social capital of BEL in the form of network cohesion and network structure. Abstrak: Analisis Jaringan Sosial untuk Menilai Ekoliterasi Ketahanan Hayati. Ekoliterasi ketahanan hayati (EKH adalah literasi yang mengaplikasikan berbagai konsep ekologi untuk mempromosikan pe­mahaman yang mendalam, refleksi kritis, kesadaran diri, keterampilan sosial dan berkomunikasi, dalam menganalisis, dan mengelola isu yang terkait dengan kesehatan/kehidupan tanaman, kesehatan/kehidupan binatang, dan risiko yang terkait dengan lingkungan. Analisis jaringan kerja sosial (AJS telah digunakan untuk mengevaluasi dua bentuk model sosial EKH: kohesi sosial dan struktur jaringan kerja. Untuk itu

  12. Modeling the Assessment of Agricultural Enterprises Headcount Analysis

    Tatyana Viatkina


    The modern procedures of enterprises labour resources have been analyzed. The algorithm for calculation the enterprise performancepotential efficiency ratio and assessment of the performance potential of the enterprise assessment based on quantitativeand qualitative characteristics has been provided. The model for assessment the effectiveness of labour management of enterprise,branch or region subject to such factors as motivation, labour expenses, staff rotation and its qualifications has be...

  13. HPLC analysis and safety assessment of coumarin in foods.

    Sproll, Constanze; Ruge, Winfried; Andlauer, Claudia; Godelmann, Rolf; Lachenmeier, Dirk W


    Coumarin is a component of natural flavourings including cassia, which is widely used in foods and pastries. The toxicity of coumarin has raised some concerns and food safety authorities have set a maximum limit of 2mg/kg for foods and beverages in general, and a maximum level of 10mg/l for alcoholic beverages. An efficient method for routine analysis of coumarin is liquid chromatography with diode array detection. The optimal sample preparation for foods containing cinnamon was investigated and found to be cold extraction of 15g sample with 50mL of methanol (80%, v/v) for 30min using magnetic stirring. In the foods under investigation, appreciable amounts of coumarin were found in bakery products and breakfast cereals (mean 9mg/kg) with the highest concentrations up to 88mg/kg in certain cookies flavoured with cinnamon. Other foods such as liqueurs, vodka, mulled wine, and milk products did not have coumarin concentrations above the maximum level. The safety assessment of coumarin containing foods, in the context of governmental food controls, is complicated as a toxicological basis for the maximum limits appears to be missing. The limits were derived at a time when a genotoxic mechanism was assumed. However, this has since been disproven in more recent studies. Our exposure data on coumarin in bakery products show that there is still a need for a continued regulation of coumarin in foods. A toxicological re-evaluation of coumarin with the aim to derive scientifically founded maximum limits should be conducted with priority. PMID:26003373

  14. Sampling and Analysis for Assessment of Body Burdens

    A review of sampling criteria and techniques and of sample processing methods for indirect assessment of body burdens is presented. The text is limited to the more recent developments in the field of bioassay and to the nuclides which cannot be readily determined in the body directly. A selected bibliography is included. The planning of a bioassay programme should emphasize the detection of high or unusual exposures and the concentrated study of these cases when detected. This procedure gives the maximum amount of data for the dosimetry of individuals at risk and also adds to our scientific background for an understanding of internal emitters. Only a minimum of effort should be spent on sampling individuals having had negligible exposure. The chemical separation procedures required for bioassay also fall into two categories. The first is the rapid method, possibly of low accuracy, used for detection. The second is the more accurate method required for study of the individual after detection of the exposure. Excretion, whether exponential or a power function, drops off rapidly. It is necessary to locate the exposure in time before any evaluation can be made, even before deciding if the exposure is significant. One approach is frequent sampling and analysis by a quick screening technique. More commonly, samples are collected at longer intervals and an arbitrary level of re-sampling is set to assist in the detection of real exposures. It is probable that too much bioassay effort has gone into measurements on individuals at low risk and not enough on those at higher risk. The development of bioassay procedures for overcoming this problem has begun, and this paper emphasizes this facet of sampling and sample processing. (author)

  15. Assessment of academic departments efficiency using data envelopment analysis

    Salah R. Agha


    Full Text Available Purpose: In this age of knowledge economy, universities play an important role in the development of a country. As government subsidies to universities have been decreasing, more efficient use of resources becomes important for university administrators. This study evaluates the relative technical efficiencies of academic departments at the Islamic University in Gaza (IUG during the years 2004-2006. Design/methodology/approach: This study applies Data Envelopment Analysis (DEA to assess the relative technical efficiency of the academic departments. The inputs are operating expenses, credit hours and training resources, while the outputs are number of graduates, promotions and public service activities. The potential improvements and super efficiency are computed for inefficient and efficient departments respectively. Further, multiple linear -regression is used to develop a relationship between super efficiency and input and output variables.Findings: Results show that the average efficiency score is 68.5% and that there are 10 efficient departments out of the 30 studied. It is noted that departments in the faculty of science, engineering and information technology have to greatly reduce their laboratory expenses. The department of economics and finance was found to have the highest super efficiency score among the efficient departments. Finally, it was found that promotions have the greatest contribution to the super efficiency scores while public services activities come next.Research limitations/implications: The paper focuses only on academic departments at a single university. Further, DEA is deterministic in nature.Practical implications: The findings offer insights on the inputs and outputs that significantly contribute to efficiencies so that inefficient departments can focus on these factors.Originality/value: Prior studies have used only one type of DEA (BCC and they did not explicitly answer the question posed by the inefficient

  16. Problems of method of technology assessment. A methodological analysis; Methodenprobleme des Technology Assessment; Eine methodologische Analyse

    Zimmermann, V.


    The study undertakes to analyse the theoretical and methodological structure of Technology Assessment (TA). It is based on a survey of TA studies which provided an important condition for theoreticall sound statements on methodological aspects of TA. It was established that the main basic theoretical problems of TA are in the field of dealing with complexity. This is also apparent in the constitution of problems, the most elementary and central approach of TA. Scientifically founded constitution of problems and the corresponding construction of models call for interdisciplinary scientific work. Interdisciplinarity in the TA research process is achieved at the level of virtual networks, these networks being composed of individuals suited to teamwork. The emerging network structures have an objective-organizational and an ideational basis. The objective-organizational basis is mainly the result of team composition and the external affiliations of the team members. The ideational basis of the virtual network is represented by the team members` mode of thinking, which is individually located at a multidisciplinary level. The theoretical `skeleton` of the TA knowledge system, which is represented by process knowledge based linkage structures, can be generated and also processed in connection with the knowledge on types of problems, areas of analysis and procedures to deal with complexity. Within this process, disciplinary knowledge is a necessary but not a sufficient condition. Metatheoretical and metadisciplinary knowledge and the correspondingly processes complexity of models are the basis for the necessary methodological awareness, that allows TA to become designable as a research procedure. (orig./HP) [Deutsch] Die Studie stellt sich die Aufgabe, die theoretische und methodische Struktur des Technology Assessment (TA) zu analysieren. Sie fusst auf Erhebungen, die bei Technology-Assessment-Studien vorgenommen wurden und die wesentliche Voraussetzungen fuer

  17. Life Cycle Assessment Software for Product and Process Sustainability Analysis

    Vervaeke, Marina


    In recent years, life cycle assessment (LCA), a methodology for assessment of environmental impacts of products and services, has become increasingly important. This methodology is applied by decision makers in industry and policy, product developers, environmental managers, and other non-LCA specialists working on environmental issues in a wide…


    Before being used in scientific investigations and policy decisions, thematic maps constructed from remotely sensed data should be subjected to a statistically rigorous accuracy assessment. The three basic components of an accuracy assessment are: 1) the sampling design used to s...

  19. Defining and Assessing Public Health Functions: A Global Analysis.

    Martin-Moreno, Jose M; Harris, Meggan; Jakubowski, Elke; Kluge, Hans


    Given the broad scope and intersectoral nature of public health structures and practices, there are inherent difficulties in defining which services fall under the public health remit and in assessing their capacity and performance. The aim of this study is to analyze how public health functions and practice have been defined and operationalized in different countries and regions around the world, with a specific focus on assessment tools that have been developed to evaluate the performance of essential public health functions, services, and operations. Our review has identified nearly 100 countries that have carried out assessments, using diverse analytical and methodological approaches. The assessment processes have evolved quite differently according to administrative arrangements and resource availability, but some key contextual factors emerge that seem to favor policy-oriented follow-up. These include local ownership of the assessment process, policymakers' commitment to reform, and expert technical advice for implementation. PMID:26789385

  20. Teacher Candidates Exposure to Formative Assessment in Educational Psychology Textbooks: A Content Analysis

    Wininger, Steven R.; Norman, Antony D.


    The purpose of this article is to define formative assessment, outline what is known about the prevalence of formative assessment implementation in the classroom, establish the importance of formative assessment with regards to student motivation and achievement, and present the results of a content analysis of current educational psychology…

  1. Assessing the Validity of Discourse Analysis: Transdisciplinary Convergence

    Jaipal-Jamani, Kamini


    Research studies using discourse analysis approaches make claims about phenomena or issues based on interpretation of written or spoken text, which includes images and gestures. How are findings/interpretations from discourse analysis validated? This paper proposes transdisciplinary convergence as a way to validate discourse analysis approaches to…

  2. Radiological assessment. A textbook on environmental dose analysis

    Radiological assessment is the quantitative process of estimating the consequences to humans resulting from the release of radionuclides to the biosphere. It is a multidisciplinary subject requiring the expertise of a number of individuals in order to predict source terms, describe environmental transport, calculate internal and external dose, and extrapolate dose to health effects. Up to this time there has been available no comprehensive book describing, on a uniform and comprehensive level, the techniques and models used in radiological assessment. Radiological Assessment is based on material presented at the 1980 Health Physics Society Summer School held in Seattle, Washington. The material has been expanded and edited to make it comprehensive in scope and useful as a text. Topics covered include (1) source terms for nuclear facilities and Medical and Industrial sites; (2) transport of radionuclides in the atmosphere; (3) transport of radionuclides in surface waters; (4) transport of radionuclides in groundwater; (5) terrestrial and aquatic food chain pathways; (6) reference man; a system for internal dose calculations; (7) internal dosimetry; (8) external dosimetry; (9) models for special-case radionuclides; (10) calculation of health effects in irradiated populations; (11) evaluation of uncertainties in environmental radiological assessment models; (12) regulatory standards for environmental releases of radionuclides; (13) development of computer codes for radiological assessment; and (14) assessment of accidental releases of radionuclides

  3. Radiological assessment. A textbook on environmental dose analysis

    Till, J.E.; Meyer, H.R. (eds.)


    Radiological assessment is the quantitative process of estimating the consequences to humans resulting from the release of radionuclides to the biosphere. It is a multidisciplinary subject requiring the expertise of a number of individuals in order to predict source terms, describe environmental transport, calculate internal and external dose, and extrapolate dose to health effects. Up to this time there has been available no comprehensive book describing, on a uniform and comprehensive level, the techniques and models used in radiological assessment. Radiological Assessment is based on material presented at the 1980 Health Physics Society Summer School held in Seattle, Washington. The material has been expanded and edited to make it comprehensive in scope and useful as a text. Topics covered include (1) source terms for nuclear facilities and Medical and Industrial sites; (2) transport of radionuclides in the atmosphere; (3) transport of radionuclides in surface waters; (4) transport of radionuclides in groundwater; (5) terrestrial and aquatic food chain pathways; (6) reference man; a system for internal dose calculations; (7) internal dosimetry; (8) external dosimetry; (9) models for special-case radionuclides; (10) calculation of health effects in irradiated populations; (11) evaluation of uncertainties in environmental radiological assessment models; (12) regulatory standards for environmental releases of radionuclides; (13) development of computer codes for radiological assessment; and (14) assessment of accidental releases of radionuclides.

  4. Fuzzy sensitivity analysis for reliability assessment of building structures

    Kala, Zdeněk


    The mathematical concept of fuzzy sensitivity analysis, which studies the effects of the fuzziness of input fuzzy numbers on the fuzziness of the output fuzzy number, is described in the article. The output fuzzy number is evaluated using Zadeh's general extension principle. The contribution of stochastic and fuzzy uncertainty in reliability analysis tasks of building structures is discussed. The algorithm of fuzzy sensitivity analysis is an alternative to stochastic sensitivity analysis in tasks in which input and output variables are considered as fuzzy numbers.

  5. Uncertainty and sensitivity analysis using probabilistic system assessment code. 1

    This report presents the results obtained when applying the probabilistic system assessment code under development to the PSACOIN Level 0 intercomparison exercise organized by the Probabilistic System Assessment Code User Group in the Nuclear Energy Agency (NEA) of OECD. This exercise is one of a series designed to compare and verify probabilistic codes in the performance assessment of geological radioactive waste disposal facilities. The computations were performed using the Monte Carlo sampling code PREP and post-processor code USAMO. The submodels in the waste disposal system were described and coded with the specification of the exercise. Besides the results required for the exercise, further additional uncertainty and sensitivity analyses were performed and the details of these are also included. (author)


    Mohammad Hamad Allaymoun


    Full Text Available This paper presents a research on using rhetorical structures for assessing collaborative processes in Computer-Supported Collaborative Learning (CSCL chats. For this purpose, the ideas of Bakhtin’s dialogism theory and Trausan-Matu’s polyphonic model are used, starting from the identification of the threads of repeated words from chats. Cue phrases and their usage in linking the identified threads are also considered. The results are presented in statistical tables and graphics that ease the understanding of the collaborative process, helping teachers to analyze and assess students' collaborative chats. It also allows students to know and understand the interactions and how it contributes to the conversation.

  7. No-Reference Video Quality Assessment using MPEG Analysis

    Søgaard, Jacob; Forchhammer, Søren; Korhonen, Jari


    estimate the video coding parameters for MPEG-2 and H.264/AVC which can be used to improve the VQA. The analysis differs from most other video coding analysis methods since it is without access to the bitstream. The results show that our proposed method is competitive with other recent NR VQA methods for...... MPEG-2 and H.264/AVC....

  8. Global Gene Expression Analysis for the Assessment of Nanobiomaterials.

    Hanagata, Nobutaka


    Using global gene expression analysis, the effects of biomaterials and nanomaterials can be analyzed at the genetic level. Even though information obtained from global gene expression analysis can be useful for the evaluation and design of biomaterials and nanomaterials, its use for these purposes is not widespread. This is due to the difficulties involved in data analysis. Because the expression data of about 20,000 genes can be obtained at once with global gene expression analysis, the data must be analyzed using bioinformatics. A method of bioinformatic analysis called gene ontology can estimate the kinds of changes on cell functions caused by genes whose expression level is changed by biomaterials and nanomaterials. Also, by applying a statistical analysis technique called hierarchical clustering to global gene expression data between a variety of biomaterials, the effects of the properties of materials on cell functions can be estimated. In this chapter, these theories of analysis and examples of applications to nanomaterials and biomaterials are described. Furthermore, global microRNA analysis, a method that has gained attention in recent years, and its application to nanomaterials are introduced. PMID:26201278

  9. Assessing SRI fund performance research : best practices in empirical analysis

    Chegut, Andrea; Schenk, H.; Scholtens, B.


    We review the socially responsible investment (SRI) mutual fund performance literature to provide best practices in SRI performance attribution analysis. Based on meta-ethnography and content analysis, five themes in this literature require specific attention: data quality, social responsibility ver

  10. Environmental Impact Assessment for Socio-Economic Analysis of Chemicals

    Calow, Peter; Biddinger, G; Hennes, C;

    This report describes the requirements for, and illustrates the application of, a methodology for a socio-economic analysis (SEA) especially as it might be adopted in the framework of REACH.......This report describes the requirements for, and illustrates the application of, a methodology for a socio-economic analysis (SEA) especially as it might be adopted in the framework of REACH....


    Danijela Rabar


    Full Text Available In this paper, regional efficiency of Croatian counties is measured in three-year period (2005-2007 using Data Envelopment Analysis (DEA. The set of inputs and outputs consists of seven socioeconomic indicators. Analysis is carried out using models with assumption of variable returns-to-scale. DEA identifies efficient counties as benchmark members and inefficient counties that are analyzed in detail to determine the sources and the amounts of their inefficiency in each source. To enable proper monitoring of development dynamics, window analysis is applied. Based on the results, guidelines for implementing necessary improvements to achieve efficiency are given. Analysis reveals great disparities among counties. In order to alleviate naturally, historically and politically conditioned unequal county positions over which economic policy makers do not have total control, categorical approach is introduced as an extension to the basic DEA models. This approach, combined with window analysis, changes relations among efficiency scores in favor of continental counties.

  12. Analysis of the most widely used Building Environmental Assessment methods

    Building Environmental Assessment (BEA) is a term used for several methods for environmental assessment of the building environment. Generally, Life Cycle Assessment (LCA) is an important foundation and part of the BEA method, but current BEA methods form more comprehensive tools than LCA. Indicators and weight assignments are the two most important factors characterizing BEA. From the comparison of the three most widely used BEA methods, EcoHomes (BREEAM for residential buildings), LEED-NC and GBTool, it can be seen that BEA methods are shifting from ecological, indicator-based scientific systems to more integrated systems covering ecological, social and economic categories. Being relatively new methods, current BEA systems are far from perfect and are under continuous development. The further development of BEA methods will focus more on non-ecological indicators and how to promote implementation. Most BEA methods are developed based on regional regulations and LCA methods, but they do not attempt to replace these regulations. On the contrary, they try to extend implementation by incentive programmes. There are several ways to enhance BEA in the future: expand the studied scope from design levels to whole life-cycle levels of constructions, enhance international cooperation, accelerate legislation and standardize and develop user-oriented assessment systems

  13. Repeater Analysis for Combining Information from Different Assessments

    Haberman, Shelby; Yao, Lili


    Admission decisions frequently rely on multiple assessments. As a consequence, it is important to explore rational approaches to combine the information from different educational tests. For example, U.S. graduate schools usually receive both TOEFL iBT® scores and GRE® General scores of foreign applicants for admission; however, little guidance…

  14. No-Reference Video Quality Assessment by HEVC Codec Analysis

    Huang, Xin; Søgaard, Jacob; Forchhammer, Søren


    transform coefficients, estimates the distortion, and assesses the video quality. The proposed scheme generates VQA features based on Intra coded frames, and then maps features using an Elastic Net to predict subjective video quality. A set of HEVC coded 4K UHD sequences are tested. Results show that the...

  15. Facet Analysis of the Client Needs Assessment Instrument.

    Dancer, L. Suzanne; Stanley, Lawrence R.

    The structure of the revised Client Needs Assessment Instrument (CNAI) is examined. In 1978-79, the Texas Department of Human Resources (DHR) developed the CNAI to provide an index of applicants' and clients' capacity for self-care by measuring the respondents' levels of functioning in: (1) physical health; (2) daily living activities; (3) mental…

  16. Using Empirical Article Analysis to Assess Research Methods Courses

    Bachiochi, Peter; Everton, Wendi; Evans, Melanie; Fugere, Madeleine; Escoto, Carlos; Letterman, Margaret; Leszczynski, Jennifer


    Developing students who can apply their knowledge of empirical research is a key outcome of the undergraduate psychology major. This learning outcome was assessed in two research methods courses by having students read and analyze a condensed empirical journal article. At the start and end of the semester, students in multiple sections of an…

  17. Windfarm Generation Assessment for ReliabilityAnalysis of Power Systems

    Negra, Nicola Barberis; Holmstrøm, Ole; Bak-Jensen, Birgitte; Sørensen, Poul


    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays a...

  18. Windfarm Generation Assessment for Reliability Analysis of Power Systems

    Barberis Negra, Nicola; Bak-Jensen, Birgitte; Holmstrøm, O.; Sørensen, P.


    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays a...

  19. Windfarm generation assessment for reliability analysis of power systems

    Negra, N.B.; Holmstrøm, O.; Bak-Jensen, B.; Sørensen, Poul Ejnar


    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays a...

  20. A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment

    Finch, Holmes; Monahan, Patrick


    This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…

  1. Modeling and Analysis on Radiological Safety Assessment of Low- and Intermediate Level Radioactive Waste Repository

    Lee, Youn Myoung; Jung, Jong Tae; Kang, Chul Hyung (and others)


    Modeling study and analysis for technical support for the safety and performance assessment of the low- and intermediate level (LILW) repository partially needed for radiological environmental impact reporting which is essential for the licenses for construction and operation of LILW has been fulfilled. Throughout this study such essential area for technical support for safety and performance assessment of the LILW repository and its licensing as gas generation and migration in and around the repository, risk analysis and environmental impact during transportation of LILW, biosphere modeling and assessment for the flux-to-dose conversion factors for human exposure as well as regional and global groundwater modeling and analysis has been carried out.

  2. Modeling and Analysis on Radiological Safety Assessment of Low- and Intermediate Level Radioactive Waste Repository

    Modeling study and analysis for technical support for the safety and performance assessment of the low- and intermediate level (LILW) repository partially needed for radiological environmental impact reporting which is essential for the licenses for construction and operation of LILW has been fulfilled. Throughout this study such essential area for technical support for safety and performance assessment of the LILW repository and its licensing as gas generation and migration in and around the repository, risk analysis and environmental impact during transportation of LILW, biosphere modeling and assessment for the flux-to-dose conversion factors for human exposure as well as regional and global groundwater modeling and analysis has been carried out

  3. A root cause analysis approach to risk assessment of a pipeline network for Kuwait Oil Company

    Davies, Ray J.; Alfano, Tony D. [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil); Waheed, Farrukh [Kuwait Oil Company, Ahmadi (Kuwait); Komulainen, Tiina [Kongsberg Oil and Gas Technologies, Sandvika (Norway)


    A large scale risk assessment was performed by Det Norske Veritas (DNV) for the entire Kuwait Oil Company (KOC) pipeline network. This risk assessment was unique in that it incorporated the assessment of all major sources of process related risk faced by KOC and included root cause management system related risks in addition to technical risks related to more immediate causes. The assessment was conducted across the entire pipeline network with the scope divided into three major categories:1. Integrity Management 2. Operations 3. Management Systems Aspects of integrity management were ranked and prioritized using a custom algorithm based on critical data sets. A detailed quantitative risk assessment was then used to further evaluate those issues deemed unacceptable, and finally a cost benefit analysis approach was used to compare and select improvement options. The operations assessment involved computer modeling of the entire pipeline network to assess for bottlenecks, surge and erosion analysis, and to identify opportunities within the network that could potentially lead to increased production. The management system assessment was performed by conducting a gap analysis on the existing system and by prioritizing those improvement actions that best aligned with KOC's strategic goals for pipelines. Using a broad and three-pronged approach to their overall risk assessment, KOC achieved a thorough, root cause analysis-based understanding of risks to their system as well as a detailed list of recommended remediation measures that were merged into a 5-year improvement plan. (author)


    Goran Karanovic


    Full Text Available Lack of capital market development cause that calculating the value of companies in the small markets, such as the Croatian market, is carried out primarily from the analysis of financial statements. Lack of market development is evident from the unrealistic and unobjective corporate values, as result of too small volumeof securities trading in financial markets. The primary financial analysis is the basic method for estimating company value, and represents the foundation for an objective determination of cash flow components that will be discounted. Trought analysis investors are trying to answer the questions such as: status of the assets,liabilities and capital, the dynamics of business enterprises, the level of solvency and liquidity, utilization of fixed assets, contribution of fixed assets in total income, company profitability rates and investment in the company. Investors use financial analysis only as a basis and as a tool to predict the potential for creating new business value.

  5. Transboundary diagnostic analysis. Vol. 2. Background and environmental assessment


    The Transboundary Diagnosis Analysis(TDA) quantifies and ranks water-related environmental transboundary issues and their causes according to the severity of environmental and/or socio-economic impacts. The three main issues in BOBLME are; overexploitation of marine living resources; degradation of mangroves, coral reefs and seagrasses; pollution and water quality. Volume 2 contains background material that sets out the bio-physical and socio-economic characteristics of the BOBLME; an analysi...

  6. Analysis of online quizzes as a teaching and assessment tool

    Lorenzo Salas-Morera


    Full Text Available This article deals with the integrated use of online quizzes as a teaching and assessment tool in the general program of the subject Proyectos in the third course of Ingeniero Técnico en Informática de Gestión over five consecutive years. The research undertaken aimed to test quizzes effectiveness on student performance when used, not only as an isolated assessment tool, but also when integrated into a combined strategy, which support the overall programming of the subject. The results obtained during the five years of experimentation using online quizzes shows that such quizzes have a proven positive influence on students' academic performance. Furthermore, surveys conducted at the end of each course revealed the high value students accord to use of online quizzes in course instruction.


    Aghaeepour, Nima; Finak, Greg; ,; Hoos, Holger; Mosmann, Tim R; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.


    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manu...

  8. Paediatric neuropsychological assessment: an analysis of parents' perspectives

    Stark, Daniel; Thomas, Sophie; Dawson, Dave; Talbot, Emily; Bennett, Emily; Starza-Smith, Arleta


    Purpose: Modern healthcare services are commonly based on shared models of care, in which a strong emphasis is placed upon the views of those in receipt of services. The purpose of this paper is to examine the parents' experiences of their child's neuropsychological assessment. Design/methodology/approach: This was a mixed-methodology study employing both quantitative and qualitative measures. Findings: The questionnaire measure indicated a high overall level of satisfaction. Qualitative anal...

  9. An assessment and analysis of dietary practices of Irish jockeys

    O'Loughlin, Gillian


    Background: Horse racing is a weight category sport in which jockeys must chronically maintain a low body mass to compete, over a protracted season. The need to relentlessly align body mass with racing limits appears to encourage the use of short-term and potentially dangerous acute weight loss strategies. The purpose of this study was to investigate and assess the dietary habits of Irish Jockeys using established methods as well as incorporating novel sensing technologies. Methods: The ...

  10. Vulnerability of assessing water resources by the improved set pair analysis

    Yang Xiao-Hua


    Full Text Available Climate change has tremendously changed the hydrological processes with global warming. There are many uncertainties in assessing water resources vulnerability. To assess the water resources vulnerability rationally under climate change, an improved set pair analysis model is established, in which set pair analysis theory is introduced and the weights are determined by the analytic hierarchy process method. The index systems and criteria of water resources vulnerability assessment in terms of water cycle, socio-economy, and ecological environment are established based on the analysis of sensibility and adaptability. Improved set pair analysis model is used to assess water resource vulnerability in Ningxia with twelve indexes under four kinds of future climate scenarios. Certain and uncertain information quantity of water resource vulnerability is calculated by connection numbers in the improved set pair analysis model. Results show that Ningxia is higher vulnerability under climate change scenarios. Improved set pair analysis model can fully take advantage of certain and uncertain knowledge, subjective and objective information compared with fuzzy assessment model and artificial neural network model. The improved set pair analysis is an extension to the vulnerability assessment model of water resources system.

  11. Tiger Team Assessments seventeen through thirty-five: A summary and analysis

    This report provides a summary and analysis of the Department of Energy's (DOE'S) 19 Tiger Team Assessments that were conducted from October 1990 to July 1992. The sites are listed in the box below, along with their respective program offices and assessment completion dates. This analysis relied solely on the information contained in the Tiger Team Assessment Reports. The findings and concerns documented by the Tiger Teams provide a database of information about the then-current ES ampersand H programs and practice. Program Secretarial Officers (PSOS) and field managers may use this information, along with other sources (such as the Corrective Action Plans, Progress Assessments, and Self-Assessments), to address the ES ampersand H deficiencies found, prioritize and plan appropriate corrective actions, measure progress toward solving the problems, strengthen and transfer knowledge about areas where site performance exemplified the ES ampersand H mindset, and so forth. Further analyses may be suggested by the analysis presented in this report

  12. Quantitative assessment of human motion using video motion analysis

    Probe, John D.


    In the study of the dynamics and kinematics of the human body a wide variety of technologies has been developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development, coupled with recent advances in video technology, have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System (APAS) to develop data on shirtsleeved and space-suited human performance in order to plan efficient on-orbit intravehicular and extravehicular activities. APAS is a fully integrated system of hardware and software for biomechanics and the analysis of human performance and generalized motion measurement. Major components of the complete system include the video system, the AT compatible computer, and the proprietary software.

  13. Risk Assessment of Infrastructure System of Systems with Precursor Analysis.

    Guo, Zhenyu; Haimes, Yacov Y


    Physical infrastructure systems are commonly composed of interconnected and interdependent subsystems, which in their essence constitute system of systems (S-o-S). System owners and policy researchers need tools to foresee potential emergent forced changes and to understand their impact so that effective risk management strategies can be developed. We develop a systemic framework for precursor analysis to support the design of an effective and efficient precursor monitoring and decision support system with the ability to (i) identify and prioritize indicators of evolving risks of system failure; and (ii) evaluate uncertainties in precursor analysis to support informed and rational decision making. This integrated precursor analysis framework is comprised of three processes: precursor identification, prioritization, and evaluation. We use an example of a highway bridge S-o-S to demonstrate the theories and methodologies of the framework. Bridge maintenance processes involve many interconnected and interdependent functional subsystems and decision-making entities and bridge failure can have broad social and economic consequences. The precursor analysis framework, which constitutes an essential part of risk analysis, examines the impact of various bridge inspection and maintenance scenarios. It enables policy researchers and analysts who are seeking a risk perspective on bridge infrastructure in a policy setting to develop more risk informed policies and create guidelines to efficiently allocate limited risk management resources and mitigate severe consequences resulting from bridge failures. PMID:27575259

  14. Manage Stakeholders approach for analysis and risk assessment in the implementation of innovative projects

    СУХОНОС, Марія Костянтинівна; Угоднікова, Олена Ігорівна


    The problem of innovation project risk management, notably Manage Stakeholder's risks, is consider in this article. The methodology of analysis and assessment Manage Stakeholder's risks in innovation projects is suggest

  15. Verification and validation of the decision analysis model for assessment of TWRS waste treatment strategies

    This document is the verification and validation final report for the Decision Analysis Model for Assessment of Tank Waste Remediation System Waste Treatment Strategies. This model is also known as the INSIGHT Model

  16. Evaluation of auto-assessment method for C-D analysis based on support vector machine

    Contrast-Detail (C-D) analysis is one of the visual quality assessment methods in medical imaging, and many auto-assessment methods for C-D analysis have been developed in recent years. However, for the auto-assessment method for C-D analysis, the effects of nonlinear image processing are not clear. So, we have made an auto-assessment method for C-D analysis using a support vector machine (SVM), and have evaluated its performance for the images processed with a noise reduction method. The feature indexes used in the SVM were the normalized cross correlation (NCC) coefficient on each signal between the noise-free and noised image, the contrast to noise ratio (CNR) on each signal, the radius of each signal, and the Student's t-test statistic for the mean difference between the signal and background pixel values. The results showed that the auto-assessment method for C-D analysis by using Student's t-test statistic agreed well with the visual assessment for the non-processed images, but disagreed for the images processed with the noise reduction method. Our results also showed that the auto-assessment method for C-D analysis by the SVM made of NCC and CNR agreed well with the visual assessment for the non-processed and noise-reduced images. Therefore, the auto-assessment method for C-D analysis by the SVM will be expected to have the robustness for the non-linear image processing. (author)

  17. Model Analysis Assessing the dynamics of student learning

    Bao, L; Bao, Lei; Redish, Edward F.


    In this paper we present a method of modeling and analysis that permits the extraction and quantitative display of detailed information about the effects of instruction on a class's knowledge. The method relies on a congitive model that represents student thinking in terms of mental models. Students frequently fail to recognize relevant conditions that lead to appropriate uses of their models. As a result they can use multiple models inconsistently. Once the most common mental models have been determined by qualitative research, they can be mapping onto a multiple choice test. Model analysis permits the interpretation of such a situation. We illustrate the use of our method by analyzing results from the FCI.

  18. Using the statistical analysis method to assess the landslide susceptibility

    Chan, Hsun-Chuan; Chen, Bo-An; Wen, Yo-Ting


    This study assessed the landslide susceptibility in Jing-Shan River upstream watershed, central Taiwan. The landslide inventories during typhoons Toraji in 2001, Mindulle in 2004, Kalmaegi and Sinlaku in 2008, Morakot in 2009, and the 0719 rainfall event in 2011, which were established by Taiwan Central Geological Survey, were used as landslide data. This study aims to assess the landslide susceptibility by using different statistical methods including logistic regression, instability index method and support vector machine (SVM). After the evaluations, the elevation, slope, slope aspect, lithology, terrain roughness, slope roughness, plan curvature, profile curvature, total curvature, average of rainfall were chosen as the landslide factors. The validity of the three established models was further examined by the receiver operating characteristic curve. The result of logistic regression showed that the factor of terrain roughness and slope roughness had a stronger impact on the susceptibility value. Instability index method showed that the factor of terrain roughness and lithology had a stronger impact on the susceptibility value. Due to the fact that the use of instability index method may lead to possible underestimation around the river side. In addition, landslide susceptibility indicated that the use of instability index method laid a potential issue about the number of factor classification. An increase of the number of factor classification may cause excessive variation coefficient of the factor. An decrease of the number of factor classification may make a large range of nearby cells classified into the same susceptibility level. Finally, using the receiver operating characteristic curve discriminate the three models. SVM is a preferred method than the others in assessment of landslide susceptibility. Moreover, SVM is further suggested to be nearly logistic regression in terms of recognizing the medium-high and high susceptibility.

  19. Numerical analysis and geotechnical assessment of mine scale model

    Khanal Manoj; Adhikary Deepak; Balusu Rao


    Various numerical methods are available to model,simulate,analyse and interpret the results; however a major task is to select a reliable and intended tool to perform a realistic assessment of any problem.For a model to be a representative of the realistic mining scenario,a verified tool must be chosen to perform an assessment of mine roof support requirement and address the geotechnical risks associated with longwall mining.The dependable tools provide a safe working environment,increased production,efficient management of resources and reduce environmental impacts of mining.Although various methods,for example,analytical,experimental and empirical are being adopted in mining,in recent days numerical tools are becoming popular due to the advancement in computer hardware and numerical methods.Empirical rules based on past experiences do provide a general guide,however due to the heterogeneous nature of mine geology (i.e.,none of the mine sites are identical),numerical simulations of mine site specific conditions would lend better insights into some underlying issues.The paper highlights the use of a continuum mechanics based tool in coal mining with a mine scale model.The continuum modelling can provide close to accurate stress fields and deformation.The paper describes the use of existing mine data to calibrate and validate the model parameters,which then are used to assess geotechnical issues related with installing a new high capacity longwall mine at the mine site.A variety of parameters,for example,chock convergences,caveability of overlying sandstones,abutment and vertical stresses have been estimated.

  20. Evaluation of safety assessment methodologies in Rocky Flats Risk Assessment Guide (1985) and Building 707 Final Safety Analysis Report (1987)

    FSARs. Rockwell International, as operating contractor at the Rocky Flats plant, conducted a safety analysis program during the 1980s. That effort resulted in Final Safety Analysis Reports (FSARs) for several buildings, one of them being the Building 707 Final Safety Analysis Report, June 87 (707FSAR) and a Plant Safety Analysis Report. Rocky Flats Risk Assessment Guide, March 1985 (RFRAG85) documents the methodologies that were used for those FSARs. Resources available for preparation of those Rocky Flats FSARs were very limited. After addressing the more pressing safety issues, some of which are described below, the present contractor (EG ampersand G) intends to conduct a program of upgrading the FSARs. This report presents the results of a review of the methodologies described in RFRAG85 and 707FSAR and contains suggestions that might be incorporated into the methodology for the FSAR upgrade effort

  1. Teacher Analysis of Student Knowledge (TASK): A Measure of Learning Trajectory-Oriented Formative Assessment

    Supovitz, Jonathan; Ebby, Caroline B.; Sirinides, Philip


    This interactive electronic report provides an overview of an innovative new instrument developed by researchers at the Consortium for Policy Research in Education (CPRE) to authentically measure teachers' formative assessment practices in mathematics. The Teacher Analysis of Student Knowledge, or TASK, instrument assesses mathematics…

  2. Literary translation and quality assessment analysis – its significance in translation training

    Rodríguez, Beatriz Maria


    This paper aims to highlight the role of translation quality assessment in translation training so as to develop students’ translation competence and skills to face translation problems. An analysis to assess literary translation quality is proposed before proceeding to discuss its pedagogical implementation.

  3. A Risk-Analysis Approach to Implementing Web-Based Assessment

    Ricketts, Chris; Zakrzewski, Stan


    Computer-Based Assessment is a risky business. This paper proposes the use of a model for web-based assessment systems that identifies pedagogic, operational, technical (non web-based), web-based and financial risks. The strategies and procedures for risk elimination or reduction arise from risk analysis and management and are the means by which…

  4. Cost-effectiveness analysis of computer-based assessment

    Pauline Loewenberger


    Full Text Available The need for more cost-effective and pedagogically acceptable combinations of teaching and learning methods to sustain increasing student numbers means that the use of innovative methods, using technology, is accelerating. There is an expectation that economies of scale might provide greater cost-effectiveness whilst also enhancing student learning. The difficulties and complexities of these expectations are considered in this paper, which explores the challenges faced by those wishing to evaluate the costeffectiveness of computer-based assessment (CBA. The paper outlines the outcomes of a survey which attempted to gather information about the costs and benefits of CBA.

  5. Modelling requirements for future assessments based on FEP analysis

    This report forms part of a suite of documents describing the Nirex model development programme. The programme is designed to provide a clear audit trail from the identification of significant features, events and processes (FEPs) to the models and modelling processes employed within a detailed safety assessment. A scenario approach to performance assessment has been adopted. It is proposed that potential evolutions of a deep geological radioactive waste repository can be represented by a base scenario and a number of variant scenarios. The base scenario is chosen to be broad-ranging and to represent the natural evolution of the repository system and its surrounding environment. The base scenario is defined to include all those FEPs that are certain to occur and those which are judged likely to occur for a significant period of the assessment timescale. The structuring of FEPs on a Master Directed Diagram (MDD) provides a systematic framework for identifying those FEPs that form part of the natural evolution of the system and those, which may define alternative potential evolutions of the repository system. In order to construct a description of the base scenario, FEPs have been grouped into a series of conceptual models. Conceptual models are groups of FEPs, identified from the MDD, representing a specific component or process within the disposal system. It has been found appropriate to define conceptual models in terms of the three main components of the disposal system: the repository engineered system, the surrounding geosphere and the biosphere. For each of these components, conceptual models provide a description of the relevant subsystem in terms of its initial characteristics, subsequent evolution and the processes affecting radionuclide transport for the groundwater and gas pathways. The aim of this document is to present the methodology that has been developed for deriving modelling requirements and to illustrate the application of the methodology by

  6. The Analysis of Ease of Doing Business Assessment Methods

    Mindaugas Samoška


    Full Text Available The study deals with the ease of doing business assessment models. Analysed models describe conditions for doing busi­ness in a certain country that is being ranked and evaluated. However obvious need for improvement in methodology accrues while analysing five most known models in a comparative way. Different data aggregation principles differ the results (quantative and qualitive methods of aggregation despite the factors that are evaluated in both models and are quite similar. After analysing all five methods we state critics about them and insights for possible improvement.Article in Lithuanian

  7. Assessment of the Assessment Tool: Analysis of Items in a Non-MCQ Mathematics Exam

    Heba Bakr Khoshaim


    Full Text Available Assessment is one of the vital steps in the teaching and learning process. The reported action research examines the effectiveness of an assessment process and inspects the validity of exam questions used for the assessment purpose. The instructors of a college-level mathematics course studied questions used in the final exams during the academic years 2013–2014 and 2014−2015. Using the data from 206 students, the researchers analyzed 54 exam questions with regard to the complexity level, the difficulty coefficient and the discrimination coefficient. Findings indicated that the complexity level correlated with the difficulty coefficient for only one of three semesters. In addition, the correlation between the discrimination coefficient and the difficulty coefficient was found to be statistically significant in all three semesters. The results suggest that all three exams were acceptable; however, further attention should be given to the complexity level of questions used in mathematical tests and that moderate difficulty level questions are better classifying students’ performance.

  8. Pesticide Flow Analysis to Assess Human Exposure in Greenhouse Flower Production in Colombia

    Binder, Claudia R.; Camilo Lesmes-Fabian


    Human exposure assessment tools represent a means for understanding human exposure to pesticides in agricultural activities and managing possible health risks. This paper presents a pesticide flow analysis modeling approach developed to assess human exposure to pesticide use in greenhouse flower crops in Colombia, focusing on dermal and inhalation exposure. This approach is based on the material flow analysis methodology. The transfer coefficients were obtained using the whole body dosimetry ...

  9. An integrated factor analysis model for product eco-design based on full life cycle assessment

    Zhi fang Zhou; Tian Xiao; Da yuan Li


    Purpose: Among the methods of comprehensive analysis for a product or an enterprise, there exist defects and deficiencies in traditional standard cost analyses and life cycle assessment methods. For example, some methods only emphasize one dimension (such as economic or environmental factors) while neglecting other relevant dimensions. This paper builds a factor analysis model of resource value flow, based on full life cycle assessment and eco-design theory, in order to expose ...

  10. Vulnerability of assessing water resources by the improved set pair analysis

    Yang Xiao-Hua; He Jun; Di Cong-Li; Li Jian-Qiang


    Climate change has tremendously changed the hydrological processes with global warming. There are many uncertainties in assessing water resources vulnerability. To assess the water resources vulnerability rationally under climate change, an improved set pair analysis model is established, in which set pair analysis theory is introduced and the weights are determined by the analytic hierarchy process method. The index systems and criteria of water resources ...

  11. Fear Assessment: Cost-Benefit Analysis and the Pricing of Fear and Anxiety

    Adler, Matthew D.


    "Risk assessment" is now a common feature of regulatory practice, but "fear assessment" is not.In particular, environmental, health and safety agencies such as EPA, FDA, OSHA, NHTSA, and CPSC, commonly count death, illness and injury as "costs" for purposes of cost-benefit analysis, but almost never incorporate fear, anxiety or other welfare-reducing mental states into the analysis.This is puzzling, since fear and anxiety are welfare setbacks, and since the very hazards regulated by these age...

  12. Concentration Analysis: A Quantitative Assessment of Student States.

    Bao, Lei; Redish, Edward F.


    Explains that multiple-choice tests such as the Force Concept Inventory (FCI) provide useful instruments to probe the distribution of student difficulties on a large scale. Introduces a new method, concentration analysis, to measure how students' responses on multiple-choice questions are distributed. (Contains 18 references.) (Author/YDS)

  13. Dynamic Assessment of Functional Lipidomic Analysis in Human Urine.

    Rockwell, Hannah E; Gao, Fei; Chen, Emily Y; McDaniel, Justice; Sarangarajan, Rangaprasad; Narain, Niven R; Kiebish, Michael A


    The development of enabling mass spectrometry platforms for the quantification of diverse lipid species in human urine is of paramount importance for understanding metabolic homeostasis in normal and pathophysiological conditions. Urine represents a non-invasive biofluid that can capture distinct differences in an individual's physiological status. However, currently there is a lack of quantitative workflows to engage in high throughput lipidomic analysis. This study describes the development of a MS/MS(ALL) shotgun lipidomic workflow and a micro liquid chromatography-high resolution tandem mass spectrometry (LC-MS/MS) workflow for urine structural and mediator lipid analysis, respectively. This workflow was deployed to understand biofluid sample handling and collection, extraction efficiency, and natural human variation over time. Utilization of 0.5 mL of urine for structural lipidomic analysis resulted in reproducible quantification of more than 600 lipid molecular species from over 20 lipid classes. Analysis of 1 mL of urine routinely quantified in excess of 55 mediator lipid metabolites comprised of octadecanoids, eicosanoids, and docosanoids generated by lipoxygenase, cyclooxygenase, and cytochrome P450 activities. In summary, the high-throughput functional lipidomics workflow described in this study demonstrates an impressive robustness and reproducibility that can be utilized for population health and precision medicine applications. PMID:27038173

  14. Technical quality assessment of an optoelectronic system for movement analysis

    The Optoelectronic Systems (OS) are largely used in gait analysis to evaluate the motor performances of healthy subjects and patients. The accuracy of marker trajectories reconstruction depends on several aspects: the number of cameras, the dimension and position of the calibration volume, and the chosen calibration procedure. In this paper we propose a methodology to evaluate the effects of the mentioned sources of error on the reconstruction of marker trajectories. The novel contribution of the present work consists in the dimension of the tested calibration volumes, which is comparable with the ones normally used in gait analysis; in addition, to simulate trajectories during clinical gait analysis, we provide non-default paths for markers as inputs. Several calibration procedures are implemented and the same trial is processed with each calibration file, also considering different cameras configurations. The RMSEs between the measured trajectories and the optimal ones are calculated for each comparison. To investigate the significant differences between the computed indices, an ANOVA analysis is implemented. The RMSE is sensible to the variations of the considered calibration volume and the camera configurations and it is always inferior to 43 mm

  15. Assessment of competence in simulated flexible bronchoscopy using motion analysis

    Collela, Sara; Svendsen, Morten Bo Søndergaard; Konge, Lars;


    intermediates and 9 experienced bronchoscopy operators performed 3 procedures each on a bronchoscopy simulator. The Microsoft Kinect system was used to automatically measure the total deviation of the scope from a perfectly straight, vertical line. Results: The low-cost motion analysis system could measure the...

  16. Biogas upgrading technologies:Energetic analysis and environmental impact assessment

    Yajing Xu; Ying Huang; Bin Wu; Xiangping Zhang; Suojiang Zhang


    Biogas upgrading for removing CO2 and other trace components from raw biogas is a necessary step before the biogas to be used as a vehicle fuel or supplied to the natural gas grid. In this work, three technologies for biogas upgrading, i.e., pressured water scrubbing (PWS), monoethanolamine aqueous scrubbing (MAS) and ionic liquid scrubbing (ILS), are studied and assessed in terms of their energy consumption and environmental impacts with the process simulation and green degree method. A non-random-two-liquid and Henry's law property method for a CO2 separation system with ionic liquid 1-butyl-3-methylimidazolium bis(trifluoromethylsulfonyl)imide ([bmim][Tf2N]) is established and verified with experimental data. The assessment results indicate that the specific energy consumption of ILS and PWS is almost the same and much less than that of MAS. High purity CO2 product can be obtained by MAS and ILS methods, whereas no pure CO2 is recovered with the PWS. For the environmental aspect, ILS has the highest green degree production value, while MAS and PWS produce serious environmental impacts.

  17. Forest ecosystem health assessment and analysis in China

    XIAOFengjin; OUYANGHua; ZHANGQiang; FUBojie; ZHANGZhicheng


    Based on more than 300 forest sample plots surveying data and forestry statistical data, remote sensing information from the NOAA AVHRR database and the daily meteorological data of 300 stations, we selected vigor, organization and resilience as the indicators to assess large-scale forest ecosystem health in China and analyzed the spatial pattern of forest ecosystem health and influencing factors. The results of assessment indicated that the spatial pattern of forest ecosystem health showed a decreasing trend along latitude gradients and longitude gradients. The healthy forests are mainly distributed in natural forests, tropical rainforests and seasonal rainforests; secondarily orderly in northeast national forest zone, subtropical forest zonation and southwest forest zonation; while the unhealthy forests were mainly located in warm temperate zone and Xinjiang-Mongolia forest zone. The coefficient of correction between Forest Ecosystem Health Index (FEHI) and annual average precipitation was 0.58 (p<0.01), while the coefficient of correlation between FEHI and annual mean temperatures was 0.49 (p<0.01), which identified that the precipitation and temperatures affect the pattern of FEHI, and the precipitation's effect was stronger than the temperature's. We also measured the correlation coefficient between FEHI and NPP, biodiversity and resistance, which were 0.64, 0.76 and 0.81 (p<0.01) respectively. The order of effect on forest ecosystem health was vigor, organization and resistance.

  18. Analysis and Pollution Assessment of Heavy Metal in Soil, Perlis

    Concentration of 5 heavy metals (Cu, Cr, Ni, Cd, Pb) were studied in the soils around Perlis, to assess heavy metals contamination distribution due to industrialization, urbanization and agricultural activities. Soil samples were collected at depth of 0-15 cm in eighteen station around Perlis. The soil samples (2 mm) were obtained duplicates and subjected to hot block digestion and the concentration of total metal was determined via ICP-MS. Overall concentrations of Cu, Cr, Ni, Cd and Pb in the soil samples ranged from 0.38-240.59, 0.642-3.921, 0.689-2.398, 0-0.63 and 0.39-27.47 mg/ kg respectively. The concentration of heavy metals in the soil display the following decreasing trend: Cu> Pb> Cr> Ni> Cd. From this result, found that level of heavy metal in soil near centralized Chuping industrial areas give maximum value compared with other location in Perlis. The Pollution index revealed that only 11 % of Cu and 6 % of Cd were classes as heavily contaminated. Meanwhile, Cu and Pb showed 6 % from all samples result a moderately contaminated and the others element give low contamination. Results of combined heavy metal concentration and heavy metal assessment indicate that industrial activities and traffic emission represent most important sources for Cu, Cd and Pb whereas Cr, Ni mainly from natural sources. Increasing anthropogenic influences on the environment, especially pollution loadings, have caused negative changes in natural ecosystems and decreased biodiversity. (author)

  19. Promises and pitfalls in environmentally extended input–output analysis for China: A survey of the literature

    As the world's largest developing economy, China plays a key role in global climate change and other environmental impacts of international concern. Environmentally extended input–output analysis (EE-IOA) is an important and insightful tool seeing widespread use in studying large-scale environmental impacts in China: calculating and analyzing greenhouse gas emissions, carbon and water footprints, pollution, and embodied energy. This paper surveys the published articles regarding EE-IOA for China in peer-reviewed journals and provides a comprehensive and quantitative overview of the body of literature, examining the research impact, environmental issues addressed, and data utilized. The paper further includes a discussion of the shortcomings in official Chinese data and of the potential means to move beyond its inherent limitations. - Highlights: • Articles in 2012–2013 more than doubled that published between 1995 and 2011. • CO2 and energy are the most common topics, frequently associated with trade. • Data from the National Bureau of Statistics is widely used but seen as flawed. • Climate change, water supply, and food security drive the future of the literature

  20. Assessment of modern methods of human factor reliability analysis in PSA studies

    The report is structured as follows: Classical terms and objects (Probabilistic safety assessment as a framework for human reliability assessment; Human failure within the PSA model; Basic types of operator failure modelled in a PSA study and analyzed by HRA methods; Qualitative analysis of human reliability; Quantitative analysis of human reliability used; Process of analysis of nuclear reactor operator reliability in a PSA study); New terms and objects (Analysis of dependences; Errors of omission; Errors of commission; Error forcing context); and Overview and brief assessment of human reliability analysis (Basic characteristics of the methods; Assets and drawbacks of the use of each of HRA method; History and prospects of the use of the methods). (P.A.)

  1. Assessing Extremes Climatology Using NWS Local Climate Analysis Tool

    Timofeyeva, M. M.; Hollingshead, A.; Hilderbrand, D.; Mayes, B.; Hartley, T.; Kempf McGavock, N. M.; Lau, E.; Olenic, E. A.; Motta, B.; Bunge, R.; Brown, L. E.; Fritsch, F.


    The Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices’ ability to access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NWS Regional Headquarters, Weather Forecast Offices, Weather Service Offices, and River Forecast Centers the ability to conduct regional and local climate studies using station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. Field offices need standardized, scientifically sound methodology for local climate analysis (such as trend, composites, and principal statistical and time-series analysis) that is comprehensive, accessible, and efficient, with the potential to expand with growing NOAA Climate Services needs. The methodology for climate analyses is practiced by the NWS Climate Prediction Center (CPC), NOAA National Climatic Data Center, and NOAA Earth System Research Laboratory, as well as NWS field office staff. LCAT will extend this practice at the local level, allowing it to become both widespread and standardized, and thus improve NWS climate services capabilities. LCAT focus is on the local scale (as opposed to national and global scales of CPC products). The LCAT will: -Improve professional competency of local office staff and expertise in providing local information to their users. LCAT will improve quality of local climate services -Ensure adequate local input to CPC products that depend on local information, such as the U.S. Drought Monitor. LCAT will allow improvement of CPC climate products -Allow testing of local climate variables beyond temperature averages and precipitation totals such as climatology of

  2. Sensitivity analysis for Probabilistic Tsunami Hazard Assessment (PTHA)

    Spada, M.; Basili, R.; Selva, J.; Lorito, S.; Sorensen, M. B.; Zonker, J.; Babeyko, A. Y.; Romano, F.; Piatanesi, A.; Tiberti, M.


    In modern societies, probabilistic hazard assessment of natural disasters is commonly used by decision makers for designing regulatory standards and, more generally, for prioritizing risk mitigation efforts. Systematic formalization of Probabilistic Tsunami Hazard Assessment (PTHA) has started only in recent years, mainly following the giant tsunami disaster of Sumatra in 2004. Typically, PTHA for earthquake sources exploits the long-standing practices developed in probabilistic seismic hazard assessment (PSHA), even though important differences are evident. In PTHA, for example, it is known that far-field sources are more important and that physical models for tsunami propagation are needed for the highly non-isotropic propagation of tsunami waves. However, considering the high impact that PTHA may have on societies, an important effort to quantify the effect of specific assumptions should be performed. Indeed, specific standard hypotheses made in PSHA may prove inappropriate for PTHA, since tsunami waves are sensitive to different aspects of sources (e.g. fault geometry, scaling laws, slip distribution) and propagate differently. In addition, the necessity of running an explicit calculation of wave propagation for every possible event (tsunami scenario) forces analysts to finding strategies for diminishing the computational burden. In this work, we test the sensitivity of hazard results with respect to several assumptions that are peculiar of PTHA and others that are commonly accepted in PSHA. Our case study is located in the central Mediterranean Sea and considers the Western Hellenic Arc as the earthquake source with Crete and Eastern Sicily as near-field and far-field target coasts, respectively. Our suite of sensitivity tests includes: a) comparison of random seismicity distribution within area sources as opposed to systematically distributed ruptures on fault sources; b) effects of statistical and physical parameters (a- and b-value, Mc, Mmax, scaling laws

  3. Regional hazard analysis for use in vulnerability and risk assessment

    Maris, Fotis; Kitikidou, Kyriaki; Paparrizos, Spyridon; Karagiorgos, Konstantinos; Potouridis, Simeon; Fuchs, Sven


    A method for supporting an operational regional risk and vulnerability analysis for hydrological hazards is suggested and applied in the Island of Cyprus. The method aggregates the output of a hydrological flow model forced by observed temperatures and precipitations, with observed discharge data. A scheme supported by observed discharge is applied for model calibration. A comparison of different calibration schemes indicated that the same model parameters can be used for the entire country. In addition, it was demonstrated that, for operational purposes, it is sufficient to rely on a few stations. Model parameters were adjusted to account for land use and thus for vulnerability of elements at risk by comparing observed and simulated flow patterns, using all components of the hydrological model. The results can be used for regional risk and vulnerability analysis in order to increase the resilience of the affected population.

  4. Assessment of pore size distribution using image analysis

    Doktor, Tomáš; Kytýř, Daniel; Valach, Jaroslav; Jiroušek, Ondřej

    Trieste: Italian Group of Fracture, 2010 - (Iacoviello, F.; Cosmi, F.), s. 155-157 ISBN 978-88-95940-30-4. [Youth Symposium on Experimental Solid Mechanics /9./. Trieste (IT), 07.07.2010-10.07.2010] R&D Projects: GA ČR(CZ) GAP105/10/2305 Institutional research plan: CEZ:AV0Z20710524 Keywords : pore size distribution * image analysis * micro-CT Subject RIV: JJ - Other Materials


    Hertel, Thomas W.; Keeney, Roman; Valenzuela, Ernesto


    This paper presents a validation experiment of a global CGE trade model widely used for analysis of trade liberalization. We focus on the ability of the model to reproduce price volatility in wheat markets. The literature on model validation is reviewed with an eye towards designing an appropriate methodology for validating large scale CGE models. The validation experiment results indicate that in its current form, the GTAP-AGR model is incapable of reproducing wheat market price volatility a...

  6. Use Of Risk Analysis Fremeworks In Urban Flood Assessments

    Arnbjerg-Nielsen, Karsten; Madsen, Henrik


    In the period 1960 – 1990 rapid urban development took place all over Europe, and notably in Denmark urban sprawl occurred around many cities. Favorable economic conditions ensured that the urbanization continued, although at a lower rate, until recently. However, from 1990 to present a increase in extreme precipitation has been observed, corresponding to an increase of design levels of at least 30 %. Analysis of climate change model output has given clear evidence, that further increases in ...

  7. The Analysis and Assessment of the Credit Risk

    Mirea Marioara; Aivaz Kamer Ainur


    The commercial banks main operation is the granting of credits that occupies the first place among the total investments. Any bank assumes risks to a certain extent when granting credits and certainly all the banks generally incur losses when some debtors fail to comply with their obligations. The level of the assumed risks, the losses can be minimized if the credit operations are organized and managed in a professional manner. The paper grasps the moment of the analysis process preceding the...

  8. Assessment of quality control approaches for metagenomic data analysis

    Zhou, Qian; Su, Xiaoquan; Ning, Kang


    Currently there is an explosive increase of the next-generation sequencing (NGS) projects and related datasets, which have to be processed by Quality Control (QC) procedures before they could be utilized for omics analysis. QC procedure usually includes identification and filtration of sequencing artifacts such as low-quality reads and contaminating reads, which would significantly affect and sometimes mislead downstream analysis. Quality control of NGS data for microbial communities is especially challenging. In this work, we have evaluated and compared the performance and effects of various QC pipelines on different types of metagenomic NGS data and from different angles, based on which general principles of using QC pipelines were proposed. Results based on both simulated and real metagenomic datasets have shown that: firstly, QC-Chain is superior in its ability for contamination identification for metagenomic NGS datasets with different complexities with high sensitivity and specificity. Secondly, the high performance computing engine enabled QC-Chain to achieve a significant reduction in processing time compared to other pipelines based on serial computing. Thirdly, QC-Chain could outperform other tools in benefiting downstream metagenomic data analysis.

  9. An assessment of unstructured grid technology for timely CFD analysis

    Kinard, Tom A.; Schabowski, Deanne M.


    An assessment of two unstructured methods is presented in this paper. A tetrahedral unstructured method USM3D, developed at NASA Langley Research Center is compared to a Cartesian unstructured method, SPLITFLOW, developed at Lockheed Fort Worth Company. USM3D is an upwind finite volume solver that accepts grids generated primarily from the Vgrid grid generator. SPLITFLOW combines an unstructured grid generator with an implicit flow solver in one package. Both methods are exercised on three test cases, a wing, and a wing body, and a fully expanded nozzle. The results for the first two runs are included here and compared to the structured grid method TEAM and to available test data. On each test case, the set up procedure are described, including any difficulties that were encountered. Detailed descriptions of the solvers are not included in this paper.

  10. Fuel assembly assessment from CVD image analysis: A feasibility study

    The Swedish Nuclear Inspectorate commissioned a feasibility study of automatic assessment of fuel assemblies from images obtained with the digital Cerenkov viewing device currently in development. The goal is to assist the IAEA inspectors in evaluating the fuel since they typically have only a few seconds to inspect an assembly. We report results here in two main areas: Investigation of basic image processing and recognition techniques needed to enhance the images and find the assembly in the image; Study of the properties of the distributions of light from the assemblies to determine whether they provide unique signatures for different burn-up and cooling times for real fuel or indicate presence of non-fuel. 8 refs, 27 figs

  11. Wind resource assessment and siting analysis in Italy

    Recently, the wind power industry has matured; consequently, in many countries a lot of wind energy applications have been programmed. Many of them are already realized and running. As such, there is a direct necessity to identify a sizeable number of wind power plant sites. Choosing the right sites to match specific Wind Energy Conversion Systems (WECS) is also needed to harness this clean energy from the points of view of industrial viability and project financing. As a pre-requisite to install a wind turbine at a particular site, it is necessary to have knowledge of the theoretical available wind energy at the site, as well as, of the practicability of the design in matching the characteristics of the WECS. In this paper, ENEA (Italian National Agency for New Technology, Energy and Environment) wind siting and resource assessment activities, currently on-going in different regions in Italy, along with the present status and future prospects of the wind power industry

  12. Assessment and analysis components of physical fitness of students

    Kashuba V.A.


    Full Text Available It is assessed components of a physical fitness of students. It is analyzed the internal and external factors affecting the quality of life for students. The study involved more than 200 students. Found that students represent a category of people with elevated risk factors, which include the nervous and mental tension, constant violations of the food, work and leisure, in their way of life there is a lack of care about their health. It is noted that the existing approaches to promoting physical fitness of students are inefficient and require the development and implementation of brand new contemporary theoretical foundations and practical approaches to the problem of increasing the activity of students. It is proved that sold today in the practice of higher education forms, methods, learning tools do not allow to fully ensure the implementation of approaches to promoting physical fitness of students do not meet the requirements for the preparation of the modern health professional.

  13. Image analysis for dental bone quality assessment using CBCT imaging

    Suprijanto; Epsilawati, L.; Hajarini, M. S.; Juliastuti, E.; Susanti, H.


    Cone beam computerized tomography (CBCT) is one of X-ray imaging modalities that are applied in dentistry. Its modality can visualize the oral region in 3D and in a high resolution. CBCT jaw image has potential information for the assessment of bone quality that often used for pre-operative implant planning. We propose comparison method based on normalized histogram (NH) on the region of inter-dental septum and premolar teeth. Furthermore, the NH characteristic from normal and abnormal bone condition are compared and analyzed. Four test parameters are proposed, i.e. the difference between teeth and bone average intensity (s), the ratio between bone and teeth average intensity (n) of NH, the difference between teeth and bone peak value (Δp) of NH, and the ratio between teeth and bone of NH range (r). The results showed that n, s, and Δp have potential to be the classification parameters of dental calcium density.

  14. Game theoretic analysis of environmental impact assessment system in China

    CHENG Hongguang; QI Ye; PU Xiao; GONG Li


    Environmental impact assessment (EIA) system has been established in China since 1973.In present EIA cases,there are four participants in general:governments,enterprises,EIA organizations and the public.The public has held responsible for both social costs and social duties.The public supervises social costs produced by enterprises discharging pollutant in EIA.However public participation is mostly deputized by governments,which severely weaken the independence of the public as one participant in EIA.In this paper,EIA refers to the different attitudes of the participants whose optional strategies may be described by a proper game model.According to disfigurements in EIA,three sides (governments,enterprises,and EIA organizations)dynamic iterative game theory of many phases is established referring to iterative game theory,dynamic game theory of incomplete information,and perfect Bayesian equilibrium theory to analyze the reciprocity relation among governments,EIA organizations and enterprises.The results show that in a short period,economic benefit is preponderant over social benefit.Governments and enterprises both do not want to take EIA to reveal social costs.EIA organizations' income comes from enterprises and the collusions are built between them to vindicate economic benefit.In a long run,social benefit loss caused by environmental pollution must be recuperated sooner or later and environmental deterioration will influence the achievements of economic benefit,so both governments and eaterprises are certain to pursue high social benefit and willing to take EIA,helpful to increase private benefit.EIA organizations will make fair assessment when their economic benefit are ensured.At present,the public as silent victims can not take actual part in EIA.The EIA system must be improved to break the present equilibrium of three sides,bringing the public to the equilibrium to exert public supervision.

  15. An Assessment of Image Analysis of Longitudinal Bone Changes

    This study was performed to assess the analyzing methods developed to detect clinically and quantitatively longitudinal bone changes. Through preliminary experiment, accuracy of Cu-Eq value conversion to the mass of HA was examined. For main experiment, 15 intraoral radiograms taken at soon, 1st, 2nd, 4th, and 6th week after implantation of mixture in extracted sites of 3 cases were user. We took the radiograms with copper step wedge as test object and HA phantom. X -ray taking was standardized by using Rinn XCP device customized directly to the individual dentition with resin bite block. The images inputted by Quick scanner into computer were digitized and analyzed by NH image program, the stability of the copper equivalent transformation and the usefulness of two analyzing methods by ROI and Reslice were examined. Obtained results as follows : 1. On the Cu equivalent images, the coefficient of variation in the measurement of Cu-Eq. value of ROI ranged from 0.05 to 0.24 and showed high reproducibility. 2. All results obtained by resliced contiguous images were coincident with those obtained from the assessment by ROI an d formation of plot profile. 3. On the stacked and resliced image at the line of interest, we could analyze directly and quantitatively the longitudinal changes at several portions by plot profile and qualitatively by surface plot. 4. Implant area showed marked resorption till 2 weeks after implantation and showed significant increase in Cu-Eq. value at 6th week (P<0.01) and periapical area showed increase in Cu-Eq. value at 6th week compared to after-operation's.

  16. Alternative model for assessment administration and analysis: An example from the E-CLASS

    Wilcox, Bethany R; Hobbs, Robert D; Aiken, John M; Welch, Nathan M; Lewandowski, H J


    The primary model for dissemination of conceptual and attitudinal assessments that has been used within the physics education research (PER) community is to create a high quality, validated assessment, make it available for a wide range of instructors to use, and provide minimal (if any) support to assist with administration or analysis of the results. Here, we present and discuss an alternative model for assessment dissemination, which is characterized by centralized data collection and analysis. This model also provides a greater degree of support for both researchers and instructors. Specifically, we describe our experiences developing a centralized, automated system for an attitudinal assessment we previously created to examine students' epistemologies and expectations about experimental physics. This system provides a proof-of-concept that we use to discuss the advantages associated with centralized administration and data collection for research-based assessments in PER. We also discuss the challenges t...

  17. Analysis of uranium ore concentrates for origin assessment

    In this study the most important analytical methodologies are presented for the nuclear forensic investigation of uranium ore concentrates (yellow cakes). These methodologies allow to measure characteristic parameters which may be source material or process inherited. By the combination of the various techniques (e.g. infrared spectrometry, impurity content, rare-earth pattern and U, Sr and Pb isotope ratio analysis by mass spectrometry), the possible provenances of the illicit material can be narrowed down to a few options and its declared origin can be verified. The methodologies serve for nuclear forensic investigations as well as for nuclear safeguards, checking the consistency of information. (orig.)

  18. High strength bolt failure analysis and integrity assessment. Lessons learned

    Isolated failures have occurred in high strength bolting used in pressurized water reactor (PWR) component support applications. The U.S. nuclear industry component support bolting failure experience is described in this paper, focusing on materials specified intentionally as ''ultra-high-strength'' (minimum specified yield strength greater than 1034 MPa). The analysis and investigation of fabrication-induced problems with a bolt made from Carpenter Technology Alloy ''Custom 455'', (ASTM A 564 XM-16) a proprietary materials, are detailed, and the measures taken to assure integrity of these bolts during operation are discussed. Lessons learned to preclude future problems are presented as conclusions

  19. Assessment of the Prony's method for BWR stability analysis

    Highlights: → This paper describes a method to determine the degree of stability of a BWR. → Performance comparison between Prony's and common AR techniques is presented. → Benchmark data and actual BWR transient data are used for comparison. → DR and f results are presented and discussed. → The Prony's method is shown to be a robust technique for BWR stability. - Abstract: It is known that Boiling Water Reactors are susceptible to present power oscillations in regions of high power and low coolant flow, in the power-flow operational map. It is possible to fall in one of such instability regions during reactor startup, since both power and coolant flow are being increased but not proportionally. One other possibility for falling into those areas is the occurrence of a trip of recirculation pumps. Stability monitoring in such cases can be difficult, because the amount or quality of power signal data required for calculation of the stability key parameters may not be enough to provide reliable results in an adequate time range. In this work, the Prony's Method is presented as one complementary alternative to determine the degree of stability of a BWR, through time series data. This analysis method can provide information about decay ratio and oscillation frequency from power signals obtained during transient events. However, so far not many applications in Boiling Water Reactors operation have been reported and supported to establish the scope of using such analysis for actual transient events. This work presents first a comparison of decay ratio and frequency oscillation results obtained by Prony's method and those results obtained by the participants of the Forsmark 1 and 2 Boiling Water Reactor Stability Benchmark using diverse techniques. Then, a comparison of decay ratio and frequency oscillation results is performed for four real BWR transient event data, using Prony's method and two other techniques based on an autoregressive modeling. The four

  20. Assessing Canadian Bank Branch Operating Efficiency Using Data Envelopment Analysis

    Yang, Zijiang


    In today's economy and society, performance analyses in the services industries attract more and more attention. This paper presents an evaluation of 240 branches of one big Canadian bank in Greater Toronto Area using Data Envelopment Analysis (DEA). Special emphasis was placed on how to present the DEA results to management so as to provide more guidance to them on what to manage and how to accomplish the changes. Finally the potential management uses of the DEA results were presented. All the findings are discussed in the context of the Canadian banking market.

  1. Using Benefit-Cost Analysis to Assess Child Abuse Prevention and Intervention Programs.

    Plotnick, Robert D.; Deppman, Laurie


    Presents a case for using benefit-cost analysis to structure evaluations of child-abuse prevention and intervention programs. Presents the basic concept of benefit-cost analysis, its application in the context of assessing these types of child welfare programs, and limitations on its application to social service programs. (Author)

  2. 77 FR 48107 - Workshop on Performance Assessments of Near-Surface Disposal Facilities: FEPs Analysis, Scenario...


    ...-Surface Disposal Facilities: FEPs Analysis, Scenario and Conceptual Model Development, and Code Selection... Radioactive Waste.'' These regulations were published in the Federal Register on December 27, 1982 (47 FR... on three aspects of a performance assessment: (1) Features, Events, and Processes (FEPs) analysis,...

  3. Procedures for uncertainty and sensitivity analysis in repository performance assessment

    The objective of the project was mainly a literature study of available methods for the treatment of parameter uncertainty propagation and sensitivity aspects in complete models such as those concerning geologic disposal of radioactive waste. The study, which has run parallel with the development of a code package (PROPER) for computer assisted analysis of function, also aims at the choice of accurate, cost-affective methods for uncertainty and sensitivity analysis. Such a choice depends on several factors like the number of input parameters, the capacity of the model and the computer reresources required to use the model. Two basic approaches are addressed in the report. In one of these the model of interest is directly simulated by an efficient sampling technique to generate an output distribution. Applying the other basic method the model is replaced by an approximating analytical response surface, which is then used in the sampling phase or in moment matching to generate the output distribution. Both approaches are illustrated by simple examples in the report. (author)

  4. Application of importance analysis probabilistic safety assessment results of Tehran Research Reactor

    Application of probabilistic safety assessment to evaluate the safety of hazardous facilities will be fulfilled when the results have been processed meaningfully. The purpose of the importance analysis is to identify major contributors to core damage frequency that may include accident initiators, system failures, component failures and human errors. In this paper, Fussell-Vesely measure of importance was applied to the results of probabilistic safety assessment study of Tehran Research Reactor. This analysis is done using systems analysis programs for hands-on integrated reliability evaluations software

  5. The analysis of financial statements as approach to the assessment of financial stability of the enterprise

    Y.E. Bezborodova


    Full Text Available In the present article some approaches to an assessment of financial stability of the enterprises by the analysis of financial statements are considered. Its financial condition serves in market conditions as pledge of stable position of the enterprise. The analysis of a financial condition of the enterprise is one of the major elements in a control system as the analysis allows to reveal the problem parties in enterprise activity by an assessment of financial stability, solvency and liquidity and to define ways of their decision.

  6. Underground Test Area Subproject Phase I Data Analysis Task. Volume VIII - Risk Assessment Documentation Package



    Volume VIII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the risk assessment documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  7. Alternative model for administration and analysis of research-based assessments

    Wilcox, Bethany R.; Zwickl, Benjamin M.; Hobbs, Robert D.; Aiken, John M.; Welch, Nathan M.; Lewandowski, H. J.


    Research-based assessments represent a valuable tool for both instructors and researchers interested in improving undergraduate physics education. However, the historical model for disseminating and propagating conceptual and attitudinal assessments developed by the physics education research (PER) community has not resulted in widespread adoption of these assessments within the broader community of physics instructors. Within this historical model, assessment developers create high quality, validated assessments, make them available for a wide range of instructors to use, and provide minimal (if any) support to assist with administration or analysis of the results. Here, we present and discuss an alternative model for assessment dissemination, which is characterized by centralized data collection and analysis. This model provides a greater degree of support for both researchers and instructors in order to more explicitly support adoption of research-based assessments. Specifically, we describe our experiences developing a centralized, automated system for an attitudinal assessment we previously created to examine students' epistemologies and expectations about experimental physics. This system provides a proof of concept that we use to discuss the advantages associated with centralized administration and data collection for research-based assessments in PER. We also discuss the challenges that we encountered while developing, maintaining, and automating this system. Ultimately, we argue that centralized administration and data collection for standardized assessments is a viable and potentially advantageous alternative to the default model characterized by decentralized administration and analysis. Moreover, with the help of online administration and automation, this model can support the long-term sustainability of centralized assessment systems.

  8. Assessment of sperm quality using monoclonal antibodies and proteomic analysis

    Čapková, Jana; Kubátová, Alena; Margaryan, Hasmik; Pěknicová, Jana

    Praha: Biotechnologický ústav v.v AVČR, 2011 - (Pěknicová, J.). s. 63-63 [XVII. symposium českých reprodukčních imunologů s mezinárodní účastí. 26.05.2011-29.05.2011, Žďár nad Sázavou] R&D Projects: GA ČR(CZ) GA523/09/1793; GA ČR(CZ) GA523/08/H064; GA MŠk(CZ) 1M06011; GA MZd(CZ) NS10009 Institutional research plan: CEZ:AV0Z50520701 Keywords : sperm parameters * proteomic analysis * 2D PAGE * mass spectrometry Subject RIV: CE - Biochemistry

  9. In-field analysis and assessment of nuclear material

    Los Alamos National Laboratory has actively developed and implemented a number of instruments to monitor, detect, and analyze nuclear materials in the field. Many of these technologies, developed under existing US Department of Energy programs, can also be used to effectively interdict nuclear materials smuggled across or within national borders. In particular, two instruments are suitable for immediate implementation: the NAVI-2, a hand-held gamma-ray and neutron system for the detection and rapid identification of radioactive materials, and the portable mass spectrometer for the rapid analysis of minute quantities of radioactive materials. Both instruments provide not only critical information about the characteristics of the nuclear material for law-enforcement agencies and national authorities but also supply health and safety information for personnel handling the suspect materials

  10. Analysis And Assessment Of The Security Method Against Incidental Contamination In The Collective Water Supply System

    Szpak Dawid; Tchórzewska – Cieślak Barbara


    The paper presents the main types of surface water incidental contaminations and the security method against incidental contamination in water sources. Analysis and assessment the collective water supply system (CWSS) protection against incidental contamination was conducted. Failure Mode and Effects Analysis (FMEA) was used. The FMEA method allow to use the product or process analysis, identification of weak points, and implementation the corrections and new solutions for eliminating the sou...

  11. Assessment of Water Quality Parameters by Using the Multidimensional Scaling Analysis

    Suheyla Yerel; Huseyin Ankara


    The surface water quality parameters of the western region of Black Sea in Turkey were assessed by using multidimensional scaling analysis. This technique was applied to the surface water quality parameters obtained from the five monitoring stations. Multidimensional scaling analysis showed that Cl-, SO42-, Na+ and BOD5 are the most important parameters causing difference in the monitoring stations. These analysis results present from the domestic waste and organic pollution affected of surfa...

  12. Rorschach assessment of traumatized refugees: an exploratory factor analysis.

    Opaas, Marianne; Hartmann, Ellen


    Fifty-one multitraumatized mental health patients with refugee backgrounds completed the Rorschach (Meyer & Viglione, 2008), Harvard Trauma Questionnaire, and Hopkins Symptom Checklist-25 (Mollica, McDonald, Massagli, & Silove, 2004), and the World Health Organization Quality of Life-BREF questionnaire (WHOQOL Group, 1998) before the start of treatment. The purpose was to gain more in-depth knowledge of an understudied patient group and to provide a prospective basis for later analyses of treatment outcome. Factor analysis of trauma-related Rorschach variables gave 2 components explaining 60% of the variance; the first was interpreted as trauma-related flooding versus constriction and the second as adequate versus impaired reality testing. Component 1 correlated positively with self-reported reexperiencing symptoms of posttraumatic stress (r = .32, p < .05). Component 2 correlated positively with self-reported quality of life in the physical, psychological, and social relationships domains (r = .34, .32, and .35, p < .05), and negatively with anxiety (r = -.33, p < .05). Each component also correlated significantly with resources like work experience, education, and language skills. PMID:23570250

  13. X-ray quality assessment by MTF analysis

    In a previous study a lucite phantom with several physical elements embedded, such as lead gratings with varying line widths, etc., was exposed at 200 X-ray installations in Bavaria, Federal Republic of Germany, by a conventional diagnostic standard technique. One of the parameters investigated was the local resolution achieved, determined visually by examination of the grating images. The same radiographs have now been used in a retrospective comparative study, based upon a quantitative analysis of TV images of the films. The films were positioned on a commercial illuminator screen and looked at by a TV camera through a simple magnifying optical system. The video signals were digitised, resulting in a pixel distance of 50 μm. In a first approach the edges of the broad frames of the lead gratings were adjusted vertically and the normalised sum of all 256 TV scanning lines taken as the edge function f(x). This was differentiated and suitably Fourier-transformed, delivering the modulation transfer function (MTF). The MTF can be analysed in several ways to describe quantitatively the maximum local resolution achievable. Correlation of some frequency measurements with the visually determined line resolution is generally good. (author)

  14. Digital image analysis outperforms manual biomarker assessment in breast cancer.

    Stålhammar, Gustav; Fuentes Martinez, Nelson; Lippert, Michael; Tobin, Nicholas P; Mølholm, Ida; Kis, Lorand; Rosin, Gustaf; Rantalainen, Mattias; Pedersen, Lars; Bergh, Jonas; Grunkin, Michael; Hartman, Johan


    In the spectrum of breast cancers, categorization according to the four gene expression-based subtypes 'Luminal A,' 'Luminal B,' 'HER2-enriched,' and 'Basal-like' is the method of choice for prognostic and predictive value. As gene expression assays are not yet universally available, routine immunohistochemical stains act as surrogate markers for these subtypes. Thus, congruence of surrogate markers and gene expression tests is of utmost importance. In this study, 3 cohorts of primary breast cancer specimens (total n=436) with up to 28 years of survival data were scored for Ki67, ER, PR, and HER2 status manually and by digital image analysis (DIA). The results were then compared for sensitivity and specificity for the Luminal B subtype, concordance to PAM50 assays in subtype classification and prognostic power. The DIA system used was the Visiopharm Integrator System. DIA outperformed manual scoring in terms of sensitivity and specificity for the Luminal B subtype, widely considered the most challenging distinction in surrogate subclassification, and produced slightly better concordance and Cohen's κ agreement with PAM50 gene expression assays. Manual biomarker scores and DIA essentially matched each other for Cox regression hazard ratios for all-cause mortality. When the Nottingham combined histologic grade (Elston-Ellis) was used as a prognostic surrogate, stronger Spearman's rank-order correlations were produced by DIA. Prognostic value of Ki67 scores in terms of likelihood ratio χ(2) (LR χ(2)) was higher for DIA that also added significantly more prognostic information to the manual scores (LR-Δχ(2)). In conclusion, the system for DIA evaluated here was in most aspects a superior alternative to manual biomarker scoring. It also has the potential to reduce time consumption for pathologists, as many of the steps in the workflow are either automatic or feasible to manage without pathological expertise. PMID:26916072

  15. FEBEX II Project Post-mortem analysis EDZ assessment

    Bazargan Sabet, B.; Shao, H.; Autio, J.; Elorza, F. J.


    variations in the propagation velocities of acoustic waves. A cylindrical block of granite 38.8 cm in diameter and 40 cm high has been analysed along 2D transversal sections in six radial directions. Different inverse tomographic strategies have been used to analyse the measured data, which shown no evidences of the existence of an EDZ in FEBEX gallery. However, a preferential direction in the wave propagation similar to the maximum compression direction of the stress tensor has appeared. As for in situ investigations, the hydraulic connectivity of the drift has been assessed at eleven locations in heated area, including granite matrix and lamprophyre dykes and at six locations in undisturbed zones. In the granite matrix area, pulse test using pressurised air with stepwise pressure increasing was conducted to determine gas entry pressure. In the fractured area, a gas constant flow rate injection test was conducted. Only two locations with higher permeability were detected; one in a natural fracture in the lamprophyre dyke and the other in the interface between lamprophyre and granite. Where numerical investigations are concerned, several analyses of the FEBEX in situ experiment were carried out to determine whether if the generation of a potential EDZ in the surrounding rock was possible or not. Stresses have been calculated by 1D full-coupled thermo-hydromechanical model and by 2D and 3D thermo-mechanical models. Results compared with the available data on compressive strength of the Grimsel granite show that in the worst-case studied, the state of stresses induced by the excavation and the heating phases remains far below the critical curve. (Author)

  16. Interconnectivity among Assessments from Rating Agencies: Using Cluster and Correlation Analysis

    Jaroslav Krejčíř


    Full Text Available The aim of this paper is to determine whether there is a dependency among leading rating agencies assessments. Rating agencies are important part of global economy. Great attention has been paid to activities of rating agencies since 2007, when there was a financial crisis. One of the main causes of this crisis was identified credit rating agencies. This paper is focused on an existence of mutual interconnectivity among assessments from three leading rating agencies. The method used for this determines is based on cluster analysis and subsequently correlation analysis and the test of independence. Credit rating assessments of Greece and Spain were chosen to the determination of this mutual interconnectivity due to the fact that these countries are most talked euro­area countries. The significant dependence of the assessment from different rating agencies has been demonstrated.

  17. An Analysis of the Cumulative Uncertainty Associated with a Quantitative Consequence Assessment of a Major Accident



    The task of the article is to quantify the uncertainty of the possible results of the accident consequence assessment of the chemical production plant and to provide some description of potentional problems with literature references and examples to help to avoid the erroneous use of available formulas. Based on numbers presented in the article we may conclude, that the main source of uncertainty in the consequence analysis of chemical accident assessment is surprisingly not only the dispers...

  18. An Analysis Of Tensile Test Results to Assess the Innovation Risk for an Additive Manufacturing Technology

    Adamczak Stanisław; Bochnia Jerzy; Kaczmarska Bożena


    The aim of this study was to assess the innovation risk for an additive manufacturing process. The analysis was based on the results of static tensile tests obtained for specimens made of photocured resin. The assessment involved analyzing the measurement uncertainty by applying the FMEA method. The structure of the causes and effects of the discrepancies was illustrated using the Ishikawa diagram. The risk priority numbers were calculated. The uncertainty of the tensile test measurement was ...

  19. A multiple-imputation based approach to sensitivity analysis and effectiveness assessment in longitudinal clinical trials

    Teshome Ayele, Birhanu; Lipkovich, Ilya; Molenberghs, Geert; Mallinckrodt, Craig H


    It is important to understand the effects of a drug as actually taken (effectiveness) and when taken as directed (efficacy). The primary objective of this investigation was to assess the statistical performance of a method referred to as placebo multiple imputation (pMI) as an estimator of effectiveness and as a worst reasonable case sensitivity analysis in assessing efficacy. The pMI method assumes the statistical behavior of placebo- and drug-treated patients after dropout is the statistica...

  20. Sensitivity analysis for the EPIK vulnerability assessment in a local karstic aquifer

    Gogu, Radu Constantin; Dassargues, Alain


    Applying the EPIK parametric method, a vulnerability assessment has been made for a small karstic groundwater system in southern Belgium. The aquifer is a karstified limestone of Devonian age. A map of intrinsic vulnerability of the aquifer and of the local water-supply system shows three vulnerability areas. A parameter-balance study and a sensitivity analysis were performed to evaluate the influence of single parameters on aquifer-vulnerability assessment using the EPIK method. This approac...

  1. Solar PV rural electrification and energy poverty assessment in Ghana: A principal component analysis

    Obeng, G. Y.; Evers, Hans-Dieter; F. O. Akuffo; Braimah, I.; Brew-Hammond, A.


    The relationship between solar photovoltaic (PV) rural electrification and energy poverty was assessed using social, economic and environmental indicator-based questionnaires in 96 solar-electrified and 113 non-electrified households in rural Ghana. The purpose was to assess energy-poverty status of households with and without solar PV systems, and to determine the factors that explain energy-poverty in off-grid rural households. Principal component analysis (PCA) was used to construct energy...

  2. Quantitative Assessment of Flame Stability Through Image Processing and Spectral Analysis

    Sun, Duo; Lu, Gang; Zhou, Hao; Yan, Yong; Liu, Shi


    This paper experimentally investigates two generalized methods, i.e., a simple universal index and oscillation frequency, for the quantitative assessment of flame stability at fossil-fuel-fired furnaces. The index is proposed to assess the stability of flame in terms of its color, geometry, and luminance. It is designed by combining up to seven characteristic parameters extracted from flame images. The oscillation frequency is derived from the spectral analysis of flame radiation signals. The...

  3. Application of inelastic neutron scattering and prompt neutron activation analysis in coal quality assessment

    The basic principles are assessed of the determination of ash content in coal based on the measurement of values proportional to the effective proton number. Discussed is the principle of coal quality assessment using the method of inelastic neutron scattering and prompt neutron activation analysis. This is done with respect both to theoretical relations between measured values and coal quality attributes and to practical laboratory measurements of coal sample quality by the said methods. (author)

  4. English Language Assessment in the Colleges of Applied Sciences in Oman: Thematic Document Analysis

    Fatma Al Hajri


    Proficiency in English language and how it is measured have become central issues in higher education research as the English language is increasingly used as a medium of instruction and a criterion for admission to education. This study evaluated the English language assessment in the foundation Programme at the Colleges of Applied sciences in Oman. It used thematic analysis in studying 118 documents on language assessment. Three main findings were reported: compatibility between what was ta...

  5. Objective Audio Quality Assessment Based on Spectro-Temporal Modulation Analysis

    Guo, Ziyuan


    Objective audio quality assessment is an interdisciplinary research area that incorporates audiology and machine learning. Although much work has been made on the machine learning aspect, the audiology aspect also deserves investigation. This thesis proposes a non-intrusive audio quality assessment algorithm, which is based on an auditory model that simulates human auditory system. The auditory model is based on spectro-temporal modulation analysis of spectrogram, which has been proven to be ...

  6. Comparison of digital image analysis versus visual assessment to assess survivin expression as an independent predictor of survival for patients with clear cell renal cell carcinoma✩

    Parker, Alexander S.; Lohse, Christine M.; Leibovich, Bradley C.; Cheville, John C; Sheinin, Yuri M.; Kwon, Eugene D.


    We previously used quantitative digital image analysis to report that high immunohistochemical tumor expression levels of survivin independently predict poor outcome among patients with clear cell renal cell carcinoma. However, given the cumbersome and costly nature of digital image analysis, we evaluated simple visual assessment as an alternative to digital image analysis for assessing survivin as a predictor of clear cell renal cell carcinoma patient outcomes. We identified 310 patients tre...

  7. Establishment of a Risk Assessment Framework for Analysis of the Spread of Highly Pathogenic Avian Influenza

    LI Jing; WANG Jing-fei; WU Chun-yan; YANG Yan-tao; JI Zeng-tao; WANG Hong-bin


    To evaluate the risk of highly pathogenic avian influenza (HPAI) in mainland China, a risk assessment framework was built.Risk factors were determined by analyzing the epidemic data using the brainstorming method; the analytic hierarchy process was designed to weigh risk factors, and the integrated multicriteria analysis was used to evaluate the final result.The completed framework included the risk factor system, data standards for risk factors, weights of risk factors, and integrated assessment methods. This risk assessment framework can be used to quantitatively analyze the outbreak and spread of HPAI in mainland China.

  8. Uncertainty and Sensitivity Analysis in Performance Assessment for the Waste Isolation Pilot Plant

    Helton, J.C.


    The Waste Isolation Pilot Plant (WIPP) is under development by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. This development has been supported by a sequence of performance assessments (PAs) carried out by Sandla National Laboratories (SNL) to assess what is known about the WIPP and to provide .tidance for future DOE research and development activities. Uncertainty and sensitivity analysis procedures based on Latin hypercube sampling and regression techniques play an important role in these PAs by providing an assessment of the uncertainty in important analysis outcomes and identi~ing the sources of thk uncertainty. Performance assessments for the WIPP are conceptually and computational] y interesting due to regulatory requirements to assess and display the effects of both stochastic (i.e., aleatory) and subjective (i.e., epistemic) uncertainty, where stochastic uncertainty arises from the possible disruptions that could occur over the 10,000 yr regulatory period associated with the WIPP and subjective uncertainty arises from an inability to unambi-aously characterize the many models and associated parameters required in a PA for the WIPP. The interplay between uncertainty analysis, sensitivity analysis, stochastic uncertainty and subjective uncertainty are discussed and illustrated in the context of a recent PA carried out by SNL to support an application by the DOE to the U.S. Environmental Protection Agency for the certification of the WIPP for the disposal of TRU waste.

  9. Assessment

    Geoff Brindley


    @@ Introduction TERMINOLOGY AND KEY CONCEPTS The term assessment refers to a variety of ways of collecting information on a learner's language ability or achievement. Although testing and assessment are often used interchangeably, the latter is an umbrella term encompassing measurement instruments administered on a ‘one-off’ basis such as tests, as well as qualitative methods of monitoring and recording student learning such as observation, simulations of project work. Assessment is also distinguished from evaluation which is concerned with the overall language programme and not just with what individual students have learnt. Proficiency assessment refers to the assessment of general language abilities acquired by the learner independent of a course of study.This kind of assessment is often done through the administration of standardised commercial language-proficency tests. On the other hand, assessment of achievement aims to establish what a student had learned in relation to a particular course or curriculum (thus frequently carried out by the teacher) .Achievement assesssment may be based either on the specific content of the course or on the course objectives (Hughes 1989).

  10. Human Reliability Analysis in Frame of Probabilistic Safety Assessment Projects in Czech Republic

    Human reliability analysis has proved to be a very important part of probabilistic safety analysis all over the world. It has been also an integral part of both Probabilistic Safety Level-1 studies developed in Czech Republic - Nuclear Power Plant Dukovany Probabilistic Safety Assessment and Nuclear Power Plant Temelin Probabilistic Safety Assessment and most of their consequent applications. The methodology used in human reliability analysis in frame of these studies is described in the first part of the paper. In general, the methodology is based on the well-known and most frequently used methods Technique for Human Error Rate Prediction and ASEP. The up-to-date decision tree method is used to address procedure-driven operator's interventions during plant response to initiating event. Some interesting results of human reliability analysis performed for Nuclear Power Plant Dukovany are described in the second part of the paper. The recommendations resulting from the analysis led to the standardization of some, up to that time, non-standard operator's actions and to the development of procedures for them. Generally, the procedures were found to be deficient from several points of view, what contributed to the decision to develop quite new emergency procedures for Nuclear Power Plant Dukovany. The human reliability analysis projects going on or planned for the very next future are described in the final part of the paper. safety analysis; risk assessment; reliability; nuclear power plants; human factors; errors; Czech Republic; operators; emergencies;

  11. Creep-fatigue defect assessment test and analysis of high temperature structure

    The creep-fatigue damage evaluation and the defect assessment are the one of key parameters to ascertain the structural integrity of high temperature structures. In this study, the creep-fatigue test with geometrically nonlinear structure including through wall defects was performed to examine the structural integrity of the defect structure and to validate the inelastic analysis code NONSTA. The creep-fatigue damage was examined by a portable zoom microscope and the replication technics allowed to observe the structure surface. After 400 cycles of testing, no apparent creep-fatigue was observed except the defect front. At the defect front, creep-fatigue crack initiation was observed. The commercial finite element analysis softwares ANSYS and ABAQUS were used for the corresponding structural analysis. Both elastic analysis and inelastic analysis using NONSTA code were performed with collected temperature profile from the test and the strain results of analyses agree well with those from the test. The creep-fatigue damage was assessed per ASME-NH utilizing analysis results and the creep-fatigue crack initiation was assessed per RCC-MR A16. The results show good agreement between analysis and test

  12. The role of uncertainty analysis in dose reconstruction and risk assessment

    Dose reconstruction and risk assessment rely heavily on the use of mathematical models to extrapolate information beyond the realm of direct observation. Because models are merely approximations of real systems, their predictions are inherently uncertain. As a result, full disclosure of uncertainty in dose and risk estimates is essential to achieve scientific credibility and to build public trust. The need for formal analysis of uncertainty in model predictions was presented during the nineteenth annual meeting of the NCRP. At that time, quantitative uncertainty analysis was considered a relatively new and difficult subject practiced by only a few investigators. Today, uncertainty analysis has become synonymous with the assessment process itself. When an uncertainty analysis is used iteratively within the assessment process, it can guide experimental research to refine dose and risk estimates, deferring potentially high cost or high consequence decisions until uncertainty is either acceptable or irreducible. Uncertainty analysis is now mandated for all ongoing dose reconstruction projects within the United States, a fact that distinguishes dose reconstruction from other types of exposure and risk assessments. 64 refs., 6 figs., 1 tab

  13. National Waste Repository Novi Han operational safety analysis report. Safety assessment methodology

    The scope of the safety assessment (SA), presented includes: waste management functions (acceptance, conditioning, storage, disposal), inventory (current and expected in the future), hazards (radiological and non-radiological) and normal and accidental modes. The stages in the development of the SA are: criteria selection, information collection, safety analysis and safety assessment documentation. After the review the facilities functions and the national and international requirements, the criteria for safety level assessment are set. As a result from the 2nd stage actual parameters of the facility, necessary for safety analysis are obtained.The methodology is selected on the base of the comparability of the results with the results of previous safety assessments and existing standards and requirements. The procedure and requirements for scenarios selection are described. A radiological hazard categorisation of the facilities is presented. Qualitative hazards and operability analysis is applied. The resulting list of events are subjected to procedure for prioritization by method of 'criticality analysis', so the estimation of the risk is given for each event. The events that fall into category of risk on the boundary of acceptability or are unacceptable are subjected to the next steps of the analysis. As a result the lists with scenarios for PSA and possible design scenarios are established. PSA logical modeling and quantitative calculations of accident sequences are presented

  14. An analysis of assessment outcomes from eight years' operation of the Australian border weed risk assessment system.

    Weber, Jason; Dane Panetta, F; Virtue, John; Pheloung, Paul


    The majority of Australian weeds are exotic plant species that were intentionally introduced for a variety of horticultural and agricultural purposes. A border weed risk assessment system (WRA) was implemented in 1997 in order to reduce the high economic costs and massive environmental damage associated with introducing serious weeds. We review the behaviour of this system with regard to eight years of data collected from the assessment of species proposed for importation or held within genetic resource centres in Australia. From a taxonomic perspective, species from the Chenopodiaceae and Poaceae were most likely to be rejected and those from the Arecaceae and Flacourtiaceae were most likely to be accepted. Dendrogram analysis and classification and regression tree (TREE) models were also used to analyse the data. The latter revealed that a small subset of the 35 variables assessed was highly associated with the outcome of the original assessment. The TREE model examining all of the data contained just five variables: unintentional human dispersal, congeneric weed, weed elsewhere, tolerates or benefits from mutilation, cultivation or fire, and reproduction by vegetative propagation. It gave the same outcome as the full WRA model for 71% of species. Weed elsewhere was not the first splitting variable in this model, indicating that the WRA has a capacity for capturing species that have no history of weediness. A reduced TREE model (in which human-mediated variables had been removed) contained four variables: broad climate suitability, reproduction in less or than equal to 1year, self-fertilisation, and tolerates and benefits from mutilation, cultivation or fire. It yielded the same outcome as the full WRA model for 65% of species. Data inconsistencies and the relative importance of questions are discussed, with some recommendations made for improving the use of the system. PMID:18339471

  15. Data uncertainty analysis for safety assessment of HLW disposal by the Monte Carlo simulation

    Based on the conceptual model of the Reference Case, which is defined as the baseline for various cases in the safety assessment of the H12 report, a new probabilistic simulation code that allowed rapid evaluation of the effect of data uncertainty has been developed. Using this code, probabilistic simulation was performed by the Monte Carlo method and conservativeness and sufficiency of the safety assessment in the H12 report was confirmed, which was performed deterministically. In order to examine the important parameter, this study includes the analysis of sensitivity structure among the input and the output. Cluster analysis and multiple regression analysis for each cluster were applied in this analysis. As a result, the transmissivity had a strong influence on the uncertainty of the system performance. Furthermore, this approach was confirmed to evaluate the global sensitive parameters and local sensitive parameters that strongly influence the space of the partial simulation results. (author)

  16. Degradation Assessment and Fault Diagnosis for Roller Bearing Based on AR Model and Fuzzy Cluster Analysis

    Lingli Jiang


    Full Text Available This paper proposes a new approach combining autoregressive (AR model and fuzzy cluster analysis for bearing fault diagnosis and degradation assessment. AR model is an effective approach to extract the fault feature, and is generally applied to stationary signals. However, the fault vibration signals of a roller bearing are non-stationary and non-Gaussian. Aiming at this problem, the set of parameters of the AR model is estimated based on higher-order cumulants. Consequently, the AR parameters are taken as the feature vectors, and fuzzy cluster analysis is applied to perform classification and pattern recognition. Experiments analysis results show that the proposed method can be used to identify various types and severities of fault bearings. This study is significant for non-stationary and non-Gaussian signal analysis, fault diagnosis and degradation assessment.

  17. 76 FR 67764 - Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...


    ... COMMISSION Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...-xxxx, Revision 0, ``Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and... at (301) 492-3446. FOR FURTHER INFORMATION CONTACT: Song-Hua Shen, Division of Risk Analysis,...

  18. Assessing Heterogeneity for Factor Analysis Model with Continuous and Ordinal Outcomes

    Ye-Mao Xia; Jian-Wei Gou


    Factor analysis models with continuous and ordinal responses are a useful tool for assessing relations between the latent variables and mixed observed responses. These models have been successfully applied to many different fields, including behavioral, educational, and social-psychological sciences. However, within the Bayesian analysis framework, most developments are constrained within parametric families, of which the particular distributions are specified for the parameters of interest. ...

  19. Software Integration of Life Cycle Assessment and Economic Analysis for Process Evaluation

    Kalakula, Sawitree; Malakula, Pomthong; Siemanonda, Kitipat; Gani, Rafiqul


    This study is focused on the sustainable process design of bioethanol production from cassava rhizome. The study includes: process simulation, sustainability analysis, economic evaluation and life cycle assessment (LCA). A steady state process simulation if performed to generate a base case design of the bioethanol conversion process using cassava rhizome as a feedstock. The sustainability analysis is performed to analyze the relevant indicators in sustainability metrics, todefinedesign/retro...

  20. Non-linear finite element assessment analysis of a modern heritage structure

    S. Sorace; Terenzi, G


    A synthesis of a non-linear finite element structural assessment enquiry carried out on a monumental modern heritage building is reported in this paper. The study includes a buckling analysis of the slender steel beams constituting a mushroom-type roof, and an ?integral" seismic pushover analysis of the supporting R/C columns. The computational solutions obtained for the steel roof beams are compared to the results derived from a calculation of the critical stress of beam panels, and the glob...

  1. Analysis and radiological assessment of survey results and samples from the beaches around Sellafield

    After radioactive sea debris had been found on beaches near the BNFL, Sellafield, plant, NRPB was asked by the Department of the Environment to analyse some of the samples collected and to assess the radiological hazard to members of the public. A report is presented containing an analysis of survey reports for the period 19 November - 4 December 1983 and preliminary results of the analysis of all samples received, together with the Board's recommendations. (author)

  2. The ICR142 NGS validation series: a resource for orthogonal assessment of NGS analysis

    Elise Ruark; Anthony Renwick; Matthew Clarke; Katie Snape; Emma Ramsay; Anna Elliott; Sandra Hanks; Ann Strydom; Sheila Seal; Nazneen Rahman


    To provide a useful community resource for orthogonal assessment of NGS analysis software, we present the ICR142 NGS validation series. The dataset includes high-quality exome sequence data from 142 samples together with Sanger sequence data at 730 sites; 409 sites with variants and 321 sites at which variants were called by an NGS analysis tool, but no variant is present in the corresponding Sanger sequence. The dataset includes 286 indel variants and 275 negative indel sites, and thus the I...

  3. Assessing collective defensive performances in football: A Qualitative Comparative Analysis of central back pairs

    Kaufmann, David


    Ahead of the World Cup in Brazil the crucial question for the Swiss national coach is the nomination of the starting eleven central back pair. A fuzzy set Qualitative Comparative Analysis assesses the defensive performances of different Swiss central back pairs during the World Cup campaign (2011 – 2014). This analysis advises Ottmar Hitzfeld to nominate Steve von Bergen and Johan Djourou as the starting eleven central back pair. The alternative with a substantially weaker empirical validity ...

  4. Risk assessment of inhalation exposure for airborne toxic metals using instrumental neutron activation analysis

    In order to study the effects of air pollution, about 1,300 samples of airborne particulate matter (APM) were collected at suburban and industrial sites, in Daejeon, Korea from 1998 to 2006. The concentrations of carcinogenic (As and Cr) and non-carcinogenic metals (Al, Mn, and Zn) were determined by using instrumental neutron activation analysis (INAA). These long-term metal concentration data were applied to a risk assessment of inhalation exposure using Monte Carlo analysis (MCA). (author)

  5. Assessing Low-Carbon Development in Nigeria : An Analysis of Four Sectors

    Cervigni, Raffaello; Rogers, John Allen; Dvorak, Irina


    The Federal Government of Nigeria (FGN) and the World Bank have agreed to carry out a Climate Change Assessment (CCA) within the framework of the Bank's Country Partnership Strategy (CPS) for Nigeria (2010-13). The CCA includes an analysis of options for low-carbon development in selected sectors, including power, oil and gas, transport, and agriculture. The goal of the low-carbon analysis...

  6. Uncertainty and sensitivity analysis methodology in a level-I PSA (Probabilistic Safety Assessment)

    This work presents a methodology for sensitivity and uncertainty analysis, applicable to a probabilistic safety assessment level I. The work contents are: correct association of distributions to parameters, importance and qualification of expert opinions, generations of samples according to sample sizes, and study of the relationships among system variables and system response. A series of statistical-mathematical techniques are recommended along the development of the analysis methodology, as well different graphical visualization for the control of the study. (author)

  7. Impact Factor 2.0 : Applying Social Network Analysis to Scientific Impact Assessment

    Hoffmann, Christian Pieter; Lutz, Christoph; Meckel, Miriam


    Social media are becoming increasingly popular in scientific communication. A range of platforms are geared specifically towards the academic community. Proponents of the altmetrics approach point out that these new media allow for new avenues of scientific impact assessment. Traditional impact measures based on bibliographic analysis have long been criticized for overlooking the relational dynamic of scientific impact. We therefore propose an application of social network analysis to researc...


    A technique to rapidly assess phytoplankton dynamics is being evaluated for its utility in the Great Lakes. Comparison to traditional microscopic techniques and to more recent in-situ FluoroProbe technology will allow us to determine if HPLC pigment analysis can provide unique a...

  9. Method of synchronization assessment of rythms in regulatory systems for signal analysis in real time

    Borovkova E.l.; Ishbulatov Yu.M.; Mironov S.A.


    A method is proposed for quantitative assessment of the phase synchronization of 0.1 Hz oscillations in autonomic cardiovascular control by photoplethysmogram analysis in real time. The efficiency of the method is shown in the comparison with the results obtained by the previously developed method.

  10. Method of synchronization assessment of rythms in regulatory systems for signal analysis in real time

    Borovkova E.l.


    Full Text Available A method is proposed for quantitative assessment of the phase synchronization of 0.1 Hz oscillations in autonomic cardiovascular control by photoplethysmogram analysis in real time. The efficiency of the method is shown in the comparison with the results obtained by the previously developed method.