WorldWideScience

Sample records for assessment ioa analysis

  1. Validation of the Indicators of Abuse (IOA) Screen.

    Science.gov (United States)

    Reis, Myrna; Nahmiash, Daphne

    1998-01-01

    Reports on the validity of the Indicators of Abuse (IOA) Screen, used by social-services-agency practitioners as an abuse screening tool. An abuse-indicator model evolving from the IOA suggests three main types of abuse signals: caregivers' personal problems/issues, caregivers interpersonal problems, and care receivers' social-support shortages…

  2. Urban energy consumption: Different insights from energy flow analysis, input–output analysis and ecological network analysis

    International Nuclear Information System (INIS)

    Highlights: • Urban energy consumption was assessed from three different perspectives. • A new concept called controlled energy was developed from network analysis. • Embodied energy and controlled energy consumption of Beijing were compared. • The integration of all three perspectives will elucidate sustainable energy use. - Abstract: Energy consumption has always been a central issue for sustainable urban assessment and planning. Different forms of energy analysis can provide various insights for energy policy making. This paper brought together three approaches for energy consumption accounting, i.e., energy flow analysis (EFA), input–output analysis (IOA) and ecological network analysis (ENA), and compared their different perspectives and the policy implications for urban energy use. Beijing was used to exemplify the different energy analysis processes, and the 42 economic sectors of the city were aggregated into seven components. It was determined that EFA quantifies both the primary and final energy consumption of the urban components by tracking the different types of fuel used by the urban economy. IOA accounts for the embodied energy consumption (direct and indirect) used to produce goods and services in the city, whereas the control analysis of ENA quantifies the specific embodied energy that is regulated by the activities within the city’s boundary. The network control analysis can also be applied to determining which economic sectors drive the energy consumption and to what extent these sectors are dependent on each other for energy. So-called “controlled energy” is a new concept that adds to the analysis of urban energy consumption, indicating the adjustable energy consumed by sectors. The integration of insights from all three accounting perspectives further our understanding of sustainable energy use in cities

  3. Analysis and assessment

    International Nuclear Information System (INIS)

    The ultimate objective is to predict potential health costs tp man accruing from the effluents or by-products of any energy system or mix of systems, but the establishment of reliable prediction equations first requires a baseline analysis of those preexisting and essentially uncontrolled factors known to have significant influence on patterns of mortality. These factors are the cultural, social, economic, and demographic traits of a defined local or regional population. Thus, the immediate objective is the rigorous statistical definition of consistent relationships that may exist among the above traits and between them and selected causes of death, especially those causes that may have interpretive value for the detection of environmental pollutants

  4. Integrated Operations Architecture Technology Assessment Study

    Science.gov (United States)

    2001-01-01

    As part of NASA's Integrated Operations Architecture (IOA) Baseline, NASA will consolidate all communications operations. including ground-based, near-earth, and deep-space communications, into a single integrated network. This network will make maximum use of commercial equipment, services and standards. It will be an Internet Protocol (IP) based network. This study supports technology development planning for the IOA. The technical problems that may arise when LEO mission spacecraft interoperate with commercial satellite services were investigated. Commercial technology and services that could support the IOA were surveyed, and gaps in the capability of existing technology and techniques were identified. Recommendations were made on which gaps should be closed by means of NASA research and development funding. Several findings emerged from the interoperability assessment: in the NASA mission set, there is a preponderance of small. inexpensive, low data rate science missions; proposed commercial satellite communications services could potentially provide TDRSS-like data relay functions; and. IP and related protocols, such as TCP, require augmentation to operate in the mobile networking environment required by the space-to-ground portion of the IOA. Five case studies were performed in the technology assessment. Each case represented a realistic implementation of the near-earth portion of the IOA. The cases included the use of frequencies at L-band, Ka-band and the optical spectrum. The cases also represented both space relay architectures and direct-to-ground architectures. Some of the main recommendations resulting from the case studies are: select an architecture for the LEO/MEO communications network; pursue the development of a Ka-band space-qualified transmitter (and possibly a receiver), and a low-cost Ka-band ground terminal for a direct-to-ground network, pursue the development of an Inmarsat (L-band) space-qualified transceiver to implement a global, low

  5. Assessment of Thorium Analysis Methods

    International Nuclear Information System (INIS)

    The Assessment of thorium analytical methods for mixture power fuel consisting of titrimetry, X-ray flouresence spectrometry, UV-VIS spectrometry, alpha spectrometry, emission spectrography, polarography, chromatography (HPLC) and neutron activation were carried out. It can be concluded that analytical methods which have high accuracy (deviation standard < 3%) were; titrimetry neutron activation analysis and UV-VIS spectrometry; whereas with low accuracy method (deviation standard 3-10%) were; alpha spectrometry and emission spectrography. Ore samples can be analyzed by X-ray flourescnce spectrometry, neutron activation analysis, UV-VIS spectrometry, emission spectrography, chromatography and alpha spectometry. Concentrated samples can be analyzed by X-ray flourescence spectrometry; simulation samples can be analyzed by titrimetry, polarography and UV-VIS spectrometry, and samples of thorium as minor constituent can be analyzed by neutron activation analysis and alpha spectrometry. Thorium purity (impurities element in thorium samples) can be analyzed by emission spectography. Considering interference aspects, in general analytical methods without molecule reaction are better than those involving molecule reactions (author). 19 refs., 1 tabs

  6. Change point analysis and assessment

    DEFF Research Database (Denmark)

    Müller, Sabine; Neergaard, Helle; Ulhøi, John Parm

    2011-01-01

    The aim of this article is to develop an analytical framework for studying processes such as continuous innovation and business development in high-tech SME clusters that transcends the traditional qualitative-quantitative divide. It integrates four existing and well-recognized approaches to stud...... studying events, processes and change, mamely change-point analysis, event-history analysis, critical-incident technique and sequence analysis....

  7. Multifractal analysis for nutritional assessment.

    Directory of Open Access Journals (Sweden)

    Youngja Park

    Full Text Available The concept of multifractality is currently used to describe self-similar and complex scaling properties observed in numerous biological signals. Fractals are geometric objects or dynamic variations which exhibit some degree of similarity (irregularity to the original object in a wide range of scales. This approach determines irregularity of biologic signal as an indicator of adaptability, the capability to respond to unpredictable stress, and health. In the present work, we propose the application of multifractal analysis of wavelet-transformed proton nuclear magnetic resonance ((1H NMR spectra of plasma to determine nutritional insufficiency. For validation of this method on (1H NMR signal of human plasma, standard deviation from classical statistical approach and Hurst exponent (H, left slope and partition function from multifractal analysis were extracted from (1H NMR spectra to test whether multifractal indices could discriminate healthy subjects from unhealthy, intensive care unit patients. After validation, the multifractal approach was applied to spectra of plasma from a modified crossover study of sulfur amino acid insufficiency and tested for associations with blood lipids. The results showed that standard deviation and H, but not left slope, were significantly different for sulfur amino acid sufficiency and insufficiency. Quadratic discriminant analysis of H, left slope and the partition function showed 78% overall classification accuracy according to sulfur amino acid status. Triglycerides and apolipoprotein C3 were significantly correlated with a multifractal model containing H, left slope, and standard deviation, and cholesterol and high-sensitivity C-reactive protein were significantly correlated to H. In conclusion, multifractal analysis of (1H NMR spectra provides a new approach to characterize nutritional status.

  8. Assessing Analysis and Reasoning in Bioethics

    Science.gov (United States)

    Pearce, Roger S.

    2008-01-01

    Developing critical thinking is a perceived weakness in current education. Analysis and reasoning are core skills in bioethics making bioethics a useful vehicle to address this weakness. Assessment is widely considered to be the most influential factor on learning (Brown and Glasner, 1999) and this piece describes how analysis and reasoning in…

  9. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  10. Quality Assessment of Urinary Stone Analysis

    DEFF Research Database (Denmark)

    Siener, Roswitha; Buchholz, Noor; Daudon, Michel;

    2016-01-01

    , fulfilled the quality requirements. According to the current standard, chemical analysis is considered to be insufficient for stone analysis, whereas infrared spectroscopy or X-ray diffraction is mandatory. However, the poor results of infrared spectroscopy highlight the importance of equipment, reference...... and chemical analysis. The aim of the present study was to assess the quality of urinary stone analysis of laboratories in Europe. Nine laboratories from eight European countries participated in six quality control surveys for urinary calculi analyses of the Reference Institute for Bioanalytics, Bonn......, Germany, between 2010 and 2014. Each participant received the same blinded test samples for stone analysis. A total of 24 samples, comprising pure substances and mixtures of two or three components, were analysed. The evaluation of the quality of the laboratory in the present study was based on the...

  11. Safety analysis and risk assessment handbook

    International Nuclear Information System (INIS)

    This Safety Analysis and Risk Assessment Handbook (SARAH) provides guidance to the safety analyst at the Rocky Flats Environmental Technology Site (RFETS) in the preparation of safety analyses and risk assessments. Although the older guidance (the Rocky Flats Risk Assessment Guide) continues to be used for updating the Final Safety Analysis Reports developed in the mid-1980s, this new guidance is used with all new authorization basis documents. With the mission change at RFETS came the need to establish new authorization basis documents for its facilities, whose functions had changed. The methodology and databases for performing the evaluations that support the new authorization basis documents had to be standardized, to avoid the use of different approaches and/or databases for similar accidents in different facilities. This handbook presents this new standardized approach. The handbook begins with a discussion of the requirements of the different types of authorization basis documents and how to choose the one appropriate for the facility to be evaluated. It then walks the analyst through the process of identifying all the potential hazards in the facility, classifying them, and choosing the ones that need to be analyzed further. It then discusses the methods for evaluating accident initiation and progression and covers the basic steps in a safety analysis, including consequence and frequency binning and risk ranking. The handbook lays out standardized approaches for determining the source terms of the various accidents (including airborne release fractions, leakpath factors, etc.), the atmospheric dispersion factors appropriate for Rocky Flats, and the methods for radiological and chemical consequence assessments. The radiological assessments use a radiological open-quotes templateclose quotes, a spreadsheet that incorporates the standard values of parameters, whereas the chemical assessments use the standard codes ARCHIE and ALOHA

  12. Spatial interaction analysis in probabilistic risk assessment

    International Nuclear Information System (INIS)

    In severe probabilistic risk assessments (PRA), it has been shown that accident scenarios involving ''external events'', such as fires and floods, can make an important contribution to the frequency of core damage and radionuclide release. These events belong to the broader category of common cause events, and an important issue in the evaluation of these events is whether a complete set of scenarios has been considered. In this article, a systematic scoping method is described for identifying and ranking scenarios involving environmental hazards that originate within plant boundaries and for determining the scope of the following detailed external event analysis. This method is also known as spatial interaction analysis. It was developed as part of the Seabrook Station Probabilistic Safety Assessment and has since been improved and applied to two other PRAs

  13. Dynamic analysis and assessment for sustainable development

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The assessment of sustainable development is crucial for constituting sustainable development strategies. Assessment methods that exist so far usually only use an indicator system for making sustainable judgement. These indicators rarely reflect dynamic characteristics. However, sustainable development is influenced by changes in the social-economic system and in the eco-environmental system at different times. Besides the spatial character, sustainable development has a temporal character that can not be neglected; therefore the research system should also be dynamic. This paper focuses on this dynamic trait, so that the assessment results obtained provide more information for judgements in decision-making processes. Firstly the dynamic characteristics of sustainable development are analyzed, which point to a track of sustainable development that is an upward undulating curve. According to the dynamic character and the development rules of a social, economic and ecological system, a flexible assessment approach that is based on tendency analysis, restrictive conditions and a feedback system is then proposed for sustainable development.

  14. Office of Integrated Assessment and Policy Analysis

    International Nuclear Information System (INIS)

    The mission of the Office of Integrated Assessments and Policy Analysis (OIAPA) is to examine current and future policies related to the development and use of energy technologies. The principal ongoing research activity to date has focused on the impacts of several energy sources, including coal, oil shale, solar, and geothermal, from the standpoint of the Resource Conservation and Recovery Act. An additional project has recently been initiated on an evaluation of impacts associated with the implementation of the Toxic Substances Control Act. The impacts of the Resource Conservation and Recovery Act and the Toxic Substances Control Act on energy supply constitute the principal research focus of OIAPA for the near term. From these studies a research approach will be developed to identify certain common elements in the regulatory evaluation cycle as a means of evaluating subsequent environmental, health, and socioeconomic impact. It is planned that an integrated assessment team examine studies completed or underway on the following aspects of major regulations: health, risk assessment, testing protocols, environment control cost/benefits, institutional structures, and facility siting. This examination would assess the methodologies used, determine the general applicability of such studies, and present in a logical form information that appears to have broad general application. A suggested action plan for the State of Tennessee on radioactive and hazardous waste management is outlined

  15. Assessment of right atrial function analysis

    International Nuclear Information System (INIS)

    To assess the potential utility of right atrial function analysis in cardiac disease, reservoir function, pump function, and right atrial peak emptying rate (RAPER) were compared in 10 normal subjects, 32 patients with coronary artery disease, and 4 patients with primary pulmonary hypertension. Right atrial volume curves were obtained using cardiac radionuclide method with Kr-81m. In normal subjects, reservoir function index was 0.41+-0.05; pump function index was 0.25+-0.05. Both types of patients has decreased reservoir funcion and increased pump function. Pump function tended to decrease with an increase of right ventricular end-diastolic pressure. RAPER correlated well with right ventricular peak filling rate, probably reflecting right ventricular diastolic function. Analysis of right atrial function seemed to be of value in evaluating factors regulating right ventricular contraction and diastolic function, and cardiac output. (Namekawa, K)

  16. Multicriteria analysis in hazards assessment in Libya

    Science.gov (United States)

    Zeleňáková, Martina; Gargar, Ibrahim; Purcz, Pavol

    2012-11-01

    Environmental hazards (natural and man-made) have always constituted problem in many developing and developed countries. Many applications proved that these problems could be solved through planning studies and detailed information about these prone areas. Determining time and location and size of the problem are important for decision makers for planning and management activities. It is important to know the risk represented by those hazards and take actions to protect against them. Multicriteria analysis methods - Analytic hierarchy process, Pairwise comparison, Ranking method are used to analyse which is the most dangerous hazard facing Libya country. The multicriteria analysis ends with a more or less stable ranking of the given alternatives and hence a recommendation as to which alternative(s) problems should be preferred. Regarding our problem of environmental risk assessment, the result will be a ranking or categorisation of hazards with regard to their risk level.

  17. HANFORD SAFETY ANALYSIS & RISK ASSESSMENT HANDBOOK (SARAH)

    Energy Technology Data Exchange (ETDEWEB)

    EVANS, C B

    2004-12-21

    The purpose of the Hanford Safety Analysis and Risk Assessment Handbook (SARAH) is to support the development of safety basis documentation for Hazard Category 2 and 3 (HC-2 and 3) U.S. Department of Energy (DOE) nuclear facilities to meet the requirements of 10 CFR 830, ''Nuclear Safety Management''. Subpart B, ''Safety Basis Requirements.'' Consistent with DOE-STD-3009-94, Change Notice 2, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'' (STD-3009), and DOE-STD-3011-2002, ''Guidance for Preparation of Basis for Interim Operation (BIO) Documents'' (STD-3011), the Hanford SARAH describes methodology for performing a safety analysis leading to development of a Documented Safety Analysis (DSA) and derivation of Technical Safety Requirements (TSR), and provides the information necessary to ensure a consistently rigorous approach that meets DOE expectations. The DSA and TSR documents, together with the DOE-issued Safety Evaluation Report (SER), are the basic components of facility safety basis documentation. For HC-2 or 3 nuclear facilities in long-term surveillance and maintenance (S&M), for decommissioning activities, where source term has been eliminated to the point that only low-level, residual fixed contamination is present, or for environmental remediation activities outside of a facility structure, DOE-STD-1120-98, ''Integration of Environment, Safety, and Health into Facility Disposition Activities'' (STD-1120), may serve as the basis for the DSA. HC-2 and 3 environmental remediation sites also are subject to the hazard analysis methodologies of this standard.

  18. Geochemical and Geochronologic Investigations of Zircon-hosted Melt Inclusions in Rhyolites from the Mesoproterozoic Pea Ridge IOA-REE Deposit, St. Francois Mountains, Missouri

    Science.gov (United States)

    Watts, K. E.; Mercer, C. N.; Vazquez, J. A.

    2015-12-01

    Silicic volcanic and plutonic rocks of an eroded Mesoproterozoic caldera complex were intruded and replaced by iron ore, and cross-cut by REE-enriched breccia pipes (~12% total REO) to form the Pea Ridge iron-oxide-apatite-REE (IOA-REE) deposit. Igneous activity, iron ore formation, and REE mineralization overlapped in space and time, however the source of REEs and other metals (Fe, Cu, Au) integral to these economically important deposits remains unclear. Melt inclusions (MI) hosted in refractory zircon phenocrysts are used to constrain magmatic components and processes in the formation of the Pea Ridge deposit. Homogenized (1.4 kbar, 1000°C, 1 hr) MI in zircons from rhyolites ~600 ft (PR-91) and ~1200 ft (PR-12) laterally from the ore body were analyzed for major elements by EPMA and volatiles and trace elements (H2O, S, F, Cl, REEs, Rb, Sr, Y, Zr, Nb, U, Th) by SHRIMP-RG. Metals (including Cu, Au) will be measured in an upcoming SHRIMP-RG session. U-Pb ages, Ti and REE were determined by SHRIMP-RG for a subset of zircon spots adjacent to MI (1458 ± 18 Ma (PR-12); 1480 ± 45 Ma (PR-91)). MI glasses range from fresh and homogeneous dacite-rhyolite (65-75 wt% SiO2) to heterogeneous, patchy mixtures of K-spar and quartz (PR-12, 91), and more rarely mica, albite and/or anorthoclase (PR-91). MI are commonly attached to monazite and xenotime, particularly along re-entrants and zircon rims (PR-91). Fresh dacite-rhyolite glasses (PR-12) have moderate H2O (~2-2.5 wt%), Rb/Sr ratios (~8) and U (~5-7 ppm), and negative (chondrite-normalized) Eu anomalies (Eu ~0.4-0.7 ppm) (typical of rhyolites), whereas HREEs (Tb, Ho, Tm) are elevated (~2-3 ppm). Patchy K-spar and quartz inclusions (PR-12, 91) have flat LREE patterns, and positive anomalies in Tb, Ho, and Tm. One K-spar inclusion (PR-91) has a ~5-50 fold increase in HREEs (Tb, Dy, Ho, Er, Tm) and U (35 ppm) relative to other MI. U-Pb and REE analyses of its zircon host are not unusual (1484 ± 21 Ma); its irregular shape

  19. Qualitative Analysis for Maintenance Process Assessment

    Science.gov (United States)

    Brand, Lionel; Kim, Yong-Mi; Melo, Walcelio; Seaman, Carolyn; Basili, Victor

    1996-01-01

    In order to improve software maintenance processes, we first need to be able to characterize and assess them. These tasks must be performed in depth and with objectivity since the problems are complex. One approach is to set up a measurement-based software process improvement program specifically aimed at maintenance. However, establishing a measurement program requires that one understands the problems to be addressed by the measurement program and is able to characterize the maintenance environment and processes in order to collect suitable and cost-effective data. Also, enacting such a program and getting usable data sets takes time. A short term substitute is therefore needed. We propose in this paper a characterization process aimed specifically at maintenance and based on a general qualitative analysis methodology. This process is rigorously defined in order to be repeatable and usable by people who are not acquainted with such analysis procedures. A basic feature of our approach is that actual implemented software changes are analyzed in order to understand the flaws in the maintenance process. Guidelines are provided and a case study is shown that demonstrates the usefulness of the approach.

  20. Multi Criteria Analysis for bioenergy systems assessments

    International Nuclear Information System (INIS)

    Sustainable bioenergy systems are, by definition, embedded in social, economic, and environmental contexts and depend on support of many stakeholders with different perspectives. The resulting complexity constitutes a major barrier to the implementation of bioenergy projects. The goal of this paper is to evaluate the potential of Multi Criteria Analysis (MCA) to facilitate the design and implementation of sustainable bioenergy projects. Four MCA tools (Super Decisions, DecideIT, Decision Lab, NAIADE) are reviewed for their suitability to assess sustainability of bioenergy systems with a special focus on multi-stakeholder inclusion. The MCA tools are applied using data from a multi-stakeholder bioenergy case study in Uganda. Although contributing to only a part of a comprehensive decision process, MCA can assist in overcoming implementation barriers by (i) structuring the problem, (ii) assisting in the identification of the least robust and/or most uncertain components in bioenergy systems and (iii) integrating stakeholders into the decision process. Applying the four MCA tools to a Ugandan case study resulted in a large variability in outcomes. However, social criteria were consistently identified by all tools as being decisive in making a bioelectricity project viable

  1. Seismic vulnerability assessments in risk analysis

    Science.gov (United States)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2013-04-01

    The assessment of seismic vulnerability is a critical issue within natural and technological risk analysis. In general, there are three common types of methods used for development of vulnerability functions of different elements at risk: empirical, analytical and expert estimations. The paper addresses the empirical methods for seismic vulnerability estimation for residential buildings and industrial facilities. The results of engineering analysis of past earthquake consequences, as well as the statistical data on buildings behavior during strong earthquakes presented in the different seismic intensity scales, are used to verify the regional parameters of mathematical models in order to simulate physical and economic vulnerability for different building types classified according to seismic scale MMSK-86. Verified procedure has been used to estimate the physical and economic vulnerability of buildings and constructions against earthquakes for the Northern Caucasus Federal region of the Russian Federation and Krasnodar area, which are characterized by rather high level of seismic activity and high population density. In order to estimate expected damage states to buildings and constructions in the case of the earthquakes according to the OSR-97B (return period T=1,000 years) within big cities and towns, they were divided into unit sites and their coordinates were presented as dots located in the centers of unit sites. Then the indexes obtained for each unit site were summed up. The maps of physical vulnerability zoning for Northern Caucasus Federal region of the Russian Federation and Krasnodar area includes two elements: percent of different damage states for settlements with number of inhabitants less than 1,000 and vulnerability for cities and towns with number of inhabitants more than 1,000. The hypsometric scale is used to represent both elements on the maps. Taking into account the size of oil pipe line systems located in the highly active seismic zones in

  2. Improved reliability analysis method based on the failure assessment diagram

    Science.gov (United States)

    Zhou, Yu; Zhang, Zheng; Zhong, Qunpeng

    2012-07-01

    With the uncertainties related to operating conditions, in-service non-destructive testing (NDT) measurements and material properties considered in the structural integrity assessment, probabilistic analysis based on the failure assessment diagram (FAD) approach has recently become an important concern. However, the point density revealing the probabilistic distribution characteristics of the assessment points is usually ignored. To obtain more detailed and direct knowledge from the reliability analysis, an improved probabilistic fracture mechanics (PFM) assessment method is proposed. By integrating 2D kernel density estimation (KDE) technology into the traditional probabilistic assessment, the probabilistic density of the randomly distributed assessment points is visualized in the assessment diagram. Moreover, a modified interval sensitivity analysis is implemented and compared with probabilistic sensitivity analysis. The improved reliability analysis method is applied to the assessment of a high pressure pipe containing an axial internal semi-elliptical surface crack. The results indicate that these two methods can give consistent sensitivities of input parameters, but the interval sensitivity analysis is computationally more efficient. Meanwhile, the point density distribution and its contour are plotted in the FAD, thereby better revealing the characteristics of PFM assessment. This study provides a powerful tool for the reliability analysis of critical structures.

  3. A Content Analysis of Intimate Partner Violence Assessments

    Science.gov (United States)

    Hays, Danica G.; Emelianchik, Kelly

    2009-01-01

    With approximately 30% of individuals of various cultural identities experiencing intimate partner violence (IPV) in their lifetimes, it is imperative that professional counselors engage in effective assessment practices and be aware of the limitations of available IPV assessments. A content analysis of 38 IPV assessments was conducted, yielding…

  4. Data Analysis and Next Generation Assessments

    Science.gov (United States)

    Pon, Kathy

    2013-01-01

    For the last decade, much of the work of California school administrators has been shaped by the accountability of the No Child Left Behind Act. Now as they stand at the precipice of Common Core Standards and next generation assessments, it is important to reflect on the proficiency educators have attained in using data to improve instruction and…

  5. Non-human biota dose assessment. Sensitivity analysis and knowledge quality assessment

    International Nuclear Information System (INIS)

    This report provides a summary of a programme of work, commissioned within the BIOPROTA collaborative forum, to assess the quantitative and qualitative elements of uncertainty associated with biota dose assessment of potential impacts of long-term releases from geological disposal facilities (GDF). Quantitative and qualitative aspects of uncertainty were determined through sensitivity and knowledge quality assessments, respectively. Both assessments focused on default assessment parameters within the ERICA assessment approach. The sensitivity analysis was conducted within the EIKOS sensitivity analysis software tool and was run in both generic and test case modes. The knowledge quality assessment involved development of a questionnaire around the ERICA assessment approach, which was distributed to a range of experts in the fields of non-human biota dose assessment and radioactive waste disposal assessments. Combined, these assessments enabled critical model features and parameters that are both sensitive (i.e. have a large influence on model output) and of low knowledge quality to be identified for each of the three test cases. The output of this project is intended to provide information on those parameters that may need to be considered in more detail for prospective site-specific biota dose assessments for GDFs. Such information should help users to enhance the quality of their assessments and build greater confidence in the results. (orig.)

  6. Technical Guidance for Assessing Environmental Justice in Regulatory Analysis

    Science.gov (United States)

    The Technical Guidance for Assessing Environmental Justice in Regulatory Analysis (also referred to as the Environmental Justice Technical Guidance or EJTG) is intended for use by Agency analysts, including risk assessors, economists, and other analytic staff that conduct analyse...

  7. Analysis and Assessment of Computer-Supported Collaborative Learning Conversations

    NARCIS (Netherlands)

    Trausan-Matu, Stefan

    2008-01-01

    Trausan-Matu, S. (2008). Analysis and Assessment of Computer-Supported Collaborative Learning Conversations. Workshop presentation at the symposium Learning networks for professional. November, 14, 2008, Heerlen, Nederland: Open Universiteit Nederland.

  8. Material Analysis for a Fire Assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Alexander; Nemer, Martin B.

    2014-08-01

    This report consolidates technical information on several materials and material classes for a fire assessment. The materials include three polymeric materials, wood, and hydraulic oil. The polymers are polystyrene, polyurethane, and melamine- formaldehyde foams. Samples of two of the specific materials were tested for their behavior in a fire - like environment. Test data and the methods used to test the materials are presented. Much of the remaining data are taken from a literature survey. This report serves as a reference source of properties necessary to predict the behavior of these materials in a fire.

  9. Assessment and Planning Using Portfolio Analysis

    Science.gov (United States)

    Roberts, Laura B.

    2010-01-01

    Portfolio analysis is a simple yet powerful management tool. Programs and activities are placed on a grid with mission along one axis and financial return on the other. The four boxes of the grid (low mission, low return; high mission, low return; high return, low mission; high return, high mission) help managers identify which programs might be…

  10. Environmental risk assessment in GMO analysis.

    Science.gov (United States)

    Pirondini, Andrea; Marmiroli, Nelson

    2010-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity. PMID:21384330

  11. Uncertainty analysis in integrated assessment: the users' perspective

    NARCIS (Netherlands)

    Gabbert, S.G.M.; Ittersum, van M.K.; Kroeze, C.; Stalpers, S.I.P.; Ewert, F.; Alkan Olsson, J.

    2010-01-01

    Integrated Assessment (IA) models aim at providing information- and decision-support to complex problems. This paper argues that uncertainty analysis in IA models should be user-driven in order to strengthen science–policy interaction. We suggest an approach to uncertainty analysis that starts with

  12. LIFECYCLE ANALYSIS AS THE CORPORATE ENVIRONMENTAL RESPONSIBILITY ASSESSMENT TECHNIQUE

    OpenAIRE

    Bojan Krstic, Milica Tasic, Vladimir Ivanovic

    2015-01-01

    Lifecycle analysis is one of the techniques for assessing the impact of enterprise on the environment, by monitoring environmental effects of the product along its lifecycle. Since the cycle can be seen in stages (extraction of raw materials, raw materials processing, final product production, product use and end of use of the product), the analysis can be applied to all or only some parts of the aforementioned cycle, hence the different variants of this technique. The analysis itself is defi...

  13. 78 FR 39284 - Technical Guidance for Assessing Environmental Justice in Regulatory Analysis

    Science.gov (United States)

    2013-07-01

    ... AGENCY Technical Guidance for Assessing Environmental Justice in Regulatory Analysis AGENCY... Technical Guidance for Assessing Environmental Justice in Regulatory Analysis Docket, EPA/DC, EPA West, Room... Technical Guidance for Assessing Environmental Justice in Regulatory Analysis is available in the...

  14. 78 FR 27235 - Technical Guidance for Assessing Environmental Justice in Regulatory Analysis

    Science.gov (United States)

    2013-05-09

    ... AGENCY Technical Guidance for Assessing Environmental Justice in Regulatory Analysis AGENCY..., ``Technical Guidance for Assessing Environmental Justice in Regulatory Analysis.'' The purpose of this... Technical Guidance for Assessing Environmental Justice in Regulatory Analysis Docket, EPA/DC, EPA West,...

  15. Metallic Mineral Resources Assessment and Analysis System Design

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    This paper presents the aim and the design structure of the metallic mineral resources assessment and analysis system. This system adopts an integrated technique of data warehouse composed of affairs-processing layer and analysis-application layer. The affairs-processing layer includes multiform databases (such as geological database, geophysical database, geochemical database),while the analysis application layer includes data warehouse, online analysis processing and data mining. This paper also presents in detail the data warehouse of the present system and the appropriate spatial analysis methods and models. Finally, this paper presents the prospect of the system.

  16. Accuracy Assessment and Analysis for GPT2

    Directory of Open Access Journals (Sweden)

    YAO Yibin

    2015-07-01

    Full Text Available GPT(global pressure and temperature is a global empirical model usually used to provide temperature and pressure for the determination of tropospheric delay, there are some weakness to GPT, these have been improved with a new empirical model named GPT2, which not only improves the accuracy of temperature and pressure, but also provides specific humidity, water vapor pressure, mapping function coefficients and other tropospheric parameters, and no accuracy analysis of GPT2 has been made until now. In this paper high-precision meteorological data from ECWMF and NOAA were used to test and analyze the accuracy of temperature, pressure and water vapor pressure expressed by GPT2, testing results show that the mean Bias of temperature is -0.59℃, average RMS is 3.82℃; absolute value of average Bias of pressure and water vapor pressure are less than 1 mb, GPT2 pressure has average RMS of 7 mb, and water vapor pressure no more than 3 mb, accuracy is different in different latitudes, all of them have obvious seasonality. In conclusion, GPT2 model has high accuracy and stability on global scale.

  17. Analysis of assessment tools used in engineering degree programs

    OpenAIRE

    Martínez Martínez, María del Rosario; Olmedo Torre, Noelia; Amante García, Beatriz; Farrerons Vidal, Óscar; Cadenato Matia, Ana María

    2014-01-01

    This work presents an analysis of the assessment tools used by professors at the Universitat Politécnica de Catalunya to assess the generic competencies introduced in the Bachelor’s Degrees in Engineering. In order to conduct this study, a survey was designed and administered anonymously to a sample of the professors most receptive to educational innovation at their own university. All total, 80 professors responded to this survey, of whom 26% turned out to be members of the un...

  18. Assessment of structural analysis technology for elastic shell collapse problems

    Science.gov (United States)

    Knight, N. F., Jr.; Macy, S. C.; Mccleary, S. L.

    1989-01-01

    The prediction of the ultimate load carrying capability for compressively loaded shell structures is a challenging nonlinear analysis problem. Selected areas of finite element technology research and nonlinear solution technology are assessed. Herein, a finite element analysis procedure is applied to four shell collapse problems which have been used by computational structural mechanics researchers in the past. This assessment will focus on a number of different shell element formulations and on different approaches used to account for geometric nonlinearities. The results presented confirm that these aspects of nonlinear shell analysis can have a significant effect on the predicted nonlinear structural response. All analyses were performed using the CSM Testbed software system which allowed a convenient assessment of different element formulations with a consistent approach to solving the discretized nonlinear equations.

  19. Comparative analysis of model assessment in community detection

    CERN Document Server

    Kawamoto, Tatsuro

    2016-01-01

    Bayesian cluster inference with a flexible generative model allows us to detect various types of structures. However, it has problems stemming from computational complexity and difficulties in model assessment. We consider the stochastic block model with restricted hyperparameter space, which is known to correspond to modularity maximization. We show that it not only reduces computational complexity, but is also beneficial for model assessment. Using various criteria, we conduct a comparative analysis of the model assessments, and analyze whether each criterion tends to overfit or underfit. We also show that the learning of hyperparameters leads to qualitative differences in Bethe free energy and cross-validation errors.

  20. No-Reference Video Quality Assessment using MPEG Analysis

    DEFF Research Database (Denmark)

    Søgaard, Jacob; Forchhammer, Søren; Korhonen, Jari

    We present a method for No-Reference (NR) Video Quality Assessment (VQA) for decoded video without access to the bitstream. This is achieved by extracting and pooling features from a NR image quality assessment method used frame by frame. We also present methods to identify the video coding and...... estimate the video coding parameters for MPEG-2 and H.264/AVC which can be used to improve the VQA. The analysis differs from most other video coding analysis methods since it is without access to the bitstream. The results show that our proposed method is competitive with other recent NR VQA methods for...

  1. Uncertainty analysis on probabilistic fracture mechanics assessment methodology

    International Nuclear Information System (INIS)

    Fracture Mechanics has found a profound usage in the area of design of components and assessing fitness for purpose/residual life estimation of an operating component. Since defect size and material properties are statistically distributed, various probabilistic approaches have been employed for the computation of fracture probability. Monte Carlo Simulation is one such procedure towards the analysis of fracture probability. This paper deals with uncertainty analysis using the Monte Carlo Simulation methods. These methods were developed based on the R6 failure assessment procedure, which has been widely used in analysing the integrity of structures. The application of this method is illustrated with a case study. (author)

  2. FENCH-analysis of electricity generation greenhouse gas emissions from solar and wind power in Germany

    International Nuclear Information System (INIS)

    The assessment of energy supply systems with regard to the influence on climate change requires not only the quantification of direct emissions caused by the operation of a power plant. It also has to take into account indirect emissions resulting from e.g. construction and dismounting of the power plant. Processes like manufacturing the materials for building the plant, the transportation of components and the construction and maintenance of the power plant are included. A tool to determine and assess the energy and mass flows is the Life Cycle Analysis (LCA) which allows the assessment of environmental impacts related to a product or service. In this paper a FENCH (Full Energy Chain)-analysis based on a LCA of electricity production from wind and solar power plants under operation conditions typical for application its Germany is presented. The FENCH-analysis is based on two methods, Process Chain Analysis (PCA) and Input-Output-Analysis (IOA) which are illustrated by the example of an electricity generation from a wind power plant. The calculated results are shown for the cumulated (indirect and direct) Greenhouse-Gas (GHG)-emissions for an electricity production from wind and solar power plants. A comparison of the results to the electricity production from a coal fired power plant is performed. At last a comparison of 1 kWh electricity from renewable energy to 1 kWh from fossil energy carrier has to be done, because the benefits of 1 kWh electricity from various types of power plants are different. Electricity from wind energy depends on the meteorological conditions while electricity from a fossil fired power plant is able to follow the power requirements of the consumers nearly all the time. By considering the comparison of the different benefit provided the GHG-Emissions are presented. (author)

  3. Hanford safety analysis and risk assessment handbook (SARAH)

    International Nuclear Information System (INIS)

    The purpose of the Hanford Safety Analysis and Risk Assessment Handbook (SARAH) is to support the development of safety basis documentation for Hazard Category 1,2, and 3 U.S. Department of Energy (DOE) nuclear facilities. SARAH describes currently acceptable methodology for development of a Documented Safety Analysis (DSA) and derivation of technical safety requirements (TSR) based on 10 CFR 830, ''Nuclear Safety Management,'' Subpart B, ''Safety Basis Requirements,'' and provides data to ensure consistency in approach

  4. Background, Assessment and Analysis of the Gender Issues in Pakistan

    OpenAIRE

    Moheyuddin, Ghulam

    2005-01-01

    This paper describes the assessment of the gender issue in Pakistan, review and analysis of the major sector depicting gender inequalities. Before continuing to the detailed analysis of the gender issues in Pakistan, it gives a bird’s eye-view of the socio-economic, political and cultural background of Pakistan. The paper explains the areas of critical gender inequalities in Pakistan and reviews the various gender indicators in Pakistan. It also discusses the current policies and the program...

  5. Intuitive Analysis of Variance-- A Formative Assessment Approach

    Science.gov (United States)

    Trumpower, David

    2013-01-01

    This article describes an assessment activity that can show students how much they intuitively understand about statistics, but also alert them to common misunderstandings. How the activity can be used formatively to help improve students' conceptual understanding of analysis of variance is discussed. (Contains 1 figure and 1 table.)

  6. Assessing Group Interaction with Social Language Network Analysis

    Science.gov (United States)

    Scholand, Andrew J.; Tausczik, Yla R.; Pennebaker, James W.

    In this paper we discuss a new methodology, social language network analysis (SLNA), that combines tools from social language processing and network analysis to assess socially situated working relationships within a group. Specifically, SLNA aims to identify and characterize the nature of working relationships by processing artifacts generated with computer-mediated communication systems, such as instant message texts or emails. Because social language processing is able to identify psychological, social, and emotional processes that individuals are not able to fully mask, social language network analysis can clarify and highlight complex interdependencies between group members, even when these relationships are latent or unrecognized.

  7. Data management and statistical analysis for environmental assessment

    International Nuclear Information System (INIS)

    Data management and statistical analysis for environmental assessment are important issues on the interface of computer science and statistics. Data collection for environmental decision making can generate large quantities of various types of data. A database/GIS system developed is described which provides efficient data storage as well as visualization tools which may be integrated into the data analysis process. FIMAD is a living database and GIS system. The system has changed and developed over time to meet the needs of the Los Alamos National Laboratory Restoration Program. The system provides a repository for data which may be accessed by different individuals for different purposes. The database structure is driven by the large amount and varied types of data required for environmental assessment. The integration of the database with the GIS system provides the foundation for powerful visualization and analysis capabilities

  8. Development and Assessment of Best Estimate Integrated Safety Analysis Code

    International Nuclear Information System (INIS)

    The integrated safety analysis code MARS3.0 has been developed and assessed through v and v procedure. Integrated safety analysis system has been established through coupling with severe accident code and utilizing MARS subchannel capability. The coupled containment module has been also improved. Development of indigenous thermal hydraulic models for MARS3.0 has been done through the implementation of multidimensional two phase flow model, APR1400, SMART safety issue models and new reactor models. Development of droplet field model has been also attempted and implemented to trial version. The full scope assessment has been carried out for the system analysis module and 3D vessel module. The code has been also assessed through participating international cooperation programs. The experimental data needed to code assessment has been collected and maintained through the WEB based data bank program. 3D GUI(graphic user interface) has been developed for MARS users. MARS users group has been organized, and currently it consists of 22 domestic organizations, including research, industrial, regulatory organizations and universities

  9. Web-Based Instruction and Learning: Analysis and Needs Assessment

    Science.gov (United States)

    Grabowski, Barbara; McCarthy, Marianne; Koszalka, Tiffany

    1998-01-01

    An analysis and needs assessment was conducted to identify kindergarten through grade 14 (K-14) customer needs with regard to using the World Wide Web (WWW) for instruction and to identify obstacles K-14 teachers face in utilizing NASA Learning Technologies products in the classroom. The needs assessment was conducted as part of the Dryden Learning Technologies Project which is a collaboration between Dryden Flight Research Center (DFRC), Edwards, California and Tne Pennsylvania State University (PSU), University Park, Pennsylvania. The overall project is a multiyear effort to conduct research in the development of teacher training and tools for Web-based science, mathematics and technology instruction and learning.

  10. Human reliability analysis methods for probabilistic safety assessment

    International Nuclear Information System (INIS)

    Human reliability analysis (HRA) of a probabilistic safety assessment (PSA) includes identifying human actions from safety point of view, modelling the most important of them in PSA models, and assessing their probabilities. As manifested by many incidents and studies, human actions may have both positive and negative effect on safety and economy. Human reliability analysis is one of the areas of probabilistic safety assessment (PSA) that has direct applications outside the nuclear industry. The thesis focuses upon developments in human reliability analysis methods and data. The aim is to support PSA by extending the applicability of HRA. The thesis consists of six publications and a summary. The summary includes general considerations and a discussion about human actions in the nuclear power plant (NPP) environment. A condensed discussion about the results of the attached publications is then given, including new development in methods and data. At the end of the summary part, the contribution of the publications to good practice in HRA is presented. In the publications, studies based on the collection of data on maintenance-related failures, simulator runs and expert judgement are presented in order to extend the human reliability analysis database. Furthermore, methodological frameworks are presented to perform a comprehensive HRA, including shutdown conditions, to study reliability of decision making, and to study the effects of wrong human actions. In the last publication, an interdisciplinary approach to analysing human decision making is presented. The publications also include practical applications of the presented methodological frameworks. (orig.)

  11. Life Cycle Exergy Analysis of Wind Energy Systems : Assessing and improving life cycle analysis methodology

    OpenAIRE

    Davidsson, Simon

    2011-01-01

    Wind power capacity is currently growing fast around the world. At the same time different forms of life cycle analysis are becoming common for measuring the environmental impact of wind energy systems. This thesis identifies several problems with current methods for assessing the environmental impact of wind energy and suggests improvements that will make these assessments more robust. The use of the exergy concept combined with life cycle analysis has been proposed by several researchers ov...

  12. Modeling the Assessment of Agricultural Enterprises Headcount Analysis

    Directory of Open Access Journals (Sweden)

    Tatyana Viatkina

    2014-07-01

    Full Text Available The modern procedures of enterprises labour resources have been analyzed. The algorithm for calculation the enterprise performancepotential efficiency ratio and assessment of the performance potential of the enterprise assessment based on quantitativeand qualitative characteristics has been provided. The model for assessment the effectiveness of labour management of enterprise,branch or region subject to such factors as motivation, labour expenses, staff rotation and its qualifications has been proposed. Theproposed model covers the assessment of effectiveness of labour management of enterprise, branch or region subject to baselines.Situation where all inequalities are meeting means that this strategy is implementing effectively. If otherwise, the company shouldtake additional measures for to improve indexes specifying deployment of staff, its motivation, turnover, qualifications and labourexpenses. Application of the considered provisions and concepts together with applied tools makes it possible to model elements ofeffective utilization of performance of potential of any agricultural enterprise. The proposed procedure for assessment of agriculturalenterprises headcount analysis is applied one and can be used when developing a strategy for adequate assessment, looking fornew ways to improve utilization of labour resources in agricultural sector.

  13. System of gait analysis based on ground reaction force assessment

    Directory of Open Access Journals (Sweden)

    František Vaverka

    2015-12-01

    Full Text Available Background: Biomechanical analysis of gait employs various methods used in kinematic and kinetic analysis, EMG, and others. One of the most frequently used methods is kinetic analysis based on the assessment of the ground reaction forces (GRF recorded on two force plates. Objective: The aim of the study was to present a method of gait analysis based on the assessment of the GRF recorded during the stance phase of two steps. Methods: The GRF recorded with a force plate on one leg during stance phase has three components acting in directions: Fx - mediolateral, Fy - anteroposterior, and Fz - vertical. A custom-written MATLAB script was used for gait analysis in this study. This software displays instantaneous force data for both legs as Fx(t, Fy(t and Fz(t curves, automatically determines the extremes of functions and sets the visual markers defining the individual points of interest. Positions of these markers can be easily adjusted by the rater, which may be necessary if the GRF has an atypical pattern. The analysis is fully automated and analyzing one trial takes only 1-2 minutes. Results: The method allows quantification of temporal variables of the extremes of the Fx(t, Fy(t, Fz(t functions, durations of the braking and propulsive phase, duration of the double support phase, the magnitudes of reaction forces in extremes of measured functions, impulses of force, and indices of symmetry. The analysis results in a standardized set of 78 variables (temporal, force, indices of symmetry which can serve as a basis for further research and diagnostics. Conclusions: The resulting set of variable offers a wide choice for selecting a specific group of variables with consideration to a particular research topic. The advantage of this method is the standardization of the GRF analysis, low time requirements allowing rapid analysis of a large number of trials in a short time, and comparability of the variables obtained during different research measurements.

  14. A hybrid input–output multi-objective model to assess economic–energy–environment trade-offs in Brazil

    International Nuclear Information System (INIS)

    A multi-objective linear programming (MOLP) model based on a hybrid Input–Output (IO) framework is presented. This model aims at assessing the trade-offs between economic, energy, environmental (E3) and social objectives in the Brazilian economic system. This combination of multi-objective models with Input–Output Analysis (IOA) plays a supplementary role in understanding the interactions between the economic and energy systems, and the corresponding impacts on the environment, offering a consistent framework for assessing the effects of distinct policies on these systems. Firstly, the System of National Accounts (SNA) is reorganized to include the National Energy Balance, creating a hybrid IO framework that is extended to assess Greenhouse Gas (GHG) emissions and the employment level. The objective functions considered are the maximization of GDP (gross domestic product) and employment levels, as well as the minimization of energy consumption and GHG emissions. An interactive method enabling a progressive and selective search of non-dominated solutions with distinct characteristics and underlying trade-offs is utilized. Illustrative results indicate that the maximization of GDP and the employment levels lead to an increase of both energy consumption and GHG emissions, while the minimization of either GHG emissions or energy consumption cause negative impacts on GDP and employment. - Highlights: • A hybrid Input–Output multi-objective model is applied to the Brazilian economy. • Objective functions are GDP, employment level, energy consumption and GHG emissions. • Interactive search process identifies trade-offs between the competing objectives. • Positive correlations between GDP growth and employment. • Positive correlations between energy consumption and GHG emissions

  15. No-Reference Video Quality Assessment using Codec Analysis

    DEFF Research Database (Denmark)

    Søgaard, Jacob; Forchhammer, Søren; Korhonen, Jari

    2015-01-01

    A no-reference video quality assessment (VQA) method is presented for videos distorted by H.264/AVC and MPEG-2. The assessment is performed without access to the bit-stream. Instead we analyze and estimate coefficients based on decoded pixels. The approach involves distinguishing between the two...... types of videos, estimating the level of quantization used in the I-frames, and exploiting this information to assess the video quality. In order to do this for H.264/AVC, the distribution of the DCT-coefficients after intra-prediction and deblocking are modeled. To obtain VQA features for H.264/AVC, we...... propose a novel estimation method of the quantization in H.264/AVC videos without bitstream access, which can also be used for Peak Signalto-Noise Ratio (PSNR) estimation. The results from the MPEG-2 and H.264/AVC analysis are mapped to a perceptual measure of video quality by Support Vector Regression...

  16. Quantitative risk assessment using the capacity-demand analysis

    International Nuclear Information System (INIS)

    The hydroelectric industry's recognition of the importance of avoiding unexpected failure, or forced outages, led to the development of probabilistic, or risk-based, methods in order to attempt to quantify exposures. Traditionally, such analysis has been carried out by qualitative assessments, relying on experience and sound engineering judgment to determine the optimum time to maintain, repair or replace a part or system. Depending on the nature of the problem, however, and the level of experience of those included in the decision making process, it is difficult to find a balance between acting proactively and accepting some amount of risk. The development of a practical means for establishing the probability of failure of any part or system, based on the determination of the statistical distribution of engineering properties such as acting stresses, is discussed. The capacity-demand analysis methodology, coupled with probablistic, risk-based analysis, permits all the factors associated with a decision to rehabilitate or replace a part, including the risks associated with the timing of the decision, to be assessed in a transparent and defendable manner. The methodology does not eliminate judgment altogether, but does move it from the level of estimating the risk of failure to the lower level of estimating variability in material properties, uncertainty in loading, and the uncertainties inherent in any engineering analysis. The method was successfully used in 1998 to carry out a comprehensive, economic risk analysis for the entire water conveyance system of a 90 year old hydropower station. The analysis included a number of diverse parts ranging from rock slopes and aging steel and concrete conduits, and the method allowed a rational assessment of the risks associated with reach of these varied parts to be determined, permitting the essential remedial works to be prioritized. 14 refs., 4 figs

  17. Assessment of Transport Projects: Risk Analysis and Decision Support

    DEFF Research Database (Denmark)

    Salling, Kim Bang

    2008-01-01

    The subject of this thesis is risk analysis and decision support in the context of transport infrastructure assessment. During my research I have observed a tendency in studies of assessing transport projects of overlooking the substantial amount of uncertainties within the decision making process...... Monte Carlo simulation, being the technique behind the quantitative risk analysis of CBA-DK. The informed decision support is dealt with by a set of resulting accumulated descending graphs (ADG) which makes it possible for decision-makers to come to terms with their risk aversion given a specific...... transport projects, namely by moving from point estimates to interval results. The main focus of this Ph.D. study has been to develop a valid, flexible and functional decision support tool in which risk oriented aspects of project evaluation is implemented. Throughout the study six papers have been produced...

  18. Total life cycle management - assessment tool an exploratory analysis

    OpenAIRE

    Young, Brad de

    2008-01-01

    It is essential for the Marine Corps to ensure the successful supply, movement and maintenance of an armed force in peacetime and combat. Integral to an effective, long-term logistics plan is the ability to accurately forecast future requirements to sustain materiel readiness. Total Life Cycle Management Assessment Tool (TLCM-AT) is a simulation tool combining operations, maintenance, and logistics. This exploratory analysis gives insight into the factors used by TLCM-AT beyond the tool s emb...

  19. Safety analysis, risk assessment, and risk acceptance criteria

    International Nuclear Information System (INIS)

    This paper discusses a number of topics that relate safety analysis as documented in the Department of Energy (DOE) safety analysis reports (SARs), probabilistic risk assessments (PRA) as characterized primarily in the context of the techniques that have assumed some level of formality in commercial nuclear power plant applications, and risk acceptance criteria as an outgrowth of PRA applications. DOE SARs of interest are those that are prepared for DOE facilities under DOE Order 5480.23 and the implementing guidance in DOE STD-3009-94. It must be noted that the primary area of application for DOE STD-3009 is existing DOE facilities and that certain modifications of the STD-3009 approach are necessary in SARs for new facilities. Moreover, it is the hazard analysis (HA) and accident analysis (AA) portions of these SARs that are relevant to the present discussions. Although PRAs can be qualitative in nature, PRA as used in this paper refers more generally to all quantitative risk assessments and their underlying methods. HA as used in this paper refers more generally to all qualitative risk assessments and their underlying methods that have been in use in hazardous facilities other than nuclear power plants. This discussion includes both quantitative and qualitative risk assessment methods. PRA has been used, improved, developed, and refined since the Reactor Safety Study (WASH-1400) was published in 1975 by the Nuclear Regulatory Commission (NRC). Much debate has ensued since WASH-1400 on exactly what the role of PRA should be in plant design, reactor licensing, 'ensuring' plant and process safety, and a large number of other decisions that must be made for potentially hazardous activities. Of particular interest in this area is whether the risks quantified using PRA should be compared with numerical risk acceptance criteria (RACs) to determine whether a facility is 'safe.' Use of RACs requires quantitative estimates of consequence frequency and magnitude

  20. Assessment of residual stress using thermoelastic stress analysis

    OpenAIRE

    Robinson, Andrew Ferrand

    2011-01-01

    The work described in this thesis considers the application of thermoelastic stress analysis (TSA) to the assessment of residual stresses in metallic materials. Residual stresses exist within almost all engineering components and structures. They are an unavoidable consequence of manufacturing processes and may cause the premature and catastrophic failure of a component when coupled with in-service stresses. Alternatively, beneficial residual stress may be introduced to enhance th...

  1. Analysis of the judicial file: assessing the validity of testimony

    OpenAIRE

    Scott, M. Teresa; Antonio L. Manzanero

    2015-01-01

    Under the holistic approach to the assessment of testimony (HELPT), this paper describes a protocol for the analysis of all of the information that can be extracted from a judicial file, regarding the knowledge of heuristic principles and psychology of testimony. The aim is to provide a systematization for expert reports about the topics that could be explored in a file, extracting the maximum unbiased information to establish the relevant hypotheses of the case and evaluate possible factors ...

  2. Site Characterization and Analysis Penetrometer System (SCAPS) : Assessing Site Cotamination

    OpenAIRE

    ECT Team, Purdue

    2007-01-01

    While a number of techniques exist for the remediation of contaminated soils, one of the largest problems is often the initial site assessment. It can be a difficult, expensive and time-consuming process to determine the exact extent of site contamination. The U.S. Army Engineer Waterways Experiment Station (WES) under the sponsorship of the U.S. Army Environmental Center (AEC) initiated the development of the Site Characterization and Analysis Penetrometer System (SCAPS) Research, Developmen...

  3. Assessment report on NRP sub-theme 'Risk Analysis'

    International Nuclear Information System (INIS)

    An overview and assessment are presented of the three research projects carried out under NRP funding that concern risk-related topics: (1) The risks of nonlinear climate changes, (2) Socio-economic and policy aspects of changes in incidence and intensity of extreme (weather) events, and (3) Characterizing the risks: a comparative analysis of the risks of global warming and of relevant policy strategies. 1 tab., 6 refs

  4. Scenario analysis in spatial impact assessment:a methodological approach

    OpenAIRE

    Torrieri, F.; Nijkamp, P.

    2009-01-01

    This paper introduces the concept of Spatial or Territorial Impact Assessment as a new tool for balanced urban or regional planning from a long-term sustainability perspective. It then argues that modern scenario methods may be a useful complement to pro-active and future oriented urban or regional strategic thinking. A cognitive interactive model for scenario analysis is next presented and its advantages are outlined.

  5. Life cycle analysis for the assessment of environmental impacts

    International Nuclear Information System (INIS)

    The paper presents the structure of a model and a database devoted to the life-cycle analysis of industrial products for the assessment of environmental impacts. The data cover a large variety of industrial sectors; the whole life-cycle of the products has to be considered when the environmental impacts are calculated. The author considers that the data format could be standardized in view of exchanging data between different studies and to enlarge the quality of the studies. (author)

  6. Modular risk analysis for assessing multiple waste sites

    International Nuclear Information System (INIS)

    Human-health impacts, especially to the surrounding public, are extremely difficult to assess at installations that contain multiple waste sites and a variety of mixed-waste constituents (e.g., organic, inorganic, and radioactive). These assessments must address different constituents, multiple waste sites, multiple release patterns, different transport pathways (i.e., groundwater, surface water, air, and overland soil), different receptor types and locations, various times of interest, population distributions, land-use patterns, baseline assessments, a variety of exposure scenarios, etc. Although the process is complex, two of the most important difficulties to overcome are associated with (1) establishing an approach that allows for modifying the source term, transport, or exposure component as an individual module without having to re-evaluate the entire installation-wide assessment (i.e., all modules simultaneously), and (2) displaying and communicating the results in an understandable and useable maimer to interested parties. An integrated, physics-based, compartmentalized approach, which is coupled to a Geographical Information System (GIS), captures the regional health impacts associated with multiple waste sites (e.g., hundreds to thousands of waste sites) at locations within and surrounding the installation. Utilizing a modular/GIS-based approach overcomes difficulties in (1) analyzing a wide variety of scenarios for multiple waste sites, and (2) communicating results from a complex human-health-impact analysis by capturing the essence of the assessment in a relatively elegant manner, so the meaning of the results can be quickly conveyed to all who review them

  7. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    Science.gov (United States)

    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R.; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manual gating, and sample classification to determine if analysis pipelines can identify characteristics that correlate with external variables (e.g., clinical outcome). This analysis presents the results of the first of these challenges. Several methods performed well compared to manual gating or external variables using statistical performance measures, suggesting that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis. PMID:23396282

  8. NASA Langley Systems Analysis & Concepts Directorate Technology Assessment/Portfolio Analysis

    Science.gov (United States)

    Cavanaugh, Stephen; Chytka, Trina; Arcara, Phil; Jones, Sharon; Stanley, Doug; Wilhite, Alan W.

    2006-01-01

    Systems analysis develops and documents candidate mission and architectures, associated system concepts, enabling capabilities and investment strategies to achieve NASA s strategic objectives. The technology assessment process connects the mission and architectures to the investment strategies. In order to successfully implement a technology assessment, there is a need to collect, manipulate, analyze, document, and disseminate technology-related information. Information must be collected and organized on the wide variety of potentially applicable technologies, including: previous research results, key technical parameters and characteristics, technology readiness levels, relationships to other technologies, costs, and potential barriers and risks. This information must be manipulated to facilitate planning and documentation. An assessment is included of the programmatic and technical risks associated with each technology task as well as potential risk mitigation plans. Risks are assessed and tracked in terms of likelihood of the risk occurring and consequences of the risk if it does occur. The risk assessments take into account cost, schedule, and technical risk dimensions. Assessment data must be simplified for presentation to decision makers. The Systems Analysis and Concepts Directorate (SACD) at NASA Langley Research Center has a wealth of experience in performing Technology Assessment and Portfolio Analysis as this has been a business line since 1978.

  9. Spatial risk assessment for critical network infrastructure using sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    Michael M·derl; Wolfgang Rauch

    2011-01-01

    The presented spatial risk assessment method allows for managing critical network infrastructure in urban areas under abnormal and future conditions caused e.g.,by terrorist attacks,infrastructure deterioration or climate change.For the spatial risk assessment,vulnerability maps for critical network infrastructure are merged with hazard maps for an interfering process.Vulnerability maps are generated using a spatial sensitivity analysis of network transport models to evaluate performance decrease under investigated thread scenarios.Thereby parameters are varied according to the specific impact of a particular threat scenario.Hazard maps are generated with a geographical information system using raster data of the same threat scenario derived from structured interviews and cluster analysis of events in the past.The application of the spatial risk assessment is exemplified by means of a case study for a water supply system,but the principal concept is applicable likewise to other critical network infrastructure.The aim of the approach is to help decision makers in choosing zones for preventive measures.

  10. Climate Change Scientific Assessment and Policy Analysis. Scientific Assessment of Solar Induced Climate Change

    International Nuclear Information System (INIS)

    The programme Scientific Assessment and Policy Analysis is commissioned by the Dutch Ministry of Housing, Spatial Planning, and the Environment (VROM) and has the following objectives: Collection and evaluation of relevant scientific information for policy development and decision-making in the field of climate change; Analysis of resolutions and decisions in the framework of international climate negotiations and their implications. The programme is concerned with analyses and assessments intended for a balanced evaluation of the state of the art knowledge for underpinning policy choices. These analyses and assessment activities are carried out within several months to about a year, depending on the complexity and the urgency of the policy issue. Assessment teams organised to handle the various topics consist of the best Dutch experts in their fields. Teams work on incidental and additionally financed activities, as opposed to the regular, structurally financed activities of the climate research consortium. The work should reflect the current state of science on the relevant topic. In this report an assessment on the following topics is presented: (1) Reconstructions of solar variability, especially with respect to those parameters which are relevant for climate change; (2) Reconstructions of proxies of solar variability, e.g. cosmogenic isotopes; (3) Reconstructions of global as well as regional climate, with respect to temperature, precipitation and circulation; (4) Physical understanding of the mechanisms which play a role in the solar terrestrial link. We focus on the Holocene with emphasis on the last centuries because of data availability, to avoid confusing climate responses to orbital changes with those due to solar activity and because of the relevance for human induced climate change as compared to the role of the variable sun in the 20th century

  11. Biological dosimetry: chromosomal aberration analysis for dose assessment

    International Nuclear Information System (INIS)

    In view of the growing importance of chromosomal aberration analysis as a biological dosimeter, the present report provides a concise summary of the scientific background of the subject and a comprehensive source of information at the technical level. After a review of the basic principles of radiation dosimetry and radiation biology basic information on the biology of lymphocytes, the structure of chromosomes and the classification of chromosomal aberrations are presented. This is followed by a presentation of techniques for collecting blood, storing, transporting, culturing, making chromosomal preparations and scaring of aberrations. The physical and statistical parameters involved in dose assessment are discussed and examples of actual dose assessments taken from the scientific literature are given

  12. Social and ethical analysis in health technology assessment.

    Science.gov (United States)

    Tantivess, Sripen

    2014-05-01

    This paper presents a review of the domestic and international literature on the assessment of the social and ethical implications of health technologies. It gives an overview of the key concepts, principles, and approaches that should be taken into account when conducting a social and ethical analysis within health technology assessment (HTA). Although there is growing consensus among healthcare experts that the social and ethical ramifications of a given technology should be examined before its adoption, the demand for this kind of analysis among policy-makers around the world, including in Thailand, has so far been lacking. Currently decision-makers mainly base technology adoption decisions using evidence on clinical effectiveness, value for money, and budget impact, while social and ethical aspects have been neglected. Despite the recognized importance of considering equity, justice, and social issues when making decisions regarding health resource allocation, the absence of internationally-accepted principles and methodologies, among other factors, hinders research in these areas. Given that developing internationally agreed standards takes time, it has been recommended that priority be given to defining processes that are justifiable, transparent, and contestable. A discussion of the current situation in Thailand concerning social and ethical analysis of health technologies is also presented. PMID:24964703

  13. Model analysis: Representing and assessing the dynamics of student learning

    Directory of Open Access Journals (Sweden)

    Edward F. Redish

    2006-02-01

    Full Text Available Decades of education research have shown that students can simultaneously possess alternate knowledge frameworks and that the development and use of such knowledge are context dependent. As a result of extensive qualitative research, standardized multiple-choice tests such as Force Concept Inventory and Force-Motion Concept Evaluation tests provide instructors tools to probe their students’ conceptual knowledge of physics. However, many existing quantitative analysis methods often focus on a binary question of whether a student answers a question correctly or not. This greatly limits the capacity of using the standardized multiple-choice tests in assessing students’ alternative knowledge. In addition, the context dependence issue, which suggests that a student may apply the correct knowledge in some situations and revert to use alternative types of knowledge in others, is often treated as random noise in current analyses. In this paper, we present a model analysis, which applies qualitative research to establish a quantitative representation framework. With this method, students’ alternative knowledge and the probabilities for students to use such knowledge in a range of equivalent contexts can be quantitatively assessed. This provides a way to analyze research-based multiple choice questions, which can generate much richer information than what is available from score-based analysis.

  14. Development and assessment of best estimate integrated safety analysis code

    International Nuclear Information System (INIS)

    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published

  15. Development and assessment of best estimate integrated safety analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong; Lee, Young Jin; Hwang, Moon Kyu (and others)

    2007-03-15

    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published.

  16. The advanced scenario analysis for performance assessment of geological disposal

    International Nuclear Information System (INIS)

    First of all, with regard to the FEP information data on the Engineered Barrier System (EBS) developed by JNC, description level and content of the FEPs have been examined from various angles on the basis of the latest research information. Each content of the FEP data has been classified and modified by means of integrating descriptive items, checking detail levels and correlations with other FEPs, collating with the H12 report, and adding technical information after H12 report. Secondly, scenario-modeling process has been studied. The study has been conducted by evaluating representation of the repository system, definition of FEP properties, and process interactions based on the concept of the interaction matrix (RES format) which represents influences between physicochemical characteristics of the repository, followed by an experimental development of the actual RES interaction matrix based on the H12 report as the examination to improve the transparency, traceability and comprehensibility of the scenario analysis process. Lastly, in relation to the geological disposal system, assessment techniques have been examined for more practical scenario analysis on particularly strong perturbations. Possible conceptual models have been proposed for each of these scenarios; seismic, faulting, and dike intrusion. As a result of these researches, a future direction for advanced scenario analysis on performance assessment has been indicated, as well as associated issues to be discussed have been clarified. (author)

  17. A Monte Carlo based spent fuel analysis safeguards strategy assessment

    Energy Technology Data Exchange (ETDEWEB)

    Fensin, Michael L [Los Alamos National Laboratory; Tobin, Stephen J [Los Alamos National Laboratory; Swinhoe, Martyn T [Los Alamos National Laboratory; Menlove, Howard O [Los Alamos National Laboratory; Sandoval, Nathan P [Los Alamos National Laboratory

    2009-01-01

    assessment process, the techniques employed to automate the coupled facets of the assessment process, and the standard burnup/enrichment/cooling time dependent spent fuel assembly library. We also clearly define the diversion scenarios that will be analyzed during the standardized assessments. Though this study is currently limited to generic PWR assemblies, it is expected that the results of the assessment will yield an adequate spent fuel analysis strategy knowledge that will help the down-select process for other reactor types.

  18. Statistical analysis applied to safety culture self-assessment

    International Nuclear Information System (INIS)

    Interviews and opinion surveys are instruments used to assess the safety culture in an organization as part of the Safety Culture Enhancement Programme. Specific statistical tools are used to analyse the survey results. This paper presents an example of an opinion survey with the corresponding application of the statistical analysis and the conclusions obtained. Survey validation, Frequency statistics, Kolmogorov-Smirnov non-parametric test, Student (T-test) and ANOVA means comparison tests and LSD post-hoc multiple comparison test, are discussed. (author)

  19. Comparison of two software versions for assessment of body-composition analysis by DXA

    DEFF Research Database (Denmark)

    Vozarova, B; Wang, J; Weyer, C;

    2001-01-01

    To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA.......To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA....

  20. 7 CFR 2.71 - Director, Office of Risk Assessment and Cost-Benefit Analysis.

    Science.gov (United States)

    2010-01-01

    ... Chief Economist § 2.71 Director, Office of Risk Assessment and Cost-Benefit Analysis. (a) Delegations..., Office of Risk Assessment and Cost-Benefit Analysis: (1) Responsible for assessing the risks to human... 7 Agriculture 1 2010-01-01 2010-01-01 false Director, Office of Risk Assessment and...

  1. Supporting analysis and assessments quality metrics: Utility market sector

    Energy Technology Data Exchange (ETDEWEB)

    Ohi, J. [National Renewable Energy Lab., Golden, CO (United States)

    1996-10-01

    In FY96, NREL was asked to coordinate all analysis tasks so that in FY97 these tasks will be part of an integrated analysis agenda that will begin to define a 5-15 year R&D roadmap and portfolio for the DOE Hydrogen Program. The purpose of the Supporting Analysis and Assessments task at NREL is to provide this coordination and conduct specific analysis tasks. One of these tasks is to prepare the Quality Metrics (QM) for the Program as part of the overall QM effort at DOE/EERE. The Hydrogen Program one of 39 program planning units conducting QM, a process begun in FY94 to assess benefits/costs of DOE/EERE programs. The purpose of QM is to inform decisionmaking during budget formulation process by describing the expected outcomes of programs during the budget request process. QM is expected to establish first step toward merit-based budget formulation and allow DOE/EERE to get {open_quotes}most bang for its (R&D) buck.{close_quotes} In FY96. NREL coordinated a QM team that prepared a preliminary QM for the utility market sector. In the electricity supply sector, the QM analysis shows hydrogen fuel cells capturing 5% (or 22 GW) of the total market of 390 GW of new capacity additions through 2020. Hydrogen consumption in the utility sector increases from 0.009 Quads in 2005 to 0.4 Quads in 2020. Hydrogen fuel cells are projected to displace over 0.6 Quads of primary energy in 2020. In future work, NREL will assess the market for decentralized, on-site generation, develop cost credits for distributed generation benefits (such as deferral of transmission and distribution investments, uninterruptible power service), cost credits for by-products such as heat and potable water, cost credits for environmental benefits (reduction of criteria air pollutants and greenhouse gas emissions), compete different fuel cell technologies against each other for market share, and begin to address economic benefits, especially employment.

  2. Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis

    International Nuclear Information System (INIS)

    Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e

  3. Developing an assessment scale for character. An exploratory factorial analysis

    Directory of Open Access Journals (Sweden)

    Ionescu, D.

    2015-07-01

    Full Text Available eveloping a character assessment scale is the more distal goal of the author. In this paper I aim to present a sequence of this psychometric process, namely exploring the factorial structure of a character assessment scale. In order to achieve this aim, we first explored the psychological factors relevant for a moral character. We also explored the moral standards that are valued in the main life contexts of an individual: family, workplace, close relationships and public context. These theoretical endeavors were important for the item writing process, as they provided the content of the scale. Furthermore, the item development phase was empirically supported through some piloting studies, which highlighted the direction of the scale to assess instances of moral character failure, generically recognized as proofs of a bad character. The present paper focuses on the results obtained after performing an exploratory factor analysis on a sample of 300 participants. The results suggest that the 21-item scale best fits a four-factor structures that cumulatively explain 42.45% of the variance. The factors are: evilness, ill-tempered behavior, dishonesty, upstartness. The scale reveals the moral profile of an individual in all four life contexts.

  4. Risk assessment of groundwater pollution using sensitivity analysis and a worst-case scenario analysis

    OpenAIRE

    Huysmans, Marijke; Madarasz, Tamas; Dassargues, Alain

    2006-01-01

    This paper illustrates how sensitivity analysis and a worst-case scenario analysis can be useful tools in risk assessment of groundwater pollution. The approach is applied to a study area in Hungary with several known groundwater pollution sources and nearby drinking water production wells. The main concern is whether the contamination sources threaten the drinking water wells of the area. A groundwater flow and transport model is set up to answer this question. Due to limited data availabili...

  5. Assessing microstructures of pyrrhotites in basalts by multifractal analysis

    Directory of Open Access Journals (Sweden)

    S. Xie

    2010-07-01

    Full Text Available Understanding and describing spatial arrangements of mineral particles and determining the mineral distribution structure are important to model the rock-forming process. Geometric properties of individual mineral particles can be estimated from thin sections, and different models have been proposed to quantify the spatial complexity of mineral arrangement. The Gejiu tin-polymetallic ore-forming district, located in Yunnan province, southwestern China, is chosen as the study area. The aim of this paper is to apply fractal and multifractal analysis to quantify distribution patterns of pyrrhotite particles from twenty-eight binary images obtained from seven basalt segments and then to discern the possible petrological formation environments of the basalts based on concentrations of trace elements. The areas and perimeters of pyrrhotite particles were measured for each image. Perimeter-area fractal analysis shows that the perimeter and area of pyrrhotite particles follow a power-law relationship, which implies the scale-invariance of the shapes of the pyrrhotites. Furthermore, the spatial variation of the pyrrhotite particles in space was characterized by multifractal analysis using the method of moments. The results show that the average values of the area-perimeter exponent (DAP, the width of the multifractal spectra (Δ(D(0−D(2 and Δ(D(qminD(qmax and the multifractality index (τ"(1 for the pyrrhotite particles reach their minimum in the second basalt segment, which implies that the spatial arrangement of pyrrhotite particles in Segment 2 is less heterogeneous. Geochemical trace element analysis results distinguish the second basalt segment sample from other basalt samples. In this aspect, the fractal and multifractal analysis may provide new insights into the quantitative assessment of mineral microstructures which may be closely associated with the petrogenesis as shown by the

  6. A comparison of integrated safety analysis and probabilistic risk assessment

    International Nuclear Information System (INIS)

    The U.S. Nuclear Regulatory Commission conducted a comparison of two standard tools for risk informing the regulatory process, namely, the Probabilistic Risk Assessment (PRA) and the Integrated Safety Analysis (ISA). PRA is a calculation of risk metrics, such as Large Early Release Frequency (LERF), and has been used to assess the safety of all commercial power reactors. ISA is an analysis required for fuel cycle facilities (FCFs) licensed to possess potentially critical quantities of special nuclear material. A PRA is usually more detailed and uses more refined models and data than an ISA, in order to obtain reasonable quantitative estimates of risk. PRA is considered fully quantitative, while most ISAs are typically only partially quantitative. The extension of PRA methodology to augment or supplant ISAs in FCFs has long been considered. However, fuel cycle facilities have a wide variety of possible accident consequences, rather than a few surrogates like LERF or core damage as used for reactors. It has been noted that a fuel cycle PRA could be used to better focus attention on the most risk-significant structures, systems, components, and operator actions. ISA and PRA both identify accident sequences; however, their treatment is quite different. ISA's identify accidents that lead to high or intermediate consequences, as defined in 10 Code of Federal Regulations (CFR) 70, and develop a set of Items Relied on For Safety (IROFS) to assure adherence to performance criteria. PRAs identify potential accident scenarios and estimate their frequency and consequences to obtain risk metrics. It is acceptable for ISAs to provide bounding evaluations of accident consequences and likelihoods in order to establish acceptable safety; but PRA applications usually require a reasonable quantitative estimate, and often obtain metrics of uncertainty. This paper provides the background, features, and methodology associated with the PRA and ISA. The differences between the

  7. Phonological assessment and analysis tools for Tagalog: Preliminary development.

    Science.gov (United States)

    Chen, Rachelle Kay; Bernhardt, B May; Stemberger, Joseph P

    2016-01-01

    Information and assessment tools concerning Tagalog phonological development are minimally available. The current study thus sets out to develop elicitation and analysis tools for Tagalog. A picture elicitation task was designed with a warm-up, screener and two extension lists, one with more complex and one with simpler words. A nonlinear phonological analysis form was adapted from English (Bernhardt & Stemberger, 2000) to capture key characteristics of Tagalog. The tools were piloted on a primarily Tagalog-speaking 4-year-old boy living in a Canadian-English-speaking environment. The data provided initial guidance for revision of the elicitation tool (available at phonodevelopment.sites.olt.ubc.ca). The analysis provides preliminary observations about possible expectations for primarily Tagalog-speaking 4-year-olds in English-speaking environments: Lack of mastery for tap/trill 'r', and minor mismatches for vowels, /l/, /h/ and word stress. Further research is required in order to develop the tool into a norm-referenced instrument for Tagalog in both monolingual and multilingual environments. PMID:27096390

  8. Analysis of complete logical structures in system reliability assessment

    International Nuclear Information System (INIS)

    The application field of the fault-tree techniques has been explored in order to assess whether the AND-OR structures covered all possible actual binary systems. This resulted in the identification of various situations requiring the complete AND-OR-NOT structures for their analysis. We do not use the term non-coherent for such cases, since the monotonicity or not of a structure function is not a characteristic of a system, but of the particular top event being examined. The report presents different examples of complete fault-trees, which can be examined according to different degrees of approximation. In fact, the exact analysis for the determination of the smallest irredundant bases is very time consuming and actually necessary only in some particular cases (multi-state systems, incidental situations). Therefore, together with the exact procedure, the report shows two different methods of logical analysis that permit the reduction of complete fault-trees to AND-OR structures. Moreover, it discusses the problems concerning the evaluation of the probability distribution of the time to first top event occurrence, once the hypothesis of structure function monotonicity is removed

  9. Time-dependent reliability analysis and condition assessment of structures

    Energy Technology Data Exchange (ETDEWEB)

    Ellingwood, B.R. [Johns Hopkins Univ., Baltimore, MD (United States)

    1997-01-01

    Structures generally play a passive role in assurance of safety in nuclear plant operation, but are important if the plant is to withstand the effect of extreme environmental or abnormal events. Relative to mechanical and electrical components, structural systems and components would be difficult and costly to replace. While the performance of steel or reinforced concrete structures in service generally has been very good, their strengths may deteriorate during an extended service life as a result of changes brought on by an aggressive environment, excessive loading, or accidental loading. Quantitative tools for condition assessment of aging structures can be developed using time-dependent structural reliability analysis methods. Such methods provide a framework for addressing the uncertainties attendant to aging in the decision process.

  10. A Comparative Analysis of Privacy Impact Assessment in Six Countries

    Directory of Open Access Journals (Sweden)

    David Wright

    2013-02-01

    Full Text Available he European Commission is revising the EU’s data protection framework. One of the changes concerns privacy impact assessment (PIA. This paper argues that the European Commission and the EU Member States should draw on the experience of other countries that have adopted PIA policies and methodologies to construct its own framework. There are similarities and differences in the approaches of Australia, Canada, Ireland, New Zealand, the UK and US, the countries with the most experience in PIA. Each has its strong points, but also shortcomings. Audits have identified some of the latter in the instance of Canada. This paper provides a comparative analysis of the six countries to identify some of the best elements that could be used to improve Article 33 in European Commission’s proposed Data Protection Regulation.

  11. New challenges on uncertainty propagation assessment of flood risk analysis

    Science.gov (United States)

    Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés

    2016-04-01

    Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis

  12. Cyber threat impact assessment and analysis for space vehicle architectures

    Science.gov (United States)

    McGraw, Robert M.; Fowler, Mark J.; Umphress, David; MacDonald, Richard A.

    2014-06-01

    This paper covers research into an assessment of potential impacts and techniques to detect and mitigate cyber attacks that affect the networks and control systems of space vehicles. Such systems, if subverted by malicious insiders, external hackers and/or supply chain threats, can be controlled in a manner to cause physical damage to the space platforms. Similar attacks on Earth-borne cyber physical systems include the Shamoon, Duqu, Flame and Stuxnet exploits. These have been used to bring down foreign power generation and refining systems. This paper discusses the potential impacts of similar cyber attacks on space-based platforms through the use of simulation models, including custom models developed in Python using SimPy and commercial SATCOM analysis tools, as an example STK/SOLIS. The paper discusses the architecture and fidelity of the simulation model that has been developed for performing the impact assessment. The paper walks through the application of an attack vector at the subsystem level and how it affects the control and orientation of the space vehicle. SimPy is used to model and extract raw impact data at the bus level, while STK/SOLIS is used to extract raw impact data at the subsystem level and to visually display the effect on the physical plant of the space vehicle.

  13. Potential Improvements in Human Reliability Analysis for Fire Risk Assessments

    International Nuclear Information System (INIS)

    The results of numerous fire risk assessments (FRA) and the experience gained from actual fire events have shown that fire can be a significant contributor to nuclear power plant (NPP) risk. However, on the basis of reviews of the FRAs performed for the Individual Plant External Events Examination (IPEEE) program in the U.S. and on recent research performed by U.S. Nuclear Regulatory Commission (NRC) to support increased use of risk information in regulatory decision making [e.g., Ref. 1, 2], it has become clear that improved modelling and quantification of human performance during fire events requires a better treatment of the special environment and response context produced by fires. This paper describes fire-related factors that have been identified as potentially impacting human performance, discusses to what extent such factors were modelled in the IPEEE FRAs, discusses prioritization of the factors likely to be most important to a realistic assessment of plant safety, and discusses which factors are likely to need additional research and development in order to allow adequate modelling in the human reliability analysis (HRA) portions of FRAs. The determination of which factors need to be modelled and the improvement of HRA related approaches for modelling such factors are critical aspects of the NRC's plan to improve FRA methods, tools, and data and to update a number of existing FRAs. (authors)

  14. Nutritional assessment and eating habits analysis in young adults.

    Science.gov (United States)

    Nieradko-Iwanicka, Barbara; Borzecki, Andrzej

    2004-01-01

    Good eating habit is an essential part of a healthy lifestyle. It helps prevent civilisation diseases. The BMI and eating plan analysis are useful in individual's nutritional assessment. The aim of the study was to assess nutritional status and eating habits in young adults. An average BMI was 23.63 kg/ m2 in the interviewed men, and 20.6 kg/m2 in women. Caloric value of the daily eating plans was average: in men 2943 kcal, in women 2272 kcal. Four people were on diets, but none of BMI over 25 kg/m2. There were no people suffering from food allergies nor gastrointestinal diseases. Only one male did sports (weight-lifting) regularly. The majority of the students ate at lunchtime at the university cafeteria or prepared meals themselves. The eating plans varied very much: the majority was based on the Eating Guide Pyramid and consisted of three balanced meals during the day-time; there were also single cases where students stuck to eating high-calorie meals at night-time mostly. PMID:16146124

  15. Assessment of water quality of Buna River using microbiological analysis

    Directory of Open Access Journals (Sweden)

    ANILË MEDHA

    2014-06-01

    Full Text Available The Buna River is situated near Shkodra town, between the hill of Rozafa castle and Taraboshi Mountain. It is the only emissary of the Shkodra Lake. Buna River is exposed to different sources of pollution related to urban pollution, sewerage discharge, agricultural activity, and climate change which are associated with an increase in water levels, erosion and floods. This research assesses the quality of water in Buna River, based on the microbiological and physical-chemical analysis. Samples were taken at three different points during years 2013-2014. The analysis will stress out data about heterotrophic and fecal coliform general characteristics, figures, and the role as indicators of water pollution and also information about PH, conductibility and the temperature of water. Microbiological contamination tests show relatively large water contamination, especially in the first sample point where Buna River begins. The high level presence of these microorganisms indicates that the water quality of the river is bad according to standards, presenting a risk to health for all the organisms that inhabit the sweet waters of Buna River.

  16. Integrating multicriteria evaluation and stakeholders analysis for assessing hydropower projects

    International Nuclear Information System (INIS)

    The use of hydroelectric potential and the protection of the river ecosystem are two contrasting aspects that arise in the management of the same resource, generating conflicts between different stakeholders. The purpose of the paper is to develop a multi-level decision-making tool, able to support energy planning, with specific reference to the construction of hydropower plants in mountain areas. Starting from a real-world problem concerning the basin of the Sesia Valley (Italy), an evaluation framework based on the combined use of Multicriteria Evaluation and Stakeholders Analysis is proposed in the study. The results of the work show that the methodology is able to grant participated decisions through a multi-stakeholders traceable and transparent assessment process, to highlight the important elements of the decision problem and to support the definition of future design guidelines. - Highlights: • The paper concerns a multi-level decision-making tool able to support energy planning. • The evaluation framework is based on the use of AHP and Stakeholders Analysis. • Hydropower projects in the Sesia Valley (Italy) are evaluated and ranked in the study. • Environmental, economic, technical and sociopolitical criteria have been considered. • 42 stakeholder groups have been included in the evaluation

  17. Assessing population exposure for landslide risk analysis using dasymetric cartography

    Science.gov (United States)

    Garcia, Ricardo A. C.; Oliveira, Sergio C.; Zezere, Jose L.

    2015-04-01

    Exposed Population is a major topic that needs to be taken into account in a full landslide risk analysis. Usually, risk analysis is based on an accounting of inhabitants number or inhabitants density, applied over statistical or administrative terrain units, such as NUTS or parishes. However, this kind of approach may skew the obtained results underestimating the importance of population, mainly in territorial units with predominance of rural occupation. Furthermore, the landslide susceptibility scores calculated for each terrain unit are frequently more detailed and accurate than the location of the exposed population inside each territorial unit based on Census data. These drawbacks are not the ideal setting when landslide risk analysis is performed for urban management and emergency planning. Dasymetric cartography, which uses a parameter or set of parameters to restrict the spatial distribution of a particular phenomenon, is a methodology that may help to enhance the resolution of Census data and therefore to give a more realistic representation of the population distribution. Therefore, this work aims to map and to compare the population distribution based on a traditional approach (population per administrative terrain units) and based on dasymetric cartography (population by building). The study is developed in the Region North of Lisbon using 2011 population data and following three main steps: i) the landslide susceptibility assessment based on statistical models independently validated; ii) the evaluation of population distribution (absolute and density) for different administrative territorial units (Parishes and BGRI - the basic statistical unit in the Portuguese Census); and iii) the dasymetric population's cartography based on building areal weighting. Preliminary results show that in sparsely populated administrative units, population density differs more than two times depending on the application of the traditional approach or the dasymetric

  18. Assessing temporal variations in connectivity through suspended sediment hysteresis analysis

    Science.gov (United States)

    Sherriff, Sophie; Rowan, John; Fenton, Owen; Jordan, Phil; Melland, Alice; Mellander, Per-Erik; hUallacháin, Daire Ó.

    2016-04-01

    Connectivity provides a valuable concept for understanding catchment-scale sediment dynamics. In intensive agricultural catchments, land management through tillage, high livestock densities and extensive land drainage practices significantly change hydromorphological behaviour and alter sediment supply and downstream delivery. Analysis of suspended sediment-discharge hysteresis has offered insights into sediment dynamics but typically on a limited selection of events. Greater availability of continuous high-resolution discharge and turbidity data and qualitative hysteresis metrics enables assessment of sediment dynamics during more events and over time. This paper assesses the utility of this approach to explore seasonal variations in connectivity. Data were collected from three small (c. 10 km2) intensive agricultural catchments in Ireland with contrasting morphologies, soil types, land use patterns and management practices, and are broadly defined as low-permeability supporting grassland, moderate-permeability supporting arable and high-permeability supporting arable. Suspended sediment concentration (using calibrated turbidity measurements) and discharge data were collected at 10-min resolution from each catchment outlet and precipitation data were collected from a weather station within each catchment. Event databases (67-90 events per catchment) collated information on sediment export metrics, hysteresis category (e.g., clockwise, anti-clockwise, no hysteresis), numeric hysteresis index, and potential hydro-meteorological controls on sediment transport including precipitation amount, duration, intensity, stream flow and antecedent soil moisture and rainfall. Statistical analysis of potential controls on sediment export was undertaken using Pearson's correlation coefficient on separate hysteresis categories in each catchment. Sediment hysteresis fluctuations through time were subsequently assessed using the hysteresis index. Results showed the numeric

  19. Imminent Cardiac Risk Assessment via Optical Intravascular Biochemical Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wetzel, D.; Wetzel, L; Wetzel, M; Lodder, R

    2009-01-01

    Heart disease is by far the biggest killer in the United States, and type II diabetes, which affects 8% of the U.S. population, is on the rise. In many cases, the acute coronary syndrome and/or sudden cardiac death occurs without warning. Atherosclerosis has known behavioral, genetic and dietary risk factors. However, our laboratory studies with animal models and human post-mortem tissue using FT-IR microspectroscopy reveal the chemical microstructure within arteries and in the arterial walls themselves. These include spectra obtained from the aortas of ApoE-/- knockout mice on sucrose and normal diets showing lipid deposition in the former case. Also pre-aneurysm chemical images of knockout mouse aorta walls, and spectra of plaque excised from a living human patient are shown for comparison. In keeping with the theme of the SPEC 2008 conference Spectroscopic Diagnosis of Disease this paper describes the background and potential value of a new catheter-based system to provide in vivo biochemical analysis of plaque in human coronary arteries. We report the following: (1) results of FT-IR microspectroscopy on animal models of vascular disease to illustrate the localized chemical distinctions between pathological and normal tissue, (2) current diagnostic techniques used for risk assessment of patients with potential unstable coronary syndromes, and (3) the advantages and limitations of each of these techniques illustrated with patent care histories, related in the first person, by the physician coauthors. Note that the physician comments clarify the contribution of each diagnostic technique to imminent cardiac risk assessment in a clinical setting, leading to the appreciation of what localized intravascular chemical analysis can contribute as an add-on diagnostic tool. The quality of medical imaging has improved dramatically since the turn of the century. Among clinical non-invasive diagnostic tools, laboratory tests of body fluids, EKG, and physical examination are

  20. An improved rank assessment method for weibull analysis of reliability data

    International Nuclear Information System (INIS)

    Weibull analysis has been applied widely in reliability data analysis. Rank assessment is one of the key steps in weibull analysis, which also induces the original errors. An improved median rank function obtained by genetic algorithms is presented to reduce the errors of rank assessment. (authors)

  1. Analysis of existing risk assessments, and list of suggestions

    CERN Document Server

    Heimsch, Laura

    2016-01-01

    The scope of this project was to analyse risk assessments made at CERN and extracting some crucial information about the different methodologies used, profiles of people who make the risk assessments, and gathering information of whether the risk matrix was used and if the acceptable level of risk was defined. Second step of the project was to trigger discussion inside HSE about risk assessment by suggesting a risk matrix and a risk assessment template.

  2. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    Science.gov (United States)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  3. Portfolio Assessment on Chemical Reactor Analysis and Process Design Courses

    Science.gov (United States)

    Alha, Katariina

    2004-01-01

    Assessment determines what students regard as important: if a teacher wants to change students' learning, he/she should change the methods of assessment. This article describes the use of portfolio assessment on five courses dealing with chemical reactor and process design during the years 1999-2001. Although the use of portfolio was a new…

  4. Flood Risk Analysis and Flood Potential Losses Assessment

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    The heavy floods in the Taihu Basin showed increasing trend in recent years. In thiswork, a typical area in the northern Taihu Basin was selected for flood risk analysis and potentialflood losses assessment. Human activities have strong impact on the study area' s flood situation (asaffected by the polders built, deforestation, population increase, urbanization, etc. ), and havemade water level higher, flood duration shorter, and flood peaks sharper. Five years of differentflood return periods [(1970), 5 (1962), 10 (1987), 20 (1954), 50 (1991)] were used to cal-culate the potential flood risk area and its losses. The potential flood risk map, economic losses,and flood-impacted population were also calculated. The study's main conclusions are: 1 ) Humanactivities have strongly changed the natural flood situation in the study area, increasing runoff andflooding; 2) The flood risk area is closely related with the precipitation center; 3) Polder construc-tion has successfully protected land from flood, shortened the flood duration, and elevated waterlevel in rivers outside the polders; 4) Economic and social development have caused flood losses toincrease in recent years.

  5. RWMC Performance Assessment/Composite Analysis Monitoring Report - FY-2002

    International Nuclear Information System (INIS)

    US DOE Order 435.1, Radioactive Waste Management, Chapter IV and the associated implementation manual and guidance require monitoring of low-level radioactive waste (LLW) disposal facilities. The Performance Assessment/Composite Analysis (PA/CA) Monitoring program was developed and implemented to meet this requirement. This report represents the results of PA/CA monitoring projects that are available as of September 2002. The technical basis for the PA/CA program is provided in the PA/CA Monitoring Program document and a program description document (PDD) serves as the quality assurance project plan for implementing the PM program. Subsurface monitoring, air pathway surveillance, and subsidence monitoring/control are required to comply with DOE Order 435.1, Chapter IV. Subsidence monitoring/control and air pathway surveillance are performed entirely by other INEEL programs - their work is summarized herein. Subsurface monitoring includes near-field (source) monitoring of buried activated beryllium and steel, monitoring of groundwater in the vadose zone, and monitoring of the Snake River Plain Aquifer. Most of the required subsurface monitoring information presented in this report was gathered from the results of ongoing INEEL monitoring programs. This report also presents results for several new monitoring efforts that have been initiated to characterize any migration of radionuclides in surface sediment near the waste

  6. Uncertainty analysis in the applications of nuclear probabilistic risk assessment

    International Nuclear Information System (INIS)

    The aim of this thesis is to propose an approach to model parameter and model uncertainties affecting the results of risk indicators used in the applications of nuclear Probabilistic Risk assessment (PRA). After studying the limitations of the traditional probabilistic approach to represent uncertainty in PRA model, a new approach based on the Dempster-Shafer theory has been proposed. The uncertainty analysis process of the proposed approach consists in five main steps. The first step aims to model input parameter uncertainties by belief and plausibility functions according to the data PRA model. The second step involves the propagation of parameter uncertainties through the risk model to lay out the uncertainties associated with output risk indicators. The model uncertainty is then taken into account in the third step by considering possible alternative risk models. The fourth step is intended firstly to provide decision makers with information needed for decision making under uncertainty (parametric and model) and secondly to identify the input parameters that have significant uncertainty contributions on the result. The final step allows the process to be continued in loop by studying the updating of beliefs functions given new data. The proposed methodology was implemented on a real but simplified application of PRA model. (author)

  7. Assessment of gene set analysis methods based on microarray data.

    Science.gov (United States)

    Alavi-Majd, Hamid; Khodakarim, Soheila; Zayeri, Farid; Rezaei-Tavirani, Mostafa; Tabatabaei, Seyyed Mohammad; Heydarpour-Meymeh, Maryam

    2014-01-25

    Gene set analysis (GSA) incorporates biological information into statistical knowledge to identify gene sets differently expressed between two or more phenotypes. It allows us to gain an insight into the functional working mechanism of cells beyond the detection of differently expressed gene sets. In order to evaluate the competence of GSA approaches, three self-contained GSA approaches with different statistical methods were chosen; Category, Globaltest and Hotelling's T(2) together with their assayed power to identify the differences expressed via simulation and real microarray data. The Category does not take care of the correlation structure, while the other two deal with correlations. In order to perform these methods, R and Bioconductor were used. Furthermore, venous thromboembolism and acute lymphoblastic leukemia microarray data were applied. The results of three GSAs showed that the competence of these methods depends on the distribution of gene expression in a dataset. It is very important to assay the distribution of gene expression data before choosing the GSA method to identify gene sets differently expressed between phenotypes. On the other hand, assessment of common genes among significant gene sets indicated that there was a significant agreement between the result of GSA and the findings of biologists. PMID:24012817

  8. Assessment of the Assessment Tool: Analysis of Items in a Non-MCQ Mathematics Exam

    Science.gov (United States)

    Khoshaim, Heba Bakr; Rashid, Saima

    2016-01-01

    Assessment is one of the vital steps in the teaching and learning process. The reported action research examines the effectiveness of an assessment process and inspects the validity of exam questions used for the assessment purpose. The instructors of a college-level mathematics course studied questions used in the final exams during the academic…

  9. Analysis Strategy for Fracture Assessment of Defects in Ductile Materials

    Energy Technology Data Exchange (ETDEWEB)

    Dillstroem, Peter; Andersson, Magnus; Sattari-Far, Iradj; Weilin Zang (Inspecta Technology AB, Stockholm (Sweden))

    2009-06-15

    The main purpose of this work is to investigate the significance of the residual stresses for defects (cracks) in ductile materials with nuclear applications, when the applied primary (mechanical) loads are high. The treatment of weld-induced stresses as expressed in the SACC/ProSACC handbook and other fracture assessment procedures such as the ASME XI code and the R6-method is believed to be conservative for ductile materials. This is because of the general approach not to account for the improved fracture resistance caused by ductile tearing. Furthermore, there is experimental evidence that the contribution of residual stresses to fracture diminishes as the degree of yielding increases to a high level. However, neglecting weld-induced stresses in general, though, is doubtful for loads that are mostly secondary (e.g. thermal shocks) and for materials which are not ductile enough to be limit load controlled. Both thin-walled and thick-walled pipes containing surface cracks are studied here. This is done by calculating the relative contribution from the weld residual stresses to CTOD and the J-integral. Both circumferential and axial cracks are analysed. Three different crack geometries are studied here by using the finite element method (FEM). (i) 2D axisymmetric modelling of a V-joint weld in a thin-walled pipe. (ii) 2D axisymmetric modelling of a V-joint weld in a thick-walled pipe. (iii) 3D modelling of a X-joint weld in a thick-walled pipe. t. Each crack configuration is analysed for two load cases; (1) Only primary (mechanical) loading is applied to the model, (2) Both secondary stresses and primary loading are applied to the model. Also presented in this report are some published experimental investigations conducted on cracked components of ductile materials subjected to both primary and secondary stresses. Based on the outcome of this study, an analysis strategy for fracture assessment of defects in ductile materials of nuclear components is proposed. A new

  10. Analysis Strategy for Fracture Assessment of Defects in Ductile Materials

    International Nuclear Information System (INIS)

    The main purpose of this work is to investigate the significance of the residual stresses for defects (cracks) in ductile materials with nuclear applications, when the applied primary (mechanical) loads are high. The treatment of weld-induced stresses as expressed in the SACC/ProSACC handbook and other fracture assessment procedures such as the ASME XI code and the R6-method is believed to be conservative for ductile materials. This is because of the general approach not to account for the improved fracture resistance caused by ductile tearing. Furthermore, there is experimental evidence that the contribution of residual stresses to fracture diminishes as the degree of yielding increases to a high level. However, neglecting weld-induced stresses in general, though, is doubtful for loads that are mostly secondary (e.g. thermal shocks) and for materials which are not ductile enough to be limit load controlled. Both thin-walled and thick-walled pipes containing surface cracks are studied here. This is done by calculating the relative contribution from the weld residual stresses to CTOD and the J-integral. Both circumferential and axial cracks are analysed. Three different crack geometries are studied here by using the finite element method (FEM). (i) 2D axisymmetric modelling of a V-joint weld in a thin-walled pipe. (ii) 2D axisymmetric modelling of a V-joint weld in a thick-walled pipe. (iii) 3D modelling of a X-joint weld in a thick-walled pipe. t. Each crack configuration is analysed for two load cases; (1) Only primary (mechanical) loading is applied to the model, (2) Both secondary stresses and primary loading are applied to the model. Also presented in this report are some published experimental investigations conducted on cracked components of ductile materials subjected to both primary and secondary stresses. Based on the outcome of this study, an analysis strategy for fracture assessment of defects in ductile materials of nuclear components is proposed. A new

  11. SOCIAL NETWORK ANALYSIS FOR ASSESSING SOCIAL CAPITAL IN BIOSECURITY ECOLITERACY

    Directory of Open Access Journals (Sweden)

    Sang Putu Kaler Surata

    2016-02-01

    Full Text Available Abstract: Social Network Analysis for Assessing Social Capital in Biosecurity Ecoliteracy. Biosecurity ecoliteracy (BEL is a view of literacy that applies ecological concepts to promote in-depth understanding, critical reflection, creative thinking, self consciousness, communication and social skills, in analyzing and managing issues around plant health/living, animal health/living and the risks that are associated with the environment. We used social network analysis (SNA to evaluate two distinct forms of social capital of BEL: social cohesion and network structure. This study was executed by employing cooperative learning in BEL toward 30 undergraduate teacher training students. Data then was analyzed using UCINET software. We found the tendency of so­cial cohesion to increase after students participated in BEL. This was supported by several SNA measures (density, closeness and degree and these values at the end were statistically different than at the beginning of BEL. The social structure map (sociogram after BEL visualized that students were much more likely to cluster in groups compared with the sociogram before BEL. Thus BEL, through cooperative learning, was able to promote social capital. In addition SNA proved a useful tool for evaluating the achievement levels of social capital of BEL in the form of network cohesion and network structure. Abstrak: Analisis Jaringan Sosial untuk Menilai Ekoliterasi Ketahanan Hayati. Ekoliterasi ketahanan hayati (EKH adalah literasi yang mengaplikasikan berbagai konsep ekologi untuk mempromosikan pe­mahaman yang mendalam, refleksi kritis, kesadaran diri, keterampilan sosial dan berkomunikasi, dalam menganalisis, dan mengelola isu yang terkait dengan kesehatan/kehidupan tanaman, kesehatan/kehidupan binatang, dan risiko yang terkait dengan lingkungan. Analisis jaringan kerja sosial (AJS telah digunakan untuk mengevaluasi dua bentuk model sosial EKH: kohesi sosial dan struktur jaringan kerja. Untuk itu

  12. Modeling the Assessment of Agricultural Enterprises Headcount Analysis

    OpenAIRE

    Tatyana Viatkina

    2014-01-01

    The modern procedures of enterprises labour resources have been analyzed. The algorithm for calculation the enterprise performancepotential efficiency ratio and assessment of the performance potential of the enterprise assessment based on quantitativeand qualitative characteristics has been provided. The model for assessment the effectiveness of labour management of enterprise,branch or region subject to such factors as motivation, labour expenses, staff rotation and its qualifications has be...

  13. HPLC analysis and safety assessment of coumarin in foods.

    Science.gov (United States)

    Sproll, Constanze; Ruge, Winfried; Andlauer, Claudia; Godelmann, Rolf; Lachenmeier, Dirk W

    2008-07-15

    Coumarin is a component of natural flavourings including cassia, which is widely used in foods and pastries. The toxicity of coumarin has raised some concerns and food safety authorities have set a maximum limit of 2mg/kg for foods and beverages in general, and a maximum level of 10mg/l for alcoholic beverages. An efficient method for routine analysis of coumarin is liquid chromatography with diode array detection. The optimal sample preparation for foods containing cinnamon was investigated and found to be cold extraction of 15g sample with 50mL of methanol (80%, v/v) for 30min using magnetic stirring. In the foods under investigation, appreciable amounts of coumarin were found in bakery products and breakfast cereals (mean 9mg/kg) with the highest concentrations up to 88mg/kg in certain cookies flavoured with cinnamon. Other foods such as liqueurs, vodka, mulled wine, and milk products did not have coumarin concentrations above the maximum level. The safety assessment of coumarin containing foods, in the context of governmental food controls, is complicated as a toxicological basis for the maximum limits appears to be missing. The limits were derived at a time when a genotoxic mechanism was assumed. However, this has since been disproven in more recent studies. Our exposure data on coumarin in bakery products show that there is still a need for a continued regulation of coumarin in foods. A toxicological re-evaluation of coumarin with the aim to derive scientifically founded maximum limits should be conducted with priority. PMID:26003373

  14. Sampling and Analysis for Assessment of Body Burdens

    International Nuclear Information System (INIS)

    A review of sampling criteria and techniques and of sample processing methods for indirect assessment of body burdens is presented. The text is limited to the more recent developments in the field of bioassay and to the nuclides which cannot be readily determined in the body directly. A selected bibliography is included. The planning of a bioassay programme should emphasize the detection of high or unusual exposures and the concentrated study of these cases when detected. This procedure gives the maximum amount of data for the dosimetry of individuals at risk and also adds to our scientific background for an understanding of internal emitters. Only a minimum of effort should be spent on sampling individuals having had negligible exposure. The chemical separation procedures required for bioassay also fall into two categories. The first is the rapid method, possibly of low accuracy, used for detection. The second is the more accurate method required for study of the individual after detection of the exposure. Excretion, whether exponential or a power function, drops off rapidly. It is necessary to locate the exposure in time before any evaluation can be made, even before deciding if the exposure is significant. One approach is frequent sampling and analysis by a quick screening technique. More commonly, samples are collected at longer intervals and an arbitrary level of re-sampling is set to assist in the detection of real exposures. It is probable that too much bioassay effort has gone into measurements on individuals at low risk and not enough on those at higher risk. The development of bioassay procedures for overcoming this problem has begun, and this paper emphasizes this facet of sampling and sample processing. (author)

  15. Assessment of academic departments efficiency using data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Salah R. Agha

    2011-07-01

    Full Text Available Purpose: In this age of knowledge economy, universities play an important role in the development of a country. As government subsidies to universities have been decreasing, more efficient use of resources becomes important for university administrators. This study evaluates the relative technical efficiencies of academic departments at the Islamic University in Gaza (IUG during the years 2004-2006. Design/methodology/approach: This study applies Data Envelopment Analysis (DEA to assess the relative technical efficiency of the academic departments. The inputs are operating expenses, credit hours and training resources, while the outputs are number of graduates, promotions and public service activities. The potential improvements and super efficiency are computed for inefficient and efficient departments respectively. Further, multiple linear -regression is used to develop a relationship between super efficiency and input and output variables.Findings: Results show that the average efficiency score is 68.5% and that there are 10 efficient departments out of the 30 studied. It is noted that departments in the faculty of science, engineering and information technology have to greatly reduce their laboratory expenses. The department of economics and finance was found to have the highest super efficiency score among the efficient departments. Finally, it was found that promotions have the greatest contribution to the super efficiency scores while public services activities come next.Research limitations/implications: The paper focuses only on academic departments at a single university. Further, DEA is deterministic in nature.Practical implications: The findings offer insights on the inputs and outputs that significantly contribute to efficiencies so that inefficient departments can focus on these factors.Originality/value: Prior studies have used only one type of DEA (BCC and they did not explicitly answer the question posed by the inefficient

  16. Problems of method of technology assessment. A methodological analysis; Methodenprobleme des Technology Assessment; Eine methodologische Analyse

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, V.

    1993-03-01

    The study undertakes to analyse the theoretical and methodological structure of Technology Assessment (TA). It is based on a survey of TA studies which provided an important condition for theoreticall sound statements on methodological aspects of TA. It was established that the main basic theoretical problems of TA are in the field of dealing with complexity. This is also apparent in the constitution of problems, the most elementary and central approach of TA. Scientifically founded constitution of problems and the corresponding construction of models call for interdisciplinary scientific work. Interdisciplinarity in the TA research process is achieved at the level of virtual networks, these networks being composed of individuals suited to teamwork. The emerging network structures have an objective-organizational and an ideational basis. The objective-organizational basis is mainly the result of team composition and the external affiliations of the team members. The ideational basis of the virtual network is represented by the team members` mode of thinking, which is individually located at a multidisciplinary level. The theoretical `skeleton` of the TA knowledge system, which is represented by process knowledge based linkage structures, can be generated and also processed in connection with the knowledge on types of problems, areas of analysis and procedures to deal with complexity. Within this process, disciplinary knowledge is a necessary but not a sufficient condition. Metatheoretical and metadisciplinary knowledge and the correspondingly processes complexity of models are the basis for the necessary methodological awareness, that allows TA to become designable as a research procedure. (orig./HP) [Deutsch] Die Studie stellt sich die Aufgabe, die theoretische und methodische Struktur des Technology Assessment (TA) zu analysieren. Sie fusst auf Erhebungen, die bei Technology-Assessment-Studien vorgenommen wurden und die wesentliche Voraussetzungen fuer

  17. Life Cycle Assessment Software for Product and Process Sustainability Analysis

    Science.gov (United States)

    Vervaeke, Marina

    2012-01-01

    In recent years, life cycle assessment (LCA), a methodology for assessment of environmental impacts of products and services, has become increasingly important. This methodology is applied by decision makers in industry and policy, product developers, environmental managers, and other non-LCA specialists working on environmental issues in a wide…

  18. DESIGNA ND ANALYSIS FOR THEMATIC MAP ACCURACY ASSESSMENT: FUNDAMENTAL PRINCIPLES

    Science.gov (United States)

    Before being used in scientific investigations and policy decisions, thematic maps constructed from remotely sensed data should be subjected to a statistically rigorous accuracy assessment. The three basic components of an accuracy assessment are: 1) the sampling design used to s...

  19. Defining and Assessing Public Health Functions: A Global Analysis.

    Science.gov (United States)

    Martin-Moreno, Jose M; Harris, Meggan; Jakubowski, Elke; Kluge, Hans

    2016-01-01

    Given the broad scope and intersectoral nature of public health structures and practices, there are inherent difficulties in defining which services fall under the public health remit and in assessing their capacity and performance. The aim of this study is to analyze how public health functions and practice have been defined and operationalized in different countries and regions around the world, with a specific focus on assessment tools that have been developed to evaluate the performance of essential public health functions, services, and operations. Our review has identified nearly 100 countries that have carried out assessments, using diverse analytical and methodological approaches. The assessment processes have evolved quite differently according to administrative arrangements and resource availability, but some key contextual factors emerge that seem to favor policy-oriented follow-up. These include local ownership of the assessment process, policymakers' commitment to reform, and expert technical advice for implementation. PMID:26789385

  20. Teacher Candidates Exposure to Formative Assessment in Educational Psychology Textbooks: A Content Analysis

    Science.gov (United States)

    Wininger, Steven R.; Norman, Antony D.

    2005-01-01

    The purpose of this article is to define formative assessment, outline what is known about the prevalence of formative assessment implementation in the classroom, establish the importance of formative assessment with regards to student motivation and achievement, and present the results of a content analysis of current educational psychology…

  1. Assessing the Validity of Discourse Analysis: Transdisciplinary Convergence

    Science.gov (United States)

    Jaipal-Jamani, Kamini

    2014-01-01

    Research studies using discourse analysis approaches make claims about phenomena or issues based on interpretation of written or spoken text, which includes images and gestures. How are findings/interpretations from discourse analysis validated? This paper proposes transdisciplinary convergence as a way to validate discourse analysis approaches to…

  2. Radiological assessment. A textbook on environmental dose analysis

    International Nuclear Information System (INIS)

    Radiological assessment is the quantitative process of estimating the consequences to humans resulting from the release of radionuclides to the biosphere. It is a multidisciplinary subject requiring the expertise of a number of individuals in order to predict source terms, describe environmental transport, calculate internal and external dose, and extrapolate dose to health effects. Up to this time there has been available no comprehensive book describing, on a uniform and comprehensive level, the techniques and models used in radiological assessment. Radiological Assessment is based on material presented at the 1980 Health Physics Society Summer School held in Seattle, Washington. The material has been expanded and edited to make it comprehensive in scope and useful as a text. Topics covered include (1) source terms for nuclear facilities and Medical and Industrial sites; (2) transport of radionuclides in the atmosphere; (3) transport of radionuclides in surface waters; (4) transport of radionuclides in groundwater; (5) terrestrial and aquatic food chain pathways; (6) reference man; a system for internal dose calculations; (7) internal dosimetry; (8) external dosimetry; (9) models for special-case radionuclides; (10) calculation of health effects in irradiated populations; (11) evaluation of uncertainties in environmental radiological assessment models; (12) regulatory standards for environmental releases of radionuclides; (13) development of computer codes for radiological assessment; and (14) assessment of accidental releases of radionuclides

  3. Radiological assessment. A textbook on environmental dose analysis

    Energy Technology Data Exchange (ETDEWEB)

    Till, J.E.; Meyer, H.R. (eds.)

    1983-09-01

    Radiological assessment is the quantitative process of estimating the consequences to humans resulting from the release of radionuclides to the biosphere. It is a multidisciplinary subject requiring the expertise of a number of individuals in order to predict source terms, describe environmental transport, calculate internal and external dose, and extrapolate dose to health effects. Up to this time there has been available no comprehensive book describing, on a uniform and comprehensive level, the techniques and models used in radiological assessment. Radiological Assessment is based on material presented at the 1980 Health Physics Society Summer School held in Seattle, Washington. The material has been expanded and edited to make it comprehensive in scope and useful as a text. Topics covered include (1) source terms for nuclear facilities and Medical and Industrial sites; (2) transport of radionuclides in the atmosphere; (3) transport of radionuclides in surface waters; (4) transport of radionuclides in groundwater; (5) terrestrial and aquatic food chain pathways; (6) reference man; a system for internal dose calculations; (7) internal dosimetry; (8) external dosimetry; (9) models for special-case radionuclides; (10) calculation of health effects in irradiated populations; (11) evaluation of uncertainties in environmental radiological assessment models; (12) regulatory standards for environmental releases of radionuclides; (13) development of computer codes for radiological assessment; and (14) assessment of accidental releases of radionuclides.

  4. Fuzzy sensitivity analysis for reliability assessment of building structures

    Science.gov (United States)

    Kala, Zdeněk

    2016-06-01

    The mathematical concept of fuzzy sensitivity analysis, which studies the effects of the fuzziness of input fuzzy numbers on the fuzziness of the output fuzzy number, is described in the article. The output fuzzy number is evaluated using Zadeh's general extension principle. The contribution of stochastic and fuzzy uncertainty in reliability analysis tasks of building structures is discussed. The algorithm of fuzzy sensitivity analysis is an alternative to stochastic sensitivity analysis in tasks in which input and output variables are considered as fuzzy numbers.

  5. Uncertainty and sensitivity analysis using probabilistic system assessment code. 1

    International Nuclear Information System (INIS)

    This report presents the results obtained when applying the probabilistic system assessment code under development to the PSACOIN Level 0 intercomparison exercise organized by the Probabilistic System Assessment Code User Group in the Nuclear Energy Agency (NEA) of OECD. This exercise is one of a series designed to compare and verify probabilistic codes in the performance assessment of geological radioactive waste disposal facilities. The computations were performed using the Monte Carlo sampling code PREP and post-processor code USAMO. The submodels in the waste disposal system were described and coded with the specification of the exercise. Besides the results required for the exercise, further additional uncertainty and sensitivity analyses were performed and the details of these are also included. (author)

  6. RHETORICAL STRUCTURE ANALYSIS FOR ASSESSING COLLABORATIVE PROCESSES IN CSCL

    Directory of Open Access Journals (Sweden)

    Mohammad Hamad Allaymoun

    2015-12-01

    Full Text Available This paper presents a research on using rhetorical structures for assessing collaborative processes in Computer-Supported Collaborative Learning (CSCL chats. For this purpose, the ideas of Bakhtin’s dialogism theory and Trausan-Matu’s polyphonic model are used, starting from the identification of the threads of repeated words from chats. Cue phrases and their usage in linking the identified threads are also considered. The results are presented in statistical tables and graphics that ease the understanding of the collaborative process, helping teachers to analyze and assess students' collaborative chats. It also allows students to know and understand the interactions and how it contributes to the conversation.

  7. No-Reference Video Quality Assessment using MPEG Analysis

    DEFF Research Database (Denmark)

    Søgaard, Jacob; Forchhammer, Søren; Korhonen, Jari

    2013-01-01

    estimate the video coding parameters for MPEG-2 and H.264/AVC which can be used to improve the VQA. The analysis differs from most other video coding analysis methods since it is without access to the bitstream. The results show that our proposed method is competitive with other recent NR VQA methods for...... MPEG-2 and H.264/AVC....

  8. Global Gene Expression Analysis for the Assessment of Nanobiomaterials.

    Science.gov (United States)

    Hanagata, Nobutaka

    2015-01-01

    Using global gene expression analysis, the effects of biomaterials and nanomaterials can be analyzed at the genetic level. Even though information obtained from global gene expression analysis can be useful for the evaluation and design of biomaterials and nanomaterials, its use for these purposes is not widespread. This is due to the difficulties involved in data analysis. Because the expression data of about 20,000 genes can be obtained at once with global gene expression analysis, the data must be analyzed using bioinformatics. A method of bioinformatic analysis called gene ontology can estimate the kinds of changes on cell functions caused by genes whose expression level is changed by biomaterials and nanomaterials. Also, by applying a statistical analysis technique called hierarchical clustering to global gene expression data between a variety of biomaterials, the effects of the properties of materials on cell functions can be estimated. In this chapter, these theories of analysis and examples of applications to nanomaterials and biomaterials are described. Furthermore, global microRNA analysis, a method that has gained attention in recent years, and its application to nanomaterials are introduced. PMID:26201278

  9. Assessing SRI fund performance research : best practices in empirical analysis

    NARCIS (Netherlands)

    Chegut, Andrea; Schenk, H.; Scholtens, B.

    2011-01-01

    We review the socially responsible investment (SRI) mutual fund performance literature to provide best practices in SRI performance attribution analysis. Based on meta-ethnography and content analysis, five themes in this literature require specific attention: data quality, social responsibility ver

  10. Environmental Impact Assessment for Socio-Economic Analysis of Chemicals

    DEFF Research Database (Denmark)

    Calow, Peter; Biddinger, G; Hennes, C;

    This report describes the requirements for, and illustrates the application of, a methodology for a socio-economic analysis (SEA) especially as it might be adopted in the framework of REACH.......This report describes the requirements for, and illustrates the application of, a methodology for a socio-economic analysis (SEA) especially as it might be adopted in the framework of REACH....

  11. ASSESSMENT OF REGIONAL EFFICIENCY IN CROATIA USING DATA ENVELOPMENT ANALYSIS

    Directory of Open Access Journals (Sweden)

    Danijela Rabar

    2013-02-01

    Full Text Available In this paper, regional efficiency of Croatian counties is measured in three-year period (2005-2007 using Data Envelopment Analysis (DEA. The set of inputs and outputs consists of seven socioeconomic indicators. Analysis is carried out using models with assumption of variable returns-to-scale. DEA identifies efficient counties as benchmark members and inefficient counties that are analyzed in detail to determine the sources and the amounts of their inefficiency in each source. To enable proper monitoring of development dynamics, window analysis is applied. Based on the results, guidelines for implementing necessary improvements to achieve efficiency are given. Analysis reveals great disparities among counties. In order to alleviate naturally, historically and politically conditioned unequal county positions over which economic policy makers do not have total control, categorical approach is introduced as an extension to the basic DEA models. This approach, combined with window analysis, changes relations among efficiency scores in favor of continental counties.

  12. Analysis of the most widely used Building Environmental Assessment methods

    International Nuclear Information System (INIS)

    Building Environmental Assessment (BEA) is a term used for several methods for environmental assessment of the building environment. Generally, Life Cycle Assessment (LCA) is an important foundation and part of the BEA method, but current BEA methods form more comprehensive tools than LCA. Indicators and weight assignments are the two most important factors characterizing BEA. From the comparison of the three most widely used BEA methods, EcoHomes (BREEAM for residential buildings), LEED-NC and GBTool, it can be seen that BEA methods are shifting from ecological, indicator-based scientific systems to more integrated systems covering ecological, social and economic categories. Being relatively new methods, current BEA systems are far from perfect and are under continuous development. The further development of BEA methods will focus more on non-ecological indicators and how to promote implementation. Most BEA methods are developed based on regional regulations and LCA methods, but they do not attempt to replace these regulations. On the contrary, they try to extend implementation by incentive programmes. There are several ways to enhance BEA in the future: expand the studied scope from design levels to whole life-cycle levels of constructions, enhance international cooperation, accelerate legislation and standardize and develop user-oriented assessment systems

  13. Repeater Analysis for Combining Information from Different Assessments

    Science.gov (United States)

    Haberman, Shelby; Yao, Lili

    2015-01-01

    Admission decisions frequently rely on multiple assessments. As a consequence, it is important to explore rational approaches to combine the information from different educational tests. For example, U.S. graduate schools usually receive both TOEFL iBT® scores and GRE® General scores of foreign applicants for admission; however, little guidance…

  14. No-Reference Video Quality Assessment by HEVC Codec Analysis

    DEFF Research Database (Denmark)

    Huang, Xin; Søgaard, Jacob; Forchhammer, Søren

    2015-01-01

    transform coefficients, estimates the distortion, and assesses the video quality. The proposed scheme generates VQA features based on Intra coded frames, and then maps features using an Elastic Net to predict subjective video quality. A set of HEVC coded 4K UHD sequences are tested. Results show that the...

  15. Facet Analysis of the Client Needs Assessment Instrument.

    Science.gov (United States)

    Dancer, L. Suzanne; Stanley, Lawrence R.

    The structure of the revised Client Needs Assessment Instrument (CNAI) is examined. In 1978-79, the Texas Department of Human Resources (DHR) developed the CNAI to provide an index of applicants' and clients' capacity for self-care by measuring the respondents' levels of functioning in: (1) physical health; (2) daily living activities; (3) mental…

  16. Using Empirical Article Analysis to Assess Research Methods Courses

    Science.gov (United States)

    Bachiochi, Peter; Everton, Wendi; Evans, Melanie; Fugere, Madeleine; Escoto, Carlos; Letterman, Margaret; Leszczynski, Jennifer

    2011-01-01

    Developing students who can apply their knowledge of empirical research is a key outcome of the undergraduate psychology major. This learning outcome was assessed in two research methods courses by having students read and analyze a condensed empirical journal article. At the start and end of the semester, students in multiple sections of an…

  17. Windfarm Generation Assessment for ReliabilityAnalysis of Power Systems

    DEFF Research Database (Denmark)

    Negra, Nicola Barberis; Holmstrøm, Ole; Bak-Jensen, Birgitte; Sørensen, Poul

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays a...

  18. Windfarm Generation Assessment for Reliability Analysis of Power Systems

    DEFF Research Database (Denmark)

    Barberis Negra, Nicola; Bak-Jensen, Birgitte; Holmstrøm, O.; Sørensen, P.

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays a...

  19. Windfarm generation assessment for reliability analysis of power systems

    DEFF Research Database (Denmark)

    Negra, N.B.; Holmstrøm, O.; Bak-Jensen, B.; Sørensen, Poul Ejnar

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays a...

  20. A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment

    Science.gov (United States)

    Finch, Holmes; Monahan, Patrick

    2008-01-01

    This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…

  1. Modeling and Analysis on Radiological Safety Assessment of Low- and Intermediate Level Radioactive Waste Repository

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Youn Myoung; Jung, Jong Tae; Kang, Chul Hyung (and others)

    2008-04-15

    Modeling study and analysis for technical support for the safety and performance assessment of the low- and intermediate level (LILW) repository partially needed for radiological environmental impact reporting which is essential for the licenses for construction and operation of LILW has been fulfilled. Throughout this study such essential area for technical support for safety and performance assessment of the LILW repository and its licensing as gas generation and migration in and around the repository, risk analysis and environmental impact during transportation of LILW, biosphere modeling and assessment for the flux-to-dose conversion factors for human exposure as well as regional and global groundwater modeling and analysis has been carried out.

  2. Modeling and Analysis on Radiological Safety Assessment of Low- and Intermediate Level Radioactive Waste Repository

    International Nuclear Information System (INIS)

    Modeling study and analysis for technical support for the safety and performance assessment of the low- and intermediate level (LILW) repository partially needed for radiological environmental impact reporting which is essential for the licenses for construction and operation of LILW has been fulfilled. Throughout this study such essential area for technical support for safety and performance assessment of the LILW repository and its licensing as gas generation and migration in and around the repository, risk analysis and environmental impact during transportation of LILW, biosphere modeling and assessment for the flux-to-dose conversion factors for human exposure as well as regional and global groundwater modeling and analysis has been carried out

  3. A root cause analysis approach to risk assessment of a pipeline network for Kuwait Oil Company

    Energy Technology Data Exchange (ETDEWEB)

    Davies, Ray J.; Alfano, Tony D. [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil); Waheed, Farrukh [Kuwait Oil Company, Ahmadi (Kuwait); Komulainen, Tiina [Kongsberg Oil and Gas Technologies, Sandvika (Norway)

    2009-07-01

    A large scale risk assessment was performed by Det Norske Veritas (DNV) for the entire Kuwait Oil Company (KOC) pipeline network. This risk assessment was unique in that it incorporated the assessment of all major sources of process related risk faced by KOC and included root cause management system related risks in addition to technical risks related to more immediate causes. The assessment was conducted across the entire pipeline network with the scope divided into three major categories:1. Integrity Management 2. Operations 3. Management Systems Aspects of integrity management were ranked and prioritized using a custom algorithm based on critical data sets. A detailed quantitative risk assessment was then used to further evaluate those issues deemed unacceptable, and finally a cost benefit analysis approach was used to compare and select improvement options. The operations assessment involved computer modeling of the entire pipeline network to assess for bottlenecks, surge and erosion analysis, and to identify opportunities within the network that could potentially lead to increased production. The management system assessment was performed by conducting a gap analysis on the existing system and by prioritizing those improvement actions that best aligned with KOC's strategic goals for pipelines. Using a broad and three-pronged approach to their overall risk assessment, KOC achieved a thorough, root cause analysis-based understanding of risks to their system as well as a detailed list of recommended remediation measures that were merged into a 5-year improvement plan. (author)

  4. FINANCIAL ANALYSIS FUNDAMENT FOR ASSESSMENT THE COMPANY'S VALUE

    Directory of Open Access Journals (Sweden)

    Goran Karanovic

    2010-06-01

    Full Text Available Lack of capital market development cause that calculating the value of companies in the small markets, such as the Croatian market, is carried out primarily from the analysis of financial statements. Lack of market development is evident from the unrealistic and unobjective corporate values, as result of too small volumeof securities trading in financial markets. The primary financial analysis is the basic method for estimating company value, and represents the foundation for an objective determination of cash flow components that will be discounted. Trought analysis investors are trying to answer the questions such as: status of the assets,liabilities and capital, the dynamics of business enterprises, the level of solvency and liquidity, utilization of fixed assets, contribution of fixed assets in total income, company profitability rates and investment in the company. Investors use financial analysis only as a basis and as a tool to predict the potential for creating new business value.

  5. Transboundary diagnostic analysis. Vol. 2. Background and environmental assessment

    OpenAIRE

    2012-01-01

    The Transboundary Diagnosis Analysis(TDA) quantifies and ranks water-related environmental transboundary issues and their causes according to the severity of environmental and/or socio-economic impacts. The three main issues in BOBLME are; overexploitation of marine living resources; degradation of mangroves, coral reefs and seagrasses; pollution and water quality. Volume 2 contains background material that sets out the bio-physical and socio-economic characteristics of the BOBLME; an analysi...

  6. Analysis of online quizzes as a teaching and assessment tool

    Directory of Open Access Journals (Sweden)

    Lorenzo Salas-Morera

    2012-03-01

    Full Text Available This article deals with the integrated use of online quizzes as a teaching and assessment tool in the general program of the subject Proyectos in the third course of Ingeniero Técnico en Informática de Gestión over five consecutive years. The research undertaken aimed to test quizzes effectiveness on student performance when used, not only as an isolated assessment tool, but also when integrated into a combined strategy, which support the overall programming of the subject. The results obtained during the five years of experimentation using online quizzes shows that such quizzes have a proven positive influence on students' academic performance. Furthermore, surveys conducted at the end of each course revealed the high value students accord to use of online quizzes in course instruction.

  7. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    OpenAIRE

    Aghaeepour, Nima; Finak, Greg; ,; Hoos, Holger; Mosmann, Tim R; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manu...

  8. Paediatric neuropsychological assessment: an analysis of parents' perspectives

    OpenAIRE

    Stark, Daniel; Thomas, Sophie; Dawson, Dave; Talbot, Emily; Bennett, Emily; Starza-Smith, Arleta

    2014-01-01

    Purpose: Modern healthcare services are commonly based on shared models of care, in which a strong emphasis is placed upon the views of those in receipt of services. The purpose of this paper is to examine the parents' experiences of their child's neuropsychological assessment. Design/methodology/approach: This was a mixed-methodology study employing both quantitative and qualitative measures. Findings: The questionnaire measure indicated a high overall level of satisfaction. Qualitative anal...

  9. An assessment and analysis of dietary practices of Irish jockeys

    OpenAIRE

    O'Loughlin, Gillian

    2014-01-01

    Background: Horse racing is a weight category sport in which jockeys must chronically maintain a low body mass to compete, over a protracted season. The need to relentlessly align body mass with racing limits appears to encourage the use of short-term and potentially dangerous acute weight loss strategies. The purpose of this study was to investigate and assess the dietary habits of Irish Jockeys using established methods as well as incorporating novel sensing technologies. Methods: The ...

  10. Vulnerability of assessing water resources by the improved set pair analysis

    Directory of Open Access Journals (Sweden)

    Yang Xiao-Hua

    2014-01-01

    Full Text Available Climate change has tremendously changed the hydrological processes with global warming. There are many uncertainties in assessing water resources vulnerability. To assess the water resources vulnerability rationally under climate change, an improved set pair analysis model is established, in which set pair analysis theory is introduced and the weights are determined by the analytic hierarchy process method. The index systems and criteria of water resources vulnerability assessment in terms of water cycle, socio-economy, and ecological environment are established based on the analysis of sensibility and adaptability. Improved set pair analysis model is used to assess water resource vulnerability in Ningxia with twelve indexes under four kinds of future climate scenarios. Certain and uncertain information quantity of water resource vulnerability is calculated by connection numbers in the improved set pair analysis model. Results show that Ningxia is higher vulnerability under climate change scenarios. Improved set pair analysis model can fully take advantage of certain and uncertain knowledge, subjective and objective information compared with fuzzy assessment model and artificial neural network model. The improved set pair analysis is an extension to the vulnerability assessment model of water resources system.

  11. Tiger Team Assessments seventeen through thirty-five: A summary and analysis

    International Nuclear Information System (INIS)

    This report provides a summary and analysis of the Department of Energy's (DOE'S) 19 Tiger Team Assessments that were conducted from October 1990 to July 1992. The sites are listed in the box below, along with their respective program offices and assessment completion dates. This analysis relied solely on the information contained in the Tiger Team Assessment Reports. The findings and concerns documented by the Tiger Teams provide a database of information about the then-current ES ampersand H programs and practice. Program Secretarial Officers (PSOS) and field managers may use this information, along with other sources (such as the Corrective Action Plans, Progress Assessments, and Self-Assessments), to address the ES ampersand H deficiencies found, prioritize and plan appropriate corrective actions, measure progress toward solving the problems, strengthen and transfer knowledge about areas where site performance exemplified the ES ampersand H mindset, and so forth. Further analyses may be suggested by the analysis presented in this report

  12. Quantitative assessment of human motion using video motion analysis

    Science.gov (United States)

    Probe, John D.

    1993-01-01

    In the study of the dynamics and kinematics of the human body a wide variety of technologies has been developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development, coupled with recent advances in video technology, have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System (APAS) to develop data on shirtsleeved and space-suited human performance in order to plan efficient on-orbit intravehicular and extravehicular activities. APAS is a fully integrated system of hardware and software for biomechanics and the analysis of human performance and generalized motion measurement. Major components of the complete system include the video system, the AT compatible computer, and the proprietary software.

  13. Risk Assessment of Infrastructure System of Systems with Precursor Analysis.

    Science.gov (United States)

    Guo, Zhenyu; Haimes, Yacov Y

    2016-08-01

    Physical infrastructure systems are commonly composed of interconnected and interdependent subsystems, which in their essence constitute system of systems (S-o-S). System owners and policy researchers need tools to foresee potential emergent forced changes and to understand their impact so that effective risk management strategies can be developed. We develop a systemic framework for precursor analysis to support the design of an effective and efficient precursor monitoring and decision support system with the ability to (i) identify and prioritize indicators of evolving risks of system failure; and (ii) evaluate uncertainties in precursor analysis to support informed and rational decision making. This integrated precursor analysis framework is comprised of three processes: precursor identification, prioritization, and evaluation. We use an example of a highway bridge S-o-S to demonstrate the theories and methodologies of the framework. Bridge maintenance processes involve many interconnected and interdependent functional subsystems and decision-making entities and bridge failure can have broad social and economic consequences. The precursor analysis framework, which constitutes an essential part of risk analysis, examines the impact of various bridge inspection and maintenance scenarios. It enables policy researchers and analysts who are seeking a risk perspective on bridge infrastructure in a policy setting to develop more risk informed policies and create guidelines to efficiently allocate limited risk management resources and mitigate severe consequences resulting from bridge failures. PMID:27575259

  14. Manage Stakeholders approach for analysis and risk assessment in the implementation of innovative projects

    OpenAIRE

    СУХОНОС, Марія Костянтинівна; Угоднікова, Олена Ігорівна

    2012-01-01

    The problem of innovation project risk management, notably Manage Stakeholder's risks, is consider in this article. The methodology of analysis and assessment Manage Stakeholder's risks in innovation projects is suggest

  15. Verification and validation of the decision analysis model for assessment of TWRS waste treatment strategies

    International Nuclear Information System (INIS)

    This document is the verification and validation final report for the Decision Analysis Model for Assessment of Tank Waste Remediation System Waste Treatment Strategies. This model is also known as the INSIGHT Model

  16. Evaluation of auto-assessment method for C-D analysis based on support vector machine

    International Nuclear Information System (INIS)

    Contrast-Detail (C-D) analysis is one of the visual quality assessment methods in medical imaging, and many auto-assessment methods for C-D analysis have been developed in recent years. However, for the auto-assessment method for C-D analysis, the effects of nonlinear image processing are not clear. So, we have made an auto-assessment method for C-D analysis using a support vector machine (SVM), and have evaluated its performance for the images processed with a noise reduction method. The feature indexes used in the SVM were the normalized cross correlation (NCC) coefficient on each signal between the noise-free and noised image, the contrast to noise ratio (CNR) on each signal, the radius of each signal, and the Student's t-test statistic for the mean difference between the signal and background pixel values. The results showed that the auto-assessment method for C-D analysis by using Student's t-test statistic agreed well with the visual assessment for the non-processed images, but disagreed for the images processed with the noise reduction method. Our results also showed that the auto-assessment method for C-D analysis by the SVM made of NCC and CNR agreed well with the visual assessment for the non-processed and noise-reduced images. Therefore, the auto-assessment method for C-D analysis by the SVM will be expected to have the robustness for the non-linear image processing. (author)

  17. Model Analysis Assessing the dynamics of student learning

    CERN Document Server

    Bao, L; Bao, Lei; Redish, Edward F.

    2002-01-01

    In this paper we present a method of modeling and analysis that permits the extraction and quantitative display of detailed information about the effects of instruction on a class's knowledge. The method relies on a congitive model that represents student thinking in terms of mental models. Students frequently fail to recognize relevant conditions that lead to appropriate uses of their models. As a result they can use multiple models inconsistently. Once the most common mental models have been determined by qualitative research, they can be mapping onto a multiple choice test. Model analysis permits the interpretation of such a situation. We illustrate the use of our method by analyzing results from the FCI.

  18. Using the statistical analysis method to assess the landslide susceptibility

    Science.gov (United States)

    Chan, Hsun-Chuan; Chen, Bo-An; Wen, Yo-Ting

    2015-04-01

    This study assessed the landslide susceptibility in Jing-Shan River upstream watershed, central Taiwan. The landslide inventories during typhoons Toraji in 2001, Mindulle in 2004, Kalmaegi and Sinlaku in 2008, Morakot in 2009, and the 0719 rainfall event in 2011, which were established by Taiwan Central Geological Survey, were used as landslide data. This study aims to assess the landslide susceptibility by using different statistical methods including logistic regression, instability index method and support vector machine (SVM). After the evaluations, the elevation, slope, slope aspect, lithology, terrain roughness, slope roughness, plan curvature, profile curvature, total curvature, average of rainfall were chosen as the landslide factors. The validity of the three established models was further examined by the receiver operating characteristic curve. The result of logistic regression showed that the factor of terrain roughness and slope roughness had a stronger impact on the susceptibility value. Instability index method showed that the factor of terrain roughness and lithology had a stronger impact on the susceptibility value. Due to the fact that the use of instability index method may lead to possible underestimation around the river side. In addition, landslide susceptibility indicated that the use of instability index method laid a potential issue about the number of factor classification. An increase of the number of factor classification may cause excessive variation coefficient of the factor. An decrease of the number of factor classification may make a large range of nearby cells classified into the same susceptibility level. Finally, using the receiver operating characteristic curve discriminate the three models. SVM is a preferred method than the others in assessment of landslide susceptibility. Moreover, SVM is further suggested to be nearly logistic regression in terms of recognizing the medium-high and high susceptibility.

  19. Numerical analysis and geotechnical assessment of mine scale model

    Institute of Scientific and Technical Information of China (English)

    Khanal Manoj; Adhikary Deepak; Balusu Rao

    2012-01-01

    Various numerical methods are available to model,simulate,analyse and interpret the results; however a major task is to select a reliable and intended tool to perform a realistic assessment of any problem.For a model to be a representative of the realistic mining scenario,a verified tool must be chosen to perform an assessment of mine roof support requirement and address the geotechnical risks associated with longwall mining.The dependable tools provide a safe working environment,increased production,efficient management of resources and reduce environmental impacts of mining.Although various methods,for example,analytical,experimental and empirical are being adopted in mining,in recent days numerical tools are becoming popular due to the advancement in computer hardware and numerical methods.Empirical rules based on past experiences do provide a general guide,however due to the heterogeneous nature of mine geology (i.e.,none of the mine sites are identical),numerical simulations of mine site specific conditions would lend better insights into some underlying issues.The paper highlights the use of a continuum mechanics based tool in coal mining with a mine scale model.The continuum modelling can provide close to accurate stress fields and deformation.The paper describes the use of existing mine data to calibrate and validate the model parameters,which then are used to assess geotechnical issues related with installing a new high capacity longwall mine at the mine site.A variety of parameters,for example,chock convergences,caveability of overlying sandstones,abutment and vertical stresses have been estimated.

  20. Evaluation of safety assessment methodologies in Rocky Flats Risk Assessment Guide (1985) and Building 707 Final Safety Analysis Report (1987)

    International Nuclear Information System (INIS)

    FSARs. Rockwell International, as operating contractor at the Rocky Flats plant, conducted a safety analysis program during the 1980s. That effort resulted in Final Safety Analysis Reports (FSARs) for several buildings, one of them being the Building 707 Final Safety Analysis Report, June 87 (707FSAR) and a Plant Safety Analysis Report. Rocky Flats Risk Assessment Guide, March 1985 (RFRAG85) documents the methodologies that were used for those FSARs. Resources available for preparation of those Rocky Flats FSARs were very limited. After addressing the more pressing safety issues, some of which are described below, the present contractor (EG ampersand G) intends to conduct a program of upgrading the FSARs. This report presents the results of a review of the methodologies described in RFRAG85 and 707FSAR and contains suggestions that might be incorporated into the methodology for the FSAR upgrade effort

  1. Teacher Analysis of Student Knowledge (TASK): A Measure of Learning Trajectory-Oriented Formative Assessment

    Science.gov (United States)

    Supovitz, Jonathan; Ebby, Caroline B.; Sirinides, Philip

    2013-01-01

    This interactive electronic report provides an overview of an innovative new instrument developed by researchers at the Consortium for Policy Research in Education (CPRE) to authentically measure teachers' formative assessment practices in mathematics. The Teacher Analysis of Student Knowledge, or TASK, instrument assesses mathematics…

  2. Literary translation and quality assessment analysis – its significance in translation training

    OpenAIRE

    Rodríguez, Beatriz Maria

    2004-01-01

    This paper aims to highlight the role of translation quality assessment in translation training so as to develop students’ translation competence and skills to face translation problems. An analysis to assess literary translation quality is proposed before proceeding to discuss its pedagogical implementation.

  3. A Risk-Analysis Approach to Implementing Web-Based Assessment

    Science.gov (United States)

    Ricketts, Chris; Zakrzewski, Stan

    2005-01-01

    Computer-Based Assessment is a risky business. This paper proposes the use of a model for web-based assessment systems that identifies pedagogic, operational, technical (non web-based), web-based and financial risks. The strategies and procedures for risk elimination or reduction arise from risk analysis and management and are the means by which…

  4. Cost-effectiveness analysis of computer-based assessment

    Directory of Open Access Journals (Sweden)

    Pauline Loewenberger

    2003-12-01

    Full Text Available The need for more cost-effective and pedagogically acceptable combinations of teaching and learning methods to sustain increasing student numbers means that the use of innovative methods, using technology, is accelerating. There is an expectation that economies of scale might provide greater cost-effectiveness whilst also enhancing student learning. The difficulties and complexities of these expectations are considered in this paper, which explores the challenges faced by those wishing to evaluate the costeffectiveness of computer-based assessment (CBA. The paper outlines the outcomes of a survey which attempted to gather information about the costs and benefits of CBA.

  5. Modelling requirements for future assessments based on FEP analysis

    International Nuclear Information System (INIS)

    This report forms part of a suite of documents describing the Nirex model development programme. The programme is designed to provide a clear audit trail from the identification of significant features, events and processes (FEPs) to the models and modelling processes employed within a detailed safety assessment. A scenario approach to performance assessment has been adopted. It is proposed that potential evolutions of a deep geological radioactive waste repository can be represented by a base scenario and a number of variant scenarios. The base scenario is chosen to be broad-ranging and to represent the natural evolution of the repository system and its surrounding environment. The base scenario is defined to include all those FEPs that are certain to occur and those which are judged likely to occur for a significant period of the assessment timescale. The structuring of FEPs on a Master Directed Diagram (MDD) provides a systematic framework for identifying those FEPs that form part of the natural evolution of the system and those, which may define alternative potential evolutions of the repository system. In order to construct a description of the base scenario, FEPs have been grouped into a series of conceptual models. Conceptual models are groups of FEPs, identified from the MDD, representing a specific component or process within the disposal system. It has been found appropriate to define conceptual models in terms of the three main components of the disposal system: the repository engineered system, the surrounding geosphere and the biosphere. For each of these components, conceptual models provide a description of the relevant subsystem in terms of its initial characteristics, subsequent evolution and the processes affecting radionuclide transport for the groundwater and gas pathways. The aim of this document is to present the methodology that has been developed for deriving modelling requirements and to illustrate the application of the methodology by

  6. The Analysis of Ease of Doing Business Assessment Methods

    Directory of Open Access Journals (Sweden)

    Mindaugas Samoška

    2011-07-01

    Full Text Available The study deals with the ease of doing business assessment models. Analysed models describe conditions for doing busi­ness in a certain country that is being ranked and evaluated. However obvious need for improvement in methodology accrues while analysing five most known models in a comparative way. Different data aggregation principles differ the results (quantative and qualitive methods of aggregation despite the factors that are evaluated in both models and are quite similar. After analysing all five methods we state critics about them and insights for possible improvement.Article in Lithuanian

  7. Assessment of the Assessment Tool: Analysis of Items in a Non-MCQ Mathematics Exam

    Directory of Open Access Journals (Sweden)

    Heba Bakr Khoshaim

    2016-01-01

    Full Text Available Assessment is one of the vital steps in the teaching and learning process. The reported action research examines the effectiveness of an assessment process and inspects the validity of exam questions used for the assessment purpose. The instructors of a college-level mathematics course studied questions used in the final exams during the academic years 2013–2014 and 2014−2015. Using the data from 206 students, the researchers analyzed 54 exam questions with regard to the complexity level, the difficulty coefficient and the discrimination coefficient. Findings indicated that the complexity level correlated with the difficulty coefficient for only one of three semesters. In addition, the correlation between the discrimination coefficient and the difficulty coefficient was found to be statistically significant in all three semesters. The results suggest that all three exams were acceptable; however, further attention should be given to the complexity level of questions used in mathematical tests and that moderate difficulty level questions are better classifying students’ performance.

  8. Pesticide Flow Analysis to Assess Human Exposure in Greenhouse Flower Production in Colombia

    OpenAIRE

    Binder, Claudia R.; Camilo Lesmes-Fabian

    2013-01-01

    Human exposure assessment tools represent a means for understanding human exposure to pesticides in agricultural activities and managing possible health risks. This paper presents a pesticide flow analysis modeling approach developed to assess human exposure to pesticide use in greenhouse flower crops in Colombia, focusing on dermal and inhalation exposure. This approach is based on the material flow analysis methodology. The transfer coefficients were obtained using the whole body dosimetry ...

  9. An integrated factor analysis model for product eco-design based on full life cycle assessment

    OpenAIRE

    Zhi fang Zhou; Tian Xiao; Da yuan Li

    2016-01-01

    Purpose: Among the methods of comprehensive analysis for a product or an enterprise, there exist defects and deficiencies in traditional standard cost analyses and life cycle assessment methods. For example, some methods only emphasize one dimension (such as economic or environmental factors) while neglecting other relevant dimensions. This paper builds a factor analysis model of resource value flow, based on full life cycle assessment and eco-design theory, in order to expose ...

  10. Vulnerability of assessing water resources by the improved set pair analysis

    OpenAIRE

    Yang Xiao-Hua; He Jun; Di Cong-Li; Li Jian-Qiang

    2014-01-01

    Climate change has tremendously changed the hydrological processes with global warming. There are many uncertainties in assessing water resources vulnerability. To assess the water resources vulnerability rationally under climate change, an improved set pair analysis model is established, in which set pair analysis theory is introduced and the weights are determined by the analytic hierarchy process method. The index systems and criteria of water resources ...

  11. Fear Assessment: Cost-Benefit Analysis and the Pricing of Fear and Anxiety

    OpenAIRE

    Adler, Matthew D.

    2003-01-01

    "Risk assessment" is now a common feature of regulatory practice, but "fear assessment" is not.In particular, environmental, health and safety agencies such as EPA, FDA, OSHA, NHTSA, and CPSC, commonly count death, illness and injury as "costs" for purposes of cost-benefit analysis, but almost never incorporate fear, anxiety or other welfare-reducing mental states into the analysis.This is puzzling, since fear and anxiety are welfare setbacks, and since the very hazards regulated by these age...

  12. Concentration Analysis: A Quantitative Assessment of Student States.

    Science.gov (United States)

    Bao, Lei; Redish, Edward F.

    2001-01-01

    Explains that multiple-choice tests such as the Force Concept Inventory (FCI) provide useful instruments to probe the distribution of student difficulties on a large scale. Introduces a new method, concentration analysis, to measure how students' responses on multiple-choice questions are distributed. (Contains 18 references.) (Author/YDS)

  13. Dynamic Assessment of Functional Lipidomic Analysis in Human Urine.

    Science.gov (United States)

    Rockwell, Hannah E; Gao, Fei; Chen, Emily Y; McDaniel, Justice; Sarangarajan, Rangaprasad; Narain, Niven R; Kiebish, Michael A

    2016-07-01

    The development of enabling mass spectrometry platforms for the quantification of diverse lipid species in human urine is of paramount importance for understanding metabolic homeostasis in normal and pathophysiological conditions. Urine represents a non-invasive biofluid that can capture distinct differences in an individual's physiological status. However, currently there is a lack of quantitative workflows to engage in high throughput lipidomic analysis. This study describes the development of a MS/MS(ALL) shotgun lipidomic workflow and a micro liquid chromatography-high resolution tandem mass spectrometry (LC-MS/MS) workflow for urine structural and mediator lipid analysis, respectively. This workflow was deployed to understand biofluid sample handling and collection, extraction efficiency, and natural human variation over time. Utilization of 0.5 mL of urine for structural lipidomic analysis resulted in reproducible quantification of more than 600 lipid molecular species from over 20 lipid classes. Analysis of 1 mL of urine routinely quantified in excess of 55 mediator lipid metabolites comprised of octadecanoids, eicosanoids, and docosanoids generated by lipoxygenase, cyclooxygenase, and cytochrome P450 activities. In summary, the high-throughput functional lipidomics workflow described in this study demonstrates an impressive robustness and reproducibility that can be utilized for population health and precision medicine applications. PMID:27038173

  14. Technical quality assessment of an optoelectronic system for movement analysis

    International Nuclear Information System (INIS)

    The Optoelectronic Systems (OS) are largely used in gait analysis to evaluate the motor performances of healthy subjects and patients. The accuracy of marker trajectories reconstruction depends on several aspects: the number of cameras, the dimension and position of the calibration volume, and the chosen calibration procedure. In this paper we propose a methodology to evaluate the effects of the mentioned sources of error on the reconstruction of marker trajectories. The novel contribution of the present work consists in the dimension of the tested calibration volumes, which is comparable with the ones normally used in gait analysis; in addition, to simulate trajectories during clinical gait analysis, we provide non-default paths for markers as inputs. Several calibration procedures are implemented and the same trial is processed with each calibration file, also considering different cameras configurations. The RMSEs between the measured trajectories and the optimal ones are calculated for each comparison. To investigate the significant differences between the computed indices, an ANOVA analysis is implemented. The RMSE is sensible to the variations of the considered calibration volume and the camera configurations and it is always inferior to 43 mm

  15. Assessment of competence in simulated flexible bronchoscopy using motion analysis

    DEFF Research Database (Denmark)

    Collela, Sara; Svendsen, Morten Bo Søndergaard; Konge, Lars;

    2015-01-01

    intermediates and 9 experienced bronchoscopy operators performed 3 procedures each on a bronchoscopy simulator. The Microsoft Kinect system was used to automatically measure the total deviation of the scope from a perfectly straight, vertical line. Results: The low-cost motion analysis system could measure the...

  16. Biogas upgrading technologies:Energetic analysis and environmental impact assessment

    Institute of Scientific and Technical Information of China (English)

    Yajing Xu; Ying Huang; Bin Wu; Xiangping Zhang; Suojiang Zhang

    2015-01-01

    Biogas upgrading for removing CO2 and other trace components from raw biogas is a necessary step before the biogas to be used as a vehicle fuel or supplied to the natural gas grid. In this work, three technologies for biogas upgrading, i.e., pressured water scrubbing (PWS), monoethanolamine aqueous scrubbing (MAS) and ionic liquid scrubbing (ILS), are studied and assessed in terms of their energy consumption and environmental impacts with the process simulation and green degree method. A non-random-two-liquid and Henry's law property method for a CO2 separation system with ionic liquid 1-butyl-3-methylimidazolium bis(trifluoromethylsulfonyl)imide ([bmim][Tf2N]) is established and verified with experimental data. The assessment results indicate that the specific energy consumption of ILS and PWS is almost the same and much less than that of MAS. High purity CO2 product can be obtained by MAS and ILS methods, whereas no pure CO2 is recovered with the PWS. For the environmental aspect, ILS has the highest green degree production value, while MAS and PWS produce serious environmental impacts.

  17. Forest ecosystem health assessment and analysis in China

    Institute of Scientific and Technical Information of China (English)

    XIAOFengjin; OUYANGHua; ZHANGQiang; FUBojie; ZHANGZhicheng

    2004-01-01

    Based on more than 300 forest sample plots surveying data and forestry statistical data, remote sensing information from the NOAA AVHRR database and the daily meteorological data of 300 stations, we selected vigor, organization and resilience as the indicators to assess large-scale forest ecosystem health in China and analyzed the spatial pattern of forest ecosystem health and influencing factors. The results of assessment indicated that the spatial pattern of forest ecosystem health showed a decreasing trend along latitude gradients and longitude gradients. The healthy forests are mainly distributed in natural forests, tropical rainforests and seasonal rainforests; secondarily orderly in northeast national forest zone, subtropical forest zonation and southwest forest zonation; while the unhealthy forests were mainly located in warm temperate zone and Xinjiang-Mongolia forest zone. The coefficient of correction between Forest Ecosystem Health Index (FEHI) and annual average precipitation was 0.58 (p<0.01), while the coefficient of correlation between FEHI and annual mean temperatures was 0.49 (p<0.01), which identified that the precipitation and temperatures affect the pattern of FEHI, and the precipitation's effect was stronger than the temperature's. We also measured the correlation coefficient between FEHI and NPP, biodiversity and resistance, which were 0.64, 0.76 and 0.81 (p<0.01) respectively. The order of effect on forest ecosystem health was vigor, organization and resistance.

  18. Analysis and Pollution Assessment of Heavy Metal in Soil, Perlis

    International Nuclear Information System (INIS)

    Concentration of 5 heavy metals (Cu, Cr, Ni, Cd, Pb) were studied in the soils around Perlis, to assess heavy metals contamination distribution due to industrialization, urbanization and agricultural activities. Soil samples were collected at depth of 0-15 cm in eighteen station around Perlis. The soil samples (2 mm) were obtained duplicates and subjected to hot block digestion and the concentration of total metal was determined via ICP-MS. Overall concentrations of Cu, Cr, Ni, Cd and Pb in the soil samples ranged from 0.38-240.59, 0.642-3.921, 0.689-2.398, 0-0.63 and 0.39-27.47 mg/ kg respectively. The concentration of heavy metals in the soil display the following decreasing trend: Cu> Pb> Cr> Ni> Cd. From this result, found that level of heavy metal in soil near centralized Chuping industrial areas give maximum value compared with other location in Perlis. The Pollution index revealed that only 11 % of Cu and 6 % of Cd were classes as heavily contaminated. Meanwhile, Cu and Pb showed 6 % from all samples result a moderately contaminated and the others element give low contamination. Results of combined heavy metal concentration and heavy metal assessment indicate that industrial activities and traffic emission represent most important sources for Cu, Cd and Pb whereas Cr, Ni mainly from natural sources. Increasing anthropogenic influences on the environment, especially pollution loadings, have caused negative changes in natural ecosystems and decreased biodiversity. (author)

  19. Promises and pitfalls in environmentally extended input–output analysis for China: A survey of the literature

    International Nuclear Information System (INIS)

    As the world's largest developing economy, China plays a key role in global climate change and other environmental impacts of international concern. Environmentally extended input–output analysis (EE-IOA) is an important and insightful tool seeing widespread use in studying large-scale environmental impacts in China: calculating and analyzing greenhouse gas emissions, carbon and water footprints, pollution, and embodied energy. This paper surveys the published articles regarding EE-IOA for China in peer-reviewed journals and provides a comprehensive and quantitative overview of the body of literature, examining the research impact, environmental issues addressed, and data utilized. The paper further includes a discussion of the shortcomings in official Chinese data and of the potential means to move beyond its inherent limitations. - Highlights: • Articles in 2012–2013 more than doubled that published between 1995 and 2011. • CO2 and energy are the most common topics, frequently associated with trade. • Data from the National Bureau of Statistics is widely used but seen as flawed. • Climate change, water supply, and food security drive the future of the literature

  20. Assessment of modern methods of human factor reliability analysis in PSA studies

    International Nuclear Information System (INIS)

    The report is structured as follows: Classical terms and objects (Probabilistic safety assessment as a framework for human reliability assessment; Human failure within the PSA model; Basic types of operator failure modelled in a PSA study and analyzed by HRA methods; Qualitative analysis of human reliability; Quantitative analysis of human reliability used; Process of analysis of nuclear reactor operator reliability in a PSA study); New terms and objects (Analysis of dependences; Errors of omission; Errors of commission; Error forcing context); and Overview and brief assessment of human reliability analysis (Basic characteristics of the methods; Assets and drawbacks of the use of each of HRA method; History and prospects of the use of the methods). (P.A.)

  1. Assessing Extremes Climatology Using NWS Local Climate Analysis Tool

    Science.gov (United States)

    Timofeyeva, M. M.; Hollingshead, A.; Hilderbrand, D.; Mayes, B.; Hartley, T.; Kempf McGavock, N. M.; Lau, E.; Olenic, E. A.; Motta, B.; Bunge, R.; Brown, L. E.; Fritsch, F.

    2010-12-01

    The Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices’ ability to access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NWS Regional Headquarters, Weather Forecast Offices, Weather Service Offices, and River Forecast Centers the ability to conduct regional and local climate studies using station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. Field offices need standardized, scientifically sound methodology for local climate analysis (such as trend, composites, and principal statistical and time-series analysis) that is comprehensive, accessible, and efficient, with the potential to expand with growing NOAA Climate Services needs. The methodology for climate analyses is practiced by the NWS Climate Prediction Center (CPC), NOAA National Climatic Data Center, and NOAA Earth System Research Laboratory, as well as NWS field office staff. LCAT will extend this practice at the local level, allowing it to become both widespread and standardized, and thus improve NWS climate services capabilities. LCAT focus is on the local scale (as opposed to national and global scales of CPC products). The LCAT will: -Improve professional competency of local office staff and expertise in providing local information to their users. LCAT will improve quality of local climate services -Ensure adequate local input to CPC products that depend on local information, such as the U.S. Drought Monitor. LCAT will allow improvement of CPC climate products -Allow testing of local climate variables beyond temperature averages and precipitation totals such as climatology of

  2. Sensitivity analysis for Probabilistic Tsunami Hazard Assessment (PTHA)

    Science.gov (United States)

    Spada, M.; Basili, R.; Selva, J.; Lorito, S.; Sorensen, M. B.; Zonker, J.; Babeyko, A. Y.; Romano, F.; Piatanesi, A.; Tiberti, M.

    2012-12-01

    In modern societies, probabilistic hazard assessment of natural disasters is commonly used by decision makers for designing regulatory standards and, more generally, for prioritizing risk mitigation efforts. Systematic formalization of Probabilistic Tsunami Hazard Assessment (PTHA) has started only in recent years, mainly following the giant tsunami disaster of Sumatra in 2004. Typically, PTHA for earthquake sources exploits the long-standing practices developed in probabilistic seismic hazard assessment (PSHA), even though important differences are evident. In PTHA, for example, it is known that far-field sources are more important and that physical models for tsunami propagation are needed for the highly non-isotropic propagation of tsunami waves. However, considering the high impact that PTHA may have on societies, an important effort to quantify the effect of specific assumptions should be performed. Indeed, specific standard hypotheses made in PSHA may prove inappropriate for PTHA, since tsunami waves are sensitive to different aspects of sources (e.g. fault geometry, scaling laws, slip distribution) and propagate differently. In addition, the necessity of running an explicit calculation of wave propagation for every possible event (tsunami scenario) forces analysts to finding strategies for diminishing the computational burden. In this work, we test the sensitivity of hazard results with respect to several assumptions that are peculiar of PTHA and others that are commonly accepted in PSHA. Our case study is located in the central Mediterranean Sea and considers the Western Hellenic Arc as the earthquake source with Crete and Eastern Sicily as near-field and far-field target coasts, respectively. Our suite of sensitivity tests includes: a) comparison of random seismicity distribution within area sources as opposed to systematically distributed ruptures on fault sources; b) effects of statistical and physical parameters (a- and b-value, Mc, Mmax, scaling laws

  3. Regional hazard analysis for use in vulnerability and risk assessment

    Science.gov (United States)

    Maris, Fotis; Kitikidou, Kyriaki; Paparrizos, Spyridon; Karagiorgos, Konstantinos; Potouridis, Simeon; Fuchs, Sven

    2014-05-01

    A method for supporting an operational regional risk and vulnerability analysis for hydrological hazards is suggested and applied in the Island of Cyprus. The method aggregates the output of a hydrological flow model forced by observed temperatures and precipitations, with observed discharge data. A scheme supported by observed discharge is applied for model calibration. A comparison of different calibration schemes indicated that the same model parameters can be used for the entire country. In addition, it was demonstrated that, for operational purposes, it is sufficient to rely on a few stations. Model parameters were adjusted to account for land use and thus for vulnerability of elements at risk by comparing observed and simulated flow patterns, using all components of the hydrological model. The results can be used for regional risk and vulnerability analysis in order to increase the resilience of the affected population.

  4. Assessment of pore size distribution using image analysis

    Czech Academy of Sciences Publication Activity Database

    Doktor, Tomáš; Kytýř, Daniel; Valach, Jaroslav; Jiroušek, Ondřej

    Trieste: Italian Group of Fracture, 2010 - (Iacoviello, F.; Cosmi, F.), s. 155-157 ISBN 978-88-95940-30-4. [Youth Symposium on Experimental Solid Mechanics /9./. Trieste (IT), 07.07.2010-10.07.2010] R&D Projects: GA ČR(CZ) GAP105/10/2305 Institutional research plan: CEZ:AV0Z20710524 Keywords : pore size distribution * image analysis * micro-CT Subject RIV: JJ - Other Materials

  5. GLOBAL ANALYSIS OF AGRICULTURAL TRADE LIBERALIZATION: ASSESSING MODEL VALIDITY

    OpenAIRE

    Hertel, Thomas W.; Keeney, Roman; Valenzuela, Ernesto

    2004-01-01

    This paper presents a validation experiment of a global CGE trade model widely used for analysis of trade liberalization. We focus on the ability of the model to reproduce price volatility in wheat markets. The literature on model validation is reviewed with an eye towards designing an appropriate methodology for validating large scale CGE models. The validation experiment results indicate that in its current form, the GTAP-AGR model is incapable of reproducing wheat market price volatility a...

  6. Use Of Risk Analysis Fremeworks In Urban Flood Assessments

    OpenAIRE

    Arnbjerg-Nielsen, Karsten; Madsen, Henrik

    2011-01-01

    In the period 1960 – 1990 rapid urban development took place all over Europe, and notably in Denmark urban sprawl occurred around many cities. Favorable economic conditions ensured that the urbanization continued, although at a lower rate, until recently. However, from 1990 to present a increase in extreme precipitation has been observed, corresponding to an increase of design levels of at least 30 %. Analysis of climate change model output has given clear evidence, that further increases in ...

  7. The Analysis and Assessment of the Credit Risk

    OpenAIRE

    Mirea Marioara; Aivaz Kamer Ainur

    2011-01-01

    The commercial banks main operation is the granting of credits that occupies the first place among the total investments. Any bank assumes risks to a certain extent when granting credits and certainly all the banks generally incur losses when some debtors fail to comply with their obligations. The level of the assumed risks, the losses can be minimized if the credit operations are organized and managed in a professional manner. The paper grasps the moment of the analysis process preceding the...

  8. Assessment of quality control approaches for metagenomic data analysis

    Science.gov (United States)

    Zhou, Qian; Su, Xiaoquan; Ning, Kang

    2014-11-01

    Currently there is an explosive increase of the next-generation sequencing (NGS) projects and related datasets, which have to be processed by Quality Control (QC) procedures before they could be utilized for omics analysis. QC procedure usually includes identification and filtration of sequencing artifacts such as low-quality reads and contaminating reads, which would significantly affect and sometimes mislead downstream analysis. Quality control of NGS data for microbial communities is especially challenging. In this work, we have evaluated and compared the performance and effects of various QC pipelines on different types of metagenomic NGS data and from different angles, based on which general principles of using QC pipelines were proposed. Results based on both simulated and real metagenomic datasets have shown that: firstly, QC-Chain is superior in its ability for contamination identification for metagenomic NGS datasets with different complexities with high sensitivity and specificity. Secondly, the high performance computing engine enabled QC-Chain to achieve a significant reduction in processing time compared to other pipelines based on serial computing. Thirdly, QC-Chain could outperform other tools in benefiting downstream metagenomic data analysis.

  9. An assessment of unstructured grid technology for timely CFD analysis

    Science.gov (United States)

    Kinard, Tom A.; Schabowski, Deanne M.

    1995-01-01

    An assessment of two unstructured methods is presented in this paper. A tetrahedral unstructured method USM3D, developed at NASA Langley Research Center is compared to a Cartesian unstructured method, SPLITFLOW, developed at Lockheed Fort Worth Company. USM3D is an upwind finite volume solver that accepts grids generated primarily from the Vgrid grid generator. SPLITFLOW combines an unstructured grid generator with an implicit flow solver in one package. Both methods are exercised on three test cases, a wing, and a wing body, and a fully expanded nozzle. The results for the first two runs are included here and compared to the structured grid method TEAM and to available test data. On each test case, the set up procedure are described, including any difficulties that were encountered. Detailed descriptions of the solvers are not included in this paper.

  10. Fuel assembly assessment from CVD image analysis: A feasibility study

    International Nuclear Information System (INIS)

    The Swedish Nuclear Inspectorate commissioned a feasibility study of automatic assessment of fuel assemblies from images obtained with the digital Cerenkov viewing device currently in development. The goal is to assist the IAEA inspectors in evaluating the fuel since they typically have only a few seconds to inspect an assembly. We report results here in two main areas: Investigation of basic image processing and recognition techniques needed to enhance the images and find the assembly in the image; Study of the properties of the distributions of light from the assemblies to determine whether they provide unique signatures for different burn-up and cooling times for real fuel or indicate presence of non-fuel. 8 refs, 27 figs

  11. Wind resource assessment and siting analysis in Italy

    International Nuclear Information System (INIS)

    Recently, the wind power industry has matured; consequently, in many countries a lot of wind energy applications have been programmed. Many of them are already realized and running. As such, there is a direct necessity to identify a sizeable number of wind power plant sites. Choosing the right sites to match specific Wind Energy Conversion Systems (WECS) is also needed to harness this clean energy from the points of view of industrial viability and project financing. As a pre-requisite to install a wind turbine at a particular site, it is necessary to have knowledge of the theoretical available wind energy at the site, as well as, of the practicability of the design in matching the characteristics of the WECS. In this paper, ENEA (Italian National Agency for New Technology, Energy and Environment) wind siting and resource assessment activities, currently on-going in different regions in Italy, along with the present status and future prospects of the wind power industry

  12. Assessment and analysis components of physical fitness of students

    Directory of Open Access Journals (Sweden)

    Kashuba V.A.

    2012-08-01

    Full Text Available It is assessed components of a physical fitness of students. It is analyzed the internal and external factors affecting the quality of life for students. The study involved more than 200 students. Found that students represent a category of people with elevated risk factors, which include the nervous and mental tension, constant violations of the food, work and leisure, in their way of life there is a lack of care about their health. It is noted that the existing approaches to promoting physical fitness of students are inefficient and require the development and implementation of brand new contemporary theoretical foundations and practical approaches to the problem of increasing the activity of students. It is proved that sold today in the practice of higher education forms, methods, learning tools do not allow to fully ensure the implementation of approaches to promoting physical fitness of students do not meet the requirements for the preparation of the modern health professional.

  13. Image analysis for dental bone quality assessment using CBCT imaging

    Science.gov (United States)

    Suprijanto; Epsilawati, L.; Hajarini, M. S.; Juliastuti, E.; Susanti, H.

    2016-03-01

    Cone beam computerized tomography (CBCT) is one of X-ray imaging modalities that are applied in dentistry. Its modality can visualize the oral region in 3D and in a high resolution. CBCT jaw image has potential information for the assessment of bone quality that often used for pre-operative implant planning. We propose comparison method based on normalized histogram (NH) on the region of inter-dental septum and premolar teeth. Furthermore, the NH characteristic from normal and abnormal bone condition are compared and analyzed. Four test parameters are proposed, i.e. the difference between teeth and bone average intensity (s), the ratio between bone and teeth average intensity (n) of NH, the difference between teeth and bone peak value (Δp) of NH, and the ratio between teeth and bone of NH range (r). The results showed that n, s, and Δp have potential to be the classification parameters of dental calcium density.

  14. Game theoretic analysis of environmental impact assessment system in China

    Institute of Scientific and Technical Information of China (English)

    CHENG Hongguang; QI Ye; PU Xiao; GONG Li

    2007-01-01

    Environmental impact assessment (EIA) system has been established in China since 1973.In present EIA cases,there are four participants in general:governments,enterprises,EIA organizations and the public.The public has held responsible for both social costs and social duties.The public supervises social costs produced by enterprises discharging pollutant in EIA.However public participation is mostly deputized by governments,which severely weaken the independence of the public as one participant in EIA.In this paper,EIA refers to the different attitudes of the participants whose optional strategies may be described by a proper game model.According to disfigurements in EIA,three sides (governments,enterprises,and EIA organizations)dynamic iterative game theory of many phases is established referring to iterative game theory,dynamic game theory of incomplete information,and perfect Bayesian equilibrium theory to analyze the reciprocity relation among governments,EIA organizations and enterprises.The results show that in a short period,economic benefit is preponderant over social benefit.Governments and enterprises both do not want to take EIA to reveal social costs.EIA organizations' income comes from enterprises and the collusions are built between them to vindicate economic benefit.In a long run,social benefit loss caused by environmental pollution must be recuperated sooner or later and environmental deterioration will influence the achievements of economic benefit,so both governments and eaterprises are certain to pursue high social benefit and willing to take EIA,helpful to increase private benefit.EIA organizations will make fair assessment when their economic benefit are ensured.At present,the public as silent victims can not take actual part in EIA.The EIA system must be improved to break the present equilibrium of three sides,bringing the public to the equilibrium to exert public supervision.

  15. An Assessment of Image Analysis of Longitudinal Bone Changes

    International Nuclear Information System (INIS)

    This study was performed to assess the analyzing methods developed to detect clinically and quantitatively longitudinal bone changes. Through preliminary experiment, accuracy of Cu-Eq value conversion to the mass of HA was examined. For main experiment, 15 intraoral radiograms taken at soon, 1st, 2nd, 4th, and 6th week after implantation of mixture in extracted sites of 3 cases were user. We took the radiograms with copper step wedge as test object and HA phantom. X -ray taking was standardized by using Rinn XCP device customized directly to the individual dentition with resin bite block. The images inputted by Quick scanner into computer were digitized and analyzed by NH image program, the stability of the copper equivalent transformation and the usefulness of two analyzing methods by ROI and Reslice were examined. Obtained results as follows : 1. On the Cu equivalent images, the coefficient of variation in the measurement of Cu-Eq. value of ROI ranged from 0.05 to 0.24 and showed high reproducibility. 2. All results obtained by resliced contiguous images were coincident with those obtained from the assessment by ROI an d formation of plot profile. 3. On the stacked and resliced image at the line of interest, we could analyze directly and quantitatively the longitudinal changes at several portions by plot profile and qualitatively by surface plot. 4. Implant area showed marked resorption till 2 weeks after implantation and showed significant increase in Cu-Eq. value at 6th week (P<0.01) and periapical area showed increase in Cu-Eq. value at 6th week compared to after-operation's.

  16. Alternative model for assessment administration and analysis: An example from the E-CLASS

    CERN Document Server

    Wilcox, Bethany R; Hobbs, Robert D; Aiken, John M; Welch, Nathan M; Lewandowski, H J

    2016-01-01

    The primary model for dissemination of conceptual and attitudinal assessments that has been used within the physics education research (PER) community is to create a high quality, validated assessment, make it available for a wide range of instructors to use, and provide minimal (if any) support to assist with administration or analysis of the results. Here, we present and discuss an alternative model for assessment dissemination, which is characterized by centralized data collection and analysis. This model also provides a greater degree of support for both researchers and instructors. Specifically, we describe our experiences developing a centralized, automated system for an attitudinal assessment we previously created to examine students' epistemologies and expectations about experimental physics. This system provides a proof-of-concept that we use to discuss the advantages associated with centralized administration and data collection for research-based assessments in PER. We also discuss the challenges t...

  17. Analysis of uranium ore concentrates for origin assessment

    International Nuclear Information System (INIS)

    In this study the most important analytical methodologies are presented for the nuclear forensic investigation of uranium ore concentrates (yellow cakes). These methodologies allow to measure characteristic parameters which may be source material or process inherited. By the combination of the various techniques (e.g. infrared spectrometry, impurity content, rare-earth pattern and U, Sr and Pb isotope ratio analysis by mass spectrometry), the possible provenances of the illicit material can be narrowed down to a few options and its declared origin can be verified. The methodologies serve for nuclear forensic investigations as well as for nuclear safeguards, checking the consistency of information. (orig.)

  18. High strength bolt failure analysis and integrity assessment. Lessons learned

    International Nuclear Information System (INIS)

    Isolated failures have occurred in high strength bolting used in pressurized water reactor (PWR) component support applications. The U.S. nuclear industry component support bolting failure experience is described in this paper, focusing on materials specified intentionally as ''ultra-high-strength'' (minimum specified yield strength greater than 1034 MPa). The analysis and investigation of fabrication-induced problems with a bolt made from Carpenter Technology Alloy ''Custom 455'', (ASTM A 564 XM-16) a proprietary materials, are detailed, and the measures taken to assure integrity of these bolts during operation are discussed. Lessons learned to preclude future problems are presented as conclusions

  19. Assessment of the Prony's method for BWR stability analysis

    International Nuclear Information System (INIS)

    Highlights: → This paper describes a method to determine the degree of stability of a BWR. → Performance comparison between Prony's and common AR techniques is presented. → Benchmark data and actual BWR transient data are used for comparison. → DR and f results are presented and discussed. → The Prony's method is shown to be a robust technique for BWR stability. - Abstract: It is known that Boiling Water Reactors are susceptible to present power oscillations in regions of high power and low coolant flow, in the power-flow operational map. It is possible to fall in one of such instability regions during reactor startup, since both power and coolant flow are being increased but not proportionally. One other possibility for falling into those areas is the occurrence of a trip of recirculation pumps. Stability monitoring in such cases can be difficult, because the amount or quality of power signal data required for calculation of the stability key parameters may not be enough to provide reliable results in an adequate time range. In this work, the Prony's Method is presented as one complementary alternative to determine the degree of stability of a BWR, through time series data. This analysis method can provide information about decay ratio and oscillation frequency from power signals obtained during transient events. However, so far not many applications in Boiling Water Reactors operation have been reported and supported to establish the scope of using such analysis for actual transient events. This work presents first a comparison of decay ratio and frequency oscillation results obtained by Prony's method and those results obtained by the participants of the Forsmark 1 and 2 Boiling Water Reactor Stability Benchmark using diverse techniques. Then, a comparison of decay ratio and frequency oscillation results is performed for four real BWR transient event data, using Prony's method and two other techniques based on an autoregressive modeling. The four

  20. Assessing Canadian Bank Branch Operating Efficiency Using Data Envelopment Analysis

    Science.gov (United States)

    Yang, Zijiang

    2009-10-01

    In today's economy and society, performance analyses in the services industries attract more and more attention. This paper presents an evaluation of 240 branches of one big Canadian bank in Greater Toronto Area using Data Envelopment Analysis (DEA). Special emphasis was placed on how to present the DEA results to management so as to provide more guidance to them on what to manage and how to accomplish the changes. Finally the potential management uses of the DEA results were presented. All the findings are discussed in the context of the Canadian banking market.

  1. Using Benefit-Cost Analysis to Assess Child Abuse Prevention and Intervention Programs.

    Science.gov (United States)

    Plotnick, Robert D.; Deppman, Laurie

    1999-01-01

    Presents a case for using benefit-cost analysis to structure evaluations of child-abuse prevention and intervention programs. Presents the basic concept of benefit-cost analysis, its application in the context of assessing these types of child welfare programs, and limitations on its application to social service programs. (Author)

  2. 77 FR 48107 - Workshop on Performance Assessments of Near-Surface Disposal Facilities: FEPs Analysis, Scenario...

    Science.gov (United States)

    2012-08-13

    ...-Surface Disposal Facilities: FEPs Analysis, Scenario and Conceptual Model Development, and Code Selection... Radioactive Waste.'' These regulations were published in the Federal Register on December 27, 1982 (47 FR... on three aspects of a performance assessment: (1) Features, Events, and Processes (FEPs) analysis,...

  3. Procedures for uncertainty and sensitivity analysis in repository performance assessment

    International Nuclear Information System (INIS)

    The objective of the project was mainly a literature study of available methods for the treatment of parameter uncertainty propagation and sensitivity aspects in complete models such as those concerning geologic disposal of radioactive waste. The study, which has run parallel with the development of a code package (PROPER) for computer assisted analysis of function, also aims at the choice of accurate, cost-affective methods for uncertainty and sensitivity analysis. Such a choice depends on several factors like the number of input parameters, the capacity of the model and the computer reresources required to use the model. Two basic approaches are addressed in the report. In one of these the model of interest is directly simulated by an efficient sampling technique to generate an output distribution. Applying the other basic method the model is replaced by an approximating analytical response surface, which is then used in the sampling phase or in moment matching to generate the output distribution. Both approaches are illustrated by simple examples in the report. (author)

  4. Application of importance analysis probabilistic safety assessment results of Tehran Research Reactor

    International Nuclear Information System (INIS)

    Application of probabilistic safety assessment to evaluate the safety of hazardous facilities will be fulfilled when the results have been processed meaningfully. The purpose of the importance analysis is to identify major contributors to core damage frequency that may include accident initiators, system failures, component failures and human errors. In this paper, Fussell-Vesely measure of importance was applied to the results of probabilistic safety assessment study of Tehran Research Reactor. This analysis is done using systems analysis programs for hands-on integrated reliability evaluations software

  5. The analysis of financial statements as approach to the assessment of financial stability of the enterprise

    Directory of Open Access Journals (Sweden)

    Y.E. Bezborodova

    2013-04-01

    Full Text Available In the present article some approaches to an assessment of financial stability of the enterprises by the analysis of financial statements are considered. Its financial condition serves in market conditions as pledge of stable position of the enterprise. The analysis of a financial condition of the enterprise is one of the major elements in a control system as the analysis allows to reveal the problem parties in enterprise activity by an assessment of financial stability, solvency and liquidity and to define ways of their decision.

  6. Underground Test Area Subproject Phase I Data Analysis Task. Volume VIII - Risk Assessment Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VIII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the risk assessment documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  7. Alternative model for administration and analysis of research-based assessments

    Science.gov (United States)

    Wilcox, Bethany R.; Zwickl, Benjamin M.; Hobbs, Robert D.; Aiken, John M.; Welch, Nathan M.; Lewandowski, H. J.

    2016-06-01

    Research-based assessments represent a valuable tool for both instructors and researchers interested in improving undergraduate physics education. However, the historical model for disseminating and propagating conceptual and attitudinal assessments developed by the physics education research (PER) community has not resulted in widespread adoption of these assessments within the broader community of physics instructors. Within this historical model, assessment developers create high quality, validated assessments, make them available for a wide range of instructors to use, and provide minimal (if any) support to assist with administration or analysis of the results. Here, we present and discuss an alternative model for assessment dissemination, which is characterized by centralized data collection and analysis. This model provides a greater degree of support for both researchers and instructors in order to more explicitly support adoption of research-based assessments. Specifically, we describe our experiences developing a centralized, automated system for an attitudinal assessment we previously created to examine students' epistemologies and expectations about experimental physics. This system provides a proof of concept that we use to discuss the advantages associated with centralized administration and data collection for research-based assessments in PER. We also discuss the challenges that we encountered while developing, maintaining, and automating this system. Ultimately, we argue that centralized administration and data collection for standardized assessments is a viable and potentially advantageous alternative to the default model characterized by decentralized administration and analysis. Moreover, with the help of online administration and automation, this model can support the long-term sustainability of centralized assessment systems.

  8. Assessment of sperm quality using monoclonal antibodies and proteomic analysis

    Czech Academy of Sciences Publication Activity Database

    Čapková, Jana; Kubátová, Alena; Margaryan, Hasmik; Pěknicová, Jana

    Praha: Biotechnologický ústav v.v AVČR, 2011 - (Pěknicová, J.). s. 63-63 [XVII. symposium českých reprodukčních imunologů s mezinárodní účastí. 26.05.2011-29.05.2011, Žďár nad Sázavou] R&D Projects: GA ČR(CZ) GA523/09/1793; GA ČR(CZ) GA523/08/H064; GA MŠk(CZ) 1M06011; GA MZd(CZ) NS10009 Institutional research plan: CEZ:AV0Z50520701 Keywords : sperm parameters * proteomic analysis * 2D PAGE * mass spectrometry Subject RIV: CE - Biochemistry

  9. In-field analysis and assessment of nuclear material

    International Nuclear Information System (INIS)

    Los Alamos National Laboratory has actively developed and implemented a number of instruments to monitor, detect, and analyze nuclear materials in the field. Many of these technologies, developed under existing US Department of Energy programs, can also be used to effectively interdict nuclear materials smuggled across or within national borders. In particular, two instruments are suitable for immediate implementation: the NAVI-2, a hand-held gamma-ray and neutron system for the detection and rapid identification of radioactive materials, and the portable mass spectrometer for the rapid analysis of minute quantities of radioactive materials. Both instruments provide not only critical information about the characteristics of the nuclear material for law-enforcement agencies and national authorities but also supply health and safety information for personnel handling the suspect materials

  10. Analysis And Assessment Of The Security Method Against Incidental Contamination In The Collective Water Supply System

    OpenAIRE

    Szpak Dawid; Tchórzewska – Cieślak Barbara

    2015-01-01

    The paper presents the main types of surface water incidental contaminations and the security method against incidental contamination in water sources. Analysis and assessment the collective water supply system (CWSS) protection against incidental contamination was conducted. Failure Mode and Effects Analysis (FMEA) was used. The FMEA method allow to use the product or process analysis, identification of weak points, and implementation the corrections and new solutions for eliminating the sou...

  11. Assessment of Water Quality Parameters by Using the Multidimensional Scaling Analysis

    OpenAIRE

    Suheyla Yerel; Huseyin Ankara

    2010-01-01

    The surface water quality parameters of the western region of Black Sea in Turkey were assessed by using multidimensional scaling analysis. This technique was applied to the surface water quality parameters obtained from the five monitoring stations. Multidimensional scaling analysis showed that Cl-, SO42-, Na+ and BOD5 are the most important parameters causing difference in the monitoring stations. These analysis results present from the domestic waste and organic pollution affected of surfa...

  12. Rorschach assessment of traumatized refugees: an exploratory factor analysis.

    Science.gov (United States)

    Opaas, Marianne; Hartmann, Ellen

    2013-01-01

    Fifty-one multitraumatized mental health patients with refugee backgrounds completed the Rorschach (Meyer & Viglione, 2008), Harvard Trauma Questionnaire, and Hopkins Symptom Checklist-25 (Mollica, McDonald, Massagli, & Silove, 2004), and the World Health Organization Quality of Life-BREF questionnaire (WHOQOL Group, 1998) before the start of treatment. The purpose was to gain more in-depth knowledge of an understudied patient group and to provide a prospective basis for later analyses of treatment outcome. Factor analysis of trauma-related Rorschach variables gave 2 components explaining 60% of the variance; the first was interpreted as trauma-related flooding versus constriction and the second as adequate versus impaired reality testing. Component 1 correlated positively with self-reported reexperiencing symptoms of posttraumatic stress (r = .32, p < .05). Component 2 correlated positively with self-reported quality of life in the physical, psychological, and social relationships domains (r = .34, .32, and .35, p < .05), and negatively with anxiety (r = -.33, p < .05). Each component also correlated significantly with resources like work experience, education, and language skills. PMID:23570250

  13. X-ray quality assessment by MTF analysis

    International Nuclear Information System (INIS)

    In a previous study a lucite phantom with several physical elements embedded, such as lead gratings with varying line widths, etc., was exposed at 200 X-ray installations in Bavaria, Federal Republic of Germany, by a conventional diagnostic standard technique. One of the parameters investigated was the local resolution achieved, determined visually by examination of the grating images. The same radiographs have now been used in a retrospective comparative study, based upon a quantitative analysis of TV images of the films. The films were positioned on a commercial illuminator screen and looked at by a TV camera through a simple magnifying optical system. The video signals were digitised, resulting in a pixel distance of 50 μm. In a first approach the edges of the broad frames of the lead gratings were adjusted vertically and the normalised sum of all 256 TV scanning lines taken as the edge function f(x). This was differentiated and suitably Fourier-transformed, delivering the modulation transfer function (MTF). The MTF can be analysed in several ways to describe quantitatively the maximum local resolution achievable. Correlation of some frequency measurements with the visually determined line resolution is generally good. (author)

  14. Digital image analysis outperforms manual biomarker assessment in breast cancer.

    Science.gov (United States)

    Stålhammar, Gustav; Fuentes Martinez, Nelson; Lippert, Michael; Tobin, Nicholas P; Mølholm, Ida; Kis, Lorand; Rosin, Gustaf; Rantalainen, Mattias; Pedersen, Lars; Bergh, Jonas; Grunkin, Michael; Hartman, Johan

    2016-04-01

    In the spectrum of breast cancers, categorization according to the four gene expression-based subtypes 'Luminal A,' 'Luminal B,' 'HER2-enriched,' and 'Basal-like' is the method of choice for prognostic and predictive value. As gene expression assays are not yet universally available, routine immunohistochemical stains act as surrogate markers for these subtypes. Thus, congruence of surrogate markers and gene expression tests is of utmost importance. In this study, 3 cohorts of primary breast cancer specimens (total n=436) with up to 28 years of survival data were scored for Ki67, ER, PR, and HER2 status manually and by digital image analysis (DIA). The results were then compared for sensitivity and specificity for the Luminal B subtype, concordance to PAM50 assays in subtype classification and prognostic power. The DIA system used was the Visiopharm Integrator System. DIA outperformed manual scoring in terms of sensitivity and specificity for the Luminal B subtype, widely considered the most challenging distinction in surrogate subclassification, and produced slightly better concordance and Cohen's κ agreement with PAM50 gene expression assays. Manual biomarker scores and DIA essentially matched each other for Cox regression hazard ratios for all-cause mortality. When the Nottingham combined histologic grade (Elston-Ellis) was used as a prognostic surrogate, stronger Spearman's rank-order correlations were produced by DIA. Prognostic value of Ki67 scores in terms of likelihood ratio χ(2) (LR χ(2)) was higher for DIA that also added significantly more prognostic information to the manual scores (LR-Δχ(2)). In conclusion, the system for DIA evaluated here was in most aspects a superior alternative to manual biomarker scoring. It also has the potential to reduce time consumption for pathologists, as many of the steps in the workflow are either automatic or feasible to manage without pathological expertise. PMID:26916072

  15. FEBEX II Project Post-mortem analysis EDZ assessment

    Energy Technology Data Exchange (ETDEWEB)

    Bazargan Sabet, B.; Shao, H.; Autio, J.; Elorza, F. J.

    2004-07-01

    variations in the propagation velocities of acoustic waves. A cylindrical block of granite 38.8 cm in diameter and 40 cm high has been analysed along 2D transversal sections in six radial directions. Different inverse tomographic strategies have been used to analyse the measured data, which shown no evidences of the existence of an EDZ in FEBEX gallery. However, a preferential direction in the wave propagation similar to the maximum compression direction of the stress tensor has appeared. As for in situ investigations, the hydraulic connectivity of the drift has been assessed at eleven locations in heated area, including granite matrix and lamprophyre dykes and at six locations in undisturbed zones. In the granite matrix area, pulse test using pressurised air with stepwise pressure increasing was conducted to determine gas entry pressure. In the fractured area, a gas constant flow rate injection test was conducted. Only two locations with higher permeability were detected; one in a natural fracture in the lamprophyre dyke and the other in the interface between lamprophyre and granite. Where numerical investigations are concerned, several analyses of the FEBEX in situ experiment were carried out to determine whether if the generation of a potential EDZ in the surrounding rock was possible or not. Stresses have been calculated by 1D full-coupled thermo-hydromechanical model and by 2D and 3D thermo-mechanical models. Results compared with the available data on compressive strength of the Grimsel granite show that in the worst-case studied, the state of stresses induced by the excavation and the heating phases remains far below the critical curve. (Author)

  16. Interconnectivity among Assessments from Rating Agencies: Using Cluster and Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Jaroslav Krejčíř

    2014-09-01

    Full Text Available The aim of this paper is to determine whether there is a dependency among leading rating agencies assessments. Rating agencies are important part of global economy. Great attention has been paid to activities of rating agencies since 2007, when there was a financial crisis. One of the main causes of this crisis was identified credit rating agencies. This paper is focused on an existence of mutual interconnectivity among assessments from three leading rating agencies. The method used for this determines is based on cluster analysis and subsequently correlation analysis and the test of independence. Credit rating assessments of Greece and Spain were chosen to the determination of this mutual interconnectivity due to the fact that these countries are most talked euro­area countries. The significant dependence of the assessment from different rating agencies has been demonstrated.

  17. An Analysis of the Cumulative Uncertainty Associated with a Quantitative Consequence Assessment of a Major Accident

    OpenAIRE

    JIRSA PAVEL

    2005-01-01

    The task of the article is to quantify the uncertainty of the possible results of the accident consequence assessment of the chemical production plant and to provide some description of potentional problems with literature references and examples to help to avoid the erroneous use of available formulas. Based on numbers presented in the article we may conclude, that the main source of uncertainty in the consequence analysis of chemical accident assessment is surprisingly not only the dispers...

  18. An Analysis Of Tensile Test Results to Assess the Innovation Risk for an Additive Manufacturing Technology

    OpenAIRE

    Adamczak Stanisław; Bochnia Jerzy; Kaczmarska Bożena

    2015-01-01

    The aim of this study was to assess the innovation risk for an additive manufacturing process. The analysis was based on the results of static tensile tests obtained for specimens made of photocured resin. The assessment involved analyzing the measurement uncertainty by applying the FMEA method. The structure of the causes and effects of the discrepancies was illustrated using the Ishikawa diagram. The risk priority numbers were calculated. The uncertainty of the tensile test measurement was ...

  19. A multiple-imputation based approach to sensitivity analysis and effectiveness assessment in longitudinal clinical trials

    OpenAIRE

    Teshome Ayele, Birhanu; Lipkovich, Ilya; Molenberghs, Geert; Mallinckrodt, Craig H

    2014-01-01

    It is important to understand the effects of a drug as actually taken (effectiveness) and when taken as directed (efficacy). The primary objective of this investigation was to assess the statistical performance of a method referred to as placebo multiple imputation (pMI) as an estimator of effectiveness and as a worst reasonable case sensitivity analysis in assessing efficacy. The pMI method assumes the statistical behavior of placebo- and drug-treated patients after dropout is the statistica...

  20. Sensitivity analysis for the EPIK vulnerability assessment in a local karstic aquifer

    OpenAIRE

    Gogu, Radu Constantin; Dassargues, Alain

    2000-01-01

    Applying the EPIK parametric method, a vulnerability assessment has been made for a small karstic groundwater system in southern Belgium. The aquifer is a karstified limestone of Devonian age. A map of intrinsic vulnerability of the aquifer and of the local water-supply system shows three vulnerability areas. A parameter-balance study and a sensitivity analysis were performed to evaluate the influence of single parameters on aquifer-vulnerability assessment using the EPIK method. This approac...

  1. Solar PV rural electrification and energy poverty assessment in Ghana: A principal component analysis

    OpenAIRE

    Obeng, G. Y.; Evers, Hans-Dieter; F. O. Akuffo; Braimah, I.; Brew-Hammond, A.

    2007-01-01

    The relationship between solar photovoltaic (PV) rural electrification and energy poverty was assessed using social, economic and environmental indicator-based questionnaires in 96 solar-electrified and 113 non-electrified households in rural Ghana. The purpose was to assess energy-poverty status of households with and without solar PV systems, and to determine the factors that explain energy-poverty in off-grid rural households. Principal component analysis (PCA) was used to construct energy...

  2. Quantitative Assessment of Flame Stability Through Image Processing and Spectral Analysis

    OpenAIRE

    Sun, Duo; Lu, Gang; Zhou, Hao; Yan, Yong; Liu, Shi

    2015-01-01

    This paper experimentally investigates two generalized methods, i.e., a simple universal index and oscillation frequency, for the quantitative assessment of flame stability at fossil-fuel-fired furnaces. The index is proposed to assess the stability of flame in terms of its color, geometry, and luminance. It is designed by combining up to seven characteristic parameters extracted from flame images. The oscillation frequency is derived from the spectral analysis of flame radiation signals. The...

  3. Application of inelastic neutron scattering and prompt neutron activation analysis in coal quality assessment

    International Nuclear Information System (INIS)

    The basic principles are assessed of the determination of ash content in coal based on the measurement of values proportional to the effective proton number. Discussed is the principle of coal quality assessment using the method of inelastic neutron scattering and prompt neutron activation analysis. This is done with respect both to theoretical relations between measured values and coal quality attributes and to practical laboratory measurements of coal sample quality by the said methods. (author)

  4. English Language Assessment in the Colleges of Applied Sciences in Oman: Thematic Document Analysis

    OpenAIRE

    Fatma Al Hajri

    2014-01-01

    Proficiency in English language and how it is measured have become central issues in higher education research as the English language is increasingly used as a medium of instruction and a criterion for admission to education. This study evaluated the English language assessment in the foundation Programme at the Colleges of Applied sciences in Oman. It used thematic analysis in studying 118 documents on language assessment. Three main findings were reported: compatibility between what was ta...

  5. Objective Audio Quality Assessment Based on Spectro-Temporal Modulation Analysis

    OpenAIRE

    Guo, Ziyuan

    2011-01-01

    Objective audio quality assessment is an interdisciplinary research area that incorporates audiology and machine learning. Although much work has been made on the machine learning aspect, the audiology aspect also deserves investigation. This thesis proposes a non-intrusive audio quality assessment algorithm, which is based on an auditory model that simulates human auditory system. The auditory model is based on spectro-temporal modulation analysis of spectrogram, which has been proven to be ...

  6. Comparison of digital image analysis versus visual assessment to assess survivin expression as an independent predictor of survival for patients with clear cell renal cell carcinoma✩

    OpenAIRE

    Parker, Alexander S.; Lohse, Christine M.; Leibovich, Bradley C.; Cheville, John C; Sheinin, Yuri M.; Kwon, Eugene D.

    2008-01-01

    We previously used quantitative digital image analysis to report that high immunohistochemical tumor expression levels of survivin independently predict poor outcome among patients with clear cell renal cell carcinoma. However, given the cumbersome and costly nature of digital image analysis, we evaluated simple visual assessment as an alternative to digital image analysis for assessing survivin as a predictor of clear cell renal cell carcinoma patient outcomes. We identified 310 patients tre...

  7. Establishment of a Risk Assessment Framework for Analysis of the Spread of Highly Pathogenic Avian Influenza

    Institute of Scientific and Technical Information of China (English)

    LI Jing; WANG Jing-fei; WU Chun-yan; YANG Yan-tao; JI Zeng-tao; WANG Hong-bin

    2007-01-01

    To evaluate the risk of highly pathogenic avian influenza (HPAI) in mainland China, a risk assessment framework was built.Risk factors were determined by analyzing the epidemic data using the brainstorming method; the analytic hierarchy process was designed to weigh risk factors, and the integrated multicriteria analysis was used to evaluate the final result.The completed framework included the risk factor system, data standards for risk factors, weights of risk factors, and integrated assessment methods. This risk assessment framework can be used to quantitatively analyze the outbreak and spread of HPAI in mainland China.

  8. Uncertainty and Sensitivity Analysis in Performance Assessment for the Waste Isolation Pilot Plant

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C.

    1998-12-17

    The Waste Isolation Pilot Plant (WIPP) is under development by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. This development has been supported by a sequence of performance assessments (PAs) carried out by Sandla National Laboratories (SNL) to assess what is known about the WIPP and to provide .tidance for future DOE research and development activities. Uncertainty and sensitivity analysis procedures based on Latin hypercube sampling and regression techniques play an important role in these PAs by providing an assessment of the uncertainty in important analysis outcomes and identi~ing the sources of thk uncertainty. Performance assessments for the WIPP are conceptually and computational] y interesting due to regulatory requirements to assess and display the effects of both stochastic (i.e., aleatory) and subjective (i.e., epistemic) uncertainty, where stochastic uncertainty arises from the possible disruptions that could occur over the 10,000 yr regulatory period associated with the WIPP and subjective uncertainty arises from an inability to unambi-aously characterize the many models and associated parameters required in a PA for the WIPP. The interplay between uncertainty analysis, sensitivity analysis, stochastic uncertainty and subjective uncertainty are discussed and illustrated in the context of a recent PA carried out by SNL to support an application by the DOE to the U.S. Environmental Protection Agency for the certification of the WIPP for the disposal of TRU waste.

  9. Assessment

    Institute of Scientific and Technical Information of China (English)

    Geoff Brindley

    2005-01-01

    @@ Introduction TERMINOLOGY AND KEY CONCEPTS The term assessment refers to a variety of ways of collecting information on a learner's language ability or achievement. Although testing and assessment are often used interchangeably, the latter is an umbrella term encompassing measurement instruments administered on a ‘one-off’ basis such as tests, as well as qualitative methods of monitoring and recording student learning such as observation, simulations of project work. Assessment is also distinguished from evaluation which is concerned with the overall language programme and not just with what individual students have learnt. Proficiency assessment refers to the assessment of general language abilities acquired by the learner independent of a course of study.This kind of assessment is often done through the administration of standardised commercial language-proficency tests. On the other hand, assessment of achievement aims to establish what a student had learned in relation to a particular course or curriculum (thus frequently carried out by the teacher) .Achievement assesssment may be based either on the specific content of the course or on the course objectives (Hughes 1989).

  10. Human Reliability Analysis in Frame of Probabilistic Safety Assessment Projects in Czech Republic

    International Nuclear Information System (INIS)

    Human reliability analysis has proved to be a very important part of probabilistic safety analysis all over the world. It has been also an integral part of both Probabilistic Safety Level-1 studies developed in Czech Republic - Nuclear Power Plant Dukovany Probabilistic Safety Assessment and Nuclear Power Plant Temelin Probabilistic Safety Assessment and most of their consequent applications. The methodology used in human reliability analysis in frame of these studies is described in the first part of the paper. In general, the methodology is based on the well-known and most frequently used methods Technique for Human Error Rate Prediction and ASEP. The up-to-date decision tree method is used to address procedure-driven operator's interventions during plant response to initiating event. Some interesting results of human reliability analysis performed for Nuclear Power Plant Dukovany are described in the second part of the paper. The recommendations resulting from the analysis led to the standardization of some, up to that time, non-standard operator's actions and to the development of procedures for them. Generally, the procedures were found to be deficient from several points of view, what contributed to the decision to develop quite new emergency procedures for Nuclear Power Plant Dukovany. The human reliability analysis projects going on or planned for the very next future are described in the final part of the paper. safety analysis; risk assessment; reliability; nuclear power plants; human factors; errors; Czech Republic; operators; emergencies;

  11. Creep-fatigue defect assessment test and analysis of high temperature structure

    International Nuclear Information System (INIS)

    The creep-fatigue damage evaluation and the defect assessment are the one of key parameters to ascertain the structural integrity of high temperature structures. In this study, the creep-fatigue test with geometrically nonlinear structure including through wall defects was performed to examine the structural integrity of the defect structure and to validate the inelastic analysis code NONSTA. The creep-fatigue damage was examined by a portable zoom microscope and the replication technics allowed to observe the structure surface. After 400 cycles of testing, no apparent creep-fatigue was observed except the defect front. At the defect front, creep-fatigue crack initiation was observed. The commercial finite element analysis softwares ANSYS and ABAQUS were used for the corresponding structural analysis. Both elastic analysis and inelastic analysis using NONSTA code were performed with collected temperature profile from the test and the strain results of analyses agree well with those from the test. The creep-fatigue damage was assessed per ASME-NH utilizing analysis results and the creep-fatigue crack initiation was assessed per RCC-MR A16. The results show good agreement between analysis and test

  12. The role of uncertainty analysis in dose reconstruction and risk assessment

    International Nuclear Information System (INIS)

    Dose reconstruction and risk assessment rely heavily on the use of mathematical models to extrapolate information beyond the realm of direct observation. Because models are merely approximations of real systems, their predictions are inherently uncertain. As a result, full disclosure of uncertainty in dose and risk estimates is essential to achieve scientific credibility and to build public trust. The need for formal analysis of uncertainty in model predictions was presented during the nineteenth annual meeting of the NCRP. At that time, quantitative uncertainty analysis was considered a relatively new and difficult subject practiced by only a few investigators. Today, uncertainty analysis has become synonymous with the assessment process itself. When an uncertainty analysis is used iteratively within the assessment process, it can guide experimental research to refine dose and risk estimates, deferring potentially high cost or high consequence decisions until uncertainty is either acceptable or irreducible. Uncertainty analysis is now mandated for all ongoing dose reconstruction projects within the United States, a fact that distinguishes dose reconstruction from other types of exposure and risk assessments. 64 refs., 6 figs., 1 tab

  13. National Waste Repository Novi Han operational safety analysis report. Safety assessment methodology

    International Nuclear Information System (INIS)

    The scope of the safety assessment (SA), presented includes: waste management functions (acceptance, conditioning, storage, disposal), inventory (current and expected in the future), hazards (radiological and non-radiological) and normal and accidental modes. The stages in the development of the SA are: criteria selection, information collection, safety analysis and safety assessment documentation. After the review the facilities functions and the national and international requirements, the criteria for safety level assessment are set. As a result from the 2nd stage actual parameters of the facility, necessary for safety analysis are obtained.The methodology is selected on the base of the comparability of the results with the results of previous safety assessments and existing standards and requirements. The procedure and requirements for scenarios selection are described. A radiological hazard categorisation of the facilities is presented. Qualitative hazards and operability analysis is applied. The resulting list of events are subjected to procedure for prioritization by method of 'criticality analysis', so the estimation of the risk is given for each event. The events that fall into category of risk on the boundary of acceptability or are unacceptable are subjected to the next steps of the analysis. As a result the lists with scenarios for PSA and possible design scenarios are established. PSA logical modeling and quantitative calculations of accident sequences are presented

  14. An analysis of assessment outcomes from eight years' operation of the Australian border weed risk assessment system.

    Science.gov (United States)

    Weber, Jason; Dane Panetta, F; Virtue, John; Pheloung, Paul

    2009-02-01

    The majority of Australian weeds are exotic plant species that were intentionally introduced for a variety of horticultural and agricultural purposes. A border weed risk assessment system (WRA) was implemented in 1997 in order to reduce the high economic costs and massive environmental damage associated with introducing serious weeds. We review the behaviour of this system with regard to eight years of data collected from the assessment of species proposed for importation or held within genetic resource centres in Australia. From a taxonomic perspective, species from the Chenopodiaceae and Poaceae were most likely to be rejected and those from the Arecaceae and Flacourtiaceae were most likely to be accepted. Dendrogram analysis and classification and regression tree (TREE) models were also used to analyse the data. The latter revealed that a small subset of the 35 variables assessed was highly associated with the outcome of the original assessment. The TREE model examining all of the data contained just five variables: unintentional human dispersal, congeneric weed, weed elsewhere, tolerates or benefits from mutilation, cultivation or fire, and reproduction by vegetative propagation. It gave the same outcome as the full WRA model for 71% of species. Weed elsewhere was not the first splitting variable in this model, indicating that the WRA has a capacity for capturing species that have no history of weediness. A reduced TREE model (in which human-mediated variables had been removed) contained four variables: broad climate suitability, reproduction in less or than equal to 1year, self-fertilisation, and tolerates and benefits from mutilation, cultivation or fire. It yielded the same outcome as the full WRA model for 65% of species. Data inconsistencies and the relative importance of questions are discussed, with some recommendations made for improving the use of the system. PMID:18339471

  15. Data uncertainty analysis for safety assessment of HLW disposal by the Monte Carlo simulation

    International Nuclear Information System (INIS)

    Based on the conceptual model of the Reference Case, which is defined as the baseline for various cases in the safety assessment of the H12 report, a new probabilistic simulation code that allowed rapid evaluation of the effect of data uncertainty has been developed. Using this code, probabilistic simulation was performed by the Monte Carlo method and conservativeness and sufficiency of the safety assessment in the H12 report was confirmed, which was performed deterministically. In order to examine the important parameter, this study includes the analysis of sensitivity structure among the input and the output. Cluster analysis and multiple regression analysis for each cluster were applied in this analysis. As a result, the transmissivity had a strong influence on the uncertainty of the system performance. Furthermore, this approach was confirmed to evaluate the global sensitive parameters and local sensitive parameters that strongly influence the space of the partial simulation results. (author)

  16. Degradation Assessment and Fault Diagnosis for Roller Bearing Based on AR Model and Fuzzy Cluster Analysis

    Directory of Open Access Journals (Sweden)

    Lingli Jiang

    2011-01-01

    Full Text Available This paper proposes a new approach combining autoregressive (AR model and fuzzy cluster analysis for bearing fault diagnosis and degradation assessment. AR model is an effective approach to extract the fault feature, and is generally applied to stationary signals. However, the fault vibration signals of a roller bearing are non-stationary and non-Gaussian. Aiming at this problem, the set of parameters of the AR model is estimated based on higher-order cumulants. Consequently, the AR parameters are taken as the feature vectors, and fuzzy cluster analysis is applied to perform classification and pattern recognition. Experiments analysis results show that the proposed method can be used to identify various types and severities of fault bearings. This study is significant for non-stationary and non-Gaussian signal analysis, fault diagnosis and degradation assessment.

  17. 76 FR 67764 - Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...

    Science.gov (United States)

    2011-11-02

    ... COMMISSION Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...-xxxx, Revision 0, ``Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and... at (301) 492-3446. FOR FURTHER INFORMATION CONTACT: Song-Hua Shen, Division of Risk Analysis,...

  18. Assessing Heterogeneity for Factor Analysis Model with Continuous and Ordinal Outcomes

    OpenAIRE

    Ye-Mao Xia; Jian-Wei Gou

    2016-01-01

    Factor analysis models with continuous and ordinal responses are a useful tool for assessing relations between the latent variables and mixed observed responses. These models have been successfully applied to many different fields, including behavioral, educational, and social-psychological sciences. However, within the Bayesian analysis framework, most developments are constrained within parametric families, of which the particular distributions are specified for the parameters of interest. ...

  19. Software Integration of Life Cycle Assessment and Economic Analysis for Process Evaluation

    OpenAIRE

    Kalakula, Sawitree; Malakula, Pomthong; Siemanonda, Kitipat; Gani, Rafiqul

    2013-01-01

    This study is focused on the sustainable process design of bioethanol production from cassava rhizome. The study includes: process simulation, sustainability analysis, economic evaluation and life cycle assessment (LCA). A steady state process simulation if performed to generate a base case design of the bioethanol conversion process using cassava rhizome as a feedstock. The sustainability analysis is performed to analyze the relevant indicators in sustainability metrics, todefinedesign/retro...

  20. Non-linear finite element assessment analysis of a modern heritage structure

    OpenAIRE

    S. Sorace; Terenzi, G

    2011-01-01

    A synthesis of a non-linear finite element structural assessment enquiry carried out on a monumental modern heritage building is reported in this paper. The study includes a buckling analysis of the slender steel beams constituting a mushroom-type roof, and an ?integral" seismic pushover analysis of the supporting R/C columns. The computational solutions obtained for the steel roof beams are compared to the results derived from a calculation of the critical stress of beam panels, and the glob...

  1. Analysis and radiological assessment of survey results and samples from the beaches around Sellafield

    International Nuclear Information System (INIS)

    After radioactive sea debris had been found on beaches near the BNFL, Sellafield, plant, NRPB was asked by the Department of the Environment to analyse some of the samples collected and to assess the radiological hazard to members of the public. A report is presented containing an analysis of survey reports for the period 19 November - 4 December 1983 and preliminary results of the analysis of all samples received, together with the Board's recommendations. (author)

  2. The ICR142 NGS validation series: a resource for orthogonal assessment of NGS analysis

    OpenAIRE

    Elise Ruark; Anthony Renwick; Matthew Clarke; Katie Snape; Emma Ramsay; Anna Elliott; Sandra Hanks; Ann Strydom; Sheila Seal; Nazneen Rahman

    2016-01-01

    To provide a useful community resource for orthogonal assessment of NGS analysis software, we present the ICR142 NGS validation series. The dataset includes high-quality exome sequence data from 142 samples together with Sanger sequence data at 730 sites; 409 sites with variants and 321 sites at which variants were called by an NGS analysis tool, but no variant is present in the corresponding Sanger sequence. The dataset includes 286 indel variants and 275 negative indel sites, and thus the I...

  3. Assessing collective defensive performances in football: A Qualitative Comparative Analysis of central back pairs

    OpenAIRE

    Kaufmann, David

    2014-01-01

    Ahead of the World Cup in Brazil the crucial question for the Swiss national coach is the nomination of the starting eleven central back pair. A fuzzy set Qualitative Comparative Analysis assesses the defensive performances of different Swiss central back pairs during the World Cup campaign (2011 – 2014). This analysis advises Ottmar Hitzfeld to nominate Steve von Bergen and Johan Djourou as the starting eleven central back pair. The alternative with a substantially weaker empirical validity ...

  4. Risk assessment of inhalation exposure for airborne toxic metals using instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    In order to study the effects of air pollution, about 1,300 samples of airborne particulate matter (APM) were collected at suburban and industrial sites, in Daejeon, Korea from 1998 to 2006. The concentrations of carcinogenic (As and Cr) and non-carcinogenic metals (Al, Mn, and Zn) were determined by using instrumental neutron activation analysis (INAA). These long-term metal concentration data were applied to a risk assessment of inhalation exposure using Monte Carlo analysis (MCA). (author)

  5. Assessing Low-Carbon Development in Nigeria : An Analysis of Four Sectors

    OpenAIRE

    Cervigni, Raffaello; Rogers, John Allen; Dvorak, Irina

    2013-01-01

    The Federal Government of Nigeria (FGN) and the World Bank have agreed to carry out a Climate Change Assessment (CCA) within the framework of the Bank's Country Partnership Strategy (CPS) for Nigeria (2010-13). The CCA includes an analysis of options for low-carbon development in selected sectors, including power, oil and gas, transport, and agriculture. The goal of the low-carbon analysis...

  6. Uncertainty and sensitivity analysis methodology in a level-I PSA (Probabilistic Safety Assessment)

    International Nuclear Information System (INIS)

    This work presents a methodology for sensitivity and uncertainty analysis, applicable to a probabilistic safety assessment level I. The work contents are: correct association of distributions to parameters, importance and qualification of expert opinions, generations of samples according to sample sizes, and study of the relationships among system variables and system response. A series of statistical-mathematical techniques are recommended along the development of the analysis methodology, as well different graphical visualization for the control of the study. (author)

  7. Impact Factor 2.0 : Applying Social Network Analysis to Scientific Impact Assessment

    OpenAIRE

    Hoffmann, Christian Pieter; Lutz, Christoph; Meckel, Miriam

    2014-01-01

    Social media are becoming increasingly popular in scientific communication. A range of platforms are geared specifically towards the academic community. Proponents of the altmetrics approach point out that these new media allow for new avenues of scientific impact assessment. Traditional impact measures based on bibliographic analysis have long been criticized for overlooking the relational dynamic of scientific impact. We therefore propose an application of social network analysis to researc...

  8. PHYTOPLANKTON PIGMENT ANALYSIS BY HPLC FOR ASSESSING COMMUNITY COMPOSITION IN THE LAURENTIAN GREAT LAKES

    Science.gov (United States)

    A technique to rapidly assess phytoplankton dynamics is being evaluated for its utility in the Great Lakes. Comparison to traditional microscopic techniques and to more recent in-situ FluoroProbe technology will allow us to determine if HPLC pigment analysis can provide unique a...

  9. Method of synchronization assessment of rythms in regulatory systems for signal analysis in real time

    OpenAIRE

    Borovkova E.l.; Ishbulatov Yu.M.; Mironov S.A.

    2014-01-01

    A method is proposed for quantitative assessment of the phase synchronization of 0.1 Hz oscillations in autonomic cardiovascular control by photoplethysmogram analysis in real time. The efficiency of the method is shown in the comparison with the results obtained by the previously developed method.

  10. Method of synchronization assessment of rythms in regulatory systems for signal analysis in real time

    Directory of Open Access Journals (Sweden)

    Borovkova E.l.

    2014-09-01

    Full Text Available A method is proposed for quantitative assessment of the phase synchronization of 0.1 Hz oscillations in autonomic cardiovascular control by photoplethysmogram analysis in real time. The efficiency of the method is shown in the comparison with the results obtained by the previously developed method.

  11. Spatial point pattern analysis of aerial survey data to assess clustering in wildlife distributions

    Science.gov (United States)

    Khaemba, Wilson Mwale

    Assessing clustering in wildlife populations is crucial for understanding their dynamics. This assessment is made difficult for data obtained through aerial surveys because the shape and size of sampling units (strip transects) result in poor data supports, which generally hampers spatial analysis of these data. The problem may be solved by having more detailed data where exact locations of observed animal groups are recorded. These data, obtainable through GPS technology, are amenable to spatial analysis, thereby allowing spatial point pattern analysis to be used to assess observed spatial patterns relative to environmental factors like vegetation. Distance measures like the G-statistic and K-function classify such patterns into clustered, regular or completely random patterns, while independence between species is assessed through a multivariate extension of the K-function. Quantification of clustering is carried out using spatial regression. The techniques are illustrated with field data on three ungulates observed in an ecosystem in Kenya. Results indicate a relation between species spatial distribution and their dietary requirements, thereby concluding the usefulness of spatial point pattern analysis in investigating species spatial distribution. It also provides a technique for explaining and differentiating the distribution of wildlife species.

  12. Assessing Model Fit: Caveats and Recommendations for Confirmatory Factor Analysis and Exploratory Structural Equation Modeling

    Science.gov (United States)

    Perry, John L.; Nicholls, Adam R.; Clough, Peter J.; Crust, Lee

    2015-01-01

    Despite the limitations of overgeneralizing cutoff values for confirmatory factor analysis (CFA; e.g., Marsh, Hau, & Wen, 2004), they are still often employed as golden rules for assessing factorial validity in sport and exercise psychology. The purpose of this study was to investigate the appropriateness of using the CFA approach with these…

  13. Error Ratio Analysis: Alternate Mathematics Assessment for General and Special Educators.

    Science.gov (United States)

    Miller, James H.; Carr, Sonya C.

    1997-01-01

    Eighty-seven elementary students in grades four, five, and six, were administered a 30-item multiplication instrument to assess performance in computation across grade levels. An interpretation of student performance using error ratio analysis is provided and the use of this method with groups of students for instructional decision making is…

  14. Substituted plan analysis in the environmental impact assessment of Yongchuan wastewater treatment project

    Institute of Scientific and Technical Information of China (English)

    FANG Jun-hua

    2006-01-01

    Substituted plan in the environmental impact assessment (EIA) mainly means the treatment technology and the substituted site of plant, and it also includes the many kinds of environment protection measures. This paper will make detailed analysis on the treatment technology, the substituted site of plant, the purpose of discharged water and the dispose of sludge in the Yongchuan wastewater treatment project.

  15. A sensitivity analysis of a radiological assessment model for Arctic waters

    DEFF Research Database (Denmark)

    Nielsen, S.P.

    1998-01-01

    A model based on compartment analysis has been developed to simulate the dispersion of radionuclides in Arctic waters for an assessment of doses to man. The model predicts concentrations of radionuclides in the marine environment and doses to man from a range of exposure pathways. A parameter...

  16. Assessment of Smolt Condition for Travel Time Analysis, 1993-1994 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Schrock, Robin M; Beeman, John W; VanderKooi, Scott P [US Geological Survey, Western Fisheries Research Center, Columbia River Research Laboratory, Cook, WA

    1999-02-01

    The assessment of smolt condition for travel time analysis (ASCTTA) project provided information on the level of smoltification in Columbia River hatchery and wild salmonid stocks to the Fish Passage Center (FPC), for the primary purpose of in-river management of flows.

  17. Designing student peer assessment in higher education: Analysis of written and oral peer feedback

    NARCIS (Netherlands)

    van den Berg, I.; Admiraal, W.; Pilot, A.

    2006-01-01

    Designing student peer assessment in higher education: analysis of written and oral peer feedback Relating it to design features, the present article describes the nature of written and oral peer feedback as it occurred in seven writing courses, each with a different PA design. Results indicate that

  18. Identifying Students with Learning Disabilities: Composite Profile Analysis Using the Cognitive Assessment System

    Science.gov (United States)

    Huang, Leesa V.; Bardos, Achilles N.; D'Amato, Rik Carl

    2010-01-01

    The detection of cognitive patterns in children with learning disabilities (LD) has been a priority in the identification process. Subtest profile analysis from traditional cognitive assessment has drawn sharp criticism for inaccurate identification and weak connections to educational planning. Therefore, the purpose of this study is to use a new…

  19. Assessing probability of safety criteria exceeding according to probabilistic safety analysis results

    International Nuclear Information System (INIS)

    The paper considers the general task on checking compliance of probabilistic safety indicators with regulatory criteria. It presents correlations to assess probable exceeding of safety criterion for different laws of distribution of the numerical results of the probabilistic safety analysis (PSA). The paper presents the scale for rationing probability of safety criteria exceeding.

  20. A sensitivity analysis of a radiological assessment model for Arctic waters

    DEFF Research Database (Denmark)

    Nielsen, S.P.

    1998-01-01

    A model based on compartment analysis has been developed to simulate the dispersion of radionuclides in Arctic waters for an assessment of doses to man. The model predicts concentrations of radionuclides in the marine environment and doses to man from a range of exposure pathways. A parameter sen...

  1. Language Assessment Impacts in China:a Tentative Analysis of TEM8

    Institute of Scientific and Technical Information of China (English)

    Cui; Yingqiong; Cheng; Hongying

    2015-01-01

    The paper aims to present a tentative analysis of language assessing impacts on relevant parties.It will start with discussing the connotation of test impacts and then the analysing the impacts of TEM8 on test takers,teachers and social level.The importance of such impacts will also be revealed in this paper.

  2. A Proposed New "What if Reliability" Analysis for Assessing the Statistical Significance of Bivariate Relationships

    Science.gov (United States)

    Onwuegbuzie, Anthony J.; Roberts, J. Kyle; Daniel, Larry G.

    2005-01-01

    In this article, the authors (a) illustrate how displaying disattenuated correlation coefficients alongside their unadjusted counterparts will allow researchers to assess the impact of unreliability on bivariate relationships and (b) demonstrate how a proposed new "what if reliability" analysis can complement null hypothesis significance tests of…

  3. A Critical Examination of the Assessment Analysis Capabilities of OCLC ACAS

    Science.gov (United States)

    Lyons, Lucy E.

    2005-01-01

    Over 500 libraries have employed OCLC's iCAS and its successor Automated Collection Assessment and Analysis Services (ACAS) as bibliometric tools to evaluate monograph collections. This examination of ACAS reveals both its methodological limitations and its feasibility as an indicator of collecting patterns. The results can be used to maximize the…

  4. Combining a building simulation with energy systems analysis to assess the benefits of natural ventilation

    DEFF Research Database (Denmark)

    Oropeza-Perez, Ivan; Østergaard, Poul Alberg; Remmen, Arne

    thermal air flow simulation program - Into the energy systems analysis model. Descriptions of the energy systems in two geographical locations, i.e. Mexico and Denmark, are set up as inputs. Then, the assessment is done by calculating the energy impacts as well as environmental benefits in the energy...

  5. Safety assessment of research reactors and preparation of the safety analysis report

    International Nuclear Information System (INIS)

    This Safety Guide presents guidelines, approved by international consensus, for the preparation, review and assessment of safety documentation for research reactors such as the Safety Analysis Report. While the Guide is most applicable to research reactors in the design and construction stage, it is also recommended for use during relicensing or reassessment of existing reactors

  6. MONTHLY VARIATION IN SPERM MOTILITY IN COMMON CARP ASSESSED USING COMPUTER-ASSISTED SPERM ANALYSIS (CASA)

    Science.gov (United States)

    Sperm motility variables from the milt of the common carp Cyprinus carpio were assessed using a computer-assisted sperm analysis (CASA) system across several months (March-August 1992) known to encompass the natural spawning period. Two-year-old pond-raised males obtained each mo...

  7. Handheld tools that 'Informate' Assessment of Student Learning in Science: A Requirements Analysis

    Science.gov (United States)

    Roschelle, Jeremy; Penuel, William R.; Yarnall, Louise; Shechtman, Nicole; Tatar, Deborah

    2005-01-01

    An important challenge faced by many teachers as they involve students in science investigations is measuring (assessing) students' progress. Our detailed requirements analysis in a particular school district led to the idea that what teachers need most are ways to increase the quality of the information they have about what students know and can…

  8. A Bayesian latent group analysis for detecting poor effort in the assessment of malingering

    NARCIS (Netherlands)

    A. Ortega; E.-J. Wagenmakers; M.D. Lee; H.J. Markowitsch; M. Piefke

    2012-01-01

    Despite their theoretical appeal, Bayesian methods for the assessment of poor effort and malingering are still rarely used in neuropsychological research and clinical diagnosis. In this article, we outline a novel and easy-to-use Bayesian latent group analysis of malingering whose goal is to identif

  9. Assessment of models for pedestrian dynamics with functional principal component analysis

    Science.gov (United States)

    Chraibi, Mohcine; Ensslen, Tim; Gottschalk, Hanno; Saadi, Mohamed; Seyfried, Armin

    2016-06-01

    Many agent based simulation approaches have been proposed for pedestrian flow. As such models are applied e.g. in evacuation studies, the quality and reliability of such models is of vital interest. Pedestrian trajectories are functional data and thus functional principal component analysis is a natural tool to assess the quality of pedestrian flow models beyond average properties. In this article we conduct functional Principal Component Analysis (PCA) for the trajectories of pedestrians passing through a bottleneck. In this way it is possible to assess the quality of the models not only on basis of average values but also by considering its fluctuations. We benchmark two agent based models of pedestrian flow against the experimental data using PCA average and stochastic features. Functional PCA proves to be an efficient tool to detect deviation between simulation and experiment and to assess quality of pedestrian models.

  10. The assessment report of QA program through the analysis of quality trend in 1994

    International Nuclear Information System (INIS)

    Effectiveness and adequacy of KAERI Qualify Assurance Program is assessed through the analysis of quality trend. As a result of assessment, Quality Assurance System for each project has reached the stage of stabilization, and especially, significant improvement of the conformance to QA procedure, the control of QA Records and documents and the inspiration of quality mind for the job has been made. However, some problems discovered in this trend analysis, ie, improvement of efficiency of quality training and economies of design verification system, are required to take preventive actions and consider appropriate measures. In the future, QA is expected to be a support to assurance of nuclear safety and development of advanced technology by making it possible to establish the best quality system suitable for our situation, based on the assessment method for quality assurance program presented in this study. 5 figs., 30 tabs. (Author)

  11. The assessment report of QA program through the analysis of quality trend in 1994

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yung Se; Hong, Kyung Sik; Park, Sang Pil; Park, Kun Woo [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-04-01

    Effectiveness and adequacy of KAERI Qualify Assurance Program is assessed through the analysis of quality trend. As a result of assessment, Quality Assurance System for each project has reached the stage of stabilization, and especially, significant improvement of the conformance to QA procedure, the control of QA Records and documents and the inspiration of quality mind for the job has been made. However, some problems discovered in this trend analysis, ie, improvement of efficiency of quality training and economies of design verification system, are required to take preventive actions and consider appropriate measures. In the future, QA is expected to be a support to assurance of nuclear safety and development of advanced technology by making it possible to establish the best quality system suitable for our situation, based on the assessment method for quality assurance program presented in this study. 5 figs., 30 tabs. (Author).

  12. Structural Reliability Assessment by Integrating Sensitivity Analysis and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Shao-Fei Jiang

    2014-01-01

    Full Text Available To reduce the runtime and ensure enough computation accuracy, this paper proposes a structural reliability assessment method by the use of sensitivity analysis (SA and support vector machine (SVM. The sensitivity analysis is firstly applied to assess the effect of random variables on the values of performance function, while the small-influence variables are rejected as input vectors of SVM. Then, the trained SVM is used to classify the input vectors, which are produced by sampling the residual variables based on their distributions. Finally, the reliability assessment is implemented with the aid of reliability theory. A 10-bar planar truss is used to validate the feasibility and efficiency of the proposed method, and a performance comparison is made with other existing methods. The results show that the proposed method can largely save the runtime with less reduction of the accuracy; furthermore, the accuracy using the proposed method is the highest among the methods employed.

  13. 2D Monte Carlo analysis of radiological risk assessment for the food intake in Korea

    International Nuclear Information System (INIS)

    Most public health risk assessments assume and combine a series of average, conservative and worst-case values to derive an acceptable point estimate of risk. To improve quality of risk information, insight of uncertainty in the assessments is needed and more emphasis is put on the probabilistic risk assessment. Probabilistic risk assessment studies use probability distributions for one or more variables of the risk equation in order to quantitatively characterize variability and uncertainty. In this study, an advanced technique called the two-dimensional Monte Carlo analysis (2D MCA) is applied to estimation of internal doses from intake of radionuclides in foodstuffs and drinking water in Korea. The variables of the risk model along with the parameters of these variables are described in terms of probability density functions (PDFs). In addition, sensitivity analyses were performed to identify important factors to the radiation doses. (author)

  14. Critical analysis of frameworks and approaches to assess the environmental risks of nanomaterials

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Linkov, Igor; Hansen, Steffen Foss; Baun, Anders

    increasingly apparent that approaches which are aimed at ultimately fulfilling standard, quantitative environmental risk assessment for NM is likely to be not only extremely challenging but also resource- and time-consuming. In response, a number of alternative or complimentary frameworks and approaches to......7.1.7 Critical analysis of frameworks and approaches to assess the environmental risks of nanomaterials Khara D. Grieger1, Igor Linkov2, Steffen Foss Hansen1, Anders Baun1 1Technical University of Denmark, Kgs. Lyngby, Denmark 2Environmental Laboratory, U.S. Army Corps of Engineers, Brookline, USA...... Email: kdg@env.dtu.dk Scientists, organizations, governments, and policy-makers are currently involved in reviewing, adapting, and formulating risk assessment frameworks and strategies to understand and assess the potential environmental risks of engineered nanomaterials (NM). It is becoming...

  15. Analysis And Assessment Of The Security Method Against Incidental Contamination In The Collective Water Supply System

    Directory of Open Access Journals (Sweden)

    Szpak Dawid

    2015-09-01

    Full Text Available The paper presents the main types of surface water incidental contaminations and the security method against incidental contamination in water sources. Analysis and assessment the collective water supply system (CWSS protection against incidental contamination was conducted. Failure Mode and Effects Analysis (FMEA was used. The FMEA method allow to use the product or process analysis, identification of weak points, and implementation the corrections and new solutions for eliminating the source of undesirable events. The developed methodology was shown in application case. It was found that the risk of water contamination in water-pipe network of the analyzed CWSS caused by water source incidental contamination is at controlled level.

  16. Development of Probabilistic Uncertainty Analysis Methodology for SRS Performance Assessments Maintenance Plan Activities

    International Nuclear Information System (INIS)

    An initial uncertainty analysis of the Performance Assessment (PA) model of the Savannah River Site (SRS) trench disposal unit was conducted. Selected input data values were varied for both flow and transport analyses to generate input sets called realizations. Outputs of fluxes to the water table and well concentrations were compared to results from the PA. This stage of the uncertainty analysis served as a prototype for future work. The focus was to lay the foundation for a more comprehensive analysis, generate a limited set of output results, and learn about the process and potential problems

  17. Using the Ages and Stages Questionnaire to teach medical students developmental assessment: a descriptive analysis

    Directory of Open Access Journals (Sweden)

    Nicol Pam

    2006-05-01

    Full Text Available Abstract Background After a survey of medical graduates' skills found a lack of confidence in developmental assessment, a program was introduced with the broad aims of increasing medical student confidence and respect for the parents' role in childhood developmental assessment. Research has shown that parents' concerns are as accurate as quality screening tests in assessing development, so the program utilised the Ages and Stages Questionnaire, a parent completed, child development assessment tool. Method To evaluate the program, an interpretative analysis was completed on the students' reports written during the program and a questionnaire was administered to the parents to gain their perception of the experience. As well, student confidence levels in assessing growth and development were measured at the end of the paediatric term. Results Although there was an increase in student confidence in developmental assessment at the end of the term, it was not statistically significant. However the findings indicated that students gained increased understanding of the process and enhanced recognition of the parental role, and the study suggested there was increased confidence in some students. Parents indicated that they thought they should be involved in the teaching of students. Conclusion The ASQ was shown to have been useful in an education program at the level of advanced beginners in developmental assessment.

  18. Benefits and risks of emerging technologies: integrating life cycle assessment and decision analysis to assess lumber treatment alternatives.

    Science.gov (United States)

    Tsang, Michael P; Bates, Matthew E; Madison, Marcus; Linkov, Igor

    2014-10-01

    Assessing the best options among emerging technologies (e.g., new chemicals, nanotechnologies) is complicated because of trade-offs across benefits and risks that are difficult to quantify given limited and fragmented availability of information. This study demonstrates the integration of multicriteria decision analysis (MCDA) and life cycle assessment (LCA) to address technology alternative selection decisions. As a case study, prioritization of six lumber treatment alternatives [micronized copper quaternary (MCQ); alkaline copper quaternary (ACQ); water-borne copper naphthenate (CN); oil-borne copper naphthenate (CNo); water-borne copper quinolate (CQ); and water-borne zinc naphthenate (ZN)] for military use are considered. Multiattribute value theory (MAVT) is used to derive risk and benefit scores. Risk scores are calculated using a cradle-to-gate LCA. Benefit scores are calculated by scoring of cost, durability, and corrosiveness criteria. Three weighting schemes are used, representing Environmental, Military and Balanced stakeholder perspectives. Aggregated scores from all three perspectives show CQ to be the least favorable alterative. MCQ is identified as the most favorable alternative from the Environmental stakeholder perspective. From the Military stakeholder perspective, ZN is determined to be the most favorable alternative, followed closely by MCQ. This type of scoring and ranking of multiple heterogeneous criteria in a systematic and transparent way facilitates better justification of technology selection and regulation. PMID:25209330

  19. Application of synthetic principal component analysis model to mine area farmland heavy metal pollution assessment

    Institute of Scientific and Technical Information of China (English)

    WANG Cong-lu; WU Chao; WANG Wei-jun

    2008-01-01

    Referring to GB5618-1995 about heavy metal pollution, and using statistical analysis SPSS, the major pollutants of mine area farmland heavy metal pollution were identified by variable clustering analysis. Assessment and classification were done to the mine area farmland heavy metal pollution situation by synthetic principal components analysis (PCA). The results show that variable clustering analysis is efficient to identify the principal components of mine area farmland heavy metal pollution. Sort and clustering were done to the synthetic principal components scores of soil sample, which is given by synthetic principal components analysis. Data structure of soil heavy metal contaminations, relationships and pollution level of different soil samples are discovered. The results of mine area farmland heavy metal pollution quality assessed and classified with synthetic component scores reflect the influence of both the major and compound heavy metal pol-lutants. Identification and assessment results of mine area farmland heavy metal pollution can provide reference and guide to propose control measures of mine area farmland heavy metal pollution and focus on the key treatment region.

  20. Multi-criteria decision analysis with probabilistic risk assessment for the management of contaminated ground water

    International Nuclear Information System (INIS)

    Traditionally, environmental decision analysis in subsurface contamination scenarios is performed using cost-benefit analysis. In this paper, we discuss some of the limitations associated with cost-benefit analysis, especially its definition of risk, its definition of cost of risk, and its poor ability to communicate risk-related information. This paper presents an integrated approach for management of contaminated ground water resources using health risk assessment and economic analysis through a multi-criteria decision analysis framework. The methodology introduces several important concepts and definitions in decision analysis related to subsurface contamination. These are the trade-off between population risk and individual risk, the trade-off between the residual risk and the cost of risk reduction, and cost-effectiveness as a justification for remediation. The proposed decision analysis framework integrates probabilistic health risk assessment into a comprehensive, yet simple, cost-based multi-criteria decision analysis framework. The methodology focuses on developing decision criteria that provide insight into the common questions of the decision-maker that involve a number of remedial alternatives. The paper then explores three potential approaches for alternative ranking, a structured explicit decision analysis, a heuristic approach of importance of the order of criteria, and a fuzzy logic approach based on fuzzy dominance and similarity analysis. Using formal alternative ranking procedures, the methodology seeks to present a structured decision analysis framework that can be applied consistently across many different and complex remediation settings. A simple numerical example is presented to demonstrate the proposed methodology. The results showed the importance of using an integrated approach for decision-making considering both costs and risks. Future work should focus on the application of the methodology to a variety of complex field conditions to

  1. Prenatal assessment of fetal chromosomal and genetic disorders through maternal plasma DNA analysis.

    Science.gov (United States)

    Liao, Gary J W; Chiu, Rossa W K; Lo, Y M Dennis

    2012-02-01

    The existence of cell free DNA derived from the fetus in the plasma of pregnant women was first demonstrated in 1997. This discovery offered the possibility of non-invasive sampling of fetal genetic material simply through the collection of a maternal blood sample. Such cell free fetal DNA molecules in the maternal circulation have subsequently been shown to originate from the placenta and could be detected from about 7 weeks of gestation. It has been shown that cell free fetal DNA analysis could offer highly accurate assessment of fetal genotype and chromosomal makeup for some applications. Thus, cell free fetal DNA analysis has been incorporated as a part of prenatal screening programs for the prenatal management of sex-linked and sex-associated diseases, rhesus D incompatibility as well as the prenatal detection of Down's syndrome.Cell free fetal DNA analysis may lead to a change in the way prenatal assessments are made. PMID:22198255

  2. Effect of weld residual stress analysis variables on fitness-for-service assessment

    International Nuclear Information System (INIS)

    In this study, the existing residual stress analysis techniques are reviewed within the context of Fitness-For-Service (FFS) assessment. It should be recognized the detailed residual stress evolution is an extremely complicated phenomenon that typically involves material-specific thermo-mechanical/metallurgical response, welding process physics, and structural interactions within a component being welded. As a result, computational procedures can vary significantly from highly complicated numerical techniques intended only to elucidate a small part of the process physics to cost-effective procedures that are deemed adequate for capturing some of the important features in a final residual stress distribution. Residual stress estimate techniques for FFS purposes belong to the latter category. With this in mind, both the adequacy of residual stress analysis techniques and the effect of residual stress analysis variables on FFS are assessed based on both literature data and analyses performed in this investigation

  3. Assessment of dietary patterns in nutritional epidemiology: principal component analysis compared with confirmatory factor analysis.

    OpenAIRE

    Varraso, Raphaëlle; Garcia-Aymerich, Judith; Monier, Florent; Le Moual, Nicole; de Batlle, Jordi; Miranda, Gemma; Pison, Christophe,; Romieu, Isabelle; Kauffmann, Francine; Maccario, Jean

    2012-01-01

    BACKGROUND: In the field of nutritional epidemiology, principal component analysis (PCA) has been used to derive patterns, but the robustness of interpretation might be an issue when the sample size is small. The authors proposed the alternative use of confirmatory factor analysis (CFA) to define such patterns. OBJECTIVE: The aim was to compare dietary patterns derived through PCA and CFA used as equivalent approaches in terms of stability and relevance. DESIGN: PCA and CFA were performed in ...

  4. Socio-economic analysis: a tool for assessing the potential of nanotechnologies

    Science.gov (United States)

    Brignon, Jean-Marc

    2011-07-01

    Cost-Benefit Analysis (CBA) has a long history, especially in the USA, of being used for the assessment of new regulation, new infrastructure and more recently for new technologies. Under the denomination of Socio-Economic Analysis (SEA), this concept is used in EU safety and environmental regulation, especially for the placing of chemicals on the market (REACh regulation) and the operation of industrial installations (Industrial Emissions Directive). As far as REACh and other EU legislation apply specifically to nanomaterials in the future, SEA might become an important assessment tool for nanotechnologies. The most important asset of SEA regarding nanomaterials, is the comparison with alternatives in socio-economic scenarios, which is key for the understanding of how a nanomaterial "socially" performs in comparison with its alternatives. "Industrial economics" methods should be introduced in SEAs to make industry and the regulator share common concepts and visions about economic competitiveness implications of regulating nanotechnologies, SEA and Life Cycle Analysis (LCA) can complement each other : Socio-Economic LCA are increasingly seen as a complete assessment tool for nanotechnologies, but the perspective between Social LCA and SEA are different and the respective merits and limitations of both approaches should be kept in mind. SEA is a "pragmatic regulatory impact analysis", that uses a cost/benefit framework analysis but remains open to other disciplines than economy, and open to the participation of stakeholders for the construction of scenarios of the deployment of technologies and the identification of alternatives. SEA is "pragmatic" in the sense that it is driven by the purpose to assess "what happens" with the introduction of nanotechnology, and uses methodologies such as Life Cycle Analysis only as far as they really contribute to that goal. We think that, being pragmatic, SEA is also adaptative, which is a key quality to handle the novelty of

  5. Socio-economic analysis: a tool for assessing the potential of nanotechnologies

    International Nuclear Information System (INIS)

    Cost-Benefit Analysis (CBA) has a long history, especially in the USA, of being used for the assessment of new regulation, new infrastructure and more recently for new technologies. Under the denomination of Socio-Economic Analysis (SEA), this concept is used in EU safety and environmental regulation, especially for the placing of chemicals on the market (REACh regulation) and the operation of industrial installations (Industrial Emissions Directive). As far as REACh and other EU legislation apply specifically to nanomaterials in the future, SEA might become an important assessment tool for nanotechnologies. The most important asset of SEA regarding nanomaterials, is the comparison with alternatives in socio-economic scenarios, which is key for the understanding of how a nanomaterial 'socially' performs in comparison with its alternatives. 'Industrial economics' methods should be introduced in SEAs to make industry and the regulator share common concepts and visions about economic competitiveness implications of regulating nanotechnologies, SEA and Life Cycle Analysis (LCA) can complement each other : Socio-Economic LCA are increasingly seen as a complete assessment tool for nanotechnologies, but the perspective between Social LCA and SEA are different and the respective merits and limitations of both approaches should be kept in mind. SEA is a 'pragmatic regulatory impact analysis', that uses a cost/benefit framework analysis but remains open to other disciplines than economy, and open to the participation of stakeholders for the construction of scenarios of the deployment of technologies and the identification of alternatives. SEA is 'pragmatic' in the sense that it is driven by the purpose to assess 'what happens' with the introduction of nanotechnology, and uses methodologies such as Life Cycle Analysis only as far as they really contribute to that goal. We think that, being pragmatic, SEA is also adaptative, which is a key quality to handle the novelty of

  6. Bibliometric analysis of global environmental assessment research in a 20-year period

    International Nuclear Information System (INIS)

    Based on the samples of 113,468 publications on environmental assessment (EA) from the past 20 years, we used a bibliometric analysis to study the literature in terms of trends of growth, subject categories and journals, international collaboration, geographic distribution of publications, and scientific research issues. By applying thresholds to network centralities, a core group of countries can be distinguished as part of the international collaboration network. A frequently used keywords analysis found that the priority in assessment would gradually change from project environmental impact assessment (EIA) to strategic environmental assessment (SEA). Decision-theoretic approaches (i.e., environmental indicator selection, life cycle assessment, etc.), along with new technologies and methods (i.e., the geographic information system and modeling) have been widely applied in the EA research field over the past 20 years. Hot spots such as “biodiversity” and “climate change” have been emphasized in current EA research, a trend that will likely continue in the future. The h-index has been used to evaluate the research quality among countries all over the world, while the improvement of developing countries' EA systems is becoming a popular research topic. Our study reveals patterns in scientific outputs and academic collaborations and serves as an alternative and innovative way of revealing global research trends in the EA research field

  7. Strategic Environmental Assessment Framework for Landscape-Based, Temporal Analysis of Wetland Change in Urban Environments

    Science.gov (United States)

    Sizo, Anton; Noble, Bram F.; Bell, Scott

    2016-03-01

    This paper presents and demonstrates a spatial framework for the application of strategic environmental assessment (SEA) in the context of change analysis for urban wetland environments. The proposed framework is focused on two key stages of the SEA process: scoping and environmental baseline assessment. These stages are arguably the most information-intense phases of SEA and have a significant effect on the quality of the SEA results. The study aims to meet the needs for proactive frameworks to assess and protect wetland habitat and services more efficiently, toward the goal of advancing more intelligent urban planning and development design. The proposed framework, adopting geographic information system and remote sensing tools and applications, supports the temporal evaluation of wetland change and sustainability assessment based on landscape indicator analysis. The framework was applied to a rapidly developing urban environment in the City of Saskatoon, Saskatchewan, Canada, analyzing wetland change and land-use pressures from 1985 to 2011. The SEA spatial scale was rescaled from administrative urban planning units to an ecologically meaningful area. Landscape change assessed was based on a suite of indicators that were subsequently rolled up into a single, multi-dimensional, and easy to understand and communicate index to examine the implications of land-use change for wetland sustainability. The results show that despite the recent extremely wet period in the Canadian prairie region, land-use change contributed to increasing threats to wetland sustainability.

  8. Bibliometric analysis of global environmental assessment research in a 20-year period

    Energy Technology Data Exchange (ETDEWEB)

    Li, Wei, E-mail: weili@bnu.edu.cn; Zhao, Yang

    2015-01-15

    Based on the samples of 113,468 publications on environmental assessment (EA) from the past 20 years, we used a bibliometric analysis to study the literature in terms of trends of growth, subject categories and journals, international collaboration, geographic distribution of publications, and scientific research issues. By applying thresholds to network centralities, a core group of countries can be distinguished as part of the international collaboration network. A frequently used keywords analysis found that the priority in assessment would gradually change from project environmental impact assessment (EIA) to strategic environmental assessment (SEA). Decision-theoretic approaches (i.e., environmental indicator selection, life cycle assessment, etc.), along with new technologies and methods (i.e., the geographic information system and modeling) have been widely applied in the EA research field over the past 20 years. Hot spots such as “biodiversity” and “climate change” have been emphasized in current EA research, a trend that will likely continue in the future. The h-index has been used to evaluate the research quality among countries all over the world, while the improvement of developing countries' EA systems is becoming a popular research topic. Our study reveals patterns in scientific outputs and academic collaborations and serves as an alternative and innovative way of revealing global research trends in the EA research field.

  9. Sustainability assessment of nuclear power: Discourse analysis of IAEA and IPCC frameworks

    International Nuclear Information System (INIS)

    Highlights: • Sustainability assessments (SAs) are methodologically precarious. • Discourse analysis reveals how the meaning of sustainability is constructed in SAs. • Discourse analysis is applied on the SAs of nuclear power of IAEA and IPCC. • For IAEA ‘sustainable’ equals ‘complying with best international practices’. • The IAEA framework largely inspires IPCC Fifth Assessment Report. - Abstract: Sustainability assessments (SAs) are methodologically precarious. Value-based judgments inevitably play a role in setting the scope of the SA, selecting assessment criteria and indicators, collecting adequate data, and developing and using models of considered systems. Discourse analysis can reveal how the meaning and operationalization of sustainability is constructed in and through SAs. Our discourse-analytical approach investigates how sustainability is channeled from ‘manifest image’ (broad but shallow), to ‘vision’, to ‘policy targets’ (specific and practical). This approach is applied on the SA frameworks used by IAEA and IPCC to assess the sustainability of the nuclear power option. The essentially problematic conclusion is that both SA frameworks are constructed in order to obtain answers that do not conflict with prior commitments adopted by the two institutes. For IAEA ‘sustainable’ equals ‘complying with best international practices and standards’. IPCC wrestles with its mission as a provider of “policy-relevant and yet policy-neutral, never policy-prescriptive” knowledge to decision-makers. IPCC avoids the assessment of different visions on the role of nuclear power in a low-carbon energy future, and skips most literature critical of nuclear power. The IAEA framework largely inspires IPCC AR5

  10. Evaluation of SOVAT: An OLAP-GIS decision support system for community health assessment data analysis

    Directory of Open Access Journals (Sweden)

    Parmanto Bambang

    2008-06-01

    Full Text Available Abstract Background Data analysis in community health assessment (CHA involves the collection, integration, and analysis of large numerical and spatial data sets in order to identify health priorities. Geographic Information Systems (GIS enable for management and analysis using spatial data, but have limitations in performing analysis of numerical data because of its traditional database architecture. On-Line Analytical Processing (OLAP is a multidimensional datawarehouse designed to facilitate querying of large numerical data. Coupling the spatial capabilities of GIS with the numerical analysis of OLAP, might enhance CHA data analysis. OLAP-GIS systems have been developed by university researchers and corporations, yet their potential for CHA data analysis is not well understood. To evaluate the potential of an OLAP-GIS decision support system for CHA problem solving, we compared OLAP-GIS to the standard information technology (IT currently used by many public health professionals. Methods SOVAT, an OLAP-GIS decision support system developed at the University of Pittsburgh, was compared against current IT for data analysis for CHA. For this study, current IT was considered the combined use of SPSS and GIS ("SPSS-GIS". Graduate students, researchers, and faculty in the health sciences at the University of Pittsburgh were recruited. Each round consisted of: an instructional video of the system being evaluated, two practice tasks, five assessment tasks, and one post-study questionnaire. Objective and subjective measurement included: task completion time, success in answering the tasks, and system satisfaction. Results Thirteen individuals participated. Inferential statistics were analyzed using linear mixed model analysis. SOVAT was statistically significant (α = .01 from SPSS-GIS for satisfaction and time (p Conclusion Using SOVAT, tasks were completed more efficiently, with a higher rate of success, and with greater satisfaction, than the

  11. Biometrical issues in the analysis of adverse events within the benefit assessment of drugs.

    Science.gov (United States)

    Bender, Ralf; Beckmann, Lars; Lange, Stefan

    2016-07-01

    The analysis of adverse events plays an important role in the benefit assessment of drugs. Consequently, results on adverse events are an integral part of reimbursement dossiers submitted by pharmaceutical companies to health policy decision-makers. Methods applied in the analysis of adverse events commonly include simple standard methods for contingency tables. However, the results produced may be misleading if observations are censored at the time of discontinuation due to treatment switching or noncompliance, resulting in unequal follow-up periods. In this paper, we present examples to show that the application of inadequate methods for the analysis of adverse events in the reimbursement dossier can lead to a downgrading of the evidence on a drug's benefit in the subsequent assessment, as greater harm from the drug cannot be excluded with sufficient certainty. Legal regulations on the benefit assessment of drugs in Germany are presented, in particular, with regard to the analysis of adverse events. Differences in safety considerations between the drug approval process and the benefit assessment are discussed. We show that the naive application of simple proportions in reimbursement dossiers frequently leads to uninterpretable results if observations are censored and the average follow-up periods differ between treatment groups. Likewise, the application of incidence rates may be misleading in the case of recurrent events and unequal follow-up periods. To allow for an appropriate benefit assessment of drugs, adequate survival time methods accounting for time dependencies and duration of follow-up are required, not only for time-to-event efficacy endpoints but also for adverse events. © 2016 The Authors. Pharmaceutical Statistics published by John Wiley & Sons Ltd. PMID:26928768

  12. Contribution of Radon Exposure to the Risk of Lung Cancer Assessed by Applying a Multifactor Analysis

    International Nuclear Information System (INIS)

    Full text: It is well known that many different carcinogenic risk factors contribute to the development of lung cancer. Smoking, occupational exposure to carcinogens, chronic lung diseases and industrial air pollution are the most significant of them. Indoor radon exposure is considered a weak carcinogenic risk factor in the Urals, Russia. A drawback of traditionally applied monofactor methods of epidemiologic analysis is that different carcinogenic risk factors are analysed separately and their complex effect on public health is not taken into account. With such an approach it is impossible to adequately assess each factor's contribution in the total carcinogenic risk. For this reason it is expedient to apply methods of multifactor analysis in epidemiologic studies. We applied mathematical methods of pattern recognition when analysing the effect of indoor radon exposure on the development of lung cancer in the population of two Ural cities. We assessed the association between radon/thoron exposures and lung cancer using both BEIR VI model and the above-mentioned methods. The results were significantly different. According to BEIR VI model the contribution of radon/thoron in the risk of lung cancer varied from 7.2% to 33%, whereas this contribution assessed in the multifactor analysis was only 0.5%. We think that the contribution assessed in BEIR VI model is overestimated. Our considerable experience in conducting epidemiologic studies using mathematical methods of pattern recognition, gives us grounds to state that the assessment of radon/thoron exposure on lung cancer risk obtained in the multifactor analysis is more adequate and precise. (author)

  13. Application of probabilistic safety assessment in CPR1000 severe accident prevention and mitigation analysis

    International Nuclear Information System (INIS)

    The relationship between probabilistic safety assessment (PSA) and severe accident study was discussed. Also how to apply PSA in severe accident prevention and mitigation was elaborated. PSA can find the plant vulnerabilities of severe accidents prevention and mitigation. Some modifications or improvements focusing on these vulnerabilities can be put forward. PSA also can assess the efficient of these actions for decision-making. According to CPR1000 unit severe accident analysis, an example for the process and method on how to use PSA to enhance the ability to deal with severe accident prevention and mitigation was set forth. (authors)

  14. Methods and considerations for the analysis and standardization of assessing muscle sympathetic nerve activity in humans.

    Science.gov (United States)

    White, Daniel W; Shoemaker, J Kevin; Raven, Peter B

    2015-12-01

    The technique of microneurography and the assessment of muscle sympathetic nerve activity (MSNA) are used in laboratories throughout the world. The variables used to describe MSNA, and the criteria by which these variables are quantified from the integrated neurogram, vary among studies and laboratories and, therefore, can become confusing to those starting to learn the technique. Therefore, the purpose of this educational review is to discuss guidelines and standards for the assessment of sympathetic nervous activity through the collection and analysis of MSNA. This review will reiterate common practices in the collection of MSNA, but will also introduce considerations for the evaluation and physiological inference using MSNA. PMID:26299824

  15. An Analysis Of Tensile Test Results to Assess the Innovation Risk for an Additive Manufacturing Technology

    Directory of Open Access Journals (Sweden)

    Adamczak Stanisław

    2015-03-01

    Full Text Available The aim of this study was to assess the innovation risk for an additive manufacturing process. The analysis was based on the results of static tensile tests obtained for specimens made of photocured resin. The assessment involved analyzing the measurement uncertainty by applying the FMEA method. The structure of the causes and effects of the discrepancies was illustrated using the Ishikawa diagram. The risk priority numbers were calculated. The uncertainty of the tensile test measurement was determined for three printing orientations. The results suggest that the material used to fabricate the tensile specimens shows clear anisotropy of the properties in relation to the printing direction.

  16. RELAP5/MOD2 overview and developmental assessment results from TMI-1 plant transient analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lin, J.C.; Tsai, C.C.; Ransom, V.H.; Johnsen, G.W.

    1984-01-01

    RELAP5/MOD2 is a new version of the RELAP5 thermal-hydraulic computer code containing improved modeling features that provide a generic capability for pressurized water reactor transient simulation. Objective of this paper is to provide code users with an overview of the code and to report developmental assessment results obtained from a Three Mile Island Unit One plant transient analysis. The assessment shows that the injection of highly subcooled water into a high-pressure primary coolant system does not cause unphysical results or pose a problem for RELAP5/MOD2.

  17. A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas

    Science.gov (United States)

    Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan

    2016-01-01

    Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation. PMID:27527202

  18. Bioimpedance harmonic analysis as a tool to simultaneously assess circulation and nervous control

    International Nuclear Information System (INIS)

    Multicycle harmonic (Fourier) analysis of bioimpedance was employed to simultaneously assess circulation and neural activity in visceral (rat urinary bladder) and somatic (human finger) organs. The informative value of the first cardiac harmonic of the bladder impedance as an index of bladder circulation is demonstrated. The individual reactions of normal and obstructive bladders in response to infusion cystometry were recorded. The potency of multicycle harmonic analysis of bioimpedance to assess sympathetic and parasympathetic neural control in urinary bladder is discussed. In the human finger, bioimpedance harmonic analysis revealed three periodic components at the rate of the heart beat, respiration and Mayer wave (0.1 Hz), which were observed under normal conditions and during blood flow arrest in the hand. The revealed spectrum peaks were explained by the changes in systemic blood pressure and in regional vascular tone resulting from neural vasomotor control. During normal respiration and circulation, two side cardiac peaks were revealed in a bioimpedance amplitude spectrum, whose amplitude reflected the depth of amplitude respiratory modulation of the cardiac output. During normal breathing, the peaks corresponding to the second and third cardiac harmonics were split, reflecting frequency respiratory modulation of the heart rate. Multicycle harmonic analysis of bioimpedance is a novel potent tool to examine the interaction between the respiratory and cardiovascular system and to simultaneously assess regional circulation and neural influences in visceral and somatic organs

  19. A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas.

    Science.gov (United States)

    Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan

    2016-01-01

    Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation. PMID:27527202

  20. Application of Item Analysis to Assess Multiple-Choice Examinations in the Mississippi Master Cattle Producer Program

    Science.gov (United States)

    Parish, Jane A.; Karisch, Brandi B.

    2013-01-01

    Item analysis can serve as a useful tool in improving multiple-choice questions used in Extension programming. It can identify gaps between instruction and assessment. An item analysis of Mississippi Master Cattle Producer program multiple-choice examination responses was performed to determine the difficulty of individual examinations, assess the…

  1. 77 FR 5857 - Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...

    Science.gov (United States)

    2012-02-06

    ... COMMISSION Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...: On November 2, 2011 (76 FR 67764), the U.S. Nuclear Regulatory Commission (NRC) published for public comment Draft NUREG, ``Common- Cause Failure Analysis in Event and Condition Assessment: Guidance...

  2. Tolerability of risk, safety assessment principles and their implications for probabilistic safety analysis

    International Nuclear Information System (INIS)

    This paper gives a regulatory view of probabilistic safety assessment as seen by the Nuclear Installations Inspectorate (NII) and in the light of the general regulatory risk aims set out in the Health and Safety Executive's (HSE) The tolerability of risk from nuclear power stations (TOR) and in Safety assessment principles for nuclear plants (SAPs), prepared by NII on behalf of the HSE. Both of these publications were revised and republished in 1992. This paper describes the SAPs, together with the historical background, the motivation for review, the effects of the Sizewell and Hinkley Point C public inquiries, changes since the original versions, comparison with international standards and use in assessment. For new plant, probabilistic safety analysis (PSA) is seen as an essential tool in balancing the safety of the design and in demonstrating compliance with TOR and the SAPs. (Author)

  3. Application of data analysis techniques to nuclear reactor systems code to accuracy assessment

    International Nuclear Information System (INIS)

    An automated code assessment program (ACAP) has been developed by the authors to provide quantitative comparisons between nuclear reactor systems (NRS) code results and experimental measurements. This software was developed under subcontract to the United States Nuclear Regulatory Commission for use in its NRS code consolidation efforts. In this paper, background on the topic of NRS accuracy and uncertainty assessment is provided which motivates the development of and defines basic software requirements for ACAP. A survey of data analysis techniques was performed, focusing on the applicability of methods in the construction of NRS code-data comparison measures. The results of this review process, which further defined the scope, user interface and process for using ACAP are also summarized. A description of the software package and several sample applications to NRS data sets are provided. Its functionality and ability to provide objective accuracy assessment figures are demonstrated. (author)

  4. Analysis of uncertainties in alpha-particle optical-potential assessment below the Coulomb barrier

    CERN Document Server

    Avrigeanu, V

    2016-01-01

    Background: Recent high-precision measurements of alpha-induced reaction data below the Coulomb barrier have pointed out questions of the alpha-particle optical-model potential (OMP) which are yet open within various mass ranges. Purpose: The applicability of a previous optical potential and eventual uncertainties and/or systematic errors of the OMP assessment at low energies can be further considered on this basis. Method: Nuclear model parameters based on the analysis of recent independent data, particularly gamma-ray strength functions, have been involved within statistical model calculation of the (alpha,x) reaction cross sections. Results: The above-mentioned potential provides a consistent description of the recent alpha-induced reaction data with no empirical rescaling factors of the and/or nucleon widths. Conclusions: A suitable assessment of alpha-particle optical potential below the Coulomb barrier should involve the statistical-model parameters beyond this potential on the basis of a former analysi...

  5. Quantitative assessment of in-solution digestion efficiency identifies optimal protocols for unbiased protein analysis

    DEFF Research Database (Denmark)

    Leon, Ileana R; Schwämmle, Veit; Jensen, Ole N; Sprenger, Richard Remko

    2013-01-01

    combination of qualitative and quantitative LC-MS/MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein...... protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows for......The majority of mass spectrometry-based protein quantification studies uses peptide-centric analytical methods and thus strongly relies on efficient and unbiased protein digestion protocols for sample preparation. We present a novel objective approach to assess protein digestion efficiency using a...

  6. Safety Analysis and Risk Assessment Handbook, new guidance to the safety analyst

    International Nuclear Information System (INIS)

    New guidance to the safety analyst has been developed at the Rocky Flats Environmental Technology Site (RFETS) in the form of the new Safety Analysis and Risk Assessment Handbook (SARAH). Although the older guidance (the Rocky Flats Risk Assessment Guide) continues to be used for updating the Final Safety Analysis Reports (FSARs) developed in the mid-1980s, this new guidance is used with all new authorization basis documents. With the RFETS mission change in the early 1990s came the need to establish new authorization basis documents for its facilities, whose missions had changed. The methodology and databases for performing the evaluations that support the new authorization basis documents needed to be standardized, to avoid the use of different approaches and/or databases for similar accidents in different facilities. This paper presents this new standardized approach, the SARAH

  7. Use of simulated patients and reflective video analysis to assess occupational therapy students' preparedness for fieldwork.

    Science.gov (United States)

    Giles, Amanda K; Carson, Nancy E; Breland, Hazel L; Coker-Bolt, Patty; Bowman, Peter J

    2014-01-01

    Educators must determine whether occupational therapy students are adequately prepared for Level II fieldwork once they have successfully completed the didactic portion of their coursework. Although studies have shown that students regard the use of video cameras and simulated patient encounters as useful tools for assessing professional and clinical behaviors, little has been published in the occupational therapy literature regarding the practical application of simulated patients or reflective video analysis. We describe a model for a final Comprehensive Practical Exam that uses both simulated patients and reflective video analysis to assess student preparedness for Level II fieldwork, and we report on student perceptions of these instructional modalities. We provide recommendations for designing, implementing, and evaluating simulated patient experiences in light of existing educational theory. PMID:25397940

  8. Analysis of Parameters Assessment on Laminated Rubber-Metal Spring for Structural Vibration

    Science.gov (United States)

    Salim, M. A.; Putra, A.; Mansor, M. R.; Musthafah, M. T.; Akop, M. Z.; Abdullah, M. A.

    2016-02-01

    This paper presents the analysis of parameter assessment on laminated rubber-metal spring (LR-MS) for vibrating structure. Three parameters were selected for the assessment which are mass, Young's modulus and radius. Natural rubber materials has been used to develop the LR-MS model. Three analyses were later conducted based on the selected parameters to the LR-MS performance which are natural frequency, location of the internal resonance frequency and transmissibility of internal resonance. Results of the analysis performed were plotted in frequency domain function graph. Transmissibility of laminated rubber-metal spring (LR-MS) is changed by changing the value of the parameter. This occurrence was referred to the theory from open literature then final conclusion has been make which are these parameters have a potential to give an effects and trends for LR-MS transmissibility.

  9. Chemometric Analysis for Pollution Source Assessment of Harbour Sediments in Arctic Locations

    DEFF Research Database (Denmark)

    Pedersen, Kristine B.; Lejon, Tore; Jensen, Pernille Erland;

    2015-01-01

    Pollution levels, pollutant distribution and potential source assessments based on multivariate analysis (chemometrics) were made for harbour sediments from two Arctic locations; Hammerfest in Norway and Sisimiut in Greenland. High levels of heavy metals were detected in addition to organic...... pollutants. Preliminary assessments based on principal component analysis (PCA) revealed different sources and pollutant distribution in the sediments of the two harbours. Tributyltin (TBT) was, however, found to originate from point source(s), and the highest concentrations of TBT in both harbours were...... indicated relation primarily to German, Russian and American mixtures in Hammerfest; and American, Russian and Japanese mixtures in Sisimiut. PCA was shown to be an important tool for identifying pollutant sources and differences in pollutant composition in relation to sediment characteristics....

  10. Fluctuation analysis-based risk assessment for respiratory virus activity and air pollution associated asthma incidence.

    Science.gov (United States)

    Liao, Chung-Min; Hsieh, Nan-Hung; Chio, Chia-Pin

    2011-08-15

    Asthma is a growing epidemic worldwide. Exacerbations of asthma have been associated with bacterial and viral respiratory tract infections and air pollution. We correlated the asthma admission rates with fluctuations in respiratory virus activity and traffic-related air pollution, namely particulate matter with an aerodynamic diameter ≤ 10 μm (PM₁₀), nitrogen dioxide (NO₂), carbon monoxide (CO), sulfur dioxide (SO₂), and ozone (O₃). A probabilistic risk assessment framework was developed based on a detrended fluctuation analysis to predict future respiratory virus and air pollutant associated asthma incidence. Results indicated a strong association between asthma admission rate and influenza (r=0.80, pinfluenza to below 0.9. We concluded that fluctuation analysis based risk assessment provides a novel predictor of asthma incidence. PMID:21663946

  11. Reliability assessment of generation and transmission systems using fault-tree analysis

    International Nuclear Information System (INIS)

    This paper presents a method that integrates deterministic approach with fault-tree analysis for reliability assessment of a composite system (generation and transmission in power systems). The contingency screening is conducted in the first step. The results are further classified into three clusters in the second step: normal, local trouble and system trouble. The fault-tree analysis is used to assess the reliability of the composite system in the third step. Finally, Risk Reduction Worth is adopted as a measure of importance for identifying the crucial element that has significant impact on the reliability. In this paper, a composite system in Taiwan serves as an example for illustrating the simulation results attained by the proposed method. The simulation results, verified by Siemens PTI PSS/E TPLAN software package, show that the proposed method is applicable for large scale power systems.

  12. Assessment of valve actuator motor rotor degradation by Fourier analysis of current waveform

    International Nuclear Information System (INIS)

    This paper presents a test report of a motor diagnostic system that uses Fourier Analysis of the motor current waveform to detect broken rotor bars in the motor or defects in the driven equipment. The test was conducted on a valve actuator motor driving a valve actuator that was in turn driving a dynamometer to measure the actuator torque output. The motor was gradually degraded by open circuiting rotor bars. The test confirmed the efficacy of the waveform analysis method for assessing motor rotor degradation and also provided data regarding the change in waveform characteristic as motor rotors are gradually degraded to failure

  13. Experimental assessment of computer codes used for safety analysis of integral reactors

    Energy Technology Data Exchange (ETDEWEB)

    Falkov, A.A.; Kuul, V.S.; Samoilov, O.B. [OKB Mechanical Engineering, Nizhny Novgorod (Russian Federation)

    1995-09-01

    Peculiarities of integral reactor thermohydraulics in accidents are associated with presence of noncondensable gas in built-in pressurizer, absence of pumped ECCS, use of guard vessel for LOCAs localisation and passive RHRS through in-reactor HX`s. These features defined the main trends in experimental investigations and verification efforts for computer codes applied. The paper reviews briefly the performed experimental investigation of thermohydraulics of AST-500, VPBER600-type integral reactors. The characteristic of UROVEN/MB-3 code for LOCAs analysis in integral reactors and results of its verification are given. The assessment of RELAP5/mod3 applicability for accident analysis in integral reactor is presented.

  14. Assessment of Random Assignment in Training and Test Sets using Generalized Cluster Analysis Technique

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACĂ

    2011-06-01

    Full Text Available Aim: The properness of random assignment of compounds in training and validation sets was assessed using the generalized cluster technique. Material and Method: A quantitative Structure-Activity Relationship model using Molecular Descriptors Family on Vertices was evaluated in terms of assignment of carboquinone derivatives in training and test sets during the leave-many-out analysis. Assignment of compounds was investigated using five variables: observed anticancer activity and four structure descriptors. Generalized cluster analysis with K-means algorithm was applied in order to investigate if the assignment of compounds was or not proper. The Euclidian distance and maximization of the initial distance using a cross-validation with a v-fold of 10 was applied. Results: All five variables included in analysis proved to have statistically significant contribution in identification of clusters. Three clusters were identified, each of them containing both carboquinone derivatives belonging to training as well as to test sets. The observed activity of carboquinone derivatives proved to be normal distributed on every. The presence of training and test sets in all clusters identified using generalized cluster analysis with K-means algorithm and the distribution of observed activity within clusters sustain a proper assignment of compounds in training and test set. Conclusion: Generalized cluster analysis using the K-means algorithm proved to be a valid method in assessment of random assignment of carboquinone derivatives in training and test sets.

  15. Feasibility analysis of widely accepted indicators as key ones in river health assessment

    Institute of Scientific and Technical Information of China (English)

    FENG Yan; KANG Bin; YANG Liping

    2012-01-01

    Index systems on river health assessment are difficult for using in practice,due to the more complex and professional indicators adopted.In the paper,some key indicators which can be applied for river health assessment in general were selected,based on the analysis of 45 assessment index systems with 902 variables within around 150 published papers and documents in 1972-2010.According to the fields covered by the variables,they were divided into four groups:habitat condition,water environment,biotic status and water utilization.The adopted number and the accepted degrees in the above systems of each indicator were calculated after the variables were combined into the indicators,some of the widely accepted indicators which can reflect different aspects of river condition were selected as key indicators in candidate.Under the correlation analysis amongst the key indicators in candidate,8 indicators were finally suggested as the key indicators for assessing river health,which were:coverage rate of riparian vegetation,reserved rate of wetland,river continuity,the changing rate of water flow,the ratio of reaching water quality standard,fish index of biotic integrity,the ratio of water utilization and land use.

  16. OECD/NEA expert group on uncertainty analysis for criticality safety assessment: current activities - 295

    International Nuclear Information System (INIS)

    The expert group (EG) on Uncertainty Analysis for Criticality Safety Assessment (UACSA) was established within the OECD/NEA Working Party on Nuclear Criticality Safety in December 2007 to promote exchange of information on related topics; compare methods and software tools for uncertainty analysis; test their performance; and assist in selection/development of safe and efficient methodologies. At the current stage, the work of the group is focused on approaches for validation of criticality calculations. With the diversity of the approaches to validate criticality calculations, a thorough description of each approach and assessment of its performance is useful to the criticality safety community. Developers, existing and potential practitioners as well as reviewers of assessments using those approaches should benefit from this effort. Exercise Phase I was conducted in order to illustrate predictive capabilities of criticality validation approaches, which include similarity assessment, definition of keff bias and bias uncertainty, and selection of benchmarks. The approaches and results of the exercises will be thoroughly documented in a pending state-of-the-art report from the EG. This paper provides an overview of current and future activities for the EG, a summary of the participant-contributed validation approaches, and a synthesis of the results for the exercises. (authors)

  17. Performance Assessment and Optimization of Biomass Steam Turbine Power Plants by Data Envelopment Analysis

    OpenAIRE

    Nattanin Ueasin; Anupong Wongchai; Sakkarin Nonthapot

    2015-01-01

    As rice husk is abundantly natural resource in Thailand, it has been used as the biomass energy resource in the stream turbine power plants, in particular to very small power producers (VSPPs). The VSPPs’ plants produced by rice husk is generally found in many regions of Thailand, however its performance efficiency and optimization has never been assessed at any level. This study aimed to fulfill this gap by adopting the method of data envelopment analysis (DEA) to relatively measure the perf...

  18. Conference Analysis Report of Assessments on Defect and Damage for a High Temperature Structure

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hyeong Yeon

    2008-11-15

    This report presents the analysis on the state-of-the-art research trends on creep-fatigue damage, defect assessment of high temperature structure, development of heat resistant materials and their behavior at high temperature based on the papers presented in the two international conferences of ASME PVP 2008 which was held in Chicago in July 2008 and CF-5(5th International Conference on Creep, Fatigue and Creep-Fatigue) which was held in Kalpakkam, India in September 2008.

  19. Sustainability Assessment of Solid Waste Management in China: A Decoupling and Decomposition Analysis

    OpenAIRE

    Xingpeng Chen; Jiaxing Pang; Zilong Zhang; Hengji Li

    2014-01-01

    As the largest solid waste (SW) generator in the world, China is facing serious pollution issues induced by increasing quantities of SW. The sustainability assessment of SW management is very important for designing relevant policy for further improving the overall efficiency of solid waste management (SWM). By focusing on industrial solid waste (ISW) and municipal solid waste (MSW), the paper investigated the sustainability performance of SWM by applying decoupling analysis, and further iden...

  20. Risk assessment and analysis of the M109 family of vehicles Fleet Management Pilot Program

    OpenAIRE

    Hitz, Stephen E

    1997-01-01

    Approved for public release; distribution is unlimited. The purpose of this thesis is to conduct a risk assessment and analysis for the M109 l55mm Self Propelled Howitzer (SPH) Fleet Management Pilot Program. The objective of this program is to reengineer the fleet's logistical support system by outsourcing those functions which make sense and that can be performed more efficiently by private industry. This innovative approach places one contractor, or Fleet Manager, in charge of sustainin...

  1. Authorship Bias in Violence Risk Assessment? A Systematic Review and Meta-Analysis

    OpenAIRE

    Jay P Singh; Martin Grann; Seena Fazel

    2013-01-01

    Various financial and non-financial conflicts of interests have been shown to influence the reporting of research findings, particularly in clinical medicine. In this study, we examine whether this extends to prognostic instruments designed to assess violence risk. Such instruments have increasingly become a routine part of clinical practice in mental health and criminal justice settings. The present meta-analysis investigated whether an authorship effect exists in the violence risk assessmen...

  2. Carbon and Energy Footprints of Prefabricated Industrial Buildings: A Systematic Life Cycle Assessment Analysis

    OpenAIRE

    Emanuele Bonamente; Franco Cotana

    2015-01-01

    A systematic analysis of green-house gases emission (carbon footprint) and primary energy consumption (energy footprint) of prefabricated industrial buildings during their entire life cycle is presented. The life cycle assessment (LCA) study was performed in a cradle-to grave approach: site-specific data from an Italian company, directly involved in all the phases from raw material manufacturing to in-situ assembly, were used to analyze the impacts as a function of different design choices. F...

  3. Energy flow analysis and assessment of energy saving potentials in a research and development plant

    International Nuclear Information System (INIS)

    This thesis is concerned with an energy flow analysis in order to assess the situation of the energy demand in the research and development plant AVC in Austria. The possibilities of more effective using the energy resources are investigated. The economics of the distribution of the electrical current and estimates of the energy demand are examined comparison with other firms are made. The consequences of better scopes of negotiations are worked out. (Suda)

  4. Assessing the Performance of a Classification-Based Vulnerability Analysis Model

    OpenAIRE

    Wang, Tai-Ran; Mousseau, Vincent; Pedroni, Nicola; Zio, Enrico

    2015-01-01

    In this article, a classification model based on the majority rule sorting (MR-Sort) method is employed to evaluate the vulnerability of safety-critical systems with respect to malevolent intentional acts. The model is built on the basis of a (limited-size) set of data representing (a priori known) vulnerability classification examples. The empirical construction of the clas-sification model introduces a source of uncertainty into the vulnerability analysis process: a quantitative assessment ...

  5. Evaluating research assessment: metrics-based analysis exposes implicit bias in REF2014 results

    OpenAIRE

    Dix, Alan

    2016-01-01

    The recent UK research assessment exercise, REF2014, attempted to be as fair and transparent as possible. However, Alan Dix, a member of the computing sub-panel, reports how a post-hoc analysis of public domain REF data reveals substantial implicit and emergent bias in terms of discipline sub-areas (theoretical vs applied), institutions (Russell Group vs post-1992), and gender. While metrics are generally recognised as flawed, our human processes may be uniformly worse.

  6. Uncertainty and sensitivity analysis in quantitative pest risk assessments; practical rules for risk assessors

    OpenAIRE

    David Makowski

    2013-01-01

    Quantitative models have several advantages compared to qualitative methods for pest risk assessments (PRA). Quantitative models do not require the definition of categorical ratings and can be used to compute numerical probabilities of entry and establishment, and to quantify spread and impact. These models are powerful tools, but they include several sources of uncertainty that need to be taken into account by risk assessors and communicated to decision makers. Uncertainty analysis (UA) and ...

  7. Designing student peer assessment in higher education: Analysis of written and oral peer feedback

    OpenAIRE

    Van den Berg, I.; Admiraal, W.; Pilot, A.

    2006-01-01

    Designing student peer assessment in higher education: analysis of written and oral peer feedback Relating it to design features, the present article describes the nature of written and oral peer feedback as it occurred in seven writing courses, each with a different PA design. Results indicate that generally, peer feedback in all courses was focused on evaluation, which is only one of the four feedback functions. Feedback on structure was hardly provided. As it turned out, the differences be...

  8. Suggestions to improve oil shale industry water management basing on inventory analysis of life cycle assessment

    International Nuclear Information System (INIS)

    Principles of Life Cycle Assessment (LCA) are implemented for the investigation of the Estonian oil shale energy production system. The energy produced on the smallest of Estonia's thermal power plants is studied as a product. A brief description of the inventory analysis and material balance for this product system is presented. A possible change in water management of investigated life cycle is discussed. Implementing these suggestions will diminish the pressure on local water resources in an economically effective way. (author)

  9. Conference Analysis Report of Assessments on Defect and Damage for a High Temperature Structure

    International Nuclear Information System (INIS)

    This report presents the analysis on the state-of-the-art research trends on creep-fatigue damage, defect assessment of high temperature structure, development of heat resistant materials and their behavior at high temperature based on the papers presented in the two international conferences of ASME PVP 2008 which was held in Chicago in July 2008 and CF-5(5th International Conference on Creep, Fatigue and Creep-Fatigue) which was held in Kalpakkam, India in September 2008

  10. Analysis of tidal expiratory flow pattern in the assessment of histamine-induced bronchoconstriction.

    OpenAIRE

    Morris, M. J.; Madgwick, R. G.; Lane, D. J.

    1995-01-01

    BACKGROUND--There are times in clinical practice when it would be useful to be able to assess the severity of airways obstruction from tidal breathing. Three indices of airways obstruction derived from analysis of resting tidal expiratory flow have previously been described: (1) Tme/TE = time to reach maximum expiratory flow/expiratory time; (2) Krs = decay constant of exponential fitted to tidal expiratory flow versus time curve; and (3) EV = extrapolated volume--that is, area under the curv...

  11. Assessing the environmental impact of induction motors using manufacturer's data and life cycle analysis

    OpenAIRE

    Torrent Burgués, Marcel; Martínez Piera, Eusebio; Andrada Gascón, Pedro

    2012-01-01

    Herein is reported development and testing of a life cycle analysis (LCA) procedure for assessing the environmental impact of induction motors. The operating conditions of a given industrial application are defined by the mechanical power required, operating hours and service life of the three-phase induction motor involved. Based on manufacturer’s data mainly,different three-phase induction motors for various sets of operating conditions, including oversizing, have been selected. To quantify...

  12. Analysis on evaluation ability of nonlinear safety assessment model of coal mines based on artificial neural network

    Institute of Scientific and Technical Information of China (English)

    SHI Shi-liang; LIU Hai-bo; LIU Ai-hua

    2004-01-01

    Based on the integration analysis of goods and shortcomings of various methods used in safety assessment of coal mines, combining nonlinear feature of mine safety sub-system, this paper establishes the neural network assessment model of mine safety, analyzes the ability of artificial neural network to evaluate mine safety state, and lays the theoretical foundation of artificial neural network using in the systematic optimization of mine safety assessment and getting reasonable accurate safety assessment result.

  13. ARCADO - Adding random case analysis to direct observation in workplace-based formative assessment of general practice registrars

    OpenAIRE

    Ingham, Gerard; Fry, Jennifer; Morgan, Simon; Ward, Bernadette

    2015-01-01

    Background Workplace-based formative assessments using consultation observation are currently conducted during the Australian general practice training program. Assessment reliability is improved by using multiple assessment methods. The aim of this study was to explore experiences of general practice medical educator assessors and registrars (trainees) when adding random case analysis to direct observation (ARCADO) during formative workplace-based assessments. Methods A sample of general pra...

  14. Qualitative and quantitative methods for human factor analysis and assessment in NPP. Investigations and results

    International Nuclear Information System (INIS)

    We consider here two basic groups of methods for analysis and assessment of the human factor in the NPP area and give some results from performed analyses as well. The human factor is the human interaction with the design equipment, with the working environment and takes into account the human capabilities and limits. In the frame of the qualitative methods for analysis of the human factor are considered concepts and structural methods for classifying of the information, connected with the human factor. Emphasize is given to the HPES method for human factor analysis in NPP. Methods for quantitative assessment of the human reliability are considered. These methods allow assigning of probabilities to the elements of the already structured information about human performance. This part includes overview of classical methods for human reliability assessment (HRA, THERP), and methods taking into account specific information about human capabilities and limits and about the man-machine interface (CHR, HEART, ATHEANA). Quantitative and qualitative results concerning human factor influence in the initiating events occurrences in the Kozloduy NPP are presented. (authors)

  15. Sex Assessment Using the Femur and Tibia in Medieval Skeletal Remains from Ireland: Discriminant Function Analysis.

    Science.gov (United States)

    Novak, Mario

    2016-04-01

    Sex determination based on discriminant function analysis of skeletal measurements is probably the most effective method for assessment of sex in archaeological and contemporary populations due to various reasons, but it also suffers from limitations such as population specificity. In this paper standards for sex assessment from the femur and tibia in the medieval Irish population are presented. Six femoral and six tibial measurements obtained from 56 male and 45 female skeletons were subjected to discriminant function analysis. Average accuracies obtained by this study range between 87.1 and 97%. The highest level of accuracy (97%) was achieved when using combined variables of the femur and tibia (maximum diameter of femoral head and circumference at tibial nutrient foramen), as well as two variables of the tibia (proximal epiphyseal breadth and circumference at nutrient foramen). Discriminant functions using a single variable provided accuracies between 87.1 and 96% with the circumference at the level of the tibial nutrient foramen providing the best separation. High accuracy rates obtained by this research correspond to the data recorded in other studies thus confirming the importance of discriminant function analysis in assessment of sex in both archaeological and forensic contexts. PMID:27301232

  16. Life Cycle Assessment and Life Cycle Cost Analysis of Magnesia Spinel Brick Production

    Directory of Open Access Journals (Sweden)

    Aysun Özkan

    2016-07-01

    Full Text Available Sustainable use of natural resources in the production of construction materials has become a necessity both in Europe and Turkey. Construction products in Europe should have European Conformity (CE and Environmental Product Declaration (EPD, an independently verified and registered document in line with the European standard EN 15804. An EPD certificate can be created by performing a Life Cycle Assessment (LCA study. In this particular work, an LCA study was carried out for a refractory brick production for environmental assessment. In addition to the LCA, the Life Cycle Cost (LCC analysis was also applied for economic assessment. Firstly, a cradle-to-gate LCA was performed for one ton of magnesia spinel refractory brick. The CML IA method included in the licensed SimaPro 8.0.1 software was chosen to calculate impact categories (namely, abiotic depletion, global warming potential, acidification potential, eutrophication potential, human toxicity, ecotoxicity, ozone depletion potential, and photochemical oxidation potential. The LCC analysis was performed by developing a cost model for internal and external cost categories within the software. The results were supported by a sensitivity analysis. According to the results, the production of raw materials and the firing process in the magnesia spinel brick production were found to have several negative effects on the environment and were costly.

  17. Scenario analysis for the postclosure assessment of the Canadian concept for nuclear fuel waste disposal

    International Nuclear Information System (INIS)

    AECL Research has developed and evaluated a concept for disposal of Canada's nuclear fuel waste involving deep underground disposal of the waste in intrusive igneous rock of the Canadian Shield. The postclosure assessment of this concept focusses on the effects on human health and the environment due to potential contaminant releases into the biosphere after the disposal vault is closed. Both radiotoxic and chemically toxic contaminants are considered. One of the steps in the postclosure assessment process is scenario analysis. Scenario analysis identifies factors that could affect the performance of the disposal system and groups these factors into scenarios that require detailed quantitative evaluation. This report documents a systematic procedure for scenario analysis that was developed for the postclosure assessment and then applied to the study of a hypothetical disposal system. The application leads to a comprehensive list of factors and a set of scenarios that require further quantitative study. The application also identifies a number of other factors and potential scenarios that would not contribute significantly to environmental and safety impacts for the hypothetical disposal system. (author). 46 refs., 3 tabs., 3 figs., 2 appendices

  18. Quality Assessment of Urinary Stone Analysis: Results of a Multicenter Study of Laboratories in Europe.

    Directory of Open Access Journals (Sweden)

    Roswitha Siener

    Full Text Available After stone removal, accurate analysis of urinary stone composition is the most crucial laboratory diagnostic procedure for the treatment and recurrence prevention in the stone-forming patient. The most common techniques for routine analysis of stones are infrared spectroscopy, X-ray diffraction and chemical analysis. The aim of the present study was to assess the quality of urinary stone analysis of laboratories in Europe. Nine laboratories from eight European countries participated in six quality control surveys for urinary calculi analyses of the Reference Institute for Bioanalytics, Bonn, Germany, between 2010 and 2014. Each participant received the same blinded test samples for stone analysis. A total of 24 samples, comprising pure substances and mixtures of two or three components, were analysed. The evaluation of the quality of the laboratory in the present study was based on the attainment of 75% of the maximum total points, i.e. 99 points. The methods of stone analysis used were infrared spectroscopy (n = 7, chemical analysis (n = 1 and X-ray diffraction (n = 1. In the present study only 56% of the laboratories, four using infrared spectroscopy and one using X-ray diffraction, fulfilled the quality requirements. According to the current standard, chemical analysis is considered to be insufficient for stone analysis, whereas infrared spectroscopy or X-ray diffraction is mandatory. However, the poor results of infrared spectroscopy highlight the importance of equipment, reference spectra and qualification of the staff for an accurate analysis of stone composition. Regular quality control is essential in carrying out routine stone analysis.

  19. Standardization of domestic human reliability analysis and experience of human reliability analysis in probabilistic safety assessment for NPPs under design

    International Nuclear Information System (INIS)

    This paper introduces the background and development activities of domestic standardization of procedure and method for Human Reliability Analysis (HRA) to avoid the intervention of subjectivity by HRA analyst in Probabilistic Safety Assessment (PSA) as possible, and the review of the HRA results for domestic nuclear power plants under design studied by Korea Atomic Energy Research Institute. We identify the HRA methods used for PSA for domestic NPPs and discuss the subjectivity of HRA analyst shown in performing a HRA. Also, we introduce the PSA guidelines published in USA and review the HRA results based on them. We propose the system of a standard procedure and method for HRA to be developed

  20. Seeking Missing Pieces in Science Concept Assessments: Reevaluating the Brief Electricity and Magnetism Assessment through Rasch Analysis

    Science.gov (United States)

    Ding, Lin

    2014-01-01

    Discipline-based science concept assessments are powerful tools to measure learners' disciplinary core ideas. Among many such assessments, the Brief Electricity and Magnetism Assessment (BEMA) has been broadly used to gauge student conceptions of key electricity and magnetism (E&M) topics in college-level introductory physics courses.…

  1. Performance assessment of air quality monitoring networks using principal component analysis and cluster analysis

    International Nuclear Information System (INIS)

    This study aims to evaluate the performance of two statistical methods, principal component analysis and cluster analysis, for the management of air quality monitoring network of Hong Kong and the reduction of associated expenses. The specific objectives include: (i) to identify city areas with similar air pollution behavior; and (ii) to locate emission sources. The statistical methods were applied to the mass concentrations of sulphur dioxide (SO2), respirable suspended particulates (RSP) and nitrogen dioxide (NO2), collected in monitoring network of Hong Kong from January 2001 to December 2007. The results demonstrate that, for each pollutant, the monitoring stations are grouped into different classes based on their air pollution behaviors. The monitoring stations located in nearby area are characterized by the same specific air pollution characteristics and suggested with an effective management of air quality monitoring system. The redundant equipments should be transferred to other monitoring stations for allowing further enlargement of the monitored area. Additionally, the existence of different air pollution behaviors in the monitoring network is explained by the variability of wind directions across the region. The results imply that the air quality problem in Hong Kong is not only a local problem mainly from street-level pollutions, but also a region problem from the Pearl River Delta region. (author)

  2. Component fragility analysis methodology for seismic risk assessment projects. Proven PSA safety document processing and assessment procedures

    International Nuclear Information System (INIS)

    The seismic risk task assessment task should be structured as follows: (i) Define all reactor unit building structures, components and equipment involved in the creation of an initiating event (IE) induced by an seismic event or contributing to the reliability of reactor unit response to an IE; (ii) construct and estimate of the fragility curves for the building and component groups sub (i); (iii) determine the HCLPF for each group of buildings, components or equipment; (iv) determine the nuclear source's seismic resistance (SME) as the minimum HCLPF from the group of equipment in the risk-dominant scenarios; (v) define the risk-limiting group of components, equipment and building structures to the SME value; (vi) based on the fragility levels, identify component groups for which a more detailed fragility analysis is needed; and (vii) recommend groups of equipment or building structures that should be taken into account with respect to the seismic risk, i.e. such groups of equipment or building structures as exhibit a low seismic resistance (HCLPF) and, at the same time, are involved to a significant extent in the reactor unit's seismic risk (are present in the dominant risk scenarios). (P.A.)

  3. Risk Assessment Method for Offshore Structure Based on Global Sensitivity Analysis

    Directory of Open Access Journals (Sweden)

    Zou Tao

    2012-01-01

    Full Text Available Based on global sensitivity analysis (GSA, this paper proposes a new risk assessment method for an offshore structure design. This method quantifies all the significances among random variables and their parameters at first. And by comparing the degree of importance, all minor factors would be negligible. Then, the global uncertainty analysis work would be simplified. Global uncertainty analysis (GUA is an effective way to study the complexity and randomness of natural events. Since field measured data and statistical results often have inevitable errors and uncertainties which lead to inaccurate prediction and analysis, the risk in the design stage of offshore structures caused by uncertainties in environmental loads, sea level, and marine corrosion must be taken into account. In this paper, the multivariate compound extreme value distribution model (MCEVD is applied to predict the extreme sea state of wave, current, and wind. The maximum structural stress and deformation of a Jacket platform are analyzed and compared with different design standards. The calculation result sufficiently demonstrates the new risk assessment method’s rationality and security.

  4. An integrated factor analysis model for product eco-design based on full life cycle assessment

    Directory of Open Access Journals (Sweden)

    Zhi fang Zhou

    2016-02-01

    Full Text Available Purpose: Among the methods of comprehensive analysis for a product or an enterprise, there exist defects and deficiencies in traditional standard cost analyses and life cycle assessment methods. For example, some methods only emphasize one dimension (such as economic or environmental factors while neglecting other relevant dimensions. This paper builds a factor analysis model of resource value flow, based on full life cycle assessment and eco-design theory, in order to expose the relevant internal logic between these two factors. Design/methodology/approach: The model considers the efficient multiplication of resources, economic efficiency, and environmental efficiency as its core objectives. The model studies the status of resource value flow during the entire life cycle of a product, and gives an in-depth analysis on the mutual logical relationship of product performance, value, resource consumption, and environmental load to reveal the symptoms and potentials in different dimensions. Originality/value: This provides comprehensive, accurate and timely decision-making information for enterprise managers regarding product eco-design, as well as production and management activities. To conclude, it verifies the availability of this evaluation and analysis model using a Chinese SUV manufacturer as an example. 

  5. Development of a computer tool to support scenario analysis for safety assessment of HLW geological disposal

    International Nuclear Information System (INIS)

    In 'H12 Project to Establishing Technical Basis for HLW Disposal in Japan' a systematic approach that was based on an international consensus was adopted to develop scenarios to be considered in performance assessment. Adequacy of the approach was, in general term, appreciated through the domestic and international peer review. However it was also suggested that there were issues related to improving transparency and traceability of the procedure. To achieve this, improvement of scenario analysis method has been studied. In this study, based on an improvement method for treatment of FEP interaction a computer tool to support scenario analysis by specialists of performance assessment has been developed. Anticipated effects of this tool are to improve efficiency of complex and time consuming scenario analysis work and to reduce possibility of human errors in this work. This tool also enables to describe interactions among a vast number of FEPs and the related information as interaction matrix, and analysis those interactions from a variety of perspectives. (author)

  6. Development of residual stress analysis procedure for fitness-for-service assessment of welded structure

    International Nuclear Information System (INIS)

    In this study, a state of art review of existing residual stress analysis techniques and representative solutions is presented in order to develop the residual stress analysis procedure for Fitness-For-Service(FFS) assessment of welded structure. Critical issues associated with existing residual stress solutions and their treatments in performing FFS are discussed. It should be recognized that detailed residual stress evolution is an extremely complicated phenomenon that typically involves material-specific thermomechanical/metallurgical response, welding process physics, and structural interactions within a component being welded. As a result, computational procedures can vary significantly from highly complicated numerical techniques intended only to elucidate a small part of the process physics to cost-effective procedures that are deemed adequate for capturing some of the important features in a final residual stress distribution. Residual stress analysis procedure for FFS purposes belongs to the latter category. With this in mind, both residual stress analysis techniques and their adequacy for FFS are assessed based on both literature data and analyses performed in this investigation

  7. 'Rosatom' sites vulnerability analysis and assessment of their physical protection effectiveness. Methodology and 'tools'

    International Nuclear Information System (INIS)

    Full text: Enhancement of physical protection (PP) efficiency at nuclear sites (NS) of State Corporation (SC) 'Rosatom' is one of priorities. This issue is reflected in a series of international and Russian documents. PP enhancement at the sites can be achieved through upgrades of both administrative procedures and technical security system. However, in any case it is requisite to initially identify the so called 'objects of physical protection', that is, answer the question of what we need to protect and identify design basis threats (DBT) and adversary models. Answers to these questions constitute the contents of papers on vulnerability analysis (VA) for nuclear sites. Further, it is necessary to answer the question, to what extent we protect these 'objects of physical protection' and site as a whole; and this is the essence of assessment of physical protection effectiveness. In the process of effectiveness assessment at specific Rosatom sites we assess the effectiveness of the existing physical protection system (PPS) and the proposed options of its upgrades. Besides, there comes a possibility to select the optimal option based on 'cost-efficiency' criterion. Implementation of this work is a mandatory requirement as defined in federal level documents. In State Corporation 'Rosatom' there are methodologies in place for vulnerability analysis and effectiveness assessment as well as 'tools' (methods, regulations, computer software), that make it possible to put the above work into practice. There are corresponding regulations developed and approved by the Rosatom senior management. Special software for PPS effectiveness assessment called 'Vega-2' developed by a Rosatom specialized subsidiary - State Enterprise 'Eleron', is designed to assess PPS effectiveness at fixed nuclear sites. It was implemented practically at all the major Rosatom nuclear sites. As of now, this 'Vega-2' software has been certified and prepared for forwarding to corporation's nuclear sites so

  8. Authorship bias in violence risk assessment? A systematic review and meta-analysis.

    Science.gov (United States)

    Singh, Jay P; Grann, Martin; Fazel, Seena

    2013-01-01

    Various financial and non-financial conflicts of interests have been shown to influence the reporting of research findings, particularly in clinical medicine. In this study, we examine whether this extends to prognostic instruments designed to assess violence risk. Such instruments have increasingly become a routine part of clinical practice in mental health and criminal justice settings. The present meta-analysis investigated whether an authorship effect exists in the violence risk assessment literature by comparing predictive accuracy outcomes in studies where the individuals who designed these instruments were study authors with independent investigations. A systematic search from 1966 to 2011 was conducted using PsycINFO, EMBASE, MEDLINE, and US National Criminal Justice Reference Service Abstracts to identify predictive validity studies for the nine most commonly used risk assessment tools. Tabular data from 83 studies comprising 104 samples was collected, information on two-thirds of which was received directly from study authors for the review. Random effects subgroup analysis and metaregression were used to explore evidence of an authorship effect. We found a substantial and statistically significant authorship effect. Overall, studies authored by tool designers reported predictive validity findings around two times higher those of investigations reported by independent authors (DOR=6.22 [95% CI=4.68-8.26] in designers' studies vs. DOR=3.08 [95% CI=2.45-3.88] in independent studies). As there was evidence of an authorship effect, we also examined disclosure rates. None of the 25 studies where tool designers or translators were also study authors published a conflict of interest statement to that effect, despite a number of journals requiring that potential conflicts be disclosed. The field of risk assessment would benefit from routine disclosure and registration of research studies. The extent to which similar conflict of interests exists in those

  9. Authorship bias in violence risk assessment? A systematic review and meta-analysis.

    Directory of Open Access Journals (Sweden)

    Jay P Singh

    Full Text Available Various financial and non-financial conflicts of interests have been shown to influence the reporting of research findings, particularly in clinical medicine. In this study, we examine whether this extends to prognostic instruments designed to assess violence risk. Such instruments have increasingly become a routine part of clinical practice in mental health and criminal justice settings. The present meta-analysis investigated whether an authorship effect exists in the violence risk assessment literature by comparing predictive accuracy outcomes in studies where the individuals who designed these instruments were study authors with independent investigations. A systematic search from 1966 to 2011 was conducted using PsycINFO, EMBASE, MEDLINE, and US National Criminal Justice Reference Service Abstracts to identify predictive validity studies for the nine most commonly used risk assessment tools. Tabular data from 83 studies comprising 104 samples was collected, information on two-thirds of which was received directly from study authors for the review. Random effects subgroup analysis and metaregression were used to explore evidence of an authorship effect. We found a substantial and statistically significant authorship effect. Overall, studies authored by tool designers reported predictive validity findings around two times higher those of investigations reported by independent authors (DOR=6.22 [95% CI=4.68-8.26] in designers' studies vs. DOR=3.08 [95% CI=2.45-3.88] in independent studies. As there was evidence of an authorship effect, we also examined disclosure rates. None of the 25 studies where tool designers or translators were also study authors published a conflict of interest statement to that effect, despite a number of journals requiring that potential conflicts be disclosed. The field of risk assessment would benefit from routine disclosure and registration of research studies. The extent to which similar conflict of interests exists

  10. Complex health care interventions: Characteristics relevant for ethical analysis in health technology assessment

    Directory of Open Access Journals (Sweden)

    Lysdahl, Kristin Bakke

    2016-03-01

    Full Text Available Complexity entails methodological challenges in assessing health care interventions. In order to address these challenges, a series of characteristics of complexity have been identified in the Health Technology Assessment (HTA literature. These characteristics are primarily identified and developed to facilitate effectiveness, safety, and cost-effectiveness analysis. However, ethics is also a constitutive part of HTA, and it is not given that the conceptions of complexity that appears relevant for effectiveness, safety, and cost-effectiveness analysis are also relevant and directly applicable for ethical analysis in HTA. The objective of this article is therefore to identify and elaborate a set of key characteristics of complex health care interventions relevant for addressing ethical aspects in HTA. We start by investigating the relevance of the characteristics of complex interventions, as defined in the HTA literature. Most aspects of complexity found to be important when assessing effectiveness, safety, and efficiency turn out also to be relevant when assessing ethical issues of a given health technology. However, the importance and relevance of the complexity characteristics may differ when addressing ethical issues rather than effectiveness. Moreover, the moral challenges of a health care intervention may themselves contribute to the complexity. After identifying and analysing existing conceptions of complexity, we synthesise a set of five key characteristics of complexity for addressing ethical aspects in HTA: 1 multiple and changing perspectives, 2 indeterminate phenomena, 3 uncertain causality, 4 unpredictable outcome, and 5 ethical complexity. This may serve as an analytic tool in addressing ethical issues in HTA of complex interventions.

  11. Multi-criteria analysis as a tool for sustainability assessment of a waste management model

    International Nuclear Information System (INIS)

    To assess the sustainability of waste management scenario with energy recovery, it is necessary to carry out an adequate analysis of all influential criteria. The main problem in the analysis is to determine the indicators that clearly and fully sublimate the most important influential factors. The model for the assessment of the sustainability of waste treatment scenarios based on multi-criteria analysis AHP (analytic hierarchy process) method is developed. The model predicts an increase in the number of indicators, if it found that a selected number of indicators are not sufficient to distinguish between scenarios and new criterion for the selection of indicators: the relevance of the indicator for certain waste treatment. The model is verified in the case study the city of Niš. Four scenarios were selected and examined: business as usual scenario (landfilling of waste) and the other are created as scenarios with energy recovery and recourses preserving: composting organic waste with recycling inorganic waste, incineration of waste and anaerobic digestion of waste. The assessment of the sustainability of waste treatment scenarios was made in several steps. It is found that the best sustainable scenario is composting of organic and recycling of inorganic waste. - Highlights: • We develop a model to assess sustainability of waste management scenario with energy recovery. • The methodology we develop is based on the analytic hierarchy process. • We examined the influence of the number of indicators in scenarios ranking. • We introduce a new criterion for selection of indicators for waste treatment with energy recovery. • The best sustainable scenario is recycling inorganic and composting organic waste

  12. A Comparative Analysis on Assessment of Land Carrying Capacity with Ecological Footprint Analysis and Index System Method.

    Science.gov (United States)

    Qian, Yao; Tang, Lina; Qiu, Quanyi; Xu, Tong; Liao, Jiangfu

    2015-01-01

    Land carrying capacity (LCC) explains whether the local land resources are effectively used to support economic activities and/or human population. LCC can be evaluated commonly with two approaches, namely ecological footprint analysis (EFA) and the index system method (ISM). EFA is helpful to investigate the effects of different land categories whereas ISM can be used to evaluate the contributions of social, environmental, and economic factors. Here we compared the two LCC-evaluation approaches with data collected from Xiamen City, a typical region where rapid economic growth and urbanization are found in China. The results show that LCC assessments with EFA and ISM not only complement each other but also are mutually supportive. Both assessments suggest that decreases in arable land and increasingly high energy consumption have major negative effects on LCC and threaten sustainable development for Xiamen City. It is important for the local policy makers, planners and designers to reduce ecological deficits by controlling fossil energy consumption, protecting arable land and forest land from converting into other land types, and slowing down the speed of urbanization, and to promote sustainability by controlling rural-to-urban immigration, increasing hazard-free treatment rate of household garbage, and raising energy consumption per unit industrial added value. Although EFA seems more appropriate for estimating LCC for a resource-output or self-sufficient region and ISM is more suitable for a resource-input region, both approaches should be employed when perform LCC assessment in any places around the world. PMID:26121142

  13. A Comparative Analysis on Assessment of Land Carrying Capacity with Ecological Footprint Analysis and Index System Method.

    Directory of Open Access Journals (Sweden)

    Yao Qian

    Full Text Available Land carrying capacity (LCC explains whether the local land resources are effectively used to support economic activities and/or human population. LCC can be evaluated commonly with two approaches, namely ecological footprint analysis (EFA and the index system method (ISM. EFA is helpful to investigate the effects of different land categories whereas ISM can be used to evaluate the contributions of social, environmental, and economic factors. Here we compared the two LCC-evaluation approaches with data collected from Xiamen City, a typical region where rapid economic growth and urbanization are found in China. The results show that LCC assessments with EFA and ISM not only complement each other but also are mutually supportive. Both assessments suggest that decreases in arable land and increasingly high energy consumption have major negative effects on LCC and threaten sustainable development for Xiamen City. It is important for the local policy makers, planners and designers to reduce ecological deficits by controlling fossil energy consumption, protecting arable land and forest land from converting into other land types, and slowing down the speed of urbanization, and to promote sustainability by controlling rural-to-urban immigration, increasing hazard-free treatment rate of household garbage, and raising energy consumption per unit industrial added value. Although EFA seems more appropriate for estimating LCC for a resource-output or self-sufficient region and ISM is more suitable for a resource-input region, both approaches should be employed when perform LCC assessment in any places around the world.

  14. Comparative SWOT analysis of strategic environmental assessment systems in the Middle East and North Africa region.

    Science.gov (United States)

    Rachid, G; El Fadel, M

    2013-08-15

    This paper presents a SWOT analysis of SEA systems in the Middle East North Africa region through a comparative examination of the status, application and structure of existing systems based on country-specific legal, institutional and procedural frameworks. The analysis is coupled with the multi-attribute decision making method (MADM) within an analytical framework that involves both performance analysis based on predefined evaluation criteria and countries' self-assessment of their SEA system through open-ended surveys. The results show heterogenous status with a general delayed progress characterized by varied levels of weaknesses embedded in the legal and administrative frameworks and poor integration with the decision making process. Capitalizing on available opportunities, the paper highlights measures to enhance the development and enactment of SEA in the region. PMID:23648267

  15. Integrating Expert Knowledge with Statistical Analysis for Landslide Susceptibility Assessment at Regional Scale

    Directory of Open Access Journals (Sweden)

    Christos Chalkias

    2016-03-01

    Full Text Available In this paper, an integration landslide susceptibility model by combining expert-based and bivariate statistical analysis (Landslide Susceptibility Index—LSI approaches is presented. Factors related with the occurrence of landslides—such as elevation, slope angle, slope aspect, lithology, land cover, Mean Annual Precipitation (MAP and Peak Ground Acceleration (PGA—were analyzed within a GIS environment. This integrated model produced a landslide susceptibility map which categorized the study area according to the probability level of landslide occurrence. The accuracy of the final map was evaluated by Receiver Operating Characteristics (ROC analysis depending on an independent (validation dataset of landslide events. The prediction ability was found to be 76% revealing that the integration of statistical analysis with human expertise can provide an acceptable landslide susceptibility assessment at regional scale.

  16. An assessment of turbulence models for linear hydrodynamic stability analysis of strongly swirling jets

    CERN Document Server

    Rukes, Lothar; Oberleithner, Kilian

    2016-01-01

    Linear stability analysis has proven to be a useful tool in the analysis of dominant coherent structures, such as the von K\\'{a}rm\\'{a}n vortex street and the global spiral mode associated with the vortex breakdown of swirling jets. In recent years, linear stability analysis has been applied successfully to turbulent time-mean flows, instead of laminar base-flows, \\textcolor{black}{which requires turbulent models that account for the interaction of the turbulent field with the coherent structures. To retain the stability equations of laminar flows, the Boussinesq approximation with a spatially nonuniform but isotropic eddy viscosity is typically employed. In this work we assess the applicability of this concept to turbulent strongly swirling jets, a class of flows that is particularly unsuited for isotropic eddy viscosity models. Indeed we find that unsteady RANS simulations only match with experiments with a Reynolds stress model that accounts for an anisotropic eddy viscosity. However, linear stability anal...

  17. Assessment of energy utilization in Iran’s industrial sector using energy and exergy analysis method

    International Nuclear Information System (INIS)

    The purpose of this study is to assess the use of quality of energy in Iran’s industrial sector. The exergy analysis has been performed along with energy analysis, in order to gain deeper and more realistic understanding of the sector’s condition. Primary energy utilization from seventeen different industries has been considered for calculation of the exergy and energy efficiencies for each industry, and later for Iran’s industrial sector. The exergy efficiency is much lower than energy efficiency in all industries and also in the industrial sector. It is shown that based on the results from exergy analysis the priorities for efficiency improvement are different from that of energy analysis; this in turn suggests that exergy analysis as a proper tool for policy makers. The sources of energy degradation and the mechanisms which cause degradation of quality of energy have been identified. Moreover remedial actions for better utilization of quality of energy are proposed. The energy and exergy efficiencies for the entire industrial sector of Iran were approximated as 63% and 42%, respectively. The oil, iron and steel, plastic and cement industries are found to have the highest share in destruction of quality of total input energy to the industrial sector. The aluminum industry has the highest exergy efficiency of 52.5%. Mean entropic temperature is also proposed as a tool for understanding the degree of quality of energy required in each industry and consequently better quality matching which leads to better energy quality utilization. - Highlights: ► Exergy is used to assess the use of quality of energy in Iran's Industrial sector. ► Energy degradation mechanisms have been identified. ► Mean entropic temperature is proposed as a metric for energy quality matching. ► Improvement priorities based on exergy are different from those of energy analysis.

  18. Assessment of the interchannel mixing model with a subchannel analysis code for BWR and PWR conditions

    International Nuclear Information System (INIS)

    The influence of the interchannel mixing model employed in a traditional subchannel analysis code was investigated in this study, specifically on the analysis of the enthalpy distribution and critical heat flux (CHF) in rod bundles in BWR and PWR conditions. The equal-volume-exchange turbulent mixing and void drift model (EVVD) was embodied to the COBRA-IV-I code. An optimized model of the void drift coefficient has been devised in this study as the result of the assessment with the two-phase flow distribution data for the general electric (GE) 9-rod and Ispra 16-rod test bundles. The influence of the subchannel analysis model on the analysis of CHF was examined by evaluating the CHF test data in rod bundles representing PWR and BWR conditions. The CHFR margins of typical light water nuclear reactor (LWR) cores were evaluated by considering the influence on the local parameter CHF correlation and the hot channel analysis result. It appeared that the interchannel mixing model has an important effect upon the analysis of CHFR margin for BWR conditions. (orig.)

  19. Spatial assessment of air quality patterns in Malaysia using multivariate analysis

    Science.gov (United States)

    Dominick, Doreena; Juahir, Hafizan; Latif, Mohd Talib; Zain, Sharifuddin M.; Aris, Ahmad Zaharin

    2012-12-01

    This study aims to investigate possible sources of air pollutants and the spatial patterns within the eight selected Malaysian air monitoring stations based on a two-year database (2008-2009). The multivariate analysis was applied on the dataset. It incorporated Hierarchical Agglomerative Cluster Analysis (HACA) to access the spatial patterns, Principal Component Analysis (PCA) to determine the major sources of the air pollution and Multiple Linear Regression (MLR) to assess the percentage contribution of each air pollutant. The HACA results grouped the eight monitoring stations into three different clusters, based on the characteristics of the air pollutants and meteorological parameters. The PCA analysis showed that the major sources of air pollution were emissions from motor vehicles, aircraft, industries and areas of high population density. The MLR analysis demonstrated that the main pollutant contributing to variability in the Air Pollutant Index (API) at all stations was particulate matter with a diameter of less than 10 μm (PM10). Further MLR analysis showed that the main air pollutant influencing the high concentration of PM10 was carbon monoxide (CO). This was due to combustion processes, particularly originating from motor vehicles. Meteorological factors such as ambient temperature, wind speed and humidity were also noted to influence the concentration of PM10.

  20. Addendum to the performance assessment analysis for low-level waste disposal in the 200 west area active burial grounds

    Energy Technology Data Exchange (ETDEWEB)

    Wood, M.I., Westinghouse Hanford

    1996-12-20

    An addendum was completed to the performance assessment (PA) analysis for the active 200 West Area low-level solid waste burial grounds. The addendum includes supplemental information developed during the review of the PA analysis, an ALARA analysis, a comparison of PA results with the Hanford Groundwater Protection Strategy, and a justification for the assumption of 500 year deterrence to the inadvertent intruder.

  1. A methodological framework for hydromorphological assessment, analysis and monitoring (IDRAIM) aimed at promoting integrated river management

    Science.gov (United States)

    Rinaldi, M.; Surian, N.; Comiti, F.; Bussettini, M.

    2015-12-01

    A methodological framework for hydromorphological assessment, analysis and monitoring (named IDRAIM) has been developed with the specific aim of supporting the management of river processes by integrating the objectives of ecological quality and flood risk mitigation. The framework builds on existing and up-to-date geomorphological concepts and approaches and has been tested on several Italian streams. The framework includes the following four phases: (1) catchment-wide characterization of the fluvial system; (2) evolutionary trajectory reconstruction and assessment of current river conditions; (3) description of future trends of channel evolution; and (4) identification of management options. The framework provides specific consideration of the temporal context, in terms of reconstructing the trajectory of past channel evolution as a basis for interpreting present river conditions and future trends. A series of specific tools has been developed for the assessment of river conditions, in terms of morphological quality and channel dynamics. These include: the Morphological Quality Index (MQI), the Morphological Dynamics Index (MDI), the Event Dynamics Classification (EDC), and the river morphodynamic corridors (MC and EMC). The monitoring of morphological parameters and indicators, alongside the assessment of future scenarios of channel evolution provides knowledge for the identification, planning and prioritization of actions for enhancing morphological quality and risk mitigation.

  2. UAV-based urban structural damage assessment using object-based image analysis and semantic reasoning

    Directory of Open Access Journals (Sweden)

    J. Fernandez Galarreta

    2014-09-01

    Full Text Available Structural damage assessment is critical after disasters but remains a challenge. Many studies have explored the potential of remote sensing data, but limitations of vertical data persist. Oblique imagery has been identified as more useful, though the multi-angle imagery also adds a new dimension of complexity. This paper addresses damage assessment based on multi-perspective, overlapping, very high resolution oblique images obtained with unmanned aerial vehicles (UAVs. 3-D point-cloud assessment for the entire building is combined with detailed object-based image analysis (OBIA of façades and roofs. This research focuses not on automatic damage assessment, but on creating a methodology that supports the often ambiguous classification of intermediate damage levels, aiming at producing comprehensive per-building damage scores. We identify completely damaged structures in the 3-D point cloud, and for all other cases provide the OBIA-based damage indicators to be used as auxiliary information by damage analysts. The results demonstrate the usability of the 3-D point-cloud data to identify major damage features. Also the UAV-derived and OBIA-processed oblique images are shown to be a suitable basis for the identification of detailed damage features on façades and roofs. Finally, we also demonstrate the possibility of aggregating the multi-perspective damage information at building level.

  3. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    Science.gov (United States)

    Baruffini, Mirko

    2010-05-01

    Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a

  4. Using statistical analysis and artificial intelligence tools for automatic assessment of video sequences

    Science.gov (United States)

    Ekobo Akoa, Brice; Simeu, Emmanuel; Lebowsky, Fritz

    2014-01-01

    This paper proposes two novel approaches to Video Quality Assessment (VQA). Both approaches attempt to develop video evaluation techniques capable of replacing human judgment when rating video quality in subjective experiments. The underlying study consists of selecting fundamental quality metrics based on Human Visual System (HVS) models and using artificial intelligence solutions as well as advanced statistical analysis. This new combination enables suitable video quality ratings while taking as input multiple quality metrics. The first method uses a neural network based machine learning process. The second method consists in evaluating the video quality assessment using non-linear regression model. The efficiency of the proposed methods is demonstrated by comparing their results with those of existing work done on synthetic video artifacts. The results obtained by each method are compared with scores from a database resulting from subjective experiments.

  5. Analysis of medicostatistical data to assess the genetic and teratogenic effects of the Chernobyl accident

    International Nuclear Information System (INIS)

    Analysis of the official medicodemographic statistical data (provided by the Ukrainian Ministry of Health) revealed variations in the mean rates of congenital developmental defects (CDD) before 1987 (1985-1986) and in the period of 1987-1989 in all the areas irrespective of a degree of contamination with radionuclides (i.e. variations are determined by the time factor rather than by the irradiation factor). According to the medical statistical data, the rates of CDD and spontaneous abortions varied within a wide range, making it difficult to assess probable mutagenic and teratogenic effects of the Chernobyl accident. Medicostatistical data on spontaneous abortions understated the actual rates 2-3-fold, therefore they were not recommended for assessment of mutagenic effects of the Chernobyl accident

  6. Organizational analysis and safety for utilities with nuclear power plants: perspectives for organizational assessment. Volume 2

    International Nuclear Information System (INIS)

    This two-volume report presents the results of initial research on the feasibility of applying organizational factors in nuclear power plant (NPP) safety assessment. Volume 1 of this report contains an overview of the literature, a discussion of available safety indicators, and a series of recommendations for more systematically incorporating organizational analysis into investigations of nuclear power plant safety. The six chapters of this volume discuss the major elements in our general approach to safety in the nuclear industry. The chapters include information on organizational design and safety; organizational governance; utility environment and safety related outcomes; assessments by selected federal agencies; review of data sources in the nuclear power industry; and existing safety indicators

  7. Nondestructive inspection assessment of eddy current and electrochemical analysis to separate inconel and stainless steel alloys

    Energy Technology Data Exchange (ETDEWEB)

    Moore, D.G.; Sorensen, N.R.

    1998-02-01

    This report presents a nondestructive inspection assessment of eddy current and electrochemical analysis to separate inconel alloys from stainless steel alloys as well as an evaluation of cleaning techniques to remove a thermal oxide layer on aircraft exhaust components. The results of this assessment are presented in terms of how effective each technique classifies a known exhaust material. Results indicate that either inspection technique can separate inconel and stainless steel alloys. Based on the experiments conducted, the electrochemical spot test is the optimum for use by airframe and powerplant mechanics. A spot test procedure is proposed for incorporation into the Federal Aviation Administration Advisory Circular 65-9A Airframe & Powerplant Mechanic - General Handbook. 3 refs., 70 figs., 7 tabs.

  8. Automatic Speech Signal Analysis for Clinical Diagnosis and Assessment of Speech Disorders

    CERN Document Server

    Baghai-Ravary, Ladan

    2013-01-01

    Automatic Speech Signal Analysis for Clinical Diagnosis and Assessment of Speech Disorders provides a survey of methods designed to aid clinicians in the diagnosis and monitoring of speech disorders such as dysarthria and dyspraxia, with an emphasis on the signal processing techniques, statistical validity of the results presented in the literature, and the appropriateness of methods that do not require specialized equipment, rigorously controlled recording procedures or highly skilled personnel to interpret results. Such techniques offer the promise of a simple and cost-effective, yet objective, assessment of a range of medical conditions, which would be of great value to clinicians. The ideal scenario would begin with the collection of examples of the clients’ speech, either over the phone or using portable recording devices operated by non-specialist nursing staff. The recordings could then be analyzed initially to aid diagnosis of conditions, and subsequently to monitor the clients’ progress and res...

  9. Information Technology Project Portfolio and Strategy Alignment Assessment Based on Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Marisa Analía Sánchez

    2012-11-01

    Full Text Available Recent researches have shown that companies face considerable difficulties in assessing the strategy value contribution of Information Technology (IT investments. One of the major obstacles to achieving strategy alignment is that organizations find extremely difficult to link and quantify the IT investments benefits with strategic goals. The aim of this paper is to define an approach to assess portfolio-strategy alignment. To this end a formal specification of Kaplan and Norton Strategy Map is developed utilizing Unified Modeling Language (UML. The approach uses the Strategy Map as a framework for defining the portfolio value contribution and Data Envelopment Analysis (DEA is used as the methodology for measuring efficiency of project portfolios.DOI:10.5585/gep.v3i2.66

  10. Assessment of Static Delamination Propagation Capabilities in Commercial Finite Element Codes Using Benchmark Analysis

    Science.gov (United States)

    Orifici, Adrian C.; Krueger, Ronald

    2010-01-01

    With capabilities for simulating delamination growth in composite materials becoming available, the need for benchmarking and assessing these capabilities is critical. In this study, benchmark analyses were performed to assess the delamination propagation simulation capabilities of the VCCT implementations in Marc TM and MD NastranTM. Benchmark delamination growth results for Double Cantilever Beam, Single Leg Bending and End Notched Flexure specimens were generated using a numerical approach. This numerical approach was developed previously, and involves comparing results from a series of analyses at different delamination lengths to a single analysis with automatic crack propagation. Specimens were analyzed with three-dimensional and two-dimensional models, and compared with previous analyses using Abaqus . The results demonstrated that the VCCT implementation in Marc TM and MD Nastran(TradeMark) was capable of accurately replicating the benchmark delamination growth results and that the use of the numerical benchmarks offers advantages over benchmarking using experimental and analytical results.

  11. Predictive Validity of Pressure Ulcer Risk Assessment Tools for Elderly: A Meta-Analysis.

    Science.gov (United States)

    Park, Seong-Hi; Lee, Young-Shin; Kwon, Young-Mi

    2016-04-01

    Preventing pressure ulcers is one of the most challenging goals existing for today's health care provider. Currently used tools which assess risk of pressure ulcer development rarely evaluate the accuracy of predictability, especially in older adults. The current study aimed at providing a systemic review and meta-analysis of 29 studies using three pressure ulcer risk assessment tools: Braden, Norton, and Waterlow Scales. Overall predictive validities of pressure ulcer risks in the pooled sensitivity and specificity indicated a similar range with a moderate accuracy level in all three scales, while heterogeneity showed more than 80% variability among studies. The studies applying the Braden Scale used five different cut-off points representing the primary cause of heterogeneity. Results indicate that commonly used screening tools for pressure ulcer risk have limitations regarding validity and accuracy for use with older adults due to heterogeneity among studies. PMID:26337859

  12. Environmental impact assessment in Colombia: Critical analysis and proposals for improvement

    International Nuclear Information System (INIS)

    The evaluation of Environmental Impact Assessment (EIA) systems is a highly recommended strategy for enhancing their effectiveness and quality. This paper describes an evaluation of EIA in Colombia, using the model and the control mechanisms proposed and applied in other countries by Christopher Wood and Ortolano. The evaluation criteria used are based on Principles of Environmental Impact Assessment Best Practice, such as effectiveness and control features, and they were contrasted with the opinions of a panel of Colombian EIA experts as a means of validating the results of the study. The results found that EIA regulations in Colombia were ineffective because of limited scope, inadequate administrative support and the inexistence of effective control mechanisms and public participation. This analysis resulted in a series of recommendations regarding the further development of the EIA system in Colombia with a view to improving its quality and effectiveness.

  13. Assessing explanatory style: the content analysis of verbatim explanations and the Attributional Style Questionnaire.

    Science.gov (United States)

    Schulman, P; Castellon, C; Seligman, M E

    1989-01-01

    We compare two methods of assessing explanatory style--the content analysis of verbatim explanations (CAVE) and the Attributional Style Questionnaire (ASQ). The CAVE technique is a method that allows the researcher to analyze any naturally occurring verbatim materials for explanatory style. This technique permits the measurement of various populations that are unwilling or unable to take the ASQ. We administered the ASQ and Beck Depression Inventory (BDI) to 169 undergraduates and content analyzed the written causes on the ASQ for explanatory style by the CAVE technique. The CAVE technique correlated 0.71 with the ASQ (P less than 0.0001, n = 159) and -0.36 with BDI (P less than 0.0001, n = 159). The ASQ correlated -0.51 with the BDI (P less than 0.0001, n = 160). Both the CAVE technique and the ASQ seem to be valid devices for assessing explanatory style. PMID:2818415

  14. Assessment of ecological risks at former landfill site using TRIAD procedure and multicriteria analysis.

    Science.gov (United States)

    Sorvari, Jaana; Schultz, Eija; Haimi, Jari

    2013-02-01

    Old industrial landfills are important sources of environmental contamination in Europe, including Finland. In this study, we demonstrated the combination of TRIAD procedure, multicriteria decision analysis (MCDA), and statistical Monte Carlo analysis for assessing the risks to terrestrial biota in a former landfill site contaminated by petroleum hydrocarbons (PHCs) and metals. First, we generated hazard quotients by dividing the concentrations of metals and PHCs in soil by the corresponding risk-based ecological benchmarks. Then we conducted ecotoxicity tests using five plant species, earthworms, and potworms, and determined the abundance and diversity of soil invertebrates from additional samples. We aggregated the results in accordance to the methods used in the TRIAD procedure, conducted rating of the assessment methods based on their performance in terms of specific criteria, and weighted the criteria using two alternative weighting techniques to produce performance scores for each method. We faced problems in using the TRIAD procedure, for example, the results from the animal counts had to be excluded from the calculation of integrated risk estimates (IREs) because our reference soil sample showed the lowest biodiversity and abundance of soil animals. In addition, hormesis hampered the use of the results from the ecotoxicity tests. The final probabilistic IREs imply significant risks at all sampling locations. Although linking MCDA with TRIAD provided a useful means to study and consider the performance of the alternative methods in predicting ecological risks, some uncertainties involved still remained outside the quantitative analysis. PMID:22762796

  15. Pesticide Flow Analysis to Assess Human Exposure in Greenhouse Flower Production in Colombia

    Directory of Open Access Journals (Sweden)

    Claudia R. Binder

    2013-03-01

    Full Text Available Human exposure assessment tools represent a means for understanding human exposure to pesticides in agricultural activities and managing possible health risks. This paper presents a pesticide flow analysis modeling approach developed to assess human exposure to pesticide use in greenhouse flower crops in Colombia, focusing on dermal and inhalation exposure. This approach is based on the material flow analysis methodology. The transfer coefficients were obtained using the whole body dosimetry method for dermal exposure and the button personal inhalable aerosol sampler for inhalation exposure, using the tracer uranine as a pesticide surrogate. The case study was a greenhouse rose farm in the Bogota Plateau in Colombia. The approach was applied to estimate the exposure to pesticides such as mancozeb, carbendazim, propamocarb hydrochloride, fosetyl, carboxin, thiram, dimethomorph and mandipropamide. We found dermal absorption estimations close to the AOEL reference values for the pesticides carbendazim, mancozeb, thiram and mandipropamide during the study period. In addition, high values of dermal exposure were found on the forearms, hands, chest and legs of study participants, indicating weaknesses in the overlapping areas of the personal protective equipment parts. These results show how the material flow analysis methodology can be applied in the field of human exposure for early recognition of the dispersion of pesticides and support the development of measures to improve operational safety during pesticide management. Furthermore, the model makes it possible to identify the status quo of the health risk faced by workers in the study area.

  16. Current activities and future trends in reliability analysis and probabilistic safety assessment in Hungary

    International Nuclear Information System (INIS)

    In Hungary reliability analysis (RA) and probabilistic safety assessment (PSA) of nuclear power plants was initiated 3 years ago. First, computer codes for automatic fault tree analysis (CAT, PREP) and numerical evaluation (REMO, KITT1,2) were adapted. Two main case studies - detailed availability/reliability calculation of diesel sets and analysis of safety systems influencing event sequences induced by large LOCA - were performed. Input failure data were taken from publications, a need for failure and reliability data bank was revealed. Current and future activities involves: setup of national data bank for WWER-440 units; full-scope level-I PSA of PAKS NPP in Hungary; operational safety assessment of particular problems at PAKS NPP. In the present article the state of RA and PSA activities in Hungary, as well as the main objectives of ongoing work are described. A need for international cooperation (for unified data collection of WWER-440 units) and for IAEA support (within Interregional Program INT/9/063) is emphasized. (author)

  17. Methods for the analysis of ordinal response data in medical image quality assessment.

    Science.gov (United States)

    Keeble, Claire; Baxter, Paul D; Gislason-Lee, Amber J; Treadgold, Laura A; Davies, Andrew G

    2016-07-01

    The assessment of image quality in medical imaging often requires observers to rate images for some metric or detectability task. These subjective results are used in optimization, radiation dose reduction or system comparison studies and may be compared to objective measures from a computer vision algorithm performing the same task. One popular scoring approach is to use a Likert scale, then assign consecutive numbers to the categories. The mean of these response values is then taken and used for comparison with the objective or second subjective response. Agreement is often assessed using correlation coefficients. We highlight a number of weaknesses in this common approach, including inappropriate analyses of ordinal data and the inability to properly account for correlations caused by repeated images or observers. We suggest alternative data collection and analysis techniques such as amendments to the scale and multilevel proportional odds models. We detail the suitability of each approach depending upon the data structure and demonstrate each method using a medical imaging example. Whilst others have raised some of these issues, we evaluated the entire study from data collection to analysis, suggested sources for software and further reading, and provided a checklist plus flowchart for use with any ordinal data. We hope that raised awareness of the limitations of the current approaches will encourage greater method consideration and the utilization of a more appropriate analysis. More accurate comparisons between measures in medical imaging will lead to a more robust contribution to the imaging literature and ultimately improved patient care. PMID:26975497

  18. Dependability Assessment by Static Analysis of Software Important to Nuclear Power Plant Safety

    Energy Technology Data Exchange (ETDEWEB)

    Ourghanlian, Alain [EDF Lab, Chatou (France)

    2014-08-15

    We describe a practical experimentation of safety assessment of safety-critical software used in Nuclear Power Plants. To enhance the credibility of safety assessments and to optimize safety justification costs, Electricite de France (EDF) investigates the use of methods and tools for source code semantic analysis, to obtain indisputable evidence and help assessors focus on the most critical issues. EDF has been using the PolySpace tool for more than 10 years. Today, new industrial tools, based on the same formal approach, Abstract Interpretation, are available. Practical experimentation with these new tools shows that the precision obtained on one of our shutdown systems software is very significantly improved. In a first part, we present the analysis principles of the tools used in our experimentation. In a second part, we present the main characteristics of protection-system software, and why these characteristics are well adapted for the new analysis tools. In the last part, we present an overview of the results and the limitation of the tools.

  19. Summary report of the advanced scenario analysis for performance assessment of geological disposal

    International Nuclear Information System (INIS)

    First of all, with regard to the FEP information data on the Engineered Barrier System (EBS) developed by JNC, description level and content of the FEPs have been examined from various angles on the basis of the latest research information. Each content of the FEP data has been classified and modified by means of integrating descriptive items, checking detail levels and correlations with other FEPs, collating with the H12 report, and adding technical information after H12 report. Secondly, scenario-modeling process has been studied. The study has been conducted by evaluating representation of the repository system, definition of FEP properties, and process interactions based on the concept of the interaction matrix (RES format) which represents influences between physicochemical characteristics of the repository, followed by an experimental development of the actual RES interaction matrix based on the H12 report as the examination to improve the transparency, traceability and comprehensibility of the scenario analysis process. Lastly, in relation to the geological disposal system, assessment techniques have been examined for more practical scenario analysis on particularly strong perturbations. Possible conceptual models have been proposed for each of these scenarios; seismic, faulting, and dike intrusion. As a result of these researches, a future direction for advanced scenario analysis on performance assessment has been indicated, as well as associated issues to be discussed have been clarified. (author)

  20. Texture analysis for the assessment of structural changes in parotid glands induced by radiotherapy

    International Nuclear Information System (INIS)

    Background and purpose: During radiotherapy (RT) for head-and-neck cancer, parotid glands undergo significant anatomic, functional and structural changes which could characterize pre-clinical signs of an increased risk of xerostomia. Texture analysis is proposed to assess structural changes of parotids induced by RT, and to investigate whether early variations of textural parameters (such as mean intensity and fractal dimension) can predict parotid shrinkage at the end of treatment. Material and methods: Textural parameters and volumes of 42 parotids from 21 patients treated with intensity-modulated RT for nasopharyngeal cancer were extracted from CT images. To individuate which parameters changed during RT, a Wilcoxon signed-rank test between textural indices (first and second RT week; first and last RT week) was performed. Discriminant analysis was applied to variations of these parameters in the first two weeks of RT to assess their power in predicting parotid shrinkage at the end of RT. Results: A significant decrease in mean intensity (1.7 HU and 3.8 HU after the second and last weeks, respectively) and fractal dimension (0.016 and 0.021) was found. Discriminant analysis, based on volume and fractal dimension, was able to predict the final parotid shrinkage (accuracy of 71.4%). Conclusion: Textural features could be used in combination with volume to characterize structural modifications on parotid glands and to predict parotid shrinkage at the end of RT

  1. Resource efficiency of urban sanitation systems. A comparative assessment using material and energy flow analysis

    Energy Technology Data Exchange (ETDEWEB)

    Meinzinger, Franziska

    2010-07-01

    Within the framework of sustainable development it is important to find ways of reducing natural resource consumption and to change towards closed-loop management. As in many other spheres increased resource efficiency has also become an important issue in sanitation. Particularly nutrient recovery for agriculture, increased energy-efficiency and saving of natural water resources, can make a contribution to more resource efficient sanitation systems. To assess the resource efficiency of alternative developments a systems perspective is required. The present study applies a combined cost, energy and material flow analysis (ceMFA) as a system analysis method to assess the resource efficiency of urban sanitation systems. This includes the discussion of relevant criteria and assessment methods. The main focus of this thesis is the comparative assessment of different systems, based on two case studies; Hamburg in Germany and Arba Minch in Ethiopia. A range of possible system developments including source separation (e.g. diversion of urine or blackwater) is defined and compared with the current situation as a reference system. The assessment is carried out using computer simulations based on model equations. The model equations not only integrate mass and nutrient flows, but also the energy and cost balances of the different systems. In order to assess the impact of different assumptions and calculation parameters, sensitivity analyses and parameter variations complete the calculations. Based on the simulations, following general conclusions can be drawn: None of the systems show an overall benefit with regard to all investigated criteria, namely nutrients, energy, water and costs. Yet, the results of the system analysis can be used as basis for decision making if a case-related weighting is introduced. The systems show varying potential for the recovery of nutrients from (source separated) wastewater flows. For the case study of Hamburg up to 29% of the mineral

  2. A critical analysis of hazard resilience measures within sustainability assessment frameworks

    International Nuclear Information System (INIS)

    Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site, community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs

  3. Assessing the effect of data pretreatment procedures for principal components analysis of chromatographic data.

    Science.gov (United States)

    McIlroy, John W; Smith, Ruth Waddell; McGuffin, Victoria L

    2015-12-01

    Following publication of the National Academy of Sciences report "Strengthening Forensic Science in the United States: A Path Forward", there has been increasing interest in the application of multivariate statistical procedures for the evaluation of forensic evidence. However, prior to statistical analysis, variance from sources other than the sample must be minimized through application of data pretreatment procedures. This is necessary to ensure that subsequent statistical analysis of the data provides meaningful results. The purpose of this work was to evaluate the effect of pretreatment procedures on multivariate statistical analysis of chromatographic data obtained for a reference set of diesel fuels. Diesel was selected due to its chemical complexity and forensic relevance, both for fire debris and environmental forensic applications. Principal components analysis (PCA) was applied to the untreated chromatograms to assess association of replicates and discrimination among the different diesel samples. The chromatograms were then pretreated by sequentially applying the following procedures: background correction, smoothing, retention-time alignment, and normalization. The effect of each procedure on association and discrimination was evaluated based on the association of replicates in the PCA scores plot. For these data, background correction and smoothing offered minimal improvement, whereas alignment and normalization offered the greatest improvement in the association of replicates and discrimination among highly similar samples. Further, prior to pretreatment, the first principal component accounted for only non-sample sources of variance. Following pretreatment, these sources were minimized and the first principal component accounted for significant chemical differences among the diesel samples. These results highlight the need for pretreatment procedures and provide a metric to assess the effect of pretreatment on subsequent multivariate statistical

  4. Mercury analysis in hair: Comparability and quality assessment within the transnational COPHES/DEMOCOPHES project.

    Science.gov (United States)

    Esteban, Marta; Schindler, Birgit Karin; Jiménez, José Antonio; Koch, Holger Martin; Angerer, Jürgen; Rosado, Montserrat; Gómez, Silvia; Casteleyn, Ludwine; Kolossa-Gehring, Marike; Becker, Kerstin; Bloemen, Louis; Schoeters, Greet; Den Hond, Elly; Sepai, Ovnair; Exley, Karen; Horvat, Milena; Knudsen, Lisbeth E; Joas, Anke; Joas, Reinhard; Aerts, Dominique; Biot, Pierre; Borošová, Daniela; Davidson, Fred; Dumitrascu, Irina; Fischer, Marc E; Grander, Margaretha; Janasik, Beata; Jones, Kate; Kašparová, Lucie; Larssen, Thorjørn; Naray, Miklos; Nielsen, Flemming; Hohenblum, Philipp; Pinto, Rui; Pirard, Catherine; Plateel, Gregory; Tratnik, Janja Snoj; Wittsiepe, Jürgen; Castaño, Argelia

    2015-08-01

    Human biomonitoring (HBM) is an effective tool for assessing actual exposure to chemicals that takes into account all routes of intake. Although hair analysis is considered to be an optimal biomarker for assessing mercury exposure, the lack of harmonization as regards sampling and analytical procedures has often limited the comparison of data at national and international level. The European-funded projects COPHES and DEMOCOPHES developed and tested a harmonized European approach to Human Biomonitoring in response to the European Environment and Health Action Plan. Herein we describe the quality assurance program (QAP) for assessing mercury levels in hair samples from more than 1800 mother-child pairs recruited in 17 European countries. To ensure the comparability of the results, standard operating procedures (SOPs) for sampling and for mercury analysis were drafted and distributed to participating laboratories. Training sessions were organized for field workers and four external quality-assessment exercises (ICI/EQUAS), followed by the corresponding web conferences, were organized between March 2011 and February 2012. ICI/EQUAS used native hair samples at two mercury concentration ranges (0.20-0.71 and 0.80-1.63) per exercise. The results revealed relative standard deviations of 7.87-13.55% and 4.04-11.31% for the low and high mercury concentration ranges, respectively. A total of 16 out of 18 participating laboratories the QAP requirements and were allowed to analyze samples from the DEMOCOPHES pilot study. Web conferences after each ICI/EQUAS revealed this to be a new and effective tool for improving analytical performance and increasing capacity building. The procedure developed and tested in COPHES/DEMOCOPHES would be optimal for application on a global scale as regards implementation of the Minamata Convention on Mercury. PMID:25483984

  5. Discrete dynamic event tree modeling and analysis of nuclear power plant crews for safety assessment

    International Nuclear Information System (INIS)

    Current Probabilistic Risk Assessment (PRA) and Human Reliability Analysis (HRA) methodologies model the evolution of accident sequences in Nuclear Power Plants (NPPs) mainly based on Logic Trees. The evolution of these sequences is a result of the interactions between the crew and plant; in current PRA methodologies, simplified models of these complex interactions are used. In this study, the Accident Dynamic Simulator (ADS), a modeling framework based on the Discrete Dynamic Event Tree (DDET), has been used for the simulation of crew-plant interactions during potential accident scenarios in NPPs. In addition, an operator/crew model has been developed to treat the response of the crew to the plant. The 'crew model' is made up of three operators whose behavior is guided by a set of rules-of-behavior (which represents the knowledge and training of the operators) coupled with written and mental procedures. In addition, an approach for addressing the crew timing variability in DDETs has been developed and implemented based on a set of HRA data from a simulator study. Finally, grouping techniques were developed and applied to the analysis of the scenarios generated by the crew-plant simulation. These techniques support the post-simulation analysis by grouping similar accident sequences, identifying the key contributing events, and quantifying the conditional probability of the groups. These techniques are used to characterize the context of the crew actions in order to obtain insights for HRA. The model has been applied for the analysis of a Small Loss Of Coolant Accident (SLOCA) event for a Pressurized Water Reactor (PWR). The simulation results support an improved characterization of the performance conditions or context of operator actions, which can be used in an HRA, in the analysis of the reliability of the actions. By providing information on the evolution of system indications, dynamic of cues, crew timing in performing procedure steps, situation

  6. Discrete dynamic event tree modeling and analysis of nuclear power plant crews for safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Mercurio, D.

    2011-07-01

    Current Probabilistic Risk Assessment (PRA) and Human Reliability Analysis (HRA) methodologies model the evolution of accident sequences in Nuclear Power Plants (NPPs) mainly based on Logic Trees. The evolution of these sequences is a result of the interactions between the crew and plant; in current PRA methodologies, simplified models of these complex interactions are used. In this study, the Accident Dynamic Simulator (ADS), a modeling framework based on the Discrete Dynamic Event Tree (DDET), has been used for the simulation of crew-plant interactions during potential accident scenarios in NPPs. In addition, an operator/crew model has been developed to treat the response of the crew to the plant. The 'crew model' is made up of three operators whose behavior is guided by a set of rules-of-behavior (which represents the knowledge and training of the operators) coupled with written and mental procedures. In addition, an approach for addressing the crew timing variability in DDETs has been developed and implemented based on a set of HRA data from a simulator study. Finally, grouping techniques were developed and applied to the analysis of the scenarios generated by the crew-plant simulation. These techniques support the post-simulation analysis by grouping similar accident sequences, identifying the key contributing events, and quantifying the conditional probability of the groups. These techniques are used to characterize the context of the crew actions in order to obtain insights for HRA. The model has been applied for the analysis of a Small Loss Of Coolant Accident (SLOCA) event for a Pressurized Water Reactor (PWR). The simulation results support an improved characterization of the performance conditions or context of operator actions, which can be used in an HRA, in the analysis of the reliability of the actions. By providing information on the evolution of system indications, dynamic of cues, crew timing in performing procedure steps, situation

  7. Safety assessment technology on the free drop impact and puncture analysis of the cask for radioactive material transport

    International Nuclear Information System (INIS)

    In this study, the regulatory condition and analysis condition is analyzed for the free drop and puncture impact analysis to develop the safety assessment technology. Impact analysis is performed with finite element method which is one of the many analysis methods of the shipping cask. LS-DYNA3D and ABAQUS is suitable for the free drop and the puncture impact analysis of the shipping cask. For the analysis model, the KSC-4 that is the shipping cask to transport spent nuclear fuel is investigated. The results of both LS-DYNA3D and ABAQUS is completely corresponded. And The integrity of the shipping cask is verified. Using this study, the reliable safety assessment technology is supplied to the staff. The efficient and reliable regulatory tasks is performed using the standard safety assessment technology

  8. Safety assessment technology on the free drop impact and puncture analysis of the cask for radioactive material transport

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dew Hey [Korea Institute of Nuclear Safety, Taejon (Korea, Republic of); Lee, Young Shin; Ryu, Chung Hyun; Kim, Hyun Su; Lee, Ho Chul; Hong, Song Jin; Choi, Young Jin; Lee, Jae Hyung; Na, Jae Yun [Chungnam National Univ., Taejon (Korea, Republic of)

    2001-03-15

    In this study, the regulatory condition and analysis condition is analyzed for the free drop and puncture impact analysis to develop the safety assessment technology. Impact analysis is performed with finite element method which is one of the many analysis methods of the shipping cask. LS-DYNA3D and ABAQUS is suitable for the free drop and the puncture impact analysis of the shipping cask. For the analysis model, the KSC-4 that is the shipping cask to transport spent nuclear fuel is investigated. The results of both LS-DYNA3D and ABAQUS is completely corresponded. And The integrity of the shipping cask is verified. Using this study, the reliable safety assessment technology is supplied to the staff. The efficient and reliable regulatory tasks is performed using the standard safety assessment technology.

  9. Triclosan: A review on systematic risk assessment and control from the perspective of substance flow analysis.

    Science.gov (United States)

    Huang, Chu-Long; Abass, Olusegun K; Yu, Chang-Ping

    2016-10-01

    Triclosan (TCS) is a broad spectrum antibacterial agent mainly used in Pharmaceutical and Personal Care Products. Its increasing use over recent decades have raised its concentration in the environment, with commonly detectable levels found along the food web-from aquatic organisms to humans in the ecosystem. To date, there is shortage of information on how to investigate TCS's systematic risk on exposed organisms including humans, due to the paucity of systematic information on TCS flows in the anthroposphere. Therefore, a more holistic approach to mass flow balancing is required, such that the systematic risk of TCS in all environmental matrices are evaluated. From the perspective of Substance Flow Analysis (SFA), this review critically summarizes the current state of knowledge on TCS production, consumption, discharge, occurrence in built and natural environments, its exposure and metabolism in humans, and also the negative effects of TCS on biota and humans. Recent risk concerns have mainly focused on TCS removal efficiencies and metabolism, but less attention is given to the effect of mass flows from source to fate during risk exposure. However, available data for TCS SFA is limited but SFA can derive logical systematic information from limited data currently available for systematic risk assessment and reduction, based on mass flow analysis. In other words, SFA tool can be used to develop a comprehensive flow chart and indicator system for the risk assessment and reduction of TCS flows in the anthroposphere, thereby bridging knowledge gaps to streamline uncertainties related to policy-making on exposure pathways within TCS flow-lines. In the final analysis, specifics on systematic TCS risk assessment via SFA, and areas of improvement on human adaptation to risks posed by emerging contaminants are identified and directions for future research are suggested. PMID:27239720

  10. An Application of the Functional Resonance Analysis Method (FRAM) to Risk Assessment of Organisational Change

    Energy Technology Data Exchange (ETDEWEB)

    Hollnagel, Erik [MINES ParisTech Crisis and Risk Research Centre (CRC), Sophia Antipolis Cedex (France)

    2012-11-15

    The objective of this study was to demonstrate an alternative approach to risk assessment of organisational changes, based on the principles of resilience engineering. The approach in question was the Functional Resonance Analysis Method (FRAM). Whereas established approaches focus on risks coming from failure or malfunctioning of components, alone or in combination, resilience engineering focuses on the common functions and processes that provide the basis for both successes and failures. Resilience engineering more precisely proposes that failures represent the flip side of the adaptations necessary to cope with the real world complexity rather than a failure of normal system functions and that a safety assessment therefore should focus on how functions are carried out rather than on how they may fail. The objective of this study was not to evaluate the current approach to risk assessment used by the organisation in question. The current approach has nevertheless been used as a frame of reference, but in a non-evaluative manner. The author has demonstrated through the selected case that FRAM can be used as an alternative approach to organizational changes. The report provides the reader with details to consider when making a decision on what analysis approach to use. The choice of which approach to use must reflect priorities and concerns of the organisation and the author makes no statement about which approach is better. It is clear that the choice of an analysis approach is not so simple to make and there are many things to take into account such as the larger working environment, organisational culture, regulatory requirements, etc.

  11. Wavelet transform analysis to assess oscillations in pial artery pulsation at the human cardiac frequency.

    Science.gov (United States)

    Winklewski, P J; Gruszecki, M; Wolf, J; Swierblewska, E; Kunicka, K; Wszedybyl-Winklewska, M; Guminski, W; Zabulewicz, J; Frydrychowski, A F; Bieniaszewski, L; Narkiewicz, K

    2015-05-01

    Pial artery adjustments to changes in blood pressure (BP) may last only seconds in humans. Using a novel method called near-infrared transillumination backscattering sounding (NIR-T/BSS) that allows for the non-invasive measurement of pial artery pulsation (cc-TQ) in humans, we aimed to assess the relationship between spontaneous oscillations in BP and cc-TQ at frequencies between 0.5 Hz and 5 Hz. We hypothesized that analysis of very short data segments would enable the estimation of changes in the cardiac contribution to the BP vs. cc-TQ relationship during very rapid pial artery adjustments to external stimuli. BP and pial artery oscillations during baseline (70s and 10s signals) and the response to maximal breath-hold apnea were studied in eighteen healthy subjects. The cc-TQ was measured using NIR-T/BSS; cerebral blood flow velocity, the pulsatility index and the resistive index were measured using Doppler ultrasound of the left internal carotid artery; heart rate and beat-to-beat systolic and diastolic blood pressure were recorded using a Finometer; end-tidal CO2 was measured using a medical gas analyzer. Wavelet transform analysis was used to assess the relationship between BP and cc-TQ oscillations. The recordings lasting 10s and representing 10 cycles with a frequency of ~1 Hz provided sufficient accuracy with respect to wavelet coherence and wavelet phase coherence values and yielded similar results to those obtained from approximately 70cycles (70s). A slight but significant decrease in wavelet coherence between augmented BP and cc-TQ oscillations was observed by the end of apnea. Wavelet transform analysis can be used to assess the relationship between BP and cc-TQ oscillations at cardiac frequency using signals intervals as short as 10s. Apnea slightly decreases the contribution of cardiac activity to BP and cc-TQ oscillations. PMID:25804326

  12. An Application of the Functional Resonance Analysis Method (FRAM) to Risk Assessment of Organisational Change

    International Nuclear Information System (INIS)

    The objective of this study was to demonstrate an alternative approach to risk assessment of organisational changes, based on the principles of resilience engineering. The approach in question was the Functional Resonance Analysis Method (FRAM). Whereas established approaches focus on risks coming from failure or malfunctioning of components, alone or in combination, resilience engineering focuses on the common functions and processes that provide the basis for both successes and failures. Resilience engineering more precisely proposes that failures represent the flip side of the adaptations necessary to cope with the real world complexity rather than a failure of normal system functions and that a safety assessment therefore should focus on how functions are carried out rather than on how they may fail. The objective of this study was not to evaluate the current approach to risk assessment used by the organisation in question. The current approach has nevertheless been used as a frame of reference, but in a non-evaluative manner. The author has demonstrated through the selected case that FRAM can be used as an alternative approach to organizational changes. The report provides the reader with details to consider when making a decision on what analysis approach to use. The choice of which approach to use must reflect priorities and concerns of the organisation and the author makes no statement about which approach is better. It is clear that the choice of an analysis approach is not so simple to make and there are many things to take into account such as the larger working environment, organisational culture, regulatory requirements, etc

  13. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    Science.gov (United States)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and

  14. Guidebook in using Cost Benefit Analysis and strategic environmental assessment for environmental planning in China

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    Environmental planning in China may benefit from greater use of Cost Benefit Analysis (CBA) and Strategic Environmental Assessment (SEA) methodologies. We provide guidance on using these methodologies. Part I and II show the principles behind the methodologies as well as their theoretical structure. Part III demonstrates the methodologies in action in a range of different good practice examples. The case studies and theoretical expositions are intended to teach by way of example as well as by understanding the principles, and to help planners use the methodologies as correctly as possible.(auth)

  15. Semantic Pattern Analysis for Verbal Fluency Based Assessment of Neurological Disorders

    Energy Technology Data Exchange (ETDEWEB)

    Sukumar, Sreenivas R [ORNL; Ainsworth, Keela C [ORNL; Brown, Tyler C [ORNL

    2014-01-01

    In this paper, we present preliminary results of semantic pattern analysis of verbal fluency tests used for assessing cognitive psychological and neuropsychological disorders. We posit that recent advances in semantic reasoning and artificial intelligence can be combined to create a standardized computer-aided diagnosis tool to automatically evaluate and interpret verbal fluency tests. Towards that goal, we derive novel semantic similarity (phonetic, phonemic and conceptual) metrics and present the predictive capability of these metrics on a de-identified dataset of participants with and without neurological disorders.

  16. Three-dimensional modelling and sensitivity analysis for the stability assessment of deep underground repository

    International Nuclear Information System (INIS)

    For the mechanical stability assessment of underground high-level waste repository, computer simulations using the three-dimensional simulation code, FLAC3D, were carried out and important parameters including stress ratio, depth, tunnel size, joint space, and joint properties were chosen from the sensitivity analysis of the results. The main effect as well as the interaction effect between the important parameters could be analyzed using fractional factorial design. In order to analyze the stability of deposition tunnel and deposition hole in discontinuous rock mass, different modellings were performed under different conditions using 3DEC. From this, the influence of joint distribution and properties, rock properties, and stress ratio could be determined

  17. Forest canopy analysis in the Alpine environment: comparison among assessment methods

    OpenAIRE

    Pastorella F; Paletto A

    2013-01-01

    Forest canopy analysis in the Alpine environment: comparison among assessment methods. Forest canopy is an important ecological feature of forest stands and can be expressed as Leaf Area Index (LAI) or canopy cover percentage. LAI is the ratio between leaf area and ground area (m2 m-2) and it can be measured using an angle of 180°. Instead, the canopy cover is the percentage of forest area occupied by the vertical projection of tree crowns; consequently, LAI expresses the canopy closure rathe...

  18. Developmental assessment of the multidimensional component in RELAP5 for Savannah River Site thermal hydraulic analysis

    International Nuclear Information System (INIS)

    This report documents ten developmental assessment problems which were used to test the multidimensional component in RELAP5/MOD2.5, Version 3w. The problems chosen were a rigid body rotation problem, a pure radial symmetric flow problem, an r-θ symmetric flow problem, a fall problem, a rest problem, a basic one-dimensional flow test problem, a gravity wave problem, a tank draining problem, a flow through the center problem, and coverage analysis using PIXIE. The multidimensional code calculations are compared to analytical solutions and one-dimensional code calculations. The discussion section of each problem contains information relative to the code's ability to simulate these problems

  19. Automatic analysis of CR-39 track detectors for selective assessment of radon and its decay products

    International Nuclear Information System (INIS)

    A system for analyzing data from CR-39 track detectors exposed to radon and its daughter products was developed. The system performs both alpha particle spectroscopic analysis, where for each track the essential geometric parameters are evaluated, and mere counting of tracks; each condition is distinguished by its own chemical etching, microscope and scanning parameters. The spectroscopic technique was applied to the assessment of 210Po embedded in glass and to the discrimination of 222Rn, 218Po and 214Po contributions in passive dosimetry. The counting technique was applied to the determination of indoor radon concentration with passive dosemeters containing CR-39 detectors

  20. Application of PRA (probabilistic risk assessments) methods for fire risk analysis

    International Nuclear Information System (INIS)

    Fire as a contributor to nuclear power plant risk has been evaluated extensively in more than 15 large-scale probabilistic risk assessments (PRA). Since their first application 9 years ago, these studies have shown that fire can be an important contributor to plant risk, an extremely plant-specific conclusion. In this article, the evolution of the application of fire risk analysis to nuclear plants is summarized. Special attention is given to Appendix R, 10CFR50; the two-stage screening approach; multilocation fires; smoke propagation; adverse effects of fire protection systems; effectiveness of fire protection systems; and fires from earthquakes. 15 refs

  1. Body electrical loss analysis (BELA) in the assessment of visceral fat: a demonstration

    OpenAIRE

    Blomqvist Kim H; Lundbom Jesper; Lundbom Nina; Sepponen Raimo E

    2011-01-01

    Abstract Background Body electrical loss analysis (BELA) is a new non-invasive way to assess visceral fat depot size through the use of electromagnetism. BELA has worked well in phantom measurements, but the technology is not yet fully validated. Methods Ten volunteers (5 men and 5 women, age: 22-60 y, BMI: 21-30 kg/m2, waist circumference: 73-108 cm) were measured with the BELA instrument and with cross-sectional magnetic resonance imaging (MRI) at the navel level, navel +5 cm and navel -5 c...

  2. Automatic mechanical fault assessment of small wind energy systems in microgrids using electric signature analysis

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Marhadi, Kun Saptohartyadi; Jensen, Bogi Bech

    2013-01-01

    A microgrid is a cluster of power generation, consumption and storage systems capable of operating either independently or as part of a macrogrid. The mechanical condition of the power production units, such as the small wind turbines, is considered of crucial importance especially in the case of...... islanded operation. In this paper, the fault assessment is achieved efficiently and consistently via electric signature analysis (ESA). In ESA the fault related frequency components are manifested as sidebands of the existing current and voltage time harmonics. The energy content between the fundamental, 5...

  3. Analysis of common cause failures in the Japanese Nuclear Power Plants for probabilistic risk assessment

    International Nuclear Information System (INIS)

    Common Cause Failure (CCF) is one of the critical risk sources for nuclear power plants. To reflect the operating experience in the Japanese nuclear power plants (NPPS) to probabilistic risk assessment (PRA), the analysis of CCFs and estimation of CCF model parameters (alpha factors and MGL parameters) for PRA were performed using the component failure records in the NUClear Information Archives (NUCIA) for the Japanese nuclear power plants. The methodology is according to the standard procedure used in the U.S. Nuclear Regulatory Commission. The results will be released for use in the Japanese PRAs after the review and validation by the utilities. (author)

  4. Hydrodynamic analysis, performance assessment, and actuator design of a flexible tail propulsor in an artificial alligator

    International Nuclear Information System (INIS)

    The overall objective of this research is to develop analysis tools for determining actuator requirements and assessing viable actuator technology for design of a flexible tail propulsor in an artificial alligator. A simple hydrodynamic model that includes both reactive and resistive forces along the tail is proposed and the calculated mean thrust agrees well with conventional estimates of drag. Using the hydrodynamic model forces as an input, studies are performed for an alligator ranging in size from 1 cm to 2 m at swimming speeds of 0.3–1.8 body lengths per second containing five antagonistic pairs of actuators distributed along the length of the tail. Several smart materials are considered for the actuation system, and preliminary analysis results indicate that the acrylic electroactive polymer and the flexible matrix composite actuators are potential artificial muscle technologies for the system

  5. Human reliability analysis in Wolsong 2/3/4 nuclear power plants probabilistic safety assessment

    International Nuclear Information System (INIS)

    The Level 1 probabilistic safety assessment (PSA) for Wolsong(WS) 2/3/4 nuclear power plant(NPPs) in design stage is performed using the methodologies being equivalent to PWR PSA. Accident sequence evaluation program (ASEF) human reliability analysis (HRA) procedure and technique for human error rate prediction (THERP) are used in HRA of WS 2/3/4 NPPs PSA. The purpose of this paper is to introduce the procedure and methodology of HRA in WS 2/3/4 NPPs PSA. Also, this paper describes the interim results of importance analysis for human actions modeled in WS 2/3/4 PSA and the findings and recommendations of administrative control of secondary control area from the view of human factors

  6. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    Science.gov (United States)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  7. Systamatic approach to integration of a human reliability analysis into a NPP probabalistic risk assessment

    International Nuclear Information System (INIS)

    This chapter describes the human reliability analysis tasks which were employed in the evaluation of the overall probability of an internal flood sequence and its consequences in terms of disabling vulnerable risk significant equipment. Topics considered include the problem familiarization process, the identification and classification of key human interactions, a human interaction review of potential initiators, a maintenance and operations review, human interaction identification, quantification model selection, the definition of operator-induced sequences, the quantification of specific human interactions, skill- and rule-based interactions, knowledge-based interactions, and the incorporation of human interaction-related events into the event tree structure. It is concluded that an integrated approach to the analysis of human interaction within the context of a Probabilistic Risk Assessment (PRA) is feasible

  8. Feature-based analysis for quality assessment of x-ray computed tomography measurements

    International Nuclear Information System (INIS)

    This paper presents an approach to assess the quality of the data extracted with computed tomography (CT) measuring systems to perform geometrical evaluations. The approach consists in analyzing the error features introduced by the CT measuring system during the extraction operation. The analysis of the features is performed qualitatively (using graphical analysis tools) and/or quantitatively (by means of the root-mean-square deviation parameter of the error features). The approach was used to analyze four sets of measurements performed with an industrial x-ray cone beam CT measuring system. Three test parts were used in the experiments: a high accuracy manufacturing multi-wave standard, a calibrated step cylinder and a calibrated production part. The results demonstrate the usefulness of the approach to gain knowledge on CT measuring processes and improve the quality of CT geometrical evaluations. Advantages and limitations of the approach are discussed. (paper)

  9. Association rule analysis for the assessment of the risk of coronary heart events.

    Science.gov (United States)

    Karaolis, M; Moutiris, J A; Papaconstantinou, L; Pattichis, C S

    2009-01-01

    Although significant progress has been made in the diagnosis and treatment of coronary heart disease (CHD), further investigation is still needed. The objective of this study was to develop a data mining system using association analysis based on the apriori algorithm for the assessment of heart event related risk factors. The events investigated were: myocardial infarction (MI), percutaneous coronary intervention (PCI), and coronary artery bypass graft surgery (CABG). A total of 369 cases were collected from the Paphos CHD Survey, most of them with more than one event. The most important risk factors, as extracted from the association rule analysis were: sex (male), smoking, high density lipoprotein, glucose, family history, and history of hypertension. Most of these risk factors were also extracted by our group in a previous study using the C4.5 decision tree algorithms, and by other investigators. Further investigation with larger data sets is still needed to verify these findings. PMID:19965088

  10. Compost maturity assessment using physicochemical, solid-state spectroscopy, and plant bioassay analysis.

    Science.gov (United States)

    Kumar, D Senthil; Kumar, P Satheesh; Rajendran, N M; Anbuganapathi, G

    2013-11-27

    The vermicompost produced from flower waste inoculated with biofertilizers was subjected to compost maturity test: (i) physicochemical method (pH, OC, TN, C:N); (ii) solid state spectroscopic analysis (FTIR and (13)C CPMAS NMR); and (iii) plant bioassay (germination index). The pH of vermicompost was decreased toward neutral, C:N ratio vermicomposts result shows reduction of complex organic materials into simple minerals which indicates the maturity of the experimental vermicompost product than the control. The increased aliphatic portion incorporated with flower residues may be due to the synthesis of alkyl, O-alkyl, and COO groups by the microbes present in the gut of earthworm. Plant bioassays are considered the most conventional assessment of compost maturity analysis, and subsequently, it shows the effect of vermicompost maturity on the germination index of Vigna mungo . PMID:24191667

  11. Evolution and Implementation of the NASA Robotic Conjunction Assessment Risk Analysis Concept of Operations

    Science.gov (United States)

    Newman, Lauri K.; Frigm, Ryan C.; Duncan, Matthew G.; Hejduk, Matthew D.

    2014-01-01

    Reacting to potential on-orbit collision risk in an operational environment requires timely and accurate communication and exchange of data, information, and analysis to ensure informed decision-making for safety of flight and responsible use of the shared space environment. To accomplish this mission, it is imperative that all stakeholders effectively manage resources: devoting necessary and potentially intensive resource commitment to responding to high-risk conjunction events and preventing unnecessary expenditure of resources on events of low collision risk. After 10 years of operational experience, the NASA Robotic Conjunction Assessment Risk Analysis (CARA) is modifying its Concept of Operations (CONOPS) to ensure this alignment of collision risk and resource management. This evolution manifests itself in the approach to characterizing, reporting, and refining of collision risk. Implementation of this updated CONOPS is expected to have a demonstrated improvement on the efficacy of JSpOC, CARA, and owner/operator resources.

  12. The tsunami probabilistic risk assessment of nuclear power plant (3). Outline of tsunami fragility analysis

    International Nuclear Information System (INIS)

    Tsunami Probabilistic Risk Assessment (PRA) standard was issued in February 2012 by Standard Committee of Atomic Energy Society of Japan (AESJ). This article detailed tsunami fragility analysis, which calculated building and structure damage probability contributing core damage and consisted of five evaluation steps: (1) selection of evaluated element and damage mode, (2) selection of evaluation procedure, (3) evaluation of actual stiffness, (4) evaluation of actual response and (5) evaluation of fragility (damage probability and others). As an application example of the standard, calculation results of tsunami fragility analysis investigation by tsunami PRA subcommittee of AESJ were shown reflecting latest knowledge of damage state caused by wave force and others acted by tsunami from the 'off the Pacific Coast of Tohoku Earthquake'. (T. Tanaka)

  13. Assessment of diversity indices for the characterization of the soil prokaryotic community by metagenomic analysis

    Science.gov (United States)

    Chernov, T. I.; Tkhakakhova, A. K.; Kutovaya, O. V.

    2015-04-01

    The diversity indices used in ecology for assessing the metagenomes of soil prokaryotic communities at different phylogenetic levels were compared. The following indices were considered: the number of detected taxa and the Shannon, Menhinick, Margalef, Simpson, Chao1, and ACE indices. The diversity analysis of the prokaryotic communities in the upper horizons of a typical chernozem (Haplic Chernozem (Pachic)), a dark chestnut soil (Haplic Kastanozem (Chromic)), and an extremely arid desert soil (Endosalic Calcisol (Yermic)) was based on the analysis of 16S rRNA genes. The Menhinick, Margalef, Chao1, and ACE indices gave similar results for the classification of the communities according to their diversity levels; the Simpson index gave good results only for the high-level taxa (phyla); the best results were obtained with the Shannon index. In general, all the indices used showed a decrease in the diversity of the soil prokaryotes in the following sequence: chernozem > dark chestnut soil > extremely arid desert soil.

  14. Y-12 site-specific earthquake response analysis and soil liquefaction assessment

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, S.B.; Hunt, R.J.; Manrod, W.E. III

    1995-09-29

    A site-specific earthquake response analysis and soil liquefaction assessment were performed for the Oak Ridge Y-12 Plant. The main purpose of these studies was to use the results of the analyses for evaluating the safety of the performance category -1, -2, and -3 facilities against the natural phenomena seismic hazards. Earthquake response was determined for seven (7), one dimensional soil columns (Fig. 12) using two horizontal components of the PC-3 design basis 2000-year seismic event. The computer program SHAKE 91 (Ref. 7) was used to calculate the absolute response accelerations on top of ground (soil/weathered shale) and rock outcrop. The SHAKE program has been validated for horizontal response calculations at periods less than 2.0 second at several sites and consequently is widely accepted in the geotechnical earthquake engineering area for site response analysis.

  15. On sustainability assessment of technical systems. Experience from systems analysis with the ORWARE and Ecoeffect tools

    Energy Technology Data Exchange (ETDEWEB)

    Assefa, Getachew [Royal Inst. of Technology, Stockholm (Sweden). Dept. of Chemical Engineering

    2006-06-15

    Engineering research and development work is undergoing a reorientation from focusing on specific parts of different systems to a broader perspective of systems level, albeit at a slower pace. This reorientation should be further developed and enhanced with the aim of organizing and structuring our technical systems in meeting sustainability requirements in face of global ecological threats that have far-reaching social and economic implications, which can no longer be captured using conventional approach of research. Until a list of universally acceptable, clear, and measurable indicators of sustainable development is developed, the work with sustainability metrics should continue to evolve as a relative measure of ecological, economic, and social performance of human activities in general, and technical systems in particular. This work can be done by comparing the relative performance of alternative technologies of providing the same well-defined function or service; or by characterizing technologies that enjoy different levels of societal priorities using relevant performance indicators. In both cases, concepts and methods of industrial ecology play a vital role. This thesis is about the development and application of a systematic approach for the assessment of the performance of technical systems from the perspective of systems analysis, sustainability, sustainability assessment, and industrial ecology. The systematic approach developed and characterized in this thesis advocates for a simultaneous assessment of the ecological, economic, and social dimensions of performance of technologies in avoiding sub-optimization and problem shifting between dimensions. It gives a holistic picture by taking a life cycle perspective of all important aspects. The systematic assessment of technical systems provides an even-handed assessment resulting in a cumulative knowledge. A modular structure of the approach makes it flexible enough in terms of comparing a number of

  16. Use of Multivariate Analysis to Assess the Nutritional Condition of Fish Larvae From Nucleic Acids and Protein Content

    OpenAIRE

    Cunha, Isabel; Saborido-Rey, Fran; Planas, Miguel

    2003-01-01

    The nutritional condition of turbot larvae (Scophthalmus maximus) was assessed by a multivariate analysis with DNA, RNA, and protein content as input variables. Special attention was given to the time that feeding began and to the timing and duration of starvation. The combination of the principal components analysis and the stepwise discriminant analysis, both techniques of multivariate analysis, made it possible to allocate the larvae into groups that were defined and identified based on si...

  17. Application of Fuzzy Set Theory for Uncertainty Analysis in the Probabilistic Safety Assessment of Nuclear Power Plants

    International Nuclear Information System (INIS)

    The paper discusses the application of fuzzy set theory for uncertainty analysis in the NPP probabilistic safety assessment as an alternative to statistical methods. Results obtained with the Monte Carlo method and fuzzy set theory to assess the probability and uncertainty of failure of the safety function performed by the passive emergency core cooling system are compared

  18. The Assessment of a Tutoring Program to Meet CAS Standards Using a SWOT Analysis and Action Plan

    Science.gov (United States)

    Fullmer, Patricia

    2009-01-01

    This article summarizes the use of SWOT (Strengths, Weaknesses, Opportunities, and Threats) analysis and subsequent action planning as a tool of self-assessment to meet CAS (Council for the Advancement of Standards in Higher Education) requirements for systematic assessment. The use of the evaluation results to devise improvements to increase the…

  19. Left ventricular synchrony assessed by phase analysis of gated myocardial perfusion SPECT imaging in healthy subjects

    International Nuclear Information System (INIS)

    Objective: To investigate the value of Cedars-Sinai quantitative gated SPECT (QGS) phase analysis for left ventricular synchrony assessment in healthy subjects. Methods: Seventy-four healthy subjects (41 males, 33 females,average age: (60±13) years) underwent both rest and exercise 99Tcm-MIBI G-MPI. QGS software was used to analyze the reconstructed rest gated SPECT images automatically, and then the parameters of left ventricular synchrony including phase bandwidth (BW) and phase standard deviation (SD) were obtained. The influences of gender and age (age<60 years, n=36; age ≥ 60 years, n=38) on left ventricular systolic synchronicity were analyzed. The phase angle for original segmental contraction was measured to determine the onset of the ventricular contraction using 17-segment model. Forty healthy subjects were selected by simple random sampling method to evaluate the intra-observer and interobserver repeatability of QGS phase analysis software. Two-sample t test and linear correlation analysis were used to analyze the data. Results: The BW and SD of left ventricular in healthy subjects were (37.22 ±11.71)°, (11.84±5.39)° respectively. Comparisons between male and female for BW and SD yielded no statistical significance (BW: (36.00±9.70)°, (38.73±13.84)°; SD: (11.88±5.56)°, (11.79±5.26)°; t=0.96 and-0.07, both P>0.05); whereas the older subjects (age≥60 years) had larger BW than the others (age<60 years ; (39.95± 12.65)°, (34.33± 10.00)°; t=-2.11, P<0.05) and no statistical significance was shown for SD between the two age groups ((11.18±4.31)°, (12.54±6.33)°; t=1.08, P>0.05). Of the 74 subjects, the mechanical activation started from the ventricular base to apex in 54 subjects (73%), and from apex to base in only 20 subjects (27%). High repeatability of phase analysis was observed for both intra-observer and inter-observer (r=0.867-0.906, all P<0.001). Conclusions: Good left ventricular segmental synchrony is shown in healthy

  20. Review and analysis of parameters for assessing transport of environmentally released radionuclides through agriculture

    Energy Technology Data Exchange (ETDEWEB)

    Baes, C.F. III; Sharp, R.D.; Sjoreen, A.L.; Shor, R.W.

    1984-09-01

    Most of the default parameters incorporated into the TERRA computer code are documented including a literature review and systematic analysis of element-specific transfer parameters B/sub v/, B/sub r/, F/sub m/, F/sub f/, and K/sub d/. This review and analysis suggests default values which are consistent with the modeling approaches taken in TERRA and may be acceptable for most assessment applications of the computer code. However, particular applications of the code and additional analysis of elemental transport may require alternative default values. Use of the values reported herein in other computer codes simulating terrestrial transport is not advised without careful interpretation of the limitations and scope these analyses. An approach to determination of vegetation-specific interception fractions is also discussed. The limitations of this approach are many, and its use indicates the need for analysis of deposition, interception, and weathering processes. Judgement must be exercised in interpretation of plant surface concentrations generated. Finally, the location-specific agricultural, climatological, and population parameters in the default SITE data base documented. These parameters are intended as alternatives to average values currently used. Indeed, areas in the United States where intensive crop, milk, or beef production occurs will be reflected in the parameter values as will areas where little agricultural activity occurs. However, the original information sources contained some small error and the interpolation and conversion methods used will add more. Parameters used in TERRA not discussed herein are discussed in the companion report to this one - ORNL-5785. In the companion report the models employed in and the coding of TERRA are discussed. These reports together provide documentation of the TERRA code and its use in assessments. 96 references, 78 figures, 21 tables.

  1. Review and analysis of parameters for assessing transport of environmentally released radionuclides through agriculture

    International Nuclear Information System (INIS)

    Most of the default parameters incorporated into the TERRA computer code are documented including a literature review and systematic analysis of element-specific transfer parameters B/sub v/, B/sub r/, F/sub m/, F/sub f/, and K/sub d/. This review and analysis suggests default values which are consistent with the modeling approaches taken in TERRA and may be acceptable for most assessment applications of the computer code. However, particular applications of the code and additional analysis of elemental transport may require alternative default values. Use of the values reported herein in other computer codes simulating terrestrial transport is not advised without careful interpretation of the limitations and scope these analyses. An approach to determination of vegetation-specific interception fractions is also discussed. The limitations of this approach are many, and its use indicates the need for analysis of deposition, interception, and weathering processes. Judgement must be exercised in interpretation of plant surface concentrations generated. Finally, the location-specific agricultural, climatological, and population parameters in the default SITE data base documented. These parameters are intended as alternatives to average values currently used. Indeed, areas in the United States where intensive crop, milk, or beef production occurs will be reflected in the parameter values as will areas where little agricultural activity occurs. However, the original information sources contained some small error and the interpolation and conversion methods used will add more. Parameters used in TERRA not discussed herein are discussed in the companion report to this one - ORNL-5785. In the companion report the models employed in and the coding of TERRA are discussed. These reports together provide documentation of the TERRA code and its use in assessments. 96 references, 78 figures, 21 tables

  2. Quality in environmental science for policy: Assessing uncertainty as a component of policy analysis

    International Nuclear Information System (INIS)

    The sheer number of attempts to define and classify uncertainty reveals an awareness of its importance in environmental science for policy, though the nature of uncertainty is often misunderstood. The interdisciplinary field of uncertainty analysis is unstable; there are currently several incomplete notions of uncertainty leading to different and incompatible uncertainty classifications. One of the most salient shortcomings of present-day practice is that most of these classifications focus on quantifying uncertainty while ignoring the qualitative aspects that tend to be decisive in the interface between science and policy. Consequently, the current practices of uncertainty analysis contribute to increasing the perceived precision of scientific knowledge, but do not adequately address its lack of socio-political relevance. The 'positivistic' uncertainty analysis models (like those that dominate the fields of climate change modelling and nuclear or chemical risk assessment) have little social relevance, as they do not influence negotiations between stakeholders. From the perspective of the science-policy interface, the current practices of uncertainty analysis are incomplete and incorrectly focused. We argue that although scientific knowledge produced and used in a context of political decision-making embodies traditional scientific characteristics, it also holds additional properties linked to its influence on social, political, and economic relations. Therefore, the significance of uncertainty cannot be assessed based on quality criteria that refer to the scientific content only; uncertainty must also include quality criteria specific to the properties and roles of this scientific knowledge within political, social, and economic contexts and processes. We propose a conceptual framework designed to account for such substantive, contextual, and procedural criteria of knowledge quality. At the same time, the proposed framework includes and synthesizes the various

  3. The Strategic Environment Assessment bibliographic network: A quantitative literature review analysis

    International Nuclear Information System (INIS)

    Academic literature has been continuously growing at such a pace that it can be difficult to follow the progression of scientific achievements; hence, the need to dispose of quantitative knowledge support systems to analyze the literature of a subject. In this article we utilize network analysis tools to build a literature review of scientific documents published in the multidisciplinary field of Strategic Environment Assessment (SEA). The proposed approach helps researchers to build unbiased and comprehensive literature reviews. We collect information on 7662 SEA publications and build the SEA Bibliographic Network (SEABN) employing the basic idea that two publications are interconnected if one cites the other. We apply network analysis at macroscopic (network architecture), mesoscopic (sub graph) and microscopic levels (node) in order to i) verify what network structure characterizes the SEA literature, ii) identify the authors, disciplines and journals that are contributing to the international discussion on SEA, and iii) scrutinize the most cited and important publications in the field. Results show that the SEA is a multidisciplinary subject; the SEABN belongs to the class of real small world networks with a dominance of publications in Environmental studies over a total of 12 scientific sectors. Christopher Wood, Olivia Bina, Matthew Cashmore, and Andrew Jordan are found to be the leading authors while Environmental Impact Assessment Review is by far the scientific journal with the highest number of publications in SEA studies. - Highlights: • We utilize network analysis to analyze scientific documents in the SEA field. • We build the SEA Bibliographic Network (SEABN) of 7662 publications. • We apply network analysis at macroscopic, mesoscopic and microscopic network levels. • We identify SEABN architecture, relevant publications, authors, subjects and journals

  4. The Strategic Environment Assessment bibliographic network: A quantitative literature review analysis

    Energy Technology Data Exchange (ETDEWEB)

    Caschili, Simone, E-mail: s.caschili@ucl.ac.uk [UCL QASER Lab, University College London, Gower Street, London WC1E 6BT (United Kingdom); De Montis, Andrea; Ganciu, Amedeo; Ledda, Antonio; Barra, Mario [Dipartimento di Agraria, University of Sassari, viale Italia, 39, 07100 Sassari (Italy)

    2014-07-01

    Academic literature has been continuously growing at such a pace that it can be difficult to follow the progression of scientific achievements; hence, the need to dispose of quantitative knowledge support systems to analyze the literature of a subject. In this article we utilize network analysis tools to build a literature review of scientific documents published in the multidisciplinary field of Strategic Environment Assessment (SEA). The proposed approach helps researchers to build unbiased and comprehensive literature reviews. We collect information on 7662 SEA publications and build the SEA Bibliographic Network (SEABN) employing the basic idea that two publications are interconnected if one cites the other. We apply network analysis at macroscopic (network architecture), mesoscopic (sub graph) and microscopic levels (node) in order to i) verify what network structure characterizes the SEA literature, ii) identify the authors, disciplines and journals that are contributing to the international discussion on SEA, and iii) scrutinize the most cited and important publications in the field. Results show that the SEA is a multidisciplinary subject; the SEABN belongs to the class of real small world networks with a dominance of publications in Environmental studies over a total of 12 scientific sectors. Christopher Wood, Olivia Bina, Matthew Cashmore, and Andrew Jordan are found to be the leading authors while Environmental Impact Assessment Review is by far the scientific journal with the highest number of publications in SEA studies. - Highlights: • We utilize network analysis to analyze scientific documents in the SEA field. • We build the SEA Bibliographic Network (SEABN) of 7662 publications. • We apply network analysis at macroscopic, mesoscopic and microscopic network levels. • We identify SEABN architecture, relevant publications, authors, subjects and journals.

  5. Using Habitat Equivalency Analysis to Assess the Cost Effectiveness of Restoration Outcomes in Four Institutional Contexts

    Science.gov (United States)

    Scemama, Pierre; Levrel, Harold

    2016-01-01

    At the national level, with a fixed amount of resources available for public investment in the restoration of biodiversity, it is difficult to prioritize alternative restoration projects. One way to do this is to assess the level of ecosystem services delivered by these projects and to compare them with their costs. The challenge is to derive a common unit of measurement for ecosystem services in order to compare projects which are carried out in different institutional contexts having different goals (application of environmental laws, management of natural reserves, etc.). This paper assesses the use of habitat equivalency analysis (HEA) as a tool to evaluate ecosystem services provided by restoration projects developed in different institutional contexts. This tool was initially developed to quantify the level of ecosystem services required to compensate for non-market impacts coming from accidental pollution in the US. In this paper, HEA is used to assess the cost effectiveness of several restoration projects in relation to different environmental policies, using case studies based in France. Four case studies were used: the creation of a market for wetlands, public acceptance of a port development project, the rehabilitation of marshes to mitigate nitrate loading to the sea, and the restoration of streams in a protected area. Our main conclusion is that HEA can provide a simple tool to clarify the objectives of restoration projects, to compare the cost and effectiveness of these projects, and to carry out trade-offs, without requiring significant amounts of human or technical resources.

  6. Literature review and analysis of the application of health outcome assessment instruments in Chinese medicine

    Institute of Scientific and Technical Information of China (English)

    Feng-bin Liu; Zheng-kun Hou; Yun-ying Yang; Pei-wu Li; Qian-wen Li; Nelson Xie; Jing-wei Li

    2013-01-01

    OBJECITVE:To evaluate the application of health assessment instruments in Chinese medicine.METHODS:According to a pre-defined search strategy,a comprehensive literature search for all articles published in China National Knowledge Infrastructure databases was conducted.The resulting articles that met the defined inclusion and exclusion criteria were used for analysis.RESULTS:A total of 97 instruments for health outcome assessment in Chinese medicine have been used in fundamental and theoretical research,and 14 of these were also used in 29 clinical trials that were randomized controlled trials,or descriptive or cross-sectional studies.In 2 152 Chinese medicine-based studies that used instruments in their methodology,more than 150 questionnaires were identified.Among the identified questionnaires,51 were used in more than 10 articles (0.5%).Most of these instruments were developed in Western countries and few studies (4%) used the instrument as the primary evidence for their conclusions.CONCLUSION:Usage of instruments for health outcome assessment in Chinese medicine is increasing rapidly; however,current limitations include selection rationale,result interpretation and standardization,which must be addressed accordingly.

  7. Independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool

    International Nuclear Information System (INIS)

    MELCOR is a fully integrated computer code that models all phases of the progression of severe accidents in light water reactor nuclear power plants, and is being developed for the US Nuclear Regulatory Commission (NRC) by Sandia National Laboratories (SNL). Brookhaven National Laboratory (BNL) has a program with the NRC called ''MELCOR Verification, Benchmarking, and Applications,'' whose aim is to provide independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool. The scope of this program is to perform quality control verification on all released versions of MELCOR, to benchmark MELCOR against more mechanistic codes and experimental data from severe fuel damage tests, and to evaluate the ability of MELCOR to simulate long-term severe accident transients in commercial LWRs, by applying the code to model both BWRs and PWRs. Under this program, BNL provided input to the NRC-sponsored MELCOR Peer Review, and is currently contributing to the MELCOR Cooperative Assessment Program (MCAP). This paper presents a summary of MELCOR assessment efforts at BNL and their contribution to NRC goals with respect to MELCOR

  8. Assessment of occupational safety risks in Floridian solid waste systems using Bayesian analysis.

    Science.gov (United States)

    Bastani, Mehrad; Celik, Nurcin

    2015-10-01

    Safety risks embedded within solid waste management systems continue to be a significant issue and are prevalent at every step in the solid waste management process. To recognise and address these occupational hazards, it is necessary to discover the potential safety concerns that cause them, as well as their direct and/or indirect impacts on the different types of solid waste workers. In this research, our goal is to statistically assess occupational safety risks to solid waste workers in the state of Florida. Here, we first review the related standard industrial codes to major solid waste management methods including recycling, incineration, landfilling, and composting. Then, a quantitative assessment of major risks is conducted based on the data collected using a Bayesian data analysis and predictive methods. The risks estimated in this study for the period of 2005-2012 are then compared with historical statistics (1993-1997) from previous assessment studies. The results have shown that the injury rates among refuse collectors in both musculoskeletal and dermal injuries have decreased from 88 and 15 to 16 and three injuries per 1000 workers, respectively. However, a contrasting trend is observed for the injury rates among recycling workers, for whom musculoskeletal and dermal injuries have increased from 13 and four injuries to 14 and six injuries per 1000 workers, respectively. Lastly, a linear regression model has been proposed to identify major elements of the high number of musculoskeletal and dermal injuries. PMID:26219294

  9. Efficiency assessment of coal mine safety input by data envelopment analysis

    Institute of Scientific and Technical Information of China (English)

    TONG Lei; DING Ri-jia

    2008-01-01

    In recent years improper allocation of safety input has prevailed in coal mines in China, which resulted in the frequent accidents in coal mining operation. A comprehensive assessment of the input efficiency of coal mine safety should lead to im-proved efficiency in the use of funds and management resources. This helps government and enterprise managers better understand how safety inputs are used and to optimize allocation of resources. Study on coal mine's efficiency assessment of safety input was conducted in this paper. A C2R model with non-Archimedean infinitesimal vector based on output is established after consideration of the input characteristics and the model properties. An assessment of an operating mine was done using a specific set of input and output criteria. It is found that the safety input was efficient in 2002 and 2005 and was weakly efficient in 2003. However, the effi-ciency was relatively low in both 2001 and 2004. The safety input resources can be optimized and adjusted by means of projection theory. Such analysis shows that, on average in 2001 and 2004, 45% of the expended funds could have been saved. Likewise, 10% of the safety management and technical staff could have been eliminated and working hours devoted to safety could have been reduced by 12%. These conditions could have given the same results.

  10. Parkinson's disease assessment based on gait analysis using an innovative RGB-D camera system.

    Science.gov (United States)

    Rocha, Ana Patrícia; Choupina, Hugo; Fernandes, José Maria; Rosas, Maria José; Vaz, Rui; Silva Cunha, João Paulo

    2014-01-01

    Movement-related diseases, such as Parkinson's disease (PD), progressively affect the motor function, many times leading to severe motor impairment and dramatic loss of the patients' quality of life. Human motion analysis techniques can be very useful to support clinical assessment of this type of diseases. In this contribution, we present a RGB-D camera (Microsoft Kinect) system and its evaluation for PD assessment. Based on skeleton data extracted from the gait of three PD patients treated with deep brain stimulation and three control subjects, several gait parameters were computed and analyzed, with the aim of discriminating between non-PD and PD subjects, as well as between two PD states (stimulator ON and OFF). We verified that among the several quantitative gait parameters, the variance of the center shoulder velocity presented the highest discriminative power to distinguish between non-PD, PD ON and PD OFF states (p = 0.004). Furthermore, we have shown that our low-cost portable system can be easily mounted in any hospital environment for evaluating patients' gait. These results demonstrate the potential of using a RGB-D camera as a PD assessment tool. PMID:25570653

  11. Improving sustainability by technology assessment and systems analysis: the case of IWRM Indonesia

    Science.gov (United States)

    Nayono, S.; Lehmann, A.; Kopfmüller, J.; Lehn, H.

    2016-06-01

    To support the implementation of the IWRM-Indonesia process in a water scarce and sanitation poor region of Central Java (Indonesia), sustainability assessments of several technology options of water supply and sanitation were carried out based on the conceptual framework of the integrative sustainability concept of the German Helmholtz association. In the case of water supply, the assessment was based on the life-cycle analysis and life-cycle-costing approach. In the sanitation sector, the focus was set on developing an analytical tool to improve planning procedures in the area of investigation, which can be applied in general to developing and newly emerging countries. Because sanitation systems in particular can be regarded as socio-technical systems, their permanent operability is closely related to cultural or religious preferences which influence acceptability. Therefore, the design of the tool and the assessment of sanitation technologies took into account the views of relevant stakeholders. The key results of the analyses are presented in this article.

  12. Procedure for conducting probabilistic safety assessment: level 1 full power internal event analysis

    International Nuclear Information System (INIS)

    This report provides guidance on conducting a Level I PSA for internal events in NPPs, which is based on the method and procedure that was used in the PSA for the design of Korea Standard Nuclear Plants (KSNPs). Level I PSA is to delineate the accident sequences leading to core damage and to estimate their frequencies. It has been directly used for assessing and modifying the system safety and reliability as a key and base part of PSA. Also, Level I PSA provides insights into design weakness and into ways of preventing core damage, which in most cases is the precursor to accidents leading to major accidents. So Level I PSA has been used as the essential technical bases for risk-informed application in NPPs. The report consists six major procedural steps for Level I PSA; familiarization of plant, initiating event analysis, event tree analysis, system fault tree analysis, reliability data analysis, and accident sequence quantification. The report is intended to assist technical persons performing Level I PSA for NPPs. A particular aim is to promote a standardized framework, terminology and form of documentation for PSAs. On the other hand, this report would be useful for the managers or regulatory persons related to risk-informed regulation, and also for conducting PSA for other industries

  13. Data collection and analysis in support of risk assessment for hydroelectric stations

    Energy Technology Data Exchange (ETDEWEB)

    Vo, T.V.; Mitts, T.M.; Phan, H.K.; Blackburn, T.R.; Casazza, L.O.

    1995-10-01

    This project is to provide the U.S. Army Corps of Engineers with a risk analysis that evaluates the non-routine closure of water flow through the turbines of powerhouses along the Columbia and Snake Rivers. The project is divided into four phases. Phase 1 efforts collected and analyzed relevant plant failure data for hydroelectric generating stations in the United States and Canada. Results from the Phase 1 efforts will be used to assess the risk (probability times consequences) associated with non-routine shut down of hydroelectric stations, which will be performed in the remaining phases of the project. Results of this project may be used to provide policy recommendations regarding operation and maintenance of hydroelectric stations. The methodology used to complete the Phase 1 of the project is composed of data collection and analysis activities. Data collection included performing site visits, conducting a data survey of hydroelectric stations, conducting an expert panel workshop, and reviewing and tabulating failure data from generic sources. Data analysis included estimating failure rates obtained from the survey data, expert judgment elicitation process, generic data, and combining these failure rates to produce final failure rate parameters. This paper summarizes the data collection analysis, results and discussions for the Phase 1 efforts.

  14. The use of caffeine to assess high dose exposures to ionising radiation by dicentric analysis

    International Nuclear Information System (INIS)

    Dicentric analysis is considered as a 'gold standard' method for biological dosimetry. However, due to the radiation-induced mitotic delay or inability to reach mitosis of heavily damaged cells, the analysis of dicentrics is restricted to doses up to 4-5 Gy. For higher doses, the analysis by premature chromosome condensation technique has been proposed. Here, it is presented a preliminary study is presented in which an alternative method to analyse dicentrics after high dose exposures to ionising radiation (IR) is evaluated. The method is based on the effect of caffeine in preventing the G2/M checkpoint allowing damaged cells to reach mitosis. The results obtained indicate that the co-treatment with Colcemide and caffeine increases significantly increases the mitotic index, and hence allows a more feasible analysis of dicentrics. Moreover in the dose range analysed, from 0 to 15 Gy, the dicentric cell distribution followed the Poisson distribution, and a simulated partial-body exposure has been clearly detected. Overall, the results presented here suggest that caffeine has a great potential to be used for dose-assessment after high dose exposure to IR. (authors)

  15. CYCLIC RECURRENCE ASSESSMENT OF GRAIN YIELD TIME SERIES USING PHASE ANALYSIS INSTRUMENTS

    Directory of Open Access Journals (Sweden)

    Temirov A. A.

    2016-01-01

    Full Text Available An algorithm of phase analysis as the instrument of nonlinear dynamics' methods used to study cyclic recurrence of time series is viewed in current article. The existing classical econometric methods for estimating cyclic recurrence developed for random systems which dynamics matches to the normal distribution. However, there also exists non-random systems characterized by trends, periodic and non-periodic cycles called quasicycles. An example of computing process of identifying quasicycles is illustrated on time series of all grain yields in Russia for the last 119 years. Phase portrait of this time series is illustrated in twodimension space. As a result, the phase portrait consists of 22 frequently unstable quasicycles which tottality forms a strange attractor. Quasicycles have quantitative (length and quality (configuration characteristics. Their combination defines very important characteristic called trend-stability. Phase analysis is a powerful form of analysis of time series to assess cyclic recurrence and is a tool for pre-forecasting analysis. Fuzzy sets' mathematical apparatus is also used in this article. An algorithm of formation of fuzzy sets' quasicycles' length is also presented here. Quasicycles' statistics are presented in tables, geometric patterns and in the form of fuzzy sets

  16. Assessing Heterogeneity for Factor Analysis Model with Continuous and Ordinal Outcomes

    Directory of Open Access Journals (Sweden)

    Ye-Mao Xia

    2016-01-01

    Full Text Available Factor analysis models with continuous and ordinal responses are a useful tool for assessing relations between the latent variables and mixed observed responses. These models have been successfully applied to many different fields, including behavioral, educational, and social-psychological sciences. However, within the Bayesian analysis framework, most developments are constrained within parametric families, of which the particular distributions are specified for the parameters of interest. This leads to difficulty in dealing with outliers and/or distribution deviations. In this paper, we propose a Bayesian semiparametric modeling for factor analysis model with continuous and ordinal variables. A truncated stick-breaking prior is used to model the distributions of the intercept and/or covariance structural parameters. Bayesian posterior analysis is carried out through the simulation-based method. Blocked Gibbs sampler is implemented to draw observations from the complicated posterior. For model selection, the logarithm of pseudomarginal likelihood is developed to compare the competing models. Empirical results are presented to illustrate the application of the methodology.

  17. a Psycholinguistic Model for Simultaneous Translation, and Proficiency Assessment by Automated Acoustic Analysis of Discourse.

    Science.gov (United States)

    Yaghi, Hussein M.

    Two separate but related issues are addressed: how simultaneous translation (ST) works on a cognitive level and how such translation can be objectively assessed. Both of these issues are discussed in the light of qualitative and quantitative analyses of a large corpus of recordings of ST and shadowing. The proposed ST model utilises knowledge derived from a discourse analysis of the data, many accepted facts in the psychology tradition, and evidence from controlled experiments that are carried out here. This model has three advantages: (i) it is based on analyses of extended spontaneous speech rather than word-, syllable-, or clause -bound stimuli; (ii) it draws equally on linguistic and psychological knowledge; and (iii) it adopts a non-traditional view of language called 'the linguistic construction of reality'. The discourse-based knowledge is also used to develop three computerised systems for the assessment of simultaneous translation: one is a semi-automated system that treats the content of the translation; and two are fully automated, one of which is based on the time structure of the acoustic signals whilst the other is based on their cross-correlation. For each system, several parameters of performance are identified, and they are correlated with assessments rendered by the traditional, subjective, qualitative method. Using signal processing techniques, the acoustic analysis of discourse leads to the conclusion that quality in simultaneous translation can be assessed quantitatively with varying degrees of automation. It identifies as measures of performance (i) three content-based standards; (ii) four time management parameters that reflect the influence of the source on the target language time structure; and (iii) two types of acoustical signal coherence. Proficiency in ST is shown to be directly related to coherence and speech rate but inversely related to omission and delay. High proficiency is associated with a high degree of simultaneity and

  18. Assessing the likely effectiveness of multispecies management for imperiled desert fishes with niche overlap analysis.

    Science.gov (United States)

    Laub, Brian G; Budy, Phaedra

    2015-08-01

    A critical decision in species conservation is whether to target individual species or a complex of ecologically similar species. Management of multispecies complexes is likely to be most effective when species share similar distributions, threats, and response to threats. We used niche overlap analysis to assess ecological similarity of 3 sensitive desert fish species currently managed as an ecological complex. We measured the amount of shared distribution of multiple habitat and life history parameters between each pair of species. Habitat use and multiple life history parameters, including maximum body length, spawning temperature, and longevity, differed significantly among the 3 species. The differences in habitat use and life history parameters among the species suggest they are likely to respond differently to similar threats and that most management actions will not benefit all 3 species equally. Habitat restoration, frequency of stream dewatering, non-native species control, and management efforts in tributaries versus main stem rivers are all likely to impact each of the species differently. Our results demonstrate that niche overlap analysis provides a powerful tool for assessing the likely effectiveness of multispecies versus single-species conservation plans. PMID:25627117

  19. Multivariate analysis of ATR-FTIR spectra for assessment of oil shale organic geochemical properties

    Science.gov (United States)

    Washburn, Kathryn E.; Birdwell, Justin E.

    2013-01-01

    In this study, attenuated total reflectance (ATR) Fourier transform infrared spectroscopy (FTIR) was coupled with partial least squares regression (PLSR) analysis to relate spectral data to parameters from total organic carbon (TOC) analysis and programmed pyrolysis to assess the feasibility of developing predictive models to estimate important organic geochemical parameters. The advantage of ATR-FTIR over traditional analytical methods is that source rocks can be analyzed in the laboratory or field in seconds, facilitating more rapid and thorough screening than would be possible using other tools. ATR-FTIR spectra, TOC concentrations and Rock–Eval parameters were measured for a set of oil shales from deposits around the world and several pyrolyzed oil shale samples. PLSR models were developed to predict the measured geochemical parameters from infrared spectra. Application of the resulting models to a set of test spectra excluded from the training set generated accurate predictions of TOC and most Rock–Eval parameters. The critical region of the infrared spectrum for assessing S1, S2, Hydrogen Index and TOC consisted of aliphatic organic moieties (2800–3000 cm−1) and the models generated a better correlation with measured values of TOC and S2 than did integrated aliphatic peak areas. The results suggest that combining ATR-FTIR with PLSR is a reliable approach for estimating useful geochemical parameters of oil shales that is faster and requires less sample preparation than current screening methods.

  20. Assessing the likely effectiveness of multispecies management for imperiled desert fishes with niche overlap analysis

    Science.gov (United States)

    Laub, P; Budy, Phaedra

    2015-01-01

    A critical decision in species conservation is whether to target individual species or a complex of ecologically similar species. Management of multispecies complexes is likely to be most effective when species share similar distributions, threats, and response to threats. We used niche overlap analysis to assess ecological similarity of 3 sensitive desert fish species currently managed as an ecological complex. We measured the amount of shared distribution of multiple habitat and life history parameters between each pair of species. Habitat use and multiple life history parameters, including maximum body length, spawning temperature, and longevity, differed significantly among the 3 species. The differences in habitat use and life history parameters among the species suggest they are likely to respond differently to similar threats and that most management actions will not benefit all 3 species equally. Habitat restoration, frequency of stream dewatering, non-native species control, and management efforts in tributaries versus main stem rivers are all likely to impact each of the species differently. Our results demonstrate that niche overlap analysis provides a powerful tool for assessing the likely effectiveness of multispecies versus single-species conservation plans.

  1. The benefits of integrating cost-benefit analysis and risk assessment

    International Nuclear Information System (INIS)

    It has increasingly been recognized that knowledge of risks in the absence of benefits and costs cannot dictate appropriate public policy choices. Recent evidence of this recognition includes the proposed EPA Risk Assessment and Cost-Benefit Analysis Act of 1995, a number of legislative changes in Canada and the US, and the increasing demand for field studies combining measures of impacts, risks, costs and benefits. Failure to consider relative environmental and human health risks, benefits, and costs in making public policy decisions has resulted in allocating scarce resources away from areas offering the highest levels of risk reduction and improvements in health and safety. The authors discuss the implications of not taking costs and benefits into account in addressing environmental risks, drawing on examples from both Canada and the US. The authors also present the results of their recent field work demonstrating the advantages of considering costs and benefits in making public policy and site remediation decisions, including a study on the benefits and costs of prevention, remediation and monitoring techniques applied to groundwater contamination; the benefits and costs of banning the use of chlorine; and the benefits and costs of Canada's concept of disposing of high-level nuclear waste. The authors conclude that a properly conducted Cost-Benefit Analysis can provide critical input to a Risk Assessment and can ensure that risk management decisions are efficient, cost-effective and maximize improvement to environmental and human health

  2. Assessing Credit Default using Logistic Regression and Multiple Discriminant Analysis: Empirical Evidence from Bosnia and Herzegovina

    Directory of Open Access Journals (Sweden)

    Deni Memić

    2015-01-01

    Full Text Available This article has an aim to assess credit default prediction on the banking market in Bosnia and Herzegovina nationwide as well as on its constitutional entities (Federation of Bosnia and Herzegovina and Republika Srpska. Ability to classify companies info different predefined groups or finding an appropriate tool which would replace human assessment in classifying companies into good and bad buckets has been one of the main interests on risk management researchers for a long time. We investigated the possibility and accuracy of default prediction using traditional statistical methods logistic regression (logit and multiple discriminant analysis (MDA and compared their predictive abilities. The results show that the created models have high predictive ability. For logit models, some variables are more influential on the default prediction than the others. Return on assets (ROA is statistically significant in all four periods prior to default, having very high regression coefficients, or high impact on the model's ability to predict default. Similar results are obtained for MDA models. It is also found that predictive ability differs between logistic regression and multiple discriminant analysis.

  3. Application of reference materials for quality assessment in neutron activation analysis

    International Nuclear Information System (INIS)

    It is generally accepted that an analytical procedure can be regarded as an information production system yielding information on the composition of the analyzed sample. Thus, information theory can be useful and the quantities characterizing the information properties of an analytical method may be applied not only as evaluation criteria but also as objective functions in the optimization. The usability of information theory is demonstrated on the example of neutron activation analysis. Both precision and bias of NAA results are taken into account together with the possible use of reference materials for quality assessment. The influence of the above-mentioned parameters on information properties such as information gain and profitability of NAA results is discussed in detail. It has been proved that information theory is especially useful in choosing suitable reference materials for the quality assessment of routine analytical procedures not only with respect to matrix and analyte concentration in the sample but also to concentrations and uncertainties of certified values in the CRM used. In the extreme trace analysis, CRMs with relatively large uncertainties and very low certified concentrations can still yield rather high information gain of results. (author) 14 refs.; 9 figs

  4. Phenomic Assessment of Genetic Buffering by Kinetic Analysis of Cell Arrays

    Science.gov (United States)

    Rodgers, John; Guo, Jingyu; Hartman, John L.

    2016-01-01

    Summary Quantitative high throughput cell array phenotyping (Q-HTCP) is applied to the genomic collection of yeast gene deletion mutants for systematic, comprehensive assessment of the contribution of genes and gene combinations to any phenotype of interest (phenomic analysis). Interacting gene networks influence every phenotype. Genetic buffering refers to how gene interaction networks stabilize or destabilize a phenotype. Like genomics, phenomics varies in its resolution with there being a tradeoff allocating a greater number of measurements per sample to enhance quantification of the phenotype vs. increasing the number of different samples by obtaining fewer measurement per sample. The Q-HTCP protocol we describe assesses 50,000–70,000 cultures per experiment by obtaining kinetic growth curves from time series imaging of agar cell arrays. This approach was developed for the yeast gene deletion strains, but it could be applied as well to other microbial mutant arrays grown on solid agar media. The methods we describe are for creation and maintenance of frozen stocks, liquid source array preparation, agar destination plate printing, image scanning, image analysis, curve fitting and evaluation of gene interaction. PMID:25213246

  5. Assessment of hydrocephalus in children based on digital image processing and analysis

    Directory of Open Access Journals (Sweden)

    Fabijańska Anna

    2014-06-01

    Full Text Available Hydrocephalus is a pathological condition of the central nervous system which often affects neonates and young children. It manifests itself as an abnormal accumulation of cerebrospinal fluid within the ventricular system of the brain with its subsequent progression. One of the most important diagnostic methods of identifying hydrocephalus is Computer Tomography (CT. The enlarged ventricular system is clearly visible on CT scans. However, the assessment of the disease progress usually relies on the radiologist’s judgment and manual measurements, which are subjective, cumbersome and have limited accuracy. Therefore, this paper regards the problem of semi-automatic assessment of hydrocephalus using image processing and analysis algorithms. In particular, automated determination of popular indices of the disease progress is considered. Algorithms for the detection, semi-automatic segmentation and numerical description of the lesion are proposed. Specifically, the disease progress is determined using shape analysis algorithms. Numerical results provided by the introduced methods are presented and compared with those calculated manually by a radiologist and a trained operator. The comparison proves the correctness of the introduced approach.

  6. Application of exploratory factor analysis to assess fish consumption in a university community

    Directory of Open Access Journals (Sweden)

    Erika da Silva Maciel

    2013-03-01

    Full Text Available The objective of this research was to use the technique of Exploratory Factor Analysis (EFA for the adequacy of a tool for the assessment of fish consumption and the characteristics involved in this process. Data were collected during a campaign to encourage fish consumption in Brazil with the voluntarily participation of members of a university community. An assessment instrument consisting of multiple-choice questions and a five-point Likert scale was designed and used to measure the importance of certain attributes that influence the choice and consumption of fish. This study sample was composed of of 224 individuals, the majority were women (65.6%. With regard to the frequency of fish consumption, 37.67% of the volunteers interviewed said they consume the product two or three times a month, and 29.6% once a week. The Exploratory Factor Analysis (EFA was used to group the variables; the extraction was made using the principal components and the rotation using the Quartimax method. The results show clusters in two main constructs, quality and consumption with Cronbach Alpha coefficients of 0.75 and 0.69, respectively, indicating good internal consistency.

  7. Air pollution assessment in two Moroccan cities using instrumental neutron activation analysis on bio-accumulators

    International Nuclear Information System (INIS)

    Full text: Biomonitoring is an appropriate tool for the air pollution assessment studies. In this work, lichens and barks have been used as bio-accumulators in several sites in two Moroccan cities (Rabat and Mohammadia). The specific ability of absorbing and accumulating heavy metals and toxic element from the air, their longevity and resistance to the environmental stresses, make those bioindicators suitable for this kind of studies. The Instrumental Neutron Activation Analysis (INAA) is universally accepted as one of the most reliable analytical tools for trace and ultra-trace elements determination. Its use in trace elements atmospheric pollution related studies has been and is still extensive as can be demonstrated by several specific works and detailed reviews. In this work, a preliminary investigation employing lichens, barks and instrumental neutron activation analysis (INAA) was carried out to evaluate the trace elements distribution in six different areas of Rabat and Mohammadia cities characterised by the presence of many industries and heavy traffic. Samples were irradiated with thermal neutrons in a nuclear reactor and the induced activity was counted using high-resolution Germanium-Lithium detectors. More than 30 elements were determined using two modes : short irradiation (1 minute) and long irradiation (17 hours). Accuracy and quality control were assessed using the reference standard material IAEA-336. This was less than 1% for major and about 5 to 10% for traces.

  8. Extended Cost-Effectiveness Analysis for Health Policy Assessment: A Tutorial.

    Science.gov (United States)

    Verguet, Stéphane; Kim, Jane J; Jamison, Dean T

    2016-09-01

    Health policy instruments such as the public financing of health technologies (e.g., new drugs, vaccines) entail consequences in multiple domains. Fundamentally, public health policies aim at increasing the uptake of effective and efficient interventions and at subsequently leading to better health benefits (e.g., premature mortality and morbidity averted). In addition, public health policies can provide non-health benefits in addition to the sole well-being of populations and beyond the health sector. For instance, public policies such as social and health insurance programs can prevent illness-related impoverishment and procure financial risk protection. Furthermore, public policies can improve the distribution of health in the population and promote the equalization of health among individuals. Extended cost-effectiveness analysis was developed to address health policy assessment, specifically to evaluate the health and financial consequences of public policies in four domains: (1) the health gains; (2) the financial risk protection benefits; (3) the total costs to the policy makers; and (4) the distributional benefits. Here, we present a tutorial that describes both the intent of extended cost-effectiveness analysis and its keys to allow easy implementation for health policy assessment. PMID:27374172

  9. A new automatic image analysis method for assessing estrogen receptors' status in breast tissue specimens.

    Science.gov (United States)

    Mouelhi, Aymen; Sayadi, Mounir; Fnaiech, Farhat; Mrad, Karima; Ben Romdhane, Khaled

    2013-12-01

    Manual assessment of estrogen receptors' (ER) status from breast tissue microscopy images is a subjective, time consuming and error prone process. Automatic image analysis methods offer the possibility to obtain consistent, objective and rapid diagnoses of histopathology specimens. In breast cancer biopsies immunohistochemically (IHC) stained for ER, cancer cell nuclei present a large variety in their characteristics that bring various difficulties for traditional image analysis methods. In this paper, we propose a new automatic method to perform both segmentation and classification of breast cell nuclei in order to give quantitative assessment and uniform indicators of IHC staining that will help pathologists in their diagnostic. Firstly, a color geometric active contour model incorporating a spatial fuzzy clustering algorithm is proposed to detect the contours of all cell nuclei in the image. Secondly, overlapping and touching nuclei are separated using an improved watershed algorithm based on a concave vertex graph. Finally, to identify positive and negative stained nuclei, all the segmented nuclei are classified into five categories according to their staining intensity and morphological features using a trained multilayer neural network combined with Fisher's linear discriminant preprocessing. The proposed method is tested on a large dataset containing several breast tissue images with different levels of malignancy. The experimental results show high agreement between the results of the method and ground-truth from the pathologist panel. Furthermore, a comparative study versus existing techniques is presented in order to demonstrate the efficiency and the superiority of the proposed method. PMID:24290943

  10. Bioimpedance Harmonic Analysis as a Diagnostic Tool to Assess Regional Circulation and Neural Activity

    Science.gov (United States)

    Mudraya, I. S.; Revenko, S. V.; Khodyreva, L. A.; Markosyan, T. G.; Dudareva, A. A.; Ibragimov, A. R.; Romich, V. V.; Kirpatovsky, V. I.

    2013-04-01

    The novel technique based on harmonic analysis of bioimpedance microvariations with original hard- and software complex incorporating a high-resolution impedance converter was used to assess the neural activity and circulation in human urinary bladder and penis in patients with pelvic pain, erectile dysfunction, and overactive bladder. The therapeutic effects of shock wave therapy and Botulinum toxin detrusor injections were evaluated quantitatively according to the spectral peaks at low 0.1 Hz frequency (M for Mayer wave), respiratory (R) and cardiac (C) rhythms with their harmonics. Enhanced baseline regional neural activity identified according to M and R peaks was found to be presumably sympathetic in pelvic pain patients, and parasympathetic - in patients with overactive bladder. Total pulsatile activity and pulsatile resonances found in the bladder as well as in the penile spectrum characterised regional circulation and vascular tone. The abnormal spectral parameters characteristic of the patients with genitourinary diseases shifted to the norm in the cases of efficient therapy. Bioimpedance harmonic analysis seems to be a potent tool to assess regional peculiarities of circulatory and autonomic nervous activity in the course of patient treatment.

  11. Bioimpedance Harmonic Analysis as a Diagnostic Tool to Assess Regional Circulation and Neural Activity

    International Nuclear Information System (INIS)

    The novel technique based on harmonic analysis of bioimpedance microvariations with original hard- and software complex incorporating a high-resolution impedance converter was used to assess the neural activity and circulation in human urinary bladder and penis in patients with pelvic pain, erectile dysfunction, and overactive bladder. The therapeutic effects of shock wave therapy and Botulinum toxin detrusor injections were evaluated quantitatively according to the spectral peaks at low 0.1 Hz frequency (M for Mayer wave), respiratory (R) and cardiac (C) rhythms with their harmonics. Enhanced baseline regional neural activity identified according to M and R peaks was found to be presumably sympathetic in pelvic pain patients, and parasympathetic – in patients with overactive bladder. Total pulsatile activity and pulsatile resonances found in the bladder as well as in the penile spectrum characterised regional circulation and vascular tone. The abnormal spectral parameters characteristic of the patients with genitourinary diseases shifted to the norm in the cases of efficient therapy. Bioimpedance harmonic analysis seems to be a potent tool to assess regional peculiarities of circulatory and autonomic nervous activity in the course of patient treatment.

  12. Phenomic assessment of genetic buffering by kinetic analysis of cell arrays.

    Science.gov (United States)

    Rodgers, John; Guo, Jingyu; Hartman, John L

    2014-01-01

    Quantitative high-throughput cell array phenotyping (Q-HTCP) is applied to the genomic collection of yeast gene deletion mutants for systematic, comprehensive assessment of the contribution of genes and gene combinations to any phenotype of interest (phenomic analysis). Interacting gene networks influence every phenotype. Genetic buffering refers to how gene interaction networks stabilize or destabilize a phenotype. Like genomics, phenomics varies in its resolution with there being a trade-off allocating a greater number of measurements per sample to enhance quantification of the phenotype vs. increasing the number of different samples by obtaining fewer measurements per sample. The Q-HTCP protocol we describe assesses 50,000-70,000 cultures per experiment by obtaining kinetic growth curves from time series imaging of agar cell arrays. This approach was developed for the yeast gene deletion strains, but it could be applied as well to other microbial mutant arrays grown on solid agar media. The methods we describe are for creation and maintenance of frozen stocks, liquid source array preparation, agar destination plate printing, image scanning, image analysis, curve fitting, and evaluation of gene interaction. PMID:25213246

  13. Probabilistic fragility analysis: A tool for assessing design rules of RC buildings

    Institute of Scientific and Technical Information of China (English)

    Nikos D Lagarost

    2008-01-01

    In this work, fragility analysis is performed to assess two groups of reinforced concrete structures. The first group of structures is composed of buildings that implement three common design practices; namely, fully infilled, weak ground story and short columns. The three design practices are applied during the design process of a reinforced concrete building. The structures of the second group vary according to the value of the behavioral factors used to define the seismic forces as specified in design procedures. Most seismic design codes belong to the class of prescriptive procedures where if certain constraints are fulfilled, the structure is considered safe. Prescriptive design procedures express the ability of the structure to absorb energy through inelastic deformation using the behavior factor. The basic objective of this work is to assess both groups of structures with reference to the limit-state probability of exceedance. Thus, four limit state fragility curves are developed on the basis of nonlinear static analysis for both groups of structures. Moreover, the 95% confidence intervals of the fragility curves are also calculated, taking into account two types of random variables that influence structural capacity and seismic demand.

  14. Gait analysis, bone and muscle density assessment for patients undergoing total hip arthroplasty

    Directory of Open Access Journals (Sweden)

    Benedikt Magnússon

    2012-12-01

    Full Text Available Total hip arthroplasty (THA is performed with or without the use of bone cement. Facing the lack of reliable clinical guidelines on decision making whether a patient should receive THA with or without bone cement, a joint clinical and engineering approach is proposed here with the objective to assess patient recovery developing monitoring techniques based on gait analysis, measurements of bone mineral density and structural and functional changes of quadriceps muscles. A clinical trial was conducted with 36 volunteer patients that were undergoing THA surgery for the first time: 18 receiving cemented implant and 18 receiving non-cemented implant. The patients are scanned with Computer Tomographic (CT modality prior-, immediately- and 12 months post-surgery. The CT data are further processed to segment muscles and bones for calculating bone mineral density (BMD. Quadriceps muscle density Hounsfield (HU based value is calculated from the segmented file on healthy and operated leg before and after THA surgery. Furthermore clinical assessment is performed using gait analysis technologies such as a sensing carpet, wireless electrodes and video. Patients undergo these measurements prior-, 6 weeks post - and 52 weeks post-surgery. The preliminary results indicate computational tools and methods that are able to quantitatively analyze patient’s condition pre and post-surgery: The spatial parameters such as step length and stride length increase 6 weeks post op in the patient group receiving cemented implant while the angle in the toe in/out parameter decrease in both patient groups.

  15. Sustainability assessment of tertiary wastewater treatment technologies: a multi-criteria analysis.

    Science.gov (United States)

    Plakas, K V; Georgiadis, A A; Karabelas, A J

    2016-01-01

    The multi-criteria analysis gives the opportunity to researchers, designers and decision-makers to examine decision options in a multi-dimensional fashion. On this basis, four tertiary wastewater treatment (WWT) technologies were assessed regarding their sustainability performance in producing recycled wastewater, considering a 'triple bottom line' approach (i.e. economic, environmental, and social). These are powdered activated carbon adsorption coupled with ultrafiltration membrane separation (PAC-UF), reverse osmosis, ozone/ultraviolet-light oxidation and heterogeneous photo-catalysis coupled with low-pressure membrane separation (photocatalytic membrane reactor, PMR). The participatory method called simple multi-attribute rating technique exploiting ranks was employed for assigning weights to selected sustainability indicators. This sustainability assessment approach resulted in the development of a composite index as a final metric, for each WWT technology evaluated. The PAC-UF technology appears to be the most appropriate technology, attaining the highest composite value regarding the sustainability performance. A scenario analysis confirmed the results of the original scenario in five out of seven cases. In parallel, the PMR was highlighted as the technology with the least variability in its performance. Nevertheless, additional actions and approaches are proposed to strengthen the objectivity of the final results. PMID:27054724

  16. Rapid ecotoxicological assessment of heavy metal combined polluted soil using canonical analysis

    Institute of Scientific and Technical Information of China (English)

    CHEN Su-hua; ZHOU Qi-xing; SUN Tie-heng; LI Pei-jun

    2003-01-01

    Quick, simple to perform, and cheap biomarkers were combined in a rapid assessment approach to measure the effects of metal pollutants, Cu, Cd, Pb and Zn in meadow burozem on wheat. Analysis of orthogonal design showed that the significant zinc factor indicated both the inhibition rate of shoot mass and that of root elongation were affected by zinc( P < 0.05 and P < 0.01, respectively). The first toxicity canonical variable (TOXI), formed from the toxicity data set, explained 49% of the total variance in the toxicity data set; the first biological canonical variable(BIOL) explained 42% of the total variation in the biological data set. The correlation between the first canonical variables TOXI and BIOL (canonical correlation) was 0.94 ( P < 0.0001). Therefore, it is reliable and feasible to use the achievement to assess toxicity of heavy metal combined polluted soil using canonical analysis. Toxicity of soil combined polluted by heavy metals to plant community was estimated by comparing the IC50 values describing the concentration needed to cause 50% decrease with grow rate compared to no metal addition. Environmental quality standard for soils prescribe that all these tested concentration of heavy metals in soil should not cause hazard and pollution ultimately, whereas it indicated that the soils in second grade cause more or less than 50% inhibition rates of wheat growth. So environmental quality standard for soils can be modified to include other features.

  17. ASSESSMENT OF OIL PALM PLANTATION AND TROPICAL PEAT SWAMP FOREST WATER QUALITY BY MULTIVARIATE STATISTICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Seca Gandaseca

    2014-01-01

    Full Text Available This study reports the spatio-temporal changes in river and canal water quality of peat swamp forest and oil palm plantation sites of Sarawak, Malaysia. To investigate temporal changes, 192 water samples were collected at four stations of BatangIgan, an oil palm plantation site of Sarawak, during July-November in 2009 and April-July in 2010. Nine water quality parameters including Electrical Conductivity (EC, pH, Turbidity (TER, Dissolved Oxygen (DO, Temperature (TEMP, Chemical Oxygen Demand (COD, five-day Biochemical Oxygen Demand (BOD5, ammonia-Nitrogen (NH3-N, Total Suspended Solids (TSS were analysed. To investigate spatial changes, 432water samples were collected from six different sites including BatangIgan during June-August 2010. Six water quality parameters including pH, DO, COD, BOD5, NH3-N and TSS were analysed to see the spatial variations. Most significant parameters which contributed in spatio-temporal variations were assessed by statistical techniques such as Hierarchical Agglomerative Cluster Analysis (HACA, Factor Analysis/Principal Components Analysis (FA/PCA and Discriminant Function Analysis (DFA. HACA identified three different classes of sites: Relatively Unimpaired, Impaired and Less Impaired Regions on the basis of similarity among different physicochemical characteristics and pollutant level between the sampling sites. DFA produced the best results for identification of main variables for temporal analysis and separated parameters (EC, TER, COD and identified three parameters for spatial analysis (pH, NH3-N and BOD5. The results signify that parameters identified by statistical analyses were responsible for water quality change and suggest the possibility the agricultural and oil palm plantation activities as a source of pollutants. The results suggest dire need for proper watershed management measures to restore the water quality of this tributary for a

  18. Assessment of Student Skills for Critiquing Published Primary Scientific Literature Using a Primary Trait Analysis Scale

    Directory of Open Access Journals (Sweden)

    Manuel F. Varela

    2009-12-01

    Full Text Available Instructor evaluation of progressive student skills in the analysis of primary literature is critical for the development of these skills in young scientists. Students in a senior or graduate-level one-semester course in Immunology at a Masters-level comprehensive university were assessed for abilities (primary traits to recognize and evaluate the following elements of a scientific paper: Hypothesis and Rationale, Significance, Methods, Results, Critical Thinking and Analysis, and Conclusions. We tested the hypotheses that average recognition scores vary among elements and that scores change with time differently by trait. Recognition scores (scaled 1 to 5, and differences in scores were analyzed using analysis of variance (ANOVA, regression, and analysis of covariance (ANCOVA (n = 10 papers over 103 days. By multiple comparisons testing, we found that recognition scores statistically fell into two groups: high scores (for Hypothesis and Rationale, Significance, Methods, and Conclusions and low scores (for Results and Critical Thinking and Analysis. Recognition scores only significantly changed with time (increased for Hypothesis and Rationale and Results. ANCOVA showed that changes in recognition scores for these elements were not significantly different in slope (F1,16 = 0.254, P = 0.621 but the Results trait was significantly lower in elevation (F1,17 = 12.456, P = 0.003. Thus, students improved with similar trajectories, but starting and ending with lower Results scores. We conclude that students have greatest difficulty evaluating Results and critically evaluating scientific validity. Our findings show extant student skills, and the significant increase in some traits shows learning. This study demonstrates that students start with variable recognition skills and that student skills may be learned at differential rates. Faculty can use these findings or the primary trait analysis scoring scale to focus on specific paper elements for which

  19. Assessment of genetic stability in micropropagules of Jatropha curcas genotypes by RAPD and AFLP analysis

    KAUST Repository

    Sharma, Sweta K.

    2011-07-01

    Jatropha curcas (Euphorbiaceae), a drought resistant non edible oil yielding plant, has acquired significant importance as an alternative renewable energy source. Low and inconsistent yields found in field plantations prompted for identification of high yielding clones and their large scale multiplication by vegetative propagation to obtain true to type plants. In the current investigation plantlets of J. curcas generated by axillary bud proliferation (micropropagation) using nodal segments obtained from selected high yielding genotypes were assessed for their genetic stability using Randomly Amplified Polymorphic DNA (RAPD) and Amplified Fragment Length Polymorphism (AFLP) analyses. For RAPD analysis, 21 out of 52 arbitrary decamer primers screened gave clear reproducible bands. In the micropropagated plantlets obtained from the 2nd sub-culture, 4 out of a total of 177 bands scored were polymorphic, but in the 8th and 16th sub-cultures (culture cycle) no polymorphisms were detected. AFLP analysis revealed 0.63%, 0% and 0% polymorphism in the 2nd, 8th and 16th generations, respectively. When different genotypes, viz. IC 56557 16, IC 56557 34 and IC 56557 13, were assessed by AFLP, 0%, 0.31% and 0.47% polymorphisms were found, respectively, indicating a difference in genetic stability among the different genotypes. To the best of our knowledge this is the first report on assessment of genetic stability of micropropagated plantlets in J. curcas and suggests that axillary shoot proliferation can safely be used as an efficient micropropagation method for mass propagation of J. curcas. © 2011 Elsevier B.V.

  20. Applications of life cycle assessment and cost analysis in health care waste management

    Energy Technology Data Exchange (ETDEWEB)

    Soares, Sebastiao Roberto, E-mail: soares@ens.ufsc.br [Department of Sanitary Engineering, Federal University of Santa Catarina, UFSC, Campus Universitario, Centro Tecnologico, Trindade, PO Box 476, Florianopolis, SC 88040-970 (Brazil); Finotti, Alexandra Rodrigues, E-mail: finotti@ens.ufsc.br [Department of Sanitary Engineering, Federal University of Santa Catarina, UFSC, Campus Universitario, Centro Tecnologico, Trindade, PO Box 476, Florianopolis, SC 88040-970 (Brazil); Prudencio da Silva, Vamilson, E-mail: vamilson@epagri.sc.gov.br [Department of Sanitary Engineering, Federal University of Santa Catarina, UFSC, Campus Universitario, Centro Tecnologico, Trindade, PO Box 476, Florianopolis, SC 88040-970 (Brazil); EPAGRI, Rod. Admar Gonzaga 1347, Itacorubi, Florianopolis, Santa Catarina 88034-901 (Brazil); Alvarenga, Rodrigo A.F., E-mail: alvarenga.raf@gmail.com [Department of Sanitary Engineering, Federal University of Santa Catarina, UFSC, Campus Universitario, Centro Tecnologico, Trindade, PO Box 476, Florianopolis, SC 88040-970 (Brazil); Ghent University, Department of Sustainable Organic Chemistry and Technology, Coupure Links 653/9000 Gent (Belgium)

    2013-01-15

    Highlights: Black-Right-Pointing-Pointer Three Health Care Waste (HCW) scenarios were assessed through environmental and cost analysis. Black-Right-Pointing-Pointer HCW treatment using microwave oven had the lowest environmental impacts and costs in comparison with autoclave and lime. Black-Right-Pointing-Pointer Lime had the worst environmental and economic results for HCW treatment, in comparison with autoclave and microwave. - Abstract: The establishment of rules to manage Health Care Waste (HCW) is a challenge for the public sector. Regulatory agencies must ensure the safety of waste management alternatives for two very different profiles of generators: (1) hospitals, which concentrate the production of HCW and (2) small establishments, such as clinics, pharmacies and other sources, that generate dispersed quantities of HCW and are scattered throughout the city. To assist in developing sector regulations for the small generators, we evaluated three management scenarios using decision-making tools. They consisted of a disinfection technique (microwave, autoclave and lime) followed by landfilling, where transportation was also included. The microwave, autoclave and lime techniques were tested at the laboratory to establish the operating parameters to ensure their efficiency in disinfection. Using a life cycle assessment (LCA) and cost analysis, the decision-making tools aimed to determine the technique with the best environmental performance. This consisted of evaluating the eco-efficiency of each scenario. Based on the life cycle assessment, microwaving had the lowest environmental impact (12.64 Pt) followed by autoclaving (48.46 Pt). The cost analyses indicated values of US$ 0.12 kg{sup -1} for the waste treated with microwaves, US$ 1.10 kg{sup -1} for the waste treated by the autoclave and US$ 1.53 kg{sup -1} for the waste treated with lime. The microwave disinfection presented the best eco-efficiency performance among those studied and provided a feasible

  1. Assessing a new gene expression analysis technique for radiation biodosimetry applications

    International Nuclear Information System (INIS)

    The response to any radiation accident or incident involving actual or potential ionising radiation exposure requires accurate and rapid assessment of the doses received by individuals. The techniques available today for biodosimetry purposes are not fully adapted to rapid high-throughput measurements of exposures in large numbers of individuals. A recently emerging technique is based on gene expression analysis, as there are a number of genes which are radiation responsive in a dose-dependent manner. The present work aimed to assess a new technique which allows the detection of the level of expression of up to 800 genes without need of enzymatic reactions. In order to do so, human peripheral blood was exposed ex vivo to a range of x-ray doses from 5 mGy to 4 Gy of x-rays and the transcriptional expression of five radiation-responsive genes PHPT1, PUMA, CCNG1, DDB2 and MDM2 was studied by both the nCounter Digital Analyzer and Multiplex Quantitative Real-Time Polymerase Chain Reaction (MQRT-PCR) as the benchmark technology. Results from both techniques showed good correlation for all genes with R2 values ranging between 0.8160 and 0.9754. The reproducibility of the nCounter Digital Analyzer was also assessed in independent biological replicates and proved to be good. Although the slopes of the correlation of results obtained by the techniques suggest that MQRT-PCR is more sensitive than the nCounter Digital Analyzer, the nCounter Digital Analyzer provides sensitive and reliable data on modifications in gene expression in human blood exposed to radiation without enzymatic amplification of RNA prior to analysis.

  2. Applications of life cycle assessment and cost analysis in health care waste management

    International Nuclear Information System (INIS)

    Highlights: ► Three Health Care Waste (HCW) scenarios were assessed through environmental and cost analysis. ► HCW treatment using microwave oven had the lowest environmental impacts and costs in comparison with autoclave and lime. ► Lime had the worst environmental and economic results for HCW treatment, in comparison with autoclave and microwave. - Abstract: The establishment of rules to manage Health Care Waste (HCW) is a challenge for the public sector. Regulatory agencies must ensure the safety of waste management alternatives for two very different profiles of generators: (1) hospitals, which concentrate the production of HCW and (2) small establishments, such as clinics, pharmacies and other sources, that generate dispersed quantities of HCW and are scattered throughout the city. To assist in developing sector regulations for the small generators, we evaluated three management scenarios using decision-making tools. They consisted of a disinfection technique (microwave, autoclave and lime) followed by landfilling, where transportation was also included. The microwave, autoclave and lime techniques were tested at the laboratory to establish the operating parameters to ensure their efficiency in disinfection. Using a life cycle assessment (LCA) and cost analysis, the decision-making tools aimed to determine the technique with the best environmental performance. This consisted of evaluating the eco-efficiency of each scenario. Based on the life cycle assessment, microwaving had the lowest environmental impact (12.64 Pt) followed by autoclaving (48.46 Pt). The cost analyses indicated values of US$ 0.12 kg−1 for the waste treated with microwaves, US$ 1.10 kg−1 for the waste treated by the autoclave and US$ 1.53 kg−1 for the waste treated with lime. The microwave disinfection presented the best eco-efficiency performance among those studied and provided a feasible alternative to subsidize the formulation of the policy for small generators of HCW.

  3. A Support Analysis Framework for mass movement damage assessment: applications to case studies in Calabria (Italy

    Directory of Open Access Journals (Sweden)

    O. Petrucci

    2009-03-01

    Full Text Available The analysis of data describing damage caused by mass movements in Calabria (Italy allowed the organisation of the Support Analysis Framework (SAF, a spreadsheet that converts damage descriptions into numerical indices expressing direct, indirect, and intangible damage.

    The SAF assesses damage indices of past mass movements and the potential outcomes of dormant phenomena re-activations. It is based on the effects on damaged elements and is independent of both physical and geometric phenomenon characteristics.

    SAF sections that assess direct damage encompass several lines, each describing an element characterised by a value fixed on a relative arbitrary scale. The levels of loss are classified as: L4: complete; L3: high; L2: medium; or L1: low. For a generic line l, the SAF multiplies the value of a damaged element by its level of loss, obtaining dl, the contribution of the line to the damage.

    Indirect damage is appraised by two sections accounting for: (a actions aiming to overcome emergency situations and (b actions aiming to restore pre-movement conditions. The level of loss depends on the number of people involved (a or the cost of actions (b.

    For intangible damage, the level of loss depends on the number of people involved.

    We examined three phenomena, assessing damage using the SAF and SAFL, customised versions of SAF based on the elements actually present in the analysed municipalities that consider the values of elements in the community framework. We show that in less populated, inland, and affluent municipalities, the impact of mass movements is greater than in coastal areas.

    The SAF can be useful to sort groups of phenomena according to their probable future damage, supplying results significant either for insurance companies or for local authorities involved in both disaster management and planning of defensive measures.

  4. A limited assessment of the ASEP human reliability analysis procedure using simulator examination results

    Energy Technology Data Exchange (ETDEWEB)

    Gore, B.R.; Dukelow, J.S. Jr.; Mitts, T.M.; Nicholson, W.L. [Pacific Northwest Lab., Richland, WA (United States)

    1995-10-01

    This report presents a limited assessment of the conservatism of the Accident Sequence Evaluation Program (ASEP) human reliability analysis (HRA) procedure described in NUREG/CR-4772. In particular, the, ASEP post-accident, post-diagnosis, nominal HRA procedure is assessed within the context of an individual`s performance of critical tasks on the simulator portion of requalification examinations administered to nuclear power plant operators. An assessment of the degree to which operator perforn:Lance during simulator examinations is an accurate reflection of operator performance during actual accident conditions was outside the scope of work for this project; therefore, no direct inference can be made from this report about such performance. The data for this study are derived from simulator examination reports from the NRC requalification examination cycle. A total of 4071 critical tasks were identified, of which 45 had been failed. The ASEP procedure was used to estimate human error probability (HEP) values for critical tasks, and the HEP results were compared with the failure rates observed in the examinations. The ASEP procedure was applied by PNL operator license examiners who supplemented the limited information in the examination reports with expert judgment based upon their extensive simulator examination experience. ASEP analyses were performed for a sample of 162 critical tasks selected randomly from the 4071, and the results were used to characterize the entire population. ASEP analyses were also performed for all of the 45 failed critical tasks. Two tests were performed to assess the bias of the ASEP HEPs compared with the data from the requalification examinations. The first compared the average of the ASEP HEP values with the fraction of the population actually failed and it found a statistically significant factor of two bias on the average.

  5. Confusion assessment method: a systematic review and meta-analysis of diagnostic accuracy

    Directory of Open Access Journals (Sweden)

    Shi Q

    2013-09-01

    Full Text Available Qiyun Shi,1,2 Laura Warren,3 Gustavo Saposnik,2 Joy C MacDermid1 1Health and Rehabilitation Sciences, Western University, London, Ontario, Canada; 2Stroke Outcomes Research Center, Department of Medicine, St Michael's Hospital, University of Toronto, Toronto, Ontario, Canada; 3Dalla Lana School of Public Health, University of Toronto, Toronto, Ontario, Canada Background: Delirium is common in the early stages of hospitalization for a variety of acute and chronic diseases. Objectives: To evaluate the diagnostic accuracy of two delirium screening tools, the Confusion Assessment Method (CAM and the Confusion Assessment Method for the Intensive Care Unit (CAM-ICU. Methods: We searched MEDLINE, EMBASE, and PsychInfo for relevant articles published in English up to March 2013. We compared two screening tools to Diagnostic and Statistical Manual of Mental Disorders IV criteria. Two reviewers independently assessed studies to determine their eligibility, validity, and quality. Sensitivity and specificity were calculated using a bivariate model. Results: Twenty-two studies (n = 2,442 patients met the inclusion criteria. All studies demonstrated that these two scales can be administered within ten minutes, by trained clinical or research staff. The pooled sensitivities and specificity for CAM were 82% (95% confidence interval [CI]: 69%–91% and 99% (95% CI: 87%–100%, and 81% (95% CI: 57%–93% and 98% (95% CI: 86%–100% for CAM-ICU, respectively. Conclusion: Both CAM and CAM-ICU are validated instruments for the diagnosis of delirium in a variety of medical settings. However, CAM and CAM-ICU both present higher specificity than sensitivity. Therefore, the use of these tools should not replace clinical judgment. Keywords: confusion assessment method, diagnostic accuracy, delirium, systematic review, meta-analysis

  6. A limited assessment of the ASEP human reliability analysis procedure using simulator examination results

    International Nuclear Information System (INIS)

    This report presents a limited assessment of the conservatism of the Accident Sequence Evaluation Program (ASEP) human reliability analysis (HRA) procedure described in NUREG/CR-4772. In particular, the, ASEP post-accident, post-diagnosis, nominal HRA procedure is assessed within the context of an individual's performance of critical tasks on the simulator portion of requalification examinations administered to nuclear power plant operators. An assessment of the degree to which operator perforn:Lance during simulator examinations is an accurate reflection of operator performance during actual accident conditions was outside the scope of work for this project; therefore, no direct inference can be made from this report about such performance. The data for this study are derived from simulator examination reports from the NRC requalification examination cycle. A total of 4071 critical tasks were identified, of which 45 had been failed. The ASEP procedure was used to estimate human error probability (HEP) values for critical tasks, and the HEP results were compared with the failure rates observed in the examinations. The ASEP procedure was applied by PNL operator license examiners who supplemented the limited information in the examination reports with expert judgment based upon their extensive simulator examination experience. ASEP analyses were performed for a sample of 162 critical tasks selected randomly from the 4071, and the results were used to characterize the entire population. ASEP analyses were also performed for all of the 45 failed critical tasks. Two tests were performed to assess the bias of the ASEP HEPs compared with the data from the requalification examinations. The first compared the average of the ASEP HEP values with the fraction of the population actually failed and it found a statistically significant factor of two bias on the average

  7. An assessment of thermal behavior of the DUPIC fuel bundle by subchannel analysis

    International Nuclear Information System (INIS)

    Thermal behavior of the standard DUPIC fuel has been assessed. The DUPIC fuel bundle has been modeled for a subchannel analysis using the ASSERT-IV code which was developed by AECL. From the calculated mixture enthalpy, equilibrium quality and void fraction distributions of the DUPIC fuel bundle, it is found that net buoyancy effect is pronounced in the central region of the DUPIC fuel bundle when compared with the standard CANDU fuel bundle. It is also found that the central region of the DUPIC fuel bundle can be cooled more efficiently than that of the standard fuel bundle. Based upon the subchannel modeling used in this study, the location of minimum CHFR in the DUPIC fuel bundle has been found to be very similar to that of the standard fuel. From the calculated mixture enthalpy distribution at the exit of the fuel channel, it is found that the mixture enthalpy and void fraction can be highest in the peripheral region of the DUPIC fuel bundle. On the other hand, the enthalpy and the void fraction was found to be highest in the central region of the standard CANDU fuel bundle at the exit of the fuel channel. Since the transverse interchange model between subchannels is important for the behavior of these variables, it is needed to put more effort in validating the transverse interchange model. For the purpose of investigating influence of thermal-hydraulic parameter variations of the DUPIC fuel bundle, four different values of the channel flow rates were used in the subchannel analysis. The effect of the channel flow reduction on thermal-hydraulic parameters have been presented. This study shows that the subchannel analysis is very useful in assessing thermal behavior of the fuel bundles in CANDU reactors. (author). 12 refs., 3 tabs., 17 figs

  8. LAVA (Los Alamos Vulnerability and Risk Assessment Methodology): A conceptual framework for automated risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, S.T.; Lim, J.J.; Phillips, J.R.; Tisinger, R.M.; Brown, D.C.; FitzGerald, P.D.

    1986-01-01

    At Los Alamos National Laboratory, we have developed an original methodology for performing risk analyses on subject systems characterized by a general set of asset categories, a general spectrum of threats, a definable system-specific set of safeguards protecting the assets from the threats, and a general set of outcomes resulting from threats exploiting weaknesses in the safeguards system. The Los Alamos Vulnerability and Risk Assessment Methodology (LAVA) models complex systems having large amounts of ''soft'' information about both the system itself and occurrences related to the system. Its structure lends itself well to automation on a portable computer, making it possible to analyze numerous similar but geographically separated installations consistently and in as much depth as the subject system warrants. LAVA is based on hierarchical systems theory, event trees, fuzzy sets, natural-language processing, decision theory, and utility theory. LAVA's framework is a hierarchical set of fuzzy event trees that relate the results of several embedded (or sub-) analyses: a vulnerability assessment providing information about the presence and efficacy of system safeguards, a threat analysis providing information about static (background) and dynamic (changing) threat components coupled with an analysis of asset ''attractiveness'' to the dynamic threat, and a consequence analysis providing information about the outcome spectrum's severity measures and impact values. By using LAVA, we have modeled our widely used computer security application as well as LAVA/CS systems for physical protection, transborder data flow, contract awards, and property management. It is presently being applied for modeling risk management in embedded systems, survivability systems, and weapons systems security. LAVA is especially effective in modeling subject systems that include a large human component.

  9. Unpacking High and Low Efficacy Teachers' Task Analysis and Competence Assessment in Teaching Low-Achieving Students in Secondary Schools

    Science.gov (United States)

    Wang, Li-Yi; Jen-Yi, Li; Tan, Liang-See; Tan, Irene; Lim, Xue-Fang; Wu, Bing Sheng

    2016-01-01

    This study adopted a pragmatic qualitative research design to unpack high and low efficacy teachers' task analysis and competence assessment in the context of teaching low-achieving students. Nine secondary school English and Science teachers were recruited and interviewed. Results of thematic analysis show that helping students perform well in…

  10. A Multi-Dimensional Comparative Assessment Methodology for Policy Analysis: A Multi-Country Study of the Agricultural Sector

    OpenAIRE

    Nijkamp, Peter; Vindigni, Gabriella

    1999-01-01

    This paper offers an overview of assessment methods for physicalplanning, with a particular focus on the agricultural sector. Anattempt is made to link multi-criteria analysis to meta-analysis byapplying rough set theory as a framework for comparative study. Anempirical application on the explanation of productivity differencesin OECD countries is used to ilustrate the potential of thisapproach.

  11. Morphological characterization and viability assessment of Trichoderma reesei by image analysis.

    Science.gov (United States)

    Lecault, Véronique; Patel, Nilesh; Thibault, Jules

    2007-01-01

    The production of cellulase from the filamentous fungus Trichoderma reesei is a critical step in the industrial process leading to cellulose ethanol. As a result of the lack of quantitative analysis tools, the intimate relationship that exists between the morphological and physiological states of the microorganism, the shear field in the bioreactor, and the process performance is not yet fully understood. A semiautomatic image analysis protocol was developed to characterize the mycelium morphology and to estimate its percentage viability during the fermentation process based on four morphological types (unbranched, branched, entangled, and clumped microorganisms). Pictures taken under bright field microscopy combined with images of fluorescein diacetate stained fungi were used to assess the morphological parameters and the percentage viability of microorganisms simultaneously. The method was tested during the course of fed-batch fermentation in a reciprocating plate bioreactor. The use of the image analysis protocol was found to be successful in quantifying the variations in the morphology and the viability of T. reesei throughout the fermentation. PMID:17373824

  12. Safety Analysis in Design and Assessment of the Physical Protection of the OKG NPP

    International Nuclear Information System (INIS)

    OKG AB operates a three unit nuclear power plant in the southern parts of Sweden. As a result of recent development of the legislation regarding physical protection of nuclear facilities, OKG has upgraded the protection against antagonistic actions. The new legislation includes requirements both on specific protective measures and on the performance of the physical protection as a whole. In short, the performance related requirements state that sufficient measures shall be implemented to protect against antagonistic actions, as defined by the regulator in the “Design Basis Threat” (DBT). Historically, physical protection and nuclear safety has been managed much as separate issues with different, sometimes contradicting, objectives. Now, insights from the work with the security upgrade have emphasized that physical protection needs to be regarded as an important part of the Defence-In-Depth (DiD) against nuclear accidents. Specifically, OKG has developed new DBT-based analysis methods, which may be characterized as probabilistically informed deterministic analysis, conformed to a format similar to the one used for conventional internal events analysis. The result is a powerful tool for design and assessment of the performance of the protection against antagonistic actions, using a nuclear safety perspective. (author)

  13. Economic analysis and assessment of syngas production using a modeling approach

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hakkwan; Parajuli, Prem B.; Yu, Fei; Columbus, Eugene P.

    2011-08-10

    Economic analysis and modeling are essential and important issues for the development of current feedstock and process technology for bio-gasification. The objective of this study was to develop an economic model and apply to predict the unit cost of syngas production from a micro-scale bio-gasification facility. An economic model was programmed in C++ computer programming language and developed using a parametric cost approach, which included processes to calculate the total capital costs and the total operating costs. The model used measured economic data from the bio-gasification facility at Mississippi State University. The modeling results showed that the unit cost of syngas production was $1.217 for a 60 Nm-3 h-1 capacity bio-gasifier. The operating cost was the major part of the total production cost. The equipment purchase cost and the labor cost were the largest part of the total capital cost and the total operating cost, respectively. Sensitivity analysis indicated that labor costs rank the top as followed by equipment cost, loan life, feedstock cost, interest rate, utility cost, and waste treatment cost. The unit cost of syngas production increased with the increase of all parameters with exception of loan life. The annual cost regarding equipment, labor, feedstock, waste treatment, and utility cost showed a linear relationship with percent changes, while loan life and annual interest rate showed a non-linear relationship. This study provides the useful information for economic analysis and assessment of the syngas production using a modeling approach.

  14. Development and assessment of a subchannel analysis code system for SMART core design

    International Nuclear Information System (INIS)

    A subchannel code system is developed for the thermal-hydraulic analysis of SMART core, and the applicability and accuracy of the code is assessed for various experimental data with rod bundles. MATRA is a subchannel analysis code calculating the enthalpy and flow distribution in fuel assemblies and reactor cores for both steady-state and transient conditions. MATRA has been developed to be run on an IBM PC or HP WS based on the existing CDC CYBER mainframe version of COBRA-IV-I. MATRA has been provided with an improved structure and code functions to give more convenient user environment. Improvement of various models enhances the convergence and accuracy of the code: those include the numerical solution scheme for the crossflow, the void fraction model, and the lateral transport model, and so on. A turbulent mixing model considering void drift phenomenon is devised by employing the two-phase mixing test data under PWR and BWR conditions. MATRA/SR-1 CHF correlation system is developed from local conditions of rod bundle CHF data calculated by MATRA. The optimized 1/8 core lumping models are developed for the analysis of the thermal margins of SMART core at steady-state and transient conditions

  15. BATCH-GE: Batch analysis of Next-Generation Sequencing data for genome editing assessment

    Science.gov (United States)

    Boel, Annekatrien; Steyaert, Woutert; De Rocker, Nina; Menten, Björn; Callewaert, Bert; De Paepe, Anne; Coucke, Paul; Willaert, Andy

    2016-01-01

    Targeted mutagenesis by the CRISPR/Cas9 system is currently revolutionizing genetics. The ease of this technique has enabled genome engineering in-vitro and in a range of model organisms and has pushed experimental dimensions to unprecedented proportions. Due to its tremendous progress in terms of speed, read length, throughput and cost, Next-Generation Sequencing (NGS) has been increasingly used for the analysis of CRISPR/Cas9 genome editing experiments. However, the current tools for genome editing assessment lack flexibility and fall short in the analysis of large amounts of NGS data. Therefore, we designed BATCH-GE, an easy-to-use bioinformatics tool for batch analysis of NGS-generated genome editing data, available from https://github.com/WouterSteyaert/BATCH-GE.git. BATCH-GE detects and reports indel mutations and other precise genome editing events and calculates the corresponding mutagenesis efficiencies for a large number of samples in parallel. Furthermore, this new tool provides flexibility by allowing the user to adapt a number of input variables. The performance of BATCH-GE was evaluated in two genome editing experiments, aiming to generate knock-out and knock-in zebrafish mutants. This tool will not only contribute to the evaluation of CRISPR/Cas9-based experiments, but will be of use in any genome editing experiment and has the ability to analyze data from every organism with a sequenced genome. PMID:27461955

  16. Finger microvascular responses to deep inspiratory gasp assessed and quantified using wavelet analysis

    International Nuclear Information System (INIS)

    The physiological changes following a deep inspiratory gasp (DIG) manoeuvre have been described in the literature. However, the lack of a reliable signal processing technique to visualize and quantify these physiological changes has so far limited the applicability of the test to the clinical setting. The main aim of this study was to assess the feasibility of using wavelet analysis to quantify the pulse arrival time (PAT) and its changes during the DIG manoeuvre. Vascular responses were extracted from cardiac (electrocardiogram, ECG) and peripheral pulse (photoplethysmography, PPG) waveforms. Wavelet analysis characterized their cardiovascular responses in healthy adult subjects in the time-frequency space, and for the ECG–PPG inter-relationship in terms of the PAT. PAT showed a characteristic biphasic response to gasp, with an increase of 59 ± 59 ms (p = 0.001) compared to the maximum value reached during quiet breathing, and a decrease of −38 ± 55 ms (p < 0.01) compared to the minimum value during quiet breathing. The response measures were repeatable. This pilot study has successfully shown the feasibility of using wavelet analysis to characterize the cardiovascular waveforms and quantify their changes with DIG. (paper)

  17. BATCH-GE: Batch analysis of Next-Generation Sequencing data for genome editing assessment.

    Science.gov (United States)

    Boel, Annekatrien; Steyaert, Woutert; De Rocker, Nina; Menten, Björn; Callewaert, Bert; De Paepe, Anne; Coucke, Paul; Willaert, Andy

    2016-01-01

    Targeted mutagenesis by the CRISPR/Cas9 system is currently revolutionizing genetics. The ease of this technique has enabled genome engineering in-vitro and in a range of model organisms and has pushed experimental dimensions to unprecedented proportions. Due to its tremendous progress in terms of speed, read length, throughput and cost, Next-Generation Sequencing (NGS) has been increasingly used for the analysis of CRISPR/Cas9 genome editing experiments. However, the current tools for genome editing assessment lack flexibility and fall short in the analysis of large amounts of NGS data. Therefore, we designed BATCH-GE, an easy-to-use bioinformatics tool for batch analysis of NGS-generated genome editing data, available from https://github.com/WouterSteyaert/BATCH-GE.git. BATCH-GE detects and reports indel mutations and other precise genome editing events and calculates the corresponding mutagenesis efficiencies for a large number of samples in parallel. Furthermore, this new tool provides flexibility by allowing the user to adapt a number of input variables. The performance of BATCH-GE was evaluated in two genome editing experiments, aiming to generate knock-out and knock-in zebrafish mutants. This tool will not only contribute to the evaluation of CRISPR/Cas9-based experiments, but will be of use in any genome editing experiment and has the ability to analyze data from every organism with a sequenced genome. PMID:27461955

  18. Assessment of Forest Conservation Value Using a Species Distribution Model and Object-based Image Analysis

    Science.gov (United States)

    Jin, Y.; Lee, D. K.; Jeong, S. G.

    2015-12-01

    The ecological and social values of forests have recently been highlighted. Assessments of the biodiversity of forests, as well as their other ecological values, play an important role in regional and national conservation planning. The preservation of habitats is linked to the protection of biodiversity. For mapping habitats, species distribution model (SDM) is used for predicting suitable habitat of significant species, and such distribution modeling is increasingly being used in conservation science. However, the pixel-based analysis does not contain contextual or topological information. In order to provide more accurate habitats predictions, a continuous field view that assumes the real world is required. Here we analyze and compare at different scales, habitats of the Yellow Marten's(Martes Flavigula), which is a top predator and also an umbrella species in South Korea. The object-scale, which is a group of pixels that have similar spatial and spectral characteristics, and pixel-scale were used for SDM. Our analysis using the SDM at different scales suggests that object-scale analysis provides a superior representation of continuous habitat, and thus will be useful in forest conservation planning as well as for species habitat monitoring.

  19. Development of PIRT and assessment matrix for verification and validation of sodium fire analysis codes

    International Nuclear Information System (INIS)

    Thermodynamic consequence in liquid sodium leak and fire accident is one of the important issues to be evaluated when considering the safety aspect of fast reactor plant building. The authors are therefore initiating systematic verification and validation (V and V) activity to assure and demonstrate the reliability of numerical simulation tool for sodium fire analysis. The V and V activity is in progress with the main focuses on already developed sodium fire analysis codes SPHINCS and AQUA-SF. The events to be evaluated are hypothetical sodium spray, pool, or combined fire accidents followed by thermodynamic behaviors postulated in a plant building. The present paper describes that the 'Phenomena Identification and Ranking Table (PIRT)' is developed at first for clarifying the important validation points in the sodium fire analysis codes, and that an 'Assessment Matrix' is proposed which summarizes both separate effect tests and integral effect tests for validating the computational models or whole code for important phenomena. Furthermore, the paper shows a practical validation with a separate effect test in which the spray droplet combustion model of SPHINCS and AQUA-SF predicts the burned amount of a falling sodium droplet with the error mostly less than 30%. (author)

  20. GIS-based analysis of drinking-water supply structures: a module for microbial risk assessment.

    Science.gov (United States)

    Kistemann, T; Herbst, S; Dangendorf, F; Exner, M

    2001-05-01

    Water-related infections constitute an important health impact world-wide. A set of tools serving for Microbial Risk Assessment (MRA) of waterborne diseases should comprise the entire drinking-water management system and take into account the Hazard Analysis and Critical Control Point (HACCP) concept which provides specific Critical Control Points (CCPs) reflecting each step of drinking-water provision. A Geographical Information System (GIS) study concerning water-supply structure (WSS) was conducted in the Rhein-Berg District (North Rhine-Westphalia, Germany). As a result, suitability of the existing water databases HYGRIS (hydrological basis geo-information system) and TEIS (drinking-water recording and information system) for the development of a WSS-GIS module could be demonstrated. Spatial patterns within the integrated raw and drinking-water data can easily be uncovered by GIS-specific options. The application of WSS-GIS allows a rapid visualization and analysis of drinking-water supply structure and offers huge advantages concerning microbial monitoring of raw and drinking water as well as recognition and investigation of incidents and outbreaks. Increasing requests regarding health protection and health reporting, demands for a better outbreak management and water-related health impacts of global climate change are major challenges of future water management to be tackled with methods including spatial analysis. GIS is assumed to be a very useful tool to meet these requirements. PMID:11434210

  1. Quantitative uncertainty analysis of Life Cycle Assessment for algal biofuel production.

    Science.gov (United States)

    Sills, Deborah L; Paramita, Vidia; Franke, Michael J; Johnson, Michael C; Akabas, Tal M; Greene, Charles H; Tester, Jefferson W

    2013-01-15

    As a result of algae's promise as a renewable energy feedstock, numerous studies have used Life Cycle Assessment (LCA) to quantify the environmental performance of algal biofuels, yet there is no consensus of results among them. Our work, motivated by the lack of comprehensive uncertainty analysis in previous studies, uses a Monte Carlo approach to estimate ranges of expected values of LCA metrics by incorporating parameter variability with empirically specified distribution functions. Results show that large uncertainties exist at virtually all steps of the biofuel production process. Although our findings agree with a number of earlier studies on matters such as the need for wet lipid extraction, nutrients recovered from waste streams, and high energy coproducts, the ranges of reported LCA metrics show that uncertainty analysis is crucial for developing technologies, such as algal biofuels. In addition, the ranges of energy return on (energy) invested (EROI) values resulting from our analysis help explain the high variability in EROI values from earlier studies. Reporting results from LCA models as ranges, and not single values, will more reliably inform industry and policy makers on expected energetic and environmental performance of biofuels produced from microalgae. PMID:23237457

  2. Seismic fragility assessment of concrete gravity dams using nonlinear dynamic analysis with massed foundation

    Energy Technology Data Exchange (ETDEWEB)

    Ghaemian, M.; MirzahosseinKashani, S. [Sharif Univ. of Technology, Tehran (Iran, Islamic Republic of). Dept. of Civil Engineering

    2010-07-01

    A significant concern for dam owners is the maintenance of concrete gravity dams in good condition for infrastructure. For example, these dams should able to continue to operate after a disaster such as an earthquake. However, many of these dams are older dams, and some are located near faults. There are concerns regarding the performance of these dams under the effect of seismic loads. This paper illustrated seismic fragility curves for concrete gravity dams by using nonlinear dynamic analysis and a continuum crack propagation model, smeared crack model. The Pine Flat dam was used for the calculations. Specifically, the paper presented the fragility analysis and probabilistic safety assessment as well as the structural modeling of dam behaviour using the smeared crack model. The finite element model of the dam was also presented. The result from the nonlinear dynamic analysis was discussed. It was concluded that a seismic fragility curve based on the length of a crack at the base demonstrates higher probability when compared with a seismic fragility curve based on areas of cracked elements. Therefore, seismic fragility curves based on the areas of cracked elements should be a more realistic approach, especially when it accounts for areas of damaged elements at the neck. 10 refs,, 5 tabs., 5 figs.

  3. Life cycle assessment of fossil and biomass power generation chains. An analysis carried out for ALSTOM Power Services

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, Ch.

    2008-12-15

    This final report issued by the Technology Assessment Department of the Paul Scherrer Institute (PSI) reports on the results of an analysis carried out on behalf of the Alstom Power Services company. Fossil and biomass chains as well as co-combustion power plants are assessed. The general objective of this analysis is an evaluation of specific as well as overall environmental burdens resulting from these different options for electricity production. The results obtained for fuel chains including hard coal, lignite, wood, natural gas and synthetic natural gas are discussed. An overall comparison is made and the conclusions drawn from the results of the analysis are presented.

  4. Expert assessments and content analysis of crew communication during ISS missions

    Science.gov (United States)

    Yusupova, Anna

    During the last seven years, we have analyzed the communication patterns between ISS crewmembers and mission control personnel and identified a number of different communication styles between these two groups (Gushin et al, 2005). In this paper, we will report on an external validity check we conducted that compares our findings with those of another study using the same research material. For many years the group of psychologists at the Medical Center of Space Flight Control (TCUMOKO) at the Institute for Biomedical Problems (IBMP) in Moscow has been analyzing audio communication sessions of Russian space crews with the ground-based Mission Control during long-duration spaceflight conditions. We compared week by week texts of the standard weekly monitoring reports made by the TsUP psychological group and audiocommunication of space crews with mission control centers. Expert assessments of the crewmembers' psychological state are made by IBMP psychoneurologists on the basis of daily schedule fulfillment, video and audio materials, and psychophysiological data from board. The second approach was based on the crew-ground communication analysis. For both population of messages we applied two corresponding schemas of content analysis. All statements made in communication sessions and weekly reports were divided into three groups in terms of their communication function (Lomov, 1981): 1) informative function (e.g., demands for information, requests, professional slang); 2) socio-regulatory function (e.g., rational consent or discord, operational complaint, refusal to cooperate); and 3) affective (emotional) function (e.g., encouragement, sympathy, emotional consent or discord). Number of statements of the audiocommunication sessions correlated with corresponding functions (informative, regulatory, affective) of communication in weekly monitioring reports made by experts. Crewmembers verbal behavior expresses its psycho-emotional state which is formulated by expert

  5. Digital Cartographic Models as Analysis Support in Multicriterial Assessment of Vulnerable Flood Risk Elements

    Science.gov (United States)

    Nichersu, Iulian; Mierla, Marian; Trifanov, Cristian; Nichersu, Iuliana; Marin, Eugenia; Sela, Florentina

    2014-05-01

    In the last 20 years there has been an increase of frequency in extreme weather and hydrological events. This frequency increase arise the need to research the risk to the events that are extreme and has big impact to the environment. This paper presents a method to analysis the vulnerable elements to the risk at extreme hydrological event, to be more precisely to flood. The method is using also the LiDAR point cloud. The risk concept has two main components: the first one hazard (represented by frequency of the occurrence and intensity of the flood) and the second one vulnerability (represented by the vulnerable elements to the flood). The studied area in the present paper is situated in the South-East of Europe (Romania, Danube Delta). The Digital Cartographic Models were accomplished by using the LiDAR data obtained within the CARTODD project. The digital cartographic models, with high resolution, consist of 3 components: digital terrain model (DTM), digital elevation model (DEM) and elevation classes (EC). Completing the information of the three models there were used also the orthophotos in visible (VIS) and infrared (IR) spectrum slices. Digital Terrain Model gives information on the altitude of the terrain and indirect of the flood hazard, taking into account the high resolution that the final product has. Digital Elevation Model supplies information related to the surfaces of the terrain plus the altitude of each object on the surface. This model helps to reach to the third model the Elevation Classes Model. We present here three categories of applications of clouds points analyses in floodrisk assessment: buildings assessment, endangered species mentioned in Annex 1 of the European Habitats Directive and morphologic/habitats damages. Pilot case studies of these applications are: Sulina town; endangering species like Osmoderma eremita, Vipera ursini and Spermophilus citellus; Sireasa Polder. For Sulina town was assessed the manmade vulnerable elements to

  6. Improving the Assessment and Valuation of Climate Change Impacts for Policy and Regulatory Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marten, Alex; Kopp, Robert E.; Shouse, Kate C.; Griffiths, Charles; Hodson, Elke L.; Kopits, Elizabeth; Mignone, Bryan K.; Moore, Chris; Newbold, Steve; Waldhoff, Stephanie T.; Wolverton, Ann

    2013-04-01

    to updating the estimates regularly as modeling capabilities and scientific and economic knowledge improves. To help foster further improvements in estimating the SCC, the U.S. Environmental Protection Agency and the U.S. Department of Energy hosted a pair of workshops on “Improving the Assessment and Valuation of Climate Change Impacts for Policy and Regulatory Analysis.” The first focused on conceptual and methodological issues related to integrated assessment modeling and the second brought together natural and social scientists to explore methods for improving damage assessment for multiple sectors. These two workshops provide the basis for the 13 papers in this special issue.

  7. In-Depth Analysis of Selected Topics Related to the Quality Assessment of E-Commerce Systems

    Science.gov (United States)

    Stefani, Antonia; Stavrinoudis, Dimitris; Xenos, Michalis

    This paper provides an in-depth analysis of selected important topics related to the quality assessment of e-commerce systems. It briefly introduces to the reader a quality assessment model based on Bayesian Networks and presents in detail the practical application of this model, highlighting practical issues related to the involvement of human subjects, conflict resolution, and calibration of the measurement instruments. Furthermore, the paper presents the application process of the model for the quality assessment of various e-commerce systems; it also discusses in detail how particular features (data) of the assessed e-commerce systems can be identified and, using the described automated assessment process, lead to higher abstraction information (desiderata) regarding the quality of the assessed e-commerce systems.

  8. Problems and prospects of modern methods of business analysis in the process of assessment of solvency of borrowers

    Directory of Open Access Journals (Sweden)

    Aptekar Saveliy S.

    2013-03-01

    Full Text Available The goal of the article is a comparative analysis of modern methods of business analysis in the process of assessment of solvency of borrowers of Ukrainian commercial banks, study of prospects and problems of the use of methods in the credit process. In the result of the study the article systemises and considers the conduct of the credit process of Ukrainian commercial banks. It becomes clear from result of the study that it is impossible to obtain a single assessment of solvency of a borrower with generalisation of numerical and non-numerical data. Assessment of qualified analysts is required for a justified assessment of solvency apart from information represented in numbers. Improvement of approaches to assessment of solvency of borrowers and adaptation of the existing foreign experience in this field to specific features of formation of solvency of Ukrainian borrowers are important tasks for the Ukrainian banking system. Prospects of further studies in this direction are establishment of importance of the conduct of business analysis and its key role in assessment of solvency of borrowers as a main instrument of minimisation of the credit risk. Improvement of this sphere of analytical work in Ukrainian banks should be carried out in the following main directions: study and analysis of qualitative indicators of business activity; analysis of main sections of the business plan; expansion of the composition of indicators of the financial analysis for obtaining information; conduct of analysis of possible sources of repayment of loan liabilities; and active use of analysis of cash flows of an enterprise.

  9. Development of a quantitative morphological assessment of toxicant-treated zebrafish larvae using brightfield imaging and high-content analysis.

    Science.gov (United States)

    Deal, Samantha; Wambaugh, John; Judson, Richard; Mosher, Shad; Radio, Nick; Houck, Keith; Padilla, Stephanie

    2016-09-01

    One of the rate-limiting procedures in a developmental zebrafish screen is the morphological assessment of each larva. Most researchers opt for a time-consuming, structured visual assessment by trained human observer(s). The present studies were designed to develop a more objective, accurate and rapid method for screening zebrafish for dysmorphology. Instead of the very detailed human assessment, we have developed the computational malformation index, which combines the use of high-content imaging with a very brief human visual assessment. Each larva was quickly assessed by a human observer (basic visual assessment), killed, fixed and assessed for dysmorphology with the Zebratox V4 BioApplication using the Cellomics® ArrayScan® V(TI) high-content image analysis platform. The basic visual assessment adds in-life parameters, and the high-content analysis assesses each individual larva for various features (total area, width, spine length, head-tail length, length-width ratio, perimeter-area ratio). In developing the computational malformation index, a training set of hundreds of embryos treated with hundreds of chemicals were visually assessed using the basic or detailed method. In the second phase, we assessed both the stability of these high-content measurements and its performance using a test set of zebrafish treated with a dose range of two reference chemicals (trans-retinoic acid or cadmium). We found the measures were stable for at least 1 week and comparison of these automated measures to detailed visual inspection of the larvae showed excellent congruence. Our computational malformation index provides an objective manner for rapid phenotypic brightfield assessment of individual larva in a developmental zebrafish assay. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26924781

  10. Advances in exergy analysis: a novel assessment of the Extended Exergy Accounting method

    International Nuclear Information System (INIS)

    Highlights: • General overview of exergy-based methods for system analysis is presented. • Taxonomy for exergy-based methods classification is proposed. • Theoretical foundations and details of Extended Exergy Accounting are described. - Abstract: Objective: This paper presents a theoretical reassessment of the Extended Exergy Accounting method (EEA in the following), a comprehensive exergy-based analytical paradigm for the evaluation of the total equivalent primary resource consumption in a generic system. Our intent in this paper was to rigorously review the EEA theory and to highlight its double “valence” as a resource quantifier and to clarify its operative potential. On the one side, EEA can be properly regarded as a general “costing” theory based on a proper knowledge of the cumulative exergy consumption of different supply chains, economic systems and labour market: it is indeed the only method that translates externalities (capital, labour and environmental remediation) into cumulative exergetic costs and thus allows for their rigorous inclusion in a comprehensive resource cost assessment. Indeed, the extended exergy cost eec reflects both the thermodynamic “efficiency” of the production chain and the “hidden” resource costs for the society as a whole. From another, perhaps even more innovative, perspective, EEA can be viewed as a space and time dependent methodology since economic and labour costs can only be included in the Extended Exergy balance via their exergy equivalents (via two rigorously defined postulates). Since the equivalent exergy cost of the externalities depends both on the type of society and on the time window of the analysis, the extended exergy cost eec reflects in a very real sense both the thermodynamic “efficiency” of the machinery and the “conversion efficiency” of the specific society within which the analysis is performed. We argue that these two intrinsic features of the EEA method provide both

  11. Assessment of neural network, frequency ratio and regression models for landslide susceptibility analysis

    Science.gov (United States)

    Pradhan, B.; Buchroithner, M. F.; Mansor, S.

    2009-04-01

    This paper presents the assessment results of spatially based probabilistic three models using Geoinformation Techniques (GIT) for landslide susceptibility analysis at Penang Island in Malaysia. Landslide locations within the study areas were identified by interpreting aerial photographs, satellite images and supported with field surveys. Maps of the topography, soil type, lineaments and land cover were constructed from the spatial data sets. There are nine landslide related factors were extracted from the spatial database and the neural network, frequency ratio and logistic regression coefficients of each factor was computed. Landslide susceptibility maps were drawn for study area using neural network, frequency ratios and logistic regression models. For verification, the results of the analyses were compared with actual landslide locations in study area. The verification results show that frequency ratio model provides higher prediction accuracy than the ANN and regression models.

  12. Assessment of air quality of two metropolitan cities in Pakistan: elemental analysis using INAA and AAS

    International Nuclear Information System (INIS)

    For the assessment of air quality of two cities in Pakistan. Instrumental Neutron Activation Analysis (INAA) and Atomic Absorption Spectrometry (AAS) have been used. In all 26 elements were determined in suspended particulate matter (SPM) and soil samples. The SPM levels from different locations of Rawalpindi and Lahore indicate unhealthy to hazardous air quality much above the World Health Organization (WHO) guidelines. Metrological conditions and nature of soil that contains clay components mainly contribute to the inventory of the SPM clements of these cities. The calculated enrichment factors (EF) also indicate the dominance of the soil components. Contributions of inorganic elements due to heavy traffic with automotive exhaust and other commercial activities in this area have been identified. Areas of Qurtaba Chowk and Bank Square in Lahore in particular showed high EF for lead depicting burning of the automotive fuel and road dust. The accuracy and precision of the work has been maintained through the concurrent use of IAEA Reference Materials. (orig.)

  13. Initial assessment of facial nerve paralysis based on motion analysis using an optical flow method.

    Science.gov (United States)

    Samsudin, Wan Syahirah W; Sundaraj, Kenneth; Ahmad, Amirozi; Salleh, Hasriah

    2016-03-14

    An initial assessment method that can classify as well as categorize the severity of paralysis into one of six levels according to the House-Brackmann (HB) system based on facial landmarks motion using an Optical Flow (OF) algorithm is proposed. The desired landmarks were obtained from the video recordings of 5 normal and 3 Bell's Palsy subjects and tracked using the Kanade-Lucas-Tomasi (KLT) method. A new scoring system based on the motion analysis using area measurement is proposed. This scoring system uses the individual scores from the facial exercises and grades the paralysis based on the HB system. The proposed method has obtained promising results and may play a pivotal role towards improved rehabilitation programs for patients. PMID:26578273

  14. Assessment of Smolt Condition for Travel Time Analysis Project, 1987-1997 Project Review.

    Energy Technology Data Exchange (ETDEWEB)

    Schrock, Robin M.; Hans, Karen M.; Beeman, John W. [US Geological Survey, Western Fisheries Research Center, Columbia River Research Laboratory, Cook, WA

    1997-12-01

    The assessment of Smolt Condition for Travel Time Analysis Project (Bonneville Power Administration Project 87-401) monitored attributes of salmonid smolt physiology in the Columbia and Snake River basins from 1987 to 1997, under the Northwest Power Planning Council Fish and Wildlife Program, in cooperation with the Smolt Monitoring Program of the Fish Passage Center. The primary goal of the project was to investigate the physiological development of juvenile salmonids related to migration rates. The assumption was made that the level of smolt development, interacting with environmental factos such as flow, would be reflected in travel times. The Fish Passage Center applied the physiological measurements of smolt condition to Water Budget management, to regulate flows so as to decrease travel time and increase survival.

  15. Multifractal analysis of surface EMG signals for assessing muscle fatigue during static contractions

    Institute of Scientific and Technical Information of China (English)

    WANG Gang; REN Xiao-mei; LI Lei; WANG Zhi-zhong

    2007-01-01

    This study is aimed at assessing muscle fatigue during a static contraction using multifractal analysis and found that the surface electromyographic (SEMG) signals characterized multifractality during a static contraction. By applying the method of direct determination of the f(α) singularity spectrum, the area of the multifractal spectrum of the SEMG signals was computed. The results showed that the spectrum area significantly increased during muscle fatigue. Therefore the area could be used as an assessor of muscle fatigue. Compared with the median frequency (MDF)-the most popular indicator of muscle fatigue, the spectrum area presented here showed higher sensitivity during a static contraction. So the singularity spectrum area is considered to be a more effective indicator than the MDF for estimating muscle fatigue.

  16. Analysis report for WIPP colloid model constraints and performance assessment parameters

    Energy Technology Data Exchange (ETDEWEB)

    Mariner, Paul E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sassani, David Carl [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-03-01

    An analysis of the Waste Isolation Pilot Plant (WIPP) colloid model constraints and parameter values was performed. The focus of this work was primarily on intrinsic colloids, mineral fragment colloids, and humic substance colloids, with a lesser focus on microbial colloids. Comments by the US Environmental Protection Agency (EPA) concerning intrinsic Th(IV) colloids and Mg-Cl-OH mineral fragment colloids were addressed in detail, assumptions and data used to constrain colloid model calculations were evaluated, and inconsistencies between data and model parameter values were identified. This work resulted in a list of specific conclusions regarding model integrity, model conservatism, and opportunities for improvement related to each of the four colloid types included in the WIPP performance assessment.

  17. Computational psycholinguistic analysis and its application in psychological assessment of college students

    Directory of Open Access Journals (Sweden)

    Kučera Dalibor

    2015-06-01

    Full Text Available The paper deals with the issue of computational psycholinguistic analysis (CPA and its experimental application in basic psychological and pedagogical assessment. CPA is a new method which may potentially provide interesting, psychologically relevant information about the author of a particular text, regardless of the text’s factual (semantic content and without the need to obtain additional materials. As part of our QPA-FPT research we studied the link between the linguistic form of a text by Czech college students and their personality characteristics obtained from a psychodiagnostic test battery. The article also discusses the basis of the method, opportunities for practical application and potential use within psychological and pedagogical disciplines

  18. DECISION ANALYSIS AND TECHNOLOGY ASSESSMENTS FOR METAL AND MASONRY DECONTAMINATION TECHNOLOGIES

    International Nuclear Information System (INIS)

    The purpose of this investigation was to conduct a comparative analysis of innovative technologies for the non-aggressive removal of coatings from metal and masonry surfaces and the aggressive removal of one-quarter to one-inch thickness of surface from structural masonry. The technologies tested should be capable of being used in nuclear facilities. Innovative decontamination technologies are being evaluated under standard, non-nuclear conditions at the FIU-HCET technology assessment site in Miami, Florida. This study is being performed to support the OST, the Deactivation and Decommissioning (D and D) Focus Area, and the environmental restoration of DOE facilities throughout the DOE complex by providing objective evaluations of currently available decontamination technologies

  19. Assessment of oil weathering by gas chromatography-mass spectrometry, time warping and principal component analysis

    DEFF Research Database (Denmark)

    Malmquist, Linus M.V.; Olsen, Rasmus R.; Hansen, Asger B.;

    2007-01-01

    Detailed characterization and understanding of oil weathering at the molecular level is an essential part of tiered approaches for forensic oil spill identification, for risk assessment of terrestrial and marine oil spills, and for evaluating effects of bioremediation initiatives. Here, a...... chemometricbased method is applied to data from two in vitro experiments in order to distinguish the effects of evaporation and dissolution processes on oil composition. The potential of the method for obtaining detailed chemical information of the effects from evaporation and dissolution processes, to determine...... weathering state and to distinguish between various weathering processes is investigated and discussed. The method is based on comprehensive and objective chromatographic data processing followed by principal component analysis (PCA) of concatenated sections of gas chromatography–mass spectrometry...

  20. Human factors assessment in PRA using Task Analysis Linked Evaluation Technique (TALENT)

    International Nuclear Information System (INIS)

    Thirty years ago the US military and US aviation industry, and more recently, in response to the US Three Mile Island and USSR Chernobyl accidents, the US commercial nuclear power industry, acknowledged that human error, as an immediate precursor, and as a latent or indirect influence in the form of training, maintainability, inservice test, and surveillance programs, is a primary contributor to unreality and risk in complex high-reliability systems. A 1985 Nuclear Regulatory Commission (NRC) study of Licensee Event Reports (LERs) suggests that upwards of 65% of commercial nuclear system failures involve human error. Despite the magnitude and nature of human error cited in that study, there has been limited attention to personnel-centered issues, especially person-to-person issues involving group processes, management and organizational environment. The paper discusses NRC integration and applications research with respect to the Task Analysis Linked Evaluation Technique (TALENT) in risk assessment applications