WorldWideScience

Sample records for assessment ioa analysis

  1. Validation of the Indicators of Abuse (IOA) Screen.

    Science.gov (United States)

    Reis, Myrna; Nahmiash, Daphne

    1998-01-01

    Reports on the validity of the Indicators of Abuse (IOA) Screen, used by social-services-agency practitioners as an abuse screening tool. An abuse-indicator model evolving from the IOA suggests three main types of abuse signals: caregivers' personal problems/issues, caregivers interpersonal problems, and care receivers' social-support shortages…

  2. Linkage mapping of the locus for inherited ovine arthrogryposis (IOA) to sheep chromosome 5.

    Science.gov (United States)

    Murphy, Angela M; MacHugh, David E; Park, Stephen D E; Scraggs, Erik; Haley, Chris S; Lynn, David J; Boland, Maurice P; Doherty, Michael L

    2007-01-01

    Arthrogryposis is a congenital malformation affecting the limbs of newborn animals and infants. Previous work has demonstrated that inherited ovine arthrogryposis (IOA) has an autosomal recessive mode of inheritance. Two affected homozygous recessive (art/art) Suffolk rams were used as founders for a backcross pedigree of half-sib families segregating the IOA trait. A genome scan was performed using 187 microsatellite genetic markers and all backcross animals were phenotyped at birth for the presence and severity of arthrogryposis. Pairwise LOD scores of 1.86, 1.35, and 1.32 were detected for three microsatellites, BM741, JAZ, and RM006, that are located on sheep Chr 5 (OAR5). Additional markers in the region were identified from the genetic linkage map of BTA7 and by in silico analyses of the draft bovine genome sequence, three of which were informative. Interval mapping of all autosomes produced an F value of 21.97 (p < 0.01) for a causative locus in the region of OAR5 previously flagged by pairwise linkage analysis. Inspection of the orthologous region of HSA5 highlighted a previously fine-mapped locus for human arthrogryposis multiplex congenita neurogenic type (AMCN). A survey of the HSA5 genome sequence identified plausible candidate genes for both IOA and human AMCN.

  3. Urban energy consumption: Different insights from energy flow analysis, input–output analysis and ecological network analysis

    International Nuclear Information System (INIS)

    Highlights: • Urban energy consumption was assessed from three different perspectives. • A new concept called controlled energy was developed from network analysis. • Embodied energy and controlled energy consumption of Beijing were compared. • The integration of all three perspectives will elucidate sustainable energy use. - Abstract: Energy consumption has always been a central issue for sustainable urban assessment and planning. Different forms of energy analysis can provide various insights for energy policy making. This paper brought together three approaches for energy consumption accounting, i.e., energy flow analysis (EFA), input–output analysis (IOA) and ecological network analysis (ENA), and compared their different perspectives and the policy implications for urban energy use. Beijing was used to exemplify the different energy analysis processes, and the 42 economic sectors of the city were aggregated into seven components. It was determined that EFA quantifies both the primary and final energy consumption of the urban components by tracking the different types of fuel used by the urban economy. IOA accounts for the embodied energy consumption (direct and indirect) used to produce goods and services in the city, whereas the control analysis of ENA quantifies the specific embodied energy that is regulated by the activities within the city’s boundary. The network control analysis can also be applied to determining which economic sectors drive the energy consumption and to what extent these sectors are dependent on each other for energy. So-called “controlled energy” is a new concept that adds to the analysis of urban energy consumption, indicating the adjustable energy consumed by sectors. The integration of insights from all three accounting perspectives further our understanding of sustainable energy use in cities

  4. Assessment of Thorium Analysis Methods

    International Nuclear Information System (INIS)

    The Assessment of thorium analytical methods for mixture power fuel consisting of titrimetry, X-ray flouresence spectrometry, UV-VIS spectrometry, alpha spectrometry, emission spectrography, polarography, chromatography (HPLC) and neutron activation were carried out. It can be concluded that analytical methods which have high accuracy (deviation standard < 3%) were; titrimetry neutron activation analysis and UV-VIS spectrometry; whereas with low accuracy method (deviation standard 3-10%) were; alpha spectrometry and emission spectrography. Ore samples can be analyzed by X-ray flourescnce spectrometry, neutron activation analysis, UV-VIS spectrometry, emission spectrography, chromatography and alpha spectometry. Concentrated samples can be analyzed by X-ray flourescence spectrometry; simulation samples can be analyzed by titrimetry, polarography and UV-VIS spectrometry, and samples of thorium as minor constituent can be analyzed by neutron activation analysis and alpha spectrometry. Thorium purity (impurities element in thorium samples) can be analyzed by emission spectography. Considering interference aspects, in general analytical methods without molecule reaction are better than those involving molecule reactions (author). 19 refs., 1 tabs

  5. Multifractal analysis for nutritional assessment.

    Directory of Open Access Journals (Sweden)

    Youngja Park

    Full Text Available The concept of multifractality is currently used to describe self-similar and complex scaling properties observed in numerous biological signals. Fractals are geometric objects or dynamic variations which exhibit some degree of similarity (irregularity to the original object in a wide range of scales. This approach determines irregularity of biologic signal as an indicator of adaptability, the capability to respond to unpredictable stress, and health. In the present work, we propose the application of multifractal analysis of wavelet-transformed proton nuclear magnetic resonance ((1H NMR spectra of plasma to determine nutritional insufficiency. For validation of this method on (1H NMR signal of human plasma, standard deviation from classical statistical approach and Hurst exponent (H, left slope and partition function from multifractal analysis were extracted from (1H NMR spectra to test whether multifractal indices could discriminate healthy subjects from unhealthy, intensive care unit patients. After validation, the multifractal approach was applied to spectra of plasma from a modified crossover study of sulfur amino acid insufficiency and tested for associations with blood lipids. The results showed that standard deviation and H, but not left slope, were significantly different for sulfur amino acid sufficiency and insufficiency. Quadratic discriminant analysis of H, left slope and the partition function showed 78% overall classification accuracy according to sulfur amino acid status. Triglycerides and apolipoprotein C3 were significantly correlated with a multifractal model containing H, left slope, and standard deviation, and cholesterol and high-sensitivity C-reactive protein were significantly correlated to H. In conclusion, multifractal analysis of (1H NMR spectra provides a new approach to characterize nutritional status.

  6. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  7. Quality Assessment of Urinary Stone Analysis

    DEFF Research Database (Denmark)

    Siener, Roswitha; Buchholz, Noor; Daudon, Michel;

    2016-01-01

    , fulfilled the quality requirements. According to the current standard, chemical analysis is considered to be insufficient for stone analysis, whereas infrared spectroscopy or X-ray diffraction is mandatory. However, the poor results of infrared spectroscopy highlight the importance of equipment, reference...... and chemical analysis. The aim of the present study was to assess the quality of urinary stone analysis of laboratories in Europe. Nine laboratories from eight European countries participated in six quality control surveys for urinary calculi analyses of the Reference Institute for Bioanalytics, Bonn, Germany......, between 2010 and 2014. Each participant received the same blinded test samples for stone analysis. A total of 24 samples, comprising pure substances and mixtures of two or three components, were analysed. The evaluation of the quality of the laboratory in the present study was based on the attainment...

  8. Safety analysis and risk assessment handbook

    International Nuclear Information System (INIS)

    This Safety Analysis and Risk Assessment Handbook (SARAH) provides guidance to the safety analyst at the Rocky Flats Environmental Technology Site (RFETS) in the preparation of safety analyses and risk assessments. Although the older guidance (the Rocky Flats Risk Assessment Guide) continues to be used for updating the Final Safety Analysis Reports developed in the mid-1980s, this new guidance is used with all new authorization basis documents. With the mission change at RFETS came the need to establish new authorization basis documents for its facilities, whose functions had changed. The methodology and databases for performing the evaluations that support the new authorization basis documents had to be standardized, to avoid the use of different approaches and/or databases for similar accidents in different facilities. This handbook presents this new standardized approach. The handbook begins with a discussion of the requirements of the different types of authorization basis documents and how to choose the one appropriate for the facility to be evaluated. It then walks the analyst through the process of identifying all the potential hazards in the facility, classifying them, and choosing the ones that need to be analyzed further. It then discusses the methods for evaluating accident initiation and progression and covers the basic steps in a safety analysis, including consequence and frequency binning and risk ranking. The handbook lays out standardized approaches for determining the source terms of the various accidents (including airborne release fractions, leakpath factors, etc.), the atmospheric dispersion factors appropriate for Rocky Flats, and the methods for radiological and chemical consequence assessments. The radiological assessments use a radiological open-quotes templateclose quotes, a spreadsheet that incorporates the standard values of parameters, whereas the chemical assessments use the standard codes ARCHIE and ALOHA

  9. Using Covariance Analysis to Assess Pointing Performance

    Science.gov (United States)

    Bayard, David; Kang, Bryan

    2009-01-01

    A Pointing Covariance Analysis Tool (PCAT) has been developed for evaluating the expected performance of the pointing control system for NASA s Space Interferometry Mission (SIM). The SIM pointing control system is very complex, consisting of multiple feedback and feedforward loops, and operating with multiple latencies and data rates. The SIM pointing problem is particularly challenging due to the effects of thermomechanical drifts in concert with the long camera exposures needed to image dim stars. Other pointing error sources include sensor noises, mechanical vibrations, and errors in the feedforward signals. PCAT models the effects of finite camera exposures and all other error sources using linear system elements. This allows the pointing analysis to be performed using linear covariance analysis. PCAT propagates the error covariance using a Lyapunov equation associated with time-varying discrete and continuous-time system matrices. Unlike Monte Carlo analysis, which could involve thousands of computational runs for a single assessment, the PCAT analysis performs the same assessment in a single run. This capability facilitates the analysis of parametric studies, design trades, and "what-if" scenarios for quickly evaluating and optimizing the control system architecture and design.

  10. Vulnerability assessment using two complementary analysis tools

    Energy Technology Data Exchange (ETDEWEB)

    Paulus, W.K.

    1993-07-01

    To analyze the vulnerability of nuclear materials to theft or sabotage, Department of Energy facilities have been using, since 1989, a computer program called ASSESS, Analytic System and Software for Evaluation of Safeguards and Security. During the past year Sandia National Laboratories has begun using an additional program, SEES, Security Exercise Evaluation Simulation, enhancing the picture of vulnerability beyond what either program achieves alone. Assess analyzes all possible paths of attack on a target and, assuming that an attack occurs, ranks them by the probability that a response force of adequate size can interrupt the attack before theft or sabotage is accomplished. A Neutralization module pits, collectively, a security force against the interrupted adversary force in a fire fight and calculates the probability that the adversaries are defeated. SEES examines a single scenario and simulates in detail the interactions among all combatants. its output includes shots fired between shooter and target, and the hits and kills. Whereas ASSESS gives breadth of analysis, expressed statistically and performed relatively quickly, SEES adds depth of detail, modeling tactical behavior. ASSESS finds scenarios that exploit the greatest weakness of a facility. SEES explores these scenarios to demonstrate in detail how various tactics to nullify the attack might work out. Without ASSESS to find the facility weakness, it is difficult to focus SEES objectively on scenarios worth analyzing. Without SEES to simulate the details of response vs. adversary interaction, it is not possible to test tactical assumptions and hypotheses. Using both programs together, vulnerability analyses achieve both breadth and depth.

  11. Office of Integrated Assessment and Policy Analysis

    International Nuclear Information System (INIS)

    The mission of the Office of Integrated Assessments and Policy Analysis (OIAPA) is to examine current and future policies related to the development and use of energy technologies. The principal ongoing research activity to date has focused on the impacts of several energy sources, including coal, oil shale, solar, and geothermal, from the standpoint of the Resource Conservation and Recovery Act. An additional project has recently been initiated on an evaluation of impacts associated with the implementation of the Toxic Substances Control Act. The impacts of the Resource Conservation and Recovery Act and the Toxic Substances Control Act on energy supply constitute the principal research focus of OIAPA for the near term. From these studies a research approach will be developed to identify certain common elements in the regulatory evaluation cycle as a means of evaluating subsequent environmental, health, and socioeconomic impact. It is planned that an integrated assessment team examine studies completed or underway on the following aspects of major regulations: health, risk assessment, testing protocols, environment control cost/benefits, institutional structures, and facility siting. This examination would assess the methodologies used, determine the general applicability of such studies, and present in a logical form information that appears to have broad general application. A suggested action plan for the State of Tennessee on radioactive and hazardous waste management is outlined

  12. Dynamic analysis and assessment for sustainable development

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The assessment of sustainable development is crucial for constituting sustainable development strategies. Assessment methods that exist so far usually only use an indicator system for making sustainable judgement. These indicators rarely reflect dynamic characteristics. However, sustainable development is influenced by changes in the social-economic system and in the eco-environmental system at different times. Besides the spatial character, sustainable development has a temporal character that can not be neglected; therefore the research system should also be dynamic. This paper focuses on this dynamic trait, so that the assessment results obtained provide more information for judgements in decision-making processes. Firstly the dynamic characteristics of sustainable development are analyzed, which point to a track of sustainable development that is an upward undulating curve. According to the dynamic character and the development rules of a social, economic and ecological system, a flexible assessment approach that is based on tendency analysis, restrictive conditions and a feedback system is then proposed for sustainable development.

  13. Multicriteria analysis in hazards assessment in Libya

    Science.gov (United States)

    Zeleňáková, Martina; Gargar, Ibrahim; Purcz, Pavol

    2012-11-01

    Environmental hazards (natural and man-made) have always constituted problem in many developing and developed countries. Many applications proved that these problems could be solved through planning studies and detailed information about these prone areas. Determining time and location and size of the problem are important for decision makers for planning and management activities. It is important to know the risk represented by those hazards and take actions to protect against them. Multicriteria analysis methods - Analytic hierarchy process, Pairwise comparison, Ranking method are used to analyse which is the most dangerous hazard facing Libya country. The multicriteria analysis ends with a more or less stable ranking of the given alternatives and hence a recommendation as to which alternative(s) problems should be preferred. Regarding our problem of environmental risk assessment, the result will be a ranking or categorisation of hazards with regard to their risk level.

  14. Geochemical and Geochronologic Investigations of Zircon-hosted Melt Inclusions in Rhyolites from the Mesoproterozoic Pea Ridge IOA-REE Deposit, St. Francois Mountains, Missouri

    Science.gov (United States)

    Watts, K. E.; Mercer, C. N.; Vazquez, J. A.

    2015-12-01

    Silicic volcanic and plutonic rocks of an eroded Mesoproterozoic caldera complex were intruded and replaced by iron ore, and cross-cut by REE-enriched breccia pipes (~12% total REO) to form the Pea Ridge iron-oxide-apatite-REE (IOA-REE) deposit. Igneous activity, iron ore formation, and REE mineralization overlapped in space and time, however the source of REEs and other metals (Fe, Cu, Au) integral to these economically important deposits remains unclear. Melt inclusions (MI) hosted in refractory zircon phenocrysts are used to constrain magmatic components and processes in the formation of the Pea Ridge deposit. Homogenized (1.4 kbar, 1000°C, 1 hr) MI in zircons from rhyolites ~600 ft (PR-91) and ~1200 ft (PR-12) laterally from the ore body were analyzed for major elements by EPMA and volatiles and trace elements (H2O, S, F, Cl, REEs, Rb, Sr, Y, Zr, Nb, U, Th) by SHRIMP-RG. Metals (including Cu, Au) will be measured in an upcoming SHRIMP-RG session. U-Pb ages, Ti and REE were determined by SHRIMP-RG for a subset of zircon spots adjacent to MI (1458 ± 18 Ma (PR-12); 1480 ± 45 Ma (PR-91)). MI glasses range from fresh and homogeneous dacite-rhyolite (65-75 wt% SiO2) to heterogeneous, patchy mixtures of K-spar and quartz (PR-12, 91), and more rarely mica, albite and/or anorthoclase (PR-91). MI are commonly attached to monazite and xenotime, particularly along re-entrants and zircon rims (PR-91). Fresh dacite-rhyolite glasses (PR-12) have moderate H2O (~2-2.5 wt%), Rb/Sr ratios (~8) and U (~5-7 ppm), and negative (chondrite-normalized) Eu anomalies (Eu ~0.4-0.7 ppm) (typical of rhyolites), whereas HREEs (Tb, Ho, Tm) are elevated (~2-3 ppm). Patchy K-spar and quartz inclusions (PR-12, 91) have flat LREE patterns, and positive anomalies in Tb, Ho, and Tm. One K-spar inclusion (PR-91) has a ~5-50 fold increase in HREEs (Tb, Dy, Ho, Er, Tm) and U (35 ppm) relative to other MI. U-Pb and REE analyses of its zircon host are not unusual (1484 ± 21 Ma); its irregular shape

  15. HANFORD SAFETY ANALYSIS & RISK ASSESSMENT HANDBOOK (SARAH)

    Energy Technology Data Exchange (ETDEWEB)

    EVANS, C B

    2004-12-21

    The purpose of the Hanford Safety Analysis and Risk Assessment Handbook (SARAH) is to support the development of safety basis documentation for Hazard Category 2 and 3 (HC-2 and 3) U.S. Department of Energy (DOE) nuclear facilities to meet the requirements of 10 CFR 830, ''Nuclear Safety Management''. Subpart B, ''Safety Basis Requirements.'' Consistent with DOE-STD-3009-94, Change Notice 2, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'' (STD-3009), and DOE-STD-3011-2002, ''Guidance for Preparation of Basis for Interim Operation (BIO) Documents'' (STD-3011), the Hanford SARAH describes methodology for performing a safety analysis leading to development of a Documented Safety Analysis (DSA) and derivation of Technical Safety Requirements (TSR), and provides the information necessary to ensure a consistently rigorous approach that meets DOE expectations. The DSA and TSR documents, together with the DOE-issued Safety Evaluation Report (SER), are the basic components of facility safety basis documentation. For HC-2 or 3 nuclear facilities in long-term surveillance and maintenance (S&M), for decommissioning activities, where source term has been eliminated to the point that only low-level, residual fixed contamination is present, or for environmental remediation activities outside of a facility structure, DOE-STD-1120-98, ''Integration of Environment, Safety, and Health into Facility Disposition Activities'' (STD-1120), may serve as the basis for the DSA. HC-2 and 3 environmental remediation sites also are subject to the hazard analysis methodologies of this standard.

  16. Qualitative Analysis for Maintenance Process Assessment

    Science.gov (United States)

    Brand, Lionel; Kim, Yong-Mi; Melo, Walcelio; Seaman, Carolyn; Basili, Victor

    1996-01-01

    In order to improve software maintenance processes, we first need to be able to characterize and assess them. These tasks must be performed in depth and with objectivity since the problems are complex. One approach is to set up a measurement-based software process improvement program specifically aimed at maintenance. However, establishing a measurement program requires that one understands the problems to be addressed by the measurement program and is able to characterize the maintenance environment and processes in order to collect suitable and cost-effective data. Also, enacting such a program and getting usable data sets takes time. A short term substitute is therefore needed. We propose in this paper a characterization process aimed specifically at maintenance and based on a general qualitative analysis methodology. This process is rigorously defined in order to be repeatable and usable by people who are not acquainted with such analysis procedures. A basic feature of our approach is that actual implemented software changes are analyzed in order to understand the flaws in the maintenance process. Guidelines are provided and a case study is shown that demonstrates the usefulness of the approach.

  17. Acoustic analysis assessment in speech pathology detection

    Directory of Open Access Journals (Sweden)

    Panek Daria

    2015-09-01

    Full Text Available Automatic detection of voice pathologies enables non-invasive, low cost and objective assessments of the presence of disorders, as well as accelerating and improving the process of diagnosis and clinical treatment given to patients. In this work, a vector made up of 28 acoustic parameters is evaluated using principal component analysis (PCA, kernel principal component analysis (kPCA and an auto-associative neural network (NLPCA in four kinds of pathology detection (hyperfunctional dysphonia, functional dysphonia, laryngitis, vocal cord paralysis using the a, i and u vowels, spoken at a high, low and normal pitch. The results indicate that the kPCA and NLPCA methods can be considered a step towards pathology detection of the vocal folds. The results show that such an approach provides acceptable results for this purpose, with the best efficiency levels of around 100%. The study brings the most commonly used approaches to speech signal processing together and leads to a comparison of the machine learning methods determining the health status of the patient

  18. A Content Analysis of Intimate Partner Violence Assessments

    Science.gov (United States)

    Hays, Danica G.; Emelianchik, Kelly

    2009-01-01

    With approximately 30% of individuals of various cultural identities experiencing intimate partner violence (IPV) in their lifetimes, it is imperative that professional counselors engage in effective assessment practices and be aware of the limitations of available IPV assessments. A content analysis of 38 IPV assessments was conducted, yielding…

  19. Non-human biota dose assessment. Sensitivity analysis and knowledge quality assessment

    International Nuclear Information System (INIS)

    This report provides a summary of a programme of work, commissioned within the BIOPROTA collaborative forum, to assess the quantitative and qualitative elements of uncertainty associated with biota dose assessment of potential impacts of long-term releases from geological disposal facilities (GDF). Quantitative and qualitative aspects of uncertainty were determined through sensitivity and knowledge quality assessments, respectively. Both assessments focused on default assessment parameters within the ERICA assessment approach. The sensitivity analysis was conducted within the EIKOS sensitivity analysis software tool and was run in both generic and test case modes. The knowledge quality assessment involved development of a questionnaire around the ERICA assessment approach, which was distributed to a range of experts in the fields of non-human biota dose assessment and radioactive waste disposal assessments. Combined, these assessments enabled critical model features and parameters that are both sensitive (i.e. have a large influence on model output) and of low knowledge quality to be identified for each of the three test cases. The output of this project is intended to provide information on those parameters that may need to be considered in more detail for prospective site-specific biota dose assessments for GDFs. Such information should help users to enhance the quality of their assessments and build greater confidence in the results. (orig.)

  20. Data Analysis and Next Generation Assessments

    Science.gov (United States)

    Pon, Kathy

    2013-01-01

    For the last decade, much of the work of California school administrators has been shaped by the accountability of the No Child Left Behind Act. Now as they stand at the precipice of Common Core Standards and next generation assessments, it is important to reflect on the proficiency educators have attained in using data to improve instruction and…

  1. Emerging frontier technologies for food safety analysis and risk assessment

    Institute of Scientific and Technical Information of China (English)

    DONG Yi-yang; LIU Jia-hui; WANG Sai; CHEN Qi-long; GUO Tian-yang; ZHANG Li-ya; JIN Yong; SU Hai-jia; TAN Tian-wei

    2015-01-01

    Access to security and safe food is a basic human necessity and essential for a sustainable world. To perform hi-end food safety analysis and risk assessment with state of the art technologies is of utmost importance thereof. With applications as exempliifed by microlfuidic immunoassay, aptasensor, direct analysis in real time, high resolution mass spectrometry, benchmark dose and chemical speciifc adjustment factor, this review presents frontier food safety analysis and risk assess-ment technologies, from which both food quality and public health wil beneift undoubtedly in a foreseeable future.

  2. Technical Guidance for Assessing Environmental Justice in Regulatory Analysis

    Science.gov (United States)

    The Technical Guidance for Assessing Environmental Justice in Regulatory Analysis (also referred to as the Environmental Justice Technical Guidance or EJTG) is intended for use by Agency analysts, including risk assessors, economists, and other analytic staff that conduct analyse...

  3. Analysis and Assessment of Computer-Supported Collaborative Learning Conversations

    NARCIS (Netherlands)

    Trausan-Matu, Stefan

    2008-01-01

    Trausan-Matu, S. (2008). Analysis and Assessment of Computer-Supported Collaborative Learning Conversations. Workshop presentation at the symposium Learning networks for professional. November, 14, 2008, Heerlen, Nederland: Open Universiteit Nederland.

  4. Material Analysis for a Fire Assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Alexander; Nemer, Martin B.

    2014-08-01

    This report consolidates technical information on several materials and material classes for a fire assessment. The materials include three polymeric materials, wood, and hydraulic oil. The polymers are polystyrene, polyurethane, and melamine- formaldehyde foams. Samples of two of the specific materials were tested for their behavior in a fire - like environment. Test data and the methods used to test the materials are presented. Much of the remaining data are taken from a literature survey. This report serves as a reference source of properties necessary to predict the behavior of these materials in a fire.

  5. Dimensionality Assessment of Ordered Polytomous Items With Parallel Analysis

    NARCIS (Netherlands)

    Timmerman, Marieke E.; Lorenzo-Seva, Urbano

    2011-01-01

    Parallel analysis (PA) is an often-recommended approach for assessment of the dimensionality of a variable set. PA is known in different variants, which may yield different dimensionality indications. In this article, the authors considered the most appropriate PA procedure to assess the number of c

  6. Assessment and Planning Using Portfolio Analysis

    Science.gov (United States)

    Roberts, Laura B.

    2010-01-01

    Portfolio analysis is a simple yet powerful management tool. Programs and activities are placed on a grid with mission along one axis and financial return on the other. The four boxes of the grid (low mission, low return; high mission, low return; high return, low mission; high return, high mission) help managers identify which programs might be…

  7. Environmental risk assessment in GMO analysis.

    Science.gov (United States)

    Pirondini, Andrea; Marmiroli, Nelson

    2010-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity. PMID:21384330

  8. Environmental risk assessment in GMO analysis.

    Science.gov (United States)

    Pirondini, Andrea; Marmiroli, Nelson

    2010-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity.

  9. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  10. Uncertainty analysis in integrated assessment: the users' perspective

    NARCIS (Netherlands)

    Gabbert, S.G.M.; Ittersum, van M.K.; Kroeze, C.; Stalpers, S.I.P.; Ewert, F.; Alkan Olsson, J.

    2010-01-01

    Integrated Assessment (IA) models aim at providing information- and decision-support to complex problems. This paper argues that uncertainty analysis in IA models should be user-driven in order to strengthen science–policy interaction. We suggest an approach to uncertainty analysis that starts with

  11. TEXTS SENTIMENT-ANALYSIS APPLICATION FOR PUBLIC OPINION ASSESSMENT

    Directory of Open Access Journals (Sweden)

    I. A. Bessmertny

    2015-01-01

    Full Text Available The paper describes an approach to the emotional tonality assessment of natural language texts based on special dictionaries. A method for an automatic assessment of public opinion by means of sentiment-analysis of reviews and discussions followed by published Web-documents is proposed. The method is based on statistics of words in the documents. A pilot model of the software system implementing the sentiment-analysis of natural language text in Russian based on a linear assessment scale is developed. A syntactic analysis and words lemmatization are used to identify terms more correctly. Tonality dictionaries are presented in editable format and are open for enhancing. The program system implementing a sentiment-analysis of the Russian texts based on open dictionaries of tonality is presented for the first time.

  12. Metallic Mineral Resources Assessment and Analysis System Design

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    This paper presents the aim and the design structure of the metallic mineral resources assessment and analysis system. This system adopts an integrated technique of data warehouse composed of affairs-processing layer and analysis-application layer. The affairs-processing layer includes multiform databases (such as geological database, geophysical database, geochemical database),while the analysis application layer includes data warehouse, online analysis processing and data mining. This paper also presents in detail the data warehouse of the present system and the appropriate spatial analysis methods and models. Finally, this paper presents the prospect of the system.

  13. Accuracy Assessment and Analysis for GPT2

    Directory of Open Access Journals (Sweden)

    YAO Yibin

    2015-07-01

    Full Text Available GPT(global pressure and temperature is a global empirical model usually used to provide temperature and pressure for the determination of tropospheric delay, there are some weakness to GPT, these have been improved with a new empirical model named GPT2, which not only improves the accuracy of temperature and pressure, but also provides specific humidity, water vapor pressure, mapping function coefficients and other tropospheric parameters, and no accuracy analysis of GPT2 has been made until now. In this paper high-precision meteorological data from ECWMF and NOAA were used to test and analyze the accuracy of temperature, pressure and water vapor pressure expressed by GPT2, testing results show that the mean Bias of temperature is -0.59℃, average RMS is 3.82℃; absolute value of average Bias of pressure and water vapor pressure are less than 1 mb, GPT2 pressure has average RMS of 7 mb, and water vapor pressure no more than 3 mb, accuracy is different in different latitudes, all of them have obvious seasonality. In conclusion, GPT2 model has high accuracy and stability on global scale.

  14. Comparative analysis of model assessment in community detection

    CERN Document Server

    Kawamoto, Tatsuro

    2016-01-01

    Bayesian cluster inference with a flexible generative model allows us to detect various types of structures. However, it has problems stemming from computational complexity and difficulties in model assessment. We consider the stochastic block model with restricted hyperparameter space, which is known to correspond to modularity maximization. We show that it not only reduces computational complexity, but is also beneficial for model assessment. Using various criteria, we conduct a comparative analysis of the model assessments, and analyze whether each criterion tends to overfit or underfit. We also show that the learning of hyperparameters leads to qualitative differences in Bethe free energy and cross-validation errors.

  15. Hazard Assessment of a Nitration Plant using Fault Tree Analysis

    Directory of Open Access Journals (Sweden)

    C. Rajagopal

    1994-10-01

    Full Text Available Hazard assessment techniques, namely, fault tree analysis and safety analysis, have been applied to the nitration section of a plant producing explosives in the defence sector. Critical components and operations, the failure of which could lead to the occurrence of an unwanted event, have been identified and their effects quantitatively assessed. Some remedial measures have been suggested to minimise potential hazards and the effect of incorporating these measures on the system safety has been examined by means of specific case studies.

  16. Background, Assessment and Analysis of the Gender Issues in Pakistan

    OpenAIRE

    Moheyuddin, Ghulam

    2005-01-01

    This paper describes the assessment of the gender issue in Pakistan, review and analysis of the major sector depicting gender inequalities. Before continuing to the detailed analysis of the gender issues in Pakistan, it gives a bird’s eye-view of the socio-economic, political and cultural background of Pakistan. The paper explains the areas of critical gender inequalities in Pakistan and reviews the various gender indicators in Pakistan. It also discusses the current policies and the program...

  17. Intuitive Analysis of Variance-- A Formative Assessment Approach

    Science.gov (United States)

    Trumpower, David

    2013-01-01

    This article describes an assessment activity that can show students how much they intuitively understand about statistics, but also alert them to common misunderstandings. How the activity can be used formatively to help improve students' conceptual understanding of analysis of variance is discussed. (Contains 1 figure and 1 table.)

  18. Using conversation analysis to assess and treat people with aphasia.

    Science.gov (United States)

    Beeke, Suzanne; Maxim, Jane; Wilkinson, Ray

    2007-05-01

    This article gives an overview of the application to aphasia of conversation analysis (CA), a qualitative methodology for the analysis of recorded, naturally occurring talk produced in everyday human interaction. CA, like pragmatics, considers language use in context, but it differs from other analytical frameworks because the clinician is not making interpretations about how an aspect of language should be coded or judging whether an utterance is successful or adequate in terms of communication. We first outline the CA methodology before discussing its application to the assessment of aphasia, principally through the use of two published assessment tools. We then move on to illustrate applications of CA in the field of aphasia therapy by discussing two single case study interventions. Key conversation behaviors are illustrated with transcripts from interactions recorded by the person with aphasia and the person's habitual conversation partner in the home environment. Finally, we explore the implications of using CA as a tool for assessment and treatment in aphasia.

  19. Assessing Group Interaction with Social Language Network Analysis

    Science.gov (United States)

    Scholand, Andrew J.; Tausczik, Yla R.; Pennebaker, James W.

    In this paper we discuss a new methodology, social language network analysis (SLNA), that combines tools from social language processing and network analysis to assess socially situated working relationships within a group. Specifically, SLNA aims to identify and characterize the nature of working relationships by processing artifacts generated with computer-mediated communication systems, such as instant message texts or emails. Because social language processing is able to identify psychological, social, and emotional processes that individuals are not able to fully mask, social language network analysis can clarify and highlight complex interdependencies between group members, even when these relationships are latent or unrecognized.

  20. Assessing environmental performance by combining life cycle assessment, multi-criteria analysis and environmental performance indicators

    NARCIS (Netherlands)

    Hermann, B.G.; Kroeze, C.; Jawjit, W.

    2007-01-01

    We present a new analytical tool, called COMPLIMENT, which can be used to provide detailed information on the overall environmental impact of a business. COMPLIMENT integrates parts of tools such as life cycle assessment, multi-criteria analysis and environmental performance indicators. It avoids di

  1. A hybrid input–output multi-objective model to assess economic–energy–environment trade-offs in Brazil

    International Nuclear Information System (INIS)

    A multi-objective linear programming (MOLP) model based on a hybrid Input–Output (IO) framework is presented. This model aims at assessing the trade-offs between economic, energy, environmental (E3) and social objectives in the Brazilian economic system. This combination of multi-objective models with Input–Output Analysis (IOA) plays a supplementary role in understanding the interactions between the economic and energy systems, and the corresponding impacts on the environment, offering a consistent framework for assessing the effects of distinct policies on these systems. Firstly, the System of National Accounts (SNA) is reorganized to include the National Energy Balance, creating a hybrid IO framework that is extended to assess Greenhouse Gas (GHG) emissions and the employment level. The objective functions considered are the maximization of GDP (gross domestic product) and employment levels, as well as the minimization of energy consumption and GHG emissions. An interactive method enabling a progressive and selective search of non-dominated solutions with distinct characteristics and underlying trade-offs is utilized. Illustrative results indicate that the maximization of GDP and the employment levels lead to an increase of both energy consumption and GHG emissions, while the minimization of either GHG emissions or energy consumption cause negative impacts on GDP and employment. - Highlights: • A hybrid Input–Output multi-objective model is applied to the Brazilian economy. • Objective functions are GDP, employment level, energy consumption and GHG emissions. • Interactive search process identifies trade-offs between the competing objectives. • Positive correlations between GDP growth and employment. • Positive correlations between energy consumption and GHG emissions

  2. Thermal analysis in quality assessment of rapeseed oils

    Energy Technology Data Exchange (ETDEWEB)

    Wesolowski, Marek; Erecinska, Joanna [Department of Analytical Chemistry, Medical University of Gdansk, Al. Gen. J. Hallera 107, PL 80-416 Gdansk (Poland)

    1998-12-07

    The evaluation of the applicability of thermoanalytical methods to the assessment of the quality of refined rapeseed oils was performed. Density, refractive index, and saponification, iodine and acid numbers of rapeseed oils were determined as part of the study. By correlating the data obtained with the temperatures of initial, final and successive mass losses determined from the thermogravimetric curves, strong relations were observed. The possibility of a practical utilization of regression equations for the assessment of the quality of refined rapeseed oils was indicated. The results of principal component analysis indicate that thermogravimetric techniques are very useful in defining the quality of rapeseed oils compared with chemical analyses

  3. Human reliability analysis methods for probabilistic safety assessment

    International Nuclear Information System (INIS)

    Human reliability analysis (HRA) of a probabilistic safety assessment (PSA) includes identifying human actions from safety point of view, modelling the most important of them in PSA models, and assessing their probabilities. As manifested by many incidents and studies, human actions may have both positive and negative effect on safety and economy. Human reliability analysis is one of the areas of probabilistic safety assessment (PSA) that has direct applications outside the nuclear industry. The thesis focuses upon developments in human reliability analysis methods and data. The aim is to support PSA by extending the applicability of HRA. The thesis consists of six publications and a summary. The summary includes general considerations and a discussion about human actions in the nuclear power plant (NPP) environment. A condensed discussion about the results of the attached publications is then given, including new development in methods and data. At the end of the summary part, the contribution of the publications to good practice in HRA is presented. In the publications, studies based on the collection of data on maintenance-related failures, simulator runs and expert judgement are presented in order to extend the human reliability analysis database. Furthermore, methodological frameworks are presented to perform a comprehensive HRA, including shutdown conditions, to study reliability of decision making, and to study the effects of wrong human actions. In the last publication, an interdisciplinary approach to analysing human decision making is presented. The publications also include practical applications of the presented methodological frameworks. (orig.)

  4. System of gait analysis based on ground reaction force assessment

    Directory of Open Access Journals (Sweden)

    František Vaverka

    2015-12-01

    Full Text Available Background: Biomechanical analysis of gait employs various methods used in kinematic and kinetic analysis, EMG, and others. One of the most frequently used methods is kinetic analysis based on the assessment of the ground reaction forces (GRF recorded on two force plates. Objective: The aim of the study was to present a method of gait analysis based on the assessment of the GRF recorded during the stance phase of two steps. Methods: The GRF recorded with a force plate on one leg during stance phase has three components acting in directions: Fx - mediolateral, Fy - anteroposterior, and Fz - vertical. A custom-written MATLAB script was used for gait analysis in this study. This software displays instantaneous force data for both legs as Fx(t, Fy(t and Fz(t curves, automatically determines the extremes of functions and sets the visual markers defining the individual points of interest. Positions of these markers can be easily adjusted by the rater, which may be necessary if the GRF has an atypical pattern. The analysis is fully automated and analyzing one trial takes only 1-2 minutes. Results: The method allows quantification of temporal variables of the extremes of the Fx(t, Fy(t, Fz(t functions, durations of the braking and propulsive phase, duration of the double support phase, the magnitudes of reaction forces in extremes of measured functions, impulses of force, and indices of symmetry. The analysis results in a standardized set of 78 variables (temporal, force, indices of symmetry which can serve as a basis for further research and diagnostics. Conclusions: The resulting set of variable offers a wide choice for selecting a specific group of variables with consideration to a particular research topic. The advantage of this method is the standardization of the GRF analysis, low time requirements allowing rapid analysis of a large number of trials in a short time, and comparability of the variables obtained during different research measurements.

  5. Life Cycle Exergy Analysis of Wind Energy Systems : Assessing and improving life cycle analysis methodology

    OpenAIRE

    Davidsson, Simon

    2011-01-01

    Wind power capacity is currently growing fast around the world. At the same time different forms of life cycle analysis are becoming common for measuring the environmental impact of wind energy systems. This thesis identifies several problems with current methods for assessing the environmental impact of wind energy and suggests improvements that will make these assessments more robust. The use of the exergy concept combined with life cycle analysis has been proposed by several researchers ov...

  6. Assessment of residual stress using thermoelastic stress analysis

    OpenAIRE

    Robinson, Andrew Ferrand

    2011-01-01

    The work described in this thesis considers the application of thermoelastic stress analysis (TSA) to the assessment of residual stresses in metallic materials. Residual stresses exist within almost all engineering components and structures. They are an unavoidable consequence of manufacturing processes and may cause the premature and catastrophic failure of a component when coupled with in-service stresses. Alternatively, beneficial residual stress may be introduced to enhance th...

  7. Site Characterization and Analysis Penetrometer System (SCAPS) : Assessing Site Cotamination

    OpenAIRE

    ECT Team, Purdue

    2007-01-01

    While a number of techniques exist for the remediation of contaminated soils, one of the largest problems is often the initial site assessment. It can be a difficult, expensive and time-consuming process to determine the exact extent of site contamination. The U.S. Army Engineer Waterways Experiment Station (WES) under the sponsorship of the U.S. Army Environmental Center (AEC) initiated the development of the Site Characterization and Analysis Penetrometer System (SCAPS) Research, Developmen...

  8. A Protocol for the Global Sensitivity Analysis of Impact Assessment Models in Life Cycle Assessment.

    Science.gov (United States)

    Cucurachi, S; Borgonovo, E; Heijungs, R

    2016-02-01

    The life cycle assessment (LCA) framework has established itself as the leading tool for the assessment of the environmental impact of products. Several works have established the need of integrating the LCA and risk analysis methodologies, due to the several common aspects. One of the ways to reach such integration is through guaranteeing that uncertainties in LCA modeling are carefully treated. It has been claimed that more attention should be paid to quantifying the uncertainties present in the various phases of LCA. Though the topic has been attracting increasing attention of practitioners and experts in LCA, there is still a lack of understanding and a limited use of the available statistical tools. In this work, we introduce a protocol to conduct global sensitivity analysis in LCA. The article focuses on the life cycle impact assessment (LCIA), and particularly on the relevance of global techniques for the development of trustable impact assessment models. We use a novel characterization model developed for the quantification of the impacts of noise on humans as a test case. We show that global SA is fundamental to guarantee that the modeler has a complete understanding of: (i) the structure of the model and (ii) the importance of uncertain model inputs and the interaction among them.

  9. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    Science.gov (United States)

    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R.; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manual gating, and sample classification to determine if analysis pipelines can identify characteristics that correlate with external variables (e.g., clinical outcome). This analysis presents the results of the first of these challenges. Several methods performed well compared to manual gating or external variables using statistical performance measures, suggesting that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis. PMID:23396282

  10. Climate Change Scientific Assessment and Policy Analysis. Scientific Assessment of Solar Induced Climate Change

    International Nuclear Information System (INIS)

    The programme Scientific Assessment and Policy Analysis is commissioned by the Dutch Ministry of Housing, Spatial Planning, and the Environment (VROM) and has the following objectives: Collection and evaluation of relevant scientific information for policy development and decision-making in the field of climate change; Analysis of resolutions and decisions in the framework of international climate negotiations and their implications. The programme is concerned with analyses and assessments intended for a balanced evaluation of the state of the art knowledge for underpinning policy choices. These analyses and assessment activities are carried out within several months to about a year, depending on the complexity and the urgency of the policy issue. Assessment teams organised to handle the various topics consist of the best Dutch experts in their fields. Teams work on incidental and additionally financed activities, as opposed to the regular, structurally financed activities of the climate research consortium. The work should reflect the current state of science on the relevant topic. In this report an assessment on the following topics is presented: (1) Reconstructions of solar variability, especially with respect to those parameters which are relevant for climate change; (2) Reconstructions of proxies of solar variability, e.g. cosmogenic isotopes; (3) Reconstructions of global as well as regional climate, with respect to temperature, precipitation and circulation; (4) Physical understanding of the mechanisms which play a role in the solar terrestrial link. We focus on the Holocene with emphasis on the last centuries because of data availability, to avoid confusing climate responses to orbital changes with those due to solar activity and because of the relevance for human induced climate change as compared to the role of the variable sun in the 20th century

  11. NASA Langley Systems Analysis & Concepts Directorate Technology Assessment/Portfolio Analysis

    Science.gov (United States)

    Cavanaugh, Stephen; Chytka, Trina; Arcara, Phil; Jones, Sharon; Stanley, Doug; Wilhite, Alan W.

    2006-01-01

    Systems analysis develops and documents candidate mission and architectures, associated system concepts, enabling capabilities and investment strategies to achieve NASA s strategic objectives. The technology assessment process connects the mission and architectures to the investment strategies. In order to successfully implement a technology assessment, there is a need to collect, manipulate, analyze, document, and disseminate technology-related information. Information must be collected and organized on the wide variety of potentially applicable technologies, including: previous research results, key technical parameters and characteristics, technology readiness levels, relationships to other technologies, costs, and potential barriers and risks. This information must be manipulated to facilitate planning and documentation. An assessment is included of the programmatic and technical risks associated with each technology task as well as potential risk mitigation plans. Risks are assessed and tracked in terms of likelihood of the risk occurring and consequences of the risk if it does occur. The risk assessments take into account cost, schedule, and technical risk dimensions. Assessment data must be simplified for presentation to decision makers. The Systems Analysis and Concepts Directorate (SACD) at NASA Langley Research Center has a wealth of experience in performing Technology Assessment and Portfolio Analysis as this has been a business line since 1978.

  12. Spatial risk assessment for critical network infrastructure using sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    Michael M·derl; Wolfgang Rauch

    2011-01-01

    The presented spatial risk assessment method allows for managing critical network infrastructure in urban areas under abnormal and future conditions caused e.g.,by terrorist attacks,infrastructure deterioration or climate change.For the spatial risk assessment,vulnerability maps for critical network infrastructure are merged with hazard maps for an interfering process.Vulnerability maps are generated using a spatial sensitivity analysis of network transport models to evaluate performance decrease under investigated thread scenarios.Thereby parameters are varied according to the specific impact of a particular threat scenario.Hazard maps are generated with a geographical information system using raster data of the same threat scenario derived from structured interviews and cluster analysis of events in the past.The application of the spatial risk assessment is exemplified by means of a case study for a water supply system,but the principal concept is applicable likewise to other critical network infrastructure.The aim of the approach is to help decision makers in choosing zones for preventive measures.

  13. Quantitative Computed Tomography and image analysis for advanced muscle assessment

    Directory of Open Access Journals (Sweden)

    Kyle Joseph Edmunds

    2016-06-01

    Full Text Available Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration.

  14. Biological dosimetry: chromosomal aberration analysis for dose assessment

    International Nuclear Information System (INIS)

    In view of the growing importance of chromosomal aberration analysis as a biological dosimeter, the present report provides a concise summary of the scientific background of the subject and a comprehensive source of information at the technical level. After a review of the basic principles of radiation dosimetry and radiation biology basic information on the biology of lymphocytes, the structure of chromosomes and the classification of chromosomal aberrations are presented. This is followed by a presentation of techniques for collecting blood, storing, transporting, culturing, making chromosomal preparations and scaring of aberrations. The physical and statistical parameters involved in dose assessment are discussed and examples of actual dose assessments taken from the scientific literature are given

  15. Model analysis: Representing and assessing the dynamics of student learning

    Directory of Open Access Journals (Sweden)

    Edward F. Redish

    2006-02-01

    Full Text Available Decades of education research have shown that students can simultaneously possess alternate knowledge frameworks and that the development and use of such knowledge are context dependent. As a result of extensive qualitative research, standardized multiple-choice tests such as Force Concept Inventory and Force-Motion Concept Evaluation tests provide instructors tools to probe their students’ conceptual knowledge of physics. However, many existing quantitative analysis methods often focus on a binary question of whether a student answers a question correctly or not. This greatly limits the capacity of using the standardized multiple-choice tests in assessing students’ alternative knowledge. In addition, the context dependence issue, which suggests that a student may apply the correct knowledge in some situations and revert to use alternative types of knowledge in others, is often treated as random noise in current analyses. In this paper, we present a model analysis, which applies qualitative research to establish a quantitative representation framework. With this method, students’ alternative knowledge and the probabilities for students to use such knowledge in a range of equivalent contexts can be quantitatively assessed. This provides a way to analyze research-based multiple choice questions, which can generate much richer information than what is available from score-based analysis.

  16. Development and assessment of best estimate integrated safety analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong; Lee, Young Jin; Hwang, Moon Kyu (and others)

    2007-03-15

    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published.

  17. Development and assessment of best estimate integrated safety analysis code

    International Nuclear Information System (INIS)

    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published

  18. Social and ethical analysis in health technology assessment.

    Science.gov (United States)

    Tantivess, Sripen

    2014-05-01

    This paper presents a review of the domestic and international literature on the assessment of the social and ethical implications of health technologies. It gives an overview of the key concepts, principles, and approaches that should be taken into account when conducting a social and ethical analysis within health technology assessment (HTA). Although there is growing consensus among healthcare experts that the social and ethical ramifications of a given technology should be examined before its adoption, the demand for this kind of analysis among policy-makers around the world, including in Thailand, has so far been lacking. Currently decision-makers mainly base technology adoption decisions using evidence on clinical effectiveness, value for money, and budget impact, while social and ethical aspects have been neglected. Despite the recognized importance of considering equity, justice, and social issues when making decisions regarding health resource allocation, the absence of internationally-accepted principles and methodologies, among other factors, hinders research in these areas. Given that developing internationally agreed standards takes time, it has been recommended that priority be given to defining processes that are justifiable, transparent, and contestable. A discussion of the current situation in Thailand concerning social and ethical analysis of health technologies is also presented. PMID:24964703

  19. A Monte Carlo based spent fuel analysis safeguards strategy assessment

    Energy Technology Data Exchange (ETDEWEB)

    Fensin, Michael L [Los Alamos National Laboratory; Tobin, Stephen J [Los Alamos National Laboratory; Swinhoe, Martyn T [Los Alamos National Laboratory; Menlove, Howard O [Los Alamos National Laboratory; Sandoval, Nathan P [Los Alamos National Laboratory

    2009-01-01

    assessment process, the techniques employed to automate the coupled facets of the assessment process, and the standard burnup/enrichment/cooling time dependent spent fuel assembly library. We also clearly define the diversion scenarios that will be analyzed during the standardized assessments. Though this study is currently limited to generic PWR assemblies, it is expected that the results of the assessment will yield an adequate spent fuel analysis strategy knowledge that will help the down-select process for other reactor types.

  20. Statistical analysis applied to safety culture self-assessment

    International Nuclear Information System (INIS)

    Interviews and opinion surveys are instruments used to assess the safety culture in an organization as part of the Safety Culture Enhancement Programme. Specific statistical tools are used to analyse the survey results. This paper presents an example of an opinion survey with the corresponding application of the statistical analysis and the conclusions obtained. Survey validation, Frequency statistics, Kolmogorov-Smirnov non-parametric test, Student (T-test) and ANOVA means comparison tests and LSD post-hoc multiple comparison test, are discussed. (author)

  1. TXRF analysis of soils and sediments to assess environmental contamination.

    Science.gov (United States)

    Bilo, Fabjola; Borgese, Laura; Cazzago, Davide; Zacco, Annalisa; Bontempi, Elza; Guarneri, Rita; Bernardello, Marco; Attuati, Silvia; Lazo, Pranvera; Depero, Laura E

    2014-12-01

    Total reflection x-ray fluorescence spectroscopy (TXRF) is proposed for the elemental chemical analysis of crustal environmental samples, such as sediments and soils. A comparative study of TXRF with respect to flame atomic absorption spectroscopy and inductively coupled plasma optical emission spectroscopy was performed. Microwave acid digestion and suspension preparation methods are evaluated. A good agreement was found among the results obtained with different spectroscopic techniques and sample preparation methods for Cr, Mn, Fe, Ni, Cu, and Zn. We demonstrated that TXRF is suitable for the assessment of environmental contamination phenomena, even if the errors for Pb, As, V, and Ba are ingent.

  2. Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis

    International Nuclear Information System (INIS)

    Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e

  3. Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Feng, E-mail: fwang@unu.edu [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Huisman, Jaco [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Stevels, Ab [Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Baldé, Cornelis Peter [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Statistics Netherlands, Henri Faasdreef 312, 2492 JP Den Haag (Netherlands)

    2013-11-15

    Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e

  4. Comparison of two software versions for assessment of body-composition analysis by DXA

    DEFF Research Database (Denmark)

    Vozarova, B; Wang, J; Weyer, C;

    2001-01-01

    To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA.......To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA....

  5. 7 CFR 2.71 - Director, Office of Risk Assessment and Cost-Benefit Analysis.

    Science.gov (United States)

    2010-01-01

    ... Chief Economist § 2.71 Director, Office of Risk Assessment and Cost-Benefit Analysis. (a) Delegations..., Office of Risk Assessment and Cost-Benefit Analysis: (1) Responsible for assessing the risks to human... 7 Agriculture 1 2010-01-01 2010-01-01 false Director, Office of Risk Assessment and...

  6. Supporting analysis and assessments quality metrics: Utility market sector

    Energy Technology Data Exchange (ETDEWEB)

    Ohi, J. [National Renewable Energy Lab., Golden, CO (United States)

    1996-10-01

    In FY96, NREL was asked to coordinate all analysis tasks so that in FY97 these tasks will be part of an integrated analysis agenda that will begin to define a 5-15 year R&D roadmap and portfolio for the DOE Hydrogen Program. The purpose of the Supporting Analysis and Assessments task at NREL is to provide this coordination and conduct specific analysis tasks. One of these tasks is to prepare the Quality Metrics (QM) for the Program as part of the overall QM effort at DOE/EERE. The Hydrogen Program one of 39 program planning units conducting QM, a process begun in FY94 to assess benefits/costs of DOE/EERE programs. The purpose of QM is to inform decisionmaking during budget formulation process by describing the expected outcomes of programs during the budget request process. QM is expected to establish first step toward merit-based budget formulation and allow DOE/EERE to get {open_quotes}most bang for its (R&D) buck.{close_quotes} In FY96. NREL coordinated a QM team that prepared a preliminary QM for the utility market sector. In the electricity supply sector, the QM analysis shows hydrogen fuel cells capturing 5% (or 22 GW) of the total market of 390 GW of new capacity additions through 2020. Hydrogen consumption in the utility sector increases from 0.009 Quads in 2005 to 0.4 Quads in 2020. Hydrogen fuel cells are projected to displace over 0.6 Quads of primary energy in 2020. In future work, NREL will assess the market for decentralized, on-site generation, develop cost credits for distributed generation benefits (such as deferral of transmission and distribution investments, uninterruptible power service), cost credits for by-products such as heat and potable water, cost credits for environmental benefits (reduction of criteria air pollutants and greenhouse gas emissions), compete different fuel cell technologies against each other for market share, and begin to address economic benefits, especially employment.

  7. EFFECT OF IOA AND R-6G ON VIABILITY AND REGENERATIVE CAPACITY OF PROTOPLASTS FROM THREE LEGUME FORAGES%代谢互补抑制剂碘乙酰胺和罗丹明对3种豆科牧草原生质体活力和再长的影响

    Institute of Scientific and Technical Information of China (English)

    李玉珠; 师尚礼; 陶茸

    2012-01-01

    研究了不同浓度代谢互补抑制剂碘乙酰胺(IOA)和罗丹明6G(R-6G)处理与里奥百脉根(Lotuscorniculatus L.cv.Leon)、扁蓿豆(Melilotoides ruthenica(L.)Sojak)及清水紫花苜蓿(Medicago sativa L.cv.Qingshui)3种豆科牧草愈伤组织原生质体活力和再生力之间的相关性。结果表明,3~10mmol/L的IOA和40~70μg/ml的R-6G分别处理10min,可使3种豆科牧草原生质体的活力及植板率明显下降,各处理浓度与原生质体活力和植板率之间均存在极显著的负相关(p〈0.01)。培养第7天,所有处理均可见再生的小细胞团;培养30~40d,抑制剂临界浓度下原生质体丧失形成愈伤组织的能力。IOA对原生质活力的抑制作用更明显,R-6G对植板率的影响更显著,并可使原生质体发出红色荧光,有利于细胞融合时亲本原生质体的标识。适宜3种豆科牧草的抑制剂及临界浓度分别是:百脉根5mmol/L IOA或50μg/ml R-6G;扁蓿豆7mmol/L IOA或70μg/ml R-6G;清水紫花苜蓿3mmol/L IOA或40μg/mL R-6G。%A study was conducted to investigate correlations among the viability,regenerative capacity of protoplasts isolated from three legume forages(Lotus corniculatus L.cv.Leon,Melilotoides ruthenica(L.) Sojak and Medicago sativa L.cv.Qingshui) after treated with metabolic complement inhibitors of IOA and R-6G.Results showed that the viability and plate rate of protoplasts from three species were obviously reduced under treatments of IOA(3~10mmol/L) and R-6G(40~70μg/ml) for 10min.The viability and plate rate of three species were significantly(p〈0.01) negative correlated with effects of IOA and R-6G of different concentrations.All protoplasts regenerated small aggregated cell clusters under the influence of IOA and R-6G after culture for 7 days.The development of these protoplasts was inhibited and they could not form calli after culture for 30 to 40 days under the IOA and R-6G treatments at a

  8. Using miscue analysis to assess comprehension in deaf college readers.

    Science.gov (United States)

    Albertini, John; Mayer, Connie

    2011-01-01

    For over 30 years, teachers have used miscue analysis as a tool to assess and evaluate the reading abilities of hearing students in elementary and middle schools and to design effective literacy programs. More recently, teachers of deaf and hard-of-hearing students have also reported its usefulness for diagnosing word- and phrase-level reading difficulties and for planning instruction. To our knowledge, miscue analysis has not been used with older, college-age deaf students who might also be having difficulty decoding and understanding text at the word level. The goal of this study was to determine whether such an analysis would be helpful in identifying the source of college students' reading comprehension difficulties. After analyzing the miscues of 10 college-age readers and the results of other comprehension-related tasks, we concluded that comprehension of basic grade school-level passages depended on the ability to recognize and comprehend key words and phrases in these texts. We also concluded that these diagnostic procedures provided useful information about the reading abilities and strategies of each reader that had implications for designing more effective interventions.

  9. Phonological assessment and analysis tools for Tagalog: Preliminary development.

    Science.gov (United States)

    Chen, Rachelle Kay; Bernhardt, B May; Stemberger, Joseph P

    2016-01-01

    Information and assessment tools concerning Tagalog phonological development are minimally available. The current study thus sets out to develop elicitation and analysis tools for Tagalog. A picture elicitation task was designed with a warm-up, screener and two extension lists, one with more complex and one with simpler words. A nonlinear phonological analysis form was adapted from English (Bernhardt & Stemberger, 2000) to capture key characteristics of Tagalog. The tools were piloted on a primarily Tagalog-speaking 4-year-old boy living in a Canadian-English-speaking environment. The data provided initial guidance for revision of the elicitation tool (available at phonodevelopment.sites.olt.ubc.ca). The analysis provides preliminary observations about possible expectations for primarily Tagalog-speaking 4-year-olds in English-speaking environments: Lack of mastery for tap/trill 'r', and minor mismatches for vowels, /l/, /h/ and word stress. Further research is required in order to develop the tool into a norm-referenced instrument for Tagalog in both monolingual and multilingual environments. PMID:27096390

  10. Time-dependent reliability analysis and condition assessment of structures

    Energy Technology Data Exchange (ETDEWEB)

    Ellingwood, B.R. [Johns Hopkins Univ., Baltimore, MD (United States)

    1997-01-01

    Structures generally play a passive role in assurance of safety in nuclear plant operation, but are important if the plant is to withstand the effect of extreme environmental or abnormal events. Relative to mechanical and electrical components, structural systems and components would be difficult and costly to replace. While the performance of steel or reinforced concrete structures in service generally has been very good, their strengths may deteriorate during an extended service life as a result of changes brought on by an aggressive environment, excessive loading, or accidental loading. Quantitative tools for condition assessment of aging structures can be developed using time-dependent structural reliability analysis methods. Such methods provide a framework for addressing the uncertainties attendant to aging in the decision process.

  11. The October 1973 NASA mission model analysis and economic assessment

    Science.gov (United States)

    1974-01-01

    Results are presented of the 1973 NASA Mission Model Analysis. The purpose was to obtain an economic assessment of using the Shuttle to accommodate the payloads and requirements as identified by the NASA Program Offices and the DoD. The 1973 Payload Model represents a baseline candidate set of future payloads which can be used as a reference base for planning purposes. The cost of implementing these payload programs utilizing the capabilities of the shuttle system is analyzed and compared with the cost of conducting the same payload effort using expendable launch vehicles. There is a net benefit of 14.1 billion dollars as a result of using the shuttle during the 12-year period as compared to using an expendable launch vehicle fleet.

  12. Origin assessment of EV olive oils by esterified sterols analysis.

    Science.gov (United States)

    Giacalone, Rosa; Giuliano, Salvatore; Gulotta, Eleonora; Monfreda, Maria; Presti, Giovanni

    2015-12-01

    In this study extra virgin olive oils of Italian and non-Italian origin (from Spain, Tunisia and blends of EU origin) were differentiated by GC-FID analysis of sterols and esterified sterols followed by chemometric tools. PCA allowed to highlight the high significance of esterified sterols to characterise extra virgin olive oils in relation to their origin. SIMCA provided a sensitivity and specificity of 94.39% and 91.59% respectively; furthermore, an external set of 54 extra virgin olive oils bearing a designation of Italian origin on the labelling was tested by SIMCA. Prediction results were also compared with organoleptic assessment. Finally, the poor correlation found between ethylesters and esterified sterols allowed to hazard the guess, worthy of further investigations, that esterified sterols may prove to be promising in studies of geographical discrimination: indeed they appear to be independent of those factors causing the formation of ethyl esters and related to olive oil production.

  13. New challenges on uncertainty propagation assessment of flood risk analysis

    Science.gov (United States)

    Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés

    2016-04-01

    Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis

  14. Assessing farming eco-efficiency: a Data Envelopment Analysis approach.

    Science.gov (United States)

    Picazo-Tadeo, Andrés J; Gómez-Limón, José A; Reig-Martínez, Ernest

    2011-04-01

    This paper assesses farming eco-efficiency using Data Envelopment Analysis (DEA) techniques. Eco-efficiency scores at both farm and environmental pressure-specific levels are computed for a sample of Spanish farmers operating in the rain-fed agricultural system of Campos County. The determinants of eco-efficiency are then studied using truncated regression and bootstrapping techniques. We contribute to previous literature in this field of research by including information on slacks in the assessment of the potential environmental pressure reductions in a DEA framework. Our results reveal that farmers are quite eco-inefficient, with very few differences emerging among specific environmental pressures. Moreover, eco-inefficiency is closely related to technical inefficiencies in the management of inputs. Regarding the determinants of eco-efficiency, farmers benefiting from agri-environmental programs as well as those with university education are found to be more eco-efficient. Concerning the policy implications of these results, public expenditure in agricultural extension and farmer training could be of some help to promote integration between farming and the environment. Furthermore, Common Agricultural Policy agri-environmental programs are an effective policy to improve eco-efficiency, although some doubts arise regarding their cost-benefit balance.

  15. Cyber threat impact assessment and analysis for space vehicle architectures

    Science.gov (United States)

    McGraw, Robert M.; Fowler, Mark J.; Umphress, David; MacDonald, Richard A.

    2014-06-01

    This paper covers research into an assessment of potential impacts and techniques to detect and mitigate cyber attacks that affect the networks and control systems of space vehicles. Such systems, if subverted by malicious insiders, external hackers and/or supply chain threats, can be controlled in a manner to cause physical damage to the space platforms. Similar attacks on Earth-borne cyber physical systems include the Shamoon, Duqu, Flame and Stuxnet exploits. These have been used to bring down foreign power generation and refining systems. This paper discusses the potential impacts of similar cyber attacks on space-based platforms through the use of simulation models, including custom models developed in Python using SimPy and commercial SATCOM analysis tools, as an example STK/SOLIS. The paper discusses the architecture and fidelity of the simulation model that has been developed for performing the impact assessment. The paper walks through the application of an attack vector at the subsystem level and how it affects the control and orientation of the space vehicle. SimPy is used to model and extract raw impact data at the bus level, while STK/SOLIS is used to extract raw impact data at the subsystem level and to visually display the effect on the physical plant of the space vehicle.

  16. Analysis of environmental impact assessment (EIA) system in Turkey.

    Science.gov (United States)

    Coşkun, Aynur Aydın; Turker, Ozhan

    2011-04-01

    The Environmental Impact Assessment (EIA) System, which embodies the "prevention principle" of the environmental law, is an important tool for environmental protection. This tool has a private importance for Turkey since it is a developing country, and it entered the Turkish law in 1983 with the Environmental Law. Besides, the EIA Regulation, which shows the application principles, became effective in 1993. Because Turkey is a candidate for European Union (EU), the EIA Regulation has been changed due to the EU compliance procedure, and its latest version became valid in 2008. This study aims to emphasize The EIA system in Turkey to supervise the efficiency of this procedure and point the success level. In the introduction part, general EIA concept, its importance, and some notations are mentioned. Following that, the legislation, which builds the EIA system, has been analyzed starting from the 1982 Turkish Constitution. Then, the legislation rules are explained due to the basic steps of the EIA procedure. In order to shed light upon the application, the EIA final decisions given until today, the results, and their distributions to the industries are assessed. In the final part of the study, a SWOT analysis is made to mention the weaknesses, strengths, opportunities, and threats of the EIA system in Turkey.

  17. Bridge health assessment system with fatigue analysis algorithm

    Science.gov (United States)

    Wang, Xuan; Wang, M. L.; Zhao, Yang

    2005-05-01

    A modern bridge is such a complicated system that is difficult to analyze by conventional mathematic tools. A rational bridge monitoring requires a good knowledge of the actual condition of various structural components. Fatigue analysis of concrete bridges is one of the most important problems. Concrete bridges are often undergoing a fatigue deterioration, starting with cracking and ending with large holes through the web. There is a need for the development of efficient health assessment system for fatigue evaluation and prediction of the remaining life. This information has clear economical consequences, as deficient bridges must be repaired or closed. The goal of this research is to provide a practical expert system in bridge health evaluation and improve the understanding of bridge behavior during their service. Efforts to develop a functional bridge monitoring system have mainly been concentrated upon successful implementation of experienced-based machine learning. The reliability of the techniques adopted for damage assessment is also important for bridge monitoring systems. By applying the system to an in-service PC bridge, it has been verified that this fuzzy logic expert system is effective and reliable for the bridge health evaluation.

  18. Assessing computer waste generation in Chile using material flow analysis.

    Science.gov (United States)

    Steubing, Bernhard; Böni, Heinz; Schluep, Mathias; Silva, Uca; Ludwig, Christian

    2010-03-01

    The quantities of e-waste are expected to increase sharply in Chile. The purpose of this paper is to provide a quantitative data basis on generated e-waste quantities. A material flow analysis was carried out assessing the generation of e-waste from computer equipment (desktop and laptop PCs as well as CRT and LCD-monitors). Import and sales data were collected from the Chilean Customs database as well as from publications by the International Data Corporation. A survey was conducted to determine consumers' choices with respect to storage, re-use and disposal of computer equipment. The generation of e-waste was assessed in a baseline as well as upper and lower scenarios until 2020. The results for the baseline scenario show that about 10,000 and 20,000 tons of computer waste may be generated in the years 2010 and 2020, respectively. The cumulative e-waste generation will be four to five times higher in the upcoming decade (2010-2019) than during the current decade (2000-2009). By 2020, the shares of LCD-monitors and laptops will increase more rapidly replacing other e-waste including the CRT-monitors. The model also shows the principal flows of computer equipment from production and sale to recycling and disposal. The re-use of computer equipment plays an important role in Chile. An appropriate recycling scheme will have to be introduced to provide adequate solutions for the growing rate of e-waste generation.

  19. Integrating multicriteria evaluation and stakeholders analysis for assessing hydropower projects

    International Nuclear Information System (INIS)

    The use of hydroelectric potential and the protection of the river ecosystem are two contrasting aspects that arise in the management of the same resource, generating conflicts between different stakeholders. The purpose of the paper is to develop a multi-level decision-making tool, able to support energy planning, with specific reference to the construction of hydropower plants in mountain areas. Starting from a real-world problem concerning the basin of the Sesia Valley (Italy), an evaluation framework based on the combined use of Multicriteria Evaluation and Stakeholders Analysis is proposed in the study. The results of the work show that the methodology is able to grant participated decisions through a multi-stakeholders traceable and transparent assessment process, to highlight the important elements of the decision problem and to support the definition of future design guidelines. - Highlights: • The paper concerns a multi-level decision-making tool able to support energy planning. • The evaluation framework is based on the use of AHP and Stakeholders Analysis. • Hydropower projects in the Sesia Valley (Italy) are evaluated and ranked in the study. • Environmental, economic, technical and sociopolitical criteria have been considered. • 42 stakeholder groups have been included in the evaluation

  20. Assessing population exposure for landslide risk analysis using dasymetric cartography

    Science.gov (United States)

    Garcia, Ricardo A. C.; Oliveira, Sergio C.; Zezere, Jose L.

    2015-04-01

    Exposed Population is a major topic that needs to be taken into account in a full landslide risk analysis. Usually, risk analysis is based on an accounting of inhabitants number or inhabitants density, applied over statistical or administrative terrain units, such as NUTS or parishes. However, this kind of approach may skew the obtained results underestimating the importance of population, mainly in territorial units with predominance of rural occupation. Furthermore, the landslide susceptibility scores calculated for each terrain unit are frequently more detailed and accurate than the location of the exposed population inside each territorial unit based on Census data. These drawbacks are not the ideal setting when landslide risk analysis is performed for urban management and emergency planning. Dasymetric cartography, which uses a parameter or set of parameters to restrict the spatial distribution of a particular phenomenon, is a methodology that may help to enhance the resolution of Census data and therefore to give a more realistic representation of the population distribution. Therefore, this work aims to map and to compare the population distribution based on a traditional approach (population per administrative terrain units) and based on dasymetric cartography (population by building). The study is developed in the Region North of Lisbon using 2011 population data and following three main steps: i) the landslide susceptibility assessment based on statistical models independently validated; ii) the evaluation of population distribution (absolute and density) for different administrative territorial units (Parishes and BGRI - the basic statistical unit in the Portuguese Census); and iii) the dasymetric population's cartography based on building areal weighting. Preliminary results show that in sparsely populated administrative units, population density differs more than two times depending on the application of the traditional approach or the dasymetric

  1. Imminent Cardiac Risk Assessment via Optical Intravascular Biochemical Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wetzel, D.; Wetzel, L; Wetzel, M; Lodder, R

    2009-01-01

    Heart disease is by far the biggest killer in the United States, and type II diabetes, which affects 8% of the U.S. population, is on the rise. In many cases, the acute coronary syndrome and/or sudden cardiac death occurs without warning. Atherosclerosis has known behavioral, genetic and dietary risk factors. However, our laboratory studies with animal models and human post-mortem tissue using FT-IR microspectroscopy reveal the chemical microstructure within arteries and in the arterial walls themselves. These include spectra obtained from the aortas of ApoE-/- knockout mice on sucrose and normal diets showing lipid deposition in the former case. Also pre-aneurysm chemical images of knockout mouse aorta walls, and spectra of plaque excised from a living human patient are shown for comparison. In keeping with the theme of the SPEC 2008 conference Spectroscopic Diagnosis of Disease this paper describes the background and potential value of a new catheter-based system to provide in vivo biochemical analysis of plaque in human coronary arteries. We report the following: (1) results of FT-IR microspectroscopy on animal models of vascular disease to illustrate the localized chemical distinctions between pathological and normal tissue, (2) current diagnostic techniques used for risk assessment of patients with potential unstable coronary syndromes, and (3) the advantages and limitations of each of these techniques illustrated with patent care histories, related in the first person, by the physician coauthors. Note that the physician comments clarify the contribution of each diagnostic technique to imminent cardiac risk assessment in a clinical setting, leading to the appreciation of what localized intravascular chemical analysis can contribute as an add-on diagnostic tool. The quality of medical imaging has improved dramatically since the turn of the century. Among clinical non-invasive diagnostic tools, laboratory tests of body fluids, EKG, and physical examination are

  2. Assessing temporal variations in connectivity through suspended sediment hysteresis analysis

    Science.gov (United States)

    Sherriff, Sophie; Rowan, John; Fenton, Owen; Jordan, Phil; Melland, Alice; Mellander, Per-Erik; hUallacháin, Daire Ó.

    2016-04-01

    Connectivity provides a valuable concept for understanding catchment-scale sediment dynamics. In intensive agricultural catchments, land management through tillage, high livestock densities and extensive land drainage practices significantly change hydromorphological behaviour and alter sediment supply and downstream delivery. Analysis of suspended sediment-discharge hysteresis has offered insights into sediment dynamics but typically on a limited selection of events. Greater availability of continuous high-resolution discharge and turbidity data and qualitative hysteresis metrics enables assessment of sediment dynamics during more events and over time. This paper assesses the utility of this approach to explore seasonal variations in connectivity. Data were collected from three small (c. 10 km2) intensive agricultural catchments in Ireland with contrasting morphologies, soil types, land use patterns and management practices, and are broadly defined as low-permeability supporting grassland, moderate-permeability supporting arable and high-permeability supporting arable. Suspended sediment concentration (using calibrated turbidity measurements) and discharge data were collected at 10-min resolution from each catchment outlet and precipitation data were collected from a weather station within each catchment. Event databases (67-90 events per catchment) collated information on sediment export metrics, hysteresis category (e.g., clockwise, anti-clockwise, no hysteresis), numeric hysteresis index, and potential hydro-meteorological controls on sediment transport including precipitation amount, duration, intensity, stream flow and antecedent soil moisture and rainfall. Statistical analysis of potential controls on sediment export was undertaken using Pearson's correlation coefficient on separate hysteresis categories in each catchment. Sediment hysteresis fluctuations through time were subsequently assessed using the hysteresis index. Results showed the numeric

  3. Concepts of Causality in Psychopathology: Applications in Clinical Assessment, Clinical Case Formulation and Functional Analysis

    NARCIS (Netherlands)

    Haynes, S.H.; O'Brien, W.H.; Kaholokula, J.K.; Witteman, C.L.M.

    2012-01-01

    This paper discusses and integrates concepts of causality in psychopathology, clinical assessment, clinical case formulation and the functional analysis. We propose that identifying causal variables, relations and mechanisms in psychopathology and clinical assessment can lead to more powerful and e

  4. Analysis of existing risk assessments, and list of suggestions

    CERN Document Server

    Heimsch, Laura

    2016-01-01

    The scope of this project was to analyse risk assessments made at CERN and extracting some crucial information about the different methodologies used, profiles of people who make the risk assessments, and gathering information of whether the risk matrix was used and if the acceptable level of risk was defined. Second step of the project was to trigger discussion inside HSE about risk assessment by suggesting a risk matrix and a risk assessment template.

  5. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    Science.gov (United States)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  6. Portfolio Assessment on Chemical Reactor Analysis and Process Design Courses

    Science.gov (United States)

    Alha, Katariina

    2004-01-01

    Assessment determines what students regard as important: if a teacher wants to change students' learning, he/she should change the methods of assessment. This article describes the use of portfolio assessment on five courses dealing with chemical reactor and process design during the years 1999-2001. Although the use of portfolio was a new…

  7. Flood Risk Analysis and Flood Potential Losses Assessment

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    The heavy floods in the Taihu Basin showed increasing trend in recent years. In thiswork, a typical area in the northern Taihu Basin was selected for flood risk analysis and potentialflood losses assessment. Human activities have strong impact on the study area' s flood situation (asaffected by the polders built, deforestation, population increase, urbanization, etc. ), and havemade water level higher, flood duration shorter, and flood peaks sharper. Five years of differentflood return periods [(1970), 5 (1962), 10 (1987), 20 (1954), 50 (1991)] were used to cal-culate the potential flood risk area and its losses. The potential flood risk map, economic losses,and flood-impacted population were also calculated. The study's main conclusions are: 1 ) Humanactivities have strongly changed the natural flood situation in the study area, increasing runoff andflooding; 2) The flood risk area is closely related with the precipitation center; 3) Polder construc-tion has successfully protected land from flood, shortened the flood duration, and elevated waterlevel in rivers outside the polders; 4) Economic and social development have caused flood losses toincrease in recent years.

  8. Soft Mathematical Aggregation in Safety Assessment and Decision Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, J. Arlin

    1999-06-10

    This paper improves on some of the limitations of conventional safety assessment and decision analysis methods. It develops a top-down mathematical method for expressing imprecise individual metrics as possibilistic or fuzzy numbers and shows how they may be combined (aggregated) into an overall metric, also portraying the inherent uncertainty. Both positively contributing and negatively contributing factors are included. Metrics are weighted according to significance of the attribute and evaluated as to contribution toward the attribute. Aggregation is performed using exponential combination of the metrics, since the accumulating effect of such factors responds less and less to additional factors. This is termed soft mathematical aggregation. Dependence among the contributing factors is accounted for by incorporating subjective metrics on overlap of the factors and by correspondingly reducing the overall contribution of these combinations to the overall aggregation. Decisions corresponding to the meaningfulness of the results are facilitated in several ways. First, the results are compared to a soft threshold provided by a sigmoid function. Second, information is provided on input ''Importance'' and ''Sensitivity,'' in order to know where to place emphasis on controls that may be necessary. Third, trends in inputs and outputs are tracked in order to add important information to the decision process. The methodology has been implemented in software.

  9. Uncertainty analysis in the applications of nuclear probabilistic risk assessment

    International Nuclear Information System (INIS)

    The aim of this thesis is to propose an approach to model parameter and model uncertainties affecting the results of risk indicators used in the applications of nuclear Probabilistic Risk assessment (PRA). After studying the limitations of the traditional probabilistic approach to represent uncertainty in PRA model, a new approach based on the Dempster-Shafer theory has been proposed. The uncertainty analysis process of the proposed approach consists in five main steps. The first step aims to model input parameter uncertainties by belief and plausibility functions according to the data PRA model. The second step involves the propagation of parameter uncertainties through the risk model to lay out the uncertainties associated with output risk indicators. The model uncertainty is then taken into account in the third step by considering possible alternative risk models. The fourth step is intended firstly to provide decision makers with information needed for decision making under uncertainty (parametric and model) and secondly to identify the input parameters that have significant uncertainty contributions on the result. The final step allows the process to be continued in loop by studying the updating of beliefs functions given new data. The proposed methodology was implemented on a real but simplified application of PRA model. (author)

  10. Assessment of the Assessment Tool: Analysis of Items in a Non-MCQ Mathematics Exam

    Science.gov (United States)

    Khoshaim, Heba Bakr; Rashid, Saima

    2016-01-01

    Assessment is one of the vital steps in the teaching and learning process. The reported action research examines the effectiveness of an assessment process and inspects the validity of exam questions used for the assessment purpose. The instructors of a college-level mathematics course studied questions used in the final exams during the academic…

  11. Analysis Strategy for Fracture Assessment of Defects in Ductile Materials

    Energy Technology Data Exchange (ETDEWEB)

    Dillstroem, Peter; Andersson, Magnus; Sattari-Far, Iradj; Weilin Zang (Inspecta Technology AB, Stockholm (Sweden))

    2009-06-15

    The main purpose of this work is to investigate the significance of the residual stresses for defects (cracks) in ductile materials with nuclear applications, when the applied primary (mechanical) loads are high. The treatment of weld-induced stresses as expressed in the SACC/ProSACC handbook and other fracture assessment procedures such as the ASME XI code and the R6-method is believed to be conservative for ductile materials. This is because of the general approach not to account for the improved fracture resistance caused by ductile tearing. Furthermore, there is experimental evidence that the contribution of residual stresses to fracture diminishes as the degree of yielding increases to a high level. However, neglecting weld-induced stresses in general, though, is doubtful for loads that are mostly secondary (e.g. thermal shocks) and for materials which are not ductile enough to be limit load controlled. Both thin-walled and thick-walled pipes containing surface cracks are studied here. This is done by calculating the relative contribution from the weld residual stresses to CTOD and the J-integral. Both circumferential and axial cracks are analysed. Three different crack geometries are studied here by using the finite element method (FEM). (i) 2D axisymmetric modelling of a V-joint weld in a thin-walled pipe. (ii) 2D axisymmetric modelling of a V-joint weld in a thick-walled pipe. (iii) 3D modelling of a X-joint weld in a thick-walled pipe. t. Each crack configuration is analysed for two load cases; (1) Only primary (mechanical) loading is applied to the model, (2) Both secondary stresses and primary loading are applied to the model. Also presented in this report are some published experimental investigations conducted on cracked components of ductile materials subjected to both primary and secondary stresses. Based on the outcome of this study, an analysis strategy for fracture assessment of defects in ductile materials of nuclear components is proposed. A new

  12. Analysis Strategy for Fracture Assessment of Defects in Ductile Materials

    International Nuclear Information System (INIS)

    The main purpose of this work is to investigate the significance of the residual stresses for defects (cracks) in ductile materials with nuclear applications, when the applied primary (mechanical) loads are high. The treatment of weld-induced stresses as expressed in the SACC/ProSACC handbook and other fracture assessment procedures such as the ASME XI code and the R6-method is believed to be conservative for ductile materials. This is because of the general approach not to account for the improved fracture resistance caused by ductile tearing. Furthermore, there is experimental evidence that the contribution of residual stresses to fracture diminishes as the degree of yielding increases to a high level. However, neglecting weld-induced stresses in general, though, is doubtful for loads that are mostly secondary (e.g. thermal shocks) and for materials which are not ductile enough to be limit load controlled. Both thin-walled and thick-walled pipes containing surface cracks are studied here. This is done by calculating the relative contribution from the weld residual stresses to CTOD and the J-integral. Both circumferential and axial cracks are analysed. Three different crack geometries are studied here by using the finite element method (FEM). (i) 2D axisymmetric modelling of a V-joint weld in a thin-walled pipe. (ii) 2D axisymmetric modelling of a V-joint weld in a thick-walled pipe. (iii) 3D modelling of a X-joint weld in a thick-walled pipe. t. Each crack configuration is analysed for two load cases; (1) Only primary (mechanical) loading is applied to the model, (2) Both secondary stresses and primary loading are applied to the model. Also presented in this report are some published experimental investigations conducted on cracked components of ductile materials subjected to both primary and secondary stresses. Based on the outcome of this study, an analysis strategy for fracture assessment of defects in ductile materials of nuclear components is proposed. A new

  13. Assessment of academic departments efficiency using data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Salah R. Agha

    2011-07-01

    Full Text Available Purpose: In this age of knowledge economy, universities play an important role in the development of a country. As government subsidies to universities have been decreasing, more efficient use of resources becomes important for university administrators. This study evaluates the relative technical efficiencies of academic departments at the Islamic University in Gaza (IUG during the years 2004-2006. Design/methodology/approach: This study applies Data Envelopment Analysis (DEA to assess the relative technical efficiency of the academic departments. The inputs are operating expenses, credit hours and training resources, while the outputs are number of graduates, promotions and public service activities. The potential improvements and super efficiency are computed for inefficient and efficient departments respectively. Further, multiple linear -regression is used to develop a relationship between super efficiency and input and output variables.Findings: Results show that the average efficiency score is 68.5% and that there are 10 efficient departments out of the 30 studied. It is noted that departments in the faculty of science, engineering and information technology have to greatly reduce their laboratory expenses. The department of economics and finance was found to have the highest super efficiency score among the efficient departments. Finally, it was found that promotions have the greatest contribution to the super efficiency scores while public services activities come next.Research limitations/implications: The paper focuses only on academic departments at a single university. Further, DEA is deterministic in nature.Practical implications: The findings offer insights on the inputs and outputs that significantly contribute to efficiencies so that inefficient departments can focus on these factors.Originality/value: Prior studies have used only one type of DEA (BCC and they did not explicitly answer the question posed by the inefficient

  14. Problems of method of technology assessment. A methodological analysis; Methodenprobleme des Technology Assessment; Eine methodologische Analyse

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, V.

    1993-03-01

    The study undertakes to analyse the theoretical and methodological structure of Technology Assessment (TA). It is based on a survey of TA studies which provided an important condition for theoreticall sound statements on methodological aspects of TA. It was established that the main basic theoretical problems of TA are in the field of dealing with complexity. This is also apparent in the constitution of problems, the most elementary and central approach of TA. Scientifically founded constitution of problems and the corresponding construction of models call for interdisciplinary scientific work. Interdisciplinarity in the TA research process is achieved at the level of virtual networks, these networks being composed of individuals suited to teamwork. The emerging network structures have an objective-organizational and an ideational basis. The objective-organizational basis is mainly the result of team composition and the external affiliations of the team members. The ideational basis of the virtual network is represented by the team members` mode of thinking, which is individually located at a multidisciplinary level. The theoretical `skeleton` of the TA knowledge system, which is represented by process knowledge based linkage structures, can be generated and also processed in connection with the knowledge on types of problems, areas of analysis and procedures to deal with complexity. Within this process, disciplinary knowledge is a necessary but not a sufficient condition. Metatheoretical and metadisciplinary knowledge and the correspondingly processes complexity of models are the basis for the necessary methodological awareness, that allows TA to become designable as a research procedure. (orig./HP) [Deutsch] Die Studie stellt sich die Aufgabe, die theoretische und methodische Struktur des Technology Assessment (TA) zu analysieren. Sie fusst auf Erhebungen, die bei Technology-Assessment-Studien vorgenommen wurden und die wesentliche Voraussetzungen fuer

  15. Performance Analysis of the Capability Assessment Tool for Sustainable Manufacturing

    Directory of Open Access Journals (Sweden)

    Enda Crossin

    2013-08-01

    Full Text Available This paper explores the performance of a novel capability assessment tool, developed to identify capability gaps and associated training and development requirements across the supply chain for environmentally-sustainable manufacturing. The tool was developed to assess 170 capabilities that have been clustered with respect to key areas of concern such as managing energy, water, material resources, carbon emissions and waste as well as environmental management practices for sustainability. Two independent expert teams used the tool to assess a sample group of five first and second tier sports apparel and footwear suppliers within the supply chain of a global sporting goods manufacturer in Asia. The paper addresses the reliability and robustness of the developed assessment method by formulating the expected links between the assessment results. The management practices of the participating suppliers were shown to be closely connected to their performance in managing their resources and emissions. The companies’ initiatives in implementing energy efficiency measures were found to be generally related to their performance in carbon emissions management. The suppliers were also asked to undertake a self-assessment by using a short questionnaire. The large gap between the comprehensive assessment and these in-house self-assessments revealed the suppliers’ misconceptions about their capabilities.

  16. Life Cycle Assessment Software for Product and Process Sustainability Analysis

    Science.gov (United States)

    Vervaeke, Marina

    2012-01-01

    In recent years, life cycle assessment (LCA), a methodology for assessment of environmental impacts of products and services, has become increasingly important. This methodology is applied by decision makers in industry and policy, product developers, environmental managers, and other non-LCA specialists working on environmental issues in a wide…

  17. Defining and Assessing Public Health Functions: A Global Analysis.

    Science.gov (United States)

    Martin-Moreno, Jose M; Harris, Meggan; Jakubowski, Elke; Kluge, Hans

    2016-01-01

    Given the broad scope and intersectoral nature of public health structures and practices, there are inherent difficulties in defining which services fall under the public health remit and in assessing their capacity and performance. The aim of this study is to analyze how public health functions and practice have been defined and operationalized in different countries and regions around the world, with a specific focus on assessment tools that have been developed to evaluate the performance of essential public health functions, services, and operations. Our review has identified nearly 100 countries that have carried out assessments, using diverse analytical and methodological approaches. The assessment processes have evolved quite differently according to administrative arrangements and resource availability, but some key contextual factors emerge that seem to favor policy-oriented follow-up. These include local ownership of the assessment process, policymakers' commitment to reform, and expert technical advice for implementation. PMID:26789385

  18. Coronary plaque composition as assessed by greyscale intravascular ultrasound and radiofrequency spectral data analysis

    NARCIS (Netherlands)

    N. Gonzalo (Nieves); H.M. Garcia-Garcia (Hector); J.M.R. Ligthart (Jürgen); G.A. Rodriguez-Granillo (Gaston); E. Meliga (Emanuele); Y. Onuma (Yoshinobu); J.C.H. Schuurbiers (Johan); N. Bruining (Nico); P.W.J.C. Serruys (Patrick)

    2008-01-01

    textabstractObjectives: (i) To explore the relation between greyscale intravascular ultrasound (IVUS) plaque qualitative classification and IVUS radiofrequency data (RFD) analysis tissue types; (ii) to evaluate if plaque composition as assessed by RFD analysis can be predicted by visual assessment o

  19. Assessing the validity of discourse analysis: transdisciplinary convergence

    Science.gov (United States)

    Jaipal-Jamani, Kamini

    2014-12-01

    Research studies using discourse analysis approaches make claims about phenomena or issues based on interpretation of written or spoken text, which includes images and gestures. How are findings/interpretations from discourse analysis validated? This paper proposes transdisciplinary convergence as a way to validate discourse analysis approaches to research. The argument is made that discourse analysis explicitly grounded in semiotics, systemic functional linguistics, and critical theory, offers a credible research methodology. The underlying assumptions, constructs, and techniques of analysis of these three theoretical disciplines can be drawn on to show convergence of data at multiple levels, validating interpretations from text analysis.

  20. Scout: orbit analysis and hazard assessment for NEOCP objects

    Science.gov (United States)

    Farnocchia, Davide; Chesley, Steven R.; Chamberlin, Alan B.

    2016-10-01

    It typically takes a few days for a newly discovered asteroid to be officially recognized as a real object. During this time, the tentative discovery is published on the Minor Planet Center's Near-Earth Object Confirmation Page (NEOCP) until additional observations confirm that the object is a real asteroid rather than an observational artifact or an artificial object. Also, NEOCP objects could have a limited observability window and yet be scientifically interesting, e.g., radar and lightcurve targets, mini-moons (temporary Earth captures), mission accessible targets, close approachers or even impactors. For instance, the only two asteroids discovered before an impact, 2008 TC3 and 2014 AA, both reached the Earth less than a day after discovery. For these reasons we developed Scout, an automated system that provides an orbital and hazard assessment for NEOCP objects within minutes after the observations are available. Scout's rapid analysis increases the chances of securing the trajectory of interesting NEOCP objects before the ephemeris uncertainty grows too large or the observing geometry becomes unfavorable. The generally short observation arcs, perhaps only a few hours or even less, lead severe degeneracies in the orbit estimation process. To overcome these degeneracies Scout relies on systematic ranging, a technique that derives possible orbits by scanning a grid in the poorly constrained space of topocentric range and range rate, while the plane-of-sky position and motion are directly tied to the recorded observations. This scan allows us to derive a distribution of the possible orbits and in turn identify the NEOCP objects of most interest to prioritize followup efforts. In particular, Scout ranks objects according to the likelihood of an impact, estimates the close approach distance, the Earth-relative minimum orbit intersection distance and v-infinity, and computes scores to identify objects more likely to be an NEO, a km-sized NEO, a Potentially

  1. Assessing the Validity of Discourse Analysis: Transdisciplinary Convergence

    Science.gov (United States)

    Jaipal-Jamani, Kamini

    2014-01-01

    Research studies using discourse analysis approaches make claims about phenomena or issues based on interpretation of written or spoken text, which includes images and gestures. How are findings/interpretations from discourse analysis validated? This paper proposes transdisciplinary convergence as a way to validate discourse analysis approaches to…

  2. Radiological assessment. A textbook on environmental dose analysis

    International Nuclear Information System (INIS)

    Radiological assessment is the quantitative process of estimating the consequences to humans resulting from the release of radionuclides to the biosphere. It is a multidisciplinary subject requiring the expertise of a number of individuals in order to predict source terms, describe environmental transport, calculate internal and external dose, and extrapolate dose to health effects. Up to this time there has been available no comprehensive book describing, on a uniform and comprehensive level, the techniques and models used in radiological assessment. Radiological Assessment is based on material presented at the 1980 Health Physics Society Summer School held in Seattle, Washington. The material has been expanded and edited to make it comprehensive in scope and useful as a text. Topics covered include (1) source terms for nuclear facilities and Medical and Industrial sites; (2) transport of radionuclides in the atmosphere; (3) transport of radionuclides in surface waters; (4) transport of radionuclides in groundwater; (5) terrestrial and aquatic food chain pathways; (6) reference man; a system for internal dose calculations; (7) internal dosimetry; (8) external dosimetry; (9) models for special-case radionuclides; (10) calculation of health effects in irradiated populations; (11) evaluation of uncertainties in environmental radiological assessment models; (12) regulatory standards for environmental releases of radionuclides; (13) development of computer codes for radiological assessment; and (14) assessment of accidental releases of radionuclides

  3. Radiological assessment. A textbook on environmental dose analysis

    Energy Technology Data Exchange (ETDEWEB)

    Till, J.E.; Meyer, H.R. (eds.)

    1983-09-01

    Radiological assessment is the quantitative process of estimating the consequences to humans resulting from the release of radionuclides to the biosphere. It is a multidisciplinary subject requiring the expertise of a number of individuals in order to predict source terms, describe environmental transport, calculate internal and external dose, and extrapolate dose to health effects. Up to this time there has been available no comprehensive book describing, on a uniform and comprehensive level, the techniques and models used in radiological assessment. Radiological Assessment is based on material presented at the 1980 Health Physics Society Summer School held in Seattle, Washington. The material has been expanded and edited to make it comprehensive in scope and useful as a text. Topics covered include (1) source terms for nuclear facilities and Medical and Industrial sites; (2) transport of radionuclides in the atmosphere; (3) transport of radionuclides in surface waters; (4) transport of radionuclides in groundwater; (5) terrestrial and aquatic food chain pathways; (6) reference man; a system for internal dose calculations; (7) internal dosimetry; (8) external dosimetry; (9) models for special-case radionuclides; (10) calculation of health effects in irradiated populations; (11) evaluation of uncertainties in environmental radiological assessment models; (12) regulatory standards for environmental releases of radionuclides; (13) development of computer codes for radiological assessment; and (14) assessment of accidental releases of radionuclides.

  4. Windfarm Generation Assessment for ReliabilityAnalysis of Power Systems

    DEFF Research Database (Denmark)

    Negra, Nicola Barberis; Holmstrøm, Ole; Bak-Jensen, Birgitte;

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...... a significant role in this assessment and different models have been created for it, but a representation which includes all of them has not been developed yet. This paper deals with this issue. First, a list of nine influencing Factors is presented and discussed. Secondly, these Factors are included...... in a reliability model and the generation of a windfarm is evaluated by means of sequential Monte Carlo simulation. Results are used to analyse how each mentioned Factor influences the assessment, and why and when they should be included in the model....

  5. Windfarm Generation Assessment for Reliability Analysis of Power Systems

    DEFF Research Database (Denmark)

    Barberis Negra, Nicola; Bak-Jensen, Birgitte; Holmstrøm, O.;

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...... a significant role in this assessment and different models have been created for it, but a representation which includes all of them has not been developed yet. This paper deals with this issue. First, a list of nine influencing Factors is presented and discussed. Secondly, these Factors are included...... in a reliability model and the generation of a windfarm is evaluated by means of sequential Monte Carlo simulation. Results are used to analyse how each mentioned Factor influences the assessment, and why and when they should be included in the model....

  6. Fuzzy sensitivity analysis for reliability assessment of building structures

    Science.gov (United States)

    Kala, Zdeněk

    2016-06-01

    The mathematical concept of fuzzy sensitivity analysis, which studies the effects of the fuzziness of input fuzzy numbers on the fuzziness of the output fuzzy number, is described in the article. The output fuzzy number is evaluated using Zadeh's general extension principle. The contribution of stochastic and fuzzy uncertainty in reliability analysis tasks of building structures is discussed. The algorithm of fuzzy sensitivity analysis is an alternative to stochastic sensitivity analysis in tasks in which input and output variables are considered as fuzzy numbers.

  7. RHETORICAL STRUCTURE ANALYSIS FOR ASSESSING COLLABORATIVE PROCESSES IN CSCL

    Directory of Open Access Journals (Sweden)

    Mohammad Hamad Allaymoun

    2015-12-01

    Full Text Available This paper presents a research on using rhetorical structures for assessing collaborative processes in Computer-Supported Collaborative Learning (CSCL chats. For this purpose, the ideas of Bakhtin’s dialogism theory and Trausan-Matu’s polyphonic model are used, starting from the identification of the threads of repeated words from chats. Cue phrases and their usage in linking the identified threads are also considered. The results are presented in statistical tables and graphics that ease the understanding of the collaborative process, helping teachers to analyze and assess students' collaborative chats. It also allows students to know and understand the interactions and how it contributes to the conversation.

  8. No-Reference Video Quality Assessment using MPEG Analysis

    DEFF Research Database (Denmark)

    Søgaard, Jacob; Forchhammer, Søren; Korhonen, Jari

    2013-01-01

    and estimate the video coding parameters for MPEG-2 and H.264/AVC which can be used to improve the VQA. The analysis differs from most other video coding analysis methods since it is without access to the bitstream. The results show that our proposed method is competitive with other recent NR VQA methods...... for MPEG-2 and H.264/AVC....

  9. Methods of Assessing Replicability in Canonical Correlation Analysis (CCA).

    Science.gov (United States)

    King, Jason E.

    Theoretical hypotheses generated from data analysis of a single sample should not be advanced until the replicability issue is treated. At least one of three questions usually arises when evaluating the invariance of results obtained from a canonical correlation analysis (CCA): (1) "Will an effect occur in subsequent studies?"; (2) "Will the size…

  10. Environmental Impact Assessment for Socio-Economic Analysis of Chemicals

    DEFF Research Database (Denmark)

    Calow, Peter; Biddinger, G; Hennes, C;

    This report describes the requirements for, and illustrates the application of, a methodology for a socio-economic analysis (SEA) especially as it might be adopted in the framework of REACH.......This report describes the requirements for, and illustrates the application of, a methodology for a socio-economic analysis (SEA) especially as it might be adopted in the framework of REACH....

  11. Assessing SRI fund performance research : best practices in empirical analysis

    NARCIS (Netherlands)

    Chegut, Andrea; Schenk, H.; Scholtens, B.

    2011-01-01

    We review the socially responsible investment (SRI) mutual fund performance literature to provide best practices in SRI performance attribution analysis. Based on meta-ethnography and content analysis, five themes in this literature require specific attention: data quality, social responsibility ver

  12. ASSESSMENT OF REGIONAL EFFICIENCY IN CROATIA USING DATA ENVELOPMENT ANALYSIS

    Directory of Open Access Journals (Sweden)

    Danijela Rabar

    2013-02-01

    Full Text Available In this paper, regional efficiency of Croatian counties is measured in three-year period (2005-2007 using Data Envelopment Analysis (DEA. The set of inputs and outputs consists of seven socioeconomic indicators. Analysis is carried out using models with assumption of variable returns-to-scale. DEA identifies efficient counties as benchmark members and inefficient counties that are analyzed in detail to determine the sources and the amounts of their inefficiency in each source. To enable proper monitoring of development dynamics, window analysis is applied. Based on the results, guidelines for implementing necessary improvements to achieve efficiency are given. Analysis reveals great disparities among counties. In order to alleviate naturally, historically and politically conditioned unequal county positions over which economic policy makers do not have total control, categorical approach is introduced as an extension to the basic DEA models. This approach, combined with window analysis, changes relations among efficiency scores in favor of continental counties.

  13. Using Empirical Article Analysis to Assess Research Methods Courses

    Science.gov (United States)

    Bachiochi, Peter; Everton, Wendi; Evans, Melanie; Fugere, Madeleine; Escoto, Carlos; Letterman, Margaret; Leszczynski, Jennifer

    2011-01-01

    Developing students who can apply their knowledge of empirical research is a key outcome of the undergraduate psychology major. This learning outcome was assessed in two research methods courses by having students read and analyze a condensed empirical journal article. At the start and end of the semester, students in multiple sections of an…

  14. Repeater Analysis for Combining Information from Different Assessments

    Science.gov (United States)

    Haberman, Shelby; Yao, Lili

    2015-01-01

    Admission decisions frequently rely on multiple assessments. As a consequence, it is important to explore rational approaches to combine the information from different educational tests. For example, U.S. graduate schools usually receive both TOEFL iBT® scores and GRE® General scores of foreign applicants for admission; however, little guidance…

  15. A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment

    Science.gov (United States)

    Finch, Holmes; Monahan, Patrick

    2008-01-01

    This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…

  16. Modeling and Analysis on Radiological Safety Assessment of Low- and Intermediate Level Radioactive Waste Repository

    International Nuclear Information System (INIS)

    Modeling study and analysis for technical support for the safety and performance assessment of the low- and intermediate level (LILW) repository partially needed for radiological environmental impact reporting which is essential for the licenses for construction and operation of LILW has been fulfilled. Throughout this study such essential area for technical support for safety and performance assessment of the LILW repository and its licensing as gas generation and migration in and around the repository, risk analysis and environmental impact during transportation of LILW, biosphere modeling and assessment for the flux-to-dose conversion factors for human exposure as well as regional and global groundwater modeling and analysis has been carried out

  17. Modeling and Analysis on Radiological Safety Assessment of Low- and Intermediate Level Radioactive Waste Repository

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Youn Myoung; Jung, Jong Tae; Kang, Chul Hyung (and others)

    2008-04-15

    Modeling study and analysis for technical support for the safety and performance assessment of the low- and intermediate level (LILW) repository partially needed for radiological environmental impact reporting which is essential for the licenses for construction and operation of LILW has been fulfilled. Throughout this study such essential area for technical support for safety and performance assessment of the LILW repository and its licensing as gas generation and migration in and around the repository, risk analysis and environmental impact during transportation of LILW, biosphere modeling and assessment for the flux-to-dose conversion factors for human exposure as well as regional and global groundwater modeling and analysis has been carried out.

  18. A root cause analysis approach to risk assessment of a pipeline network for Kuwait Oil Company

    Energy Technology Data Exchange (ETDEWEB)

    Davies, Ray J.; Alfano, Tony D. [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil); Waheed, Farrukh [Kuwait Oil Company, Ahmadi (Kuwait); Komulainen, Tiina [Kongsberg Oil and Gas Technologies, Sandvika (Norway)

    2009-07-01

    A large scale risk assessment was performed by Det Norske Veritas (DNV) for the entire Kuwait Oil Company (KOC) pipeline network. This risk assessment was unique in that it incorporated the assessment of all major sources of process related risk faced by KOC and included root cause management system related risks in addition to technical risks related to more immediate causes. The assessment was conducted across the entire pipeline network with the scope divided into three major categories:1. Integrity Management 2. Operations 3. Management Systems Aspects of integrity management were ranked and prioritized using a custom algorithm based on critical data sets. A detailed quantitative risk assessment was then used to further evaluate those issues deemed unacceptable, and finally a cost benefit analysis approach was used to compare and select improvement options. The operations assessment involved computer modeling of the entire pipeline network to assess for bottlenecks, surge and erosion analysis, and to identify opportunities within the network that could potentially lead to increased production. The management system assessment was performed by conducting a gap analysis on the existing system and by prioritizing those improvement actions that best aligned with KOC's strategic goals for pipelines. Using a broad and three-pronged approach to their overall risk assessment, KOC achieved a thorough, root cause analysis-based understanding of risks to their system as well as a detailed list of recommended remediation measures that were merged into a 5-year improvement plan. (author)

  19. Review of assessment methods discount rate in investment analysis

    Directory of Open Access Journals (Sweden)

    Yamaletdinova Guzel Hamidullovna

    2011-08-01

    Full Text Available The article examines the current methods of calculating discount rate in investment analysis and business valuation, as well as analyzes the key problems using various techniques in terms of the Russian economy.

  20. Transboundary diagnostic analysis. Vol. 2. Background and environmental assessment

    OpenAIRE

    2012-01-01

    The Transboundary Diagnosis Analysis(TDA) quantifies and ranks water-related environmental transboundary issues and their causes according to the severity of environmental and/or socio-economic impacts. The three main issues in BOBLME are; overexploitation of marine living resources; degradation of mangroves, coral reefs and seagrasses; pollution and water quality. Volume 2 contains background material that sets out the bio-physical and socio-economic characteristics of the BOBLME; an analysi...

  1. Analysis of online quizzes as a teaching and assessment tool

    Directory of Open Access Journals (Sweden)

    Lorenzo Salas-Morera

    2012-03-01

    Full Text Available This article deals with the integrated use of online quizzes as a teaching and assessment tool in the general program of the subject Proyectos in the third course of Ingeniero Técnico en Informática de Gestión over five consecutive years. The research undertaken aimed to test quizzes effectiveness on student performance when used, not only as an isolated assessment tool, but also when integrated into a combined strategy, which support the overall programming of the subject. The results obtained during the five years of experimentation using online quizzes shows that such quizzes have a proven positive influence on students' academic performance. Furthermore, surveys conducted at the end of each course revealed the high value students accord to use of online quizzes in course instruction.

  2. Assessment of Transport Projects: Risk Analysis and Decision Support

    DEFF Research Database (Denmark)

    Salling, Kim Bang

    2008-01-01

    . Even though vast amounts of money are spent upon preliminary models, environmental investigations, public hearings, etc., the resulting outcome is given by point estimates, i.e. in terms of net present values or benefit-cost rates. This thesis highlights the perspective of risks when assessing...... distributions. This selection process has been conducted among others by literature studies, conference and seminar attendances and substantial amount of tests within CBA-DK. Currently, the model is made up by five different distributions further divided into two groups of non-parametric and parametric...... functions. New research proved that specifically two impacts stood out in transport project assessment, namely, travel time savings and construction costs. The final concern of this study has been the fitting of distributions, e.g. by the use of data from major databases developed in which Optimism Bias...

  3. An assessment and analysis of dietary practices of Irish jockeys

    OpenAIRE

    O'Loughlin, Gillian

    2014-01-01

    Background: Horse racing is a weight category sport in which jockeys must chronically maintain a low body mass to compete, over a protracted season. The need to relentlessly align body mass with racing limits appears to encourage the use of short-term and potentially dangerous acute weight loss strategies. The purpose of this study was to investigate and assess the dietary habits of Irish Jockeys using established methods as well as incorporating novel sensing technologies. Methods: The ...

  4. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    OpenAIRE

    Aghaeepour, Nima; Finak, Greg; ,; Hoos, Holger; Mosmann, Tim R; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manu...

  5. Paediatric neuropsychological assessment: an analysis of parents' perspectives

    OpenAIRE

    Stark, Daniel; Thomas, Sophie; Dawson, Dave; Talbot, Emily; Bennett, Emily; Starza-Smith, Arleta

    2014-01-01

    Purpose: Modern healthcare services are commonly based on shared models of care, in which a strong emphasis is placed upon the views of those in receipt of services. The purpose of this paper is to examine the parents' experiences of their child's neuropsychological assessment. Design/methodology/approach: This was a mixed-methodology study employing both quantitative and qualitative measures. Findings: The questionnaire measure indicated a high overall level of satisfaction. Qualitative anal...

  6. Tiger Team Assessments seventeen through thirty-five: A summary and analysis

    International Nuclear Information System (INIS)

    This report provides a summary and analysis of the Department of Energy's (DOE'S) 19 Tiger Team Assessments that were conducted from October 1990 to July 1992. The sites are listed in the box below, along with their respective program offices and assessment completion dates. This analysis relied solely on the information contained in the Tiger Team Assessment Reports. The findings and concerns documented by the Tiger Teams provide a database of information about the then-current ES ampersand H programs and practice. Program Secretarial Officers (PSOS) and field managers may use this information, along with other sources (such as the Corrective Action Plans, Progress Assessments, and Self-Assessments), to address the ES ampersand H deficiencies found, prioritize and plan appropriate corrective actions, measure progress toward solving the problems, strengthen and transfer knowledge about areas where site performance exemplified the ES ampersand H mindset, and so forth. Further analyses may be suggested by the analysis presented in this report

  7. Risk Assessment of Infrastructure System of Systems with Precursor Analysis.

    Science.gov (United States)

    Guo, Zhenyu; Haimes, Yacov Y

    2016-08-01

    Physical infrastructure systems are commonly composed of interconnected and interdependent subsystems, which in their essence constitute system of systems (S-o-S). System owners and policy researchers need tools to foresee potential emergent forced changes and to understand their impact so that effective risk management strategies can be developed. We develop a systemic framework for precursor analysis to support the design of an effective and efficient precursor monitoring and decision support system with the ability to (i) identify and prioritize indicators of evolving risks of system failure; and (ii) evaluate uncertainties in precursor analysis to support informed and rational decision making. This integrated precursor analysis framework is comprised of three processes: precursor identification, prioritization, and evaluation. We use an example of a highway bridge S-o-S to demonstrate the theories and methodologies of the framework. Bridge maintenance processes involve many interconnected and interdependent functional subsystems and decision-making entities and bridge failure can have broad social and economic consequences. The precursor analysis framework, which constitutes an essential part of risk analysis, examines the impact of various bridge inspection and maintenance scenarios. It enables policy researchers and analysts who are seeking a risk perspective on bridge infrastructure in a policy setting to develop more risk informed policies and create guidelines to efficiently allocate limited risk management resources and mitigate severe consequences resulting from bridge failures. PMID:27575259

  8. Quantitative assessment of human motion using video motion analysis

    Science.gov (United States)

    Probe, John D.

    1993-01-01

    In the study of the dynamics and kinematics of the human body a wide variety of technologies has been developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development, coupled with recent advances in video technology, have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System (APAS) to develop data on shirtsleeved and space-suited human performance in order to plan efficient on-orbit intravehicular and extravehicular activities. APAS is a fully integrated system of hardware and software for biomechanics and the analysis of human performance and generalized motion measurement. Major components of the complete system include the video system, the AT compatible computer, and the proprietary software.

  9. Assessment of the Assessment Tool: Analysis of Items in a Non-MCQ Mathematics Exam

    Directory of Open Access Journals (Sweden)

    Heba Bakr Khoshaim

    2016-01-01

    Full Text Available Assessment is one of the vital steps in the teaching and learning process. The reported action research examines the effectiveness of an assessment process and inspects the validity of exam questions used for the assessment purpose. The instructors of a college-level mathematics course studied questions used in the final exams during the academic years 2013–2014 and 2014−2015. Using the data from 206 students, the researchers analyzed 54 exam questions with regard to the complexity level, the difficulty coefficient and the discrimination coefficient. Findings indicated that the complexity level correlated with the difficulty coefficient for only one of three semesters. In addition, the correlation between the discrimination coefficient and the difficulty coefficient was found to be statistically significant in all three semesters. The results suggest that all three exams were acceptable; however, further attention should be given to the complexity level of questions used in mathematical tests and that moderate difficulty level questions are better classifying students’ performance.

  10. Evaluation of safety assessment methodologies in Rocky Flats Risk Assessment Guide (1985) and Building 707 Final Safety Analysis Report (1987)

    Energy Technology Data Exchange (ETDEWEB)

    Walsh, B.; Fisher, C.; Zigler, G.; Clark, R.A. [Science and Engineering Associates, Inc., Albuquerque, NM (United States)

    1990-11-09

    FSARs. Rockwell International, as operating contractor at the Rocky Flats plant, conducted a safety analysis program during the 1980s. That effort resulted in Final Safety Analysis Reports (FSARs) for several buildings, one of them being the Building 707 Final Safety Analysis Report, June 87 (707FSAR) and a Plant Safety Analysis Report. Rocky Flats Risk Assessment Guide, March 1985 (RFRAG85) documents the methodologies that were used for those FSARs. Resources available for preparation of those Rocky Flats FSARs were very limited. After addressing the more pressing safety issues, some of which are described below, the present contractor (EG&G) intends to conduct a program of upgrading the FSARs. This report presents the results of a review of the methodologies described in RFRAG85 and 707FSAR and contains suggestions that might be incorporated into the methodology for the FSAR upgrade effort.

  11. Cost-effectiveness analysis: adding value to assessment of animal health welfare and production.

    Science.gov (United States)

    Babo Martins, S; Rushton, J

    2014-12-01

    Cost-effectiveness analysis (CEA) has been extensively used in economic assessments in fields related to animal health, namely in human health where it provides a decision-making framework for choices about the allocation of healthcare resources. Conversely, in animal health, cost-benefit analysis has been the preferred tool for economic analysis. In this paper, the use of CEA in related areas and the role of this technique in assessments of animal health, welfare and production are reviewed. Cost-effectiveness analysis can add further value to these assessments, particularly in programmes targeting animal welfare or animal diseases with an impact on human health, where outcomes are best valued in natural effects rather than in monetary units. Importantly, CEA can be performed during programme implementation stages to assess alternative courses of action in real time.

  12. Manage Stakeholders approach for analysis and risk assessment in the implementation of innovative projects

    OpenAIRE

    СУХОНОС, Марія Костянтинівна; Угоднікова, Олена Ігорівна

    2012-01-01

    The problem of innovation project risk management, notably Manage Stakeholder's risks, is consider in this article. The methodology of analysis and assessment Manage Stakeholder's risks in innovation projects is suggest

  13. 78 FR 39284 - Technical Guidance for Assessing Environmental Justice in Regulatory Analysis

    Science.gov (United States)

    2013-07-01

    ..., Office of Policy, National Center for Environmental Economics, Mail code 1809T, Environmental Protection... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION AGENCY Technical Guidance for Assessing Environmental Justice in Regulatory Analysis...

  14. 78 FR 27235 - Technical Guidance for Assessing Environmental Justice in Regulatory Analysis

    Science.gov (United States)

    2013-05-09

    ..., Office of Policy, National Center for Environmental Economics, Mail code 1809T, Environmental Protection... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION AGENCY Technical Guidance for Assessing Environmental Justice in Regulatory Analysis...

  15. Model Analysis Assessing the dynamics of student learning

    CERN Document Server

    Bao, L; Bao, Lei; Redish, Edward F.

    2002-01-01

    In this paper we present a method of modeling and analysis that permits the extraction and quantitative display of detailed information about the effects of instruction on a class's knowledge. The method relies on a congitive model that represents student thinking in terms of mental models. Students frequently fail to recognize relevant conditions that lead to appropriate uses of their models. As a result they can use multiple models inconsistently. Once the most common mental models have been determined by qualitative research, they can be mapping onto a multiple choice test. Model analysis permits the interpretation of such a situation. We illustrate the use of our method by analyzing results from the FCI.

  16. Numerical analysis and geotechnical assessment of mine scale model

    Institute of Scientific and Technical Information of China (English)

    Khanal Manoj; Adhikary Deepak; Balusu Rao

    2012-01-01

    Various numerical methods are available to model,simulate,analyse and interpret the results; however a major task is to select a reliable and intended tool to perform a realistic assessment of any problem.For a model to be a representative of the realistic mining scenario,a verified tool must be chosen to perform an assessment of mine roof support requirement and address the geotechnical risks associated with longwall mining.The dependable tools provide a safe working environment,increased production,efficient management of resources and reduce environmental impacts of mining.Although various methods,for example,analytical,experimental and empirical are being adopted in mining,in recent days numerical tools are becoming popular due to the advancement in computer hardware and numerical methods.Empirical rules based on past experiences do provide a general guide,however due to the heterogeneous nature of mine geology (i.e.,none of the mine sites are identical),numerical simulations of mine site specific conditions would lend better insights into some underlying issues.The paper highlights the use of a continuum mechanics based tool in coal mining with a mine scale model.The continuum modelling can provide close to accurate stress fields and deformation.The paper describes the use of existing mine data to calibrate and validate the model parameters,which then are used to assess geotechnical issues related with installing a new high capacity longwall mine at the mine site.A variety of parameters,for example,chock convergences,caveability of overlying sandstones,abutment and vertical stresses have been estimated.

  17. Promises and pitfalls in environmentally extended input–output analysis for China: A survey of the literature

    International Nuclear Information System (INIS)

    As the world's largest developing economy, China plays a key role in global climate change and other environmental impacts of international concern. Environmentally extended input–output analysis (EE-IOA) is an important and insightful tool seeing widespread use in studying large-scale environmental impacts in China: calculating and analyzing greenhouse gas emissions, carbon and water footprints, pollution, and embodied energy. This paper surveys the published articles regarding EE-IOA for China in peer-reviewed journals and provides a comprehensive and quantitative overview of the body of literature, examining the research impact, environmental issues addressed, and data utilized. The paper further includes a discussion of the shortcomings in official Chinese data and of the potential means to move beyond its inherent limitations. - Highlights: • Articles in 2012–2013 more than doubled that published between 1995 and 2011. • CO2 and energy are the most common topics, frequently associated with trade. • Data from the National Bureau of Statistics is widely used but seen as flawed. • Climate change, water supply, and food security drive the future of the literature

  18. An Analysis of State Autism Educational Assessment Practices and Requirements.

    Science.gov (United States)

    Barton, Erin E; Harris, Bryn; Leech, Nancy; Stiff, Lillian; Choi, Gounah; Joel, Tiffany

    2016-03-01

    States differ in the procedures and criteria used to identify ASD. These differences are likely to impact the prevalence and age of identification for children with ASD. The purpose of the current study was to examine the specific state variations in ASD identification and eligibility criteria requirements. We examined variations by state in autism assessment practices and the proportion of children eligible for special education services under the autism category. Overall, our findings suggest that ASD identification practices vary across states, but most states use federal guidelines, at least in part, to set their requirements. Implications and recommendations for policy and practice are discussed.

  19. Cost-effectiveness analysis of computer-based assessment

    Directory of Open Access Journals (Sweden)

    Pauline Loewenberger

    2003-12-01

    Full Text Available The need for more cost-effective and pedagogically acceptable combinations of teaching and learning methods to sustain increasing student numbers means that the use of innovative methods, using technology, is accelerating. There is an expectation that economies of scale might provide greater cost-effectiveness whilst also enhancing student learning. The difficulties and complexities of these expectations are considered in this paper, which explores the challenges faced by those wishing to evaluate the costeffectiveness of computer-based assessment (CBA. The paper outlines the outcomes of a survey which attempted to gather information about the costs and benefits of CBA.

  20. The Analysis of Ease of Doing Business Assessment Methods

    Directory of Open Access Journals (Sweden)

    Mindaugas Samoška

    2011-07-01

    Full Text Available The study deals with the ease of doing business assessment models. Analysed models describe conditions for doing busi­ness in a certain country that is being ranked and evaluated. However obvious need for improvement in methodology accrues while analysing five most known models in a comparative way. Different data aggregation principles differ the results (quantative and qualitive methods of aggregation despite the factors that are evaluated in both models and are quite similar. After analysing all five methods we state critics about them and insights for possible improvement.Article in Lithuanian

  1. A Risk-Analysis Approach to Implementing Web-Based Assessment

    Science.gov (United States)

    Ricketts, Chris; Zakrzewski, Stan

    2005-01-01

    Computer-Based Assessment is a risky business. This paper proposes the use of a model for web-based assessment systems that identifies pedagogic, operational, technical (non web-based), web-based and financial risks. The strategies and procedures for risk elimination or reduction arise from risk analysis and management and are the means by which…

  2. Literary translation and quality assessment analysis – its significance in translation training

    OpenAIRE

    Rodríguez, Beatriz Maria

    2004-01-01

    This paper aims to highlight the role of translation quality assessment in translation training so as to develop students’ translation competence and skills to face translation problems. An analysis to assess literary translation quality is proposed before proceeding to discuss its pedagogical implementation.

  3. Exploratory Mokken Scale Analysis as a Dimensionality Assessment Tool: Why Scalability Does Not Imply Unidimensionality

    Science.gov (United States)

    Smits, Iris A. M.; Timmerman, Marieke E.; Meijer, Rob R.

    2012-01-01

    The assessment of the number of dimensions and the dimensionality structure of questionnaire data is important in scale evaluation. In this study, the authors evaluate two dimensionality assessment procedures in the context of Mokken scale analysis (MSA), using a so-called fixed lowerbound. The comparative simulation study, covering various…

  4. Pesticide Flow Analysis to Assess Human Exposure in Greenhouse Flower Production in Colombia

    OpenAIRE

    Binder, Claudia R.; Camilo Lesmes-Fabian

    2013-01-01

    Human exposure assessment tools represent a means for understanding human exposure to pesticides in agricultural activities and managing possible health risks. This paper presents a pesticide flow analysis modeling approach developed to assess human exposure to pesticide use in greenhouse flower crops in Colombia, focusing on dermal and inhalation exposure. This approach is based on the material flow analysis methodology. The transfer coefficients were obtained using the whole body dosimetry ...

  5. Fear Assessment: Cost-Benefit Analysis and the Pricing of Fear and Anxiety

    OpenAIRE

    Adler, Matthew D.

    2003-01-01

    "Risk assessment" is now a common feature of regulatory practice, but "fear assessment" is not.In particular, environmental, health and safety agencies such as EPA, FDA, OSHA, NHTSA, and CPSC, commonly count death, illness and injury as "costs" for purposes of cost-benefit analysis, but almost never incorporate fear, anxiety or other welfare-reducing mental states into the analysis.This is puzzling, since fear and anxiety are welfare setbacks, and since the very hazards regulated by these age...

  6. No-Reference Video Quality Assessment using Codec Analysis

    DEFF Research Database (Denmark)

    Søgaard, Jacob; Forchhammer, Søren; Korhonen, Jari

    2015-01-01

    propose a novel estimation method of the quantization in H.264/AVC videos without bitstream access, which can also be used for Peak Signalto-Noise Ratio (PSNR) estimation. The results from the MPEG-2 and H.264/AVC analysis are mapped to a perceptual measure of video quality by Support Vector Regression...

  7. Technical quality assessment of an optoelectronic system for movement analysis

    Science.gov (United States)

    Di Marco, R.; Rossi, S.; Patanè, F.; Cappa, P.

    2015-02-01

    The Optoelectronic Systems (OS) are largely used in gait analysis to evaluate the motor performances of healthy subjects and patients. The accuracy of marker trajectories reconstruction depends on several aspects: the number of cameras, the dimension and position of the calibration volume, and the chosen calibration procedure. In this paper we propose a methodology to evaluate the effects of the mentioned sources of error on the reconstruction of marker trajectories. The novel contribution of the present work consists in the dimension of the tested calibration volumes, which is comparable with the ones normally used in gait analysis; in addition, to simulate trajectories during clinical gait analysis, we provide non-default paths for markers as inputs. Several calibration procedures are implemented and the same trial is processed with each calibration file, also considering different cameras configurations. The RMSEs between the measured trajectories and the optimal ones are calculated for each comparison. To investigate the significant differences between the computed indices, an ANOVA analysis is implemented. The RMSE is sensible to the variations of the considered calibration volume and the camera configurations and it is always inferior to 43 mm.

  8. Using Rasch Analysis to Identify Uncharacteristic Responses to Undergraduate Assessments

    Science.gov (United States)

    Edwards, Antony; Alcock, Lara

    2010-01-01

    Rasch Analysis is a statistical technique that is commonly used to analyse both test data and Likert survey data, to construct and evaluate question item banks, and to evaluate change in longitudinal studies. In this article, we introduce the dichotomous Rasch model, briefly discussing its assumptions. Then, using data collected in an…

  9. Dynamic Assessment of Functional Lipidomic Analysis in Human Urine.

    Science.gov (United States)

    Rockwell, Hannah E; Gao, Fei; Chen, Emily Y; McDaniel, Justice; Sarangarajan, Rangaprasad; Narain, Niven R; Kiebish, Michael A

    2016-07-01

    The development of enabling mass spectrometry platforms for the quantification of diverse lipid species in human urine is of paramount importance for understanding metabolic homeostasis in normal and pathophysiological conditions. Urine represents a non-invasive biofluid that can capture distinct differences in an individual's physiological status. However, currently there is a lack of quantitative workflows to engage in high throughput lipidomic analysis. This study describes the development of a MS/MS(ALL) shotgun lipidomic workflow and a micro liquid chromatography-high resolution tandem mass spectrometry (LC-MS/MS) workflow for urine structural and mediator lipid analysis, respectively. This workflow was deployed to understand biofluid sample handling and collection, extraction efficiency, and natural human variation over time. Utilization of 0.5 mL of urine for structural lipidomic analysis resulted in reproducible quantification of more than 600 lipid molecular species from over 20 lipid classes. Analysis of 1 mL of urine routinely quantified in excess of 55 mediator lipid metabolites comprised of octadecanoids, eicosanoids, and docosanoids generated by lipoxygenase, cyclooxygenase, and cytochrome P450 activities. In summary, the high-throughput functional lipidomics workflow described in this study demonstrates an impressive robustness and reproducibility that can be utilized for population health and precision medicine applications. PMID:27038173

  10. Concentration Analysis: A Quantitative Assessment of Student States.

    Science.gov (United States)

    Bao, Lei; Redish, Edward F.

    2001-01-01

    Explains that multiple-choice tests such as the Force Concept Inventory (FCI) provide useful instruments to probe the distribution of student difficulties on a large scale. Introduces a new method, concentration analysis, to measure how students' responses on multiple-choice questions are distributed. (Contains 18 references.) (Author/YDS)

  11. Technical quality assessment of an optoelectronic system for movement analysis

    International Nuclear Information System (INIS)

    The Optoelectronic Systems (OS) are largely used in gait analysis to evaluate the motor performances of healthy subjects and patients. The accuracy of marker trajectories reconstruction depends on several aspects: the number of cameras, the dimension and position of the calibration volume, and the chosen calibration procedure. In this paper we propose a methodology to evaluate the effects of the mentioned sources of error on the reconstruction of marker trajectories. The novel contribution of the present work consists in the dimension of the tested calibration volumes, which is comparable with the ones normally used in gait analysis; in addition, to simulate trajectories during clinical gait analysis, we provide non-default paths for markers as inputs. Several calibration procedures are implemented and the same trial is processed with each calibration file, also considering different cameras configurations. The RMSEs between the measured trajectories and the optimal ones are calculated for each comparison. To investigate the significant differences between the computed indices, an ANOVA analysis is implemented. The RMSE is sensible to the variations of the considered calibration volume and the camera configurations and it is always inferior to 43 mm

  12. Biogas upgrading technologies:Energetic analysis and environmental impact assessment

    Institute of Scientific and Technical Information of China (English)

    Yajing Xu; Ying Huang; Bin Wu; Xiangping Zhang; Suojiang Zhang

    2015-01-01

    Biogas upgrading for removing CO2 and other trace components from raw biogas is a necessary step before the biogas to be used as a vehicle fuel or supplied to the natural gas grid. In this work, three technologies for biogas upgrading, i.e., pressured water scrubbing (PWS), monoethanolamine aqueous scrubbing (MAS) and ionic liquid scrubbing (ILS), are studied and assessed in terms of their energy consumption and environmental impacts with the process simulation and green degree method. A non-random-two-liquid and Henry's law property method for a CO2 separation system with ionic liquid 1-butyl-3-methylimidazolium bis(trifluoromethylsulfonyl)imide ([bmim][Tf2N]) is established and verified with experimental data. The assessment results indicate that the specific energy consumption of ILS and PWS is almost the same and much less than that of MAS. High purity CO2 product can be obtained by MAS and ILS methods, whereas no pure CO2 is recovered with the PWS. For the environmental aspect, ILS has the highest green degree production value, while MAS and PWS produce serious environmental impacts.

  13. Forest ecosystem health assessment and analysis in China

    Institute of Scientific and Technical Information of China (English)

    XIAOFengjin; OUYANGHua; ZHANGQiang; FUBojie; ZHANGZhicheng

    2004-01-01

    Based on more than 300 forest sample plots surveying data and forestry statistical data, remote sensing information from the NOAA AVHRR database and the daily meteorological data of 300 stations, we selected vigor, organization and resilience as the indicators to assess large-scale forest ecosystem health in China and analyzed the spatial pattern of forest ecosystem health and influencing factors. The results of assessment indicated that the spatial pattern of forest ecosystem health showed a decreasing trend along latitude gradients and longitude gradients. The healthy forests are mainly distributed in natural forests, tropical rainforests and seasonal rainforests; secondarily orderly in northeast national forest zone, subtropical forest zonation and southwest forest zonation; while the unhealthy forests were mainly located in warm temperate zone and Xinjiang-Mongolia forest zone. The coefficient of correction between Forest Ecosystem Health Index (FEHI) and annual average precipitation was 0.58 (p<0.01), while the coefficient of correlation between FEHI and annual mean temperatures was 0.49 (p<0.01), which identified that the precipitation and temperatures affect the pattern of FEHI, and the precipitation's effect was stronger than the temperature's. We also measured the correlation coefficient between FEHI and NPP, biodiversity and resistance, which were 0.64, 0.76 and 0.81 (p<0.01) respectively. The order of effect on forest ecosystem health was vigor, organization and resistance.

  14. Analysis and Pollution Assessment of Heavy Metal in Soil, Perlis

    International Nuclear Information System (INIS)

    Concentration of 5 heavy metals (Cu, Cr, Ni, Cd, Pb) were studied in the soils around Perlis, to assess heavy metals contamination distribution due to industrialization, urbanization and agricultural activities. Soil samples were collected at depth of 0-15 cm in eighteen station around Perlis. The soil samples (2 mm) were obtained duplicates and subjected to hot block digestion and the concentration of total metal was determined via ICP-MS. Overall concentrations of Cu, Cr, Ni, Cd and Pb in the soil samples ranged from 0.38-240.59, 0.642-3.921, 0.689-2.398, 0-0.63 and 0.39-27.47 mg/ kg respectively. The concentration of heavy metals in the soil display the following decreasing trend: Cu> Pb> Cr> Ni> Cd. From this result, found that level of heavy metal in soil near centralized Chuping industrial areas give maximum value compared with other location in Perlis. The Pollution index revealed that only 11 % of Cu and 6 % of Cd were classes as heavily contaminated. Meanwhile, Cu and Pb showed 6 % from all samples result a moderately contaminated and the others element give low contamination. Results of combined heavy metal concentration and heavy metal assessment indicate that industrial activities and traffic emission represent most important sources for Cu, Cd and Pb whereas Cr, Ni mainly from natural sources. Increasing anthropogenic influences on the environment, especially pollution loadings, have caused negative changes in natural ecosystems and decreased biodiversity. (author)

  15. Banff fibrosis study: multicenter visual assessment and computerized analysis of interstitial fibrosis in kidney biopsies.

    Science.gov (United States)

    Farris, A B; Chan, S; Climenhaga, J; Adam, B; Bellamy, C O C; Serón, D; Colvin, R B; Reeve, J; Mengel, M

    2014-04-01

    Increasing interstitial fibrosis (IF) in native and kidney transplant biopsies is associated with functional decline and serves as a clinical trial end point. A Banff 2009 Conference survey revealed a range in IF assessment practices. Observers from multiple centers were asked to assess 30 renal biopsies with a range of IF and quantitate IF using two approaches on trichrome, Periodic acid-Schiff (PAS) and computer-assisted quantification of collagen III immunohistochemistry (C-IHC) slides, as well as assessing percent of cortical tubular atrophy% (TA%) and Banff total cortical inflammation score (ti-score). C-IHC using whole slide scans was performed. C-IHC assessment showed a higher correlation with organ function (r = -0.48) than did visual assessments (r = -0.32--0.42); computerized and visual C-IHC assessment also correlated (r = 0.64-0.66). Visual assessment of trichrome and C-IHC showed better correlations with organ function and C-IHC, than PAS, TA% and ti-score. However, visual assessment of IF, independent of approach, was variable among observers, and differences in correlations with organ function were not statistically significant among C-IHC image analysis and visual assessment methods. C-IHC image analysis correlated among three centers (r > 0.90, p image could potentially accomplish standardized IF assessment in multicenter settings. PMID:24712330

  16. Assessment of modern methods of human factor reliability analysis in PSA studies

    International Nuclear Information System (INIS)

    The report is structured as follows: Classical terms and objects (Probabilistic safety assessment as a framework for human reliability assessment; Human failure within the PSA model; Basic types of operator failure modelled in a PSA study and analyzed by HRA methods; Qualitative analysis of human reliability; Quantitative analysis of human reliability used; Process of analysis of nuclear reactor operator reliability in a PSA study); New terms and objects (Analysis of dependences; Errors of omission; Errors of commission; Error forcing context); and Overview and brief assessment of human reliability analysis (Basic characteristics of the methods; Assets and drawbacks of the use of each of HRA method; History and prospects of the use of the methods). (P.A.)

  17. THE MANAGEMENT OF TRANSFUSION SERVICES, ANALYSIS AND ASSESSMENT

    Science.gov (United States)

    Begic, Dzenana; Mujicic, Ermina; Coric, Jozo; Zunic, Lejla

    2016-01-01

    Introduction: The hospital blood bank (HBB) need to timely provide adequate amounts of blood and blood products for surgeries. For various surgical programs are performed assessments of the average number of blood doses needed for surgery. By using two types of requisitions BT/AB (blood type/antibody) and BT/AB/MT (blood type/antibody/match test) for pretransfusion immunohaematological testing in General Hospital “Prim. Dr. Abdulah Nakas” is achieved more rational consumption of blood and blood derivatives and financial savings through reduced number of matching tests (MT). Goal: To determine the total amount of pre-operative requisitions (BT/AB and BT/AB/MT) for blood and blood products at surgical departments of the General Hospital “Prim. Dr. Abdulah Nakas” in the period from June 1, 2014 – December 31, 2014 and analyze the consumption/return of blood in reserve in relation to the surgical disciplines, the total number of savings in MT. Conduct assessments MSBOS (Maximum Surgical Blood Ordering Schedule). Results: The total amount of preoperative requisitions for blood and blood products in surgical wards was 927 requests from which 623 demands or 67.2% is tested by BT/MT, while 304 or 32.8% was tested by BT/AB/MT. Transfused in total was 617 units of blood and blood products, 275 units were not transfused. Probability of transfusions for surgery was 51.3, the highest in the case of surgical intensive care 70.4 and the lowest for the department of general surgery 37.2%. Assessment of indicators of efficient resource management indicates they are the best at the delivery ward 0.89, while a total for surgical wards is 0.69. In total for surgery on the average were required 2.1 units of blood. By using two types of requisitions for pretransfusion immunohaematological testing (BT/AB and CG/AB/MT) is achieved more rational use of MT. In 623 requests for BT/AB only 61 MT were performed. Average of blood units issued in accordance with these requirements is 0

  18. Sensitivity analysis for Probabilistic Tsunami Hazard Assessment (PTHA)

    Science.gov (United States)

    Spada, M.; Basili, R.; Selva, J.; Lorito, S.; Sorensen, M. B.; Zonker, J.; Babeyko, A. Y.; Romano, F.; Piatanesi, A.; Tiberti, M.

    2012-12-01

    In modern societies, probabilistic hazard assessment of natural disasters is commonly used by decision makers for designing regulatory standards and, more generally, for prioritizing risk mitigation efforts. Systematic formalization of Probabilistic Tsunami Hazard Assessment (PTHA) has started only in recent years, mainly following the giant tsunami disaster of Sumatra in 2004. Typically, PTHA for earthquake sources exploits the long-standing practices developed in probabilistic seismic hazard assessment (PSHA), even though important differences are evident. In PTHA, for example, it is known that far-field sources are more important and that physical models for tsunami propagation are needed for the highly non-isotropic propagation of tsunami waves. However, considering the high impact that PTHA may have on societies, an important effort to quantify the effect of specific assumptions should be performed. Indeed, specific standard hypotheses made in PSHA may prove inappropriate for PTHA, since tsunami waves are sensitive to different aspects of sources (e.g. fault geometry, scaling laws, slip distribution) and propagate differently. In addition, the necessity of running an explicit calculation of wave propagation for every possible event (tsunami scenario) forces analysts to finding strategies for diminishing the computational burden. In this work, we test the sensitivity of hazard results with respect to several assumptions that are peculiar of PTHA and others that are commonly accepted in PSHA. Our case study is located in the central Mediterranean Sea and considers the Western Hellenic Arc as the earthquake source with Crete and Eastern Sicily as near-field and far-field target coasts, respectively. Our suite of sensitivity tests includes: a) comparison of random seismicity distribution within area sources as opposed to systematically distributed ruptures on fault sources; b) effects of statistical and physical parameters (a- and b-value, Mc, Mmax, scaling laws

  19. Assessment of competence in simulated flexible bronchoscopy using motion analysis

    DEFF Research Database (Denmark)

    Collela, Sara; Svendsen, Morten Bo Søndergaard; Konge, Lars;

    2015-01-01

    intermediates and 9 experienced bronchoscopy operators performed 3 procedures each on a bronchoscopy simulator. The Microsoft Kinect system was used to automatically measure the total deviation of the scope from a perfectly straight, vertical line. Results: The low-cost motion analysis system could measure......Background: Flexible bronchoscopy should be performed with a correct posture and a straight scope to optimize bronchoscopy performance and at the same time minimize the risk of work-related injuries and endoscope damage. Objectives: We aimed to test whether an automatic motion analysis system could...... be used to explore if there is a correlation in scope movements and the operator's level of experience. Our hypothesis was that experienced bronchoscopists move less and keep the flexible scope straighter than less-experienced bronchoscopists while performing procedures. Methods: Eleven novices, 9...

  20. The Analysis and Assessment of the Credit Risk

    OpenAIRE

    Mirea Marioara; Aivaz Kamer Ainur

    2011-01-01

    The commercial banks main operation is the granting of credits that occupies the first place among the total investments. Any bank assumes risks to a certain extent when granting credits and certainly all the banks generally incur losses when some debtors fail to comply with their obligations. The level of the assumed risks, the losses can be minimized if the credit operations are organized and managed in a professional manner. The paper grasps the moment of the analysis process preceding the...

  1. Image analysis for dental bone quality assessment using CBCT imaging

    Science.gov (United States)

    Suprijanto; Epsilawati, L.; Hajarini, M. S.; Juliastuti, E.; Susanti, H.

    2016-03-01

    Cone beam computerized tomography (CBCT) is one of X-ray imaging modalities that are applied in dentistry. Its modality can visualize the oral region in 3D and in a high resolution. CBCT jaw image has potential information for the assessment of bone quality that often used for pre-operative implant planning. We propose comparison method based on normalized histogram (NH) on the region of inter-dental septum and premolar teeth. Furthermore, the NH characteristic from normal and abnormal bone condition are compared and analyzed. Four test parameters are proposed, i.e. the difference between teeth and bone average intensity (s), the ratio between bone and teeth average intensity (n) of NH, the difference between teeth and bone peak value (Δp) of NH, and the ratio between teeth and bone of NH range (r). The results showed that n, s, and Δp have potential to be the classification parameters of dental calcium density.

  2. An assessment of unstructured grid technology for timely CFD analysis

    Science.gov (United States)

    Kinard, Tom A.; Schabowski, Deanne M.

    1995-01-01

    An assessment of two unstructured methods is presented in this paper. A tetrahedral unstructured method USM3D, developed at NASA Langley Research Center is compared to a Cartesian unstructured method, SPLITFLOW, developed at Lockheed Fort Worth Company. USM3D is an upwind finite volume solver that accepts grids generated primarily from the Vgrid grid generator. SPLITFLOW combines an unstructured grid generator with an implicit flow solver in one package. Both methods are exercised on three test cases, a wing, and a wing body, and a fully expanded nozzle. The results for the first two runs are included here and compared to the structured grid method TEAM and to available test data. On each test case, the set up procedure are described, including any difficulties that were encountered. Detailed descriptions of the solvers are not included in this paper.

  3. Using Pre-Statistical Analysis to Streamline Monitoring Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Reed, J.K.

    1999-10-20

    A variety of statistical methods exist to aid evaluation of groundwater quality and subsequent decision making in regulatory programs. These methods are applied because of large temporal and spatial extrapolations commonly applied to these data. In short, statistical conclusions often serve as a surrogate for knowledge. However, facilities with mature monitoring programs that have generated abundant data have inherently less uncertainty because of the sheer quantity of analytical results. In these cases, statistical tests can be less important, and ''expert'' data analysis should assume an important screening role.The WSRC Environmental Protection Department, working with the General Separations Area BSRI Environmental Restoration project team has developed a method for an Integrated Hydrogeological Analysis (IHA) of historical water quality data from the F and H Seepage Basins groundwater remediation project. The IHA combines common sense analytical techniques and a GIS presentation that force direct interactive evaluation of the data. The IHA can perform multiple data analysis tasks required by the RCRA permit. These include: (1) Development of a groundwater quality baseline prior to remediation startup, (2) Targeting of constituents for removal from RCRA GWPS, (3) Targeting of constituents for removal from UIC, permit, (4) Targeting of constituents for reduced, (5)Targeting of monitoring wells not producing representative samples, (6) Reduction in statistical evaluation, and (7) Identification of contamination from other facilities.

  4. Game theoretic analysis of environmental impact assessment system in China

    Institute of Scientific and Technical Information of China (English)

    CHENG Hongguang; QI Ye; PU Xiao; GONG Li

    2007-01-01

    Environmental impact assessment (EIA) system has been established in China since 1973.In present EIA cases,there are four participants in general:governments,enterprises,EIA organizations and the public.The public has held responsible for both social costs and social duties.The public supervises social costs produced by enterprises discharging pollutant in EIA.However public participation is mostly deputized by governments,which severely weaken the independence of the public as one participant in EIA.In this paper,EIA refers to the different attitudes of the participants whose optional strategies may be described by a proper game model.According to disfigurements in EIA,three sides (governments,enterprises,and EIA organizations)dynamic iterative game theory of many phases is established referring to iterative game theory,dynamic game theory of incomplete information,and perfect Bayesian equilibrium theory to analyze the reciprocity relation among governments,EIA organizations and enterprises.The results show that in a short period,economic benefit is preponderant over social benefit.Governments and enterprises both do not want to take EIA to reveal social costs.EIA organizations' income comes from enterprises and the collusions are built between them to vindicate economic benefit.In a long run,social benefit loss caused by environmental pollution must be recuperated sooner or later and environmental deterioration will influence the achievements of economic benefit,so both governments and eaterprises are certain to pursue high social benefit and willing to take EIA,helpful to increase private benefit.EIA organizations will make fair assessment when their economic benefit are ensured.At present,the public as silent victims can not take actual part in EIA.The EIA system must be improved to break the present equilibrium of three sides,bringing the public to the equilibrium to exert public supervision.

  5. Pilot analysis of the Motivation Assessment for Team Readiness, Integration, and Collaboration (MATRICx) using Rasch analysis.

    Science.gov (United States)

    Mallinson, Trudy; Lotrecchiano, Gaetano R; Schwartz, Lisa S; Furniss, Jeremy; Leblanc-Beaudoin, Tommy; Lazar, Danielle; Falk-Krzesinski, Holly J

    2016-10-01

    Healthcare services and the production of healthcare knowledge are increasingly dependent on highly functioning, multidisciplinary teams, requiring greater awareness of individuals' readiness to collaborate in translational science teams. Yet, there is no comprehensive tool of individual motivations and threats to collaboration that can guide preparation of individuals for work on well-functioning teams. This prospective pilot study evaluated the preliminary psychometric properties of the Motivation Assessment for Team Readiness, Integration, and Collaboration (MATRICx). We examined 55 items of the MATRICx in a sample of 125 faculty, students and researchers, using contemporary psychometric methods (Rasch analysis). We found that the motivator and threat items formed separate constructs relative to collaboration readiness. Further, respondents who identified themselves as inexperienced at working on collaborative projects defined the motivation construct differently from experienced respondents. These results are consistent with differences in strategic alliances described in the literature-for example, inexperienced respondents reflected features of cooperation and coordination, such as concern with sharing information and compatibility of goals. In contrast, the more experienced respondents were concerned with issues that reflected a collective purpose, more typical of collaborative alliances. While these different types of alliances are usually described as representing varying aspects along a continuum, our findings suggest that collaboration might be better thought of as a qualitatively different state than cooperation or coordination. These results need to be replicated in larger samples, but the findings have implications for the development and design of educational interventions that aim to ready scientists and clinicians for greater interdisciplinary work.

  6. Pilot analysis of the Motivation Assessment for Team Readiness, Integration, and Collaboration (MATRICx) using Rasch analysis.

    Science.gov (United States)

    Mallinson, Trudy; Lotrecchiano, Gaetano R; Schwartz, Lisa S; Furniss, Jeremy; Leblanc-Beaudoin, Tommy; Lazar, Danielle; Falk-Krzesinski, Holly J

    2016-10-01

    Healthcare services and the production of healthcare knowledge are increasingly dependent on highly functioning, multidisciplinary teams, requiring greater awareness of individuals' readiness to collaborate in translational science teams. Yet, there is no comprehensive tool of individual motivations and threats to collaboration that can guide preparation of individuals for work on well-functioning teams. This prospective pilot study evaluated the preliminary psychometric properties of the Motivation Assessment for Team Readiness, Integration, and Collaboration (MATRICx). We examined 55 items of the MATRICx in a sample of 125 faculty, students and researchers, using contemporary psychometric methods (Rasch analysis). We found that the motivator and threat items formed separate constructs relative to collaboration readiness. Further, respondents who identified themselves as inexperienced at working on collaborative projects defined the motivation construct differently from experienced respondents. These results are consistent with differences in strategic alliances described in the literature-for example, inexperienced respondents reflected features of cooperation and coordination, such as concern with sharing information and compatibility of goals. In contrast, the more experienced respondents were concerned with issues that reflected a collective purpose, more typical of collaborative alliances. While these different types of alliances are usually described as representing varying aspects along a continuum, our findings suggest that collaboration might be better thought of as a qualitatively different state than cooperation or coordination. These results need to be replicated in larger samples, but the findings have implications for the development and design of educational interventions that aim to ready scientists and clinicians for greater interdisciplinary work. PMID:27388617

  7. Alternative model for assessment administration and analysis: An example from the E-CLASS

    CERN Document Server

    Wilcox, Bethany R; Hobbs, Robert D; Aiken, John M; Welch, Nathan M; Lewandowski, H J

    2016-01-01

    The primary model for dissemination of conceptual and attitudinal assessments that has been used within the physics education research (PER) community is to create a high quality, validated assessment, make it available for a wide range of instructors to use, and provide minimal (if any) support to assist with administration or analysis of the results. Here, we present and discuss an alternative model for assessment dissemination, which is characterized by centralized data collection and analysis. This model also provides a greater degree of support for both researchers and instructors. Specifically, we describe our experiences developing a centralized, automated system for an attitudinal assessment we previously created to examine students' epistemologies and expectations about experimental physics. This system provides a proof-of-concept that we use to discuss the advantages associated with centralized administration and data collection for research-based assessments in PER. We also discuss the challenges t...

  8. Assessing Canadian Bank Branch Operating Efficiency Using Data Envelopment Analysis

    Science.gov (United States)

    Yang, Zijiang

    2009-10-01

    In today's economy and society, performance analyses in the services industries attract more and more attention. This paper presents an evaluation of 240 branches of one big Canadian bank in Greater Toronto Area using Data Envelopment Analysis (DEA). Special emphasis was placed on how to present the DEA results to management so as to provide more guidance to them on what to manage and how to accomplish the changes. Finally the potential management uses of the DEA results were presented. All the findings are discussed in the context of the Canadian banking market.

  9. Use Of Risk Analysis Fremeworks In Urban Flood Assessments

    DEFF Research Database (Denmark)

    Arnbjerg-Nielsen, Karsten; Madsen, Henrik

    , primarily insurance companies and mortgage providers, but also politicians and media are highly interested. Presently two very different approaches are being followed in both research and practice. One is the introduction of risk analysis and risk management tools to provide professionals and politicians...... and accepting losses that are outweighed by benefits to society as a whole. Another, very different approach is to apply more stakeholder driven approaches, much in the line of Integrated Water Resources Management. The key difference is that it is recognized that the costs and benefits of both existing...... are being carried out in real-life applications combining researchers, practitioners, and NGOs....

  10. Assessment of the Prony's method for BWR stability analysis

    International Nuclear Information System (INIS)

    Highlights: → This paper describes a method to determine the degree of stability of a BWR. → Performance comparison between Prony's and common AR techniques is presented. → Benchmark data and actual BWR transient data are used for comparison. → DR and f results are presented and discussed. → The Prony's method is shown to be a robust technique for BWR stability. - Abstract: It is known that Boiling Water Reactors are susceptible to present power oscillations in regions of high power and low coolant flow, in the power-flow operational map. It is possible to fall in one of such instability regions during reactor startup, since both power and coolant flow are being increased but not proportionally. One other possibility for falling into those areas is the occurrence of a trip of recirculation pumps. Stability monitoring in such cases can be difficult, because the amount or quality of power signal data required for calculation of the stability key parameters may not be enough to provide reliable results in an adequate time range. In this work, the Prony's Method is presented as one complementary alternative to determine the degree of stability of a BWR, through time series data. This analysis method can provide information about decay ratio and oscillation frequency from power signals obtained during transient events. However, so far not many applications in Boiling Water Reactors operation have been reported and supported to establish the scope of using such analysis for actual transient events. This work presents first a comparison of decay ratio and frequency oscillation results obtained by Prony's method and those results obtained by the participants of the Forsmark 1 and 2 Boiling Water Reactor Stability Benchmark using diverse techniques. Then, a comparison of decay ratio and frequency oscillation results is performed for four real BWR transient event data, using Prony's method and two other techniques based on an autoregressive modeling. The four

  11. How to assess the Efficiency and "Uncertainty" of Global Sensitivity Analysis?

    Science.gov (United States)

    Haghnegahdar, Amin; Razavi, Saman

    2016-04-01

    Sensitivity analysis (SA) is an important paradigm for understanding model behavior, characterizing uncertainty, improving model calibration, etc. Conventional "global" SA (GSA) approaches are rooted in different philosophies, resulting in different and sometime conflicting and/or counter-intuitive assessment of sensitivity. Moreover, most global sensitivity techniques are highly computationally demanding to be able to generate robust and stable sensitivity metrics over the entire model response surface. Accordingly, a novel sensitivity analysis method called Variogram Analysis of Response Surfaces (VARS) is introduced to overcome the aforementioned issues. VARS uses the Variogram concept to efficiently provide a comprehensive assessment of global sensitivity across a range of scales within the parameter space. Based on the VARS principles, in this study we present innovative ideas to assess (1) the efficiency of GSA algorithms and (2) the level of confidence we can assign to a sensitivity assessment. We use multiple hydrological models with different levels of complexity to explain the new ideas.

  12. Using Benefit-Cost Analysis to Assess Child Abuse Prevention and Intervention Programs.

    Science.gov (United States)

    Plotnick, Robert D.; Deppman, Laurie

    1999-01-01

    Presents a case for using benefit-cost analysis to structure evaluations of child-abuse prevention and intervention programs. Presents the basic concept of benefit-cost analysis, its application in the context of assessing these types of child welfare programs, and limitations on its application to social service programs. (Author)

  13. Using Language Sample Analysis to Assess Spoken Language Production in Adolescents

    Science.gov (United States)

    Miller, Jon F.; Andriacchi, Karen; Nockerts, Ann

    2016-01-01

    Purpose: This tutorial discusses the importance of language sample analysis and how Systematic Analysis of Language Transcripts (SALT) software can be used to simplify the process and effectively assess the spoken language production of adolescents. Method: Over the past 30 years, thousands of language samples have been collected from typical…

  14. Environmental endocrine disruption: an effects assessment and analysis.

    Science.gov (United States)

    Crisp, T M; Clegg, E D; Cooper, R L; Wood, W P; Anderson, D G; Baetcke, K P; Hoffmann, J L; Morrow, M S; Rodier, D J; Schaeffer, J E; Touart, L W; Zeeman, M G; Patel, Y M

    1998-02-01

    This report is an overview of the current state of the science relative to environmental endocrine disruption in humans, laboratory testing, and wildlife species. Background information is presented on the field of endocrinology, the nature of hormones, and potential sites for endocrine disruption, with specific examples of chemicals affecting these sites. An attempt is made to present objectively the issue of endocrine disruption, consider working hypotheses, offer opposing viewpoints, analyze the available information, and provide a reasonable assessment of the problem. Emphasis is placed on disruption of central nervous system--pituitary integration of hormonal and sexual behavioral activity, female and male reproductive system development and function, and thyroid function. In addition, the potential role of environmental endocrine disruption in the induction of breast, testicular, and prostate cancers, as well as endometriosis, is evaluated. The interrelationship of the endocrine and immune system is documented. With respect to endocrine-related ecological effects, specific case examples from the peer-reviewed literature of marine invertebrates and representatives of the five classes of vertebrates are presented and discussed. The report identifies some data gaps in our understanding of the environmental endocrine disruption issue and recommends a few research needs. Finally, the report states the U.S. Environmental Protection Agency Science Policy Council's interim position on endocrine disruption and lists some of the ongoing activities to deal with this matter. PMID:9539004

  15. Environmental assessment of the Tung cultivation through life cycle analysis

    Directory of Open Access Journals (Sweden)

    Marcelo Bernál

    2014-01-01

    Full Text Available Over the past few decades, conventional agriculture has been facing serious crises caused by numerous factors, including poor soil management and the excessive application of pesticides. Thus, alternative production systems have been developed, including agroforestry systems, especially those that produce both energy and food. The objective of this study was to environmentally evaluate the culture of Aleurites fordii Hemls. (Tung using the Life Cycle Assessment method with the SimaPro 7.3.2 software. The results revealed that in family farms that use less mechanization to harvest crops, the primary category of environmental impact was land use, which included the removal of animal and vegetable species and ecosystem changes. The full impact of this category was 1741.21 m2yr PDF (potentially disappeared fraction. Subsequently, prognostics were established for the reduction of such impacts, and we conclude that Tung has a high potential for agricultural installation with high responsibility to the environment. Keywords: Environmental factors, Aleurites fordii Hemls, Life Cycle Management, Tung.

  16. Alternative model for administration and analysis of research-based assessments

    Science.gov (United States)

    Wilcox, Bethany R.; Zwickl, Benjamin M.; Hobbs, Robert D.; Aiken, John M.; Welch, Nathan M.; Lewandowski, H. J.

    2016-06-01

    Research-based assessments represent a valuable tool for both instructors and researchers interested in improving undergraduate physics education. However, the historical model for disseminating and propagating conceptual and attitudinal assessments developed by the physics education research (PER) community has not resulted in widespread adoption of these assessments within the broader community of physics instructors. Within this historical model, assessment developers create high quality, validated assessments, make them available for a wide range of instructors to use, and provide minimal (if any) support to assist with administration or analysis of the results. Here, we present and discuss an alternative model for assessment dissemination, which is characterized by centralized data collection and analysis. This model provides a greater degree of support for both researchers and instructors in order to more explicitly support adoption of research-based assessments. Specifically, we describe our experiences developing a centralized, automated system for an attitudinal assessment we previously created to examine students' epistemologies and expectations about experimental physics. This system provides a proof of concept that we use to discuss the advantages associated with centralized administration and data collection for research-based assessments in PER. We also discuss the challenges that we encountered while developing, maintaining, and automating this system. Ultimately, we argue that centralized administration and data collection for standardized assessments is a viable and potentially advantageous alternative to the default model characterized by decentralized administration and analysis. Moreover, with the help of online administration and automation, this model can support the long-term sustainability of centralized assessment systems.

  17. Underground Test Area Subproject Phase I Data Analysis Task. Volume VIII - Risk Assessment Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VIII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the risk assessment documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  18. Application of importance analysis probabilistic safety assessment results of Tehran Research Reactor

    International Nuclear Information System (INIS)

    Application of probabilistic safety assessment to evaluate the safety of hazardous facilities will be fulfilled when the results have been processed meaningfully. The purpose of the importance analysis is to identify major contributors to core damage frequency that may include accident initiators, system failures, component failures and human errors. In this paper, Fussell-Vesely measure of importance was applied to the results of probabilistic safety assessment study of Tehran Research Reactor. This analysis is done using systems analysis programs for hands-on integrated reliability evaluations software

  19. Jelly pineapple syneresis assessment via univariate and multivariate analysis

    Directory of Open Access Journals (Sweden)

    Carlos Alberto da Silva Ledo

    2010-09-01

    Full Text Available The evaluation of the pineapple jelly is intended to analyze the occurrence of syneresis by univariate and multivariate analysis. The jelly of the pineapple presents low concentration pectin, therefore, it was added high methoxyl pectin in the following concentrations: 0.50%, 0.75% and 1.00% corresponding to slow, medium and fast speed of gel formation process. In this study it was checked the pH, acidity, brix and the syneresis of jelly. The highest concentration of pectin in the jelly showed a decrease in the release of the water, syneresis. This result showed that the percentage of 1.00% of pectin in jelly is necessary to form the gel and to obtain a suitable texture.

  20. Seismic Assessment of an RC Building Using Pushover Analysis

    Directory of Open Access Journals (Sweden)

    R. A. Hakim

    2014-08-01

    Full Text Available Current research works and observations indicated that parts of the Kingdom of Saudi Arabia have low to moderate seismic regions. Major parts of buildings were designed only for gravity load and were poorly detailed to accommodate lateral loads. This study aims to investigate building performance on resisting expected seismic loadings. Two 3D frames were investigated using pushover analysis according to ATC-40. One was designed according to a design practice that considers only the gravity load and the other frame was designed according to the Saudi Building Code (SBC-301. Results showed that the building designed considering only the gravity load was found inadequate. On the other hand, the building designed according to SBC-301 satisfies the Immediate Occupancy (IO acceptance criteria according to ATC-40.

  1. In-field analysis and assessment of nuclear material

    International Nuclear Information System (INIS)

    Los Alamos National Laboratory has actively developed and implemented a number of instruments to monitor, detect, and analyze nuclear materials in the field. Many of these technologies, developed under existing US Department of Energy programs, can also be used to effectively interdict nuclear materials smuggled across or within national borders. In particular, two instruments are suitable for immediate implementation: the NAVI-2, a hand-held gamma-ray and neutron system for the detection and rapid identification of radioactive materials, and the portable mass spectrometer for the rapid analysis of minute quantities of radioactive materials. Both instruments provide not only critical information about the characteristics of the nuclear material for law-enforcement agencies and national authorities but also supply health and safety information for personnel handling the suspect materials

  2. Analysis And Assessment Of The Security Method Against Incidental Contamination In The Collective Water Supply System

    OpenAIRE

    Szpak Dawid; Tchórzewska – Cieślak Barbara

    2015-01-01

    The paper presents the main types of surface water incidental contaminations and the security method against incidental contamination in water sources. Analysis and assessment the collective water supply system (CWSS) protection against incidental contamination was conducted. Failure Mode and Effects Analysis (FMEA) was used. The FMEA method allow to use the product or process analysis, identification of weak points, and implementation the corrections and new solutions for eliminating the sou...

  3. Assessment of Water Quality Parameters by Using the Multidimensional Scaling Analysis

    OpenAIRE

    Suheyla Yerel; Huseyin Ankara

    2010-01-01

    The surface water quality parameters of the western region of Black Sea in Turkey were assessed by using multidimensional scaling analysis. This technique was applied to the surface water quality parameters obtained from the five monitoring stations. Multidimensional scaling analysis showed that Cl-, SO42-, Na+ and BOD5 are the most important parameters causing difference in the monitoring stations. These analysis results present from the domestic waste and organic pollution affected of surfa...

  4. Rorschach assessment of traumatized refugees: an exploratory factor analysis.

    Science.gov (United States)

    Opaas, Marianne; Hartmann, Ellen

    2013-01-01

    Fifty-one multitraumatized mental health patients with refugee backgrounds completed the Rorschach (Meyer & Viglione, 2008), Harvard Trauma Questionnaire, and Hopkins Symptom Checklist-25 (Mollica, McDonald, Massagli, & Silove, 2004), and the World Health Organization Quality of Life-BREF questionnaire (WHOQOL Group, 1998) before the start of treatment. The purpose was to gain more in-depth knowledge of an understudied patient group and to provide a prospective basis for later analyses of treatment outcome. Factor analysis of trauma-related Rorschach variables gave 2 components explaining 60% of the variance; the first was interpreted as trauma-related flooding versus constriction and the second as adequate versus impaired reality testing. Component 1 correlated positively with self-reported reexperiencing symptoms of posttraumatic stress (r = .32, p < .05). Component 2 correlated positively with self-reported quality of life in the physical, psychological, and social relationships domains (r = .34, .32, and .35, p < .05), and negatively with anxiety (r = -.33, p < .05). Each component also correlated significantly with resources like work experience, education, and language skills. PMID:23570250

  5. Assessment

    Institute of Scientific and Technical Information of China (English)

    Geoff Brindley

    2005-01-01

    @@ Introduction TERMINOLOGY AND KEY CONCEPTS The term assessment refers to a variety of ways of collecting information on a learner's language ability or achievement. Although testing and assessment are often used interchangeably, the latter is an umbrella term encompassing measurement instruments administered on a ‘one-off’ basis such as tests, as well as qualitative methods of monitoring and recording student learning such as observation, simulations of project work. Assessment is also distinguished from evaluation which is concerned with the overall language programme and not just with what individual students have learnt. Proficiency assessment refers to the assessment of general language abilities acquired by the learner independent of a course of study.This kind of assessment is often done through the administration of standardised commercial language-proficency tests. On the other hand, assessment of achievement aims to establish what a student had learned in relation to a particular course or curriculum (thus frequently carried out by the teacher) .Achievement assesssment may be based either on the specific content of the course or on the course objectives (Hughes 1989).

  6. Digital image analysis outperforms manual biomarker assessment in breast cancer.

    Science.gov (United States)

    Stålhammar, Gustav; Fuentes Martinez, Nelson; Lippert, Michael; Tobin, Nicholas P; Mølholm, Ida; Kis, Lorand; Rosin, Gustaf; Rantalainen, Mattias; Pedersen, Lars; Bergh, Jonas; Grunkin, Michael; Hartman, Johan

    2016-04-01

    In the spectrum of breast cancers, categorization according to the four gene expression-based subtypes 'Luminal A,' 'Luminal B,' 'HER2-enriched,' and 'Basal-like' is the method of choice for prognostic and predictive value. As gene expression assays are not yet universally available, routine immunohistochemical stains act as surrogate markers for these subtypes. Thus, congruence of surrogate markers and gene expression tests is of utmost importance. In this study, 3 cohorts of primary breast cancer specimens (total n=436) with up to 28 years of survival data were scored for Ki67, ER, PR, and HER2 status manually and by digital image analysis (DIA). The results were then compared for sensitivity and specificity for the Luminal B subtype, concordance to PAM50 assays in subtype classification and prognostic power. The DIA system used was the Visiopharm Integrator System. DIA outperformed manual scoring in terms of sensitivity and specificity for the Luminal B subtype, widely considered the most challenging distinction in surrogate subclassification, and produced slightly better concordance and Cohen's κ agreement with PAM50 gene expression assays. Manual biomarker scores and DIA essentially matched each other for Cox regression hazard ratios for all-cause mortality. When the Nottingham combined histologic grade (Elston-Ellis) was used as a prognostic surrogate, stronger Spearman's rank-order correlations were produced by DIA. Prognostic value of Ki67 scores in terms of likelihood ratio χ(2) (LR χ(2)) was higher for DIA that also added significantly more prognostic information to the manual scores (LR-Δχ(2)). In conclusion, the system for DIA evaluated here was in most aspects a superior alternative to manual biomarker scoring. It also has the potential to reduce time consumption for pathologists, as many of the steps in the workflow are either automatic or feasible to manage without pathological expertise. PMID:26916072

  7. FEBEX II Project Post-mortem analysis EDZ assessment

    Energy Technology Data Exchange (ETDEWEB)

    Bazargan Sabet, B.; Shao, H.; Autio, J.; Elorza, F. J.

    2004-07-01

    variations in the propagation velocities of acoustic waves. A cylindrical block of granite 38.8 cm in diameter and 40 cm high has been analysed along 2D transversal sections in six radial directions. Different inverse tomographic strategies have been used to analyse the measured data, which shown no evidences of the existence of an EDZ in FEBEX gallery. However, a preferential direction in the wave propagation similar to the maximum compression direction of the stress tensor has appeared. As for in situ investigations, the hydraulic connectivity of the drift has been assessed at eleven locations in heated area, including granite matrix and lamprophyre dykes and at six locations in undisturbed zones. In the granite matrix area, pulse test using pressurised air with stepwise pressure increasing was conducted to determine gas entry pressure. In the fractured area, a gas constant flow rate injection test was conducted. Only two locations with higher permeability were detected; one in a natural fracture in the lamprophyre dyke and the other in the interface between lamprophyre and granite. Where numerical investigations are concerned, several analyses of the FEBEX in situ experiment were carried out to determine whether if the generation of a potential EDZ in the surrounding rock was possible or not. Stresses have been calculated by 1D full-coupled thermo-hydromechanical model and by 2D and 3D thermo-mechanical models. Results compared with the available data on compressive strength of the Grimsel granite show that in the worst-case studied, the state of stresses induced by the excavation and the heating phases remains far below the critical curve. (Author)

  8. Assessment of bone formation capacity using in vivo transplantation assays: procedure and tissue analysis

    DEFF Research Database (Denmark)

    Abdallah, Basem; Ditzel, Nicholas; Kassem, Moustapha

    2008-01-01

    In vivo assessment of bone formation (osteogenesis) potential by isolated cells is an important method for analysis of cells and factors control ling bone formation. Currently, cell implantation mixed with hydroxyapa-tite/tricalcium phosphate in an open system (subcutaneous implantation) in immun...... transplantation methods in testing bone formationpotential of human mesenchymal stem cells.......In vivo assessment of bone formation (osteogenesis) potential by isolated cells is an important method for analysis of cells and factors control ling bone formation. Currently, cell implantation mixed with hydroxyapa-tite/tricalcium phosphate in an open system (subcutaneous implantation......) in immunodeficient mice is the standard method for in vivo assessment of bone formation capacity of a particular cell type. The method is easy to perform and provides reproducible results. Assessment of the donor origin of tissue formation is possible, especially in the case of human-to-mouse transplanta tion...

  9. Interconnectivity among Assessments from Rating Agencies: Using Cluster and Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Jaroslav Krejčíř

    2014-09-01

    Full Text Available The aim of this paper is to determine whether there is a dependency among leading rating agencies assessments. Rating agencies are important part of global economy. Great attention has been paid to activities of rating agencies since 2007, when there was a financial crisis. One of the main causes of this crisis was identified credit rating agencies. This paper is focused on an existence of mutual interconnectivity among assessments from three leading rating agencies. The method used for this determines is based on cluster analysis and subsequently correlation analysis and the test of independence. Credit rating assessments of Greece and Spain were chosen to the determination of this mutual interconnectivity due to the fact that these countries are most talked euro­area countries. The significant dependence of the assessment from different rating agencies has been demonstrated.

  10. A multiple-imputation based approach to sensitivity analysis and effectiveness assessment in longitudinal clinical trials

    OpenAIRE

    Teshome Ayele, Birhanu; Lipkovich, Ilya; Molenberghs, Geert; Mallinckrodt, Craig H

    2014-01-01

    It is important to understand the effects of a drug as actually taken (effectiveness) and when taken as directed (efficacy). The primary objective of this investigation was to assess the statistical performance of a method referred to as placebo multiple imputation (pMI) as an estimator of effectiveness and as a worst reasonable case sensitivity analysis in assessing efficacy. The pMI method assumes the statistical behavior of placebo- and drug-treated patients after dropout is the statistica...

  11. An Analysis Of Tensile Test Results to Assess the Innovation Risk for an Additive Manufacturing Technology

    OpenAIRE

    Adamczak Stanisław; Bochnia Jerzy; Kaczmarska Bożena

    2015-01-01

    The aim of this study was to assess the innovation risk for an additive manufacturing process. The analysis was based on the results of static tensile tests obtained for specimens made of photocured resin. The assessment involved analyzing the measurement uncertainty by applying the FMEA method. The structure of the causes and effects of the discrepancies was illustrated using the Ishikawa diagram. The risk priority numbers were calculated. The uncertainty of the tensile test measurement was ...

  12. Quantitative Assessment of Flame Stability Through Image Processing and Spectral Analysis

    OpenAIRE

    Sun, Duo; Lu, Gang; Zhou, Hao; Yan, Yong; Liu, Shi

    2015-01-01

    This paper experimentally investigates two generalized methods, i.e., a simple universal index and oscillation frequency, for the quantitative assessment of flame stability at fossil-fuel-fired furnaces. The index is proposed to assess the stability of flame in terms of its color, geometry, and luminance. It is designed by combining up to seven characteristic parameters extracted from flame images. The oscillation frequency is derived from the spectral analysis of flame radiation signals. The...

  13. Application of inelastic neutron scattering and prompt neutron activation analysis in coal quality assessment

    International Nuclear Information System (INIS)

    The basic principles are assessed of the determination of ash content in coal based on the measurement of values proportional to the effective proton number. Discussed is the principle of coal quality assessment using the method of inelastic neutron scattering and prompt neutron activation analysis. This is done with respect both to theoretical relations between measured values and coal quality attributes and to practical laboratory measurements of coal sample quality by the said methods. (author)

  14. Objective Audio Quality Assessment Based on Spectro-Temporal Modulation Analysis

    OpenAIRE

    Guo, Ziyuan

    2011-01-01

    Objective audio quality assessment is an interdisciplinary research area that incorporates audiology and machine learning. Although much work has been made on the machine learning aspect, the audiology aspect also deserves investigation. This thesis proposes a non-intrusive audio quality assessment algorithm, which is based on an auditory model that simulates human auditory system. The auditory model is based on spectro-temporal modulation analysis of spectrogram, which has been proven to be ...

  15. Endogenous allergens and compositional analysis in the allergenicity assessment of genetically modified plants.

    Science.gov (United States)

    Fernandez, A; Mills, E N C; Lovik, M; Spoek, A; Germini, A; Mikalsen, A; Wal, J M

    2013-12-01

    Allergenicity assessment of genetically modified (GM) plants is one of the key pillars in the safety assessment process of these products. As part of this evaluation, one of the concerns is to assess that unintended effects (e.g. over-expression of endogenous allergens) relevant for the food safety have not occurred due to the genetic modification. Novel technologies are now available and could be used as complementary and/or alternative methods to those based on human sera for the assessment of endogenous allergenicity. In view of these developments and as a step forward in the allergenicity assessment of GM plants, it is recommended that known endogenous allergens are included in the compositional analysis as additional parameters to be measured.

  16. Establishment of a Risk Assessment Framework for Analysis of the Spread of Highly Pathogenic Avian Influenza

    Institute of Scientific and Technical Information of China (English)

    LI Jing; WANG Jing-fei; WU Chun-yan; YANG Yan-tao; JI Zeng-tao; WANG Hong-bin

    2007-01-01

    To evaluate the risk of highly pathogenic avian influenza (HPAI) in mainland China, a risk assessment framework was built.Risk factors were determined by analyzing the epidemic data using the brainstorming method; the analytic hierarchy process was designed to weigh risk factors, and the integrated multicriteria analysis was used to evaluate the final result.The completed framework included the risk factor system, data standards for risk factors, weights of risk factors, and integrated assessment methods. This risk assessment framework can be used to quantitatively analyze the outbreak and spread of HPAI in mainland China.

  17. Uncertainty and Sensitivity Analysis in Performance Assessment for the Waste Isolation Pilot Plant

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C.

    1998-12-17

    The Waste Isolation Pilot Plant (WIPP) is under development by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. This development has been supported by a sequence of performance assessments (PAs) carried out by Sandla National Laboratories (SNL) to assess what is known about the WIPP and to provide .tidance for future DOE research and development activities. Uncertainty and sensitivity analysis procedures based on Latin hypercube sampling and regression techniques play an important role in these PAs by providing an assessment of the uncertainty in important analysis outcomes and identi~ing the sources of thk uncertainty. Performance assessments for the WIPP are conceptually and computational] y interesting due to regulatory requirements to assess and display the effects of both stochastic (i.e., aleatory) and subjective (i.e., epistemic) uncertainty, where stochastic uncertainty arises from the possible disruptions that could occur over the 10,000 yr regulatory period associated with the WIPP and subjective uncertainty arises from an inability to unambi-aously characterize the many models and associated parameters required in a PA for the WIPP. The interplay between uncertainty analysis, sensitivity analysis, stochastic uncertainty and subjective uncertainty are discussed and illustrated in the context of a recent PA carried out by SNL to support an application by the DOE to the U.S. Environmental Protection Agency for the certification of the WIPP for the disposal of TRU waste.

  18. An analysis of assessment outcomes from eight years' operation of the Australian border weed risk assessment system.

    Science.gov (United States)

    Weber, Jason; Dane Panetta, F; Virtue, John; Pheloung, Paul

    2009-02-01

    The majority of Australian weeds are exotic plant species that were intentionally introduced for a variety of horticultural and agricultural purposes. A border weed risk assessment system (WRA) was implemented in 1997 in order to reduce the high economic costs and massive environmental damage associated with introducing serious weeds. We review the behaviour of this system with regard to eight years of data collected from the assessment of species proposed for importation or held within genetic resource centres in Australia. From a taxonomic perspective, species from the Chenopodiaceae and Poaceae were most likely to be rejected and those from the Arecaceae and Flacourtiaceae were most likely to be accepted. Dendrogram analysis and classification and regression tree (TREE) models were also used to analyse the data. The latter revealed that a small subset of the 35 variables assessed was highly associated with the outcome of the original assessment. The TREE model examining all of the data contained just five variables: unintentional human dispersal, congeneric weed, weed elsewhere, tolerates or benefits from mutilation, cultivation or fire, and reproduction by vegetative propagation. It gave the same outcome as the full WRA model for 71% of species. Weed elsewhere was not the first splitting variable in this model, indicating that the WRA has a capacity for capturing species that have no history of weediness. A reduced TREE model (in which human-mediated variables had been removed) contained four variables: broad climate suitability, reproduction in less or than equal to 1year, self-fertilisation, and tolerates and benefits from mutilation, cultivation or fire. It yielded the same outcome as the full WRA model for 65% of species. Data inconsistencies and the relative importance of questions are discussed, with some recommendations made for improving the use of the system. PMID:18339471

  19. The role of uncertainty analysis in dose reconstruction and risk assessment

    International Nuclear Information System (INIS)

    Dose reconstruction and risk assessment rely heavily on the use of mathematical models to extrapolate information beyond the realm of direct observation. Because models are merely approximations of real systems, their predictions are inherently uncertain. As a result, full disclosure of uncertainty in dose and risk estimates is essential to achieve scientific credibility and to build public trust. The need for formal analysis of uncertainty in model predictions was presented during the nineteenth annual meeting of the NCRP. At that time, quantitative uncertainty analysis was considered a relatively new and difficult subject practiced by only a few investigators. Today, uncertainty analysis has become synonymous with the assessment process itself. When an uncertainty analysis is used iteratively within the assessment process, it can guide experimental research to refine dose and risk estimates, deferring potentially high cost or high consequence decisions until uncertainty is either acceptable or irreducible. Uncertainty analysis is now mandated for all ongoing dose reconstruction projects within the United States, a fact that distinguishes dose reconstruction from other types of exposure and risk assessments. 64 refs., 6 figs., 1 tab

  20. Degradation Assessment and Fault Diagnosis for Roller Bearing Based on AR Model and Fuzzy Cluster Analysis

    Directory of Open Access Journals (Sweden)

    Lingli Jiang

    2011-01-01

    Full Text Available This paper proposes a new approach combining autoregressive (AR model and fuzzy cluster analysis for bearing fault diagnosis and degradation assessment. AR model is an effective approach to extract the fault feature, and is generally applied to stationary signals. However, the fault vibration signals of a roller bearing are non-stationary and non-Gaussian. Aiming at this problem, the set of parameters of the AR model is estimated based on higher-order cumulants. Consequently, the AR parameters are taken as the feature vectors, and fuzzy cluster analysis is applied to perform classification and pattern recognition. Experiments analysis results show that the proposed method can be used to identify various types and severities of fault bearings. This study is significant for non-stationary and non-Gaussian signal analysis, fault diagnosis and degradation assessment.

  1. Uncertainty and sensitivity analysis methodology in a level-I PSA (Probabilistic Safety Assessment)

    International Nuclear Information System (INIS)

    This work presents a methodology for sensitivity and uncertainty analysis, applicable to a probabilistic safety assessment level I. The work contents are: correct association of distributions to parameters, importance and qualification of expert opinions, generations of samples according to sample sizes, and study of the relationships among system variables and system response. A series of statistical-mathematical techniques are recommended along the development of the analysis methodology, as well different graphical visualization for the control of the study. (author)

  2. Assessing collective defensive performances in football: A Qualitative Comparative Analysis of central back pairs

    OpenAIRE

    Kaufmann, David

    2014-01-01

    Ahead of the World Cup in Brazil the crucial question for the Swiss national coach is the nomination of the starting eleven central back pair. A fuzzy set Qualitative Comparative Analysis assesses the defensive performances of different Swiss central back pairs during the World Cup campaign (2011 – 2014). This analysis advises Ottmar Hitzfeld to nominate Steve von Bergen and Johan Djourou as the starting eleven central back pair. The alternative with a substantially weaker empirical validity ...

  3. Software Integration of Life Cycle Assessment and Economic Analysis for Process Evaluation

    OpenAIRE

    Kalakula, Sawitree; Malakula, Pomthong; Siemanonda, Kitipat; Gani, Rafiqul

    2013-01-01

    This study is focused on the sustainable process design of bioethanol production from cassava rhizome. The study includes: process simulation, sustainability analysis, economic evaluation and life cycle assessment (LCA). A steady state process simulation if performed to generate a base case design of the bioethanol conversion process using cassava rhizome as a feedstock. The sustainability analysis is performed to analyze the relevant indicators in sustainability metrics, todefinedesign/retro...

  4. Analysis and radiological assessment of survey results and samples from the beaches around Sellafield

    International Nuclear Information System (INIS)

    After radioactive sea debris had been found on beaches near the BNFL, Sellafield, plant, NRPB was asked by the Department of the Environment to analyse some of the samples collected and to assess the radiological hazard to members of the public. A report is presented containing an analysis of survey reports for the period 19 November - 4 December 1983 and preliminary results of the analysis of all samples received, together with the Board's recommendations. (author)

  5. The ICR142 NGS validation series: a resource for orthogonal assessment of NGS analysis

    OpenAIRE

    Elise Ruark; Anthony Renwick; Matthew Clarke; Katie Snape; Emma Ramsay; Anna Elliott; Sandra Hanks; Ann Strydom; Sheila Seal; Nazneen Rahman

    2016-01-01

    To provide a useful community resource for orthogonal assessment of NGS analysis software, we present the ICR142 NGS validation series. The dataset includes high-quality exome sequence data from 142 samples together with Sanger sequence data at 730 sites; 409 sites with variants and 321 sites at which variants were called by an NGS analysis tool, but no variant is present in the corresponding Sanger sequence. The dataset includes 286 indel variants and 275 negative indel sites, and thus the I...

  6. Non-linear finite element assessment analysis of a modern heritage structure

    OpenAIRE

    S. Sorace; Terenzi, G

    2011-01-01

    A synthesis of a non-linear finite element structural assessment enquiry carried out on a monumental modern heritage building is reported in this paper. The study includes a buckling analysis of the slender steel beams constituting a mushroom-type roof, and an ?integral" seismic pushover analysis of the supporting R/C columns. The computational solutions obtained for the steel roof beams are compared to the results derived from a calculation of the critical stress of beam panels, and the glob...

  7. Assessing Low-Carbon Development in Nigeria : An Analysis of Four Sectors

    OpenAIRE

    Cervigni, Raffaello; Rogers, John Allen; Dvorak, Irina

    2013-01-01

    The Federal Government of Nigeria (FGN) and the World Bank have agreed to carry out a Climate Change Assessment (CCA) within the framework of the Bank's Country Partnership Strategy (CPS) for Nigeria (2010-13). The CCA includes an analysis of options for low-carbon development in selected sectors, including power, oil and gas, transport, and agriculture. The goal of the low-carbon analysis...

  8. Elusive Critical Elements of Transformative Risk Assessment Practice and Interpretation: Is Alternatives Analysis the Next Step?

    Science.gov (United States)

    Francis, Royce A

    2015-11-01

    This article argues that "game-changing" approaches to risk analysis must focus on "democratizing" risk analysis in the same way that information technologies have democratized access to, and production of, knowledge. This argument is motivated by the author's reading of Goble and Bier's analysis, "Risk Assessment Can Be a Game-Changing Information Technology-But Too Often It Isn't" (Risk Analysis, 2013; 33: 1942-1951), in which living risk assessments are shown to be "game changing" in probabilistic risk analysis. In this author's opinion, Goble and Bier's article focuses on living risk assessment's potential for transforming risk analysis from the perspective of risk professionals-yet, the game-changing nature of information technologies has typically achieved a much broader reach. Specifically, information technologies change who has access to, and who can produce, information. From this perspective, the author argues that risk assessment is not a game-changing technology in the same way as the printing press or the Internet because transformative information technologies reduce the cost of production of, and access to, privileged knowledge bases. The author argues that risk analysis does not reduce these costs. The author applies Goble and Bier's metaphor to the chemical risk analysis context, and in doing so proposes key features that transformative risk analysis technology should possess. The author also discusses the challenges and opportunities facing risk analysis in this context. These key features include: clarity in information structure and problem representation, economical information dissemination, increased transparency to nonspecialists, democratized manufacture and transmission of knowledge, and democratic ownership, control, and interpretation of knowledge. The chemical safety decision-making context illustrates the impact of changing the way information is produced and accessed in the risk context. Ultimately, the author concludes that although

  9. Method of synchronization assessment of rythms in regulatory systems for signal analysis in real time

    OpenAIRE

    Borovkova E.l.; Ishbulatov Yu.M.; Mironov S.A.

    2014-01-01

    A method is proposed for quantitative assessment of the phase synchronization of 0.1 Hz oscillations in autonomic cardiovascular control by photoplethysmogram analysis in real time. The efficiency of the method is shown in the comparison with the results obtained by the previously developed method.

  10. Method of synchronization assessment of rythms in regulatory systems for signal analysis in real time

    Directory of Open Access Journals (Sweden)

    Borovkova E.l.

    2014-09-01

    Full Text Available A method is proposed for quantitative assessment of the phase synchronization of 0.1 Hz oscillations in autonomic cardiovascular control by photoplethysmogram analysis in real time. The efficiency of the method is shown in the comparison with the results obtained by the previously developed method.

  11. Assessment of Smolt Condition for Travel Time Analysis, 1993-1994 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Schrock, Robin M; Beeman, John W; VanderKooi, Scott P [US Geological Survey, Western Fisheries Research Center, Columbia River Research Laboratory, Cook, WA

    1999-02-01

    The assessment of smolt condition for travel time analysis (ASCTTA) project provided information on the level of smoltification in Columbia River hatchery and wild salmonid stocks to the Fish Passage Center (FPC), for the primary purpose of in-river management of flows.

  12. Integrating Life-cycle Assessment into Transport Cost-benefit Analysis

    DEFF Research Database (Denmark)

    Manzo, Stefano; Salling, Kim Bang

    2016-01-01

    Traditional transport Cost-Benefit Analysis (CBA) commonly ignores the indirect environmental impacts of an infrastructure project deriving from the overall life-cycle of the different project components. Such indirect impacts are instead of key importance in order to assess the long-term sustain...

  13. Computational Psycholinguistic Analysis and Its Application in Psychological Assessment of College Students

    Science.gov (United States)

    Kucera, Dalibor; Havigerová, Jana M.

    2015-01-01

    The paper deals with the issue of computational psycholinguistic analysis (CPA) and its experimental application in basic psychological and pedagogical assessment. CPA is a new method which may potentially provide interesting, psychologically relevant information about the author of a particular text, regardless of the text's factual (semantic)…

  14. Identifying Students with Learning Disabilities: Composite Profile Analysis Using the Cognitive Assessment System

    Science.gov (United States)

    Huang, Leesa V.; Bardos, Achilles N.; D'Amato, Rik Carl

    2010-01-01

    The detection of cognitive patterns in children with learning disabilities (LD) has been a priority in the identification process. Subtest profile analysis from traditional cognitive assessment has drawn sharp criticism for inaccurate identification and weak connections to educational planning. Therefore, the purpose of this study is to use a new…

  15. Designing student peer assessment in higher education: Analysis of written and oral peer feedback

    NARCIS (Netherlands)

    van den Berg, I.; Admiraal, W.; Pilot, A.

    2006-01-01

    Designing student peer assessment in higher education: analysis of written and oral peer feedback Relating it to design features, the present article describes the nature of written and oral peer feedback as it occurred in seven writing courses, each with a different PA design. Results indicate that

  16. Language Assessment Impacts in China:a Tentative Analysis of TEM8

    Institute of Scientific and Technical Information of China (English)

    Cui Yingqiong; Cheng Hongying

    2015-01-01

    The paper aims to present a tentative analysis of language assessing impacts on relevant parties.It will start with discussing the connotation of test impacts and then the analysing the impacts of TEM8 on test takers,teachers and social level.The importance of such impacts will also be revealed in this paper.

  17. Assessing Model Fit: Caveats and Recommendations for Confirmatory Factor Analysis and Exploratory Structural Equation Modeling

    Science.gov (United States)

    Perry, John L.; Nicholls, Adam R.; Clough, Peter J.; Crust, Lee

    2015-01-01

    Despite the limitations of overgeneralizing cutoff values for confirmatory factor analysis (CFA; e.g., Marsh, Hau, & Wen, 2004), they are still often employed as golden rules for assessing factorial validity in sport and exercise psychology. The purpose of this study was to investigate the appropriateness of using the CFA approach with these…

  18. Combining a building simulation with energy systems analysis to assess the benefits of natural ventilation

    DEFF Research Database (Denmark)

    Oropeza-Perez, Ivan; Østergaard, Poul Alberg; Remmen, Arne

    2013-01-01

    a thermal air flow simulation program - Into the energy systems analysis model. Descriptions of the energy systems in two geographical locations, i.e. Mexico and Denmark, are set up as inputs. Then, the assessment is done by calculating the energy impacts as well as environmental benefits in the energy...

  19. A sensitivity analysis of a radiological assessment model for Arctic waters

    DEFF Research Database (Denmark)

    Nielsen, S.P.

    1998-01-01

    A model based on compartment analysis has been developed to simulate the dispersion of radionuclides in Arctic waters for an assessment of doses to man. The model predicts concentrations of radionuclides in the marine environment and doses to man from a range of exposure pathways. A parameter...

  20. Cluster Analysis of Assessment in Anatomy and Physiology for Health Science Undergraduates

    Science.gov (United States)

    Brown, Stephen; White, Sue; Power, Nicola

    2016-01-01

    Academic content common to health science programs is often taught to a mixed group of students; however, content assessment may be consistent for each discipline. This study used a retrospective cluster analysis on such a group, first to identify high and low achieving students, and second, to determine the distribution of students within…

  1. Further potentials in the joint implementation of life cycle assessment and data envelopment analysis.

    Science.gov (United States)

    Iribarren, Diego; Vázquez-Rowe, Ian; Moreira, María Teresa; Feijoo, Gumersindo

    2010-10-15

    The combined application of Life Cycle Assessment and Data Envelopment Analysis has been recently proposed to provide a tool for the comprehensive assessment of the environmental and operational performance of multiple similar entities. Among the acknowledged advantages of LCA+DEA methodology, eco-efficiency verification and avoidance of average inventories are usually highlighted. However, given the novelty of LCA+DEA methods, a high number of additional potentials remain unexplored. In this sense, there are some features that are worth detailing given their wide interest to enhance LCA performance. Emphasis is laid on the improved interpretation of LCA results through the complementary use of DEA with respect to: (i) super-efficiency analysis to facilitate the selection of reference performers, (ii) inter- and intra-assessments of multiple data sets within any specific sector with benchmarking and trend analysis purposes, (iii) integration of an economic dimension in order to enrich sustainability assessments, and (iv) window analysis to evaluate environmental impact efficiency over a certain period of time. Furthermore, the capability of LCA+DEA methodology to be generally implemented in a wide range of scenarios is discussed. These further potentials are explained and demonstrated via the presentation of brief case studies based on real data sets.

  2. Identifying Barriers in Implementing Outcomes-Based Assessment Program Review: A Grounded Theory Analysis

    Science.gov (United States)

    Bresciani, Marilee J.

    2011-01-01

    The purpose of this grounded theory study was to identify the typical barriers encountered by faculty and administrators when implementing outcomes-based assessment program review. An analysis of interviews with faculty and administrators at nine institutions revealed a theory that faculty and administrators' promotion, tenure (if applicable),…

  3. Substituted plan analysis in the environmental impact assessment of Yongchuan wastewater treatment project

    Institute of Scientific and Technical Information of China (English)

    FANG Jun-hua

    2006-01-01

    Substituted plan in the environmental impact assessment (EIA) mainly means the treatment technology and the substituted site of plant, and it also includes the many kinds of environment protection measures. This paper will make detailed analysis on the treatment technology, the substituted site of plant, the purpose of discharged water and the dispose of sludge in the Yongchuan wastewater treatment project.

  4. MONTHLY VARIATION IN SPERM MOTILITY IN COMMON CARP ASSESSED USING COMPUTER-ASSISTED SPERM ANALYSIS (CASA)

    Science.gov (United States)

    Sperm motility variables from the milt of the common carp Cyprinus carpio were assessed using a computer-assisted sperm analysis (CASA) system across several months (March-August 1992) known to encompass the natural spawning period. Two-year-old pond-raised males obtained each mo...

  5. A Bayesian latent group analysis for detecting poor effort in the assessment of malingering

    NARCIS (Netherlands)

    A. Ortega; E.-J. Wagenmakers; M.D. Lee; H.J. Markowitsch; M. Piefke

    2012-01-01

    Despite their theoretical appeal, Bayesian methods for the assessment of poor effort and malingering are still rarely used in neuropsychological research and clinical diagnosis. In this article, we outline a novel and easy-to-use Bayesian latent group analysis of malingering whose goal is to identif

  6. Structural Reliability Assessment by Integrating Sensitivity Analysis and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Shao-Fei Jiang

    2014-01-01

    Full Text Available To reduce the runtime and ensure enough computation accuracy, this paper proposes a structural reliability assessment method by the use of sensitivity analysis (SA and support vector machine (SVM. The sensitivity analysis is firstly applied to assess the effect of random variables on the values of performance function, while the small-influence variables are rejected as input vectors of SVM. Then, the trained SVM is used to classify the input vectors, which are produced by sampling the residual variables based on their distributions. Finally, the reliability assessment is implemented with the aid of reliability theory. A 10-bar planar truss is used to validate the feasibility and efficiency of the proposed method, and a performance comparison is made with other existing methods. The results show that the proposed method can largely save the runtime with less reduction of the accuracy; furthermore, the accuracy using the proposed method is the highest among the methods employed.

  7. 2D Monte Carlo analysis of radiological risk assessment for the food intake in Korea

    International Nuclear Information System (INIS)

    Most public health risk assessments assume and combine a series of average, conservative and worst-case values to derive an acceptable point estimate of risk. To improve quality of risk information, insight of uncertainty in the assessments is needed and more emphasis is put on the probabilistic risk assessment. Probabilistic risk assessment studies use probability distributions for one or more variables of the risk equation in order to quantitatively characterize variability and uncertainty. In this study, an advanced technique called the two-dimensional Monte Carlo analysis (2D MCA) is applied to estimation of internal doses from intake of radionuclides in foodstuffs and drinking water in Korea. The variables of the risk model along with the parameters of these variables are described in terms of probability density functions (PDFs). In addition, sensitivity analyses were performed to identify important factors to the radiation doses. (author)

  8. Benefits and risks of emerging technologies: integrating life cycle assessment and decision analysis to assess lumber treatment alternatives.

    Science.gov (United States)

    Tsang, Michael P; Bates, Matthew E; Madison, Marcus; Linkov, Igor

    2014-10-01

    Assessing the best options among emerging technologies (e.g., new chemicals, nanotechnologies) is complicated because of trade-offs across benefits and risks that are difficult to quantify given limited and fragmented availability of information. This study demonstrates the integration of multicriteria decision analysis (MCDA) and life cycle assessment (LCA) to address technology alternative selection decisions. As a case study, prioritization of six lumber treatment alternatives [micronized copper quaternary (MCQ); alkaline copper quaternary (ACQ); water-borne copper naphthenate (CN); oil-borne copper naphthenate (CNo); water-borne copper quinolate (CQ); and water-borne zinc naphthenate (ZN)] for military use are considered. Multiattribute value theory (MAVT) is used to derive risk and benefit scores. Risk scores are calculated using a cradle-to-gate LCA. Benefit scores are calculated by scoring of cost, durability, and corrosiveness criteria. Three weighting schemes are used, representing Environmental, Military and Balanced stakeholder perspectives. Aggregated scores from all three perspectives show CQ to be the least favorable alterative. MCQ is identified as the most favorable alternative from the Environmental stakeholder perspective. From the Military stakeholder perspective, ZN is determined to be the most favorable alternative, followed closely by MCQ. This type of scoring and ranking of multiple heterogeneous criteria in a systematic and transparent way facilitates better justification of technology selection and regulation. PMID:25209330

  9. Development of Probabilistic Uncertainty Analysis Methodology for SRS Performance Assessments Maintenance Plan Activities

    International Nuclear Information System (INIS)

    An initial uncertainty analysis of the Performance Assessment (PA) model of the Savannah River Site (SRS) trench disposal unit was conducted. Selected input data values were varied for both flow and transport analyses to generate input sets called realizations. Outputs of fluxes to the water table and well concentrations were compared to results from the PA. This stage of the uncertainty analysis served as a prototype for future work. The focus was to lay the foundation for a more comprehensive analysis, generate a limited set of output results, and learn about the process and potential problems

  10. Analysis And Assessment Of The Security Method Against Incidental Contamination In The Collective Water Supply System

    Directory of Open Access Journals (Sweden)

    Szpak Dawid

    2015-09-01

    Full Text Available The paper presents the main types of surface water incidental contaminations and the security method against incidental contamination in water sources. Analysis and assessment the collective water supply system (CWSS protection against incidental contamination was conducted. Failure Mode and Effects Analysis (FMEA was used. The FMEA method allow to use the product or process analysis, identification of weak points, and implementation the corrections and new solutions for eliminating the source of undesirable events. The developed methodology was shown in application case. It was found that the risk of water contamination in water-pipe network of the analyzed CWSS caused by water source incidental contamination is at controlled level.

  11. Application of synthetic principal component analysis model to mine area farmland heavy metal pollution assessment

    Institute of Scientific and Technical Information of China (English)

    WANG Cong-lu; WU Chao; WANG Wei-jun

    2008-01-01

    Referring to GB5618-1995 about heavy metal pollution, and using statistical analysis SPSS, the major pollutants of mine area farmland heavy metal pollution were identified by variable clustering analysis. Assessment and classification were done to the mine area farmland heavy metal pollution situation by synthetic principal components analysis (PCA). The results show that variable clustering analysis is efficient to identify the principal components of mine area farmland heavy metal pollution. Sort and clustering were done to the synthetic principal components scores of soil sample, which is given by synthetic principal components analysis. Data structure of soil heavy metal contaminations, relationships and pollution level of different soil samples are discovered. The results of mine area farmland heavy metal pollution quality assessed and classified with synthetic component scores reflect the influence of both the major and compound heavy metal pol-lutants. Identification and assessment results of mine area farmland heavy metal pollution can provide reference and guide to propose control measures of mine area farmland heavy metal pollution and focus on the key treatment region.

  12. Socio-economic analysis: a tool for assessing the potential of nanotechnologies

    Science.gov (United States)

    Brignon, Jean-Marc

    2011-07-01

    Cost-Benefit Analysis (CBA) has a long history, especially in the USA, of being used for the assessment of new regulation, new infrastructure and more recently for new technologies. Under the denomination of Socio-Economic Analysis (SEA), this concept is used in EU safety and environmental regulation, especially for the placing of chemicals on the market (REACh regulation) and the operation of industrial installations (Industrial Emissions Directive). As far as REACh and other EU legislation apply specifically to nanomaterials in the future, SEA might become an important assessment tool for nanotechnologies. The most important asset of SEA regarding nanomaterials, is the comparison with alternatives in socio-economic scenarios, which is key for the understanding of how a nanomaterial "socially" performs in comparison with its alternatives. "Industrial economics" methods should be introduced in SEAs to make industry and the regulator share common concepts and visions about economic competitiveness implications of regulating nanotechnologies, SEA and Life Cycle Analysis (LCA) can complement each other : Socio-Economic LCA are increasingly seen as a complete assessment tool for nanotechnologies, but the perspective between Social LCA and SEA are different and the respective merits and limitations of both approaches should be kept in mind. SEA is a "pragmatic regulatory impact analysis", that uses a cost/benefit framework analysis but remains open to other disciplines than economy, and open to the participation of stakeholders for the construction of scenarios of the deployment of technologies and the identification of alternatives. SEA is "pragmatic" in the sense that it is driven by the purpose to assess "what happens" with the introduction of nanotechnology, and uses methodologies such as Life Cycle Analysis only as far as they really contribute to that goal. We think that, being pragmatic, SEA is also adaptative, which is a key quality to handle the novelty of

  13. Socio-economic analysis: a tool for assessing the potential of nanotechnologies

    International Nuclear Information System (INIS)

    Cost-Benefit Analysis (CBA) has a long history, especially in the USA, of being used for the assessment of new regulation, new infrastructure and more recently for new technologies. Under the denomination of Socio-Economic Analysis (SEA), this concept is used in EU safety and environmental regulation, especially for the placing of chemicals on the market (REACh regulation) and the operation of industrial installations (Industrial Emissions Directive). As far as REACh and other EU legislation apply specifically to nanomaterials in the future, SEA might become an important assessment tool for nanotechnologies. The most important asset of SEA regarding nanomaterials, is the comparison with alternatives in socio-economic scenarios, which is key for the understanding of how a nanomaterial 'socially' performs in comparison with its alternatives. 'Industrial economics' methods should be introduced in SEAs to make industry and the regulator share common concepts and visions about economic competitiveness implications of regulating nanotechnologies, SEA and Life Cycle Analysis (LCA) can complement each other : Socio-Economic LCA are increasingly seen as a complete assessment tool for nanotechnologies, but the perspective between Social LCA and SEA are different and the respective merits and limitations of both approaches should be kept in mind. SEA is a 'pragmatic regulatory impact analysis', that uses a cost/benefit framework analysis but remains open to other disciplines than economy, and open to the participation of stakeholders for the construction of scenarios of the deployment of technologies and the identification of alternatives. SEA is 'pragmatic' in the sense that it is driven by the purpose to assess 'what happens' with the introduction of nanotechnology, and uses methodologies such as Life Cycle Analysis only as far as they really contribute to that goal. We think that, being pragmatic, SEA is also adaptative, which is a key quality to handle the novelty of

  14. Assessment of dietary patterns in nutritional epidemiology: principal component analysis compared with confirmatory factor analysis.

    OpenAIRE

    Varraso, Raphaëlle; Garcia-Aymerich, Judith; Monier, Florent; Le Moual, Nicole; de Batlle, Jordi; Miranda, Gemma; Pison, Christophe,; Romieu, Isabelle; Kauffmann, Francine; Maccario, Jean

    2012-01-01

    BACKGROUND: In the field of nutritional epidemiology, principal component analysis (PCA) has been used to derive patterns, but the robustness of interpretation might be an issue when the sample size is small. The authors proposed the alternative use of confirmatory factor analysis (CFA) to define such patterns. OBJECTIVE: The aim was to compare dietary patterns derived through PCA and CFA used as equivalent approaches in terms of stability and relevance. DESIGN: PCA and CFA were performed in ...

  15. Bibliometric analysis of global environmental assessment research in a 20-year period

    International Nuclear Information System (INIS)

    Based on the samples of 113,468 publications on environmental assessment (EA) from the past 20 years, we used a bibliometric analysis to study the literature in terms of trends of growth, subject categories and journals, international collaboration, geographic distribution of publications, and scientific research issues. By applying thresholds to network centralities, a core group of countries can be distinguished as part of the international collaboration network. A frequently used keywords analysis found that the priority in assessment would gradually change from project environmental impact assessment (EIA) to strategic environmental assessment (SEA). Decision-theoretic approaches (i.e., environmental indicator selection, life cycle assessment, etc.), along with new technologies and methods (i.e., the geographic information system and modeling) have been widely applied in the EA research field over the past 20 years. Hot spots such as “biodiversity” and “climate change” have been emphasized in current EA research, a trend that will likely continue in the future. The h-index has been used to evaluate the research quality among countries all over the world, while the improvement of developing countries' EA systems is becoming a popular research topic. Our study reveals patterns in scientific outputs and academic collaborations and serves as an alternative and innovative way of revealing global research trends in the EA research field

  16. Strategic Environmental Assessment Framework for Landscape-Based, Temporal Analysis of Wetland Change in Urban Environments

    Science.gov (United States)

    Sizo, Anton; Noble, Bram F.; Bell, Scott

    2016-03-01

    This paper presents and demonstrates a spatial framework for the application of strategic environmental assessment (SEA) in the context of change analysis for urban wetland environments. The proposed framework is focused on two key stages of the SEA process: scoping and environmental baseline assessment. These stages are arguably the most information-intense phases of SEA and have a significant effect on the quality of the SEA results. The study aims to meet the needs for proactive frameworks to assess and protect wetland habitat and services more efficiently, toward the goal of advancing more intelligent urban planning and development design. The proposed framework, adopting geographic information system and remote sensing tools and applications, supports the temporal evaluation of wetland change and sustainability assessment based on landscape indicator analysis. The framework was applied to a rapidly developing urban environment in the City of Saskatoon, Saskatchewan, Canada, analyzing wetland change and land-use pressures from 1985 to 2011. The SEA spatial scale was rescaled from administrative urban planning units to an ecologically meaningful area. Landscape change assessed was based on a suite of indicators that were subsequently rolled up into a single, multi-dimensional, and easy to understand and communicate index to examine the implications of land-use change for wetland sustainability. The results show that despite the recent extremely wet period in the Canadian prairie region, land-use change contributed to increasing threats to wetland sustainability.

  17. Bibliometric analysis of global environmental assessment research in a 20-year period

    Energy Technology Data Exchange (ETDEWEB)

    Li, Wei, E-mail: weili@bnu.edu.cn; Zhao, Yang

    2015-01-15

    Based on the samples of 113,468 publications on environmental assessment (EA) from the past 20 years, we used a bibliometric analysis to study the literature in terms of trends of growth, subject categories and journals, international collaboration, geographic distribution of publications, and scientific research issues. By applying thresholds to network centralities, a core group of countries can be distinguished as part of the international collaboration network. A frequently used keywords analysis found that the priority in assessment would gradually change from project environmental impact assessment (EIA) to strategic environmental assessment (SEA). Decision-theoretic approaches (i.e., environmental indicator selection, life cycle assessment, etc.), along with new technologies and methods (i.e., the geographic information system and modeling) have been widely applied in the EA research field over the past 20 years. Hot spots such as “biodiversity” and “climate change” have been emphasized in current EA research, a trend that will likely continue in the future. The h-index has been used to evaluate the research quality among countries all over the world, while the improvement of developing countries' EA systems is becoming a popular research topic. Our study reveals patterns in scientific outputs and academic collaborations and serves as an alternative and innovative way of revealing global research trends in the EA research field.

  18. Strategic Environmental Assessment Framework for Landscape-Based, Temporal Analysis of Wetland Change in Urban Environments.

    Science.gov (United States)

    Sizo, Anton; Noble, Bram F; Bell, Scott

    2016-03-01

    This paper presents and demonstrates a spatial framework for the application of strategic environmental assessment (SEA) in the context of change analysis for urban wetland environments. The proposed framework is focused on two key stages of the SEA process: scoping and environmental baseline assessment. These stages are arguably the most information-intense phases of SEA and have a significant effect on the quality of the SEA results. The study aims to meet the needs for proactive frameworks to assess and protect wetland habitat and services more efficiently, toward the goal of advancing more intelligent urban planning and development design. The proposed framework, adopting geographic information system and remote sensing tools and applications, supports the temporal evaluation of wetland change and sustainability assessment based on landscape indicator analysis. The framework was applied to a rapidly developing urban environment in the City of Saskatoon, Saskatchewan, Canada, analyzing wetland change and land-use pressures from 1985 to 2011. The SEA spatial scale was rescaled from administrative urban planning units to an ecologically meaningful area. Landscape change assessed was based on a suite of indicators that were subsequently rolled up into a single, multi-dimensional, and easy to understand and communicate index to examine the implications of land-use change for wetland sustainability. The results show that despite the recent extremely wet period in the Canadian prairie region, land-use change contributed to increasing threats to wetland sustainability.

  19. Urban water metabolism efficiency assessment: integrated analysis of available and virtual water.

    Science.gov (United States)

    Huang, Chu-Long; Vause, Jonathan; Ma, Hwong-Wen; Yu, Chang-Ping

    2013-05-01

    Resolving the complex environmental problems of water pollution and shortage which occur during urbanization requires the systematic assessment of urban water metabolism efficiency (WME). While previous research has tended to focus on either available or virtual water metabolism, here we argue that the systematic problems arising during urbanization require an integrated assessment of available and virtual WME, using an indicator system based on material flow analysis (MFA) results. Future research should focus on the following areas: 1) analysis of available and virtual water flow patterns and processes through urban districts in different urbanization phases in years with varying amounts of rainfall, and their environmental effects; 2) based on the optimization of social, economic and environmental benefits, establishment of an indicator system for urban WME assessment using MFA results; 3) integrated assessment of available and virtual WME in districts with different urbanization levels, to facilitate study of the interactions between the natural and social water cycles; 4) analysis of mechanisms driving differences in WME between districts with different urbanization levels, and the selection of dominant social and economic driving indicators, especially those impacting water resource consumption. Combinations of these driving indicators could then be used to design efficient water resource metabolism solutions, and integrated management policies for reduced water consumption.

  20. Sustainability assessment of nuclear power: Discourse analysis of IAEA and IPCC frameworks

    International Nuclear Information System (INIS)

    Highlights: • Sustainability assessments (SAs) are methodologically precarious. • Discourse analysis reveals how the meaning of sustainability is constructed in SAs. • Discourse analysis is applied on the SAs of nuclear power of IAEA and IPCC. • For IAEA ‘sustainable’ equals ‘complying with best international practices’. • The IAEA framework largely inspires IPCC Fifth Assessment Report. - Abstract: Sustainability assessments (SAs) are methodologically precarious. Value-based judgments inevitably play a role in setting the scope of the SA, selecting assessment criteria and indicators, collecting adequate data, and developing and using models of considered systems. Discourse analysis can reveal how the meaning and operationalization of sustainability is constructed in and through SAs. Our discourse-analytical approach investigates how sustainability is channeled from ‘manifest image’ (broad but shallow), to ‘vision’, to ‘policy targets’ (specific and practical). This approach is applied on the SA frameworks used by IAEA and IPCC to assess the sustainability of the nuclear power option. The essentially problematic conclusion is that both SA frameworks are constructed in order to obtain answers that do not conflict with prior commitments adopted by the two institutes. For IAEA ‘sustainable’ equals ‘complying with best international practices and standards’. IPCC wrestles with its mission as a provider of “policy-relevant and yet policy-neutral, never policy-prescriptive” knowledge to decision-makers. IPCC avoids the assessment of different visions on the role of nuclear power in a low-carbon energy future, and skips most literature critical of nuclear power. The IAEA framework largely inspires IPCC AR5

  1. Evaluation of SOVAT: An OLAP-GIS decision support system for community health assessment data analysis

    Directory of Open Access Journals (Sweden)

    Parmanto Bambang

    2008-06-01

    Full Text Available Abstract Background Data analysis in community health assessment (CHA involves the collection, integration, and analysis of large numerical and spatial data sets in order to identify health priorities. Geographic Information Systems (GIS enable for management and analysis using spatial data, but have limitations in performing analysis of numerical data because of its traditional database architecture. On-Line Analytical Processing (OLAP is a multidimensional datawarehouse designed to facilitate querying of large numerical data. Coupling the spatial capabilities of GIS with the numerical analysis of OLAP, might enhance CHA data analysis. OLAP-GIS systems have been developed by university researchers and corporations, yet their potential for CHA data analysis is not well understood. To evaluate the potential of an OLAP-GIS decision support system for CHA problem solving, we compared OLAP-GIS to the standard information technology (IT currently used by many public health professionals. Methods SOVAT, an OLAP-GIS decision support system developed at the University of Pittsburgh, was compared against current IT for data analysis for CHA. For this study, current IT was considered the combined use of SPSS and GIS ("SPSS-GIS". Graduate students, researchers, and faculty in the health sciences at the University of Pittsburgh were recruited. Each round consisted of: an instructional video of the system being evaluated, two practice tasks, five assessment tasks, and one post-study questionnaire. Objective and subjective measurement included: task completion time, success in answering the tasks, and system satisfaction. Results Thirteen individuals participated. Inferential statistics were analyzed using linear mixed model analysis. SOVAT was statistically significant (α = .01 from SPSS-GIS for satisfaction and time (p Conclusion Using SOVAT, tasks were completed more efficiently, with a higher rate of success, and with greater satisfaction, than the

  2. Biometrical issues in the analysis of adverse events within the benefit assessment of drugs.

    Science.gov (United States)

    Bender, Ralf; Beckmann, Lars; Lange, Stefan

    2016-07-01

    The analysis of adverse events plays an important role in the benefit assessment of drugs. Consequently, results on adverse events are an integral part of reimbursement dossiers submitted by pharmaceutical companies to health policy decision-makers. Methods applied in the analysis of adverse events commonly include simple standard methods for contingency tables. However, the results produced may be misleading if observations are censored at the time of discontinuation due to treatment switching or noncompliance, resulting in unequal follow-up periods. In this paper, we present examples to show that the application of inadequate methods for the analysis of adverse events in the reimbursement dossier can lead to a downgrading of the evidence on a drug's benefit in the subsequent assessment, as greater harm from the drug cannot be excluded with sufficient certainty. Legal regulations on the benefit assessment of drugs in Germany are presented, in particular, with regard to the analysis of adverse events. Differences in safety considerations between the drug approval process and the benefit assessment are discussed. We show that the naive application of simple proportions in reimbursement dossiers frequently leads to uninterpretable results if observations are censored and the average follow-up periods differ between treatment groups. Likewise, the application of incidence rates may be misleading in the case of recurrent events and unequal follow-up periods. To allow for an appropriate benefit assessment of drugs, adequate survival time methods accounting for time dependencies and duration of follow-up are required, not only for time-to-event efficacy endpoints but also for adverse events. © 2016 The Authors. Pharmaceutical Statistics published by John Wiley & Sons Ltd. PMID:26928768

  3. An Analysis Of Tensile Test Results to Assess the Innovation Risk for an Additive Manufacturing Technology

    Directory of Open Access Journals (Sweden)

    Adamczak Stanisław

    2015-03-01

    Full Text Available The aim of this study was to assess the innovation risk for an additive manufacturing process. The analysis was based on the results of static tensile tests obtained for specimens made of photocured resin. The assessment involved analyzing the measurement uncertainty by applying the FMEA method. The structure of the causes and effects of the discrepancies was illustrated using the Ishikawa diagram. The risk priority numbers were calculated. The uncertainty of the tensile test measurement was determined for three printing orientations. The results suggest that the material used to fabricate the tensile specimens shows clear anisotropy of the properties in relation to the printing direction.

  4. How does scientific risk assessment of GM crops fit within the wider risk analysis?

    Science.gov (United States)

    Johnson, Katy L; Raybould, Alan F; Hudson, Malcolm D; Poppy, Guy M

    2007-01-01

    The debate concerning genetically modified crops illustrates confusion between the role of scientists and that of wider society in regulatory decision making. We identify two fundamental misunderstandings, which, if rectified, would allow progress with confidence. First, scientific risk assessment needs to test well-defined hypotheses, not simply collect data. Second, risk assessments need to be placed in the wider context of risk analysis to enable the wider 'non-scientific' questions to be considered in regulatory decision making. Such integration and understanding is urgently required because the challenges to regulation will escalate as scientific progress advances.

  5. RELAP5/MOD2 overview and developmental assessment results from TMI-1 plant transient analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lin, J.C.; Tsai, C.C.; Ransom, V.H.; Johnsen, G.W.

    1984-01-01

    RELAP5/MOD2 is a new version of the RELAP5 thermal-hydraulic computer code containing improved modeling features that provide a generic capability for pressurized water reactor transient simulation. Objective of this paper is to provide code users with an overview of the code and to report developmental assessment results obtained from a Three Mile Island Unit One plant transient analysis. The assessment shows that the injection of highly subcooled water into a high-pressure primary coolant system does not cause unphysical results or pose a problem for RELAP5/MOD2.

  6. Application of probabilistic safety assessment in CPR1000 severe accident prevention and mitigation analysis

    International Nuclear Information System (INIS)

    The relationship between probabilistic safety assessment (PSA) and severe accident study was discussed. Also how to apply PSA in severe accident prevention and mitigation was elaborated. PSA can find the plant vulnerabilities of severe accidents prevention and mitigation. Some modifications or improvements focusing on these vulnerabilities can be put forward. PSA also can assess the efficient of these actions for decision-making. According to CPR1000 unit severe accident analysis, an example for the process and method on how to use PSA to enhance the ability to deal with severe accident prevention and mitigation was set forth. (authors)

  7. Rasch analysis in the development of a rating scale for assessment of mobility after stroke

    DEFF Research Database (Denmark)

    Engberg, A; Garde, B; Kreiner, S

    1995-01-01

    The study describes the development of a rating scale for assessment of mobility after stroke. It was based on 74 first-stroke patients, 40 men and 34 women, each assessed three times during rehabilitation. Their median age was 69 years, and they represented all degrees of severity of paresis. Co...... in the 10-item subscores; 3) the score sum is independent of age, side of hemiparesis, and gender of the patient. Latent trait analysis (Rasch) was found to be an ideal model for statistical investigation of these properties....

  8. Structural integrity assessment by using finite element analysis based on damage mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Chang Sik; Kim, Nak Hyun; Kim, Yun Jae [Korea University, Seoul (Korea, Republic of)

    2009-07-01

    This paper introduces structural integrity assessment by using Finite Element analysis based on damage mechanics. Several FE damage methods as like GTN model have been proposed up to the present. These damage models have their own advantages and disadvantages. It is important to select the proper damage model for the integrity assessment of the structure in interest. In this paper, selected several damage models are apply to simulate fracture behaviours of structures with various geometries, and the FE results are compared with the experimental results. These models are implemented to general purpose FE program, ABAQUS, via user-defined subroutines.

  9. A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas.

    Science.gov (United States)

    Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan

    2016-01-01

    Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation. PMID:27527202

  10. Bioimpedance harmonic analysis as a tool to simultaneously assess circulation and nervous control

    International Nuclear Information System (INIS)

    Multicycle harmonic (Fourier) analysis of bioimpedance was employed to simultaneously assess circulation and neural activity in visceral (rat urinary bladder) and somatic (human finger) organs. The informative value of the first cardiac harmonic of the bladder impedance as an index of bladder circulation is demonstrated. The individual reactions of normal and obstructive bladders in response to infusion cystometry were recorded. The potency of multicycle harmonic analysis of bioimpedance to assess sympathetic and parasympathetic neural control in urinary bladder is discussed. In the human finger, bioimpedance harmonic analysis revealed three periodic components at the rate of the heart beat, respiration and Mayer wave (0.1 Hz), which were observed under normal conditions and during blood flow arrest in the hand. The revealed spectrum peaks were explained by the changes in systemic blood pressure and in regional vascular tone resulting from neural vasomotor control. During normal respiration and circulation, two side cardiac peaks were revealed in a bioimpedance amplitude spectrum, whose amplitude reflected the depth of amplitude respiratory modulation of the cardiac output. During normal breathing, the peaks corresponding to the second and third cardiac harmonics were split, reflecting frequency respiratory modulation of the heart rate. Multicycle harmonic analysis of bioimpedance is a novel potent tool to examine the interaction between the respiratory and cardiovascular system and to simultaneously assess regional circulation and neural influences in visceral and somatic organs

  11. A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas

    Science.gov (United States)

    Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan

    2016-01-01

    Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation. PMID:27527202

  12. A multicriteria decision analysis model and risk assessment framework for carbon capture and storage.

    Science.gov (United States)

    Humphries Choptiany, John Michael; Pelot, Ronald

    2014-09-01

    Multicriteria decision analysis (MCDA) has been applied to various energy problems to incorporate a variety of qualitative and quantitative criteria, usually spanning environmental, social, engineering, and economic fields. MCDA and associated methods such as life-cycle assessments and cost-benefit analysis can also include risk analysis to address uncertainties in criteria estimates. One technology now being assessed to help mitigate climate change is carbon capture and storage (CCS). CCS is a new process that captures CO2 emissions from fossil-fueled power plants and injects them into geological reservoirs for storage. It presents a unique challenge to decisionmakers (DMs) due to its technical complexity, range of environmental, social, and economic impacts, variety of stakeholders, and long time spans. The authors have developed a risk assessment model using a MCDA approach for CCS decisions such as selecting between CO2 storage locations and choosing among different mitigation actions for reducing risks. The model includes uncertainty measures for several factors, utility curve representations of all variables, Monte Carlo simulation, and sensitivity analysis. This article uses a CCS scenario example to demonstrate the development and application of the model based on data derived from published articles and publicly available sources. The model allows high-level DMs to better understand project risks and the tradeoffs inherent in modern, complex energy decisions.

  13. Using Two Different Approaches to Assess Dietary Patterns: Hypothesis-Driven and Data-Driven Analysis

    Directory of Open Access Journals (Sweden)

    Ágatha Nogueira Previdelli

    2016-09-01

    Full Text Available The use of dietary patterns to assess dietary intake has become increasingly common in nutritional epidemiology studies due to the complexity and multidimensionality of the diet. Currently, two main approaches have been widely used to assess dietary patterns: data-driven and hypothesis-driven analysis. Since the methods explore different angles of dietary intake, using both approaches simultaneously might yield complementary and useful information; thus, we aimed to use both approaches to gain knowledge of adolescents’ dietary patterns. Food intake from a cross-sectional survey with 295 adolescents was assessed by 24 h dietary recall (24HR. In hypothesis-driven analysis, based on the American National Cancer Institute method, the usual intake of Brazilian Healthy Eating Index Revised components were estimated. In the data-driven approach, the usual intake of foods/food groups was estimated by the Multiple Source Method. In the results, hypothesis-driven analysis showed low scores for Whole grains, Total vegetables, Total fruit and Whole fruits, while, in data-driven analysis, fruits and whole grains were not presented in any pattern. High intakes of sodium, fats and sugars were observed in hypothesis-driven analysis with low total scores for Sodium, Saturated fat and SoFAA (calories from solid fat, alcohol and added sugar components in agreement, while the data-driven approach showed the intake of several foods/food groups rich in these nutrients, such as butter/margarine, cookies, chocolate powder, whole milk, cheese, processed meat/cold cuts and candies. In this study, using both approaches at the same time provided consistent and complementary information with regard to assessing the overall dietary habits that will be important in order to drive public health programs, and improve their efficiency to monitor and evaluate the dietary patterns of populations.

  14. Using Two Different Approaches to Assess Dietary Patterns: Hypothesis-Driven and Data-Driven Analysis.

    Science.gov (United States)

    Previdelli, Ágatha Nogueira; de Andrade, Samantha Caesar; Fisberg, Regina Mara; Marchioni, Dirce Maria

    2016-01-01

    The use of dietary patterns to assess dietary intake has become increasingly common in nutritional epidemiology studies due to the complexity and multidimensionality of the diet. Currently, two main approaches have been widely used to assess dietary patterns: data-driven and hypothesis-driven analysis. Since the methods explore different angles of dietary intake, using both approaches simultaneously might yield complementary and useful information; thus, we aimed to use both approaches to gain knowledge of adolescents' dietary patterns. Food intake from a cross-sectional survey with 295 adolescents was assessed by 24 h dietary recall (24HR). In hypothesis-driven analysis, based on the American National Cancer Institute method, the usual intake of Brazilian Healthy Eating Index Revised components were estimated. In the data-driven approach, the usual intake of foods/food groups was estimated by the Multiple Source Method. In the results, hypothesis-driven analysis showed low scores for Whole grains, Total vegetables, Total fruit and Whole fruits), while, in data-driven analysis, fruits and whole grains were not presented in any pattern. High intakes of sodium, fats and sugars were observed in hypothesis-driven analysis with low total scores for Sodium, Saturated fat and SoFAA (calories from solid fat, alcohol and added sugar) components in agreement, while the data-driven approach showed the intake of several foods/food groups rich in these nutrients, such as butter/margarine, cookies, chocolate powder, whole milk, cheese, processed meat/cold cuts and candies. In this study, using both approaches at the same time provided consistent and complementary information with regard to assessing the overall dietary habits that will be important in order to drive public health programs, and improve their efficiency to monitor and evaluate the dietary patterns of populations. PMID:27669289

  15. Using Two Different Approaches to Assess Dietary Patterns: Hypothesis-Driven and Data-Driven Analysis.

    Science.gov (United States)

    Previdelli, Ágatha Nogueira; de Andrade, Samantha Caesar; Fisberg, Regina Mara; Marchioni, Dirce Maria

    2016-09-23

    The use of dietary patterns to assess dietary intake has become increasingly common in nutritional epidemiology studies due to the complexity and multidimensionality of the diet. Currently, two main approaches have been widely used to assess dietary patterns: data-driven and hypothesis-driven analysis. Since the methods explore different angles of dietary intake, using both approaches simultaneously might yield complementary and useful information; thus, we aimed to use both approaches to gain knowledge of adolescents' dietary patterns. Food intake from a cross-sectional survey with 295 adolescents was assessed by 24 h dietary recall (24HR). In hypothesis-driven analysis, based on the American National Cancer Institute method, the usual intake of Brazilian Healthy Eating Index Revised components were estimated. In the data-driven approach, the usual intake of foods/food groups was estimated by the Multiple Source Method. In the results, hypothesis-driven analysis showed low scores for Whole grains, Total vegetables, Total fruit and Whole fruits), while, in data-driven analysis, fruits and whole grains were not presented in any pattern. High intakes of sodium, fats and sugars were observed in hypothesis-driven analysis with low total scores for Sodium, Saturated fat and SoFAA (calories from solid fat, alcohol and added sugar) components in agreement, while the data-driven approach showed the intake of several foods/food groups rich in these nutrients, such as butter/margarine, cookies, chocolate powder, whole milk, cheese, processed meat/cold cuts and candies. In this study, using both approaches at the same time provided consistent and complementary information with regard to assessing the overall dietary habits that will be important in order to drive public health programs, and improve their efficiency to monitor and evaluate the dietary patterns of populations.

  16. 77 FR 5857 - Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...

    Science.gov (United States)

    2012-02-06

    ... COMMISSION Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...: On November 2, 2011 (76 FR 67764), the U.S. Nuclear Regulatory Commission (NRC) published for public comment Draft NUREG, ``Common- Cause Failure Analysis in Event and Condition Assessment: Guidance...

  17. Application of Item Analysis to Assess Multiple-Choice Examinations in the Mississippi Master Cattle Producer Program

    Science.gov (United States)

    Parish, Jane A.; Karisch, Brandi B.

    2013-01-01

    Item analysis can serve as a useful tool in improving multiple-choice questions used in Extension programming. It can identify gaps between instruction and assessment. An item analysis of Mississippi Master Cattle Producer program multiple-choice examination responses was performed to determine the difficulty of individual examinations, assess the…

  18. Risk Assessment of Repetitive Movements in Olive Growing: Analysis of Annual Exposure Level Assessment Models with the OCRA Checklist.

    Science.gov (United States)

    Proto, A R; Zimbalatti, G

    2015-10-01

    In Italy, one of the main agricultural crops is represented by the cultivation of olive trees. Olive cultivation characterizes the Italian agricultural landscape and national agricultural economics. Italy is the world's second largest producer of olive oil. Because olive cultivation requires the largest labor force in southern Italy, the aim of this research was to assess the risk of biomechanical overload of the workers' upper limbs. The objective, therefore, was to determine the level of risk that workers are exposed to in each phase of the production process. In Calabria, the second most important region in Italy for both the production of olive oil and cultivated area, there are 113,907 olive farms (83% of all farms) and 250,000 workers. To evaluate the risk of repetitive movements, all of the work tasks performed by workers on 100 farms in Calabria were analyzed. A total of 430 workers were interviewed over the four-year research period. To evaluate the level of exposure to repetitive movements, the OCRA (occupational repetitive actions) checklist was adopted. This checklist was the primary analytical tool during the preliminary risk assessment and in a given working situation. The analysis suggested by the OCRA checklist starts with pre-assigned scores (increasing in value with intensification of risk) for each of four main risk factors and additional factors. Between 2010 and 2013, surveys were conducted using the OCRA checklist with the aim of verifying musculoskeletal risks. The results obtained from the study of 430 workers allowed us to identify the level of exposure to risk. This analysis was conducted in the workplace to examine in detail the repetitive movements performed by the workers. The research was divided into two phases: first to provide preliminary information on the different tasks performed in olive growing, and second to assign a percentage to each task of the total hours worked in a year. Based on the results, this method could well

  19. Urban flooding and health risk analysis by use of quantitative microbial risk assessment

    DEFF Research Database (Denmark)

    Andersen, Signe Tanja

    D thesis is to identify the limitations and possibilities for optimising microbial risk assessments of urban flooding through more evidence-based solutions, including quantitative microbial data and hydrodynamic water quality models. The focus falls especially on the problem of data needs and the causes......, and by comparing the model results with an epidemiological study of the same event, the concept of using hydrological models to estimate water quality – and thereby estimate risk – was improved. Another urban flooding risk assessment used average measured concentrations of pathogens in wastewater as inputs......, but also when wading through a flooded area. The results in this thesis have brought microbial risk assessments one step closer to more uniform and repeatable risk analysis by using actual and relevant measured data and hydrodynamic water quality models to estimate the risk from flooding caused...

  20. Application of data analysis techniques to nuclear reactor systems code to accuracy assessment

    International Nuclear Information System (INIS)

    An automated code assessment program (ACAP) has been developed by the authors to provide quantitative comparisons between nuclear reactor systems (NRS) code results and experimental measurements. This software was developed under subcontract to the United States Nuclear Regulatory Commission for use in its NRS code consolidation efforts. In this paper, background on the topic of NRS accuracy and uncertainty assessment is provided which motivates the development of and defines basic software requirements for ACAP. A survey of data analysis techniques was performed, focusing on the applicability of methods in the construction of NRS code-data comparison measures. The results of this review process, which further defined the scope, user interface and process for using ACAP are also summarized. A description of the software package and several sample applications to NRS data sets are provided. Its functionality and ability to provide objective accuracy assessment figures are demonstrated. (author)

  1. Software Integration of Life Cycle Assessment and Economic Analysis for Process Evaluation

    DEFF Research Database (Denmark)

    Kalakula, Sawitree; Malakula, Pomthong; Siemanonda, Kitipat;

    2013-01-01

    This study is focused on the sustainable process design of bioethanol production from cassava rhizome. The study includes: process simulation, sustainability analysis, economic evaluation and life cycle assessment (LCA). A steady state process simulation if performed to generate a base case design...... of the bioethanol conversion process using cassava rhizome as a feedstock. The sustainability analysis is performed to analyze the relevant indicators in sustainability metrics, to definedesign/retrofit targets for process improvements. Economic analysis is performed to evaluate the profitability of the process........ Also, simultaneously with sustainability analysis, the life cycle impact on environment associated with bioethanol production is performed. Finally, candidate alternative designs are generated and compared with the base case design in terms of LCA, economics, waste, energy usage and enviromental impact...

  2. Chemometric Analysis for Pollution Source Assessment of Harbour Sediments in Arctic Locations

    DEFF Research Database (Denmark)

    Pedersen, Kristine B.; Lejon, Tore; Jensen, Pernille Erland;

    2015-01-01

    Pollution levels, pollutant distribution and potential source assessments based on multivariate analysis (chemometrics) were made for harbour sediments from two Arctic locations; Hammerfest in Norway and Sisimiut in Greenland. High levels of heavy metals were detected in addition to organic...... pollutants. Preliminary assessments based on principal component analysis (PCA) revealed different sources and pollutant distribution in the sediments of the two harbours. Tributyltin (TBT) was, however, found to originate from point source(s), and the highest concentrations of TBT in both harbours were...... indicated relation primarily to German, Russian and American mixtures in Hammerfest; and American, Russian and Japanese mixtures in Sisimiut. PCA was shown to be an important tool for identifying pollutant sources and differences in pollutant composition in relation to sediment characteristics....

  3. Use of simulated patients and reflective video analysis to assess occupational therapy students' preparedness for fieldwork.

    Science.gov (United States)

    Giles, Amanda K; Carson, Nancy E; Breland, Hazel L; Coker-Bolt, Patty; Bowman, Peter J

    2014-01-01

    Educators must determine whether occupational therapy students are adequately prepared for Level II fieldwork once they have successfully completed the didactic portion of their coursework. Although studies have shown that students regard the use of video cameras and simulated patient encounters as useful tools for assessing professional and clinical behaviors, little has been published in the occupational therapy literature regarding the practical application of simulated patients or reflective video analysis. We describe a model for a final Comprehensive Practical Exam that uses both simulated patients and reflective video analysis to assess student preparedness for Level II fieldwork, and we report on student perceptions of these instructional modalities. We provide recommendations for designing, implementing, and evaluating simulated patient experiences in light of existing educational theory. PMID:25397940

  4. Structured Assessment Approach: a microcomputer-based insider-vulnerability analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    Patenaude, C.J.; Sicherman, A.; Sacks, I.J.

    1986-01-01

    The Structured Assessment Approach (SAA) was developed to help assess the vulnerability of safeguards systems to insiders in a staged manner. For physical security systems, the SAA identifies possible diversion paths which are not safeguarded under various facility operating conditions and insiders who could defeat the system via direct access, collusion or indirect tampering. For material control and accounting systems, the SAA identifies those who could block the detection of a material loss or diversion via data falsification or equipment tampering. The SAA, originally desinged to run on a mainframe computer, has been converted to run on a personal computer. Many features have been added to simplify and facilitate its use for conducting vulnerability analysis. For example, the SAA input, which is a text-like data file, is easily readable and can provide documentation of facility safeguards and assumptions used for the analysis.

  5. Reliability assessment of generation and transmission systems using fault-tree analysis

    International Nuclear Information System (INIS)

    This paper presents a method that integrates deterministic approach with fault-tree analysis for reliability assessment of a composite system (generation and transmission in power systems). The contingency screening is conducted in the first step. The results are further classified into three clusters in the second step: normal, local trouble and system trouble. The fault-tree analysis is used to assess the reliability of the composite system in the third step. Finally, Risk Reduction Worth is adopted as a measure of importance for identifying the crucial element that has significant impact on the reliability. In this paper, a composite system in Taiwan serves as an example for illustrating the simulation results attained by the proposed method. The simulation results, verified by Siemens PTI PSS/E TPLAN software package, show that the proposed method is applicable for large scale power systems.

  6. Assessment of synthetic winds through spectral modelling, rainflow count analysis and statistics of increments

    Science.gov (United States)

    Beyer, Hans Georg; Chougule, Abhijit

    2016-04-01

    While wind energy industry growing rapidly and siting of wind turbines onshore as well as offshore is increasing, many wind engineering model tools have been developed for the assessment of loads on wind turbines due to varying wind speeds. In order to have proper wind turbine design and performance analysis, it is important to have an accurate representation of the incoming wind field. To ease the analysis, tools for the generation of synthetic wind fields have been developed, e.g the widely used TurbSim procedure. We analyse respective synthetic data sets on one hand in view of the similarity of the spectral characteristics of measured and synthetic sets. In addition, second order characteristics with direct relevance to load assessment as given by the statistics of increments and rainflow count results are inspected.

  7. Approach to proliferation risk assessment based on multiple objective analysis framework

    Energy Technology Data Exchange (ETDEWEB)

    Andrianov, A.; Kuptsov, I. [Obninsk Institute for Nuclear Power Engineering of NNRU MEPhI (Russian Federation); Studgorodok 1, Obninsk, Kaluga region, 249030 (Russian Federation)

    2013-07-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materials circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.

  8. Fluctuation analysis-based risk assessment for respiratory virus activity and air pollution associated asthma incidence.

    Science.gov (United States)

    Liao, Chung-Min; Hsieh, Nan-Hung; Chio, Chia-Pin

    2011-08-15

    Asthma is a growing epidemic worldwide. Exacerbations of asthma have been associated with bacterial and viral respiratory tract infections and air pollution. We correlated the asthma admission rates with fluctuations in respiratory virus activity and traffic-related air pollution, namely particulate matter with an aerodynamic diameter ≤ 10 μm (PM₁₀), nitrogen dioxide (NO₂), carbon monoxide (CO), sulfur dioxide (SO₂), and ozone (O₃). A probabilistic risk assessment framework was developed based on a detrended fluctuation analysis to predict future respiratory virus and air pollutant associated asthma incidence. Results indicated a strong association between asthma admission rate and influenza (r=0.80, pinfluenza to below 0.9. We concluded that fluctuation analysis based risk assessment provides a novel predictor of asthma incidence. PMID:21663946

  9. Analysis of uncertainties in alpha-particle optical-potential assessment below the Coulomb barrier

    CERN Document Server

    Avrigeanu, V

    2016-01-01

    Background: Recent high-precision measurements of alpha-induced reaction data below the Coulomb barrier have pointed out questions of the alpha-particle optical-model potential (OMP) which are yet open within various mass ranges. Purpose: The applicability of a previous optical potential and eventual uncertainties and/or systematic errors of the OMP assessment at low energies can be further considered on this basis. Method: Nuclear model parameters based on the analysis of recent independent data, particularly gamma-ray strength functions, have been involved within statistical model calculation of the (alpha,x) reaction cross sections. Results: The above-mentioned potential provides a consistent description of the recent alpha-induced reaction data with no empirical rescaling factors of the and/or nucleon widths. Conclusions: A suitable assessment of alpha-particle optical potential below the Coulomb barrier should involve the statistical-model parameters beyond this potential on the basis of a former analysi...

  10. Analysis of uncertainties in α -particle optical-potential assessment below the Coulomb barrier

    Science.gov (United States)

    Avrigeanu, V.; Avrigeanu, M.

    2016-08-01

    Background: Recent high-precision measurements of α -induced reaction data below the Coulomb barrier have pointed out questions about the α -particle optical-model potential (OMP) which are still unanswered within various mass ranges. Purpose: The applicability of previous optical potential and eventual uncertainties and/or systematic errors of the OMP assessment at low energies can be further considered on this basis. Method: Nuclear model parameters based on the analysis of recent independent data, particularly γ -ray strength functions, have been involved within statistical model calculation of the (α ,x ) reaction cross sections. Results: The above-mentioned potential provides a consistent description of the recent α -induced reaction data with no empirical rescaling factors of the γ and/or nucleon widths. Conclusions: A suitable assessment of α -particle optical potential below the Coulomb barrier should involve the statistical-model parameters beyond this potential on the basis of a former analysis of independent data.

  11. Experimental assessment of computer codes used for safety analysis of integral reactors

    Energy Technology Data Exchange (ETDEWEB)

    Falkov, A.A.; Kuul, V.S.; Samoilov, O.B. [OKB Mechanical Engineering, Nizhny Novgorod (Russian Federation)

    1995-09-01

    Peculiarities of integral reactor thermohydraulics in accidents are associated with presence of noncondensable gas in built-in pressurizer, absence of pumped ECCS, use of guard vessel for LOCAs localisation and passive RHRS through in-reactor HX`s. These features defined the main trends in experimental investigations and verification efforts for computer codes applied. The paper reviews briefly the performed experimental investigation of thermohydraulics of AST-500, VPBER600-type integral reactors. The characteristic of UROVEN/MB-3 code for LOCAs analysis in integral reactors and results of its verification are given. The assessment of RELAP5/mod3 applicability for accident analysis in integral reactor is presented.

  12. ASSESSMENT OF PLASTIC FLOWS AND STOCKS IN SERBIA USING MATERIAL FLOW ANALYSIS

    Directory of Open Access Journals (Sweden)

    Goran Vujić

    2010-01-01

    Full Text Available Material flow analysis (MFA was used to assess the amounts of plastic materials flows and stocks that are annually produced, consumed, imported, exported, collected, recycled, and disposed in the landfills in Serbia. The analysis revealed that approximatelly 269,000 tons of plastic materials are directly disposed in uncontrolled landfills in Serbia without any preatretment, and that siginificant amounts of these materials have already accumulated in the landfills. The substantial amounts of landfilled plastics represent not only a loss of valuable recourses, but also pose a seriuos treath to the environment and human health, and if the trend of direct plastic landfilling is continued, Serbia will face with grave consecequnces.

  13. ANALYSIS OF ENVIRONMENTAL FRAGILITY USING MULTI-CRITERIA ANALYSIS (MCE FOR INTEGRATED LANDSCAPE ASSESSMENT

    Directory of Open Access Journals (Sweden)

    Abimael Cereda Junior

    2014-01-01

    Full Text Available The Geographic Information Systems brought greater possibilitie s to the representation and interpretation of the landscap e as well as the integrated a nalysis. However, this approach does not dispense technical and methodological substan tiation for achieving the computational universe. This work is grounded in ecodynamic s and empirical analysis of natural and anthr opogenic environmental Fragility a nd aims to propose and present an integrated paradigm of Multi-criteria Analysis and F uzzy Logic Model of Environmental Fragility, taking as a case study of the Basin of Monjolinho Stream in São Carlos-SP. The use of this methodology allowed for a reduct ion in the subjectivism influences of decision criteria, which factors might have its cartographic expression, respecting the complex integrated landscape.

  14. Assessment of Random Assignment in Training and Test Sets using Generalized Cluster Analysis Technique

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACĂ

    2011-06-01

    Full Text Available Aim: The properness of random assignment of compounds in training and validation sets was assessed using the generalized cluster technique. Material and Method: A quantitative Structure-Activity Relationship model using Molecular Descriptors Family on Vertices was evaluated in terms of assignment of carboquinone derivatives in training and test sets during the leave-many-out analysis. Assignment of compounds was investigated using five variables: observed anticancer activity and four structure descriptors. Generalized cluster analysis with K-means algorithm was applied in order to investigate if the assignment of compounds was or not proper. The Euclidian distance and maximization of the initial distance using a cross-validation with a v-fold of 10 was applied. Results: All five variables included in analysis proved to have statistically significant contribution in identification of clusters. Three clusters were identified, each of them containing both carboquinone derivatives belonging to training as well as to test sets. The observed activity of carboquinone derivatives proved to be normal distributed on every. The presence of training and test sets in all clusters identified using generalized cluster analysis with K-means algorithm and the distribution of observed activity within clusters sustain a proper assignment of compounds in training and test set. Conclusion: Generalized cluster analysis using the K-means algorithm proved to be a valid method in assessment of random assignment of carboquinone derivatives in training and test sets.

  15. Feasibility analysis of widely accepted indicators as key ones in river health assessment

    Institute of Scientific and Technical Information of China (English)

    FENG Yan; KANG Bin; YANG Liping

    2012-01-01

    Index systems on river health assessment are difficult for using in practice,due to the more complex and professional indicators adopted.In the paper,some key indicators which can be applied for river health assessment in general were selected,based on the analysis of 45 assessment index systems with 902 variables within around 150 published papers and documents in 1972-2010.According to the fields covered by the variables,they were divided into four groups:habitat condition,water environment,biotic status and water utilization.The adopted number and the accepted degrees in the above systems of each indicator were calculated after the variables were combined into the indicators,some of the widely accepted indicators which can reflect different aspects of river condition were selected as key indicators in candidate.Under the correlation analysis amongst the key indicators in candidate,8 indicators were finally suggested as the key indicators for assessing river health,which were:coverage rate of riparian vegetation,reserved rate of wetland,river continuity,the changing rate of water flow,the ratio of reaching water quality standard,fish index of biotic integrity,the ratio of water utilization and land use.

  16. Automatic mechanical fault assessment of small wind energy systems in microgrids using electric signature analysis

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Marhadi, Kun Saptohartyadi; Jensen, Bogi Bech

    2013-01-01

    of islanded operation. In this paper, the fault assessment is achieved efficiently and consistently via electric signature analysis (ESA). In ESA the fault related frequency components are manifested as sidebands of the existing current and voltage time harmonics. The energy content between the fundamental, 5...... element model where dynamic eccentricity and bearing outer race defect are simulated under varying fault severity and electric loading conditions....

  17. Assessing the Performance of a Classification-Based Vulnerability Analysis Model

    OpenAIRE

    Wang, Tai-Ran; Mousseau, Vincent; Pedroni, Nicola; Zio, Enrico

    2015-01-01

    In this article, a classification model based on the majority rule sorting (MR-Sort) method is employed to evaluate the vulnerability of safety-critical systems with respect to malevolent intentional acts. The model is built on the basis of a (limited-size) set of data representing (a priori known) vulnerability classification examples. The empirical construction of the clas-sification model introduces a source of uncertainty into the vulnerability analysis process: a quantitative assessment ...

  18. Risk assessment and analysis of the M109 family of vehicles Fleet Management Pilot Program

    OpenAIRE

    Hitz, Stephen E

    1997-01-01

    Approved for public release; distribution is unlimited. The purpose of this thesis is to conduct a risk assessment and analysis for the M109 l55mm Self Propelled Howitzer (SPH) Fleet Management Pilot Program. The objective of this program is to reengineer the fleet's logistical support system by outsourcing those functions which make sense and that can be performed more efficiently by private industry. This innovative approach places one contractor, or Fleet Manager, in charge of sustainin...

  19. Conference Analysis Report of Assessments on Defect and Damage for a High Temperature Structure

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hyeong Yeon

    2008-11-15

    This report presents the analysis on the state-of-the-art research trends on creep-fatigue damage, defect assessment of high temperature structure, development of heat resistant materials and their behavior at high temperature based on the papers presented in the two international conferences of ASME PVP 2008 which was held in Chicago in July 2008 and CF-5(5th International Conference on Creep, Fatigue and Creep-Fatigue) which was held in Kalpakkam, India in September 2008.

  20. Authorship Bias in Violence Risk Assessment? A Systematic Review and Meta-Analysis

    OpenAIRE

    Jay P Singh; Martin Grann; Seena Fazel

    2013-01-01

    Various financial and non-financial conflicts of interests have been shown to influence the reporting of research findings, particularly in clinical medicine. In this study, we examine whether this extends to prognostic instruments designed to assess violence risk. Such instruments have increasingly become a routine part of clinical practice in mental health and criminal justice settings. The present meta-analysis investigated whether an authorship effect exists in the violence risk assessmen...

  1. Analysis of tidal expiratory flow pattern in the assessment of histamine-induced bronchoconstriction.

    OpenAIRE

    Morris, M. J.; Madgwick, R. G.; Lane, D. J.

    1995-01-01

    BACKGROUND--There are times in clinical practice when it would be useful to be able to assess the severity of airways obstruction from tidal breathing. Three indices of airways obstruction derived from analysis of resting tidal expiratory flow have previously been described: (1) Tme/TE = time to reach maximum expiratory flow/expiratory time; (2) Krs = decay constant of exponential fitted to tidal expiratory flow versus time curve; and (3) EV = extrapolated volume--that is, area under the curv...

  2. Analysis on evaluation ability of nonlinear safety assessment model of coal mines based on artificial neural network

    Institute of Scientific and Technical Information of China (English)

    SHI Shi-liang; LIU Hai-bo; LIU Ai-hua

    2004-01-01

    Based on the integration analysis of goods and shortcomings of various methods used in safety assessment of coal mines, combining nonlinear feature of mine safety sub-system, this paper establishes the neural network assessment model of mine safety, analyzes the ability of artificial neural network to evaluate mine safety state, and lays the theoretical foundation of artificial neural network using in the systematic optimization of mine safety assessment and getting reasonable accurate safety assessment result.

  3. Seeking Missing Pieces in Science Concept Assessments: Reevaluating the Brief Electricity and Magnetism Assessment through Rasch Analysis

    Science.gov (United States)

    Ding, Lin

    2014-01-01

    Discipline-based science concept assessments are powerful tools to measure learners' disciplinary core ideas. Among many such assessments, the Brief Electricity and Magnetism Assessment (BEMA) has been broadly used to gauge student conceptions of key electricity and magnetism (E&M) topics in college-level introductory physics courses.…

  4. Qualitative and quantitative methods for human factor analysis and assessment in NPP. Investigations and results

    International Nuclear Information System (INIS)

    We consider here two basic groups of methods for analysis and assessment of the human factor in the NPP area and give some results from performed analyses as well. The human factor is the human interaction with the design equipment, with the working environment and takes into account the human capabilities and limits. In the frame of the qualitative methods for analysis of the human factor are considered concepts and structural methods for classifying of the information, connected with the human factor. Emphasize is given to the HPES method for human factor analysis in NPP. Methods for quantitative assessment of the human reliability are considered. These methods allow assigning of probabilities to the elements of the already structured information about human performance. This part includes overview of classical methods for human reliability assessment (HRA, THERP), and methods taking into account specific information about human capabilities and limits and about the man-machine interface (CHR, HEART, ATHEANA). Quantitative and qualitative results concerning human factor influence in the initiating events occurrences in the Kozloduy NPP are presented. (authors)

  5. Sex Assessment Using the Femur and Tibia in Medieval Skeletal Remains from Ireland: Discriminant Function Analysis.

    Science.gov (United States)

    Novak, Mario

    2016-04-01

    Sex determination based on discriminant function analysis of skeletal measurements is probably the most effective method for assessment of sex in archaeological and contemporary populations due to various reasons, but it also suffers from limitations such as population specificity. In this paper standards for sex assessment from the femur and tibia in the medieval Irish population are presented. Six femoral and six tibial measurements obtained from 56 male and 45 female skeletons were subjected to discriminant function analysis. Average accuracies obtained by this study range between 87.1 and 97%. The highest level of accuracy (97%) was achieved when using combined variables of the femur and tibia (maximum diameter of femoral head and circumference at tibial nutrient foramen), as well as two variables of the tibia (proximal epiphyseal breadth and circumference at nutrient foramen). Discriminant functions using a single variable provided accuracies between 87.1 and 96% with the circumference at the level of the tibial nutrient foramen providing the best separation. High accuracy rates obtained by this research correspond to the data recorded in other studies thus confirming the importance of discriminant function analysis in assessment of sex in both archaeological and forensic contexts. PMID:27301232

  6. Life Cycle Assessment and Life Cycle Cost Analysis of Magnesia Spinel Brick Production

    Directory of Open Access Journals (Sweden)

    Aysun Özkan

    2016-07-01

    Full Text Available Sustainable use of natural resources in the production of construction materials has become a necessity both in Europe and Turkey. Construction products in Europe should have European Conformity (CE and Environmental Product Declaration (EPD, an independently verified and registered document in line with the European standard EN 15804. An EPD certificate can be created by performing a Life Cycle Assessment (LCA study. In this particular work, an LCA study was carried out for a refractory brick production for environmental assessment. In addition to the LCA, the Life Cycle Cost (LCC analysis was also applied for economic assessment. Firstly, a cradle-to-gate LCA was performed for one ton of magnesia spinel refractory brick. The CML IA method included in the licensed SimaPro 8.0.1 software was chosen to calculate impact categories (namely, abiotic depletion, global warming potential, acidification potential, eutrophication potential, human toxicity, ecotoxicity, ozone depletion potential, and photochemical oxidation potential. The LCC analysis was performed by developing a cost model for internal and external cost categories within the software. The results were supported by a sensitivity analysis. According to the results, the production of raw materials and the firing process in the magnesia spinel brick production were found to have several negative effects on the environment and were costly.

  7. Scenario analysis for the postclosure assessment of the Canadian concept for nuclear fuel waste disposal

    International Nuclear Information System (INIS)

    AECL Research has developed and evaluated a concept for disposal of Canada's nuclear fuel waste involving deep underground disposal of the waste in intrusive igneous rock of the Canadian Shield. The postclosure assessment of this concept focusses on the effects on human health and the environment due to potential contaminant releases into the biosphere after the disposal vault is closed. Both radiotoxic and chemically toxic contaminants are considered. One of the steps in the postclosure assessment process is scenario analysis. Scenario analysis identifies factors that could affect the performance of the disposal system and groups these factors into scenarios that require detailed quantitative evaluation. This report documents a systematic procedure for scenario analysis that was developed for the postclosure assessment and then applied to the study of a hypothetical disposal system. The application leads to a comprehensive list of factors and a set of scenarios that require further quantitative study. The application also identifies a number of other factors and potential scenarios that would not contribute significantly to environmental and safety impacts for the hypothetical disposal system. (author). 46 refs., 3 tabs., 3 figs., 2 appendices

  8. Component fragility analysis methodology for seismic risk assessment projects. Proven PSA safety document processing and assessment procedures

    International Nuclear Information System (INIS)

    The seismic risk task assessment task should be structured as follows: (i) Define all reactor unit building structures, components and equipment involved in the creation of an initiating event (IE) induced by an seismic event or contributing to the reliability of reactor unit response to an IE; (ii) construct and estimate of the fragility curves for the building and component groups sub (i); (iii) determine the HCLPF for each group of buildings, components or equipment; (iv) determine the nuclear source's seismic resistance (SME) as the minimum HCLPF from the group of equipment in the risk-dominant scenarios; (v) define the risk-limiting group of components, equipment and building structures to the SME value; (vi) based on the fragility levels, identify component groups for which a more detailed fragility analysis is needed; and (vii) recommend groups of equipment or building structures that should be taken into account with respect to the seismic risk, i.e. such groups of equipment or building structures as exhibit a low seismic resistance (HCLPF) and, at the same time, are involved to a significant extent in the reactor unit's seismic risk (are present in the dominant risk scenarios). (P.A.)

  9. Quality Assessment of Urinary Stone Analysis: Results of a Multicenter Study of Laboratories in Europe.

    Directory of Open Access Journals (Sweden)

    Roswitha Siener

    Full Text Available After stone removal, accurate analysis of urinary stone composition is the most crucial laboratory diagnostic procedure for the treatment and recurrence prevention in the stone-forming patient. The most common techniques for routine analysis of stones are infrared spectroscopy, X-ray diffraction and chemical analysis. The aim of the present study was to assess the quality of urinary stone analysis of laboratories in Europe. Nine laboratories from eight European countries participated in six quality control surveys for urinary calculi analyses of the Reference Institute for Bioanalytics, Bonn, Germany, between 2010 and 2014. Each participant received the same blinded test samples for stone analysis. A total of 24 samples, comprising pure substances and mixtures of two or three components, were analysed. The evaluation of the quality of the laboratory in the present study was based on the attainment of 75% of the maximum total points, i.e. 99 points. The methods of stone analysis used were infrared spectroscopy (n = 7, chemical analysis (n = 1 and X-ray diffraction (n = 1. In the present study only 56% of the laboratories, four using infrared spectroscopy and one using X-ray diffraction, fulfilled the quality requirements. According to the current standard, chemical analysis is considered to be insufficient for stone analysis, whereas infrared spectroscopy or X-ray diffraction is mandatory. However, the poor results of infrared spectroscopy highlight the importance of equipment, reference spectra and qualification of the staff for an accurate analysis of stone composition. Regular quality control is essential in carrying out routine stone analysis.

  10. Assessing the influence of Environmental Impact Assessments on science and policy: an analysis of the Three Gorges Project.

    Science.gov (United States)

    Tullos, Desiree

    2009-07-01

    The need to understand and minimize negative environmental outcomes associated with large dams has both contributed to and benefited from the introduction and subsequent improvements in the Environmental Impact Assessment (EIA) process. However, several limitations in the EIA process remain, including those associated with the uncertainty and significance of impact projections. These limitations are directly related to the feedback between science and policy, with information gaps in scientific understanding discovered through the EIA process contributing valuable recommendations on critical focus areas for prioritizing and funding research within the fields of ecological conservation and river engineering. This paper presents an analysis of the EIA process for the Three Gorges Project (TGP) in China as a case study for evaluating this feedback between the EIA and science and policy. For one of the best-studied public development projects in the world, this paper presents an investigation into whether patterns exist between the scientific interest (via number of publications) in environmental impacts and (a) the identification of impacts as uncertain or priority by the EIA, (b) decisions or political events associated with the dam, and (c) impact type. This analysis includes the compilation of literature on TGP, characterization of ecosystem interactions and responses to TGP through a hierarchy of impacts, coding of EIA impacts as "uncertain" impacts that require additional study and "priority" impacts that have particularly high significance, mapping of an event chronology to relate policies, institutional changes, and decisions about TGP as "events" that could influence the focus and intensity of scientific investigation, and analysis of the number of publications by impact type and order within the impact hierarchy. From these analyses, it appears that the availability and consistency of scientific information limit the accuracy of environmental impact

  11. Predictive capacity of risk assessment scales and clinical judgment for pressure ulcers: a meta-analysis.

    Science.gov (United States)

    García-Fernández, Francisco Pedro; Pancorbo-Hidalgo, Pedro L; Agreda, J Javier Soldevilla

    2014-01-01

    A systematic review with meta-analysis was completed to determine the capacity of risk assessment scales and nurses' clinical judgment to predict pressure ulcer (PU) development. Electronic databases were searched for prospective studies on the validity and predictive capacity of PUs risk assessment scales published between 1962 and 2010 in English, Spanish, Portuguese, Korean, German, and Greek. We excluded gray literature sources, integrative review articles, and retrospective or cross-sectional studies. The methodological quality of the studies was assessed according to the guidelines of the Critical Appraisal Skills Program. Predictive capacity was measured as relative risk (RR) with 95% confidence intervals. When 2 or more valid original studies were found, a meta-analysis was conducted using a random-effect model and sensitivity analysis. We identified 57 studies, including 31 that included a validation study. We also retrieved 4 studies that tested clinical judgment as a risk prediction factor. Meta-analysis produced the following pooled predictive capacity indicators: Braden (RR = 4.26); Norton (RR = 3.69); Waterlow (RR = 2.66); Cubbin-Jackson (RR = 8.63); EMINA (RR = 6.17); Pressure Sore Predictor Scale (RR = 21.4); and clinical judgment (RR = 1.89). Pooled analysis of 11 studies found adequate risk prediction capacity in various clinical settings; the Braden, Norton, EMINA (mEntal state, Mobility, Incontinence, Nutrition, Activity), Waterlow, and Cubbin-Jackson scales showed the highest predictive capacity. The clinical judgment of nurses was found to achieve inadequate predictive capacity when used alone, and should be used in combination with a validated scale.

  12. Rapid quality assessment of Radix Aconiti Preparata using direct analysis in real time mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Zhu Hongbin; Wang Chunyan; Qi Yao [Changchun Center of Mass Spectrometry and Chemical Biology Laboratory, Changchun Institute of Applied Chemistry, Chinese Academy of Sciences, Changchun 130022 (China); University of Chinese Academy of Sciences, Beijing 100039 (China); Song Fengrui, E-mail: songfr@ciac.jl.cn [Changchun Center of Mass Spectrometry and Chemical Biology Laboratory, Changchun Institute of Applied Chemistry, Chinese Academy of Sciences, Changchun 130022 (China); Liu Zhiqiang; Liu Shuying [Changchun Center of Mass Spectrometry and Chemical Biology Laboratory, Changchun Institute of Applied Chemistry, Chinese Academy of Sciences, Changchun 130022 (China)

    2012-11-08

    Highlights: Black-Right-Pointing-Pointer DART MS combined with PCA and HCA was used to rapidly identify markers of Radix Aconiti. Black-Right-Pointing-Pointer The DART MS behavior of six aconitine-type alkaloids was investigated. Black-Right-Pointing-Pointer Chemical markers were recognized between the qualified and unqualified samples. Black-Right-Pointing-Pointer DART MS was shown to be an effective tool for quality control of Radix Aconiti Preparata. - Abstract: This study presents a novel and rapid method to identify chemical markers for the quality control of Radix Aconiti Preparata, a world widely used traditional herbal medicine. In the method, the samples with a fast extraction procedure were analyzed using direct analysis in real time mass spectrometry (DART MS) combined with multivariate data analysis. At present, the quality assessment approach of Radix Aconiti Preparata was based on the two processing methods recorded in Chinese Pharmacopoeia for the purpose of reducing the toxicity of Radix Aconiti and ensuring its clinical therapeutic efficacy. In order to ensure the safety and effectivity in clinical use, the processing degree of Radix Aconiti should be well controlled and assessed. In the paper, hierarchical cluster analysis and principal component analysis were performed to evaluate the DART MS data of Radix Aconiti Preparata samples in different processing times. The results showed that the well processed Radix Aconiti Preparata, unqualified processed and the raw Radix Aconiti could be clustered reasonably corresponding to their constituents. The loading plot shows that the main chemical markers having the most influence on the discrimination amongst the qualified and unqualified samples were mainly some monoester diterpenoid aconitines and diester diterpenoid aconitines, i.e. benzoylmesaconine, hypaconitine, mesaconitine, neoline, benzoylhypaconine, benzoylaconine, fuziline, aconitine and 10-OH-mesaconitine. The established DART MS approach in

  13. Standardization of domestic human reliability analysis and experience of human reliability analysis in probabilistic safety assessment for NPPs under design

    International Nuclear Information System (INIS)

    This paper introduces the background and development activities of domestic standardization of procedure and method for Human Reliability Analysis (HRA) to avoid the intervention of subjectivity by HRA analyst in Probabilistic Safety Assessment (PSA) as possible, and the review of the HRA results for domestic nuclear power plants under design studied by Korea Atomic Energy Research Institute. We identify the HRA methods used for PSA for domestic NPPs and discuss the subjectivity of HRA analyst shown in performing a HRA. Also, we introduce the PSA guidelines published in USA and review the HRA results based on them. We propose the system of a standard procedure and method for HRA to be developed

  14. An integrated factor analysis model for product eco-design based on full life cycle assessment

    Directory of Open Access Journals (Sweden)

    Zhi fang Zhou

    2016-02-01

    Full Text Available Purpose: Among the methods of comprehensive analysis for a product or an enterprise, there exist defects and deficiencies in traditional standard cost analyses and life cycle assessment methods. For example, some methods only emphasize one dimension (such as economic or environmental factors while neglecting other relevant dimensions. This paper builds a factor analysis model of resource value flow, based on full life cycle assessment and eco-design theory, in order to expose the relevant internal logic between these two factors. Design/methodology/approach: The model considers the efficient multiplication of resources, economic efficiency, and environmental efficiency as its core objectives. The model studies the status of resource value flow during the entire life cycle of a product, and gives an in-depth analysis on the mutual logical relationship of product performance, value, resource consumption, and environmental load to reveal the symptoms and potentials in different dimensions. Originality/value: This provides comprehensive, accurate and timely decision-making information for enterprise managers regarding product eco-design, as well as production and management activities. To conclude, it verifies the availability of this evaluation and analysis model using a Chinese SUV manufacturer as an example. 

  15. Risk Assessment Method for Offshore Structure Based on Global Sensitivity Analysis

    Directory of Open Access Journals (Sweden)

    Zou Tao

    2012-01-01

    Full Text Available Based on global sensitivity analysis (GSA, this paper proposes a new risk assessment method for an offshore structure design. This method quantifies all the significances among random variables and their parameters at first. And by comparing the degree of importance, all minor factors would be negligible. Then, the global uncertainty analysis work would be simplified. Global uncertainty analysis (GUA is an effective way to study the complexity and randomness of natural events. Since field measured data and statistical results often have inevitable errors and uncertainties which lead to inaccurate prediction and analysis, the risk in the design stage of offshore structures caused by uncertainties in environmental loads, sea level, and marine corrosion must be taken into account. In this paper, the multivariate compound extreme value distribution model (MCEVD is applied to predict the extreme sea state of wave, current, and wind. The maximum structural stress and deformation of a Jacket platform are analyzed and compared with different design standards. The calculation result sufficiently demonstrates the new risk assessment method’s rationality and security.

  16. Using multi-criteria decision analysis to assess the vulnerability of drinking water utilities.

    Science.gov (United States)

    Joerin, Florent; Cool, Geneviève; Rodriguez, Manuel J; Gignac, Marc; Bouchard, Christian

    2010-07-01

    Outbreaks of microbiological waterborne disease have increased governmental concern regarding the importance of drinking water safety. Considering the multi-barrier approach to safe drinking water may improve management decisions to reduce contamination risks. However, the application of this approach must consider numerous and diverse kinds of information simultaneously. This makes it difficult for authorities to apply the approach to decision making. For this reason, multi-criteria decision analysis can be helpful in applying the multi-barrier approach to vulnerability assessment. The goal of this study is to propose an approach based on a multi-criteria analysis method in order to rank drinking water systems (DWUs) based on their vulnerability to microbiological contamination. This approach is illustrated with an application carried out on 28 DWUs supplied by groundwater in the Province of Québec, Canada. The multi-criteria analysis method chosen is measuring attractiveness by a categorical based evaluation technique methodology allowing the assessment of a microbiological vulnerability indicator (MVI) for each DWU. Results are presented on a scale ranking DWUs from less vulnerable to most vulnerable to contamination. MVI results are tested using a sensitivity analysis on barrier weights and they are also compared with historical data on contamination at the utilities. The investigation demonstrates that MVI provides a good representation of the vulnerability of DWUs to microbiological contamination.

  17. Assessing the Goodness of Fit of Phylogenetic Comparative Methods: A Meta-Analysis and Simulation Study.

    Directory of Open Access Journals (Sweden)

    Dwueng-Chwuan Jhwueng

    Full Text Available Phylogenetic comparative methods (PCMs have been applied widely in analyzing data from related species but their fit to data is rarely assessed.Can one determine whether any particular comparative method is typically more appropriate than others by examining comparative data sets?I conducted a meta-analysis of 122 phylogenetic data sets found by searching all papers in JEB, Blackwell Synergy and JSTOR published in 2002-2005 for the purpose of assessing the fit of PCMs. The number of species in these data sets ranged from 9 to 117.I used the Akaike information criterion to compare PCMs, and then fit PCMs to bivariate data sets through REML analysis. Correlation estimates between two traits and bootstrapped confidence intervals of correlations from each model were also compared.For phylogenies of less than one hundred taxa, the Independent Contrast method and the independent, non-phylogenetic models provide the best fit.For bivariate analysis, correlations from different PCMs are qualitatively similar so that actual correlations from real data seem to be robust to the PCM chosen for the analysis. Therefore, researchers might apply the PCM they believe best describes the evolutionary mechanisms underlying their data.

  18. 'Rosatom' sites vulnerability analysis and assessment of their physical protection effectiveness. Methodology and 'tools'

    International Nuclear Information System (INIS)

    Full text: Enhancement of physical protection (PP) efficiency at nuclear sites (NS) of State Corporation (SC) 'Rosatom' is one of priorities. This issue is reflected in a series of international and Russian documents. PP enhancement at the sites can be achieved through upgrades of both administrative procedures and technical security system. However, in any case it is requisite to initially identify the so called 'objects of physical protection', that is, answer the question of what we need to protect and identify design basis threats (DBT) and adversary models. Answers to these questions constitute the contents of papers on vulnerability analysis (VA) for nuclear sites. Further, it is necessary to answer the question, to what extent we protect these 'objects of physical protection' and site as a whole; and this is the essence of assessment of physical protection effectiveness. In the process of effectiveness assessment at specific Rosatom sites we assess the effectiveness of the existing physical protection system (PPS) and the proposed options of its upgrades. Besides, there comes a possibility to select the optimal option based on 'cost-efficiency' criterion. Implementation of this work is a mandatory requirement as defined in federal level documents. In State Corporation 'Rosatom' there are methodologies in place for vulnerability analysis and effectiveness assessment as well as 'tools' (methods, regulations, computer software), that make it possible to put the above work into practice. There are corresponding regulations developed and approved by the Rosatom senior management. Special software for PPS effectiveness assessment called 'Vega-2' developed by a Rosatom specialized subsidiary - State Enterprise 'Eleron', is designed to assess PPS effectiveness at fixed nuclear sites. It was implemented practically at all the major Rosatom nuclear sites. As of now, this 'Vega-2' software has been certified and prepared for forwarding to corporation's nuclear sites so

  19. Authorship bias in violence risk assessment? A systematic review and meta-analysis.

    Science.gov (United States)

    Singh, Jay P; Grann, Martin; Fazel, Seena

    2013-01-01

    Various financial and non-financial conflicts of interests have been shown to influence the reporting of research findings, particularly in clinical medicine. In this study, we examine whether this extends to prognostic instruments designed to assess violence risk. Such instruments have increasingly become a routine part of clinical practice in mental health and criminal justice settings. The present meta-analysis investigated whether an authorship effect exists in the violence risk assessment literature by comparing predictive accuracy outcomes in studies where the individuals who designed these instruments were study authors with independent investigations. A systematic search from 1966 to 2011 was conducted using PsycINFO, EMBASE, MEDLINE, and US National Criminal Justice Reference Service Abstracts to identify predictive validity studies for the nine most commonly used risk assessment tools. Tabular data from 83 studies comprising 104 samples was collected, information on two-thirds of which was received directly from study authors for the review. Random effects subgroup analysis and metaregression were used to explore evidence of an authorship effect. We found a substantial and statistically significant authorship effect. Overall, studies authored by tool designers reported predictive validity findings around two times higher those of investigations reported by independent authors (DOR=6.22 [95% CI=4.68-8.26] in designers' studies vs. DOR=3.08 [95% CI=2.45-3.88] in independent studies). As there was evidence of an authorship effect, we also examined disclosure rates. None of the 25 studies where tool designers or translators were also study authors published a conflict of interest statement to that effect, despite a number of journals requiring that potential conflicts be disclosed. The field of risk assessment would benefit from routine disclosure and registration of research studies. The extent to which similar conflict of interests exists in those

  20. Authorship bias in violence risk assessment? A systematic review and meta-analysis.

    Directory of Open Access Journals (Sweden)

    Jay P Singh

    Full Text Available Various financial and non-financial conflicts of interests have been shown to influence the reporting of research findings, particularly in clinical medicine. In this study, we examine whether this extends to prognostic instruments designed to assess violence risk. Such instruments have increasingly become a routine part of clinical practice in mental health and criminal justice settings. The present meta-analysis investigated whether an authorship effect exists in the violence risk assessment literature by comparing predictive accuracy outcomes in studies where the individuals who designed these instruments were study authors with independent investigations. A systematic search from 1966 to 2011 was conducted using PsycINFO, EMBASE, MEDLINE, and US National Criminal Justice Reference Service Abstracts to identify predictive validity studies for the nine most commonly used risk assessment tools. Tabular data from 83 studies comprising 104 samples was collected, information on two-thirds of which was received directly from study authors for the review. Random effects subgroup analysis and metaregression were used to explore evidence of an authorship effect. We found a substantial and statistically significant authorship effect. Overall, studies authored by tool designers reported predictive validity findings around two times higher those of investigations reported by independent authors (DOR=6.22 [95% CI=4.68-8.26] in designers' studies vs. DOR=3.08 [95% CI=2.45-3.88] in independent studies. As there was evidence of an authorship effect, we also examined disclosure rates. None of the 25 studies where tool designers or translators were also study authors published a conflict of interest statement to that effect, despite a number of journals requiring that potential conflicts be disclosed. The field of risk assessment would benefit from routine disclosure and registration of research studies. The extent to which similar conflict of interests exists

  1. Complex health care interventions: Characteristics relevant for ethical analysis in health technology assessment

    Directory of Open Access Journals (Sweden)

    Lysdahl, Kristin Bakke

    2016-03-01

    Full Text Available Complexity entails methodological challenges in assessing health care interventions. In order to address these challenges, a series of characteristics of complexity have been identified in the Health Technology Assessment (HTA literature. These characteristics are primarily identified and developed to facilitate effectiveness, safety, and cost-effectiveness analysis. However, ethics is also a constitutive part of HTA, and it is not given that the conceptions of complexity that appears relevant for effectiveness, safety, and cost-effectiveness analysis are also relevant and directly applicable for ethical analysis in HTA. The objective of this article is therefore to identify and elaborate a set of key characteristics of complex health care interventions relevant for addressing ethical aspects in HTA. We start by investigating the relevance of the characteristics of complex interventions, as defined in the HTA literature. Most aspects of complexity found to be important when assessing effectiveness, safety, and efficiency turn out also to be relevant when assessing ethical issues of a given health technology. However, the importance and relevance of the complexity characteristics may differ when addressing ethical issues rather than effectiveness. Moreover, the moral challenges of a health care intervention may themselves contribute to the complexity. After identifying and analysing existing conceptions of complexity, we synthesise a set of five key characteristics of complexity for addressing ethical aspects in HTA: 1 multiple and changing perspectives, 2 indeterminate phenomena, 3 uncertain causality, 4 unpredictable outcome, and 5 ethical complexity. This may serve as an analytic tool in addressing ethical issues in HTA of complex interventions.

  2. An assessment of turbulence models for linear hydrodynamic stability analysis of strongly swirling jets

    CERN Document Server

    Rukes, Lothar; Oberleithner, Kilian

    2016-01-01

    Linear stability analysis has proven to be a useful tool in the analysis of dominant coherent structures, such as the von K\\'{a}rm\\'{a}n vortex street and the global spiral mode associated with the vortex breakdown of swirling jets. In recent years, linear stability analysis has been applied successfully to turbulent time-mean flows, instead of laminar base-flows, \\textcolor{black}{which requires turbulent models that account for the interaction of the turbulent field with the coherent structures. To retain the stability equations of laminar flows, the Boussinesq approximation with a spatially nonuniform but isotropic eddy viscosity is typically employed. In this work we assess the applicability of this concept to turbulent strongly swirling jets, a class of flows that is particularly unsuited for isotropic eddy viscosity models. Indeed we find that unsteady RANS simulations only match with experiments with a Reynolds stress model that accounts for an anisotropic eddy viscosity. However, linear stability anal...

  3. Risk Analysis and Assessment of Overtopping Concerning Sea Dikes in the Case of Storm Surge

    Institute of Scientific and Technical Information of China (English)

    王莉萍; 黄桂玲; 陈正寿; 梁丙臣; 刘桂林

    2014-01-01

    Risk-analysis-and-assessment-relating-coastal-structures-has-been-one-of-the-hot-topics-in-the-area-of-coastal-protection-recently.-In-this-paper,-from-three-aspects-of-joint-return-period-of-multiple-loads,-dike-failure-rate-and-dike-continuous-risk-prevention-respectively,-three-new-risk-analysis-methods-concerning-overtopping-of-sea-dikes-are-developed.-It-is-worth-noting-that-the-factors-of-storm-surge-which-leads-to-overtopping-are-also-considered-in-the-three-methods.-In-order-to-verify-and-estimate-the-effectiveness-and-reliability-of-the-newly-developed-methods,-quantified-mutual-information-is-adopted.-By-means-of-case-testing,-it-can-be-found-that-different-prior-variables-might-be-selected-dividedly,-according-to-the-requirement-of-special-engineering-application-or-the-dominance-of-loads.-Based-on-the-selection-of-prior-variables,-the-correlating-risk-analysis-method-can-be-successfully-applied-to-practical-engineering.

  4. Use of image analysis to assess color response on plants caused by herbicide application

    DEFF Research Database (Denmark)

    Asif, Ali; Streibig, Jens Carl; Duus, Joachim;

    2013-01-01

    by herbicides. The range of color components of green and nongreen parts of the plants and soil in Hue, Saturation, and Brightness (HSB) color space were used for segmentation. The canopy color changes of barley, winter wheat, red fescue, and brome fescue caused by doses of a glyphosate and diflufenican mixture......, cycloxydim, diquat dibromide, and fluazifop-p-butyl were described with a log-logistic dose–response model, and the relationship between visual inspection and image analysis was calculated at the effective doses that cause 50% and 90% response (ED50 and ED90, respectively). The ranges of HSB components...... for the green and nongreen parts of the plants and soil were different. The relative potencies were not significantly different from one, indicating that visual and image analysis estimations were about the same. The comparison results suggest that image analysis can be used to assess color changes of plants...

  5. Analysis of research publications that relate to bioterrorism and risk assessment.

    Science.gov (United States)

    Barker, Gary C

    2013-09-01

    Research relating to bioterrorism and its associated risks is interdisciplinary and is performed with a wide variety of objectives. Although published reports of this research have appeared only in the past decade, there has been a steady increase in their number and a continuous diversification of sources, content, and document types. In this analysis, we explored a large set of published reports, identified from accessible indices using simple search techniques, and tried to rationalize the patterns and connectivity of the research subjects rather than the detailed content. The analysis is based on a connectivity network representation built from author-assigned keywords. Network analysis reveals a strong relationship between research aimed at bioterrorism risks and research identified with public health. Additionally, the network identifies clusters of keywords centered on emergency preparedness and food safety issues. The network structure includes a large amount of meta-information that can be used for assessment and planning of research activity and for framing specific research interests.

  6. Comparative SWOT analysis of strategic environmental assessment systems in the Middle East and North Africa region.

    Science.gov (United States)

    Rachid, G; El Fadel, M

    2013-08-15

    This paper presents a SWOT analysis of SEA systems in the Middle East North Africa region through a comparative examination of the status, application and structure of existing systems based on country-specific legal, institutional and procedural frameworks. The analysis is coupled with the multi-attribute decision making method (MADM) within an analytical framework that involves both performance analysis based on predefined evaluation criteria and countries' self-assessment of their SEA system through open-ended surveys. The results show heterogenous status with a general delayed progress characterized by varied levels of weaknesses embedded in the legal and administrative frameworks and poor integration with the decision making process. Capitalizing on available opportunities, the paper highlights measures to enhance the development and enactment of SEA in the region. PMID:23648267

  7. Comparative SWOT analysis of strategic environmental assessment systems in the Middle East and North Africa region.

    Science.gov (United States)

    Rachid, G; El Fadel, M

    2013-08-15

    This paper presents a SWOT analysis of SEA systems in the Middle East North Africa region through a comparative examination of the status, application and structure of existing systems based on country-specific legal, institutional and procedural frameworks. The analysis is coupled with the multi-attribute decision making method (MADM) within an analytical framework that involves both performance analysis based on predefined evaluation criteria and countries' self-assessment of their SEA system through open-ended surveys. The results show heterogenous status with a general delayed progress characterized by varied levels of weaknesses embedded in the legal and administrative frameworks and poor integration with the decision making process. Capitalizing on available opportunities, the paper highlights measures to enhance the development and enactment of SEA in the region.

  8. A Comparative Analysis on Assessment of Land Carrying Capacity with Ecological Footprint Analysis and Index System Method.

    Directory of Open Access Journals (Sweden)

    Yao Qian

    Full Text Available Land carrying capacity (LCC explains whether the local land resources are effectively used to support economic activities and/or human population. LCC can be evaluated commonly with two approaches, namely ecological footprint analysis (EFA and the index system method (ISM. EFA is helpful to investigate the effects of different land categories whereas ISM can be used to evaluate the contributions of social, environmental, and economic factors. Here we compared the two LCC-evaluation approaches with data collected from Xiamen City, a typical region where rapid economic growth and urbanization are found in China. The results show that LCC assessments with EFA and ISM not only complement each other but also are mutually supportive. Both assessments suggest that decreases in arable land and increasingly high energy consumption have major negative effects on LCC and threaten sustainable development for Xiamen City. It is important for the local policy makers, planners and designers to reduce ecological deficits by controlling fossil energy consumption, protecting arable land and forest land from converting into other land types, and slowing down the speed of urbanization, and to promote sustainability by controlling rural-to-urban immigration, increasing hazard-free treatment rate of household garbage, and raising energy consumption per unit industrial added value. Although EFA seems more appropriate for estimating LCC for a resource-output or self-sufficient region and ISM is more suitable for a resource-input region, both approaches should be employed when perform LCC assessment in any places around the world.

  9. [Objective assessment of facial paralysis using infrared thermography and formal concept analysis].

    Science.gov (United States)

    Liu, Xu-Long; Hong, Wen-Xue; Liu, Jie-Min

    2014-04-01

    This paper presented a novel approach to objective assessment of facial nerve paralysis based on infrared thermography and formal concept analysis. Sixty five patients with facial nerve paralysis on one side were included in the study. The facial temperature distribution images of these 65 patients were captured by infrared thermography every five days during one-month period. First, the facial thermal images were pre-processed to identify six potential regions of bilateral symmetry by using image segmentation techniques. Then, the temperature differences on the left and right sides of the facial regions were extracted and analyzed. Finally, the authors explored the relationships between the statistical averages of those temperature differences and the House-Brackmann score for objective assessment degree of nerve damage in a facial nerve paralysis by using formal concept analysis. The results showed that the facial temperature distribution of patients with facial nerve paralysis exhibited a contralateral asymmetry, and the bilateral temperature differences of the facial regions were greater than 0.2 degrees C, whereas in normal healthy individuals these temperature differences were less than 0.2 degrees C. Spearman correlation coefficient between the bilateral temperature differences of the facial regions and the degree of facial nerve damage was an average of 0.508, which was statistically significant (p facial regions was greater than 0.2 degrees C, and all were less than 0.5 degrees C, facial nerve paralysis could be determined as for the mild to moderate; if one of the temperature differences of bilateral symmetry was greater than 0.5 degrees C, facial nerve paralysis could be determined as for serious. In conclusion, this paper presents an automated technique for the computerized analysis of thermal images to objectively assess facial nerve related thermal dysfunction by using formal concept analysis theory, which may benefit the clinical diagnosis and

  10. Taylor Dispersion Analysis as a promising tool for assessment of peptide-peptide interactions.

    Science.gov (United States)

    Høgstedt, Ulrich B; Schwach, Grégoire; van de Weert, Marco; Østergaard, Jesper

    2016-10-10

    Protein-protein and peptide-peptide (self-)interactions are of key importance in understanding the physiochemical behavior of proteins and peptides in solution. However, due to the small size of peptide molecules, characterization of these interactions is more challenging than for proteins. In this work, we show that protein-protein and peptide-peptide interactions can advantageously be investigated by measurement of the diffusion coefficient using Taylor Dispersion Analysis. Through comparison to Dynamic Light Scattering it was shown that Taylor Dispersion Analysis is well suited for the characterization of protein-protein interactions of solutions of α-lactalbumin and human serum albumin. The peptide-peptide interactions of three selected peptides were then investigated in a concentration range spanning from 0.5mg/ml up to 80mg/ml using Taylor Dispersion Analysis. The peptide-peptide interactions determination indicated that multibody interactions significantly affect the PPIs at concentration levels above 25mg/ml for the two charged peptides. Relative viscosity measurements, performed using the capillary based setup applied for Taylor Dispersion Analysis, showed that the viscosity of the peptide solutions increased with concentration. Our results indicate that a viscosity difference between run buffer and sample in Taylor Dispersion Analysis may result in overestimation of the measured diffusion coefficient. Thus, Taylor Dispersion Analysis provides a practical, but as yet primarily qualitative, approach to assessment of the colloidal stability of both peptide and protein formulations.

  11. Addendum to the performance assessment analysis for low-level waste disposal in the 200 west area active burial grounds

    Energy Technology Data Exchange (ETDEWEB)

    Wood, M.I., Westinghouse Hanford

    1996-12-20

    An addendum was completed to the performance assessment (PA) analysis for the active 200 West Area low-level solid waste burial grounds. The addendum includes supplemental information developed during the review of the PA analysis, an ALARA analysis, a comparison of PA results with the Hanford Groundwater Protection Strategy, and a justification for the assumption of 500 year deterrence to the inadvertent intruder.

  12. A methodological framework for hydromorphological assessment, analysis and monitoring (IDRAIM) aimed at promoting integrated river management

    Science.gov (United States)

    Rinaldi, M.; Surian, N.; Comiti, F.; Bussettini, M.

    2015-12-01

    A methodological framework for hydromorphological assessment, analysis and monitoring (named IDRAIM) has been developed with the specific aim of supporting the management of river processes by integrating the objectives of ecological quality and flood risk mitigation. The framework builds on existing and up-to-date geomorphological concepts and approaches and has been tested on several Italian streams. The framework includes the following four phases: (1) catchment-wide characterization of the fluvial system; (2) evolutionary trajectory reconstruction and assessment of current river conditions; (3) description of future trends of channel evolution; and (4) identification of management options. The framework provides specific consideration of the temporal context, in terms of reconstructing the trajectory of past channel evolution as a basis for interpreting present river conditions and future trends. A series of specific tools has been developed for the assessment of river conditions, in terms of morphological quality and channel dynamics. These include: the Morphological Quality Index (MQI), the Morphological Dynamics Index (MDI), the Event Dynamics Classification (EDC), and the river morphodynamic corridors (MC and EMC). The monitoring of morphological parameters and indicators, alongside the assessment of future scenarios of channel evolution provides knowledge for the identification, planning and prioritization of actions for enhancing morphological quality and risk mitigation.

  13. Fast eutrophication assessment for stormwater wet detention ponds via fuzzy probit regression analysis under uncertainty.

    Science.gov (United States)

    Tahsin, Subrina; Chang, Ni-Bin

    2016-02-01

    Stormwater wet detention ponds have been a commonly employed best management practice for stormwater management throughout the world for many years. In the past, the trophic state index values have been used to evaluate seasonal changes in water quality and rank lakes within a region or between several regions; yet, to date, there is no similar index for stormwater wet detention ponds. This study aimed to develop a new multivariate trophic state index (MTSI) suitable for conducting a rapid eutrophication assessment of stormwater wet detention ponds under uncertainty with respect to three typical physical and chemical properties. Six stormwater wet detention ponds in Florida were selected for demonstration of the new MTSI with respect to total phosphorus (TP), total nitrogen (TN), and Secchi disk depth (SDD) as cognitive assessment metrics to sense eutrophication potential collectively and inform the environmental impact holistically. Due to the involvement of multiple endogenous variables (i.e., TN, TP, and SDD) for the eutrophication assessment simultaneously under uncertainty, fuzzy synthetic evaluation was applied to first standardize and synchronize the sources of uncertainty in the decision analysis. The ordered probit regression model was then formulated for assessment based on the concept of MTSI with the inputs from the fuzzy synthetic evaluation. It is indicative that the severe eutrophication condition is present during fall, which might be due to frequent heavy summer storm events contributing to high-nutrient inputs in these six ponds. PMID:26733470

  14. UAV-based urban structural damage assessment using object-based image analysis and semantic reasoning

    Directory of Open Access Journals (Sweden)

    J. Fernandez Galarreta

    2014-09-01

    Full Text Available Structural damage assessment is critical after disasters but remains a challenge. Many studies have explored the potential of remote sensing data, but limitations of vertical data persist. Oblique imagery has been identified as more useful, though the multi-angle imagery also adds a new dimension of complexity. This paper addresses damage assessment based on multi-perspective, overlapping, very high resolution oblique images obtained with unmanned aerial vehicles (UAVs. 3-D point-cloud assessment for the entire building is combined with detailed object-based image analysis (OBIA of façades and roofs. This research focuses not on automatic damage assessment, but on creating a methodology that supports the often ambiguous classification of intermediate damage levels, aiming at producing comprehensive per-building damage scores. We identify completely damaged structures in the 3-D point cloud, and for all other cases provide the OBIA-based damage indicators to be used as auxiliary information by damage analysts. The results demonstrate the usability of the 3-D point-cloud data to identify major damage features. Also the UAV-derived and OBIA-processed oblique images are shown to be a suitable basis for the identification of detailed damage features on façades and roofs. Finally, we also demonstrate the possibility of aggregating the multi-perspective damage information at building level.

  15. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    Science.gov (United States)

    Baruffini, Mirko

    2010-05-01

    Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a

  16. Automatic Speech Signal Analysis for Clinical Diagnosis and Assessment of Speech Disorders

    CERN Document Server

    Baghai-Ravary, Ladan

    2013-01-01

    Automatic Speech Signal Analysis for Clinical Diagnosis and Assessment of Speech Disorders provides a survey of methods designed to aid clinicians in the diagnosis and monitoring of speech disorders such as dysarthria and dyspraxia, with an emphasis on the signal processing techniques, statistical validity of the results presented in the literature, and the appropriateness of methods that do not require specialized equipment, rigorously controlled recording procedures or highly skilled personnel to interpret results. Such techniques offer the promise of a simple and cost-effective, yet objective, assessment of a range of medical conditions, which would be of great value to clinicians. The ideal scenario would begin with the collection of examples of the clients’ speech, either over the phone or using portable recording devices operated by non-specialist nursing staff. The recordings could then be analyzed initially to aid diagnosis of conditions, and subsequently to monitor the clients’ progress and res...

  17. Analysis of medicostatistical data to assess the genetic and teratogenic effects of the Chernobyl accident

    International Nuclear Information System (INIS)

    Analysis of the official medicodemographic statistical data (provided by the Ukrainian Ministry of Health) revealed variations in the mean rates of congenital developmental defects (CDD) before 1987 (1985-1986) and in the period of 1987-1989 in all the areas irrespective of a degree of contamination with radionuclides (i.e. variations are determined by the time factor rather than by the irradiation factor). According to the medical statistical data, the rates of CDD and spontaneous abortions varied within a wide range, making it difficult to assess probable mutagenic and teratogenic effects of the Chernobyl accident. Medicostatistical data on spontaneous abortions understated the actual rates 2-3-fold, therefore they were not recommended for assessment of mutagenic effects of the Chernobyl accident

  18. Predictive Validity of Pressure Ulcer Risk Assessment Tools for Elderly: A Meta-Analysis.

    Science.gov (United States)

    Park, Seong-Hi; Lee, Young-Shin; Kwon, Young-Mi

    2016-04-01

    Preventing pressure ulcers is one of the most challenging goals existing for today's health care provider. Currently used tools which assess risk of pressure ulcer development rarely evaluate the accuracy of predictability, especially in older adults. The current study aimed at providing a systemic review and meta-analysis of 29 studies using three pressure ulcer risk assessment tools: Braden, Norton, and Waterlow Scales. Overall predictive validities of pressure ulcer risks in the pooled sensitivity and specificity indicated a similar range with a moderate accuracy level in all three scales, while heterogeneity showed more than 80% variability among studies. The studies applying the Braden Scale used five different cut-off points representing the primary cause of heterogeneity. Results indicate that commonly used screening tools for pressure ulcer risk have limitations regarding validity and accuracy for use with older adults due to heterogeneity among studies.

  19. Organizational analysis and safety for utilities with nuclear power plants: perspectives for organizational assessment. Volume 2

    International Nuclear Information System (INIS)

    This two-volume report presents the results of initial research on the feasibility of applying organizational factors in nuclear power plant (NPP) safety assessment. Volume 1 of this report contains an overview of the literature, a discussion of available safety indicators, and a series of recommendations for more systematically incorporating organizational analysis into investigations of nuclear power plant safety. The six chapters of this volume discuss the major elements in our general approach to safety in the nuclear industry. The chapters include information on organizational design and safety; organizational governance; utility environment and safety related outcomes; assessments by selected federal agencies; review of data sources in the nuclear power industry; and existing safety indicators

  20. The Structured Assessment Approach: A microcomputer-based insider-vulnerability analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    Patenaude, C.J.; Sicherman, A.; Sacks, I.J.

    1986-01-01

    The Structured Assessment Approach (SAA) was developed to help assess the vulnerability of safeguards systems to insiders in a staged manner. For physical security systems, the SAA identifies possible diversion paths which are not safeguarded under various facility operating conditions and insiders who could defeat the system via direct access, collusion or indirect tampering. For material control and accounting systems, the SAA identifies those who could block the detection of a material loss or diversion via data falsification or equipment tampering. The SAA, originally designed to run on a mainframe computer, has been converted to run on a personal computer. Many features have been added to simplify and facilitate its use for conducting vulnerability analysis. The SAA microcomputer based approach is discussed in this paper.

  1. Assessing the Queuing Process Using Data Envelopment Analysis: an Application in Health Centres.

    Science.gov (United States)

    Safdar, Komal A; Emrouznejad, Ali; Dey, Prasanta K

    2016-01-01

    Queuing is one of the very important criteria for assessing the performance and efficiency of any service industry, including healthcare. Data Envelopment Analysis (DEA) is one of the most widely-used techniques for performance measurement in healthcare. However, no queue management application has been reported in the health-related DEA literature. Most of the studies regarding patient flow systems had the objective of improving an already existing Appointment System. The current study presents a novel application of DEA for assessing the queuing process at an Outpatients' department of a large public hospital in a developing country where appointment systems do not exist. The main aim of the current study is to demonstrate the usefulness of DEA modelling in the evaluation of a queue system. The patient flow pathway considered for this study consists of two stages; consultation with a doctor and pharmacy. The DEA results indicated that waiting times and other related queuing variables included need considerable minimisation at both stages.

  2. Neural image analysis in the process of quality assessment: domestic pig oocytes

    Science.gov (United States)

    Boniecki, P.; Przybył, J.; Kuzimska, T.; Mueller, W.; Raba, B.; Lewicki, A.; Przybył, K.; Zaborowicz, M.; Koszela, K.

    2014-04-01

    The questions related to quality classification of animal oocytes are explored by numerous scientific and research centres. This research is important, particularly in the context of improving the breeding value of farm animals. The methods leading to the stimulation of normal development of a larger number of fertilised animal oocytes in extracorporeal conditions are of special importance. Growing interest in the techniques of supported reproduction resulted in searching for new, increasingly effective methods for quality assessment of mammalian gametes and embryos. Progress in the production of in vitro animal embryos in fact depends on proper classification of obtained oocytes. The aim of this paper was the development of an original method for quality assessment of oocytes, performed on the basis of their graphical presentation in the form of microscopic digital images. The classification process was implemented on the basis of the information coded in the form of microphotographic pictures of the oocytes of domestic pig, using the modern methods of neural image analysis.

  3. Information Technology Project Portfolio and Strategy Alignment Assessment Based on Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Marisa Analía Sánchez

    2012-11-01

    Full Text Available Recent researches have shown that companies face considerable difficulties in assessing the strategy value contribution of Information Technology (IT investments. One of the major obstacles to achieving strategy alignment is that organizations find extremely difficult to link and quantify the IT investments benefits with strategic goals. The aim of this paper is to define an approach to assess portfolio-strategy alignment. To this end a formal specification of Kaplan and Norton Strategy Map is developed utilizing Unified Modeling Language (UML. The approach uses the Strategy Map as a framework for defining the portfolio value contribution and Data Envelopment Analysis (DEA is used as the methodology for measuring efficiency of project portfolios.DOI:10.5585/gep.v3i2.66

  4. Environmental impact assessment in Colombia: Critical analysis and proposals for improvement

    International Nuclear Information System (INIS)

    The evaluation of Environmental Impact Assessment (EIA) systems is a highly recommended strategy for enhancing their effectiveness and quality. This paper describes an evaluation of EIA in Colombia, using the model and the control mechanisms proposed and applied in other countries by Christopher Wood and Ortolano. The evaluation criteria used are based on Principles of Environmental Impact Assessment Best Practice, such as effectiveness and control features, and they were contrasted with the opinions of a panel of Colombian EIA experts as a means of validating the results of the study. The results found that EIA regulations in Colombia were ineffective because of limited scope, inadequate administrative support and the inexistence of effective control mechanisms and public participation. This analysis resulted in a series of recommendations regarding the further development of the EIA system in Colombia with a view to improving its quality and effectiveness.

  5. Nondestructive inspection assessment of eddy current and electrochemical analysis to separate inconel and stainless steel alloys

    Energy Technology Data Exchange (ETDEWEB)

    Moore, D.G.; Sorensen, N.R.

    1998-02-01

    This report presents a nondestructive inspection assessment of eddy current and electrochemical analysis to separate inconel alloys from stainless steel alloys as well as an evaluation of cleaning techniques to remove a thermal oxide layer on aircraft exhaust components. The results of this assessment are presented in terms of how effective each technique classifies a known exhaust material. Results indicate that either inspection technique can separate inconel and stainless steel alloys. Based on the experiments conducted, the electrochemical spot test is the optimum for use by airframe and powerplant mechanics. A spot test procedure is proposed for incorporation into the Federal Aviation Administration Advisory Circular 65-9A Airframe & Powerplant Mechanic - General Handbook. 3 refs., 70 figs., 7 tabs.

  6. Assessment of Static Delamination Propagation Capabilities in Commercial Finite Element Codes Using Benchmark Analysis

    Science.gov (United States)

    Orifici, Adrian C.; Krueger, Ronald

    2010-01-01

    With capabilities for simulating delamination growth in composite materials becoming available, the need for benchmarking and assessing these capabilities is critical. In this study, benchmark analyses were performed to assess the delamination propagation simulation capabilities of the VCCT implementations in Marc TM and MD NastranTM. Benchmark delamination growth results for Double Cantilever Beam, Single Leg Bending and End Notched Flexure specimens were generated using a numerical approach. This numerical approach was developed previously, and involves comparing results from a series of analyses at different delamination lengths to a single analysis with automatic crack propagation. Specimens were analyzed with three-dimensional and two-dimensional models, and compared with previous analyses using Abaqus . The results demonstrated that the VCCT implementation in Marc TM and MD Nastran(TradeMark) was capable of accurately replicating the benchmark delamination growth results and that the use of the numerical benchmarks offers advantages over benchmarking using experimental and analytical results.

  7. Using statistical analysis and artificial intelligence tools for automatic assessment of video sequences

    Science.gov (United States)

    Ekobo Akoa, Brice; Simeu, Emmanuel; Lebowsky, Fritz

    2014-01-01

    This paper proposes two novel approaches to Video Quality Assessment (VQA). Both approaches attempt to develop video evaluation techniques capable of replacing human judgment when rating video quality in subjective experiments. The underlying study consists of selecting fundamental quality metrics based on Human Visual System (HVS) models and using artificial intelligence solutions as well as advanced statistical analysis. This new combination enables suitable video quality ratings while taking as input multiple quality metrics. The first method uses a neural network based machine learning process. The second method consists in evaluating the video quality assessment using non-linear regression model. The efficiency of the proposed methods is demonstrated by comparing their results with those of existing work done on synthetic video artifacts. The results obtained by each method are compared with scores from a database resulting from subjective experiments.

  8. Assessing explanatory style: the content analysis of verbatim explanations and the Attributional Style Questionnaire.

    Science.gov (United States)

    Schulman, P; Castellon, C; Seligman, M E

    1989-01-01

    We compare two methods of assessing explanatory style--the content analysis of verbatim explanations (CAVE) and the Attributional Style Questionnaire (ASQ). The CAVE technique is a method that allows the researcher to analyze any naturally occurring verbatim materials for explanatory style. This technique permits the measurement of various populations that are unwilling or unable to take the ASQ. We administered the ASQ and Beck Depression Inventory (BDI) to 169 undergraduates and content analyzed the written causes on the ASQ for explanatory style by the CAVE technique. The CAVE technique correlated 0.71 with the ASQ (P less than 0.0001, n = 159) and -0.36 with BDI (P less than 0.0001, n = 159). The ASQ correlated -0.51 with the BDI (P less than 0.0001, n = 160). Both the CAVE technique and the ASQ seem to be valid devices for assessing explanatory style. PMID:2818415

  9. A framework for techno-economic & environmental sustainability analysis by risk assessment for conceptual process evaluation

    DEFF Research Database (Denmark)

    Loureiro da Costa Lira Gargalo, Carina; Carvalho, Ana; Gernaey, Krist;

    2016-01-01

    The need to achieve a sustainable process performance has become increasingly important in order to keep a competitive advantage in the global markets. Development of comprehensive and systematic methods to accomplish this goal is the subject of this work. To this end, a multi-level framework...... for techno-economic and environmental sustainability analysis through risk assessment is proposed for the early-stage design and screening of conceptual process alternatives. The alternatives within the design space are analyzed following the framework’s work-flow, which targets the following: (i) quantify...... for the quantitative and qualitative assessment of sustainability at the decision-support level. Through the application of appropriate methods in a hierarchical manner, this tool leads to the identification of the potentially best and more sustainable solutions. Furthermore, the application of the framework...

  10. Assessment of ecological risks at former landfill site using TRIAD procedure and multicriteria analysis.

    Science.gov (United States)

    Sorvari, Jaana; Schultz, Eija; Haimi, Jari

    2013-02-01

    Old industrial landfills are important sources of environmental contamination in Europe, including Finland. In this study, we demonstrated the combination of TRIAD procedure, multicriteria decision analysis (MCDA), and statistical Monte Carlo analysis for assessing the risks to terrestrial biota in a former landfill site contaminated by petroleum hydrocarbons (PHCs) and metals. First, we generated hazard quotients by dividing the concentrations of metals and PHCs in soil by the corresponding risk-based ecological benchmarks. Then we conducted ecotoxicity tests using five plant species, earthworms, and potworms, and determined the abundance and diversity of soil invertebrates from additional samples. We aggregated the results in accordance to the methods used in the TRIAD procedure, conducted rating of the assessment methods based on their performance in terms of specific criteria, and weighted the criteria using two alternative weighting techniques to produce performance scores for each method. We faced problems in using the TRIAD procedure, for example, the results from the animal counts had to be excluded from the calculation of integrated risk estimates (IREs) because our reference soil sample showed the lowest biodiversity and abundance of soil animals. In addition, hormesis hampered the use of the results from the ecotoxicity tests. The final probabilistic IREs imply significant risks at all sampling locations. Although linking MCDA with TRIAD provided a useful means to study and consider the performance of the alternative methods in predicting ecological risks, some uncertainties involved still remained outside the quantitative analysis. PMID:22762796

  11. [Multi-component quantitative analysis combined with chromatographic fingerprint for quality assessment of Onosma hookeri].

    Science.gov (United States)

    Aga, Er-bu; Nie, Li-juan; Dongzhi, Zhuo-ma; Wang, Ju-le

    2015-11-01

    A method for simultaneous determination of the shikonin, acetyl shikonin and β, β'-dimethylpropene shikonin in Onosma hookeri and the chromatographic fingerprint was estabished by HPLC-DAD on an Agilent Zorbax SB-column with a gradient elution of acetonitrile and water at 0.8 mL x min(-1), 30 degrees C. The quality assessment was conducted by comparing the content difference of three naphthoquinone constituents, in combination with chromatographic fingerprint analysis and systems cluster analysis among 7 batches of radix O. hookeri. The content of the three naphthoquinone constituents showed wide variations in 7 bathces. The similarity value of the fingerprints of sample 5, 6 and 7 was above 0.99, sample 2 and 3 above 0.97, sample 3 and 4 above 0.90, and other samples larger than 0.8, which was in concert with the content of three naphthoquinone constituents. The 7 samples were roughly divided into 4 categories. The results above indicated that the using of this medicine is complex and rather spotty. The established HPLC fingerprints and the quantitative analysis method can be used efficiently for quality assessment of O. hookeri.

  12. Methods for the analysis of ordinal response data in medical image quality assessment.

    Science.gov (United States)

    Keeble, Claire; Baxter, Paul D; Gislason-Lee, Amber J; Treadgold, Laura A; Davies, Andrew G

    2016-07-01

    The assessment of image quality in medical imaging often requires observers to rate images for some metric or detectability task. These subjective results are used in optimization, radiation dose reduction or system comparison studies and may be compared to objective measures from a computer vision algorithm performing the same task. One popular scoring approach is to use a Likert scale, then assign consecutive numbers to the categories. The mean of these response values is then taken and used for comparison with the objective or second subjective response. Agreement is often assessed using correlation coefficients. We highlight a number of weaknesses in this common approach, including inappropriate analyses of ordinal data and the inability to properly account for correlations caused by repeated images or observers. We suggest alternative data collection and analysis techniques such as amendments to the scale and multilevel proportional odds models. We detail the suitability of each approach depending upon the data structure and demonstrate each method using a medical imaging example. Whilst others have raised some of these issues, we evaluated the entire study from data collection to analysis, suggested sources for software and further reading, and provided a checklist plus flowchart for use with any ordinal data. We hope that raised awareness of the limitations of the current approaches will encourage greater method consideration and the utilization of a more appropriate analysis. More accurate comparisons between measures in medical imaging will lead to a more robust contribution to the imaging literature and ultimately improved patient care. PMID:26975497

  13. Pesticide Flow Analysis to Assess Human Exposure in Greenhouse Flower Production in Colombia

    Directory of Open Access Journals (Sweden)

    Claudia R. Binder

    2013-03-01

    Full Text Available Human exposure assessment tools represent a means for understanding human exposure to pesticides in agricultural activities and managing possible health risks. This paper presents a pesticide flow analysis modeling approach developed to assess human exposure to pesticide use in greenhouse flower crops in Colombia, focusing on dermal and inhalation exposure. This approach is based on the material flow analysis methodology. The transfer coefficients were obtained using the whole body dosimetry method for dermal exposure and the button personal inhalable aerosol sampler for inhalation exposure, using the tracer uranine as a pesticide surrogate. The case study was a greenhouse rose farm in the Bogota Plateau in Colombia. The approach was applied to estimate the exposure to pesticides such as mancozeb, carbendazim, propamocarb hydrochloride, fosetyl, carboxin, thiram, dimethomorph and mandipropamide. We found dermal absorption estimations close to the AOEL reference values for the pesticides carbendazim, mancozeb, thiram and mandipropamide during the study period. In addition, high values of dermal exposure were found on the forearms, hands, chest and legs of study participants, indicating weaknesses in the overlapping areas of the personal protective equipment parts. These results show how the material flow analysis methodology can be applied in the field of human exposure for early recognition of the dispersion of pesticides and support the development of measures to improve operational safety during pesticide management. Furthermore, the model makes it possible to identify the status quo of the health risk faced by workers in the study area.

  14. Pesticide flow analysis to assess human exposure in greenhouse flower production in Colombia.

    Science.gov (United States)

    Lesmes-Fabian, Camilo; Binder, Claudia R

    2013-03-25

    Human exposure assessment tools represent a means for understanding human exposure to pesticides in agricultural activities and managing possible health risks. This paper presents a pesticide flow analysis modeling approach developed to assess human exposure to pesticide use in greenhouse flower crops in Colombia, focusing on dermal and inhalation exposure. This approach is based on the material flow analysis methodology. The transfer coefficients were obtained using the whole body dosimetry method for dermal exposure and the button personal inhalable aerosol sampler for inhalation exposure, using the tracer uranine as a pesticide surrogate. The case study was a greenhouse rose farm in the Bogota Plateau in Colombia. The approach was applied to estimate the exposure to pesticides such as mancozeb, carbendazim, propamocarb hydrochloride, fosetyl, carboxin, thiram, dimethomorph and mandipropamide. We found dermal absorption estimations close to the AOEL reference values for the pesticides carbendazim, mancozeb, thiram and mandipropamide during the study period. In addition, high values of dermal exposure were found on the forearms, hands, chest and legs of study participants, indicating weaknesses in the overlapping areas of the personal protective equipment parts. These results show how the material flow analysis methodology can be applied in the field of human exposure for early recognition of the dispersion of pesticides and support the development of measures to improve operational safety during pesticide management. Furthermore, the model makes it possible to identify the status quo of the health risk faced by workers in the study area.

  15. Texture analysis for the assessment of structural changes in parotid glands induced by radiotherapy

    International Nuclear Information System (INIS)

    Background and purpose: During radiotherapy (RT) for head-and-neck cancer, parotid glands undergo significant anatomic, functional and structural changes which could characterize pre-clinical signs of an increased risk of xerostomia. Texture analysis is proposed to assess structural changes of parotids induced by RT, and to investigate whether early variations of textural parameters (such as mean intensity and fractal dimension) can predict parotid shrinkage at the end of treatment. Material and methods: Textural parameters and volumes of 42 parotids from 21 patients treated with intensity-modulated RT for nasopharyngeal cancer were extracted from CT images. To individuate which parameters changed during RT, a Wilcoxon signed-rank test between textural indices (first and second RT week; first and last RT week) was performed. Discriminant analysis was applied to variations of these parameters in the first two weeks of RT to assess their power in predicting parotid shrinkage at the end of RT. Results: A significant decrease in mean intensity (1.7 HU and 3.8 HU after the second and last weeks, respectively) and fractal dimension (0.016 and 0.021) was found. Discriminant analysis, based on volume and fractal dimension, was able to predict the final parotid shrinkage (accuracy of 71.4%). Conclusion: Textural features could be used in combination with volume to characterize structural modifications on parotid glands and to predict parotid shrinkage at the end of RT

  16. Resource efficiency of urban sanitation systems. A comparative assessment using material and energy flow analysis

    Energy Technology Data Exchange (ETDEWEB)

    Meinzinger, Franziska

    2010-07-01

    Within the framework of sustainable development it is important to find ways of reducing natural resource consumption and to change towards closed-loop management. As in many other spheres increased resource efficiency has also become an important issue in sanitation. Particularly nutrient recovery for agriculture, increased energy-efficiency and saving of natural water resources, can make a contribution to more resource efficient sanitation systems. To assess the resource efficiency of alternative developments a systems perspective is required. The present study applies a combined cost, energy and material flow analysis (ceMFA) as a system analysis method to assess the resource efficiency of urban sanitation systems. This includes the discussion of relevant criteria and assessment methods. The main focus of this thesis is the comparative assessment of different systems, based on two case studies; Hamburg in Germany and Arba Minch in Ethiopia. A range of possible system developments including source separation (e.g. diversion of urine or blackwater) is defined and compared with the current situation as a reference system. The assessment is carried out using computer simulations based on model equations. The model equations not only integrate mass and nutrient flows, but also the energy and cost balances of the different systems. In order to assess the impact of different assumptions and calculation parameters, sensitivity analyses and parameter variations complete the calculations. Based on the simulations, following general conclusions can be drawn: None of the systems show an overall benefit with regard to all investigated criteria, namely nutrients, energy, water and costs. Yet, the results of the system analysis can be used as basis for decision making if a case-related weighting is introduced. The systems show varying potential for the recovery of nutrients from (source separated) wastewater flows. For the case study of Hamburg up to 29% of the mineral

  17. A critical analysis of hazard resilience measures within sustainability assessment frameworks

    International Nuclear Information System (INIS)

    Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site, community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs

  18. Assessing the effect of data pretreatment procedures for principal components analysis of chromatographic data.

    Science.gov (United States)

    McIlroy, John W; Smith, Ruth Waddell; McGuffin, Victoria L

    2015-12-01

    Following publication of the National Academy of Sciences report "Strengthening Forensic Science in the United States: A Path Forward", there has been increasing interest in the application of multivariate statistical procedures for the evaluation of forensic evidence. However, prior to statistical analysis, variance from sources other than the sample must be minimized through application of data pretreatment procedures. This is necessary to ensure that subsequent statistical analysis of the data provides meaningful results. The purpose of this work was to evaluate the effect of pretreatment procedures on multivariate statistical analysis of chromatographic data obtained for a reference set of diesel fuels. Diesel was selected due to its chemical complexity and forensic relevance, both for fire debris and environmental forensic applications. Principal components analysis (PCA) was applied to the untreated chromatograms to assess association of replicates and discrimination among the different diesel samples. The chromatograms were then pretreated by sequentially applying the following procedures: background correction, smoothing, retention-time alignment, and normalization. The effect of each procedure on association and discrimination was evaluated based on the association of replicates in the PCA scores plot. For these data, background correction and smoothing offered minimal improvement, whereas alignment and normalization offered the greatest improvement in the association of replicates and discrimination among highly similar samples. Further, prior to pretreatment, the first principal component accounted for only non-sample sources of variance. Following pretreatment, these sources were minimized and the first principal component accounted for significant chemical differences among the diesel samples. These results highlight the need for pretreatment procedures and provide a metric to assess the effect of pretreatment on subsequent multivariate statistical

  19. Assessing the effect of data pretreatment procedures for principal components analysis of chromatographic data.

    Science.gov (United States)

    McIlroy, John W; Smith, Ruth Waddell; McGuffin, Victoria L

    2015-12-01

    Following publication of the National Academy of Sciences report "Strengthening Forensic Science in the United States: A Path Forward", there has been increasing interest in the application of multivariate statistical procedures for the evaluation of forensic evidence. However, prior to statistical analysis, variance from sources other than the sample must be minimized through application of data pretreatment procedures. This is necessary to ensure that subsequent statistical analysis of the data provides meaningful results. The purpose of this work was to evaluate the effect of pretreatment procedures on multivariate statistical analysis of chromatographic data obtained for a reference set of diesel fuels. Diesel was selected due to its chemical complexity and forensic relevance, both for fire debris and environmental forensic applications. Principal components analysis (PCA) was applied to the untreated chromatograms to assess association of replicates and discrimination among the different diesel samples. The chromatograms were then pretreated by sequentially applying the following procedures: background correction, smoothing, retention-time alignment, and normalization. The effect of each procedure on association and discrimination was evaluated based on the association of replicates in the PCA scores plot. For these data, background correction and smoothing offered minimal improvement, whereas alignment and normalization offered the greatest improvement in the association of replicates and discrimination among highly similar samples. Further, prior to pretreatment, the first principal component accounted for only non-sample sources of variance. Following pretreatment, these sources were minimized and the first principal component accounted for significant chemical differences among the diesel samples. These results highlight the need for pretreatment procedures and provide a metric to assess the effect of pretreatment on subsequent multivariate statistical

  20. Mercury analysis in hair: Comparability and quality assessment within the transnational COPHES/DEMOCOPHES project.

    Science.gov (United States)

    Esteban, Marta; Schindler, Birgit Karin; Jiménez, José Antonio; Koch, Holger Martin; Angerer, Jürgen; Rosado, Montserrat; Gómez, Silvia; Casteleyn, Ludwine; Kolossa-Gehring, Marike; Becker, Kerstin; Bloemen, Louis; Schoeters, Greet; Den Hond, Elly; Sepai, Ovnair; Exley, Karen; Horvat, Milena; Knudsen, Lisbeth E; Joas, Anke; Joas, Reinhard; Aerts, Dominique; Biot, Pierre; Borošová, Daniela; Davidson, Fred; Dumitrascu, Irina; Fischer, Marc E; Grander, Margaretha; Janasik, Beata; Jones, Kate; Kašparová, Lucie; Larssen, Thorjørn; Naray, Miklos; Nielsen, Flemming; Hohenblum, Philipp; Pinto, Rui; Pirard, Catherine; Plateel, Gregory; Tratnik, Janja Snoj; Wittsiepe, Jürgen; Castaño, Argelia

    2015-08-01

    Human biomonitoring (HBM) is an effective tool for assessing actual exposure to chemicals that takes into account all routes of intake. Although hair analysis is considered to be an optimal biomarker for assessing mercury exposure, the lack of harmonization as regards sampling and analytical procedures has often limited the comparison of data at national and international level. The European-funded projects COPHES and DEMOCOPHES developed and tested a harmonized European approach to Human Biomonitoring in response to the European Environment and Health Action Plan. Herein we describe the quality assurance program (QAP) for assessing mercury levels in hair samples from more than 1800 mother-child pairs recruited in 17 European countries. To ensure the comparability of the results, standard operating procedures (SOPs) for sampling and for mercury analysis were drafted and distributed to participating laboratories. Training sessions were organized for field workers and four external quality-assessment exercises (ICI/EQUAS), followed by the corresponding web conferences, were organized between March 2011 and February 2012. ICI/EQUAS used native hair samples at two mercury concentration ranges (0.20-0.71 and 0.80-1.63) per exercise. The results revealed relative standard deviations of 7.87-13.55% and 4.04-11.31% for the low and high mercury concentration ranges, respectively. A total of 16 out of 18 participating laboratories the QAP requirements and were allowed to analyze samples from the DEMOCOPHES pilot study. Web conferences after each ICI/EQUAS revealed this to be a new and effective tool for improving analytical performance and increasing capacity building. The procedure developed and tested in COPHES/DEMOCOPHES would be optimal for application on a global scale as regards implementation of the Minamata Convention on Mercury. PMID:25483984

  1. The Development and Implementation of an Instrument to Assess Students’ Data Analysis Skills in Molecular Biology

    Directory of Open Access Journals (Sweden)

    Brian J. Rybarczyk

    2014-03-01

    Full Text Available Developing visual literacy skills is an important component of scientific literacy in undergraduate science education.  Comprehension, analysis, and interpretation are parts of visual literacy that describe related data analysis skills important for learning in the biological sciences. The Molecular Biology Data Analysis Test (MBDAT was developed to measure students’ data analysis skills connected with scientific reasoning when analyzing and interpreting scientific data generated from experimental research.  The skills analyzed included basic skills such as identification of patterns and trends in data and connecting a method that generated the data and advanced skills such as distinguishing positive and negative controls, synthesizing conclusions, determining if data supports a hypothesis, and predicting alternative or next-step experiments.  Construct and content validity were established and calculated statistical parameters demonstrate that the MBDAT is valid and reliable for measuring students’ data analysis skills in molecular and cell biology contexts.  The instrument also measures students’ perceived confidence in their data interpretation abilities.  As scientific research continues to evolve in complexity, interpretation of scientific information in visual formats will continue to be an important component of scientific literacy.  Thus science education will need to support and assess students’ development of these skills as part of students’ scientific training.

  2. Safety assessment technology on the free drop impact and puncture analysis of the cask for radioactive material transport

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dew Hey [Korea Institute of Nuclear Safety, Taejon (Korea, Republic of); Lee, Young Shin; Ryu, Chung Hyun; Kim, Hyun Su; Lee, Ho Chul; Hong, Song Jin; Choi, Young Jin; Lee, Jae Hyung; Na, Jae Yun [Chungnam National Univ., Taejon (Korea, Republic of)

    2001-03-15

    In this study, the regulatory condition and analysis condition is analyzed for the free drop and puncture impact analysis to develop the safety assessment technology. Impact analysis is performed with finite element method which is one of the many analysis methods of the shipping cask. LS-DYNA3D and ABAQUS is suitable for the free drop and the puncture impact analysis of the shipping cask. For the analysis model, the KSC-4 that is the shipping cask to transport spent nuclear fuel is investigated. The results of both LS-DYNA3D and ABAQUS is completely corresponded. And The integrity of the shipping cask is verified. Using this study, the reliable safety assessment technology is supplied to the staff. The efficient and reliable regulatory tasks is performed using the standard safety assessment technology.

  3. Safety assessment technology on the free drop impact and puncture analysis of the cask for radioactive material transport

    International Nuclear Information System (INIS)

    In this study, the regulatory condition and analysis condition is analyzed for the free drop and puncture impact analysis to develop the safety assessment technology. Impact analysis is performed with finite element method which is one of the many analysis methods of the shipping cask. LS-DYNA3D and ABAQUS is suitable for the free drop and the puncture impact analysis of the shipping cask. For the analysis model, the KSC-4 that is the shipping cask to transport spent nuclear fuel is investigated. The results of both LS-DYNA3D and ABAQUS is completely corresponded. And The integrity of the shipping cask is verified. Using this study, the reliable safety assessment technology is supplied to the staff. The efficient and reliable regulatory tasks is performed using the standard safety assessment technology

  4. OVERVIEW ON BNL ASSESSMENT OF SEISMIC ANALYSIS METHODS FOR DEEPLY EMBEDDED NPP STRUCTURES.

    Energy Technology Data Exchange (ETDEWEB)

    XU,J.; COSTANTINO, C.; HOFMAYER, C.; GRAVES, H.

    2007-04-01

    A study was performed by Brookhaven National Laboratory (BNL) under the sponsorship of the U. S. Nuclear Regulatory Commission (USNRC), to determine the applicability of established soil-structure interaction analysis methods and computer programs to deeply embedded and/or buried (DEB) nuclear power plant (NPP) structures. This paper provides an overview of the BNL study including a description and discussions of analyses performed to assess relative performance of various SSI analysis methods typically applied to NPP structures, as well as the importance of interface modeling for DEB structures. There are four main elements contained in the BNL study: (1) Review and evaluation of existing seismic design practice, (2) Assessment of simplified vs. detailed methods for SSI in-structure response spectrum analysis of DEB structures, (3) Assessment of methods for computing seismic induced earth pressures on DEB structures, and (4) Development of the criteria for benchmark problems which could be used for validating computer programs for computing seismic responses of DEB NPP structures. The BNL study concluded that the equivalent linear SSI methods, including both simplified and detailed approaches, can be extended to DEB structures and produce acceptable SSI response calculations, provided that the SSI response induced by the ground motion is very much within the linear regime or the non-linear effect is not anticipated to control the SSI response parameters. The BNL study also revealed that the response calculation is sensitive to the modeling assumptions made for the soil/structure interface and application of a particular material model for the soil.

  5. Triclosan: A review on systematic risk assessment and control from the perspective of substance flow analysis.

    Science.gov (United States)

    Huang, Chu-Long; Abass, Olusegun K; Yu, Chang-Ping

    2016-10-01

    Triclosan (TCS) is a broad spectrum antibacterial agent mainly used in Pharmaceutical and Personal Care Products. Its increasing use over recent decades have raised its concentration in the environment, with commonly detectable levels found along the food web-from aquatic organisms to humans in the ecosystem. To date, there is shortage of information on how to investigate TCS's systematic risk on exposed organisms including humans, due to the paucity of systematic information on TCS flows in the anthroposphere. Therefore, a more holistic approach to mass flow balancing is required, such that the systematic risk of TCS in all environmental matrices are evaluated. From the perspective of Substance Flow Analysis (SFA), this review critically summarizes the current state of knowledge on TCS production, consumption, discharge, occurrence in built and natural environments, its exposure and metabolism in humans, and also the negative effects of TCS on biota and humans. Recent risk concerns have mainly focused on TCS removal efficiencies and metabolism, but less attention is given to the effect of mass flows from source to fate during risk exposure. However, available data for TCS SFA is limited but SFA can derive logical systematic information from limited data currently available for systematic risk assessment and reduction, based on mass flow analysis. In other words, SFA tool can be used to develop a comprehensive flow chart and indicator system for the risk assessment and reduction of TCS flows in the anthroposphere, thereby bridging knowledge gaps to streamline uncertainties related to policy-making on exposure pathways within TCS flow-lines. In the final analysis, specifics on systematic TCS risk assessment via SFA, and areas of improvement on human adaptation to risks posed by emerging contaminants are identified and directions for future research are suggested. PMID:27239720

  6. An Application of the Functional Resonance Analysis Method (FRAM) to Risk Assessment of Organisational Change

    International Nuclear Information System (INIS)

    The objective of this study was to demonstrate an alternative approach to risk assessment of organisational changes, based on the principles of resilience engineering. The approach in question was the Functional Resonance Analysis Method (FRAM). Whereas established approaches focus on risks coming from failure or malfunctioning of components, alone or in combination, resilience engineering focuses on the common functions and processes that provide the basis for both successes and failures. Resilience engineering more precisely proposes that failures represent the flip side of the adaptations necessary to cope with the real world complexity rather than a failure of normal system functions and that a safety assessment therefore should focus on how functions are carried out rather than on how they may fail. The objective of this study was not to evaluate the current approach to risk assessment used by the organisation in question. The current approach has nevertheless been used as a frame of reference, but in a non-evaluative manner. The author has demonstrated through the selected case that FRAM can be used as an alternative approach to organizational changes. The report provides the reader with details to consider when making a decision on what analysis approach to use. The choice of which approach to use must reflect priorities and concerns of the organisation and the author makes no statement about which approach is better. It is clear that the choice of an analysis approach is not so simple to make and there are many things to take into account such as the larger working environment, organisational culture, regulatory requirements, etc

  7. An Application of the Functional Resonance Analysis Method (FRAM) to Risk Assessment of Organisational Change

    Energy Technology Data Exchange (ETDEWEB)

    Hollnagel, Erik [MINES ParisTech Crisis and Risk Research Centre (CRC), Sophia Antipolis Cedex (France)

    2012-11-15

    The objective of this study was to demonstrate an alternative approach to risk assessment of organisational changes, based on the principles of resilience engineering. The approach in question was the Functional Resonance Analysis Method (FRAM). Whereas established approaches focus on risks coming from failure or malfunctioning of components, alone or in combination, resilience engineering focuses on the common functions and processes that provide the basis for both successes and failures. Resilience engineering more precisely proposes that failures represent the flip side of the adaptations necessary to cope with the real world complexity rather than a failure of normal system functions and that a safety assessment therefore should focus on how functions are carried out rather than on how they may fail. The objective of this study was not to evaluate the current approach to risk assessment used by the organisation in question. The current approach has nevertheless been used as a frame of reference, but in a non-evaluative manner. The author has demonstrated through the selected case that FRAM can be used as an alternative approach to organizational changes. The report provides the reader with details to consider when making a decision on what analysis approach to use. The choice of which approach to use must reflect priorities and concerns of the organisation and the author makes no statement about which approach is better. It is clear that the choice of an analysis approach is not so simple to make and there are many things to take into account such as the larger working environment, organisational culture, regulatory requirements, etc.

  8. Wavelet transform analysis to assess oscillations in pial artery pulsation at the human cardiac frequency.

    Science.gov (United States)

    Winklewski, P J; Gruszecki, M; Wolf, J; Swierblewska, E; Kunicka, K; Wszedybyl-Winklewska, M; Guminski, W; Zabulewicz, J; Frydrychowski, A F; Bieniaszewski, L; Narkiewicz, K

    2015-05-01

    Pial artery adjustments to changes in blood pressure (BP) may last only seconds in humans. Using a novel method called near-infrared transillumination backscattering sounding (NIR-T/BSS) that allows for the non-invasive measurement of pial artery pulsation (cc-TQ) in humans, we aimed to assess the relationship between spontaneous oscillations in BP and cc-TQ at frequencies between 0.5 Hz and 5 Hz. We hypothesized that analysis of very short data segments would enable the estimation of changes in the cardiac contribution to the BP vs. cc-TQ relationship during very rapid pial artery adjustments to external stimuli. BP and pial artery oscillations during baseline (70s and 10s signals) and the response to maximal breath-hold apnea were studied in eighteen healthy subjects. The cc-TQ was measured using NIR-T/BSS; cerebral blood flow velocity, the pulsatility index and the resistive index were measured using Doppler ultrasound of the left internal carotid artery; heart rate and beat-to-beat systolic and diastolic blood pressure were recorded using a Finometer; end-tidal CO2 was measured using a medical gas analyzer. Wavelet transform analysis was used to assess the relationship between BP and cc-TQ oscillations. The recordings lasting 10s and representing 10 cycles with a frequency of ~1 Hz provided sufficient accuracy with respect to wavelet coherence and wavelet phase coherence values and yielded similar results to those obtained from approximately 70cycles (70s). A slight but significant decrease in wavelet coherence between augmented BP and cc-TQ oscillations was observed by the end of apnea. Wavelet transform analysis can be used to assess the relationship between BP and cc-TQ oscillations at cardiac frequency using signals intervals as short as 10s. Apnea slightly decreases the contribution of cardiac activity to BP and cc-TQ oscillations. PMID:25804326

  9. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    Science.gov (United States)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and

  10. Using computerized text analysis to assess communication within an Italian type 1 diabetes Facebook group

    Directory of Open Access Journals (Sweden)

    Alda Troncone

    2015-11-01

    Full Text Available The purpose of this study was to assess messages posted by mothers of children with type 1 diabetes in the Italian Facebook group “Mamme e diabete” using computerized text analysis. The data suggest that these mothers use online discussion boards as a place to seek and provide information to better manage the disease’s daily demands—especially those tasks linked to insulin correction and administration, control of food intake, and bureaucratic duties, as well as to seek and give encouragement and to share experiences regarding diabetes and related impact on their life. The implications of these findings for the management of diabetes are discussed.

  11. Guidebook in using Cost Benefit Analysis and strategic environmental assessment for environmental planning in China

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    Environmental planning in China may benefit from greater use of Cost Benefit Analysis (CBA) and Strategic Environmental Assessment (SEA) methodologies. We provide guidance on using these methodologies. Part I and II show the principles behind the methodologies as well as their theoretical structure. Part III demonstrates the methodologies in action in a range of different good practice examples. The case studies and theoretical expositions are intended to teach by way of example as well as by understanding the principles, and to help planners use the methodologies as correctly as possible.(auth)

  12. Body electrical loss analysis (BELA) in the assessment of visceral fat: a demonstration

    OpenAIRE

    Blomqvist Kim H; Lundbom Jesper; Lundbom Nina; Sepponen Raimo E

    2011-01-01

    Abstract Background Body electrical loss analysis (BELA) is a new non-invasive way to assess visceral fat depot size through the use of electromagnetism. BELA has worked well in phantom measurements, but the technology is not yet fully validated. Methods Ten volunteers (5 men and 5 women, age: 22-60 y, BMI: 21-30 kg/m2, waist circumference: 73-108 cm) were measured with the BELA instrument and with cross-sectional magnetic resonance imaging (MRI) at the navel level, navel +5 cm and navel -5 c...

  13. Developmental assessment of the multidimensional component in RELAP5 for Savannah River Site thermal hydraulic analysis

    International Nuclear Information System (INIS)

    This report documents ten developmental assessment problems which were used to test the multidimensional component in RELAP5/MOD2.5, Version 3w. The problems chosen were a rigid body rotation problem, a pure radial symmetric flow problem, an r-θ symmetric flow problem, a fall problem, a rest problem, a basic one-dimensional flow test problem, a gravity wave problem, a tank draining problem, a flow through the center problem, and coverage analysis using PIXIE. The multidimensional code calculations are compared to analytical solutions and one-dimensional code calculations. The discussion section of each problem contains information relative to the code's ability to simulate these problems

  14. A sensitivity analysis of a radiological assessment model for Arctic waters

    DEFF Research Database (Denmark)

    Nielsen, S.P.

    1998-01-01

    A model based on compartment analysis has been developed to simulate the dispersion of radionuclides in Arctic waters for an assessment of doses to man. The model predicts concentrations of radionuclides in the marine environment and doses to man from a range of exposure pathways. A parameter sen...... scavenging, water-sediment interaction, biological uptake, ice transport and fish migration. Two independent evaluations of the release of radioactivity from dumped nuclear waste in the Kara Sea have been used as source terms for the dose calculations....

  15. [Assessment of ultraviolet radiation penetration into human skin. I. Theoretical analysis].

    Science.gov (United States)

    Cader, A; Jankowski, J

    1995-01-01

    This is one of the two articles under the same title "Assessment of ultraviolet radiation penetrating into human skin" which are aimed at presenting a part of broader studies in this area. They drive at identifying biophysical aspects of the effects of ultraviolet radiation on human skin. In order to characterise such parameters as UV reflectance from the skin surface of UV absorption and dispersion coefficients, it is necessary to develop appropriate methods. In Part I--"Theoretical analysis", theoretical principles for interpreting measurements of radiation dispersed in different geometrical configurations are presented. They can serve as a basis for estimating the values of UV linear absorption and dispersion coefficients in skin tissues.

  16. Alignment Content Analysis of NAEP 2009 Reading Assessment Analysis Based on Method of Surveys of Enacted Curriculum

    Science.gov (United States)

    Blank, Rolf K.; Smithson, John

    2010-01-01

    Beginning in summer 2009, the complete set of NAEP student assessment items for grades 4 and 8 Science and Reading 2009 assessments were analyzed for comparison to the National Assessment of Educational Progress (NAEP) Item Specifications which are based on the NAEP Assessment Frameworks for these subjects (National Assessment Governing Board,…

  17. The ICR142 NGS validation series: a resource for orthogonal assessment of NGS analysis.

    Science.gov (United States)

    Ruark, Elise; Renwick, Anthony; Clarke, Matthew; Snape, Katie; Ramsay, Emma; Elliott, Anna; Hanks, Sandra; Strydom, Ann; Seal, Sheila; Rahman, Nazneen

    2016-01-01

    To provide a useful community resource for orthogonal assessment of NGS analysis software, we present the ICR142 NGS validation series. The dataset includes high-quality exome sequence data from 142 samples together with Sanger sequence data at 730 sites; 409 sites with variants and 321 sites at which variants were called by an NGS analysis tool, but no variant is present in the corresponding Sanger sequence. The dataset includes 286 indel variants and 275 negative indel sites, and thus the ICR142 validation dataset is of particular utility in evaluating indel calling performance. The FASTQ files and Sanger sequence results can be accessed in the European Genome-phenome Archive under the accession number EGAS00001001332.

  18. Assessment of gait symmetry for Talus Valgus children based on experimental kinematic analysis

    Science.gov (United States)

    Toth-Tascau, Mirela; Pasca, Oana; Vigaru, Cosmina; Rusu, Lucian

    2013-10-01

    The general purpose of this study was to assess the gait symmetry for Talus Valgus deformity based on experimental kinematic analysis. As this foot condition generally occurs in children, the study is focused on two children having five years old, one being healthy, as control subject, and the second one having bilateral Talus Valgus deformity. Kinematic experimental analysis was conducted using Zebris CMS-HS Measuring System. The bilateral symmetry was analyzed using two methods: index of symmetry (SI) calculated for spatio-temporal parameters (stance phase, swing phase, and step length) and kinematic parameter (maximum value of dorsiflexion - plantar flexion angle in the ankle joint), and an unpaired t-test to compare the variation of means values of dorsiflexion - plantar flexion angle in ankle joint for both left and right side. The study evidenced a good bilateral symmetry in case of the control subject and quantified the asymmetry in case of subject with Talus Valgus deformity.

  19. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    Science.gov (United States)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  20. Application of finite element analysis for assessing biocompatibility of intra-arterial catheters and probes.

    Science.gov (United States)

    Bedingham, W; Neavin, T D

    1991-01-01

    A commercial finite element modeling program (FIDAP) was adapted to compute the fluid dynamics of laminar blood flow around an intra-arterial catheter and/or sensor probe. The model provided an accurate transient solution to the Navier-Stokes equations under pulsatile blood flow conditions. To simulate the compliance in the catheter tubing set, a second order convolution integral was incorporated into the boundary conditions. The saline drip rate and catheter compliance could be specified, and the bulk blood flow, blood pressure, and heart rate were varied to simulate specific patient conditions. Analysis of the transient solution was used to assess probable sites for thrombus activation and deposition. The transient velocity and pressure fields identified regions of separated flow and recirculation. The computed shear rates and stresses were used to predict hemolysis, platelet activation, and thrombus formation. Analysis of particle paths provided an estimate of residence times and thrombus deposition sites.

  1. Evolution and Implementation of the NASA Robotic Conjunction Assessment Risk Analysis Concept of Operations

    Science.gov (United States)

    Newman, Lauri K.; Frigm, Ryan C.; Duncan, Matthew G.; Hejduk, Matthew D.

    2014-01-01

    Reacting to potential on-orbit collision risk in an operational environment requires timely and accurate communication and exchange of data, information, and analysis to ensure informed decision-making for safety of flight and responsible use of the shared space environment. To accomplish this mission, it is imperative that all stakeholders effectively manage resources: devoting necessary and potentially intensive resource commitment to responding to high-risk conjunction events and preventing unnecessary expenditure of resources on events of low collision risk. After 10 years of operational experience, the NASA Robotic Conjunction Assessment Risk Analysis (CARA) is modifying its Concept of Operations (CONOPS) to ensure this alignment of collision risk and resource management. This evolution manifests itself in the approach to characterizing, reporting, and refining of collision risk. Implementation of this updated CONOPS is expected to have a demonstrated improvement on the efficacy of JSpOC, CARA, and owner/operator resources.

  2. Method of assessing a lipid-related health risk based on ion mobility analysis of lipoproteins

    Energy Technology Data Exchange (ETDEWEB)

    Benner, W. Henry (Danville, CA); Krauss, Ronald M. (Berkeley, CA); Blanche, Patricia J. (Berkeley, CA)

    2010-12-14

    A medical diagnostic method and instrumentation system for analyzing noncovalently bonded agglomerated biological particles is described. The method and system comprises: a method of preparation for the biological particles; an electrospray generator; an alpha particle radiation source; a differential mobility analyzer; a particle counter; and data acquisition and analysis means. The medical device is useful for the assessment of human diseases, such as cardiac disease risk and hyperlipidemia, by rapid quantitative analysis of lipoprotein fraction densities. Initially, purification procedures are described to reduce an initial blood sample to an analytical input to the instrument. The measured sizes from the analytical sample are correlated with densities, resulting in a spectrum of lipoprotein densities. The lipoprotein density distribution can then be used to characterize cardiac and other lipid-related health risks.

  3. Human reliability analysis in Wolsong 2/3/4 nuclear power plants probabilistic safety assessment

    International Nuclear Information System (INIS)

    The Level 1 probabilistic safety assessment (PSA) for Wolsong(WS) 2/3/4 nuclear power plant(NPPs) in design stage is performed using the methodologies being equivalent to PWR PSA. Accident sequence evaluation program (ASEF) human reliability analysis (HRA) procedure and technique for human error rate prediction (THERP) are used in HRA of WS 2/3/4 NPPs PSA. The purpose of this paper is to introduce the procedure and methodology of HRA in WS 2/3/4 NPPs PSA. Also, this paper describes the interim results of importance analysis for human actions modeled in WS 2/3/4 PSA and the findings and recommendations of administrative control of secondary control area from the view of human factors

  4. Compost maturity assessment using physicochemical, solid-state spectroscopy, and plant bioassay analysis.

    Science.gov (United States)

    Kumar, D Senthil; Kumar, P Satheesh; Rajendran, N M; Anbuganapathi, G

    2013-11-27

    The vermicompost produced from flower waste inoculated with biofertilizers was subjected to compost maturity test: (i) physicochemical method (pH, OC, TN, C:N); (ii) solid state spectroscopic analysis (FTIR and (13)C CPMAS NMR); and (iii) plant bioassay (germination index). The pH of vermicompost was decreased toward neutral, C:N ratio vermicomposts result shows reduction of complex organic materials into simple minerals which indicates the maturity of the experimental vermicompost product than the control. The increased aliphatic portion incorporated with flower residues may be due to the synthesis of alkyl, O-alkyl, and COO groups by the microbes present in the gut of earthworm. Plant bioassays are considered the most conventional assessment of compost maturity analysis, and subsequently, it shows the effect of vermicompost maturity on the germination index of Vigna mungo . PMID:24191667

  5. Hydrodynamic analysis, performance assessment, and actuator design of a flexible tail propulsor in an artificial alligator

    International Nuclear Information System (INIS)

    The overall objective of this research is to develop analysis tools for determining actuator requirements and assessing viable actuator technology for design of a flexible tail propulsor in an artificial alligator. A simple hydrodynamic model that includes both reactive and resistive forces along the tail is proposed and the calculated mean thrust agrees well with conventional estimates of drag. Using the hydrodynamic model forces as an input, studies are performed for an alligator ranging in size from 1 cm to 2 m at swimming speeds of 0.3–1.8 body lengths per second containing five antagonistic pairs of actuators distributed along the length of the tail. Several smart materials are considered for the actuation system, and preliminary analysis results indicate that the acrylic electroactive polymer and the flexible matrix composite actuators are potential artificial muscle technologies for the system

  6. Assessment of computational issues associated with analysis of high-lift systems

    Science.gov (United States)

    Balasubramanian, R.; Jones, Kenneth M.; Waggoner, Edgar G.

    1992-01-01

    Thin-layer Navier-Stokes calculations for wing-fuselage configurations from subsonic to hypersonic flow regimes are now possible. However, efficient, accurate solutions for using these codes for two- and three-dimensional high-lift systems have yet to be realized. A brief overview of salient experimental and computational research is presented. An assessment of the state-of-the-art relative to high-lift system analysis and identification of issues related to grid generation and flow physics which are crucial for computational success in this area are also provided. Research in support of the high-lift elements of NASA's High Speed Research and Advanced Subsonic Transport Programs which addresses some of the computational issues is presented. Finally, fruitful areas of concentrated research are identified to accelerate overall progress for high lift system analysis and design.

  7. The tsunami probabilistic risk assessment of nuclear power plant (3). Outline of tsunami fragility analysis

    International Nuclear Information System (INIS)

    Tsunami Probabilistic Risk Assessment (PRA) standard was issued in February 2012 by Standard Committee of Atomic Energy Society of Japan (AESJ). This article detailed tsunami fragility analysis, which calculated building and structure damage probability contributing core damage and consisted of five evaluation steps: (1) selection of evaluated element and damage mode, (2) selection of evaluation procedure, (3) evaluation of actual stiffness, (4) evaluation of actual response and (5) evaluation of fragility (damage probability and others). As an application example of the standard, calculation results of tsunami fragility analysis investigation by tsunami PRA subcommittee of AESJ were shown reflecting latest knowledge of damage state caused by wave force and others acted by tsunami from the 'off the Pacific Coast of Tohoku Earthquake'. (T. Tanaka)

  8. On sustainability assessment of technical systems. Experience from systems analysis with the ORWARE and Ecoeffect tools

    Energy Technology Data Exchange (ETDEWEB)

    Assefa, Getachew [Royal Inst. of Technology, Stockholm (Sweden). Dept. of Chemical Engineering

    2006-06-15

    Engineering research and development work is undergoing a reorientation from focusing on specific parts of different systems to a broader perspective of systems level, albeit at a slower pace. This reorientation should be further developed and enhanced with the aim of organizing and structuring our technical systems in meeting sustainability requirements in face of global ecological threats that have far-reaching social and economic implications, which can no longer be captured using conventional approach of research. Until a list of universally acceptable, clear, and measurable indicators of sustainable development is developed, the work with sustainability metrics should continue to evolve as a relative measure of ecological, economic, and social performance of human activities in general, and technical systems in particular. This work can be done by comparing the relative performance of alternative technologies of providing the same well-defined function or service; or by characterizing technologies that enjoy different levels of societal priorities using relevant performance indicators. In both cases, concepts and methods of industrial ecology play a vital role. This thesis is about the development and application of a systematic approach for the assessment of the performance of technical systems from the perspective of systems analysis, sustainability, sustainability assessment, and industrial ecology. The systematic approach developed and characterized in this thesis advocates for a simultaneous assessment of the ecological, economic, and social dimensions of performance of technologies in avoiding sub-optimization and problem shifting between dimensions. It gives a holistic picture by taking a life cycle perspective of all important aspects. The systematic assessment of technical systems provides an even-handed assessment resulting in a cumulative knowledge. A modular structure of the approach makes it flexible enough in terms of comparing a number of

  9. Left ventricular synchrony assessed by phase analysis of gated myocardial perfusion SPECT imaging in healthy subjects

    International Nuclear Information System (INIS)

    Objective: To investigate the value of Cedars-Sinai quantitative gated SPECT (QGS) phase analysis for left ventricular synchrony assessment in healthy subjects. Methods: Seventy-four healthy subjects (41 males, 33 females,average age: (60±13) years) underwent both rest and exercise 99Tcm-MIBI G-MPI. QGS software was used to analyze the reconstructed rest gated SPECT images automatically, and then the parameters of left ventricular synchrony including phase bandwidth (BW) and phase standard deviation (SD) were obtained. The influences of gender and age (age<60 years, n=36; age ≥ 60 years, n=38) on left ventricular systolic synchronicity were analyzed. The phase angle for original segmental contraction was measured to determine the onset of the ventricular contraction using 17-segment model. Forty healthy subjects were selected by simple random sampling method to evaluate the intra-observer and interobserver repeatability of QGS phase analysis software. Two-sample t test and linear correlation analysis were used to analyze the data. Results: The BW and SD of left ventricular in healthy subjects were (37.22 ±11.71)°, (11.84±5.39)° respectively. Comparisons between male and female for BW and SD yielded no statistical significance (BW: (36.00±9.70)°, (38.73±13.84)°; SD: (11.88±5.56)°, (11.79±5.26)°; t=0.96 and-0.07, both P>0.05); whereas the older subjects (age≥60 years) had larger BW than the others (age<60 years ; (39.95± 12.65)°, (34.33± 10.00)°; t=-2.11, P<0.05) and no statistical significance was shown for SD between the two age groups ((11.18±4.31)°, (12.54±6.33)°; t=1.08, P>0.05). Of the 74 subjects, the mechanical activation started from the ventricular base to apex in 54 subjects (73%), and from apex to base in only 20 subjects (27%). High repeatability of phase analysis was observed for both intra-observer and inter-observer (r=0.867-0.906, all P<0.001). Conclusions: Good left ventricular segmental synchrony is shown in healthy

  10. Climate change impact and adaptation research requires integrated assessment and farming systems analysis: a case study in the Netherlands

    NARCIS (Netherlands)

    Reidsma, P.; Wolf, J.; Kanellopoulos, A.; Schaap, B.F.; Mandryk, M.; Verhagen, J.; Ittersum, van M.K.

    2015-01-01

    Rather than on crop modelling only, climate change impact assessments in agriculture need to be based on integrated assessment and farming systems analysis, and account for adaptation at different levels. With a case study for Flevoland, the Netherlands, we illustrate that (1) crop models cannot acc

  11. The Assessment of a Tutoring Program to Meet CAS Standards Using a SWOT Analysis and Action Plan

    Science.gov (United States)

    Fullmer, Patricia

    2009-01-01

    This article summarizes the use of SWOT (Strengths, Weaknesses, Opportunities, and Threats) analysis and subsequent action planning as a tool of self-assessment to meet CAS (Council for the Advancement of Standards in Higher Education) requirements for systematic assessment. The use of the evaluation results to devise improvements to increase the…

  12. Review and analysis of parameters for assessing transport of environmentally released radionuclides through agriculture

    International Nuclear Information System (INIS)

    Most of the default parameters incorporated into the TERRA computer code are documented including a literature review and systematic analysis of element-specific transfer parameters B/sub v/, B/sub r/, F/sub m/, F/sub f/, and K/sub d/. This review and analysis suggests default values which are consistent with the modeling approaches taken in TERRA and may be acceptable for most assessment applications of the computer code. However, particular applications of the code and additional analysis of elemental transport may require alternative default values. Use of the values reported herein in other computer codes simulating terrestrial transport is not advised without careful interpretation of the limitations and scope these analyses. An approach to determination of vegetation-specific interception fractions is also discussed. The limitations of this approach are many, and its use indicates the need for analysis of deposition, interception, and weathering processes. Judgement must be exercised in interpretation of plant surface concentrations generated. Finally, the location-specific agricultural, climatological, and population parameters in the default SITE data base documented. These parameters are intended as alternatives to average values currently used. Indeed, areas in the United States where intensive crop, milk, or beef production occurs will be reflected in the parameter values as will areas where little agricultural activity occurs. However, the original information sources contained some small error and the interpolation and conversion methods used will add more. Parameters used in TERRA not discussed herein are discussed in the companion report to this one - ORNL-5785. In the companion report the models employed in and the coding of TERRA are discussed. These reports together provide documentation of the TERRA code and its use in assessments. 96 references, 78 figures, 21 tables

  13. Review and analysis of parameters for assessing transport of environmentally released radionuclides through agriculture

    Energy Technology Data Exchange (ETDEWEB)

    Baes, C.F. III; Sharp, R.D.; Sjoreen, A.L.; Shor, R.W.

    1984-09-01

    Most of the default parameters incorporated into the TERRA computer code are documented including a literature review and systematic analysis of element-specific transfer parameters B/sub v/, B/sub r/, F/sub m/, F/sub f/, and K/sub d/. This review and analysis suggests default values which are consistent with the modeling approaches taken in TERRA and may be acceptable for most assessment applications of the computer code. However, particular applications of the code and additional analysis of elemental transport may require alternative default values. Use of the values reported herein in other computer codes simulating terrestrial transport is not advised without careful interpretation of the limitations and scope these analyses. An approach to determination of vegetation-specific interception fractions is also discussed. The limitations of this approach are many, and its use indicates the need for analysis of deposition, interception, and weathering processes. Judgement must be exercised in interpretation of plant surface concentrations generated. Finally, the location-specific agricultural, climatological, and population parameters in the default SITE data base documented. These parameters are intended as alternatives to average values currently used. Indeed, areas in the United States where intensive crop, milk, or beef production occurs will be reflected in the parameter values as will areas where little agricultural activity occurs. However, the original information sources contained some small error and the interpolation and conversion methods used will add more. Parameters used in TERRA not discussed herein are discussed in the companion report to this one - ORNL-5785. In the companion report the models employed in and the coding of TERRA are discussed. These reports together provide documentation of the TERRA code and its use in assessments. 96 references, 78 figures, 21 tables.

  14. The Strategic Environment Assessment bibliographic network: A quantitative literature review analysis

    Energy Technology Data Exchange (ETDEWEB)

    Caschili, Simone, E-mail: s.caschili@ucl.ac.uk [UCL QASER Lab, University College London, Gower Street, London WC1E 6BT (United Kingdom); De Montis, Andrea; Ganciu, Amedeo; Ledda, Antonio; Barra, Mario [Dipartimento di Agraria, University of Sassari, viale Italia, 39, 07100 Sassari (Italy)

    2014-07-01

    Academic literature has been continuously growing at such a pace that it can be difficult to follow the progression of scientific achievements; hence, the need to dispose of quantitative knowledge support systems to analyze the literature of a subject. In this article we utilize network analysis tools to build a literature review of scientific documents published in the multidisciplinary field of Strategic Environment Assessment (SEA). The proposed approach helps researchers to build unbiased and comprehensive literature reviews. We collect information on 7662 SEA publications and build the SEA Bibliographic Network (SEABN) employing the basic idea that two publications are interconnected if one cites the other. We apply network analysis at macroscopic (network architecture), mesoscopic (sub graph) and microscopic levels (node) in order to i) verify what network structure characterizes the SEA literature, ii) identify the authors, disciplines and journals that are contributing to the international discussion on SEA, and iii) scrutinize the most cited and important publications in the field. Results show that the SEA is a multidisciplinary subject; the SEABN belongs to the class of real small world networks with a dominance of publications in Environmental studies over a total of 12 scientific sectors. Christopher Wood, Olivia Bina, Matthew Cashmore, and Andrew Jordan are found to be the leading authors while Environmental Impact Assessment Review is by far the scientific journal with the highest number of publications in SEA studies. - Highlights: • We utilize network analysis to analyze scientific documents in the SEA field. • We build the SEA Bibliographic Network (SEABN) of 7662 publications. • We apply network analysis at macroscopic, mesoscopic and microscopic network levels. • We identify SEABN architecture, relevant publications, authors, subjects and journals.

  15. Integration of Gis-analysis and Atmospheric Modelling For Nuclear Risk and Vulnerability Assessment

    Science.gov (United States)

    Rigina, O.; Baklanov, A.; Mahura, A.

    The paper is devoted to the problems of residential radiation risk and territorial vul- nerability with respect to nuclear sites in Europe. The study suggests two approaches, based on an integration of the GIS-analysis and the atmospheric modelling, to calcu- late radiation risk/vulnerability. First, modelling simulations were done for a number of case-studies, based on real data, such as reactor core inventory and estimations from the known accidents, for a number of typical meteorological conditions and different accidental scenarios. Then, using these simulations and the population database as input data, the GIS-analysis reveals administrative units at the highest risk with re- spect to the mean individual and collective doses received by the population. Then, two alternative methods were suggested to assess a probabilistic risk to the population in case of a severe accident on the Kola and Leningrad NPPs (as examples) based on social-geophysical factors: proximity to the accident site, population density and presence of critical groups, and the probabilities of wind trajectories and precipitation. The two latter probabilities were calculated by the atmospheric trajectory models and statistical methods for many years. The GIS analysis was done for the Nordic coun- tries as an example. GIS-based spatial analyses integrated with mathematical mod- elling allow to develop a common methodological approach for complex assessment of regional vulnerability and residential radiation risk, by merging together the sepa- rate aspects: modelling of consequences, probabilistic analysis of atmospheric flows, dose estimation etc. The approach was capable to create risk/vulnerability maps of the Nordic countries and to reveal the most vulnerable provinces with respect to the radiation risk sites.

  16. The Strategic Environment Assessment bibliographic network: A quantitative literature review analysis

    International Nuclear Information System (INIS)

    Academic literature has been continuously growing at such a pace that it can be difficult to follow the progression of scientific achievements; hence, the need to dispose of quantitative knowledge support systems to analyze the literature of a subject. In this article we utilize network analysis tools to build a literature review of scientific documents published in the multidisciplinary field of Strategic Environment Assessment (SEA). The proposed approach helps researchers to build unbiased and comprehensive literature reviews. We collect information on 7662 SEA publications and build the SEA Bibliographic Network (SEABN) employing the basic idea that two publications are interconnected if one cites the other. We apply network analysis at macroscopic (network architecture), mesoscopic (sub graph) and microscopic levels (node) in order to i) verify what network structure characterizes the SEA literature, ii) identify the authors, disciplines and journals that are contributing to the international discussion on SEA, and iii) scrutinize the most cited and important publications in the field. Results show that the SEA is a multidisciplinary subject; the SEABN belongs to the class of real small world networks with a dominance of publications in Environmental studies over a total of 12 scientific sectors. Christopher Wood, Olivia Bina, Matthew Cashmore, and Andrew Jordan are found to be the leading authors while Environmental Impact Assessment Review is by far the scientific journal with the highest number of publications in SEA studies. - Highlights: • We utilize network analysis to analyze scientific documents in the SEA field. • We build the SEA Bibliographic Network (SEABN) of 7662 publications. • We apply network analysis at macroscopic, mesoscopic and microscopic network levels. • We identify SEABN architecture, relevant publications, authors, subjects and journals

  17. Structural Performance Assessment Based on Statistical and Wavelet Analysis of Acceleration Measurements of a Building during an Earthquake

    OpenAIRE

    Mosbeh R. Kaloop; Jong Wan Hu; Mohamed A. Sayed; Jiyoung Seong

    2016-01-01

    This study introduces the analysis of structural health monitoring (SHM) system based on acceleration measurements during an earthquake. The SHM system is applied to assess the performance investigation of the administration building in Seoul National University of Education, South Korea. The statistical and wavelet analysis methods are applied to investigate and assess the performance of the building during an earthquake shaking which took place on March 31, 2014. The results indicate that (...

  18. Improving sustainability by technology assessment and systems analysis: the case of IWRM Indonesia

    Science.gov (United States)

    Nayono, S.; Lehmann, A.; Kopfmüller, J.; Lehn, H.

    2016-06-01

    To support the implementation of the IWRM-Indonesia process in a water scarce and sanitation poor region of Central Java (Indonesia), sustainability assessments of several technology options of water supply and sanitation were carried out based on the conceptual framework of the integrative sustainability concept of the German Helmholtz association. In the case of water supply, the assessment was based on the life-cycle analysis and life-cycle-costing approach. In the sanitation sector, the focus was set on developing an analytical tool to improve planning procedures in the area of investigation, which can be applied in general to developing and newly emerging countries. Because sanitation systems in particular can be regarded as socio-technical systems, their permanent operability is closely related to cultural or religious preferences which influence acceptability. Therefore, the design of the tool and the assessment of sanitation technologies took into account the views of relevant stakeholders. The key results of the analyses are presented in this article.

  19. Parkinson's disease assessment based on gait analysis using an innovative RGB-D camera system.

    Science.gov (United States)

    Rocha, Ana Patrícia; Choupina, Hugo; Fernandes, José Maria; Rosas, Maria José; Vaz, Rui; Silva Cunha, João Paulo

    2014-01-01

    Movement-related diseases, such as Parkinson's disease (PD), progressively affect the motor function, many times leading to severe motor impairment and dramatic loss of the patients' quality of life. Human motion analysis techniques can be very useful to support clinical assessment of this type of diseases. In this contribution, we present a RGB-D camera (Microsoft Kinect) system and its evaluation for PD assessment. Based on skeleton data extracted from the gait of three PD patients treated with deep brain stimulation and three control subjects, several gait parameters were computed and analyzed, with the aim of discriminating between non-PD and PD subjects, as well as between two PD states (stimulator ON and OFF). We verified that among the several quantitative gait parameters, the variance of the center shoulder velocity presented the highest discriminative power to distinguish between non-PD, PD ON and PD OFF states (p = 0.004). Furthermore, we have shown that our low-cost portable system can be easily mounted in any hospital environment for evaluating patients' gait. These results demonstrate the potential of using a RGB-D camera as a PD assessment tool. PMID:25570653

  20. Using Habitat Equivalency Analysis to Assess the Cost Effectiveness of Restoration Outcomes in Four Institutional Contexts

    Science.gov (United States)

    Scemama, Pierre; Levrel, Harold

    2016-01-01

    At the national level, with a fixed amount of resources available for public investment in the restoration of biodiversity, it is difficult to prioritize alternative restoration projects. One way to do this is to assess the level of ecosystem services delivered by these projects and to compare them with their costs. The challenge is to derive a common unit of measurement for ecosystem services in order to compare projects which are carried out in different institutional contexts having different goals (application of environmental laws, management of natural reserves, etc.). This paper assesses the use of habitat equivalency analysis (HEA) as a tool to evaluate ecosystem services provided by restoration projects developed in different institutional contexts. This tool was initially developed to quantify the level of ecosystem services required to compensate for non-market impacts coming from accidental pollution in the US. In this paper, HEA is used to assess the cost effectiveness of several restoration projects in relation to different environmental policies, using case studies based in France. Four case studies were used: the creation of a market for wetlands, public acceptance of a port development project, the rehabilitation of marshes to mitigate nitrate loading to the sea, and the restoration of streams in a protected area. Our main conclusion is that HEA can provide a simple tool to clarify the objectives of restoration projects, to compare the cost and effectiveness of these projects, and to carry out trade-offs, without requiring significant amounts of human or technical resources.

  1. Efficiency assessment of coal mine safety input by data envelopment analysis

    Institute of Scientific and Technical Information of China (English)

    TONG Lei; DING Ri-jia

    2008-01-01

    In recent years improper allocation of safety input has prevailed in coal mines in China, which resulted in the frequent accidents in coal mining operation. A comprehensive assessment of the input efficiency of coal mine safety should lead to im-proved efficiency in the use of funds and management resources. This helps government and enterprise managers better understand how safety inputs are used and to optimize allocation of resources. Study on coal mine's efficiency assessment of safety input was conducted in this paper. A C2R model with non-Archimedean infinitesimal vector based on output is established after consideration of the input characteristics and the model properties. An assessment of an operating mine was done using a specific set of input and output criteria. It is found that the safety input was efficient in 2002 and 2005 and was weakly efficient in 2003. However, the effi-ciency was relatively low in both 2001 and 2004. The safety input resources can be optimized and adjusted by means of projection theory. Such analysis shows that, on average in 2001 and 2004, 45% of the expended funds could have been saved. Likewise, 10% of the safety management and technical staff could have been eliminated and working hours devoted to safety could have been reduced by 12%. These conditions could have given the same results.

  2. Assessment of occupational safety risks in Floridian solid waste systems using Bayesian analysis.

    Science.gov (United States)

    Bastani, Mehrad; Celik, Nurcin

    2015-10-01

    Safety risks embedded within solid waste management systems continue to be a significant issue and are prevalent at every step in the solid waste management process. To recognise and address these occupational hazards, it is necessary to discover the potential safety concerns that cause them, as well as their direct and/or indirect impacts on the different types of solid waste workers. In this research, our goal is to statistically assess occupational safety risks to solid waste workers in the state of Florida. Here, we first review the related standard industrial codes to major solid waste management methods including recycling, incineration, landfilling, and composting. Then, a quantitative assessment of major risks is conducted based on the data collected using a Bayesian data analysis and predictive methods. The risks estimated in this study for the period of 2005-2012 are then compared with historical statistics (1993-1997) from previous assessment studies. The results have shown that the injury rates among refuse collectors in both musculoskeletal and dermal injuries have decreased from 88 and 15 to 16 and three injuries per 1000 workers, respectively. However, a contrasting trend is observed for the injury rates among recycling workers, for whom musculoskeletal and dermal injuries have increased from 13 and four injuries to 14 and six injuries per 1000 workers, respectively. Lastly, a linear regression model has been proposed to identify major elements of the high number of musculoskeletal and dermal injuries. PMID:26219294

  3. Literature review and analysis of the application of health outcome assessment instruments in Chinese medicine

    Institute of Scientific and Technical Information of China (English)

    Feng-bin Liu; Zheng-kun Hou; Yun-ying Yang; Pei-wu Li; Qian-wen Li; Nelson Xie; Jing-wei Li

    2013-01-01

    OBJECITVE:To evaluate the application of health assessment instruments in Chinese medicine.METHODS:According to a pre-defined search strategy,a comprehensive literature search for all articles published in China National Knowledge Infrastructure databases was conducted.The resulting articles that met the defined inclusion and exclusion criteria were used for analysis.RESULTS:A total of 97 instruments for health outcome assessment in Chinese medicine have been used in fundamental and theoretical research,and 14 of these were also used in 29 clinical trials that were randomized controlled trials,or descriptive or cross-sectional studies.In 2 152 Chinese medicine-based studies that used instruments in their methodology,more than 150 questionnaires were identified.Among the identified questionnaires,51 were used in more than 10 articles (0.5%).Most of these instruments were developed in Western countries and few studies (4%) used the instrument as the primary evidence for their conclusions.CONCLUSION:Usage of instruments for health outcome assessment in Chinese medicine is increasing rapidly; however,current limitations include selection rationale,result interpretation and standardization,which must be addressed accordingly.

  4. Improving sustainability by technology assessment and systems analysis: the case of IWRM Indonesia

    Science.gov (United States)

    Nayono, S.; Lehmann, A.; Kopfmüller, J.; Lehn, H.

    2016-09-01

    To support the implementation of the IWRM-Indonesia process in a water scarce and sanitation poor region of Central Java (Indonesia), sustainability assessments of several technology options of water supply and sanitation were carried out based on the conceptual framework of the integrative sustainability concept of the German Helmholtz association. In the case of water supply, the assessment was based on the life-cycle analysis and life-cycle-costing approach. In the sanitation sector, the focus was set on developing an analytical tool to improve planning procedures in the area of investigation, which can be applied in general to developing and newly emerging countries. Because sanitation systems in particular can be regarded as socio-technical systems, their permanent operability is closely related to cultural or religious preferences which influence acceptability. Therefore, the design of the tool and the assessment of sanitation technologies took into account the views of relevant stakeholders. The key results of the analyses are presented in this article.

  5. Parkinson's disease assessment based on gait analysis using an innovative RGB-D camera system.

    Science.gov (United States)

    Rocha, Ana Patrícia; Choupina, Hugo; Fernandes, José Maria; Rosas, Maria José; Vaz, Rui; Silva Cunha, João Paulo

    2014-01-01

    Movement-related diseases, such as Parkinson's disease (PD), progressively affect the motor function, many times leading to severe motor impairment and dramatic loss of the patients' quality of life. Human motion analysis techniques can be very useful to support clinical assessment of this type of diseases. In this contribution, we present a RGB-D camera (Microsoft Kinect) system and its evaluation for PD assessment. Based on skeleton data extracted from the gait of three PD patients treated with deep brain stimulation and three control subjects, several gait parameters were computed and analyzed, with the aim of discriminating between non-PD and PD subjects, as well as between two PD states (stimulator ON and OFF). We verified that among the several quantitative gait parameters, the variance of the center shoulder velocity presented the highest discriminative power to distinguish between non-PD, PD ON and PD OFF states (p = 0.004). Furthermore, we have shown that our low-cost portable system can be easily mounted in any hospital environment for evaluating patients' gait. These results demonstrate the potential of using a RGB-D camera as a PD assessment tool.

  6. Statistical analysis of data from limiting dilution cloning to assess monoclonality in generating manufacturing cell lines.

    Science.gov (United States)

    Quiroz, Jorge; Tsao, Yung-Shyeng

    2016-07-01

    Assurance of monoclonality of recombinant cell lines is a critical issue to gain regulatory approval in biological license application (BLA). Some of the requirements of regulatory agencies are the use of proper documentations and appropriate statistical analysis to demonstrate monoclonality. In some cases, one round may be sufficient to demonstrate monoclonality. In this article, we propose the use of confidence intervals for assessing monoclonality for limiting dilution cloning in the generation of recombinant manufacturing cell lines based on a single round. The use of confidence intervals instead of point estimates allow practitioners to account for the uncertainty present in the data when assessing whether an estimated level of monoclonality is consistent with regulatory requirements. In other cases, one round may not be sufficient and two consecutive rounds are required to assess monoclonality. When two consecutive subclonings are required, we improved the present methodology by reducing the infinite series proposed by Coller and Coller (Hybridoma 1983;2:91-96) to a simpler series. The proposed simpler series provides more accurate and reliable results. It also reduces the level of computation and can be easily implemented in any spreadsheet program like Microsoft Excel. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1061-1068, 2016.

  7. CYCLIC RECURRENCE ASSESSMENT OF GRAIN YIELD TIME SERIES USING PHASE ANALYSIS INSTRUMENTS

    Directory of Open Access Journals (Sweden)

    Temirov A. A.

    2016-01-01

    Full Text Available An algorithm of phase analysis as the instrument of nonlinear dynamics' methods used to study cyclic recurrence of time series is viewed in current article. The existing classical econometric methods for estimating cyclic recurrence developed for random systems which dynamics matches to the normal distribution. However, there also exists non-random systems characterized by trends, periodic and non-periodic cycles called quasicycles. An example of computing process of identifying quasicycles is illustrated on time series of all grain yields in Russia for the last 119 years. Phase portrait of this time series is illustrated in twodimension space. As a result, the phase portrait consists of 22 frequently unstable quasicycles which tottality forms a strange attractor. Quasicycles have quantitative (length and quality (configuration characteristics. Their combination defines very important characteristic called trend-stability. Phase analysis is a powerful form of analysis of time series to assess cyclic recurrence and is a tool for pre-forecasting analysis. Fuzzy sets' mathematical apparatus is also used in this article. An algorithm of formation of fuzzy sets' quasicycles' length is also presented here. Quasicycles' statistics are presented in tables, geometric patterns and in the form of fuzzy sets

  8. The Statistical Analysis and Assessment of the Solvency of Forest Enterprises

    Directory of Open Access Journals (Sweden)

    Vyniatynska Liudmila V.

    2016-05-01

    Full Text Available The aim of the article is to conduct a statistical analysis of the solvency of forest enterprises through a system of statistical indicators using the sampling method (the sampling is based on the criteria of forest cover percent of regions of Ukraine. Using financial statements of forest enterprises that form a system of information and analytical support for the statistical analysis of the level of solvency of forestry in Ukraine for 2009-2015 has been analyzed and evaluated. With the help of the developed recommended values the results of the statistical analysis of the forest enterprises’ solvency under conditions of self-financing and commercial consideration have been summarized and systematized. Using the methodology of the statistical analysis of the forest enterprises’ solvency conducted on the corresponding conceptual framework, which is relevant and meets the current needs, a system of statistical indicators enabling to assess the level of solvency of forest enterprises and identify the reasons of its low level has been calculated.

  9. Assessing Heterogeneity for Factor Analysis Model with Continuous and Ordinal Outcomes

    Directory of Open Access Journals (Sweden)

    Ye-Mao Xia

    2016-01-01

    Full Text Available Factor analysis models with continuous and ordinal responses are a useful tool for assessing relations between the latent variables and mixed observed responses. These models have been successfully applied to many different fields, including behavioral, educational, and social-psychological sciences. However, within the Bayesian analysis framework, most developments are constrained within parametric families, of which the particular distributions are specified for the parameters of interest. This leads to difficulty in dealing with outliers and/or distribution deviations. In this paper, we propose a Bayesian semiparametric modeling for factor analysis model with continuous and ordinal variables. A truncated stick-breaking prior is used to model the distributions of the intercept and/or covariance structural parameters. Bayesian posterior analysis is carried out through the simulation-based method. Blocked Gibbs sampler is implemented to draw observations from the complicated posterior. For model selection, the logarithm of pseudomarginal likelihood is developed to compare the competing models. Empirical results are presented to illustrate the application of the methodology.

  10. a Psycholinguistic Model for Simultaneous Translation, and Proficiency Assessment by Automated Acoustic Analysis of Discourse.

    Science.gov (United States)

    Yaghi, Hussein M.

    Two separate but related issues are addressed: how simultaneous translation (ST) works on a cognitive level and how such translation can be objectively assessed. Both of these issues are discussed in the light of qualitative and quantitative analyses of a large corpus of recordings of ST and shadowing. The proposed ST model utilises knowledge derived from a discourse analysis of the data, many accepted facts in the psychology tradition, and evidence from controlled experiments that are carried out here. This model has three advantages: (i) it is based on analyses of extended spontaneous speech rather than word-, syllable-, or clause -bound stimuli; (ii) it draws equally on linguistic and psychological knowledge; and (iii) it adopts a non-traditional view of language called 'the linguistic construction of reality'. The discourse-based knowledge is also used to develop three computerised systems for the assessment of simultaneous translation: one is a semi-automated system that treats the content of the translation; and two are fully automated, one of which is based on the time structure of the acoustic signals whilst the other is based on their cross-correlation. For each system, several parameters of performance are identified, and they are correlated with assessments rendered by the traditional, subjective, qualitative method. Using signal processing techniques, the acoustic analysis of discourse leads to the conclusion that quality in simultaneous translation can be assessed quantitatively with varying degrees of automation. It identifies as measures of performance (i) three content-based standards; (ii) four time management parameters that reflect the influence of the source on the target language time structure; and (iii) two types of acoustical signal coherence. Proficiency in ST is shown to be directly related to coherence and speech rate but inversely related to omission and delay. High proficiency is associated with a high degree of simultaneity and

  11. Assessing the likely effectiveness of multispecies management for imperiled desert fishes with niche overlap analysis.

    Science.gov (United States)

    Laub, Brian G; Budy, Phaedra

    2015-08-01

    A critical decision in species conservation is whether to target individual species or a complex of ecologically similar species. Management of multispecies complexes is likely to be most effective when species share similar distributions, threats, and response to threats. We used niche overlap analysis to assess ecological similarity of 3 sensitive desert fish species currently managed as an ecological complex. We measured the amount of shared distribution of multiple habitat and life history parameters between each pair of species. Habitat use and multiple life history parameters, including maximum body length, spawning temperature, and longevity, differed significantly among the 3 species. The differences in habitat use and life history parameters among the species suggest they are likely to respond differently to similar threats and that most management actions will not benefit all 3 species equally. Habitat restoration, frequency of stream dewatering, non-native species control, and management efforts in tributaries versus main stem rivers are all likely to impact each of the species differently. Our results demonstrate that niche overlap analysis provides a powerful tool for assessing the likely effectiveness of multispecies versus single-species conservation plans. PMID:25627117

  12. Multivariate analysis of ATR-FTIR spectra for assessment of oil shale organic geochemical properties

    Science.gov (United States)

    Washburn, Kathryn E.; Birdwell, Justin E.

    2013-01-01

    In this study, attenuated total reflectance (ATR) Fourier transform infrared spectroscopy (FTIR) was coupled with partial least squares regression (PLSR) analysis to relate spectral data to parameters from total organic carbon (TOC) analysis and programmed pyrolysis to assess the feasibility of developing predictive models to estimate important organic geochemical parameters. The advantage of ATR-FTIR over traditional analytical methods is that source rocks can be analyzed in the laboratory or field in seconds, facilitating more rapid and thorough screening than would be possible using other tools. ATR-FTIR spectra, TOC concentrations and Rock–Eval parameters were measured for a set of oil shales from deposits around the world and several pyrolyzed oil shale samples. PLSR models were developed to predict the measured geochemical parameters from infrared spectra. Application of the resulting models to a set of test spectra excluded from the training set generated accurate predictions of TOC and most Rock–Eval parameters. The critical region of the infrared spectrum for assessing S1, S2, Hydrogen Index and TOC consisted of aliphatic organic moieties (2800–3000 cm−1) and the models generated a better correlation with measured values of TOC and S2 than did integrated aliphatic peak areas. The results suggest that combining ATR-FTIR with PLSR is a reliable approach for estimating useful geochemical parameters of oil shales that is faster and requires less sample preparation than current screening methods.

  13. Assessing Credit Default using Logistic Regression and Multiple Discriminant Analysis: Empirical Evidence from Bosnia and Herzegovina

    Directory of Open Access Journals (Sweden)

    Deni Memić

    2015-01-01

    Full Text Available This article has an aim to assess credit default prediction on the banking market in Bosnia and Herzegovina nationwide as well as on its constitutional entities (Federation of Bosnia and Herzegovina and Republika Srpska. Ability to classify companies info different predefined groups or finding an appropriate tool which would replace human assessment in classifying companies into good and bad buckets has been one of the main interests on risk management researchers for a long time. We investigated the possibility and accuracy of default prediction using traditional statistical methods logistic regression (logit and multiple discriminant analysis (MDA and compared their predictive abilities. The results show that the created models have high predictive ability. For logit models, some variables are more influential on the default prediction than the others. Return on assets (ROA is statistically significant in all four periods prior to default, having very high regression coefficients, or high impact on the model's ability to predict default. Similar results are obtained for MDA models. It is also found that predictive ability differs between logistic regression and multiple discriminant analysis.

  14. Sustainability assessment of tertiary wastewater treatment technologies: a multi-criteria analysis.

    Science.gov (United States)

    Plakas, K V; Georgiadis, A A; Karabelas, A J

    2016-01-01

    The multi-criteria analysis gives the opportunity to researchers, designers and decision-makers to examine decision options in a multi-dimensional fashion. On this basis, four tertiary wastewater treatment (WWT) technologies were assessed regarding their sustainability performance in producing recycled wastewater, considering a 'triple bottom line' approach (i.e. economic, environmental, and social). These are powdered activated carbon adsorption coupled with ultrafiltration membrane separation (PAC-UF), reverse osmosis, ozone/ultraviolet-light oxidation and heterogeneous photo-catalysis coupled with low-pressure membrane separation (photocatalytic membrane reactor, PMR). The participatory method called simple multi-attribute rating technique exploiting ranks was employed for assigning weights to selected sustainability indicators. This sustainability assessment approach resulted in the development of a composite index as a final metric, for each WWT technology evaluated. The PAC-UF technology appears to be the most appropriate technology, attaining the highest composite value regarding the sustainability performance. A scenario analysis confirmed the results of the original scenario in five out of seven cases. In parallel, the PMR was highlighted as the technology with the least variability in its performance. Nevertheless, additional actions and approaches are proposed to strengthen the objectivity of the final results.

  15. Phenomic assessment of genetic buffering by kinetic analysis of cell arrays.

    Science.gov (United States)

    Rodgers, John; Guo, Jingyu; Hartman, John L

    2014-01-01

    Quantitative high-throughput cell array phenotyping (Q-HTCP) is applied to the genomic collection of yeast gene deletion mutants for systematic, comprehensive assessment of the contribution of genes and gene combinations to any phenotype of interest (phenomic analysis). Interacting gene networks influence every phenotype. Genetic buffering refers to how gene interaction networks stabilize or destabilize a phenotype. Like genomics, phenomics varies in its resolution with there being a trade-off allocating a greater number of measurements per sample to enhance quantification of the phenotype vs. increasing the number of different samples by obtaining fewer measurements per sample. The Q-HTCP protocol we describe assesses 50,000-70,000 cultures per experiment by obtaining kinetic growth curves from time series imaging of agar cell arrays. This approach was developed for the yeast gene deletion strains, but it could be applied as well to other microbial mutant arrays grown on solid agar media. The methods we describe are for creation and maintenance of frozen stocks, liquid source array preparation, agar destination plate printing, image scanning, image analysis, curve fitting, and evaluation of gene interaction. PMID:25213246

  16. Application of exploratory factor analysis to assess fish consumption in a university community

    Directory of Open Access Journals (Sweden)

    Erika da Silva Maciel

    2013-03-01

    Full Text Available The objective of this research was to use the technique of Exploratory Factor Analysis (EFA for the adequacy of a tool for the assessment of fish consumption and the characteristics involved in this process. Data were collected during a campaign to encourage fish consumption in Brazil with the voluntarily participation of members of a university community. An assessment instrument consisting of multiple-choice questions and a five-point Likert scale was designed and used to measure the importance of certain attributes that influence the choice and consumption of fish. This study sample was composed of of 224 individuals, the majority were women (65.6%. With regard to the frequency of fish consumption, 37.67% of the volunteers interviewed said they consume the product two or three times a month, and 29.6% once a week. The Exploratory Factor Analysis (EFA was used to group the variables; the extraction was made using the principal components and the rotation using the Quartimax method. The results show clusters in two main constructs, quality and consumption with Cronbach Alpha coefficients of 0.75 and 0.69, respectively, indicating good internal consistency.

  17. Air pollution assessment in two Moroccan cities using instrumental neutron activation analysis on bio-accumulators

    International Nuclear Information System (INIS)

    Full text: Biomonitoring is an appropriate tool for the air pollution assessment studies. In this work, lichens and barks have been used as bio-accumulators in several sites in two Moroccan cities (Rabat and Mohammadia). The specific ability of absorbing and accumulating heavy metals and toxic element from the air, their longevity and resistance to the environmental stresses, make those bioindicators suitable for this kind of studies. The Instrumental Neutron Activation Analysis (INAA) is universally accepted as one of the most reliable analytical tools for trace and ultra-trace elements determination. Its use in trace elements atmospheric pollution related studies has been and is still extensive as can be demonstrated by several specific works and detailed reviews. In this work, a preliminary investigation employing lichens, barks and instrumental neutron activation analysis (INAA) was carried out to evaluate the trace elements distribution in six different areas of Rabat and Mohammadia cities characterised by the presence of many industries and heavy traffic. Samples were irradiated with thermal neutrons in a nuclear reactor and the induced activity was counted using high-resolution Germanium-Lithium detectors. More than 30 elements were determined using two modes : short irradiation (1 minute) and long irradiation (17 hours). Accuracy and quality control were assessed using the reference standard material IAEA-336. This was less than 1% for major and about 5 to 10% for traces.

  18. Rapid ecotoxicological assessment of heavy metal combined polluted soil using canonical analysis

    Institute of Scientific and Technical Information of China (English)

    CHEN Su-hua; ZHOU Qi-xing; SUN Tie-heng; LI Pei-jun

    2003-01-01

    Quick, simple to perform, and cheap biomarkers were combined in a rapid assessment approach to measure the effects of metal pollutants, Cu, Cd, Pb and Zn in meadow burozem on wheat. Analysis of orthogonal design showed that the significant zinc factor indicated both the inhibition rate of shoot mass and that of root elongation were affected by zinc( P < 0.05 and P < 0.01, respectively). The first toxicity canonical variable (TOXI), formed from the toxicity data set, explained 49% of the total variance in the toxicity data set; the first biological canonical variable(BIOL) explained 42% of the total variation in the biological data set. The correlation between the first canonical variables TOXI and BIOL (canonical correlation) was 0.94 ( P < 0.0001). Therefore, it is reliable and feasible to use the achievement to assess toxicity of heavy metal combined polluted soil using canonical analysis. Toxicity of soil combined polluted by heavy metals to plant community was estimated by comparing the IC50 values describing the concentration needed to cause 50% decrease with grow rate compared to no metal addition. Environmental quality standard for soils prescribe that all these tested concentration of heavy metals in soil should not cause hazard and pollution ultimately, whereas it indicated that the soils in second grade cause more or less than 50% inhibition rates of wheat growth. So environmental quality standard for soils can be modified to include other features.

  19. Sustainability assessment of tertiary wastewater treatment technologies: a multi-criteria analysis.

    Science.gov (United States)

    Plakas, K V; Georgiadis, A A; Karabelas, A J

    2016-01-01

    The multi-criteria analysis gives the opportunity to researchers, designers and decision-makers to examine decision options in a multi-dimensional fashion. On this basis, four tertiary wastewater treatment (WWT) technologies were assessed regarding their sustainability performance in producing recycled wastewater, considering a 'triple bottom line' approach (i.e. economic, environmental, and social). These are powdered activated carbon adsorption coupled with ultrafiltration membrane separation (PAC-UF), reverse osmosis, ozone/ultraviolet-light oxidation and heterogeneous photo-catalysis coupled with low-pressure membrane separation (photocatalytic membrane reactor, PMR). The participatory method called simple multi-attribute rating technique exploiting ranks was employed for assigning weights to selected sustainability indicators. This sustainability assessment approach resulted in the development of a composite index as a final metric, for each WWT technology evaluated. The PAC-UF technology appears to be the most appropriate technology, attaining the highest composite value regarding the sustainability performance. A scenario analysis confirmed the results of the original scenario in five out of seven cases. In parallel, the PMR was highlighted as the technology with the least variability in its performance. Nevertheless, additional actions and approaches are proposed to strengthen the objectivity of the final results. PMID:27054724

  20. Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis

    Science.gov (United States)

    Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.

  1. Assessment of hydrocephalus in children based on digital image processing and analysis

    Directory of Open Access Journals (Sweden)

    Fabijańska Anna

    2014-06-01

    Full Text Available Hydrocephalus is a pathological condition of the central nervous system which often affects neonates and young children. It manifests itself as an abnormal accumulation of cerebrospinal fluid within the ventricular system of the brain with its subsequent progression. One of the most important diagnostic methods of identifying hydrocephalus is Computer Tomography (CT. The enlarged ventricular system is clearly visible on CT scans. However, the assessment of the disease progress usually relies on the radiologist’s judgment and manual measurements, which are subjective, cumbersome and have limited accuracy. Therefore, this paper regards the problem of semi-automatic assessment of hydrocephalus using image processing and analysis algorithms. In particular, automated determination of popular indices of the disease progress is considered. Algorithms for the detection, semi-automatic segmentation and numerical description of the lesion are proposed. Specifically, the disease progress is determined using shape analysis algorithms. Numerical results provided by the introduced methods are presented and compared with those calculated manually by a radiologist and a trained operator. The comparison proves the correctness of the introduced approach.

  2. Frequency Domain Analysis for Assessing Fluid Responsiveness by Using Instantaneous Pulse Rate Variability

    Directory of Open Access Journals (Sweden)

    Pei-Chen Lin

    2016-02-01

    Full Text Available In the ICU, fluid therapy is conventional strategy for the patient in shock. However, only half of ICU patients have well-responses to fluid therapy, and fluid loading in non-responsive patient delays definitive therapy. Prediction of fluid responsiveness (FR has become intense topic in clinic. Most of conventional FR prediction method based on time domain analysis, and it is limited ability to indicate FR. This study proposed a method which predicts FR based on frequency domain analysis, named instantaneous pulse rate variability (iPRV. iPRV provides a new indication in very high frequency (VHF range (0.4-0.8Hz of spectrum for peripheral responses. Twenty six healthy subjects participated this study and photoplethysmography signal was recorded in supine baseline, during head-up tilt (HUT, and passive leg raising (PLR, which induces variation of venous return and helps for quantitative assessment of FR individually. The result showed the spectral power of VHF decreased during HUT (573.96±756.36 ms2 in baseline; 348.00±434.92 ms2 in HUT and increased during PLR (573.96±756.36 ms2 in baseline; 718.92±973.70 ms2 in PLR, which present the compensated regulation of venous return and FR. This study provides an effective indicator for assessing FR in frequency domain and has potential to be a reliable system in ICU.

  3. Probabilistic fragility analysis: A tool for assessing design rules of RC buildings

    Institute of Scientific and Technical Information of China (English)

    Nikos D Lagarost

    2008-01-01

    In this work, fragility analysis is performed to assess two groups of reinforced concrete structures. The first group of structures is composed of buildings that implement three common design practices; namely, fully infilled, weak ground story and short columns. The three design practices are applied during the design process of a reinforced concrete building. The structures of the second group vary according to the value of the behavioral factors used to define the seismic forces as specified in design procedures. Most seismic design codes belong to the class of prescriptive procedures where if certain constraints are fulfilled, the structure is considered safe. Prescriptive design procedures express the ability of the structure to absorb energy through inelastic deformation using the behavior factor. The basic objective of this work is to assess both groups of structures with reference to the limit-state probability of exceedance. Thus, four limit state fragility curves are developed on the basis of nonlinear static analysis for both groups of structures. Moreover, the 95% confidence intervals of the fragility curves are also calculated, taking into account two types of random variables that influence structural capacity and seismic demand.

  4. Assessing the likely effectiveness of multispecies management for imperiled desert fishes with niche overlap analysis

    Science.gov (United States)

    Laub, P; Budy, Phaedra

    2015-01-01

    A critical decision in species conservation is whether to target individual species or a complex of ecologically similar species. Management of multispecies complexes is likely to be most effective when species share similar distributions, threats, and response to threats. We used niche overlap analysis to assess ecological similarity of 3 sensitive desert fish species currently managed as an ecological complex. We measured the amount of shared distribution of multiple habitat and life history parameters between each pair of species. Habitat use and multiple life history parameters, including maximum body length, spawning temperature, and longevity, differed significantly among the 3 species. The differences in habitat use and life history parameters among the species suggest they are likely to respond differently to similar threats and that most management actions will not benefit all 3 species equally. Habitat restoration, frequency of stream dewatering, non-native species control, and management efforts in tributaries versus main stem rivers are all likely to impact each of the species differently. Our results demonstrate that niche overlap analysis provides a powerful tool for assessing the likely effectiveness of multispecies versus single-species conservation plans.

  5. Extended Cost-Effectiveness Analysis for Health Policy Assessment: A Tutorial.

    Science.gov (United States)

    Verguet, Stéphane; Kim, Jane J; Jamison, Dean T

    2016-09-01

    Health policy instruments such as the public financing of health technologies (e.g., new drugs, vaccines) entail consequences in multiple domains. Fundamentally, public health policies aim at increasing the uptake of effective and efficient interventions and at subsequently leading to better health benefits (e.g., premature mortality and morbidity averted). In addition, public health policies can provide non-health benefits in addition to the sole well-being of populations and beyond the health sector. For instance, public policies such as social and health insurance programs can prevent illness-related impoverishment and procure financial risk protection. Furthermore, public policies can improve the distribution of health in the population and promote the equalization of health among individuals. Extended cost-effectiveness analysis was developed to address health policy assessment, specifically to evaluate the health and financial consequences of public policies in four domains: (1) the health gains; (2) the financial risk protection benefits; (3) the total costs to the policy makers; and (4) the distributional benefits. Here, we present a tutorial that describes both the intent of extended cost-effectiveness analysis and its keys to allow easy implementation for health policy assessment. PMID:27374172

  6. Gait analysis, bone and muscle density assessment for patients undergoing total hip arthroplasty

    Directory of Open Access Journals (Sweden)

    Benedikt Magnússon

    2012-12-01

    Full Text Available Total hip arthroplasty (THA is performed with or without the use of bone cement. Facing the lack of reliable clinical guidelines on decision making whether a patient should receive THA with or without bone cement, a joint clinical and engineering approach is proposed here with the objective to assess patient recovery developing monitoring techniques based on gait analysis, measurements of bone mineral density and structural and functional changes of quadriceps muscles. A clinical trial was conducted with 36 volunteer patients that were undergoing THA surgery for the first time: 18 receiving cemented implant and 18 receiving non-cemented implant. The patients are scanned with Computer Tomographic (CT modality prior-, immediately- and 12 months post-surgery. The CT data are further processed to segment muscles and bones for calculating bone mineral density (BMD. Quadriceps muscle density Hounsfield (HU based value is calculated from the segmented file on healthy and operated leg before and after THA surgery. Furthermore clinical assessment is performed using gait analysis technologies such as a sensing carpet, wireless electrodes and video. Patients undergo these measurements prior-, 6 weeks post - and 52 weeks post-surgery. The preliminary results indicate computational tools and methods that are able to quantitatively analyze patient’s condition pre and post-surgery: The spatial parameters such as step length and stride length increase 6 weeks post op in the patient group receiving cemented implant while the angle in the toe in/out parameter decrease in both patient groups.

  7. Bioimpedance Harmonic Analysis as a Diagnostic Tool to Assess Regional Circulation and Neural Activity

    Science.gov (United States)

    Mudraya, I. S.; Revenko, S. V.; Khodyreva, L. A.; Markosyan, T. G.; Dudareva, A. A.; Ibragimov, A. R.; Romich, V. V.; Kirpatovsky, V. I.

    2013-04-01

    The novel technique based on harmonic analysis of bioimpedance microvariations with original hard- and software complex incorporating a high-resolution impedance converter was used to assess the neural activity and circulation in human urinary bladder and penis in patients with pelvic pain, erectile dysfunction, and overactive bladder. The therapeutic effects of shock wave therapy and Botulinum toxin detrusor injections were evaluated quantitatively according to the spectral peaks at low 0.1 Hz frequency (M for Mayer wave), respiratory (R) and cardiac (C) rhythms with their harmonics. Enhanced baseline regional neural activity identified according to M and R peaks was found to be presumably sympathetic in pelvic pain patients, and parasympathetic - in patients with overactive bladder. Total pulsatile activity and pulsatile resonances found in the bladder as well as in the penile spectrum characterised regional circulation and vascular tone. The abnormal spectral parameters characteristic of the patients with genitourinary diseases shifted to the norm in the cases of efficient therapy. Bioimpedance harmonic analysis seems to be a potent tool to assess regional peculiarities of circulatory and autonomic nervous activity in the course of patient treatment.

  8. Bioimpedance Harmonic Analysis as a Diagnostic Tool to Assess Regional Circulation and Neural Activity

    International Nuclear Information System (INIS)

    The novel technique based on harmonic analysis of bioimpedance microvariations with original hard- and software complex incorporating a high-resolution impedance converter was used to assess the neural activity and circulation in human urinary bladder and penis in patients with pelvic pain, erectile dysfunction, and overactive bladder. The therapeutic effects of shock wave therapy and Botulinum toxin detrusor injections were evaluated quantitatively according to the spectral peaks at low 0.1 Hz frequency (M for Mayer wave), respiratory (R) and cardiac (C) rhythms with their harmonics. Enhanced baseline regional neural activity identified according to M and R peaks was found to be presumably sympathetic in pelvic pain patients, and parasympathetic – in patients with overactive bladder. Total pulsatile activity and pulsatile resonances found in the bladder as well as in the penile spectrum characterised regional circulation and vascular tone. The abnormal spectral parameters characteristic of the patients with genitourinary diseases shifted to the norm in the cases of efficient therapy. Bioimpedance harmonic analysis seems to be a potent tool to assess regional peculiarities of circulatory and autonomic nervous activity in the course of patient treatment.

  9. Microarray analysis reveals the actual specificity of enrichment media used for food safety assessment.

    Science.gov (United States)

    Kostić, Tanja; Stessl, Beatrix; Wagner, Martin; Sessitsch, Angela

    2011-06-01

    Microbial diagnostic microarrays are tools for simultaneous detection and identification of microorganisms in food, clinical, and environmental samples. In comparison to classic methods, microarray-based systems have the potential for high throughput, parallelism, and miniaturization. High specificity and high sensitivity of detection have been demonstrated. A microbial diagnostic microarray for the detection of the most relevant bacterial food- and waterborne pathogens and indicator organisms was developed and thoroughly validated. The microarray platform based on sequence-specific end labeling of oligonucleotides and the phylogenetically robust gyrB marker gene allowed a highly specific (resolution on genus and/or species level) and sensitive (0.1% relative and 10(4) CFU absolute sensitivity) detection of the target pathogens. In initial challenge studies of the applicability of microarray-based food analysis, we obtained results demonstrating the questionable specificity of standardized culture-dependent microbiological detection methods. Taking into consideration the importance of reliable food safety assessment methods, comprehensive performance assessment is essential. Results demonstrate the potential of this new pathogen diagnostic microarray to evaluate culture-based standard methods in microbiological food analysis.

  10. Assessment of Student Skills for Critiquing Published Primary Scientific Literature Using a Primary Trait Analysis Scale

    Directory of Open Access Journals (Sweden)

    Manuel F. Varela

    2009-12-01

    Full Text Available Instructor evaluation of progressive student skills in the analysis of primary literature is critical for the development of these skills in young scientists. Students in a senior or graduate-level one-semester course in Immunology at a Masters-level comprehensive university were assessed for abilities (primary traits to recognize and evaluate the following elements of a scientific paper: Hypothesis and Rationale, Significance, Methods, Results, Critical Thinking and Analysis, and Conclusions. We tested the hypotheses that average recognition scores vary among elements and that scores change with time differently by trait. Recognition scores (scaled 1 to 5, and differences in scores were analyzed using analysis of variance (ANOVA, regression, and analysis of covariance (ANCOVA (n = 10 papers over 103 days. By multiple comparisons testing, we found that recognition scores statistically fell into two groups: high scores (for Hypothesis and Rationale, Significance, Methods, and Conclusions and low scores (for Results and Critical Thinking and Analysis. Recognition scores only significantly changed with time (increased for Hypothesis and Rationale and Results. ANCOVA showed that changes in recognition scores for these elements were not significantly different in slope (F1,16 = 0.254, P = 0.621 but the Results trait was significantly lower in elevation (F1,17 = 12.456, P = 0.003. Thus, students improved with similar trajectories, but starting and ending with lower Results scores. We conclude that students have greatest difficulty evaluating Results and critically evaluating scientific validity. Our findings show extant student skills, and the significant increase in some traits shows learning. This study demonstrates that students start with variable recognition skills and that student skills may be learned at differential rates. Faculty can use these findings or the primary trait analysis scoring scale to focus on specific paper elements for which

  11. ASSESSMENT OF OIL PALM PLANTATION AND TROPICAL PEAT SWAMP FOREST WATER QUALITY BY MULTIVARIATE STATISTICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Seca Gandaseca

    2014-01-01

    Full Text Available This study reports the spatio-temporal changes in river and canal water quality of peat swamp forest and oil palm plantation sites of Sarawak, Malaysia. To investigate temporal changes, 192 water samples were collected at four stations of BatangIgan, an oil palm plantation site of Sarawak, during July-November in 2009 and April-July in 2010. Nine water quality parameters including Electrical Conductivity (EC, pH, Turbidity (TER, Dissolved Oxygen (DO, Temperature (TEMP, Chemical Oxygen Demand (COD, five-day Biochemical Oxygen Demand (BOD5, ammonia-Nitrogen (NH3-N, Total Suspended Solids (TSS were analysed. To investigate spatial changes, 432water samples were collected from six different sites including BatangIgan during June-August 2010. Six water quality parameters including pH, DO, COD, BOD5, NH3-N and TSS were analysed to see the spatial variations. Most significant parameters which contributed in spatio-temporal variations were assessed by statistical techniques such as Hierarchical Agglomerative Cluster Analysis (HACA, Factor Analysis/Principal Components Analysis (FA/PCA and Discriminant Function Analysis (DFA. HACA identified three different classes of sites: Relatively Unimpaired, Impaired and Less Impaired Regions on the basis of similarity among different physicochemical characteristics and pollutant level between the sampling sites. DFA produced the best results for identification of main variables for temporal analysis and separated parameters (EC, TER, COD and identified three parameters for spatial analysis (pH, NH3-N and BOD5. The results signify that parameters identified by statistical analyses were responsible for water quality change and suggest the possibility the agricultural and oil palm plantation activities as a source of pollutants. The results suggest dire need for proper watershed management measures to restore the water quality of this tributary for a

  12. A Support Analysis Framework for mass movement damage assessment: applications to case studies in Calabria (Italy

    Directory of Open Access Journals (Sweden)

    O. Petrucci

    2009-03-01

    Full Text Available The analysis of data describing damage caused by mass movements in Calabria (Italy allowed the organisation of the Support Analysis Framework (SAF, a spreadsheet that converts damage descriptions into numerical indices expressing direct, indirect, and intangible damage.

    The SAF assesses damage indices of past mass movements and the potential outcomes of dormant phenomena re-activations. It is based on the effects on damaged elements and is independent of both physical and geometric phenomenon characteristics.

    SAF sections that assess direct damage encompass several lines, each describing an element characterised by a value fixed on a relative arbitrary scale. The levels of loss are classified as: L4: complete; L3: high; L2: medium; or L1: low. For a generic line l, the SAF multiplies the value of a damaged element by its level of loss, obtaining dl, the contribution of the line to the damage.

    Indirect damage is appraised by two sections accounting for: (a actions aiming to overcome emergency situations and (b actions aiming to restore pre-movement conditions. The level of loss depends on the number of people involved (a or the cost of actions (b.

    For intangible damage, the level of loss depends on the number of people involved.

    We examined three phenomena, assessing damage using the SAF and SAFL, customised versions of SAF based on the elements actually present in the analysed municipalities that consider the values of elements in the community framework. We show that in less populated, inland, and affluent municipalities, the impact of mass movements is greater than in coastal areas.

    The SAF can be useful to sort groups of phenomena according to their probable future damage, supplying results significant either for insurance companies or for local authorities involved in both disaster management and planning of defensive measures.

  13. Assessment of genetic stability in micropropagules of Jatropha curcas genotypes by RAPD and AFLP analysis

    KAUST Repository

    Sharma, Sweta K.

    2011-07-01

    Jatropha curcas (Euphorbiaceae), a drought resistant non edible oil yielding plant, has acquired significant importance as an alternative renewable energy source. Low and inconsistent yields found in field plantations prompted for identification of high yielding clones and their large scale multiplication by vegetative propagation to obtain true to type plants. In the current investigation plantlets of J. curcas generated by axillary bud proliferation (micropropagation) using nodal segments obtained from selected high yielding genotypes were assessed for their genetic stability using Randomly Amplified Polymorphic DNA (RAPD) and Amplified Fragment Length Polymorphism (AFLP) analyses. For RAPD analysis, 21 out of 52 arbitrary decamer primers screened gave clear reproducible bands. In the micropropagated plantlets obtained from the 2nd sub-culture, 4 out of a total of 177 bands scored were polymorphic, but in the 8th and 16th sub-cultures (culture cycle) no polymorphisms were detected. AFLP analysis revealed 0.63%, 0% and 0% polymorphism in the 2nd, 8th and 16th generations, respectively. When different genotypes, viz. IC 56557 16, IC 56557 34 and IC 56557 13, were assessed by AFLP, 0%, 0.31% and 0.47% polymorphisms were found, respectively, indicating a difference in genetic stability among the different genotypes. To the best of our knowledge this is the first report on assessment of genetic stability of micropropagated plantlets in J. curcas and suggests that axillary shoot proliferation can safely be used as an efficient micropropagation method for mass propagation of J. curcas. © 2011 Elsevier B.V.

  14. Applications of life cycle assessment and cost analysis in health care waste management

    Energy Technology Data Exchange (ETDEWEB)

    Soares, Sebastiao Roberto, E-mail: soares@ens.ufsc.br [Department of Sanitary Engineering, Federal University of Santa Catarina, UFSC, Campus Universitario, Centro Tecnologico, Trindade, PO Box 476, Florianopolis, SC 88040-970 (Brazil); Finotti, Alexandra Rodrigues, E-mail: finotti@ens.ufsc.br [Department of Sanitary Engineering, Federal University of Santa Catarina, UFSC, Campus Universitario, Centro Tecnologico, Trindade, PO Box 476, Florianopolis, SC 88040-970 (Brazil); Prudencio da Silva, Vamilson, E-mail: vamilson@epagri.sc.gov.br [Department of Sanitary Engineering, Federal University of Santa Catarina, UFSC, Campus Universitario, Centro Tecnologico, Trindade, PO Box 476, Florianopolis, SC 88040-970 (Brazil); EPAGRI, Rod. Admar Gonzaga 1347, Itacorubi, Florianopolis, Santa Catarina 88034-901 (Brazil); Alvarenga, Rodrigo A.F., E-mail: alvarenga.raf@gmail.com [Department of Sanitary Engineering, Federal University of Santa Catarina, UFSC, Campus Universitario, Centro Tecnologico, Trindade, PO Box 476, Florianopolis, SC 88040-970 (Brazil); Ghent University, Department of Sustainable Organic Chemistry and Technology, Coupure Links 653/9000 Gent (Belgium)

    2013-01-15

    Highlights: Black-Right-Pointing-Pointer Three Health Care Waste (HCW) scenarios were assessed through environmental and cost analysis. Black-Right-Pointing-Pointer HCW treatment using microwave oven had the lowest environmental impacts and costs in comparison with autoclave and lime. Black-Right-Pointing-Pointer Lime had the worst environmental and economic results for HCW treatment, in comparison with autoclave and microwave. - Abstract: The establishment of rules to manage Health Care Waste (HCW) is a challenge for the public sector. Regulatory agencies must ensure the safety of waste management alternatives for two very different profiles of generators: (1) hospitals, which concentrate the production of HCW and (2) small establishments, such as clinics, pharmacies and other sources, that generate dispersed quantities of HCW and are scattered throughout the city. To assist in developing sector regulations for the small generators, we evaluated three management scenarios using decision-making tools. They consisted of a disinfection technique (microwave, autoclave and lime) followed by landfilling, where transportation was also included. The microwave, autoclave and lime techniques were tested at the laboratory to establish the operating parameters to ensure their efficiency in disinfection. Using a life cycle assessment (LCA) and cost analysis, the decision-making tools aimed to determine the technique with the best environmental performance. This consisted of evaluating the eco-efficiency of each scenario. Based on the life cycle assessment, microwaving had the lowest environmental impact (12.64 Pt) followed by autoclaving (48.46 Pt). The cost analyses indicated values of US$ 0.12 kg{sup -1} for the waste treated with microwaves, US$ 1.10 kg{sup -1} for the waste treated by the autoclave and US$ 1.53 kg{sup -1} for the waste treated with lime. The microwave disinfection presented the best eco-efficiency performance among those studied and provided a feasible

  15. Applications of life cycle assessment and cost analysis in health care waste management

    International Nuclear Information System (INIS)

    Highlights: ► Three Health Care Waste (HCW) scenarios were assessed through environmental and cost analysis. ► HCW treatment using microwave oven had the lowest environmental impacts and costs in comparison with autoclave and lime. ► Lime had the worst environmental and economic results for HCW treatment, in comparison with autoclave and microwave. - Abstract: The establishment of rules to manage Health Care Waste (HCW) is a challenge for the public sector. Regulatory agencies must ensure the safety of waste management alternatives for two very different profiles of generators: (1) hospitals, which concentrate the production of HCW and (2) small establishments, such as clinics, pharmacies and other sources, that generate dispersed quantities of HCW and are scattered throughout the city. To assist in developing sector regulations for the small generators, we evaluated three management scenarios using decision-making tools. They consisted of a disinfection technique (microwave, autoclave and lime) followed by landfilling, where transportation was also included. The microwave, autoclave and lime techniques were tested at the laboratory to establish the operating parameters to ensure their efficiency in disinfection. Using a life cycle assessment (LCA) and cost analysis, the decision-making tools aimed to determine the technique with the best environmental performance. This consisted of evaluating the eco-efficiency of each scenario. Based on the life cycle assessment, microwaving had the lowest environmental impact (12.64 Pt) followed by autoclaving (48.46 Pt). The cost analyses indicated values of US$ 0.12 kg−1 for the waste treated with microwaves, US$ 1.10 kg−1 for the waste treated by the autoclave and US$ 1.53 kg−1 for the waste treated with lime. The microwave disinfection presented the best eco-efficiency performance among those studied and provided a feasible alternative to subsidize the formulation of the policy for small generators of HCW.

  16. A Support Analysis Framework for mass movement damage assessment: applications to case studies in Calabria (Italy)

    Science.gov (United States)

    Petrucci, O.; Gullã, G.

    2009-03-01

    The analysis of data describing damage caused by mass movements in Calabria (Italy) allowed the organisation of the Support Analysis Framework (SAF), a spreadsheet that converts damage descriptions into numerical indices expressing direct, indirect, and intangible damage. The SAF assesses damage indices of past mass movements and the potential outcomes of dormant phenomena re-activations. It is based on the effects on damaged elements and is independent of both physical and geometric phenomenon characteristics. SAF sections that assess direct damage encompass several lines, each describing an element characterised by a value fixed on a relative arbitrary scale. The levels of loss are classified as: L4: complete; L3: high; L2: medium; or L1: low. For a generic line l, the SAF multiplies the value of a damaged element by its level of loss, obtaining dl, the contribution of the line to the damage. Indirect damage is appraised by two sections accounting for: (a) actions aiming to overcome emergency situations and (b) actions aiming to restore pre-movement conditions. The level of loss depends on the number of people involved (a) or the cost of actions (b). For intangible damage, the level of loss depends on the number of people involved. We examined three phenomena, assessing damage using the SAF and SAFL, customised versions of SAF based on the elements actually present in the analysed municipalities that consider the values of elements in the community framework. We show that in less populated, inland, and affluent municipalities, the impact of mass movements is greater than in coastal areas. The SAF can be useful to sort groups of phenomena according to their probable future damage, supplying results significant either for insurance companies or for local authorities involved in both disaster management and planning of defensive measures.

  17. Confusion assessment method: a systematic review and meta-analysis of diagnostic accuracy

    Directory of Open Access Journals (Sweden)

    Shi Q

    2013-09-01

    Full Text Available Qiyun Shi,1,2 Laura Warren,3 Gustavo Saposnik,2 Joy C MacDermid1 1Health and Rehabilitation Sciences, Western University, London, Ontario, Canada; 2Stroke Outcomes Research Center, Department of Medicine, St Michael's Hospital, University of Toronto, Toronto, Ontario, Canada; 3Dalla Lana School of Public Health, University of Toronto, Toronto, Ontario, Canada Background: Delirium is common in the early stages of hospitalization for a variety of acute and chronic diseases. Objectives: To evaluate the diagnostic accuracy of two delirium screening tools, the Confusion Assessment Method (CAM and the Confusion Assessment Method for the Intensive Care Unit (CAM-ICU. Methods: We searched MEDLINE, EMBASE, and PsychInfo for relevant articles published in English up to March 2013. We compared two screening tools to Diagnostic and Statistical Manual of Mental Disorders IV criteria. Two reviewers independently assessed studies to determine their eligibility, validity, and quality. Sensitivity and specificity were calculated using a bivariate model. Results: Twenty-two studies (n = 2,442 patients met the inclusion criteria. All studies demonstrated that these two scales can be administered within ten minutes, by trained clinical or research staff. The pooled sensitivities and specificity for CAM were 82% (95% confidence interval [CI]: 69%–91% and 99% (95% CI: 87%–100%, and 81% (95% CI: 57%–93% and 98% (95% CI: 86%–100% for CAM-ICU, respectively. Conclusion: Both CAM and CAM-ICU are validated instruments for the diagnosis of delirium in a variety of medical settings. However, CAM and CAM-ICU both present higher specificity than sensitivity. Therefore, the use of these tools should not replace clinical judgment. Keywords: confusion assessment method, diagnostic accuracy, delirium, systematic review, meta-analysis

  18. Authenticity assessment of beef origin by principal component analysis of matrix-assisted laser desorption/ionization mass spectrometric data.

    Science.gov (United States)

    Zaima, Nobuhiro; Goto-Inoue, Naoko; Hayasaka, Takahiro; Enomoto, Hirofumi; Setou, Mitsutoshi

    2011-06-01

    It has become necessary to assess the authenticity of beef origin because of concerns regarding human health hazards. In this study, we used a metabolomic approach involving matrix-assisted laser desorption/ionization imaging mass spectrometry to assess the authenticity of beef origin. Highly accurate data were obtained for samples of extracted lipids from beef of different origin; the samples were grouped according to their origin. The analysis of extracted lipids in this study ended within 10 min, suggesting this approach can be used as a simple authenticity assessment before a definitive identification by isotope analysis.

  19. Short-Term Assessment of Risk and Treatability (START): systematic review and meta-analysis.

    Science.gov (United States)

    O'Shea, Laura E; Dickens, Geoffrey L

    2014-09-01

    This article describes a systematic review of the psychometric properties of the Short-Term Assessment of Risk and Treatability (START) and a meta-analysis to assess its predictive efficacy for the 7 risk domains identified in the manual (violence to others, self-harm, suicide, substance abuse, victimization, unauthorized leave, and self-neglect) among institutionalized patients with mental disorder and/or personality disorder. Comprehensive terms were used to search 5 electronic databases up to January 2013. Additional articles were located by examining references lists and hand-searching. Twenty-three papers were selected to include in the narrative review of START's properties, whereas 9 studies involving 543 participants were included in the meta-analysis. Studies about the feasibility and utility of the tool had positive results but lacked comparators. START ratings demonstrated high internal consistency, interrater reliability, and convergent validity with other risk measures. There was a lack of information about the variability of START ratings over time. Its use in an intervention to reduce violence in forensic psychiatric outpatients was not better than standard care. START risk estimates demonstrated strong predictive validity for various aggressive outcomes and good predictive validity for self-harm. Predictive validity for self-neglect and victimization was no better than chance, whereas evidence for the remaining outcomes is derived from a single, small study. Only 3 of the studies included in the meta-analysis were rated to be at a low risk of bias. Future research should aim to investigate the predictive validity of the START for the full range of adverse outcomes, using well-designed methodologies, and validated outcome tools. PMID:24796344

  20. Pre- analysis assessment of Sea Surface Temperature (SST) products in the region of Malaysian coastal water

    Science.gov (United States)

    Aziz, M. A. H.; Omar, K. M.; Din, A. H. M.; Reba, M. N. M.

    2016-06-01

    This paper presents the pre-analysis of validation between the acquisition satellite data and in situ data. To carry out this assessment, Sea Surface Temperature (SST) data are acquired to be regressed with SST In situ. With the launch of the Moderate Resolution Imaging Spectroradiometer (MODIS) satellite with a sensor on the Terra spacecraft, data sets of the global distribution of sea surface temperature are retrieved, and need to be validated and analyzed. Radar Altimeter Database System (RADS) also has an archived data of Optimal Interpolation SST (OISST) that can be retrieved based on satellite track of altimeter. The aim of this paper is to present intercomparison study between pixel based (MODIS SST) and point based (RADS SST). The value of root mean square error (rmse) is computed to see the performance of the data product. It is an assessment and evaluation to see the performance for both data. The objective of this paper is to evaluate Malaysian coastal area through validation with in situ data. To achieve the objective, we perform pre-analysis study of the MODIS products and RADS SST to see the performance of both data in terms of spatial value during seasonal changes. However, the scope of this analysis covers only on the spatial MODIS pixel value and the OISST point value during the southwest monsoon daytime. From the result, RADS SST/RADS show higher root mean square error (rmse) at 0.731/0.677 (before calibration) and 0.6951/0.476 (after calibration). From the rmse result, we could deduce that the RADS SST has random error arising from the fact that the interpolated points are based on the track.

  1. Assessment of trace elements levels in patients with Type 2 diabetes using multivariate statistical analysis.

    Science.gov (United States)

    Badran, M; Morsy, R; Soliman, H; Elnimr, T

    2016-01-01

    The trace elements metabolism has been reported to possess specific roles in the pathogenesis and progress of diabetes mellitus. Due to the continuous increase in the population of patients with Type 2 diabetes (T2D), this study aims to assess the levels and inter-relationships of fast blood glucose (FBG) and serum trace elements in Type 2 diabetic patients. This study was conducted on 40 Egyptian Type 2 diabetic patients and 36 healthy volunteers (Hospital of Tanta University, Tanta, Egypt). The blood serum was digested and then used to determine the levels of 24 trace elements using an inductive coupled plasma mass spectroscopy (ICP-MS). Multivariate statistical analysis depended on correlation coefficient, cluster analysis (CA) and principal component analysis (PCA), were used to analysis the data. The results exhibited significant changes in FBG and eight of trace elements, Zn, Cu, Se, Fe, Mn, Cr, Mg, and As, levels in the blood serum of Type 2 diabetic patients relative to those of healthy controls. The statistical analyses using multivariate statistical techniques were obvious in the reduction of the experimental variables, and grouping the trace elements in patients into three clusters. The application of PCA revealed a distinct difference in associations of trace elements and their clustering patterns in control and patients group in particular for Mg, Fe, Cu, and Zn that appeared to be the most crucial factors which related with Type 2 diabetes. Therefore, on the basis of this study, the contributors of trace elements content in Type 2 diabetic patients can be determine and specify with correlation relationship and multivariate statistical analysis, which confirm that the alteration of some essential trace metals may play a role in the development of diabetes mellitus. PMID:26653752

  2. Unpacking High and Low Efficacy Teachers' Task Analysis and Competence Assessment in Teaching Low-Achieving Students in Secondary Schools

    Science.gov (United States)

    Wang, Li-Yi; Jen-Yi, Li; Tan, Liang-See; Tan, Irene; Lim, Xue-Fang; Wu, Bing Sheng

    2016-01-01

    This study adopted a pragmatic qualitative research design to unpack high and low efficacy teachers' task analysis and competence assessment in the context of teaching low-achieving students. Nine secondary school English and Science teachers were recruited and interviewed. Results of thematic analysis show that helping students perform well in…

  3. Safety Analysis in Design and Assessment of the Physical Protection of the OKG NPP

    International Nuclear Information System (INIS)

    OKG AB operates a three unit nuclear power plant in the southern parts of Sweden. As a result of recent development of the legislation regarding physical protection of nuclear facilities, OKG has upgraded the protection against antagonistic actions. The new legislation includes requirements both on specific protective measures and on the performance of the physical protection as a whole. In short, the performance related requirements state that sufficient measures shall be implemented to protect against antagonistic actions, as defined by the regulator in the “Design Basis Threat” (DBT). Historically, physical protection and nuclear safety has been managed much as separate issues with different, sometimes contradicting, objectives. Now, insights from the work with the security upgrade have emphasized that physical protection needs to be regarded as an important part of the Defence-In-Depth (DiD) against nuclear accidents. Specifically, OKG has developed new DBT-based analysis methods, which may be characterized as probabilistically informed deterministic analysis, conformed to a format similar to the one used for conventional internal events analysis. The result is a powerful tool for design and assessment of the performance of the protection against antagonistic actions, using a nuclear safety perspective. (author)

  4. Statistical Analysis of Meteorological Data to Assess Evapotranspiration and Infiltration at the Rifle Site, CO, USA

    Science.gov (United States)

    Faybishenko, B.; Long, P. E.; Tokunaga, T. K.; Christensen, J. N.

    2015-12-01

    Net infiltration to the vadose zone, especially in arid or semi-arid climates, is an important control on microbial activity and solute and green house gas fluxes. To assess net infiltration, we performed a statistical analysis of meteorological data as the basis for hydrological and climatic investigations and predictions for the Rifle site, Colorado, USA, located within a floodplain in a mountainous region along the Colorado River, with a semi-arid climate. We carried out a statistical analysis of meteorological 30-year time series data (1985-2015), including: (1) precipitation data, taking into account the evaluation of the snowmelt, (2) evaluation of the evapotranspiration (reference and actual), (3) estimation of the multi-time-scalar Standardized Precipitation-Evapotranspiration Index (SPEI), (4) evaluation of the net infiltration rate, and (5) corroborative analysis of calculated net infiltration rate and groundwater recharge from radioisotopic measurements from samples collected in 2013. We determined that annual net infiltration percentage of precipitation varies from 4.7% to ~18%, with a mean of ~10%, and concluded that calculations of net infiltration based on long-term meteorological data are comparable with those from strontium isotopic investigations. The evaluation of the SPEI showed the intermittent pattern of droughts and wet periods over the past 30 years, with a detectable decreasein the duration of droughts with time. Local measurements within the floodplain indicate a recharge gradient with increased recharge closer to the Colorado River.

  5. BATCH-GE: Batch analysis of Next-Generation Sequencing data for genome editing assessment.

    Science.gov (United States)

    Boel, Annekatrien; Steyaert, Woutert; De Rocker, Nina; Menten, Björn; Callewaert, Bert; De Paepe, Anne; Coucke, Paul; Willaert, Andy

    2016-01-01

    Targeted mutagenesis by the CRISPR/Cas9 system is currently revolutionizing genetics. The ease of this technique has enabled genome engineering in-vitro and in a range of model organisms and has pushed experimental dimensions to unprecedented proportions. Due to its tremendous progress in terms of speed, read length, throughput and cost, Next-Generation Sequencing (NGS) has been increasingly used for the analysis of CRISPR/Cas9 genome editing experiments. However, the current tools for genome editing assessment lack flexibility and fall short in the analysis of large amounts of NGS data. Therefore, we designed BATCH-GE, an easy-to-use bioinformatics tool for batch analysis of NGS-generated genome editing data, available from https://github.com/WouterSteyaert/BATCH-GE.git. BATCH-GE detects and reports indel mutations and other precise genome editing events and calculates the corresponding mutagenesis efficiencies for a large number of samples in parallel. Furthermore, this new tool provides flexibility by allowing the user to adapt a number of input variables. The performance of BATCH-GE was evaluated in two genome editing experiments, aiming to generate knock-out and knock-in zebrafish mutants. This tool will not only contribute to the evaluation of CRISPR/Cas9-based experiments, but will be of use in any genome editing experiment and has the ability to analyze data from every organism with a sequenced genome. PMID:27461955

  6. Assessment of Pollution into the Densu Delta Wetland Using Instrumental Neutron Activation Analysis

    Directory of Open Access Journals (Sweden)

    Dzifa Denutsui

    2011-11-01

    Full Text Available The objective of this work is to assess the levels of trace elements and the extent of pollution in the surface water and sediments from in the Densu delta wetland using INAA, PLI and the help of statistical tools like PCA and CA establish their common sources. Eight samples of surface water, eight samples of sediments were taken for elemental analysis. The samples were analyzed for physical and chemical parameters, as well as trace elements (Al, Cu, V, Mn, Cd, Zn, Cr, Fe and As. The results showed that the surface water in the area can be described as slightly alkaline with pH of 7.5 and high conductivities of 150-3412:S/cm with corresponding TDS of 80-1750 mg/L. Statistical analyses such as Principal Component Analysis (PCA and Cluster Analysis (CA were used to identify trace elements pollution in the wetland area. Results from CA and PCA suggest positive relationships between the two analyses. Trace elements were identified as originating from a common source in all the analyses. The PLI calculated for sediments in the study area were found to be below the index limit of 1. This shows that the surface waters are not yet polluted with respect to trace elements but the originality of the ecosystem is threatened.

  7. Assessment of Forest Conservation Value Using a Species Distribution Model and Object-based Image Analysis

    Science.gov (United States)

    Jin, Y.; Lee, D. K.; Jeong, S. G.

    2015-12-01

    The ecological and social values of forests have recently been highlighted. Assessments of the biodiversity of forests, as well as their other ecological values, play an important role in regional and national conservation planning. The preservation of habitats is linked to the protection of biodiversity. For mapping habitats, species distribution model (SDM) is used for predicting suitable habitat of significant species, and such distribution modeling is increasingly being used in conservation science. However, the pixel-based analysis does not contain contextual or topological information. In order to provide more accurate habitats predictions, a continuous field view that assumes the real world is required. Here we analyze and compare at different scales, habitats of the Yellow Marten's(Martes Flavigula), which is a top predator and also an umbrella species in South Korea. The object-scale, which is a group of pixels that have similar spatial and spectral characteristics, and pixel-scale were used for SDM. Our analysis using the SDM at different scales suggests that object-scale analysis provides a superior representation of continuous habitat, and thus will be useful in forest conservation planning as well as for species habitat monitoring.

  8. Economic analysis and assessment of syngas production using a modeling approach

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hakkwan; Parajuli, Prem B.; Yu, Fei; Columbus, Eugene P.

    2011-08-10

    Economic analysis and modeling are essential and important issues for the development of current feedstock and process technology for bio-gasification. The objective of this study was to develop an economic model and apply to predict the unit cost of syngas production from a micro-scale bio-gasification facility. An economic model was programmed in C++ computer programming language and developed using a parametric cost approach, which included processes to calculate the total capital costs and the total operating costs. The model used measured economic data from the bio-gasification facility at Mississippi State University. The modeling results showed that the unit cost of syngas production was $1.217 for a 60 Nm-3 h-1 capacity bio-gasifier. The operating cost was the major part of the total production cost. The equipment purchase cost and the labor cost were the largest part of the total capital cost and the total operating cost, respectively. Sensitivity analysis indicated that labor costs rank the top as followed by equipment cost, loan life, feedstock cost, interest rate, utility cost, and waste treatment cost. The unit cost of syngas production increased with the increase of all parameters with exception of loan life. The annual cost regarding equipment, labor, feedstock, waste treatment, and utility cost showed a linear relationship with percent changes, while loan life and annual interest rate showed a non-linear relationship. This study provides the useful information for economic analysis and assessment of the syngas production using a modeling approach.

  9. Seismic fragility assessment of concrete gravity dams using nonlinear dynamic analysis with massed foundation

    Energy Technology Data Exchange (ETDEWEB)

    Ghaemian, M.; MirzahosseinKashani, S. [Sharif Univ. of Technology, Tehran (Iran, Islamic Republic of). Dept. of Civil Engineering

    2010-07-01

    A significant concern for dam owners is the maintenance of concrete gravity dams in good condition for infrastructure. For example, these dams should able to continue to operate after a disaster such as an earthquake. However, many of these dams are older dams, and some are located near faults. There are concerns regarding the performance of these dams under the effect of seismic loads. This paper illustrated seismic fragility curves for concrete gravity dams by using nonlinear dynamic analysis and a continuum crack propagation model, smeared crack model. The Pine Flat dam was used for the calculations. Specifically, the paper presented the fragility analysis and probabilistic safety assessment as well as the structural modeling of dam behaviour using the smeared crack model. The finite element model of the dam was also presented. The result from the nonlinear dynamic analysis was discussed. It was concluded that a seismic fragility curve based on the length of a crack at the base demonstrates higher probability when compared with a seismic fragility curve based on areas of cracked elements. Therefore, seismic fragility curves based on the areas of cracked elements should be a more realistic approach, especially when it accounts for areas of damaged elements at the neck. 10 refs,, 5 tabs., 5 figs.

  10. BATCH-GE: Batch analysis of Next-Generation Sequencing data for genome editing assessment

    Science.gov (United States)

    Boel, Annekatrien; Steyaert, Woutert; De Rocker, Nina; Menten, Björn; Callewaert, Bert; De Paepe, Anne; Coucke, Paul; Willaert, Andy

    2016-01-01

    Targeted mutagenesis by the CRISPR/Cas9 system is currently revolutionizing genetics. The ease of this technique has enabled genome engineering in-vitro and in a range of model organisms and has pushed experimental dimensions to unprecedented proportions. Due to its tremendous progress in terms of speed, read length, throughput and cost, Next-Generation Sequencing (NGS) has been increasingly used for the analysis of CRISPR/Cas9 genome editing experiments. However, the current tools for genome editing assessment lack flexibility and fall short in the analysis of large amounts of NGS data. Therefore, we designed BATCH-GE, an easy-to-use bioinformatics tool for batch analysis of NGS-generated genome editing data, available from https://github.com/WouterSteyaert/BATCH-GE.git. BATCH-GE detects and reports indel mutations and other precise genome editing events and calculates the corresponding mutagenesis efficiencies for a large number of samples in parallel. Furthermore, this new tool provides flexibility by allowing the user to adapt a number of input variables. The performance of BATCH-GE was evaluated in two genome editing experiments, aiming to generate knock-out and knock-in zebrafish mutants. This tool will not only contribute to the evaluation of CRISPR/Cas9-based experiments, but will be of use in any genome editing experiment and has the ability to analyze data from every organism with a sequenced genome. PMID:27461955

  11. Extraneous carbon assessment in ultra-microscale radiocarbon analysis using benzene polycarboxylic acids (BPCA)

    Science.gov (United States)

    Hanke, Ulrich M.; McIntyre, Cameron P.; Schmidt, Michael W. I.; Wacker, Lukas; Eglinton, Timothy I.

    2016-04-01

    Measurements of the natural abundance of radiocarbon (14C) concentrations in inorganic and organic carbon-containing materials can be used to investigate their date of origin. Particularly, the biogeochemical cycling of specific compounds in the environment may be investigated applying molecular marker analyses. However, the isolation of specific molecules from environmental matrices requires a complex processing procedure resulting in small sample sizes that often contain less than 30 μg C. Such small samples are sensitive to extraneous carbon (Cex) that is introduced during the purification of the compounds (Shah and Pearson, 2007). We present a thorough radiocarbon blank assessment for benzene polycarboxylic acids (BPCA), a proxy for combustion products that are formed during the oxidative degradation of condensed polyaromatic structures (Wiedemeier et al, in press). The extraneous carbon assessment includes reference material for (1) chemical extraction, (2) preparative liquid chromatography (3) wet chemical oxidation which are subsequently measured with gas ion source AMS (Accelerator Mass Spectrometer, 5-100 μg C). We always use pairs of reference materials, radiocarbon depleted (14Cfossil) and modern (14Cmodern) to determine the fraction modern (F14C) of Cex.Our results include detailed information about the quantification of Cex in radiocarbon molecular marker analysis using BPCA. Error propagation calculations indicate that ultra-microscale samples (20-30 μg) are feasible with uncertainties of less than 10 %. Calculations of the constant contamination reveal important information about the source (F14C) and mass (μg) of Cex (Wacker and Christl, 2011) for each sub procedure. An external correction of compound specific radiocarbon data is essential for robust results that allow for a high degree of confidence in the 14C results. References Shah and Pearson, 2007. Ultra-microscale (5-25μg C) analysis of individual lipids by 14C AMS: Assessment and

  12. Expert assessments and content analysis of crew communication during ISS missions

    Science.gov (United States)

    Yusupova, Anna

    During the last seven years, we have analyzed the communication patterns between ISS crewmembers and mission control personnel and identified a number of different communication styles between these two groups (Gushin et al, 2005). In this paper, we will report on an external validity check we conducted that compares our findings with those of another study using the same research material. For many years the group of psychologists at the Medical Center of Space Flight Control (TCUMOKO) at the Institute for Biomedical Problems (IBMP) in Moscow has been analyzing audio communication sessions of Russian space crews with the ground-based Mission Control during long-duration spaceflight conditions. We compared week by week texts of the standard weekly monitoring reports made by the TsUP psychological group and audiocommunication of space crews with mission control centers. Expert assessments of the crewmembers' psychological state are made by IBMP psychoneurologists on the basis of daily schedule fulfillment, video and audio materials, and psychophysiological data from board. The second approach was based on the crew-ground communication analysis. For both population of messages we applied two corresponding schemas of content analysis. All statements made in communication sessions and weekly reports were divided into three groups in terms of their communication function (Lomov, 1981): 1) informative function (e.g., demands for information, requests, professional slang); 2) socio-regulatory function (e.g., rational consent or discord, operational complaint, refusal to cooperate); and 3) affective (emotional) function (e.g., encouragement, sympathy, emotional consent or discord). Number of statements of the audiocommunication sessions correlated with corresponding functions (informative, regulatory, affective) of communication in weekly monitioring reports made by experts. Crewmembers verbal behavior expresses its psycho-emotional state which is formulated by expert

  13. Life cycle assessment of fossil and biomass power generation chains. An analysis carried out for ALSTOM Power Services

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, Ch.

    2008-12-15

    This final report issued by the Technology Assessment Department of the Paul Scherrer Institute (PSI) reports on the results of an analysis carried out on behalf of the Alstom Power Services company. Fossil and biomass chains as well as co-combustion power plants are assessed. The general objective of this analysis is an evaluation of specific as well as overall environmental burdens resulting from these different options for electricity production. The results obtained for fuel chains including hard coal, lignite, wood, natural gas and synthetic natural gas are discussed. An overall comparison is made and the conclusions drawn from the results of the analysis are presented.

  14. Improving the Assessment and Valuation of Climate Change Impacts for Policy and Regulatory Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marten, Alex; Kopp, Robert E.; Shouse, Kate C.; Griffiths, Charles; Hodson, Elke L.; Kopits, Elizabeth; Mignone, Bryan K.; Moore, Chris; Newbold, Steve; Waldhoff, Stephanie T.; Wolverton, Ann

    2013-04-01

    to updating the estimates regularly as modeling capabilities and scientific and economic knowledge improves. To help foster further improvements in estimating the SCC, the U.S. Environmental Protection Agency and the U.S. Department of Energy hosted a pair of workshops on “Improving the Assessment and Valuation of Climate Change Impacts for Policy and Regulatory Analysis.” The first focused on conceptual and methodological issues related to integrated assessment modeling and the second brought together natural and social scientists to explore methods for improving damage assessment for multiple sectors. These two workshops provide the basis for the 13 papers in this special issue.

  15. Digital Cartographic Models as Analysis Support in Multicriterial Assessment of Vulnerable Flood Risk Elements

    Science.gov (United States)

    Nichersu, Iulian; Mierla, Marian; Trifanov, Cristian; Nichersu, Iuliana; Marin, Eugenia; Sela, Florentina

    2014-05-01

    In the last 20 years there has been an increase of frequency in extreme weather and hydrological events. This frequency increase arise the need to research the risk to the events that are extreme and has big impact to the environment. This paper presents a method to analysis the vulnerable elements to the risk at extreme hydrological event, to be more precisely to flood. The method is using also the LiDAR point cloud. The risk concept has two main components: the first one hazard (represented by frequency of the occurrence and intensity of the flood) and the second one vulnerability (represented by the vulnerable elements to the flood). The studied area in the present paper is situated in the South-East of Europe (Romania, Danube Delta). The Digital Cartographic Models were accomplished by using the LiDAR data obtained within the CARTODD project. The digital cartographic models, with high resolution, consist of 3 components: digital terrain model (DTM), digital elevation model (DEM) and elevation classes (EC). Completing the information of the three models there were used also the orthophotos in visible (VIS) and infrared (IR) spectrum slices. Digital Terrain Model gives information on the altitude of the terrain and indirect of the flood hazard, taking into account the high resolution that the final product has. Digital Elevation Model supplies information related to the surfaces of the terrain plus the altitude of each object on the surface. This model helps to reach to the third model the Elevation Classes Model. We present here three categories of applications of clouds points analyses in floodrisk assessment: buildings assessment, endangered species mentioned in Annex 1 of the European Habitats Directive and morphologic/habitats damages. Pilot case studies of these applications are: Sulina town; endangering species like Osmoderma eremita, Vipera ursini and Spermophilus citellus; Sireasa Polder. For Sulina town was assessed the manmade vulnerable elements to

  16. Spiritual Assessment within Clinical Interventions Focused on Quality of Life Assessment in Palliative Care: A Secondary Analysis of a Systematic Review

    Directory of Open Access Journals (Sweden)

    Gianluca Catania

    2016-03-01

    Full Text Available One of the most crucial palliative care challenges is in determining how patients’ needs are defined and assessed. Although physical and psychological needs are commonly documented in patient’s charts, spiritual needs are less frequently reported. The aim of this review was to determine which explicit, longitudinal documentation of spiritual concerns would sufficiently affect clinical care to alleviate spiritual distress or promote spiritual wellbeing. A secondary analysis of a systematic review originally aimed at appraising the effectiveness of complex interventions focused on quality of life in palliative care was conducted. Five databases were searched for articles reporting interventions focused on QoL including at least two or more QoL dimensions. A narrative synthesis was performed to synthesize findings. In total, 10 studies were included. Only three studies included spiritual wellbeing assessment. Spirituality tools used to assess spiritual wellbeing were different between studies: Hospital QoL Index 14; Spiritual Needs Inventory; Missoula-Vitas QoL Index; and the Needs Assessment Tool: Progressive Disease-Cancer. Only one study reported a healthcare professional’s session training in the use of the QoL tool. Two out of three studies showed in participants an improvement in spiritual wellbeing, but changes in spiritual wellbeing scores were not significant. Overall patients receiving interventions focused on QoL assessment experienced both improvements in their QoL and in their spiritual needs. Although spiritual changes were not significant, the results provide evidence that a spiritual need exists and that spiritual care should be appropriately planned and delivered. Spiritual needs assessment precedes spiritual caring. It is essential that interventions focused on QoL assessment in palliative care include training on how to conduct a spiritual assessment and appropriate interventions to be offered to patients to address their

  17. Problems and prospects of modern methods of business analysis in the process of assessment of solvency of borrowers

    Directory of Open Access Journals (Sweden)

    Aptekar Saveliy S.

    2013-03-01

    Full Text Available The goal of the article is a comparative analysis of modern methods of business analysis in the process of assessment of solvency of borrowers of Ukrainian commercial banks, study of prospects and problems of the use of methods in the credit process. In the result of the study the article systemises and considers the conduct of the credit process of Ukrainian commercial banks. It becomes clear from result of the study that it is impossible to obtain a single assessment of solvency of a borrower with generalisation of numerical and non-numerical data. Assessment of qualified analysts is required for a justified assessment of solvency apart from information represented in numbers. Improvement of approaches to assessment of solvency of borrowers and adaptation of the existing foreign experience in this field to specific features of formation of solvency of Ukrainian borrowers are important tasks for the Ukrainian banking system. Prospects of further studies in this direction are establishment of importance of the conduct of business analysis and its key role in assessment of solvency of borrowers as a main instrument of minimisation of the credit risk. Improvement of this sphere of analytical work in Ukrainian banks should be carried out in the following main directions: study and analysis of qualitative indicators of business activity; analysis of main sections of the business plan; expansion of the composition of indicators of the financial analysis for obtaining information; conduct of analysis of possible sources of repayment of loan liabilities; and active use of analysis of cash flows of an enterprise.

  18. Development of a quantitative morphological assessment of toxicant-treated zebrafish larvae using brightfield imaging and high-content analysis.

    Science.gov (United States)

    Deal, Samantha; Wambaugh, John; Judson, Richard; Mosher, Shad; Radio, Nick; Houck, Keith; Padilla, Stephanie

    2016-09-01

    One of the rate-limiting procedures in a developmental zebrafish screen is the morphological assessment of each larva. Most researchers opt for a time-consuming, structured visual assessment by trained human observer(s). The present studies were designed to develop a more objective, accurate and rapid method for screening zebrafish for dysmorphology. Instead of the very detailed human assessment, we have developed the computational malformation index, which combines the use of high-content imaging with a very brief human visual assessment. Each larva was quickly assessed by a human observer (basic visual assessment), killed, fixed and assessed for dysmorphology with the Zebratox V4 BioApplication using the Cellomics® ArrayScan® V(TI) high-content image analysis platform. The basic visual assessment adds in-life parameters, and the high-content analysis assesses each individual larva for various features (total area, width, spine length, head-tail length, length-width ratio, perimeter-area ratio). In developing the computational malformation index, a training set of hundreds of embryos treated with hundreds of chemicals were visually assessed using the basic or detailed method. In the second phase, we assessed both the stability of these high-content measurements and its performance using a test set of zebrafish treated with a dose range of two reference chemicals (trans-retinoic acid or cadmium). We found the measures were stable for at least 1 week and comparison of these automated measures to detailed visual inspection of the larvae showed excellent congruence. Our computational malformation index provides an objective manner for rapid phenotypic brightfield assessment of individual larva in a developmental zebrafish assay. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26924781

  19. Advances in exergy analysis: a novel assessment of the Extended Exergy Accounting method

    International Nuclear Information System (INIS)

    Highlights: • General overview of exergy-based methods for system analysis is presented. • Taxonomy for exergy-based methods classification is proposed. • Theoretical foundations and details of Extended Exergy Accounting are described. - Abstract: Objective: This paper presents a theoretical reassessment of the Extended Exergy Accounting method (EEA in the following), a comprehensive exergy-based analytical paradigm for the evaluation of the total equivalent primary resource consumption in a generic system. Our intent in this paper was to rigorously review the EEA theory and to highlight its double “valence” as a resource quantifier and to clarify its operative potential. On the one side, EEA can be properly regarded as a general “costing” theory based on a proper knowledge of the cumulative exergy consumption of different supply chains, economic systems and labour market: it is indeed the only method that translates externalities (capital, labour and environmental remediation) into cumulative exergetic costs and thus allows for their rigorous inclusion in a comprehensive resource cost assessment. Indeed, the extended exergy cost eec reflects both the thermodynamic “efficiency” of the production chain and the “hidden” resource costs for the society as a whole. From another, perhaps even more innovative, perspective, EEA can be viewed as a space and time dependent methodology since economic and labour costs can only be included in the Extended Exergy balance via their exergy equivalents (via two rigorously defined postulates). Since the equivalent exergy cost of the externalities depends both on the type of society and on the time window of the analysis, the extended exergy cost eec reflects in a very real sense both the thermodynamic “efficiency” of the machinery and the “conversion efficiency” of the specific society within which the analysis is performed. We argue that these two intrinsic features of the EEA method provide both

  20. Crowd-based breath analysis: assessing behavior, activity, exposures, and emotional response of people in groups.

    Science.gov (United States)

    Williams, Jonathan; Pleil, Joachim

    2016-01-01

    A new concept for exhaled breath analysis has emerged wherein groups, or even crowds of people are simultaneously sampled in enclosed environments to detect overall trends in their activities and recent exposures. The basic idea is to correlate the temporal profile of known breath markers such as carbon dioxide, isoprene, or acetone with all other volatile organics in the air space. Those that trend similarly in time are designated as breath constituents. The ultimate goal of this work is to develop technology for assessing group based behaviors, chemical exposures or even changes in stress or mood. Applications are myriad ranging from chemical dose/toxicity screening to health and stress status for national security diagnostics. The basic technology employs real-time mass spectrometry capable of simultaneously measuring volatile chemicals and endogenous breath markers. PMID:27341381

  1. Assessment of neural network, frequency ratio and regression models for landslide susceptibility analysis

    Science.gov (United States)

    Pradhan, B.; Buchroithner, M. F.; Mansor, S.

    2009-04-01

    This paper presents the assessment results of spatially based probabilistic three models using Geoinformation Techniques (GIT) for landslide susceptibility analysis at Penang Island in Malaysia. Landslide locations within the study areas were identified by interpreting aerial photographs, satellite images and supported with field surveys. Maps of the topography, soil type, lineaments and land cover were constructed from the spatial data sets. There are nine landslide related factors were extracted from the spatial database and the neural network, frequency ratio and logistic regression coefficients of each factor was computed. Landslide susceptibility maps were drawn for study area using neural network, frequency ratios and logistic regression models. For verification, the results of the analyses were compared with actual landslide locations in study area. The verification results show that frequency ratio model provides higher prediction accuracy than the ANN and regression models.

  2. Assessment of shock capturing schemes for resonant flows in nonlinear instability analysis

    Science.gov (United States)

    Przekwas, A. J.; Yang, H. Q.; Mcconnaughey, P.; Tucker, K.

    1990-01-01

    The paper presents computational assessment of advanced numerical schemes for nonlinear acoustic problems related to combustion instabilities in liquid rocket engines. Several time-accurate, shock capturing schemes have been evaluated on a benchmark, closed-end resonant pipe flow problem. It involves the numerical solution of inviscid, compressible gas dynamics equations to predict acoustic wave propagation, wave steepening, formation of shocks, acoustic energy dissipation and wave-wall reflection for several hundred wave cycles. It was demonstrated that high accuracy TVD type schemes can be used for direct, exact nonlinear analysis of combustion instability problems, preserving high harmonic energy content for long periods of time. The selected scheme was then applied to analyze the acoustic responses of resonant pipe-resonator, radial acoustic modes and hub-baffle configurations. Interesting observations of wave shape and damping characteristics have been drawn from presented computational studies.

  3. Analysis report for WIPP colloid model constraints and performance assessment parameters

    Energy Technology Data Exchange (ETDEWEB)

    Mariner, Paul E.; Sassani, David Carl

    2014-03-01

    An analysis of the Waste Isolation Pilot Plant (WIPP) colloid model constraints and parameter values was performed. The focus of this work was primarily on intrinsic colloids, mineral fragment colloids, and humic substance colloids, with a lesser focus on microbial colloids. Comments by the US Environmental Protection Agency (EPA) concerning intrinsic Th(IV) colloids and Mg-Cl-OH mineral fragment colloids were addressed in detail, assumptions and data used to constrain colloid model calculations were evaluated, and inconsistencies between data and model parameter values were identified. This work resulted in a list of specific conclusions regarding model integrity, model conservatism, and opportunities for improvement related to each of the four colloid types included in the WIPP performance assessment.

  4. Multifractal analysis of surface EMG signals for assessing muscle fatigue during static contractions

    Institute of Scientific and Technical Information of China (English)

    WANG Gang; REN Xiao-mei; LI Lei; WANG Zhi-zhong

    2007-01-01

    This study is aimed at assessing muscle fatigue during a static contraction using multifractal analysis and found that the surface electromyographic (SEMG) signals characterized multifractality during a static contraction. By applying the method of direct determination of the f(α) singularity spectrum, the area of the multifractal spectrum of the SEMG signals was computed. The results showed that the spectrum area significantly increased during muscle fatigue. Therefore the area could be used as an assessor of muscle fatigue. Compared with the median frequency (MDF)-the most popular indicator of muscle fatigue, the spectrum area presented here showed higher sensitivity during a static contraction. So the singularity spectrum area is considered to be a more effective indicator than the MDF for estimating muscle fatigue.

  5. Gene set analysis for GWAS: assessing the use of modified Kolmogorov-Smirnov statistics.

    Science.gov (United States)

    Debrabant, Birgit; Soerensen, Mette

    2014-10-01

    We discuss the use of modified Kolmogorov-Smirnov (KS) statistics in the context of gene set analysis and review corresponding null and alternative hypotheses. Especially, we show that, when enhancing the impact of highly significant genes in the calculation of the test statistic, the corresponding test can be considered to infer the classical self-contained null hypothesis. We use simulations to estimate the power for different kinds of alternatives, and to assess the impact of the weight parameter of the modified KS statistic on the power. Finally, we show the analogy between the weight parameter and the genesis and distribution of the gene-level statistics, and illustrate the effects of differential weighting in a real-life example.

  6. Assessment of Smolt Condition for Travel Time Analysis Project, 1987-1997 Project Review.

    Energy Technology Data Exchange (ETDEWEB)

    Schrock, Robin M.; Hans, Karen M.; Beeman, John W. [US Geological Survey, Western Fisheries Research Center, Columbia River Research Laboratory, Cook, WA

    1997-12-01

    The assessment of Smolt Condition for Travel Time Analysis Project (Bonneville Power Administration Project 87-401) monitored attributes of salmonid smolt physiology in the Columbia and Snake River basins from 1987 to 1997, under the Northwest Power Planning Council Fish and Wildlife Program, in cooperation with the Smolt Monitoring Program of the Fish Passage Center. The primary goal of the project was to investigate the physiological development of juvenile salmonids related to migration rates. The assumption was made that the level of smolt development, interacting with environmental factos such as flow, would be reflected in travel times. The Fish Passage Center applied the physiological measurements of smolt condition to Water Budget management, to regulate flows so as to decrease travel time and increase survival.

  7. Assessment of TEES reg sign applications for Wet Industrial Wastes: Energy benefit and economic analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Elliott, D.C.; Scheer, T.H.

    1992-02-01

    Fundamental work is catalyzed biomass pyrolysis/gasification led to the Thermochemical Environmental Energy System (TEES{reg sign}) concept, a means of converting moist biomass feedstocks to high-value fuel gases such as methane. A low-temperature (350{degrees}C), pressurized (3100 psig) reaction environment and a nickel catalyst are used to reduce volumes of very high-moisture wastes such as food processing byproducts while producing useful quantities of energy. A study was conducted to assess the economic viability of a range of potential applications of the process. Cases examined included feedstocks of cheese whey, grape pomace, spent grain, and an organic chemical waste stream. The analysis indicated that only the organic chemical waste process is economically attractive in the existing energy/economic environment. However, food processing cases will become attractive as alternative disposal practices are curtailed and energy prices rise.

  8. Assessment of the Effective Prestress Force on Bonded Tendon by a Finite Element Analysis

    International Nuclear Information System (INIS)

    Bonded tendons have been used in containment buildings, which house nuclear reactors, of heavy water reactors and light water reactors of several nuclear power plants operated in Korea. The assessment of the effective prestress force on these bonded tendons is becoming an important issue in assuring their continuous operation beyond their design life time. To date, an indirect method was adapted to evaluate the prestress force on the bonded tendons for containment buildings using test beams that were manufactured at the time of construction. In order to complement the indirect method, a system identification (SI) technique process was developed in preliminary research that mainly focused on the 1:4 scale prestressed concrete containment vessel tested by Sandia Nation Laboratory in 2000. Therefore, this paper carried out a finite element (FE) analysis to evaluate the effective prestress force of a bonded tendon using the SI technique

  9. Use of the NetBeans Platform for NASA Robotic Conjunction Assessment Risk Analysis

    Science.gov (United States)

    Sabey, Nickolas J.

    2014-01-01

    The latest Java and JavaFX technologies are very attractive software platforms for customers involved in space mission operations such as those of NASA and the US Air Force. For NASA Robotic Conjunction Assessment Risk Analysis (CARA), the NetBeans platform provided an environment in which scalable software solutions could be developed quickly and efficiently. Both Java 8 and the NetBeans platform are in the process of simplifying CARA development in secure environments by providing a significant amount of capability in a single accredited package, where accreditation alone can account for 6-8 months for each library or software application. Capabilities either in use or being investigated by CARA include: 2D and 3D displays with JavaFX, parallelization with the new Streams API, and scalability through the NetBeans plugin architecture.

  10. Computational psycholinguistic analysis and its application in psychological assessment of college students

    Directory of Open Access Journals (Sweden)

    Kučera Dalibor

    2015-06-01

    Full Text Available The paper deals with the issue of computational psycholinguistic analysis (CPA and its experimental application in basic psychological and pedagogical assessment. CPA is a new method which may potentially provide interesting, psychologically relevant information about the author of a particular text, regardless of the text’s factual (semantic content and without the need to obtain additional materials. As part of our QPA-FPT research we studied the link between the linguistic form of a text by Czech college students and their personality characteristics obtained from a psychodiagnostic test battery. The article also discusses the basis of the method, opportunities for practical application and potential use within psychological and pedagogical disciplines

  11. Analysis of Temporal Effects in Quality Assessment of High Definition Video

    Directory of Open Access Journals (Sweden)

    M. Slanina

    2012-04-01

    Full Text Available The paper deals with the temporal properties of a~scoring session when assessing the subjective quality of full HD video sequences using the continuous video quality tests. The performed experiment uses a modification of the standard test methodology described in ITU-R Rec. BT.500. It focuses on the reactive times and the time needed for the user ratings to stabilize at the beginning of a video sequence. In order to compare the subjective scores with objective quality measures, we also provide an analysis of PSNR and VQM for the considered sequences to find that correlation of the objective metric results with user scores, recored during playback and after playback, differs significantly.

  12. NMR analysis of cracking products of asphalt and assessment of catalyst performance

    International Nuclear Information System (INIS)

    This paper reports 13C nuclear magnetic resonance (NMR) analysis of some crackates obtained from asphalt cracking in a micro autoclave under a pool of nitrogen. The cracking was carried out in the presence of Zeolite Socony Mobil no. 5 (HZSM-5) and locally cheap and readily available clay i.e. Utimanzai Clay (UTIMAC) as catalysts. The crackates obtained in case of each run was analyzed by 13C NMR spectrophotometer using CDCl3 as dissolving solvent and tetramethyl silane (TMS) as internal standard. The 13C NMR data was used to assess the extent of hydrocracking and degree of branching in crackates from asphalt .The results indicate that the cheap local catalyst used has comparable suitability with the conventional expensive catalyst in terms of asphalt cracking and its conversion to light products enriched with bulk n-alkane configurations

  13. Initial assessment of facial nerve paralysis based on motion analysis using an optical flow method.

    Science.gov (United States)

    Samsudin, Wan Syahirah W; Sundaraj, Kenneth; Ahmad, Amirozi; Salleh, Hasriah

    2016-01-01

    An initial assessment method that can classify as well as categorize the severity of paralysis into one of six levels according to the House-Brackmann (HB) system based on facial landmarks motion using an Optical Flow (OF) algorithm is proposed. The desired landmarks were obtained from the video recordings of 5 normal and 3 Bell's Palsy subjects and tracked using the Kanade-Lucas-Tomasi (KLT) method. A new scoring system based on the motion analysis using area measurement is proposed. This scoring system uses the individual scores from the facial exercises and grades the paralysis based on the HB system. The proposed method has obtained promising results and may play a pivotal role towards improved rehabilitation programs for patients.

  14. Human factors assessment in PRA using Task Analysis Linked Evaluation Technique (TALENT)

    International Nuclear Information System (INIS)

    Thirty years ago the US military and US aviation industry, and more recently, in response to the US Three Mile Island and USSR Chernobyl accidents, the US commercial nuclear power industry, acknowledged that human error, as an immediate precursor, and as a latent or indirect influence in the form of training, maintainability, inservice test, and surveillance programs, is a primary contributor to unreality and risk in complex high-reliability systems. A 1985 Nuclear Regulatory Commission (NRC) study of Licensee Event Reports (LERs) suggests that upwards of 65% of commercial nuclear system failures involve human error. Despite the magnitude and nature of human error cited in that study, there has been limited attention to personnel-centered issues, especially person-to-person issues involving group processes, management and organizational environment. The paper discusses NRC integration and applications research with respect to the Task Analysis Linked Evaluation Technique (TALENT) in risk assessment applications

  15. Risk assessment for enterprise resource planning (ERP) system implementations: a fault tree analysis approach

    Science.gov (United States)

    Zeng, Yajun; Skibniewski, Miroslaw J.

    2013-08-01

    Enterprise resource planning (ERP) system implementations are often characterised with large capital outlay, long implementation duration, and high risk of failure. In order to avoid ERP implementation failure and realise the benefits of the system, sound risk management is the key. This paper proposes a probabilistic risk assessment approach for ERP system implementation projects based on fault tree analysis, which models the relationship between ERP system components and specific risk factors. Unlike traditional risk management approaches that have been mostly focused on meeting project budget and schedule objectives, the proposed approach intends to address the risks that may cause ERP system usage failure. The approach can be used to identify the root causes of ERP system implementation usage failure and quantify the impact of critical component failures or critical risk events in the implementation process.

  16. Reliability assessment of thrust chamber cooling concepts using probabilistic analysis techniques

    Science.gov (United States)

    Rapp, Douglas C.

    1993-01-01

    The reliability of OFHC (Oxygen Free High Conductivity) copper and NARloy-Z thrust chambers is assessed by applying probabilistic structural analysis techniques to incorporate design parameter variability and uncertainty. Thrust chambers specifically evaluated are the cylindrical test fixtures employed in a plug-nozzle configuration at the NASA Lewis Research Center. Direct sampling Monte Carlo simulations based on a simplified life prediction methodology established probability densities of firing cycles to structural failure. Simulated cyclic lives demonstrated modest agreement to experiment. Similarly, regions of high structural failure probability were determined using a limit state approach employing calculated cumulative distribution functions for effective stress response and an assumed material strength distribution. A probability of failure of 0.012 was calculated at the center of the coolant channel hot-gas-side wall for an OFHC milled channel. Structural response was found to be sensitive to the uncertainties in the thrust chamber thermal environment and the material's thermal expansion coefficient.

  17. Life Cycle Assessment of Bio-diesel Production—A Comparative Analysis

    Science.gov (United States)

    Chatterjee, R.; Sharma, V.; Mukherjee, S.; Kumar, S.

    2014-04-01

    This work deals with the comparative analysis of environmental impacts of bio-diesel produced from Jatropha curcas, Rapeseed and Palm oil by applying the life cycle assessment and eco-efficiency concepts. The environmental impact indicators considered in the present paper include global warming potential (GWP, CO2 equivalent), acidification potential (AP, SO2 equivalent) and eutrophication potential (EP, NO3 equivalent). Different weighting techniques have been used to present and evaluate the environmental characteristics of bio-diesel. With the assistance of normalization values, the eco-efficiency was demonstrated in this work. The results indicate that the energy consumption of bio-diesel production is lowest in Jatropha while AP and EP are more in case of Jatropha than that of Rapeseed and Palm oil.

  18. A multi-criteria decision analysis assessment of waste paper management options.

    Science.gov (United States)

    Hanan, Deirdre; Burnley, Stephen; Cooke, David

    2013-03-01

    The use of Multi-criteria Decision Analysis (MCDA) was investigated in an exercise using a panel of local residents and stakeholders to assess the options for managing waste paper on the Isle of Wight. Seven recycling, recovery and disposal options were considered by the panel who evaluated each option against seven environmental, financial and social criteria. The panel preferred options where the waste was managed on the island with gasification and recycling achieving the highest scores. Exporting the waste to the English mainland for incineration or landfill proved to be the least preferred options. This research has demonstrated that MCDA is an effective way of involving community groups in waste management decision making.

  19. Assessment of soil/structure interaction analysis procedures for nuclear power plant structures

    International Nuclear Information System (INIS)

    The paper presents an assessment of two state-of-the-art soil/structure interaction analysis procedures that are frequently used to provide seismic analyses of nuclear power plant structures. The advantages of large three-dimensional, elastic, discrete mass models and two-dimensional finite element models are compared. The discrete mass models can provide three-dimensional response capability with economical computer costs but only fair soil/structure interaction representation. The two-dimensional finite element models provide good soil/structure interaction representation, but cannot provide out-of-plane response. Three-dimensional finite element models would provide the most informative and complete analyses. For this model, computer costs would be much greater, but modeling costs would be approximately the same as those required for three-dimensional discrete mass models

  20. Central blood pressure assessment using 24-hour brachial pulse wave analysis

    Directory of Open Access Journals (Sweden)

    Muiesan ML

    2014-10-01

    Full Text Available Maria Lorenza Muiesan, Massimo Salvetti, Fabio Bertacchini, Claudia Agabiti-Rosei, Giulia Maruelli, Efrem Colonetti, Anna Paini Clinica Medica, Department of Clinical and Experimental Sciences, University of Brescia, Brescia, Italy Abstract: This review describes the use of central blood pressure (BP measurements during ambulatory monitoring, using noninvasive devices. The principles of measuring central BP by applanation tonometry and by oscillometry are reported, and information on device validation studies is described. The pathophysiological basis for the differences between brachial and aortic pressure is discussed. The currently available methods for central aortic pressure measurement are relatively accurate, and their use has important clinical implications, such as improving diagnostic and prognostic stratification of hypertension and providing a more accurate assessment of the effect of treatment on BP. Keywords: aortic blood pressure measurements, ambulatory monitoring, pulse wave analysis

  1. Admixture analysis and stocking impact assessment in brown trout ( Salmo trutta ), estimated with incomplete baseline data

    DEFF Research Database (Denmark)

    Hansen, Michael Møller; Eg Nielsen, Einar; Bekkevold, Dorte;

    2001-01-01

    Studies of genetic interactions between wild and domesticated fish are often hampered by unavailability of samples from wild populations prior to population admixture. We assessed the utility of a new Bayesian method, which can estimate individual admixture coefficients even with data missing from...... by the mean of individual admixture coefficients. This method proved more informative than a multidimensional scaling analysis of individual-based genetic distances and assignment tests. The results showed almost complete absence of stocked, domesticated trout in samples of trout from the rivers. Consequently......, stocking had little effect on improving fisheries. In one population, the genetic contribution by domesticated trout was small, whereas in the other population, some genetic impact was suggested. Admixture in this sample of anadromous trout despite absence of stocked domesticated trout could be because...

  2. Assessment of pipeline stability in the Gulf of Mexico during hurricanes using dynamic analysis

    Directory of Open Access Journals (Sweden)

    Yinghui Tian

    2015-03-01

    Full Text Available Pipelines are the critical link between major offshore oil and gas developments and the mainland. Any inadequate on-bottom stability design could result in disruption and failure, having a devastating impact on the economy and environment. Predicting the stability behavior of offshore pipelines in hurricanes is therefore vital to the assessment of both new design and existing assets. The Gulf of Mexico has a very dense network of pipeline systems constructed on the seabed. During the last two decades, the Gulf of Mexico has experienced a series of strong hurricanes, which have destroyed, disrupted and destabilized many pipelines. This paper first reviews some of these engineering cases. Following that, three case studies are retrospectively simulated using an in-house developed program. The study utilizes the offshore pipeline and hurricane details to conduct a Dynamic Lateral Stability analysis, with the results providing evidence as to the accuracy of the modeling techniques developed.

  3. Assessment of Cr(VI-induced cytotoxicity and genotoxicity using high content analysis.

    Directory of Open Access Journals (Sweden)

    Chad M Thompson

    Full Text Available Oral exposure to high concentrations of hexavalent chromium [Cr(VI] induces intestinal redox changes, villus cytotoxicity, crypt hyperplasia, and intestinal tumors in mice. To assess the effects of Cr(VI in a cell model relevant to the intestine, undifferentiated (proliferating and differentiated (confluent Caco-2 cells were treated with Cr(VI, hydrogen peroxide or rotenone for 2-24 hours. DNA damage was then assessed by nuclear staining intensity of 8-hydroxydeoxyguanosine (8-OHdG and phosphorylated histone variant H2AX (γ-H2AX measured by high content analysis methods. In undifferentiated Caco-2, all three chemicals increased 8-OHdG and γ-H2AX staining at cytotoxic concentrations, whereas only 8-OHdG was elevated at non-cytotoxic concentrations at 24 hr. Differentiated Caco-2 were more resistant to cytotoxicity and DNA damage than undifferentiated cells, and there were no changes in apoptotic markers p53 or annexin-V. However, Cr(VI induced a dose-dependent translocation of the unfolded protein response transcription factor ATF6 into the nucleus. Micronucleus (MN formation was assessed in CHO-K1 and A549 cell lines. Cr(VI increased MN frequency in CHO-K1 only at highly cytotoxic concentrations. Relative to the positive control Mitomycin-C, Cr(VI only slightly increased MN frequency in A549 at mildly cytotoxic concentrations. The results demonstrate that Cr(VI genotoxicity correlates with cytotoxic concentrations, and that H2AX phosphorylation occurs at higher concentrations than oxidative DNA damage in proliferating Caco-2 cells. The findings suggest that in vitro genotoxicity of Cr(VI is primarily oxidative in nature at low concentrations. Implications for in vivo intestinal toxicity of Cr(VI will be discussed.

  4. Analysis of multi-dimensional contemporaneous EHR data to refine delirium assessments.

    Science.gov (United States)

    Corradi, John P; Chhabra, Jyoti; Mather, Jeffrey F; Waszynski, Christine M; Dicks, Robert S

    2016-08-01

    Delirium is a potentially lethal condition of altered mental status, attention, and level of consciousness with an acute onset and fluctuating course. Its causes are multi-factorial, and its pathophysiology is not well understood; therefore clinical focus has been on prevention strategies and early detection. One patient evaluation technique in routine use is the Confusion Assessment Method (CAM): a relatively simple test resulting in 'positive', 'negative' or 'unable-to-assess' (UTA) ratings. Hartford Hospital nursing staff use the CAM regularly on all non-critical care units, and a high frequency of UTA was observed after reviewing several years of records. In addition, patients with UTA ratings displayed poor outcomes such as in-hospital mortality, longer lengths of stay, and discharge to acute and long term care facilities. We sought to better understand the use of UTA, especially outside of critical care environments, in order to improve delirium detection throughout the hospital. An unsupervised clustering approach was used with additional, concurrent assessment data available in the EHR to categorize patient visits with UTA CAMs. The results yielded insights into the most common situations in which the UTA rating was used (e.g. impaired verbal communication, dementia), suggesting potentially inappropriate ratings that could be refined with further evaluation and remedied with updated clinical training. Analysis of the patient clusters also suggested that unrecognized delirium may contribute to the poor outcomes associated with the use of UTA. This method of using temporally related high dimensional EHR data to illuminate a dynamic medical condition could have wider applicability. PMID:27340924

  5. A meta-analysis of clinical trials assessing the effect of radiofrequency ablation for breast cancer

    Directory of Open Access Journals (Sweden)

    Chen J

    2016-03-01

    Full Text Available Jiayan Chen,1,* Chi Zhang,1,* Fei Li,1,* Liping Xu,1 Hongcheng Zhu,1 Shui Wang,2 Xiaoan Liu,2 Xiaoming Zha,2 Qiang Ding,2 Lijun Ling,2 Wenbin Zhou,2 Xinchen Sun1 1Department of Radiation Oncology, 2Department of Breast Surgery, The First Affiliated Hospital, Nanjing Medical University, Nanjing, People’s Republic of China *These authors contributed equally to this work Background: Radiofrequency ablation (RFA is a minimally invasive thermal ablation technique. We conducted a meta-analysis based on eligible studies to assess the efficacy and safety of RFA for treating patients with breast cancer.Methods: A literature search was conducted in PubMed, Embase, and Web of Science databases. Eligible studies were clinical trials that assessed RFA in patients with breast cancer. The outcomes included complete ablation rate, recurrence rate, excellent or good cosmetic rates, and complication rate. A random-effects or fixed-effects model was used to pool the estimate, according to the heterogeneity among the included studies.Results: Fifteen studies, with a total of 404 patients, were included in this meta-analysis. Pooled results showed that 89% (95% confidence interval: 85%–93% of patients achieved a complete ablation after RFA treatment and 96% of patients reported a good-to-excellent cosmetic result. Although the pooled result for recurrence rate was 0, several cases of relapse were observed at different follow-up times. No RFA-related complications were recorded, except for skin burn with an incidence of 4% (95% confidence interval: 1%–6%.Conclusion: This meta-analysis showed that RFA can be a promising alternative option for treating breast cancer since it produces a higher complete ablation rate with a low complication rate. Further well-designed randomized controlled trials are needed to confirm the efficacy and safety of RFA for breast cancer. Keywords: radiofrequency ablation, breast cancer, meta-analysis

  6. Groundwater quality assessment using chemometric analysis in the Adyar River, South India.

    Science.gov (United States)

    Venugopal, T; Giridharan, L; Jayaprakash, M

    2008-08-01

    A multivariate statistical technique has been used to assess the factors responsible for the chemical composition of the groundwater near the highly polluted Adyar River. Basic chemical parameters of the groundwater have been pooled together for evaluating and interpreting a few empirical factors controlling the chemical nature of the water. Twenty-three groundwater samples were collected in the vicinity of the Adyar River. Box-whisker plots were drawn to evaluate the chemical variation and the seasonal effect on the variables. R-mode factor analysis and cluster analysis were applied to the geochemical parameters of the water to identify the factors affecting the chemical composition of the groundwater. Dendograms of both the seasons gives two major clusters reflecting the groups of polluted and unpolluted stations. The other two minor clusters and the movement of stations from one cluster to another clearly bring out the seasonal variation in the chemical composition of the groundwater. The results of the R-mode factor analysis reveal that the groundwater chemistry of the study area reflects the influence of anthropogenic activities, rock-water interactions, saline water intrusion into the river water, and subsequent percolation into the groundwater. The complex geochemical data of the groundwater were interpreted by reducing them to seven major factors, and the seasonal variation in the chemistry of water was clearly brought out by these factors. The higher concentration of heavy metals such as Fe and Cr is attributed to the rock-water interaction and effluents from industries such as tanning, chrome-plating, and dyeing. In the urban area, the Pb concentration is high due to industrial as well as urban runoff of the atmospheric deposition from automobile pollution. Factor score analysis was used successfully to delineate the stations under study with the contributing factors, and the seasonal effect on the sample stations was identified and evaluated.

  7. Assessment of shielding analysis methods, codes, and data for spent fuel transport/storage applications

    International Nuclear Information System (INIS)

    This report provides a preliminary assessment of the computational tools and existing methods used to obtain radiation dose rates from shielded spent nuclear fuel and high-level radioactive waste (HLW). Particular emphasis is placed on analysis tools and techniques applicable to facilities/equipment designed for the transport or storage of spent nuclear fuel or HLW. Applications to cask transport, storage, and facility handling are considered. The report reviews the analytic techniques for generating appropriate radiation sources, evaluating the radiation transport through the shield, and calculating the dose at a desired point or surface exterior to the shield. Discrete ordinates, Monte Carlo, and point kernel methods for evaluating radiation transport are reviewed, along with existing codes and data that utilize these methods. A literature survey was employed to select a cadre of codes and data libraries to be reviewed. The selection process was based on specific criteria presented in the report. Separate summaries were written for several codes (or family of codes) that provided information on the method of solution, limitations and advantages, availability, data access, ease of use, and known accuracy. For each data library, the summary covers the source of the data, applicability of these data, and known verification efforts. Finally, the report discusses the overall status of spent fuel shielding analysis techniques and attempts to illustrate areas where inaccuracy and/or uncertainty exist. The report notes the advantages and limitations of several analysis procedures and illustrates the importance of using adequate cross-section data sets. Additional work is recommended to enable final selection/validation of analysis tools that will best meet the US Department of Energy's requirements for use in developing a viable HLW management system. 188 refs., 16 figs., 27 tabs

  8. A latent transition analysis for the assessment of structured diagnostic interviews.

    Science.gov (United States)

    Scorza, Pamela; Masyn, Katherine E; Salomon, Joshua A; Betancourt, Theresa S

    2015-09-01

    Structured diagnostic interviews administered by lay people are commonly used to assess psychiatric disorders, including depression, in large epidemiologic studies. Many interviews utilize "gate" questions, such as screening questions, that allow interviewers to skip entire survey sections for a particular respondent, saving time and reducing respondent fatigue. However, most depression estimates based on these response data are predicated on the assumption that the gate questions function without measurement error or bias. The tenability of this assumption is questionable, and its violation could compromise the reliability and validity of those estimates of depression. In this study, we used a novel application of latent transition analysis to cross-sectional data, accounting for measurement error in different response pathways through the depression module in the World Mental Health Composite International Diagnostic Interview. The analysis included data from 19,734 participants ≥18 years of age in the Comprehensive Psychiatric Epidemiologic Surveys. The latent transition analysis, allowing for measurement error in screening questions and exclusion criteria, produced a higher estimate of the lifetime probability of experiencing depression than did the algorithm based on the Diagnostic and Statistical Manual for Mental Disorders, 4th Edition, Text Revision. This illustration of latent transition analysis applied to item-level data from a complex structured diagnostic tool with gate questions demonstrates the potential utility of an analytic approach that does not automatically assume gate questions function without measurement error. This model could also be used to probe for evidence of measurement bias in the form of differential item function when using structured diagnostic tools in different cultures and languages. PMID:25894707

  9. Assessment of land degradation using time series trend analysis of vegetation indictors in Otindag Sandy land

    International Nuclear Information System (INIS)

    Land condition assessment is a basic prerequisite for finding the degradation of a territory, which might lead to desertification under climatic and human pressures. The temporal change in vegetation productivity is a key indicator of land degradation. In this paper, taking the Otindag Sandy Land as a case, the mean normalized difference vegetation index (NDVIa), net primary production (NPP) and vegetation rain use efficiency (RUE) dynamic trends during 2001–2010 were analysed. The Mann-Kendall test and the Correlation Analysis method were used and their sensitivities to land degradation were evaluated. The results showed that the three vegetation indicators (NDVIa, NPP and RUE) showed a downward trend with the two methods in the past 10 years and the land was degraded. For the analysis of the three vegetation indicators (NDVIa, NPP and RUE), it indicated a decreasing trend in 62.57%, 74.16% and 88.56% of the study area according to the Mann-Kendall test and in 57.85%, 68.38% and 85.29% according to the correlation analysis method. However, the change trends were not significant, the significant trends at the 95% confidence level only accounted for a small proportion. Analysis of NDVIa, NPP and RUE series showed a significant decreasing trend in 9.21%, 4.81% and 6.51% with the Mann-Kendall test. The NPP change trends showed obvious positive link with the precipitation in the study area. While the effect of the inter-annual variation of the precipitation for RUE was small, the vegetation RUE can provide valuable insights into the status of land condition and had best sensitivity to land degradation

  10. Analysis of right ventricular areas to assess the severity of ascites syndrome in broiler chickens.

    Science.gov (United States)

    McGovern, R H; Feddes, J J; Robinson, F E; Hanson, J A

    1999-01-01

    Ascites syndrome in broiler chickens is defined as a condition associated with pulmonary hypertension leading to right heart failure, increased central venous pressure, passive congestion of the liver, and accumulations of serous fluids in body cavities. The syndrome is currently seen in fast-growing broiler chickens associated with an increase in the weight, volume, and area of the right ventricle of the heart. The ratio of the right ventricle weight to the total heart mass has been used to assess the consequences of increased blood pressure. The right ventricle area (RVA) can be quantified using image analysis technology. Hearts were removed from 719 male broilers at slaughter (42 d). All birds were visually scored for the incidence of ascites. A score of 0 or 1 represented slight hydropericardium, slight right heart hypertrophy, and slight edema. A score of 4 was assigned to birds with marked accumulation of ascitic fluid in one or more ceolomic cavities, pronounced dilation of the right heart, and prominent liver lesions. A cross-sectional image of each heart slice (a 4-mm-thick slice of the ventricles) was digitally recorded. Using image analysis software, the RVA, left ventricular area (LVA), and total heart area (HA) were determined. Because a slice of the heart was used in image analysis, the importance of maintaining the original shape was determined. Twenty hearts in five ranges of RVA size were scanned in four different positions, which have differing heart slice orientations and differing RVA shapes, for a comparison of positioning technique (placement) relating to the RVA. The shape of the heart slice for image analysis was observed not to be critical for the small RVA. For heart slices with large RVA values, it was found to be critical to analyze the heart slice in a standardized placement. PMID:10023748

  11. Analysis of psychological factors for quality assessment of interactive multimodal service

    Science.gov (United States)

    Yamagishi, Kazuhisa; Hayashi, Takanori

    2005-03-01

    We proposed a subjective quality assessment model for interactive multimodal services. First, psychological factors of an audiovisual communication service were extracted by using the semantic differential (SD) technique and factor analysis. Forty subjects participated in subjective tests and performed point-to-point conversational tasks on a PC-based TV phone that exhibits various network qualities. The subjects assessed those qualities on the basis of 25 pairs of adjectives. Two psychological factors, i.e., an aesthetic feeling and a feeling of activity, were extracted from the results. Then, quality impairment factors affecting these two psychological factors were analyzed. We found that the aesthetic feeling is mainly affected by IP packet loss and video coding bit rate, and the feeling of activity depends on delay time and video frame rate. We then proposed an opinion model derived from the relationships among quality impairment factors, psychological factors, and overall quality. The results indicated that the estimation error of the proposed model is almost equivalent to the statistical reliability of the subjective score. Finally, using the proposed model, we discuss guidelines for quality design of interactive audiovisual communication services.

  12. Assessing short summaries with human judgments procedure and latent semantic analysis in narrative and expository texts.

    Science.gov (United States)

    León, José A; Olmos, Ricardo; Escudero, Inmaculada; Cañas, José J; Salmerón, Lalo

    2006-11-01

    In the present study, we tested a computer-based procedure for assessing very concise summaries (50 words long) of two types of text (narrative and expository) using latent semantic analysis (LSA) in comparison with the judgments of four human experts. LSA was used to estimate semantic similarity using six different methods: four holistic (summary-text, summary-summaries, summary-expert summaries, and pregraded-ungraded summary) and two componential (summary-sentence text and summary-main sentence text). A total of 390 Spanish middle and high school students (14-16 years old) and six experts read a narrative or expository text and later summarized it. The results support the viability of developing a computerized assessment tool using human judgments and LSA, although the correlation between human judgments and LSA was higher in the narrative text than in the expository, and LSA correlated more with human content ratings thanwith hu mancoherence ratings. Finally, theholistic methods were found to be more reliable than the componential methods analyzed in this study.

  13. Assessment of metal sorption mechanisms by aquatic macrophytes using PIXE analysis

    Energy Technology Data Exchange (ETDEWEB)

    Módenes, A.N., E-mail: anmodenes@yahoo.com.br [Department of Chemical Engineering-Postgraduate Program, West Parana State University, Campus of Toledo, rua da Faculdade 645, Jd. La Salle, 85903-000 Toledo, PR (Brazil); Espinoza-Quiñones, F.R.; Santos, G.H.F.; Borba, C.E. [Department of Chemical Engineering-Postgraduate Program, West Parana State University, Campus of Toledo, rua da Faculdade 645, Jd. La Salle, 85903-000 Toledo, PR (Brazil); Rizzutto, M.A. [Physics Institute, University of São Paulo, Rua do Matão s/n, Travessa R 187, 05508-900 São Paulo, SP (Brazil)

    2013-10-15

    Highlights: • Divalent metal ion removals by Egeria densa biosorbent. • Multielements concentrations in biosorbent samples by PIXE analysis. • Elements mass balance in liquid and solid phase before and after metal removals. • Assessment of the mechanisms involved in Cd{sup 2+} and Zn{sup 2+} removal by biosorbent. • Confirmation of the signature of ion exchange process in metal removal. -- Abstract: In this work, a study of the metal sorption mechanism by dead biomass has been performed. All batch metal biosorption experiments were performed using the aquatic macrophyte Egeria densa as biosorbent. Divalent cadmium and zinc solutions were used to assess the sorption mechanisms involved. Using a suitable equilibrium time of 2 h and a mixture of 300 mg biosorbent and 50 mL metal solution at pH 5, monocomponent sorption experiments were performed. In order to determine the residual amounts of metals in the aqueous solutions and the concentrations of removed metals in the dry biomass, Particle Induced X-ray Emission (PIXE) measurements in thin and thick target samples were carried out. Based on the strong experimental evidence from the mass balance among the major elements participating in the sorption processes, an ion exchange process was identified as the mechanism responsible for metal removal by the dry biomass.

  14. Life cycle assessment and economic analysis of a low concentrating photovoltaic system.

    Science.gov (United States)

    De Feo, G; Forni, M; Petito, F; Renno, C

    2016-10-01

    Many new photovoltaic (PV) applications, such as the concentrating PV (CPV) systems, are appearing on the market. The main characteristic of CPV systems is to concentrate sunlight on a receiver by means of optical devices and to decrease the solar cells area required. A low CPV (LCPV) system allows optimizing the PV effect with high increase of generated electric power as well as decrease of active surface area. In this paper, an economic analysis and a life cycle assessment (LCA) study of a particular LCPV scheme is presented and its environmental impacts are compared with those of a PV traditional system. The LCA study was performed with the software tool SimaPro 8.0.2, using the Econinvent 3.1 database. A functional unit of 1 kWh of electricity produced was chosen. Carbon Footprint, Ecological Footprint and ReCiPe 2008 were the methods used to assess the environmental impacts of the LCPV plant compared with a corresponding traditional system. All the methods demonstrated the environmental convenience of the LCPV system. The innovative system allowed saving 16.9% of CO2 equivalent in comparison with the traditional PV plant. The environmental impacts saving was 17% in terms of Ecological Footprint, and, finally, 15.8% with the ReCiPe method. PMID:26935857

  15. Operational Modal Analysis and the Performance Assessment of Vehicle Suspension Systems

    Directory of Open Access Journals (Sweden)

    L. Soria

    2012-01-01

    Full Text Available Comfort, road holding and safety of passenger cars are mainly influenced by an appropriate design of suspension systems. Improvements of the dynamic behaviour can be achieved by implementing semi-active or active suspension systems. In these cases, the correct design of a well-performing suspension control strategy is of fundamental importance to obtain satisfying results. Operational Modal Analysis allows the experimental structural identification in those that are the real operating conditions: Moving from output-only data, leading to modal models linearised around the more interesting working points and, in the case of controlled systems, providing the needed information for the optimal design and verification of the controller performance. All these characters are needed for the experimental assessment of vehicle suspension systems. In the paper two suspension architectures are considered equipping the same car type. The former is a semi-active commercial system, the latter a novel prototypic active system. For the assessment of suspension performance, two different kinds of tests have been considered, proving ground tests on different road profiles and laboratory four poster rig tests. By OMA-processing the signals acquired in the different testing conditions and by comparing the results, it is shown how this tool can be effectively utilised to verify the operation and the performance of those systems, by only carrying out a simple, cost-effective road test.

  16. Fuzzy-logic-based network for complex systems risk assessment: application to ship performance analysis.

    Science.gov (United States)

    Abou, Seraphin C

    2012-03-01

    In this paper, a new interpretation of intuitionistic fuzzy sets in the advanced framework of the Dempster-Shafer theory of evidence is extended to monitor safety-critical systems' performance. Not only is the proposed approach more effective, but it also takes into account the fuzzy rules that deal with imperfect knowledge/information and, therefore, is different from the classical Takagi-Sugeno fuzzy system, which assumes that the rule (the knowledge) is perfect. We provide an analytical solution to the practical and important problem of the conceptual probabilistic approach for formal ship safety assessment using the fuzzy set theory that involves uncertainties associated with the reliability input data. Thus, the overall safety of the ship engine is investigated as an object of risk analysis using the fuzzy mapping structure, which considers uncertainty and partial truth in the input-output mapping. The proposed method integrates direct evidence of the frame of discernment and is demonstrated through references to examples where fuzzy set models are informative. These simple applications illustrate how to assess the conflict of sensor information fusion for a sufficient cooling power system of vessels under extreme operation conditions. It was found that propulsion engine safety systems are not only a function of many environmental and operation profiles but are also dynamic and complex.

  17. Assessment of thermal analysis software for the DOE Office of Civilian Radioactive Waste Management

    International Nuclear Information System (INIS)

    This assessment uses several recent assessments and the more general code compilations that have been completed to produce a list of 116 codes that can be used for thermal analysis. This list is then compared with criteria prepared especially for the Department of Energy Office of Civilian Radioactive Waste Management (DOE/OCRWM). Based on these criteria, fifteen codes are narrowed to three primary codes and four secondary codes for use by the OCRWM thermal analyst. The analyst is cautioned that since no single code is sufficient for all applications, a code must be selected based upon the predominate heat transfer mode of the problem to be solved, but the codes suggested in this report have been used successfully for a range of OCRWM applications. The report concludes with a series of recommendations for additional work of which the major points include the following: The codes suggested by this report must be benchmarked with the existing US and international problems and validated when possible; An interactive code selection tool could be developed or, perhaps even more useful, a users group could be supported to ensure the proper selection of thermal codes and dissemination of information on the latest version; The status of the 116 codes identified by this report should be verified, and methods for maintaining the still active codes must be established; and special capabilities of each code in phase change, convection and radiation should be improved to better enable the thermal analyst to model OCRWM applications. 37 refs., 3 figs., 12 tabs

  18. Life cycle assessment and economic analysis of a low concentrating photovoltaic system.

    Science.gov (United States)

    De Feo, G; Forni, M; Petito, F; Renno, C

    2016-10-01

    Many new photovoltaic (PV) applications, such as the concentrating PV (CPV) systems, are appearing on the market. The main characteristic of CPV systems is to concentrate sunlight on a receiver by means of optical devices and to decrease the solar cells area required. A low CPV (LCPV) system allows optimizing the PV effect with high increase of generated electric power as well as decrease of active surface area. In this paper, an economic analysis and a life cycle assessment (LCA) study of a particular LCPV scheme is presented and its environmental impacts are compared with those of a PV traditional system. The LCA study was performed with the software tool SimaPro 8.0.2, using the Econinvent 3.1 database. A functional unit of 1 kWh of electricity produced was chosen. Carbon Footprint, Ecological Footprint and ReCiPe 2008 were the methods used to assess the environmental impacts of the LCPV plant compared with a corresponding traditional system. All the methods demonstrated the environmental convenience of the LCPV system. The innovative system allowed saving 16.9% of CO2 equivalent in comparison with the traditional PV plant. The environmental impacts saving was 17% in terms of Ecological Footprint, and, finally, 15.8% with the ReCiPe method.

  19. Source term assessment with ASTEC and associated uncertainty analysis using SUNSET tool

    Energy Technology Data Exchange (ETDEWEB)

    Chevalier-Jabet, K., E-mail: karine.chevalier-jabet@irsn.fr; Cousin, F.; Cantrel, L.; Séropian, C.

    2014-06-01

    Several accidental scenarios have been simulated using the ASTEC integral IRSN-GRS code for a French 1300 MWe PWR, including several break sizes or locations, highlighting the effect of safety systems and of iodine chemistry in the reactor coolant system (RCS) and in the containment on iodine source term evaluations. Iodine chemistry in the RCS and in the containment is still subject to significant uncertainties and it is thus studied in on-going R and D programs. To assess the impact of uncertainties, ASTEC has been coupled to the IRSN uncertainty propagation and sensitivity analysis tool SUNSET. Focusing on a loss of feed-water of steam generator accident, ASTEC/SUNSET calculations have been performed to assess the effect of remaining uncertainties relative to iodine behaviour on the source term. Calculations show that the postulated lack of knowledge may impact the iodine source term in the environment by at least one decade, confirming the importance of the on-going R and D programs to improve the knowledge on iodine chemistry.

  20. Assessment of bilateral photoplethysmography for lower limb peripheral vascular occlusive disease using color relation analysis classifier.

    Science.gov (United States)

    Lin, Chia-Hung

    2011-09-01

    This paper proposes the assessment of bilateral photoplethysmography (PPG) for lower limb peripheral vascular occlusive disease (PVOD) using a color relation analysis (CRA) classifier. PPG signals are non-invasively recorded from the right and left sides at the big toe sites. With the time-domain technique, the right-to-left side difference is studied by comparing the subject's PPG data. The absolute bilateral differences construct various diminishing and damping patterns. These difference patterns in amplitude and shape distortion relate to the grades of PVOD, including the normal condition, lower-grade disease, and higher-grade disease. A CRA classifier is used to recognize the various patterns for PVOD assessment. Its concept is derived from the HSV color model and uses the hue, saturation, and value to depict the disease grades using the natural primary colors of red, green, and blue. PPG signals are obtained from 21 subjects aged 24-65 years using an optical measurement technique. The proposed CRA classifier is tested using the physiological measurements, and the tests reveal its practicality for monitoring PPG signals. PMID:20674063

  1. Assessment of arsenic concentration in stream water using neuro fuzzy networks with factor analysis.

    Science.gov (United States)

    Chang, Fi-John; Chung, Chang-Han; Chen, Pin-An; Liu, Chen-Wuing; Coynel, Alexandra; Vachaud, Georges

    2014-10-01

    We propose a systematical approach to assessing arsenic concentration in a river through: important factor extraction by a nonlinear factor analysis; arsenic concentration estimation by the neuro-fuzzy network; and impact assessment of important factors on arsenic concentration by the membership degrees of the constructed neuro-fuzzy network. The arsenic-contaminated Huang Gang Creek in northern Taiwan is used as a study case. Results indicate that rainfall, nitrite nitrogen and temperature are important factors and the proposed estimation model (ANFIS(GT)) is superior to the two comparative models, in which 50% and 52% improvements in RMSE are made over ANFIS(CC) and ANFIS(all), respectively. Results reveal that arsenic concentration reaches the highest in an environment of lower temperature, higher nitrite nitrogen concentration and larger one-month antecedent rainfall; while it reaches the lowest in an environment of higher temperature, lower nitrite nitrogen concentration and smaller one-month antecedent rainfall. It is noted that these three selected factors are easy-to-collect. We demonstrate that the proposed methodology is a useful and effective methodology, which can be adapted to other similar settings to reliably model water quality based on parameters of interest and/or study areas of interest for universal usage. The proposed methodology gives a quick and reliable way to estimate arsenic concentration, which makes good contribution to water environment management. PMID:25046611

  2. Environmental Impact Assessment of Rosia Jiu Opencast Area Using AN Integrated SAR Analysis

    Science.gov (United States)

    Poenaru, V. D.; Negula, I. F. Dana; Badea, A.; Cuculici, R.

    2016-06-01

    The satellite data provide a new perspective to analyse and interpret environmental impact assessment as function of topography and vegetation. The main goal of this paper is to investigate the new Staring Spotlight TerraSAR-X mode capabilities to monitor land degradation in Rosia Jiu opencast area taking into account the mining engineering standards and specifications. The second goal is to relate mining activities with spatio-temporal dynamics of land degradation by using differential Synthetic Aperture Radar interferometry (DInSAR). The experimental analysis was carried out on data acquired in the LAN_2277 scientific proposal framework during 2014-2015 period. A set of 25 very height resolution SAR data gathered in the VV polarisation mode with a resolution of 0.45 m x 0.16m and an incidence angle of 37° have been used in this study. Preliminary results showed that altered terrain topography with steep slopes and deep pits has led to the layover of radar signal. Initially, ambiguous results have been obtained due to the highly dynamic character of subsidence induced by activities which imply mass mining methods. By increasing the SAR data number, the land degradation assessment has been improved. Most of the interferometric pairs have low coherence therefore the product coherence threshold was set to 0.3. A coherent and non-coherent analysis is performed to delineate land cover changes and complement the deformation model. Thus, the environmental impact of mining activities is better studied. Moreover, the monitoring of changes in pit depths, heights of stock-piles and waste dumps and levels of tailing dumps provide additional information about production data.

  3. ENVIRONMENTAL IMPACT ASSESSMENT OF ROSIA JIU OPENCAST AREA USING AN INTEGRATED SAR ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. D. Poenaru

    2016-06-01

    Full Text Available The satellite data provide a new perspective to analyse and interpret environmental impact assessment as function of topography and vegetation. The main goal of this paper is to investigate the new Staring Spotlight TerraSAR-X mode capabilities to monitor land degradation in Rosia Jiu opencast area taking into account the mining engineering standards and specifications. The second goal is to relate mining activities with spatio-temporal dynamics of land degradation by using differential Synthetic Aperture Radar interferometry (DInSAR. The experimental analysis was carried out on data acquired in the LAN_2277 scientific proposal framework during 2014-2015 period. A set of 25 very height resolution SAR data gathered in the VV polarisation mode with a resolution of 0.45 m x 0.16m and an incidence angle of 37° have been used in this study. Preliminary results showed that altered terrain topography with steep slopes and deep pits has led to the layover of radar signal. Initially, ambiguous results have been obtained due to the highly dynamic character of subsidence induced by activities which imply mass mining methods. By increasing the SAR data number, the land degradation assessment has been improved. Most of the interferometric pairs have low coherence therefore the product coherence threshold was set to 0.3. A coherent and non-coherent analysis is performed to delineate land cover changes and complement the deformation model. Thus, the environmental impact of mining activities is better studied. Moreover, the monitoring of changes in pit depths, heights of stock-piles and waste dumps and levels of tailing dumps provide additional information about production data.

  4. The impact of expert knowledge on natural hazard susceptibility assessment using spatial multi-criteria analysis

    Science.gov (United States)

    Karlsson, Caroline; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve

    2016-04-01

    Road and railway networks are one of the key factors to a country's economic growth. Inadequate infrastructural networks could be detrimental to a society if the transport between locations are hindered or delayed. Logistical hindrances can often be avoided whereas natural hindrances are more difficult to control. One natural hindrance that can have a severe adverse effect on both infrastructure and society is flooding. Intense and heavy rainfall events can trigger other natural hazards such as landslides and debris flow. Disruptions caused by landslides are similar to that of floods and increase the maintenance cost considerably. The effect on society by natural disasters is likely to increase due to a changed climate with increasing precipitation. Therefore, there is a need for risk prevention and mitigation of natural hazards. Determining susceptible areas and incorporating them in the decision process may reduce the infrastructural harm. Spatial multi-criteria analysis (SMCA) is a part of decision analysis, which provides a set of procedures for analysing complex decision problems through a Geographic Information System (GIS). The objective and aim of this study was to evaluate the usefulness of expert judgements for inundation, landslide and debris flow susceptibility assessments through a SMCA approach using hydrological, geological and land use factors. The sensitivity of the SMCA model was tested in relation to each perspective and impact on the resulting susceptibility. A least cost path function was used to compare new alternative road lines with the existing ones. This comparison was undertaken to identify the resulting differences in the susceptibility assessments using expert judgements as well as historic incidences of flooding and landslides in order to discuss the usefulness of the model in road planning.

  5. Analysis of risk factors and risk assessment for ischemic stroke recurrence

    Directory of Open Access Journals (Sweden)

    Xiu-ying LONG

    2016-08-01

    Full Text Available Objective To screen the risk factors for recurrence of ischemic stroke and to assess the risk of recurrence. Methods Essen Stroke Risk Score (ESRS was used to evaluate the risk of recurrence in 176 patients with ischemic stroke (96 cases of first onset and 80 cases of recurrence. Univariate and multivariate stepwise Logistic regression analysis was used to screen risk factors for recurrence of ischemic stroke.  Results There were significant differences between first onset group and recurrence group on age, the proportion of > 75 years old, hypertension, diabetes, coronary heart disease, peripheral angiopathy, transient ischemic attack (TIA or ischemic stroke, drinking and ESRS score (P < 0.05, for all. First onset group included one case of ESRS 0 (1.04%, 8 cases of 1 (8.33%, 39 cases of 2 (40.63%, 44 cases of 3 (45.83%, 4 cases of 4 (4.17%. Recurrence group included 2 cases of ESRS 3 (2.50%, 20 cases of 4 (25% , 37 cases of 5 (46.25% , 18 cases of 6 (22.50% , 3 cases of 7 (3.75% . There was significant difference between 2 groups (Z = -11.376, P = 0.000. Logistic regression analysis showed ESRS > 3 score was independent risk factor for recurrence of ischemic stroke (OR = 31.324, 95%CI: 3.934-249.430; P = 0.001.  Conclusions ESRS > 3 score is the independent risk factor for recurrence of ischemic stroke. It is important to strengthen risk assessment of recurrence of ischemic stroke. To screen and control risk factors is the key to secondary prevention of ischemic stroke. DOI: 10.3969/j.issn.1672-6731.2016.07.011

  6. Assessment and structural analysis of a PCPV with hot liner and adjustable wall temperature

    International Nuclear Information System (INIS)

    The great adaptability of the concept with elastic hot liner and adjustable wall temperature can be seen in design and assessment of the PCPV for different reactor types. The first part of the paper presents an overview of the influence and possible reactions on the main existing assumptions for this special concept. One of the most essential features - the limitation of liner stresses for elastic compression - can be attained by balancing liner and structural concrete temperatures. The temperature difference between these two components mainly influences the stress-state of the liner. Transient conditions mostly extend only to the region of liner and thermal barrier. The knowledge of material properties is a fundamental requirement of every analysis. A study demostrates how temperature and long-term behaviour of materials influence the stress and strain history of the vessel. The concept offers the possibility of vessel stabilization before operation. This method which anticipates visco-elastic deformations has particular importance for high operating temperatures. The decision whether to stabilize or not depends both on the thermal assessment and on the long-term restraints of the liner and requires an optimization of these effects. The second part of the paper deals with the analysis methods and their results used in the development of the Austrian PCPV. By means of two- and three-dimensional calculations for the reference design of a PWR with 1500 MWe some of the above mentioned aspects are explained. Stress and deformation diagrams indicate the possibility of safe operation. These extensive investigations and analyses also yielded a feeling on analytical possibilities for vessel design and their costs. (orig.)

  7. Quantitative assessment of the stent/scaffold strut embedment analysis by optical coherence tomography.

    Science.gov (United States)

    Sotomi, Yohei; Tateishi, Hiroki; Suwannasom, Pannipa; Dijkstra, Jouke; Eggermont, Jeroen; Liu, Shengnan; Tenekecioglu, Erhan; Zheng, Yaping; Abdelghani, Mohammad; Cavalcante, Rafael; de Winter, Robbert J; Wykrzykowska, Joanna J; Onuma, Yoshinobu; Serruys, Patrick W; Kimura, Takeshi

    2016-06-01

    The degree of stent/scaffold embedment could be a surrogate parameter of the vessel wall-stent/scaffold interaction and could have biological implications in the vascular response. We have developed a new specific software for the quantitative evaluation of embedment of struts by optical coherence tomography (OCT). In the present study, we described the algorithm of the embedment analysis and its reproducibility. The degree of embedment was evaluated as the ratio of the embedded part versus the whole strut height and subdivided into quartiles. The agreement and the inter- and intra-observer reproducibility were evaluated using the kappa and the interclass correlation coefficient (ICC). A total of 4 pullbacks of OCT images in 4 randomly selected coronary lesions with 3.0 × 18 mm devices [2 lesions with Absorb BVS and 2 lesions with XIENCE (both from Abbott Vascular, Santa Clara, CA, USA)] from Absorb Japan trial were evaluated by two investigators with QCU-CMS software version 4.69 (Leiden University Medical Center, Leiden, The Netherlands). Finally, 1481 polymeric struts in 174 cross-sections and 1415 metallic struts in 161 cross-sections were analyzed. Inter- and intra-observer reproducibility of quantitative measurements of embedment ratio and categorical assessment of embedment in Absorb BVS and XIENCE had excellent agreement with ICC ranging from 0.958 to 0.999 and kappa ranging from 0.850 to 0.980. The newly developed embedment software showed excellent reproducibility. Computer-assisted embedment analysis could be a feasible tool to assess the strut penetration into the vessel wall that could be a surrogate of acute injury caused by implantation of devices. PMID:26898315

  8. Extended defense systems :I. adversary-defender modeling grammar for vulnerability analysis and threat assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Merkle, Peter Benedict

    2006-03-01

    Vulnerability analysis and threat assessment require systematic treatments of adversary and defender characteristics. This work addresses the need for a formal grammar for the modeling and analysis of adversary and defender engagements of interest to the National Nuclear Security Administration (NNSA). Analytical methods treating both linguistic and numerical information should ensure that neither aspect has disproportionate influence on assessment outcomes. The adversary-defender modeling (ADM) grammar employs classical set theory and notation. It is designed to incorporate contributions from subject matter experts in all relevant disciplines, without bias. The Attack Scenario Space U{sub S} is the set universe of all scenarios possible under physical laws. An attack scenario is a postulated event consisting of the active engagement of at least one adversary with at least one defended target. Target Information Space I{sub S} is the universe of information about targets and defenders. Adversary and defender groups are described by their respective Character super-sets, (A){sub P} and (D){sub F}. Each super-set contains six elements: Objectives, Knowledge, Veracity, Plans, Resources, and Skills. The Objectives are the desired end-state outcomes. Knowledge is comprised of empirical and theoretical a priori knowledge and emergent knowledge (learned during an attack), while Veracity is the correspondence of Knowledge with fact or outcome. Plans are ordered activity-task sequences (tuples) with logical contingencies. Resources are the a priori and opportunistic physical assets and intangible attributes applied to the execution of associated Plans elements. Skills for both adversary and defender include the assumed general and task competencies for the associated plan set, the realized value of competence in execution or exercise, and the opponent's planning assumption of the task competence.

  9. Value-Based Assessment of New Medical Technologies: Towards a Robust Methodological Framework for the Application of Multiple Criteria Decision Analysis in the Context of Health Technology Assessment.

    Science.gov (United States)

    Angelis, Aris; Kanavos, Panos

    2016-05-01

    In recent years, multiple criteria decision analysis (MCDA) has emerged as a likely alternative to address shortcomings in health technology assessment (HTA) by offering a more holistic perspective to value assessment and acting as an alternative priority setting tool. In this paper, we argue that MCDA needs to subscribe to robust methodological processes related to the selection of objectives, criteria and attributes in order to be meaningful in the context of healthcare decision making and fulfil its role in value-based assessment (VBA). We propose a methodological process, based on multi-attribute value theory (MAVT) methods comprising five distinct phases, outline the stages involved in each phase and discuss their relevance in the HTA process. Importantly, criteria and attributes need to satisfy a set of desired properties, otherwise the outcome of the analysis can produce spurious results and misleading recommendations. Assuming the methodological process we propose is adhered to, the application of MCDA presents three very distinct advantages to decision makers in the context of HTA and VBA: first, it acts as an instrument for eliciting preferences on the performance of alternative options across a wider set of explicit criteria, leading to a more complete assessment of value; second, it allows the elicitation of preferences across the criteria themselves to reflect differences in their relative importance; and, third, the entire process of preference elicitation can be informed by direct stakeholder engagement, and can therefore reflect their own preferences. All features are fully transparent and facilitate decision making.

  10. Value-Based Assessment of New Medical Technologies: Towards a Robust Methodological Framework for the Application of Multiple Criteria Decision Analysis in the Context of Health Technology Assessment.

    Science.gov (United States)

    Angelis, Aris; Kanavos, Panos

    2016-05-01

    In recent years, multiple criteria decision analysis (MCDA) has emerged as a likely alternative to address shortcomings in health technology assessment (HTA) by offering a more holistic perspective to value assessment and acting as an alternative priority setting tool. In this paper, we argue that MCDA needs to subscribe to robust methodological processes related to the selection of objectives, criteria and attributes in order to be meaningful in the context of healthcare decision making and fulfil its role in value-based assessment (VBA). We propose a methodological process, based on multi-attribute value theory (MAVT) methods comprising five distinct phases, outline the stages involved in each phase and discuss their relevance in the HTA process. Importantly, criteria and attributes need to satisfy a set of desired properties, otherwise the outcome of the analysis can produce spurious results and misleading recommendations. Assuming the methodological process we propose is adhered to, the application of MCDA presents three very distinct advantages to decision makers in the context of HTA and VBA: first, it acts as an instrument for eliciting preferences on the performance of alternative options across a wider set of explicit criteria, leading to a more complete assessment of value; second, it allows the elicitation of preferences across the criteria themselves to reflect differences in their relative importance; and, third, the entire process of preference elicitation can be informed by direct stakeholder engagement, and can therefore reflect their own preferences. All features are fully transparent and facilitate decision making. PMID:26739955

  11. Gait and function as tools for the assessment of fracture repair - the role of movement analysis for the assessment of fracture healing.

    Science.gov (United States)

    Rosenbaum, Dieter; Macri, Felipe; Lupselo, Fernando Silva; Preis, Osvaldo Cristiano

    2014-06-01

    Assessment of gait and function might be as sensitive tool to monitor the progress of fracture healing. Currently available assessment tools for function use instrumented three dimensional gait analysis or pedobarography. The analysis is focused on gait or movement parameters and seeks to identify abnormalities or asymmetries between legs or arms. The additional inclusion of muscle function by electromyography can further elucidate functional performance and its temporal development. Alternative approaches abstain from directly assessing function in the laboratory but rather determine the amount of activities of daily living or the mere ability to perform defined tasks such as walking, stair climbing or running. Some of these methods have been applied to determine recovery after orthopaedic interventions including fracture repair. The combination of lab-based functional measurements and assessment of physical activities in daily live may offer a valuable level of information about the gait quality and quantity of individual patients which sheds light on functional limitations or rehabilitation of gait and mobility after a disease or injury and the respective conservative, medical or surgical treatment.

  12. Pupillometric analysis for assessment of gene therapy in Leber Congenital Amaurosis patients

    Directory of Open Access Journals (Sweden)

    Melillo Paolo

    2012-07-01

    Full Text Available Abstract Background Objective techniques to assess the amelioration of vision in patients with impaired visual function are needed to standardize efficacy assessment in gene therapy trials for ocular diseases. Pupillometry has been investigated in several diseases in order to provide objective information about the visual reflex pathway and has been adopted to quantify visual impairment in patients with Leber Congenital Amaurosis (LCA. In this paper, we describe detailed methods of pupillometric analysis and a case study on three Italian patients affected by Leber Congenital Amaurosis (LCA involved in a gene therapy clinical trial at two follow-up time-points: 1 year and 3 years after therapy administration. Methods Pupillary light reflexes (PLR were measured in patients who had received a unilateral subretinal injection in a clinical gene therapy trial. Pupil images were recorded simultaneously in both eyes with a commercial pupillometer and related software. A program was generated with MATLAB software in order to enable enhanced pupil detection with revision of the acquired images (correcting aberrations due to the inability of these severely visually impaired patients to fixate, and computation of the pupillometric parameters for each stimulus. Pupil detection was performed through Hough Transform and a non-parametric paired statistical test was adopted for comparison. Results The developed program provided correct pupil detection also for frames in which the pupil is not totally visible. Moreover, it provided an automatic computation of the pupillometric parameters for each stimulus and enabled semi-automatic revision of computerized detection, eliminating the need for the user to manually check frame by frame. With reference to the case study, the amplitude of pupillary constriction and the constriction velocity were increased in the right (treated eye compared to the left (untreated eye at both follow-up time-points, showing stability of

  13. Body electrical loss analysis (BELA in the assessment of visceral fat: a demonstration

    Directory of Open Access Journals (Sweden)

    Blomqvist Kim H

    2011-11-01

    Full Text Available Abstract Background Body electrical loss analysis (BELA is a new non-invasive way to assess visceral fat depot size through the use of electromagnetism. BELA has worked well in phantom measurements, but the technology is not yet fully validated. Methods Ten volunteers (5 men and 5 women, age: 22-60 y, BMI: 21-30 kg/m2, waist circumference: 73-108 cm were measured with the BELA instrument and with cross-sectional magnetic resonance imaging (MRI at the navel level, navel +5 cm and navel -5 cm. The BELA signal was compared with visceral and subcutaneous fat areas calculated from the MR images. Results The BELA signal did not correlate with subcutaneous fat area at any level, but correlated significantly with visceral fat area at the navel level and navel +5 cm. The correlation was best at level of navel +5 cm (R2 = 0.74, P 2, LOOCV = 40.1 cm2, where SEE is the standard error of the estimate and LOOCV is the root mean squared error of leave-one-out style cross-validation. The average estimate of repeatability of the BELA signal observed through the study was ±9.6 %. One of the volunteers had an exceptionally large amount of visceral fat, which was underestimated by BELA. Conclusions The correlation of the BELA signal with the visceral but not with the subcutaneous fat area as measured by MRI is promising. The lack of correlation with the subcutaneous fat suggests that subcutaneous fat has a minor influence to the BELA signal. Further research will show if it is possible to develop a reliable low-cost method for the assessment of visceral fat either using BELA only or combining it, for example, with bioelectrical impedance measurement. The combination of these measurements may help assessing visceral fat in a large scale of body composition. Before large-scale clinical testing and ROC analysis, the initial BELA instrumentation requires improvements. The accuracy of the present equipment is not sufficient for such new technology.

  14. Developmental assessment of the Fort St. Vrain version of the composite HTGR analysis program (CHAP-2)

    International Nuclear Information System (INIS)

    The Composite HTGR Analysis Program (CHAP) consists of a model-independent systems analysis mainframe named LASAN and model-dependent linked code modules, each representing a component, subsystem, or phenomenon of an HTGR plant. The Fort St. Vrain version (CHAP-2) includes 21 coded modules that model the neutron kinetics and thermal response of the core; the thermal-hydraulics of the reactor primary coolant system, secondary steam supply system, and balance-of-plant; the actions of the control system and plant protection system; the response of the reactor building; and the relative hazard resulting from fuel particle failure. FSV steady-state and transient plant data are being used to partially verify the component modeling and dynamic simulation techniques used to predict plant response to postulated accident sequences. Results of these preliminary validation efforts are presented showing good agreement between code output and plant data for the portions of the code that have been tested. Plans for further development and assessment as well as application of the validated code are discussed. (author)

  15. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis.

    Science.gov (United States)

    Larimer, Curtis; Winder, Eric; Jeters, Robert; Prowant, Matthew; Nettleship, Ian; Addleman, Raymond Shane; Bonheyo, George T

    2016-01-01

    The accumulation of bacteria in surface-attached biofilms can be detrimental to human health, dental hygiene, and many industrial processes. Natural biofilms are soft and often transparent, and they have heterogeneous biological composition and structure over micro- and macroscales. As a result, it is challenging to quantify the spatial distribution and overall intensity of biofilms. In this work, a new method was developed to enhance the visibility and quantification of bacterial biofilms. First, broad-spectrum biomolecular staining was used to enhance the visibility of the cells, nucleic acids, and proteins that make up biofilms. Then, an image analysis algorithm was developed to objectively and quantitatively measure biofilm accumulation from digital photographs and results were compared to independent measurements of cell density. This new method was used to quantify the growth intensity of Pseudomonas putida biofilms as they grew over time. This method is simple and fast, and can quantify biofilm growth over a large area with approximately the same precision as the more laborious cell counting method. Stained and processed images facilitate assessment of spatial heterogeneity of a biofilm across a surface. This new approach to biofilm analysis could be applied in studies of natural, industrial, and environmental biofilms.

  16. Pepper seed germination assessed by combined X-radiography and computer-aided imaging analysis

    International Nuclear Information System (INIS)

    A lot of pepper seeds having 87% germination were subjected to X-ray inspection using a non lethal dose of radiation. Seeds with less than 2.7% (on the basis of total seed area) of free space area, i.e. the spaces between embryo and endosperm, were classified as highly viable seeds (97-100% germination) with the lowest level of abnormal seedlings. Seeds X-ray classified as good were subjected to a computerised image analysis to study seed imbibition and radicle elongation. The patterns of seed area increase, chosen as the most accurate indicator of seed swelling, resembled the triphasic curve of water uptake. The first phase was completed at 9 h followed by a second phase that varied widely in time until completion of germination between 52 and 96 h. The proportion of seeds with radicle protrusion between 52-56 h and 64-72 h assessed with the image analysis was significantly higher than that recorded using a conventional germination test. In addition, the rate of increase of seed area during the third phase of imbibition, mostly due to protrusion of the radicle tip and its growth, was highly correlated with the corresponding radicle elongation rate

  17. APPLICATION OF LASER SCANNING SURVEYING TO ROCK SLOPES RISK ASSESSMENT ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. Corsetti

    2014-01-01

    Full Text Available The methods for understanding rock instability mechanisms and for evaluating potential destructive scenarios are of great importance in risk assessment analysis dedicated to the establishment of appropriate prevention and mitigation actions. When the portion of the unstable rock mass is very large, effective actions to counteract the risks are complex and expensive. In these conditions, an optimal risk management cannot ignore procedures able to faster and accurately acquire i geometrical data for modeling the geometry of the rock walls and implementing reliable forecasting models and ii monitoring data able to describe the magnitude and the direction of deformation processes. These data contributes to the prediction of the behavior of a landslide if the measurements are acquired frequently and reliable numerical models can be implemented. Innovative geomatic techniques, based on GPS, Terrestrial Laser Scanning Surveying (TLS, automated total station and satellite and ground SAR Interferometry, have been recently applied to define the geometry and monitoring the displacements of unstable slopes. Among these, TLS is mainly adopted to generate detailed 3D models useful to reconstruct rock wall geometry by contributing to the estimation of geo-mechanical parameters, that is orientation, persistence and apparent spacing of rock discontinuities. Two examples of applications of TLS technique to the analysis of a large front in a quarry and of a rock shoulder of a dam are presented.

  18. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis.

    Science.gov (United States)

    Larimer, Curtis; Winder, Eric; Jeters, Robert; Prowant, Matthew; Nettleship, Ian; Addleman, Raymond Shane; Bonheyo, George T

    2016-01-01

    The accumulation of bacteria in surface-attached biofilms can be detrimental to human health, dental hygiene, and many industrial processes. Natural biofilms are soft and often transparent, and they have heterogeneous biological composition and structure over micro- and macroscales. As a result, it is challenging to quantify the spatial distribution and overall intensity of biofilms. In this work, a new method was developed to enhance the visibility and quantification of bacterial biofilms. First, broad-spectrum biomolecular staining was used to enhance the visibility of the cells, nucleic acids, and proteins that make up biofilms. Then, an image analysis algorithm was developed to objectively and quantitatively measure biofilm accumulation from digital photographs and results were compared to independent measurements of cell density. This new method was used to quantify the growth intensity of Pseudomonas putida biofilms as they grew over time. This method is simple and fast, and can quantify biofilm growth over a large area with approximately the same precision as the more laborious cell counting method. Stained and processed images facilitate assessment of spatial heterogeneity of a biofilm across a surface. This new approach to biofilm analysis could be applied in studies of natural, industrial, and environmental biofilms. PMID:26643074

  19. Assessment of Slope Instability and Risk Analysis of Road Cut Slopes in Lashotor Pass, Iran

    Directory of Open Access Journals (Sweden)

    Mohammad Hossein Taherynia

    2014-01-01

    Full Text Available Assessment of the stability of natural and artificial rock slopes is an important topic in the rock mechanics sciences. One of the most widely used methods for this purpose is the classification of the slope rock mass. In the recent decades, several rock slope classification systems are presented by many researchers. Each one of these rock mass classification systems uses different parameters and rating systems. These differences are due to the diversity of affecting parameters and the degree of influence on the rock slope stability. Another important point in rock slope stability is appraisal hazard and risk analysis. In the risk analysis, the degree of danger of rock slope instability is determined. The Lashotor pass is located in the Shiraz-Isfahan highway in Iran. Field surveys indicate that there are high potentialities of instability in the road cut slopes of the Lashotor pass. In the current paper, the stability of the rock slopes in the Lashotor pass is studied comprehensively with different classification methods. For risk analyses, we estimated dangerous area by use of the RocFall software. Furthermore, the dangers of falling rocks for the vehicles passing the Lashotor pass are estimated according to rockfall hazard rating system.

  20. A New Analysis Tool Assessment for Rotordynamic Modeling of Gas Foil Bearings

    Science.gov (United States)

    Howard, Samuel A.; SanAndres, Luis

    2010-01-01

    Gas foil bearings offer several advantages over traditional bearing types that make them attractive for use in high-speed turbomachinery. They can operate at very high temperatures, require no lubrication supply (oil pumps, seals, etc.), exhibit very long life with no maintenance, and once operating airborne, have very low power loss. The use of gas foil bearings in high-speed turbomachinery has been accelerating in recent years, although the pace has been slow. One of the contributing factors to the slow growth has been a lack of analysis tools, benchmarked to measurements, to predict gas foil bearing behavior in rotating machinery. To address this shortcoming, NASA Glenn Research Center (GRC) has supported the development of analytical tools to predict gas foil bearing performance. One of the codes has the capability to predict rotordynamic coefficients, power loss, film thickness, structural deformation, and more. The current paper presents an assessment of the predictive capability of the code, named XLGFBTH (Texas A&M University). A test rig at GRC is used as a simulated case study to compare rotordynamic analysis using output from the code to actual rotor response as measured in the test rig. The test rig rotor is supported on two gas foil journal bearings manufactured at GRC, with all pertinent geometry disclosed. The resulting comparison shows that the rotordynamic coefficients calculated using XLGFBTH represent the dynamics of the system reasonably well, especially as they pertain to predicting critical speeds.