WorldWideScience

Sample records for evaluation statistical support

  1. Monitoring and Evaluation; Statistical Support for Life-cycle Studies, 2003 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Skalski, John

    2003-12-01

    This report summarizes the statistical analysis and consulting activities performed under Contract No. 00004134, Project No. 199105100 funded by Bonneville Power Administration during 2003. These efforts are focused on providing real-time predictions of outmigration timing, assessment of life-history performance measures, evaluation of status and trends in recovery, and guidance on the design and analysis of Columbia Basin fish and wildlife studies monitoring and evaluation studies. The overall objective of the project is to provide BPA and the rest of the fisheries community with statistical guidance on design, analysis, and interpretation of monitoring data, which will lead to improved monitoring and evaluation of salmonid mitigation programs in the Columbia/Snake River Basin. This overall goal is being accomplished by making fisheries data readily available for public scrutiny, providing statistical guidance on the design and analyses of studies by hands-on support and written documents, and providing real-time analyses of tagging results during the smolt outmigration for review by decision makers. For a decade, this project has been providing in-season projections of smolt outmigration timing to assist in spill management. As many as 50 different fish stocks at 8 different hydroprojects are tracked and real-time to predict the 'percent of run to date' and 'date to specific percentile'. The project also conducts added-value analyses of historical tagging data to understand relationships between fish responses, environmental factors, and anthropogenic effects. The statistical analysis of historical tagging data crosses agency lines in order to assimilate information on salmon population dynamics irrespective of origin. The lessons learned from past studies are used to improve the design and analyses of future monitoring and evaluation efforts. Through these efforts, the project attempts to provide the fisheries community with reliable analyses

  2. The Statistical Value Chain - a Benchmarking Checklist for Decision Makers to Evaluate Decision Support Seen from a Statistical Point-Of-View

    DEFF Research Database (Denmark)

    Herrmann, Ivan Tengbjerg; Henningsen, Geraldine; Wood, Christian D.

    2013-01-01

    quantitative methods exist for evaluating uncertainty—for example, Monte Carlo simulation—and such methods work very well when the AN is in full control of the data collection and model-building processes. In many cases, however, the AN is not in control of these processes. In this article we develop a simple...... method that a DM can employ in order to evaluate the process of decision support from a statistical point-of-view. We call this approach the “Statistical Value Chain” (SVC): a consecutive benchmarking checklist with eight steps that can be used to evaluate decision support seen from a statistical point-of-view....

  3. Data for Development : An Evaluation of World Bank Support for Data and Statistical Capacity

    OpenAIRE

    Independent Evaluation Group

    2017-01-01

    This evaluation’s objective was to assess how effectively the World Bank has supported development data production, sharing, and use, and to suggest ways to improve its approach. This evaluation defines development data as data produced by country systems, the World Bank, or third parties on countries’ social, economic, and environmental issues. At the global level, the World Bank has a st...

  4. Evaluating European imports of Asian aquaculture products using statistically supported life cycle assessments

    NARCIS (Netherlands)

    Henriksson, Patrik John Gustav

    2015-01-01

    This thesis aims to evaluate the environmental sustainability of European imports of farmed aquatic food products from Asia, using life cycle assessment (LCA). Farming of Asian tiger prawn, whiteleg shrimp, freshwater prawn, tilapia and pangasius catfish in Bangladesh, China, Thailand and Vietnam

  5. Statistically downscaled climate projections to support evaluating climate change risks for hydropower

    International Nuclear Information System (INIS)

    Brekke, L.

    2008-01-01

    This paper described a web-served public access archive of down-scaled climate projections developed as a tool for water managers of river and hydropower systems. The archive provided access to climate projection data at basin-relevant resolution and included an extensive compilation of down-scale climate projects designed to support risk-based adaptation planning. Downscaled translations of 112 contemporary climate projections produced using the World Climate Research Program's coupled model intercomparison project were also included. Datasets for the coupled model included temperature and precipitation, monthly time-steps, and geographic coverage for the United States and portions of Mexico and Canada. It was concluded that the archive will be used to develop risk-based studies on shifts in seasonal patterns, changes in mean annual runoff, and associated responses in water resources and hydroelectric power management. Case studies demonstrating reclamation applications of archive content and potential applications for hydroelectric power production impacts were included. tabs., figs

  6. Decision support using nonparametric statistics

    CERN Document Server

    Beatty, Warren

    2018-01-01

    This concise volume covers nonparametric statistics topics that most are most likely to be seen and used from a practical decision support perspective. While many degree programs require a course in parametric statistics, these methods are often inadequate for real-world decision making in business environments. Much of the data collected today by business executives (for example, customer satisfaction opinions) requires nonparametric statistics for valid analysis, and this book provides the reader with a set of tools that can be used to validly analyze all data, regardless of type. Through numerous examples and exercises, this book explains why nonparametric statistics will lead to better decisions and how they are used to reach a decision, with a wide array of business applications. Online resources include exercise data, spreadsheets, and solutions.

  7. Evaluating statistical cloud schemes

    OpenAIRE

    Grützun, Verena; Quaas, Johannes; Morcrette , Cyril J.; Ament, Felix

    2015-01-01

    Statistical cloud schemes with prognostic probability distribution functions have become more important in atmospheric modeling, especially since they are in principle scale adaptive and capture cloud physics in more detail. While in theory the schemes have a great potential, their accuracy is still questionable. High-resolution three-dimensional observational data of water vapor and cloud water, which could be used for testing them, are missing. We explore the potential of ground-based re...

  8. Clinical Decision Support: Statistical Hopes and Challenges

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan; Zvárová, Jana

    2016-01-01

    Roč. 4, č. 1 (2016), s. 30-34 ISSN 1805-8698 Grant - others:Nadační fond na opdporu vědy(CZ) Neuron Institutional support: RVO:67985807 Keywords : decision support * data mining * multivariate statistics * psychiatry * information based medicine Subject RIV: BB - Applied Statistics, Operational Research

  9. Statistical evaluation of vibration analysis techniques

    Science.gov (United States)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  10. Evaluating meeting support tools

    NARCIS (Netherlands)

    Post, W.M.; Huis in 't Veld, M. M.A.; Boogaard, S.A.A. van den

    2007-01-01

    Many attempts are underway for developing meeting support tools, but less attention is paid to the evaluation of meetingware. This article describes the development and testing of an instrument for evaluating meeting tools. First, we specified the object of evaluation -meetings- by means of a set of

  11. Evaluating meeting support tools

    NARCIS (Netherlands)

    Post, W.M.; Huis in't Veld, M.A.A.; Boogaard, S.A.A. van den

    2008-01-01

    Many attempts are underway for developing meeting support tools, but less attention is paid to the evaluation of meetingware. This article describes the development and testing of an instrument for evaluating meeting tools. First, we specified the object of evaluation - meetings - by means of a set

  12. Statistical modeling to support power system planning

    Science.gov (United States)

    Staid, Andrea

    This dissertation focuses on data-analytic approaches that improve our understanding of power system applications to promote better decision-making. It tackles issues of risk analysis, uncertainty management, resource estimation, and the impacts of climate change. Tools of data mining and statistical modeling are used to bring new insight to a variety of complex problems facing today's power system. The overarching goal of this research is to improve the understanding of the power system risk environment for improved operation, investment, and planning decisions. The first chapter introduces some challenges faced in planning for a sustainable power system. Chapter 2 analyzes the driving factors behind the disparity in wind energy investments among states with a goal of determining the impact that state-level policies have on incentivizing wind energy. Findings show that policy differences do not explain the disparities; physical and geographical factors are more important. Chapter 3 extends conventional wind forecasting to a risk-based focus of predicting maximum wind speeds, which are dangerous for offshore operations. Statistical models are presented that issue probabilistic predictions for the highest wind speed expected in a three-hour interval. These models achieve a high degree of accuracy and their use can improve safety and reliability in practice. Chapter 4 examines the challenges of wind power estimation for onshore wind farms. Several methods for wind power resource assessment are compared, and the weaknesses of the Jensen model are demonstrated. For two onshore farms, statistical models outperform other methods, even when very little information is known about the wind farm. Lastly, chapter 5 focuses on the power system more broadly in the context of the risks expected from tropical cyclones in a changing climate. Risks to U.S. power system infrastructure are simulated under different scenarios of tropical cyclone behavior that may result from climate

  13. Electrical engineering research support for FDOT Traffic Statistics Office

    Science.gov (United States)

    2010-03-01

    The aim of this project was to provide electrical engineering support for the telemetered traffic monitoring sites (TTMSs) operated by the Statistics Office of the Florida Department of Transportation. This project was a continuation of project BD-54...

  14. Adaptive RAC codes employing statistical channel evaluation ...

    African Journals Online (AJOL)

    An adaptive encoding technique using row and column array (RAC) codes employing a different number of parity columns that depends on the channel state is proposed in this paper. The trellises of the proposed adaptive codes and a statistical channel evaluation technique employing these trellises are designed and ...

  15. PI-3 correlations and statistical evaluation results

    International Nuclear Information System (INIS)

    Pernica, R.; Cizek, J.

    1992-01-01

    Empirical Critical Heat Flux (CHF) correlations PI-3 having the widest range of validity for flow conditions in both hexagonal and square rod bundle geometries and compared with published CHF correlations are presented. They are valid for vertical water upflow through rod bundles with relatively wide and very tight rod lattices, and include axial and radial non-uniform heating. The correlations were developed with the use of more than 6000 data obtained from 119 electrically heated rod bundles. Comprehensive results of statistical evaluations of the new correlations are presented for various data bases. Also presented is a comparison of statistical evaluations of several well-known CHF correlations in the experimental data base used. A procedure which makes it possible to directly determine the probability that CHF does not occur is described for the purpose of nuclear safety assessment. (author) 8 tabs., 32 figs., 11 refs

  16. Developments in statistical evaluation of clinical trials

    CERN Document Server

    Oud, Johan; Ghidey, Wendimagegn

    2014-01-01

    This book describes various ways of approaching and interpreting the data produced by clinical trial studies, with a special emphasis on the essential role that biostatistics plays in clinical trials. Over the past few decades the role of statistics in the evaluation and interpretation of clinical data has become of paramount importance. As a result the standards of clinical study design, conduct and interpretation have undergone substantial improvement. The book includes 18 carefully reviewed chapters on recent developments in clinical trials and their statistical evaluation, with each chapter providing one or more examples involving typical data sets, enabling readers to apply the proposed procedures. The chapters employ a uniform style to enhance comparability between the approaches.

  17. A statistical evaluation of asbestos air concentrations

    Energy Technology Data Exchange (ETDEWEB)

    Lange, J.H. [Envirosafe Training and Consultants, Pittsburgh, PA (United States)

    1999-07-01

    Both area and personal air samples collected during an asbestos abatement project were matched and statistically analysed. Among the many parameters studied were fibre concentrations and their variability. Mean values for area and personal samples were 0.005 and 0.024 f cm{sup -}-{sup 3} of air, respectively. Summary values for area and personal samples suggest that exposures are low with no single exposure value exceeding the current OSHA TWA value of 0.1 f cm{sup -3} of air. Within- and between-worker analysis suggests that these data are homogeneous. Comparison of within- and between-worker values suggests that the exposure source and variability for abatement are more related to the process than individual practices. This supports the importance of control measures for abatement. Study results also suggest that area and personal samples are not statistically related, that is, there is no association observed for these two sampling methods when data are analysed by correlation or regression analysis. Personal samples were statistically higher in concentration than area samples. Area sampling cannot be used as a surrogate exposure for asbestos abatement workers. (author)

  18. A statistical evaluation of asbestos air concentrations

    International Nuclear Information System (INIS)

    Lange, J.H.

    1999-01-01

    Both area and personal air samples collected during an asbestos abatement project were matched and statistically analysed. Among the many parameters studied were fibre concentrations and their variability. Mean values for area and personal samples were 0.005 and 0.024 f cm - - 3 of air, respectively. Summary values for area and personal samples suggest that exposures are low with no single exposure value exceeding the current OSHA TWA value of 0.1 f cm -3 of air. Within- and between-worker analysis suggests that these data are homogeneous. Comparison of within- and between-worker values suggests that the exposure source and variability for abatement are more related to the process than individual practices. This supports the importance of control measures for abatement. Study results also suggest that area and personal samples are not statistically related, that is, there is no association observed for these two sampling methods when data are analysed by correlation or regression analysis. Personal samples were statistically higher in concentration than area samples. Area sampling cannot be used as a surrogate exposure for asbestos abatement workers. (author)

  19. Teaching Probability with the Support of the R Statistical Software

    Science.gov (United States)

    dos Santos Ferreira, Robson; Kataoka, Verônica Yumi; Karrer, Monica

    2014-01-01

    The objective of this paper is to discuss aspects of high school students' learning of probability in a context where they are supported by the statistical software R. We report on the application of a teaching experiment, constructed using the perspective of Gal's probabilistic literacy and Papert's constructionism. The results show improvement…

  20. Rapid Statistical Learning Supporting Word Extraction From Continuous Speech.

    Science.gov (United States)

    Batterink, Laura J

    2017-07-01

    The identification of words in continuous speech, known as speech segmentation, is a critical early step in language acquisition. This process is partially supported by statistical learning, the ability to extract patterns from the environment. Given that speech segmentation represents a potential bottleneck for language acquisition, patterns in speech may be extracted very rapidly, without extensive exposure. This hypothesis was examined by exposing participants to continuous speech streams composed of novel repeating nonsense words. Learning was measured on-line using a reaction time task. After merely one exposure to an embedded novel word, learners demonstrated significant learning effects, as revealed by faster responses to predictable than to unpredictable syllables. These results demonstrate that learners gained sensitivity to the statistical structure of unfamiliar speech on a very rapid timescale. This ability may play an essential role in early stages of language acquisition, allowing learners to rapidly identify word candidates and "break in" to an unfamiliar language.

  1. Quality research in healthcare: are researchers getting enough statistical support?

    Directory of Open Access Journals (Sweden)

    Ambler Gareth

    2006-01-01

    Full Text Available Abstract Background Reviews of peer-reviewed health studies have highlighted problems with their methodological quality. As published health studies form the basis of many clinical decisions including evaluation and provisions of health services, this has scientific and ethical implications. The lack of involvement of methodologists (defined as statisticians or quantitative epidemiologists has been suggested as one key reason for this problem and this has been linked to the lack of access to methodologists. This issue was highlighted several years ago and it was suggested that more investments were needed from health care organisations and Universities to alleviate this problem. Methods To assess the current level of methodological support available for health researchers in England, we surveyed the 25 National Health Services Trusts in England, that are the major recipients of the Department of Health's research and development (R&D support funding. Results and discussion The survey shows that the earmarking of resources to provide appropriate methodological support to health researchers in these organisations is not widespread. Neither the level of R&D support funding received nor the volume of research undertaken by these organisations showed any association with the amount they spent in providing a central resource for methodological support for their researchers. Conclusion The promotion and delivery of high quality health research requires that organisations hosting health research and their academic partners put in place funding and systems to provide appropriate methodological support to ensure valid research findings. If resources are limited, health researchers may have to rely on short courses and/or a limited number of advisory sessions which may not always produce satisfactory results.

  2. Ontology matching evaluation : A statistical perspective

    NARCIS (Netherlands)

    Mohammadi, M.; Hofman, W.J.; Tan, Y.H.

    2016-01-01

    This paper proposes statistical approaches to test if the difference between two ontology matchers is real. Specifically, the performances of the matchers over multiple data sets are obtained and based on their performances, the conclusion can be drawn whether one method is better than one another

  3. Ontology matching evaluation : A statistical perspective

    NARCIS (Netherlands)

    Mohammadi, M.; Hofman, Wout; Tan, Y.

    2016-01-01

    This paper proposes statistical approaches to test if the difference between two ontology matchers is real. Specifically, the performances of the matchers over multiple data sets are obtained and based on their performances, the conclusion can be drawn whether one method is better than one

  4. Evaluation of observables in statistical multifragmentation theories

    International Nuclear Information System (INIS)

    Cole, A.J.

    1989-01-01

    The canonical formulation of equilibrium statistical multifragmentation is examined. It is shown that the explicit construction of observables (average values) by sampling the partition probabilities is unnecessary insofar as closed expressions in the form of recursion relations can be obtained quite easily. Such expressions may conversely be used to verify the sampling algorithms

  5. Decision Support Systems: Applications in Statistics and Hypothesis Testing.

    Science.gov (United States)

    Olsen, Christopher R.; Bozeman, William C.

    1988-01-01

    Discussion of the selection of appropriate statistical procedures by educators highlights a study conducted to investigate the effectiveness of decision aids in facilitating the use of appropriate statistics. Experimental groups and a control group using a printed flow chart, a computer-based decision aid, and a standard text are described. (11…

  6. Statistics? You Must Be Joking: The Application and Evaluation of Humor when Teaching Statistics

    Science.gov (United States)

    Neumann, David L.; Hood, Michelle; Neumann, Michelle M.

    2009-01-01

    Humor has been promoted as a teaching tool that enhances student engagement and learning. The present report traces the pathway from research to practice by reflecting upon various ways to incorporate humor into the face-to-face teaching of statistics. The use of humor in an introductory university statistics course was evaluated via interviews…

  7. Gregor Mendel, His Experiments and Their Statistical Evaluation

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2014-01-01

    Roč. 99, č. 1 (2014), s. 87-99 ISSN 1211-8788 Institutional support: RVO:67985807 Keywords : Mendel * history of genetics * Mendel-Fisher controversy * statistical analysis * binomial distribution * numerical simulation Subject RIV: BB - Applied Statistics, Operational Research http://www.mzm.cz/fileadmin/user_upload/publikace/casopisy/amm_sb_99_1_2014/08kalina.pdf

  8. Evaluation of three immobilization supports and two nutritional ...

    African Journals Online (AJOL)

    Polyurethane foam, Luffa cylindrica sponge and Ca-alginate (3% w/v) were evaluated as immobilization supports for removing reactive black 5 dye using the white rot fungus Trametes versicolor at 1, 4 and 8 days of colonization. According to statistical results, the L. cylindrica sponge was the best support at 4 days of ...

  9. Supporting statistics in the workplace: experiences with two hospitals

    Directory of Open Access Journals (Sweden)

    M. Y. Mortlock

    2003-01-01

    Full Text Available This paper provides some reflections on the promotion of lifelong learning in statistics in the workplace. The initiative from which the reflections are drawn is a collaboration between a university and two public hospitals, of which one of the stated aims is to develop statistical skills among the hospitals' researchers. This is realized in the provision of ‘biostatistical clinics’ in which workplace teaching and learning of statistics takes place in one-on-one or small group situations. The central issue that is identified is the need to accommodate diversity: in backgrounds, motivations and learning needs of workplace learners (in this case medical researchers, in the workplace environments themselves and in the projects encountered. Operational issues for the statistician in providing such training are addressed. These considerations may reflect the experiences of the wider community of statisticians involved in service provision within a larger organization.

  10. Statistics and Probability Theory In Pursuit of Engineering Decision Support

    CERN Document Server

    Faber, Michael Havbro

    2012-01-01

    This book provides the reader with the basic skills and tools of statistics and probability in the context of engineering modeling and analysis. The emphasis is on the application and the reasoning behind the application of these skills and tools for the purpose of enhancing  decision making in engineering. The purpose of the book is to ensure that the reader will acquire the required theoretical basis and technical skills such as to feel comfortable with the theory of basic statistics and probability. Moreover, in this book, as opposed to many standard books on the same subject, the perspective is to focus on the use of the theory for the purpose of engineering model building and decision making.  This work is suitable for readers with little or no prior knowledge on the subject of statistics and probability.

  11. Statistical methods for evaluating the attainment of cleanup standards

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, R.O.; Simpson, J.C.

    1992-12-01

    This document is the third volume in a series of volumes sponsored by the US Environmental Protection Agency (EPA), Statistical Policy Branch, that provide statistical methods for evaluating the attainment of cleanup Standards at Superfund sites. Volume 1 (USEPA 1989a) provides sampling designs and tests for evaluating attainment of risk-based standards for soils and solid media. Volume 2 (USEPA 1992) provides designs and tests for evaluating attainment of risk-based standards for groundwater. The purpose of this third volume is to provide statistical procedures for designing sampling programs and conducting statistical tests to determine whether pollution parameters in remediated soils and solid media at Superfund sites attain site-specific reference-based standards. This.document is written for individuals who may not have extensive training or experience with statistical methods. The intended audience includes EPA regional remedial project managers, Superfund-site potentially responsible parties, state environmental protection agencies, and contractors for these groups.

  12. Attitudes toward statistics in medical postgraduates: measuring, evaluating and monitoring.

    Science.gov (United States)

    Zhang, Yuhai; Shang, Lei; Wang, Rui; Zhao, Qinbo; Li, Chanjuan; Xu, Yongyong; Su, Haixia

    2012-11-23

    In medical training, statistics is considered a very difficult course to learn and teach. Current studies have found that students' attitudes toward statistics can influence their learning process. Measuring, evaluating and monitoring the changes of students' attitudes toward statistics are important. Few studies have focused on the attitudes of postgraduates, especially medical postgraduates. Our purpose was to understand current attitudes regarding statistics held by medical postgraduates and explore their effects on students' achievement. We also wanted to explore the influencing factors and the sources of these attitudes and monitor their changes after a systematic statistics course. A total of 539 medical postgraduates enrolled in a systematic statistics course completed the pre-form of the Survey of Attitudes Toward Statistics -28 scale, and 83 postgraduates were selected randomly from among them to complete the post-form scale after the course. Most medical postgraduates held positive attitudes toward statistics, but they thought statistics was a very difficult subject. The attitudes mainly came from experiences in a former statistical or mathematical class. Age, level of statistical education, research experience, specialty and mathematics basis may influence postgraduate attitudes toward statistics. There were significant positive correlations between course achievement and attitudes toward statistics. In general, student attitudes showed negative changes after completing a statistics course. The importance of student attitudes toward statistics must be recognized in medical postgraduate training. To make sure all students have a positive learning environment, statistics teachers should measure their students' attitudes and monitor their change of status during a course. Some necessary assistance should be offered for those students who develop negative attitudes.

  13. Attitudes toward statistics in medical postgraduates: measuring, evaluating and monitoring

    Science.gov (United States)

    2012-01-01

    Background In medical training, statistics is considered a very difficult course to learn and teach. Current studies have found that students’ attitudes toward statistics can influence their learning process. Measuring, evaluating and monitoring the changes of students’ attitudes toward statistics are important. Few studies have focused on the attitudes of postgraduates, especially medical postgraduates. Our purpose was to understand current attitudes regarding statistics held by medical postgraduates and explore their effects on students’ achievement. We also wanted to explore the influencing factors and the sources of these attitudes and monitor their changes after a systematic statistics course. Methods A total of 539 medical postgraduates enrolled in a systematic statistics course completed the pre-form of the Survey of Attitudes Toward Statistics −28 scale, and 83 postgraduates were selected randomly from among them to complete the post-form scale after the course. Results Most medical postgraduates held positive attitudes toward statistics, but they thought statistics was a very difficult subject. The attitudes mainly came from experiences in a former statistical or mathematical class. Age, level of statistical education, research experience, specialty and mathematics basis may influence postgraduate attitudes toward statistics. There were significant positive correlations between course achievement and attitudes toward statistics. In general, student attitudes showed negative changes after completing a statistics course. Conclusions The importance of student attitudes toward statistics must be recognized in medical postgraduate training. To make sure all students have a positive learning environment, statistics teachers should measure their students’ attitudes and monitor their change of status during a course. Some necessary assistance should be offered for those students who develop negative attitudes. PMID:23173770

  14. Institutional Support : Institute of Statistical, Social and Economic ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    The Institute of Statistical, Social and Economic Research (ISSER) established in 1969 is a semi-autonomous university-based research centre located at the University of Ghana, Legon, Accra. ISSER has a strong track record of undertaking high-quality policy-relevant research. This grant - the largest being awarded under ...

  15. Statistical analysis of support thickness and particle size effects in HRTEM imaging of metal nanoparticles

    Energy Technology Data Exchange (ETDEWEB)

    House, Stephen D., E-mail: sdh46@pitt.edu [Chemical and Petroleum Engineering, and Physics, University of Pittsburgh, Pittsburgh, PA 15261 (United States); Bonifacio, Cecile S.; Grieshaber, Ross V.; Li, Long; Zhang, Zhongfan [Chemical and Petroleum Engineering, and Physics, University of Pittsburgh, Pittsburgh, PA 15261 (United States); Ciston, Jim [National Center of Electron Microscopy, Molecular Foundry, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Stach, Eric A. [Center for Functional Nanomaterials, Brookhaven National Laboratory, Upton, NY 11973 (United States); Yang, Judith C. [Chemical and Petroleum Engineering, and Physics, University of Pittsburgh, Pittsburgh, PA 15261 (United States)

    2016-10-15

    High-resolution transmission electron microscopy (HRTEM) examination of nanoparticles requires their placement on some manner of support – either TEM grid membranes or part of the material itself, as in many heterogeneous catalyst systems – but a systematic quantification of the practical imaging limits of this approach has been lacking. Here we address this issue through a statistical evaluation of how nanoparticle size and substrate thickness affects the ability to resolve structural features of interest in HRTEM images of metallic nanoparticles on common support membranes. The visibility of lattice fringes from crystalline Au nanoparticles on amorphous carbon and silicon supports of varying thickness was investigated with both conventional and aberration-corrected TEM. Over the 1–4 nm nanoparticle size range examined, the probability of successfully resolving lattice fringes differed significantly as a function both of nanoparticle size and support thickness. Statistical analysis was used to formulate guidelines for the selection of supports and to quantify the impact a given support would have on HRTEM imaging of crystalline structure. For nanoparticles ≥1 nm, aberration-correction was found to provide limited benefit for the purpose of visualizing lattice fringes; electron dose is more predictive of lattice fringe visibility than aberration correction. These results confirm that the ability to visualize lattice fringes is ultimately dependent on the signal-to-noise ratio of the HRTEM images, rather than the point-to-point resolving power of the microscope. This study provides a benchmark for HRTEM imaging of crystalline supported metal nanoparticles and is extensible to a wide variety of supports and nanostructures. - Highlights: • The impact of supports on imaging nanoparticle lattice structure is quantified. • Visualization probabilities given particle size and support thickness are estimated. • Aberration-correction provided limited benefit

  16. Statistical analysis of support thickness and particle size effects in HRTEM imaging of metal nanoparticles

    International Nuclear Information System (INIS)

    House, Stephen D.; Bonifacio, Cecile S.; Grieshaber, Ross V.; Li, Long; Zhang, Zhongfan; Ciston, Jim; Stach, Eric A.; Yang, Judith C.

    2016-01-01

    High-resolution transmission electron microscopy (HRTEM) examination of nanoparticles requires their placement on some manner of support – either TEM grid membranes or part of the material itself, as in many heterogeneous catalyst systems – but a systematic quantification of the practical imaging limits of this approach has been lacking. Here we address this issue through a statistical evaluation of how nanoparticle size and substrate thickness affects the ability to resolve structural features of interest in HRTEM images of metallic nanoparticles on common support membranes. The visibility of lattice fringes from crystalline Au nanoparticles on amorphous carbon and silicon supports of varying thickness was investigated with both conventional and aberration-corrected TEM. Over the 1–4 nm nanoparticle size range examined, the probability of successfully resolving lattice fringes differed significantly as a function both of nanoparticle size and support thickness. Statistical analysis was used to formulate guidelines for the selection of supports and to quantify the impact a given support would have on HRTEM imaging of crystalline structure. For nanoparticles ≥1 nm, aberration-correction was found to provide limited benefit for the purpose of visualizing lattice fringes; electron dose is more predictive of lattice fringe visibility than aberration correction. These results confirm that the ability to visualize lattice fringes is ultimately dependent on the signal-to-noise ratio of the HRTEM images, rather than the point-to-point resolving power of the microscope. This study provides a benchmark for HRTEM imaging of crystalline supported metal nanoparticles and is extensible to a wide variety of supports and nanostructures. - Highlights: • The impact of supports on imaging nanoparticle lattice structure is quantified. • Visualization probabilities given particle size and support thickness are estimated. • Aberration-correction provided limited benefit

  17. Sensory evaluation of food: statistical methods and procedures

    National Research Council Canada - National Science Library

    O'Mahony, Michael

    1986-01-01

    The aim of this book is to provide basic knowledge of the logic and computation of statistics for the sensory evaluation of food, or for other forms of sensory measurement encountered in, say, psychophysics...

  18. Analysis and Evaluation of Statistical Models for Integrated Circuits Design

    Directory of Open Access Journals (Sweden)

    Sáenz-Noval J.J.

    2011-10-01

    Full Text Available Statistical models for integrated circuits (IC allow us to estimate the percentage of acceptable devices in the batch before fabrication. Actually, Pelgrom is the statistical model most accepted in the industry; however it was derived from a micrometer technology, which does not guarantee reliability in nanometric manufacturing processes. This work considers three of the most relevant statistical models in the industry and evaluates their limitations and advantages in analog design, so that the designer has a better criterion to make a choice. Moreover, it shows how several statistical models can be used for each one of the stages and design purposes.

  19. Statistical evaluation of diagnostic performance topics in ROC analysis

    CERN Document Server

    Zou, Kelly H; Bandos, Andriy I; Ohno-Machado, Lucila; Rockette, Howard E

    2016-01-01

    Statistical evaluation of diagnostic performance in general and Receiver Operating Characteristic (ROC) analysis in particular are important for assessing the performance of medical tests and statistical classifiers, as well as for evaluating predictive models or algorithms. This book presents innovative approaches in ROC analysis, which are relevant to a wide variety of applications, including medical imaging, cancer research, epidemiology, and bioinformatics. Statistical Evaluation of Diagnostic Performance: Topics in ROC Analysis covers areas including monotone-transformation techniques in parametric ROC analysis, ROC methods for combined and pooled biomarkers, Bayesian hierarchical transformation models, sequential designs and inferences in the ROC setting, predictive modeling, multireader ROC analysis, and free-response ROC (FROC) methodology. The book is suitable for graduate-level students and researchers in statistics, biostatistics, epidemiology, public health, biomedical engineering, radiology, medi...

  20. Institutional Support : Institute of Statistical, Social and Economic ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ISSER has a strong track record of undertaking high-quality policy-relevant research. ... set organizational goals, and establish a monitoring and evaluation system. ... IWRA/IDRC webinar on climate change and adaptive water management.

  1. Fuzzy comprehensive evaluation method of F statistics weighting in ...

    African Journals Online (AJOL)

    In order to rapidly identify the source of water inrush in coal mine, and provide the theoretical basis for mine water damage prevention and control, fuzzy comprehensive evaluation model was established. The F statistics of water samples was normalized as the weight of fuzzy comprehensive evaluation for determining the ...

  2. Statistical Process Control in the Practice of Program Evaluation.

    Science.gov (United States)

    Posavac, Emil J.

    1995-01-01

    A technique developed to monitor the quality of manufactured products, statistical process control (SPC), incorporates several features that may prove attractive to evaluators. This paper reviews the history of SPC, suggests how the approach can enrich program evaluation, and illustrates its use in a hospital-based example. (SLD)

  3. Applying Bayesian Statistics to Educational Evaluation. Theoretical Paper No. 62.

    Science.gov (United States)

    Brumet, Michael E.

    Bayesian statistical inference is unfamiliar to many educational evaluators. While the classical model is useful in educational research, it is not as useful in evaluation because of the need to identify solutions to practical problems based on a wide spectrum of information. The reason Bayesian analysis is effective for decision making is that it…

  4. Storytelling, statistics and hereditary thought: the narrative support of early statistics.

    Science.gov (United States)

    López-Beltrán, Carlos

    2006-03-01

    This paper's main contention is that some basically methodological developments in science which are apparently distant and unrelated can be seen as part of a sequential story. Focusing on general inferential and epistemological matters, the paper links occurrences separated by both in time and space, by formal and representational issues rather than social or disciplinary links. It focuses on a few limited aspects of several cognitive practices in medical and biological contexts separated by geography, disciplines and decades, but connected by long term transdisciplinary representational and inferential structures and constraints. The paper intends to show a given set of knowledge claims based on organizing statistically empirical data can be seen to have been underpinned by a previous, more familiar, and probably more natural, narrative handling of similar evidence. To achieve that this paper moves from medicine in France in the late eighteenth and early nineteenth century to the second half of the nineteenth century in England among gentleman naturalists, following its subject: the shift from narrative depiction of hereditary transmission of physical peculiarities to posterior statistical articulations of the same phenomena. Some early defenders of heredity as an important (if not the most important) causal presence in the understanding of life adopted singular narratives, in the form of case stories from medical and natural history traditions, to flesh out a special kind of causality peculiar to heredity. This work tries to reconstruct historically the rationale that drove the use of such narratives. It then shows that when this rationale was methodologically challenged, its basic narrative and probabilistic underpinings were transferred to the statistical quantificational tools that took their place.

  5. A comparison of Asian aquaculture products using statistically supported LCA

    NARCIS (Netherlands)

    Henriksson, P.J.G.; Rico, A.; Zhang, W.; Al-Nahid, A.; Newton, R.; Phan, L.T.; Zhang, Z.; Jaithiang, J.; Dao, H.M.; Phu, T.M.; Little, D.C.; Murray, F.J.; Satapornvanit, K.; Liu, L.; Liu, Q.; Haque, M.M.; Kruijssen, F.; de Snoo, G.R.; Heijungs, R.; van Bodegom, P.

    2015-01-01

    We investigated aquaculture production of Asian tiger shrimp, whiteleg shrimp, giant river prawn, tilapia, and pangasius catfish in Bangladesh, China, Thailand, and Vietnam by using life cycle assessments (LCAs), with the purpose of evaluating the comparative eco-efficiency of producing different

  6. Statistical analysis of elastic beam with unilateral frictionless supports

    International Nuclear Information System (INIS)

    Feijoo, R.A.; Barbosa, H.J.C.

    1983-06-01

    A variational formulation of the elastic beam problem with unilateral frictionless supports is presented. It is shown that the solution of this problem can be characterized as the solution of a variational inequality or as the solution of the constrained minimum of the total potential energy of the structure. THe finite dimensional counterpart of this variational formulation is obtained using the finite element method, and the Gauss-Seidel method with projection and overrelaxation can be used to obtain an approximate solution. In order to show the numerical performance of the present approach some numerical examples are also presented. (Author) [pt

  7. Statistical methods to evaluate thermoluminescence ionizing radiation dosimetry data

    International Nuclear Information System (INIS)

    Segre, Nadia; Matoso, Erika; Fagundes, Rosane Correa

    2011-01-01

    Ionizing radiation levels, evaluated through the exposure of CaF 2 :Dy thermoluminescence dosimeters (TLD- 200), have been monitored at Centro Experimental Aramar (CEA), located at Ipero in Sao Paulo state, Brazil, since 1991 resulting in a large amount of measurements until 2009 (more than 2,000). The data amount associated with measurements dispersion, since every process has deviation, reinforces the utilization of statistical tools to evaluate the results, procedure also imposed by the Brazilian Standard CNEN-NN-3.01/PR- 3.01-008 which regulates the radiometric environmental monitoring. Thermoluminescence ionizing radiation dosimetry data are statistically compared in order to evaluate potential CEA's activities environmental impact. The statistical tools discussed in this work are box plots, control charts and analysis of variance. (author)

  8. Introducing StatHand: A cross-platform mobile application to support students’ statistical decision making

    Directory of Open Access Journals (Sweden)

    Peter James Allen

    2016-02-01

    Full Text Available Although essential to professional competence in psychology, quantitative research methods are a known area of weakness for many undergraduate psychology students. Students find selecting appropriate statistical tests and procedures for different types of research questions, hypotheses and data types particularly challenging, and these skills are not often practiced in class. Decision trees (a type of graphic organizer are known to facilitate this decision making process, but extant trees have a number of limitations. Furthermore, emerging research suggests that mobile technologies offer many possibilities for facilitating learning. It is within this context that we have developed StatHand, a free cross-platform application designed to support students’ statistical decision making. Developed with the support of the Australian Government Office for Learning and Teaching, StatHand guides users through a series of simple, annotated questions to help them identify a statistical test or procedure appropriate to their circumstances. It further offers the guidance necessary to run these tests and procedures, then interpret and report their results. In this Technology Report we will overview the rationale behind StatHand, before describing the feature set of the application. We will then provide guidelines for integrating StatHand into the research methods curriculum, before concluding by outlining our road map for the ongoing development and evaluation of StatHand.

  9. Introducing StatHand: A Cross-Platform Mobile Application to Support Students' Statistical Decision Making.

    Science.gov (United States)

    Allen, Peter J; Roberts, Lynne D; Baughman, Frank D; Loxton, Natalie J; Van Rooy, Dirk; Rock, Adam J; Finlay, James

    2016-01-01

    Although essential to professional competence in psychology, quantitative research methods are a known area of weakness for many undergraduate psychology students. Students find selecting appropriate statistical tests and procedures for different types of research questions, hypotheses and data types particularly challenging, and these skills are not often practiced in class. Decision trees (a type of graphic organizer) are known to facilitate this decision making process, but extant trees have a number of limitations. Furthermore, emerging research suggests that mobile technologies offer many possibilities for facilitating learning. It is within this context that we have developed StatHand, a free cross-platform application designed to support students' statistical decision making. Developed with the support of the Australian Government Office for Learning and Teaching, StatHand guides users through a series of simple, annotated questions to help them identify a statistical test or procedure appropriate to their circumstances. It further offers the guidance necessary to run these tests and procedures, then interpret and report their results. In this Technology Report we will overview the rationale behind StatHand, before describing the feature set of the application. We will then provide guidelines for integrating StatHand into the research methods curriculum, before concluding by outlining our road map for the ongoing development and evaluation of StatHand.

  10. Financial Derivatives (Based on Two Supports Evaluation

    Directory of Open Access Journals (Sweden)

    Tiberiu Socaciu

    2016-07-01

    Full Text Available In this paper we build a PDE like Black-Scholes equation in hypothesis of a financial derivative that is dependent on two supports (usual is dependent only on one support, like amoption based on gold, when national currency has a great float.Keywords: Financial derivatives, derivatives evaluation, derivatives based on two supports, extended Itō like lemma.

  11. Statistical evaluation of cleanup: How should it be done?

    International Nuclear Information System (INIS)

    Gilbert, R.O.

    1993-02-01

    This paper discusses statistical issues that must be addressed when conducting statistical tests for the purpose of evaluating if a site has been remediated to guideline values or standards. The importance of using the Data Quality Objectives (DQO) process to plan and design the sampling plan is emphasized. Other topics discussed are: (1) accounting for the uncertainty of cleanup standards when conducting statistical tests, (2) determining the number of samples and measurements needed to attain specified DQOs, (3) considering whether the appropriate testing philosophy in a given situation is ''guilty until proven innocent'' or ''innocent until proven guilty'' when selecting a statistical test for evaluating the attainment of standards, (4) conducting tests using data sets that contain measurements that have been reported by the laboratory as less than the minimum detectable activity, and (5) selecting statistical tests that are appropriate for risk-based or background-based standards. A recent draft report by Berger that provides guidance on sampling plans and data analyses for final status surveys at US Nuclear Regulatory Commission licensed facilities serves as a focal point for discussion

  12. Evaluation of the Wishart test statistics for polarimetric SAR data

    DEFF Research Database (Denmark)

    Skriver, Henning; Nielsen, Allan Aasbjerg; Conradsen, Knut

    2003-01-01

    A test statistic for equality of two covariance matrices following the complex Wishart distribution has previously been used in new algorithms for change detection, edge detection and segmentation in polarimetric SAR images. Previously, the results for change detection and edge detection have been...... quantitatively evaluated. This paper deals with the evaluation of segmentation. A segmentation performance measure originally developed for single-channel SAR images has been extended to polarimetric SAR images, and used to evaluate segmentation for a merge-using-moment algorithm for polarimetric SAR data....

  13. Research evaluation support services in biomedical libraries.

    Science.gov (United States)

    Gutzman, Karen Elizabeth; Bales, Michael E; Belter, Christopher W; Chambers, Thane; Chan, Liza; Holmes, Kristi L; Lu, Ya-Ling; Palmer, Lisa A; Reznik-Zellen, Rebecca C; Sarli, Cathy C; Suiter, Amy M; Wheeler, Terrie R

    2018-01-01

    The paper provides a review of current practices related to evaluation support services reported by seven biomedical and research libraries. A group of seven libraries from the United States and Canada described their experiences with establishing evaluation support services at their libraries. A questionnaire was distributed among the libraries to elicit information as to program development, service and staffing models, campus partnerships, training, products such as tools and reports, and resources used for evaluation support services. The libraries also reported interesting projects, lessons learned, and future plans. The seven libraries profiled in this paper report a variety of service models in providing evaluation support services to meet the needs of campus stakeholders. The service models range from research center cores, partnerships with research groups, and library programs with staff dedicated to evaluation support services. A variety of products and services were described such as an automated tool to develop rank-based metrics, consultation on appropriate metrics to use for evaluation, customized publication and citation reports, resource guides, classes and training, and others. Implementing these services has allowed the libraries to expand their roles on campus and to contribute more directly to the research missions of their institutions. Libraries can leverage a variety of evaluation support services as an opportunity to successfully meet an array of challenges confronting the biomedical research community, including robust efforts to report and demonstrate tangible and meaningful outcomes of biomedical research and clinical care. These services represent a transformative direction that can be emulated by other biomedical and research libraries.

  14. Equipment Maintenance management support system based on statistical analysis of maintenance history data

    International Nuclear Information System (INIS)

    Shimizu, S.; Ando, Y.; Morioka, T.

    1990-01-01

    Plant maintenance is recently becoming important with the increase in the number of nuclear power stations and in plant operating time. Various kinds of requirements for plant maintenance, such as countermeasures for equipment degradation and saving maintenance costs while keeping up plant reliability and productivity, are proposed. For this purpose, plant maintenance programs should be improved based on equipment reliability estimated by field data. In order to meet these requirements, it is planned to develop an equipment maintenance management support system for nuclear power plants based on statistical analysis of equipment maintenance history data. The large difference between this proposed new method and current similar methods is to evaluate not only failure data but maintenance data, which includes normal termination data and some degree of degradation or functional disorder data for equipment and parts. So, it is possible to utilize these field data for improving maintenance schedules and to evaluate actual equipment and parts reliability under the current maintenance schedule. In the present paper, the authors show the objectives of this system, an outline of this system and its functions, and the basic technique for collecting and managing of maintenance history data on statistical analysis. It is shown, from the results of feasibility tests using simulation data of maintenance history, that this system has the ability to provide useful information for maintenance and the design enhancement

  15. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  16. METHODOLOGICAL PRINCIPLES AND METHODS OF TERMS OF TRADE STATISTICAL EVALUATION

    Directory of Open Access Journals (Sweden)

    N. Kovtun

    2014-09-01

    Full Text Available The paper studies the methodological principles and guidance of the statistical evaluation of terms of trade for the United Nations classification model – Harmonized Commodity Description and Coding System (HS. The practical implementation of the proposed three-stage model of index analysis and estimation of terms of trade for Ukraine's commodity-members for the period of 2011-2012 are realized.

  17. Organizational Structures that Support Internal Program Evaluation

    Science.gov (United States)

    Lambur, Michael T.

    2008-01-01

    This chapter explores how the structure of large complex organizations such as Cooperative Extension affects their ability to support internal evaluation of their programs and activities. Following a literature review of organizational structure and its relation to internal evaluation capacity, the chapter presents the results of interviews with…

  18. Statistical Decision Support Tools for System-Oriented Runway Management, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The feasibility of developing a statistical decision support system for traffic flow management in the terminal area and runway load balancing was demonstrated in...

  19. Statistical evaluation of design-error related nuclear reactor accidents

    International Nuclear Information System (INIS)

    Ott, K.O.; Marchaterre, J.F.

    1981-01-01

    In this paper, general methodology for the statistical evaluation of design-error related accidents is proposed that can be applied to a variety of systems that evolves during the development of large-scale technologies. The evaluation aims at an estimate of the combined ''residual'' frequency of yet unknown types of accidents ''lurking'' in a certain technological system. A special categorization in incidents and accidents is introduced to define the events that should be jointly analyzed. The resulting formalism is applied to the development of U.S. nuclear power reactor technology, considering serious accidents (category 2 events) that involved, in the accident progression, a particular design inadequacy. 9 refs

  20. Modeling a support system for the evaluator

    International Nuclear Information System (INIS)

    Lozano Lima, B.; Ilizastegui Perez, F; Barnet Izquierdo, B.

    1998-01-01

    This work gives evaluators a tool they can employ to give more soundness to their review of operational limits and conditions. The system will establish the most adequate method to carry out the evaluation, as well as to evaluate the basis for technical operational specifications. It also includes the attainment of alternative questions to be supplied to the operating entity to support it in decision-making activities

  1. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  2. Australasian Resuscitation In Sepsis Evaluation trial statistical analysis plan.

    Science.gov (United States)

    Delaney, Anthony; Peake, Sandra L; Bellomo, Rinaldo; Cameron, Peter; Holdgate, Anna; Howe, Belinda; Higgins, Alisa; Presneill, Jeffrey; Webb, Steve

    2013-10-01

    The Australasian Resuscitation In Sepsis Evaluation (ARISE) study is an international, multicentre, randomised, controlled trial designed to evaluate the effectiveness of early goal-directed therapy compared with standard care for patients presenting to the ED with severe sepsis. In keeping with current practice, and taking into considerations aspects of trial design and reporting specific to non-pharmacologic interventions, this document outlines the principles and methods for analysing and reporting the trial results. The document is prepared prior to completion of recruitment into the ARISE study, without knowledge of the results of the interim analysis conducted by the data safety and monitoring committee and prior to completion of the two related international studies. The statistical analysis plan was designed by the ARISE chief investigators, and reviewed and approved by the ARISE steering committee. The data collected by the research team as specified in the study protocol, and detailed in the study case report form were reviewed. Information related to baseline characteristics, characteristics of delivery of the trial interventions, details of resuscitation and other related therapies, and other relevant data are described with appropriate comparisons between groups. The primary, secondary and tertiary outcomes for the study are defined, with description of the planned statistical analyses. A statistical analysis plan was developed, along with a trial profile, mock-up tables and figures. A plan for presenting baseline characteristics, microbiological and antibiotic therapy, details of the interventions, processes of care and concomitant therapies, along with adverse events are described. The primary, secondary and tertiary outcomes are described along with identification of subgroups to be analysed. A statistical analysis plan for the ARISE study has been developed, and is available in the public domain, prior to the completion of recruitment into the

  3. The Evaluation Methodology of Information Support

    Directory of Open Access Journals (Sweden)

    Lubos Necesal

    2016-01-01

    Full Text Available Knowledge, information and people are the motive force in today's organizations. Successful organizations need to find the right employees and provide them with the right and highquality information. This is a complex problem. In the world where information plays more and more important role, employees have to be skilled at information activities (searching, processing, saving, etc. of information and information system/-s (IS they work with. Organizations have to cover both these areas. Therefore, we need an effective instrument, which could be used to evaluate new employees within admission or as regular evaluating of current employees, to evaluate information system, whether it is an appropriate tool for fulfilling the employee’s tasks within the organization, and to evaluate how the organization covers the foregoing areas. Such instrument is the “Evaluation methodology of information support in organization”. This paper defines the term “information support“ and its role in organization. The body of the paper proposes the “Evaluation methodology of information support in organization”. The conclusion discusses contributions of information support evaluation

  4. Correct statistical evaluation for total dose in rural settlement

    International Nuclear Information System (INIS)

    Vlasova, N.G.; Skryabin, A.M.

    2001-01-01

    Statistical evaluation of dose reduced to the determination of an average value and its error. If an average value of a total dose in general can be determined by simple summarizing of the averages of its external and internal components, the evaluation of an error can be received only from its distribution. Herewith, considering that both components of the dose are interdependent, to summarize their distributions, as a last ones of a random independent variables, is incorrect. It follows that an evaluation of the parameters of the total dose distribution, including an error, in general, cannot be received empirically, particularly, at the lack or absence of the data on one of the components of the last one, that constantly is happens in practice. If the evaluation of an average for total dose was defined somehow, as the best, as an average of a distribution of the values of individual total doses, as summarizing the individual external and internal doses by the random type, that an error of evaluation had not been produced. The methodical approach to evaluation of the total dose distribution at the lack of dosimetric information was designed. The essence of it is original way of an interpolation of an external dose distribution, using data on an internal dose

  5. Statistical performance evaluation of ECG transmission using wireless networks.

    Science.gov (United States)

    Shakhatreh, Walid; Gharaibeh, Khaled; Al-Zaben, Awad

    2013-07-01

    This paper presents simulation of the transmission of biomedical signals (using ECG signal as an example) over wireless networks. Investigation of the effect of channel impairments including SNR, pathloss exponent, path delay and network impairments such as packet loss probability; on the diagnosability of the received ECG signal are presented. The ECG signal is transmitted through a wireless network system composed of two communication protocols; an 802.15.4- ZigBee protocol and an 802.11b protocol. The performance of the transmission is evaluated using higher order statistics parameters such as kurtosis and Negative Entropy in addition to the common techniques such as the PRD, RMS and Cross Correlation.

  6. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  7. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  8. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  9. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  10. Statistical estimate for evaluation of vitrified radioactive wastes

    International Nuclear Information System (INIS)

    Jedinakova-Krizova, V.; Dvorak, Z.

    1994-01-01

    The evaluation of experimental results by methods of mathematical statistics gave a chance to derive a number of conclusions on the leachability of vitrified radioactive wastes. Practical application of this procedure requires that the ratio of Na and K concentration in the solution should be independent of the leaching time- The actual value of this ratio is influenced. above all. by the properties of the glass matrix. These results confirm the ion that Na/K correlation found could be extended for the determination of the Na/ 137 Cs concentration ratio. This finding was used for the application of a ln-ln correlation, while evaluation the quality of vitrified radioactive wastes products. (author) 7 refs.; 5 figs.; 1 tab

  11. Statistical evaluation of design-error related accidents

    International Nuclear Information System (INIS)

    Ott, K.O.; Marchaterre, J.F.

    1980-01-01

    In a recently published paper (Campbell and Ott, 1979), a general methodology was proposed for the statistical evaluation of design-error related accidents. The evaluation aims at an estimate of the combined residual frequency of yet unknown types of accidents lurking in a certain technological system. Here, the original methodology is extended, as to apply to a variety of systems that evolves during the development of large-scale technologies. A special categorization of incidents and accidents is introduced to define the events that should be jointly analyzed. The resulting formalism is applied to the development of the nuclear power reactor technology, considering serious accidents that involve in the accident-progression a particular design inadequacy

  12. Research evaluation support services in biomedical libraries

    Directory of Open Access Journals (Sweden)

    Karen Elizabeth Gutzman

    2018-01-01

    Conclusions: Libraries can leverage a variety of evaluation support services as an opportunity to successfully meet an array of challenges confronting the biomedical research community, including robust efforts to report and demonstrate tangible and meaningful outcomes of biomedical research and clinical care. These services represent a transformative direction that can be emulated by other biomedical and research libraries.

  13. Personal dosimetry statistics and specifics of low dose evaluation

    International Nuclear Information System (INIS)

    Avila, R.E.; Gómez Salinas, R.A.; Oyarzún Cortés, C.H.

    2015-01-01

    The dose statistics of a personal dosimetry service, considering 35,000+ readings, display a sharp peak at low dose (below 0.5 mSv) with skewness to higher values. A measure of the dispersion is that approximately 65% of the doses fall below the average plus 2 standard deviations, an observation which may prove helpful to radiation protection agencies. Categorizing the doses by the concomitant use of a finger ring dosimeter, that skewness is larger in the whole body, and ring dosimeters. The use of Harshaw 5500 readers at high gain leads to frequent values of the glow curve that are judged to be spurious, i.e. values not belonging to the roughly normal noise over the curve. A statistical criterion is shown for identifying those anomalous values, and replacing them with the local behavior, as fit by a cubic polynomial. As a result, the doses above 0.05 mSv which are affected by more than 2% comprise over 10% of the data base. The low dose peak of the statistics, above, has focused our attention on the evaluation of LiF(Mg,Ti) dosimeters exposed at low dose, and read with Harshaw 5500 readers. The standard linear procedure, via an overall reader calibration factor, is observed to fail at low dose, in detailed calibrations from 0.02 mSv to 1 Sv. A significant improvement is achieved by a piecewise polynomials calibration curve. A cubic, at low dose is matched, at ∼10 mSv, in value and first derivative, to a linear dependence at higher doses. This improvement is particularly noticeable below 2 mSv, where over 60% of the evaluated dosimeters are found. (author)

  14. Evaluation of air quality in a megacity using statistics tools

    Science.gov (United States)

    Ventura, Luciana Maria Baptista; de Oliveira Pinto, Fellipe; Soares, Laiza Molezon; Luna, Aderval Severino; Gioda, Adriana

    2017-03-01

    Local physical characteristics (e.g., meteorology and topography) associate to particle concentrations are important to evaluate air quality in a region. Meteorology and topography affect air pollutant dispersions. This study used statistics tools (PCA, HCA, Kruskal-Wallis, Mann-Whitney's test and others) to a better understanding of the relationship between fine particulate matter (PM2.5) levels and seasons, meteorological conditions and air basins. To our knowledge, it is one of the few studies performed in Latin America involving all parameters together. PM2.5 samples were collected in six sampling sites with different emission sources (industrial, vehicular, soil dust) in Rio de Janeiro, Brazil. The PM2.5 daily concentrations ranged from 1 to 61 µg m-3, with averages higher than the annual limit (15 µg m-3) for some of the sites. The results of the statistics evaluation showed that PM2.5 concentrations were not influenced by seasonality. Furthermore, air basins defined previously were not confirmed, because some sites presented similar emission sources. Therefore, new redefinitions of air basins need to be done, once they are important to air quality management.

  15. Evaluation of air quality in a megacity using statistics tools

    Science.gov (United States)

    Ventura, Luciana Maria Baptista; de Oliveira Pinto, Fellipe; Soares, Laiza Molezon; Luna, Aderval Severino; Gioda, Adriana

    2018-06-01

    Local physical characteristics (e.g., meteorology and topography) associate to particle concentrations are important to evaluate air quality in a region. Meteorology and topography affect air pollutant dispersions. This study used statistics tools (PCA, HCA, Kruskal-Wallis, Mann-Whitney's test and others) to a better understanding of the relationship between fine particulate matter (PM2.5) levels and seasons, meteorological conditions and air basins. To our knowledge, it is one of the few studies performed in Latin America involving all parameters together. PM2.5 samples were collected in six sampling sites with different emission sources (industrial, vehicular, soil dust) in Rio de Janeiro, Brazil. The PM2.5 daily concentrations ranged from 1 to 61 µg m-3, with averages higher than the annual limit (15 µg m-3) for some of the sites. The results of the statistics evaluation showed that PM2.5 concentrations were not influenced by seasonality. Furthermore, air basins defined previously were not confirmed, because some sites presented similar emission sources. Therefore, new redefinitions of air basins need to be done, once they are important to air quality management.

  16. Statistical methods of evaluating and comparing imaging techniques

    International Nuclear Information System (INIS)

    Freedman, L.S.

    1987-01-01

    Over the past 20 years several new methods of generating images of internal organs and the anatomy of the body have been developed and used to enhance the accuracy of diagnosis and treatment. These include ultrasonic scanning, radioisotope scanning, computerised X-ray tomography (CT) and magnetic resonance imaging (MRI). The new techniques have made a considerable impact on radiological practice in hospital departments, not least on the investigational process for patients suspected or known to have malignant disease. As a consequence of the increased range of imaging techniques now available, there has developed a need to evaluate and compare their usefulness. Over the past 10 years formal studies of the application of imaging technology have been conducted and many reports have appeared in the literature. These studies cover a range of clinical situations. Likewise, the methodologies employed for evaluating and comparing the techniques in question have differed widely. While not attempting an exhaustive review of the clinical studies which have been reported, this paper aims to examine the statistical designs and analyses which have been used. First a brief review of the different types of study is given. Examples of each type are then chosen to illustrate statistical issues related to their design and analysis. In the final sections it is argued that a form of classification for these different types of study might be helpful in clarifying relationships between them and bringing a perspective to the field. A classification based upon a limited analogy with clinical trials is suggested

  17. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  18. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  19. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  20. Supporting Students to Develop Concepts Underlying Sampling and to Shuttle Between Contextual and Statistical Spheres

    NARCIS (Netherlands)

    Bakker, A.; Dierdorp, A.; Maanen, J.A. van; Eijkelhof, H.M.C.

    2012-01-01

    To stimulate students’ shuttling between contextual and statistical spheres, we based tasks on professional practices. This article focuses on two tasks to support reasoning about sampling by students aged 16-17. The purpose of the tasks was to find out which smaller sample size would have been

  1. Optimum design of automobile seat using statistical design support system; Tokeiteki sekkei shien system no jidoshayo seat eno tekiyo

    Energy Technology Data Exchange (ETDEWEB)

    Kashiwamura, T [NHK Spring Co. Ltd., Yokohama (Japan); Shiratori, M; Yu, Q; Koda, I [Yokohama National University, Yokohama (Japan)

    1997-10-01

    The authors proposed a new practical optimum design method called statistical design support system, which consists of five steps: the effectivity analysis, reanalysis, evaluation of dispersion, the optimiza4ion and evaluation of structural reliability. In this study, the authors applied the present system to analyze and optimum design of an automobile seat frame subjected to crushing. This study should that the present method could be applied to the complex nonlinear problems such as large deformation, material nonlinearity as well as impact problem. It was shown that the optimum design of the seat frame has been solved easily using the present system. 6 refs., 5 figs., 5 tabs.

  2. Evaluating and Reporting Statistical Power in Counseling Research

    Science.gov (United States)

    Balkin, Richard S.; Sheperis, Carl J.

    2011-01-01

    Despite recommendations from the "Publication Manual of the American Psychological Association" (6th ed.) to include information on statistical power when publishing quantitative results, authors seldom include analysis or discussion of statistical power. The rationale for discussing statistical power is addressed, approaches to using "G*Power" to…

  3. Evaluating Ethical Responsibility in Inverse Decision Support

    Directory of Open Access Journals (Sweden)

    Ahmad M. Kabil

    2012-01-01

    Full Text Available Decision makers have considerable autonomy on how they make decisions and what type of support they receive. This situation places the DSS analyst in a different relationship with the client than his colleagues who support regular MIS applications. This paper addresses an ethical dilemma in “Inverse Decision Support,” when the analyst supports a decision maker who requires justification for a preconceived selection that does not correspond to the best option that resulted from the professional resolution of the problem. An extended application of the AHP model is proposed for evaluating the ethical responsibility in selecting a suboptimal alternative. The extended application is consistent with the Inverse Decision Theory that is used extensively in medical decision making. A survey of decision analysts is used to assess their perspective of using the proposed extended application. The results show that 80% of the respondents felt that the proposed extended application is useful in business practices. 14% of them expanded the usability of the extended application to academic teaching of the ethics theory. The extended application is considered more usable in a country with a higher Transparency International Corruption Perceptions Index (TICPI than in a country with a lower one.

  4. Methodology development for statistical evaluation of reactor safety analyses

    International Nuclear Information System (INIS)

    Mazumdar, M.; Marshall, J.A.; Chay, S.C.; Gay, R.

    1976-07-01

    In February 1975, Westinghouse Electric Corporation, under contract to Electric Power Research Institute, started a one-year program to develop methodology for statistical evaluation of nuclear-safety-related engineering analyses. The objectives of the program were to develop an understanding of the relative efficiencies of various computational methods which can be used to compute probability distributions of output variables due to input parameter uncertainties in analyses of design basis events for nuclear reactors and to develop methods for obtaining reasonably accurate estimates of these probability distributions at an economically feasible level. A series of tasks was set up to accomplish these objectives. Two of the tasks were to investigate the relative efficiencies and accuracies of various Monte Carlo and analytical techniques for obtaining such estimates for a simple thermal-hydraulic problem whose output variable of interest is given in a closed-form relationship of the input variables and to repeat the above study on a thermal-hydraulic problem in which the relationship between the predicted variable and the inputs is described by a short-running computer program. The purpose of the report presented is to document the results of the investigations completed under these tasks, giving the rationale for choices of techniques and problems, and to present interim conclusions

  5. How Narrative Focus and a Statistical Map Shape Health Policy Support Among State Legislators.

    Science.gov (United States)

    Niederdeppe, Jeff; Roh, Sungjong; Dreisbach, Caitlin

    2016-01-01

    This study attempts to advance theorizing about health policy advocacy with combinations of narrative focus and a statistical map in an attempt to increase state legislators' support for policies to address the issue of obesity by reducing food deserts. Specifically, we examine state legislators' responses to variations in narrative focus (individual vs. community) about causes and solutions for food deserts in U.S. communities, and a statistical map (presence vs. absence) depicting the prevalence of food deserts across the United States. Using a Web-based randomized experiment (N=496), we show that narrative focus and the statistical map interact to produce different patterns of cognitive response and support for policies to reduce the prevalence of food deserts. The presence of a statistical map showing the prevalence of food deserts in the United States appeared to matter only when combined with an individual narrative, offsetting the fact that the individual narrative in isolation produced fewer thoughts consistent with the story's persuasive goal and more counterarguments in opposition to environmental causes and solutions for obesity than other message conditions. The image did not have an impact when combined with a story describing a community at large. Cognitive responses fully mediated message effects on intended persuasive outcomes. We conclude by discussing the study's contributions to communication theory and practice.

  6. Time-to-event methodology improved statistical evaluation in register-based health services research.

    Science.gov (United States)

    Bluhmki, Tobias; Bramlage, Peter; Volk, Michael; Kaltheuner, Matthias; Danne, Thomas; Rathmann, Wolfgang; Beyersmann, Jan

    2017-02-01

    Complex longitudinal sampling and the observational structure of patient registers in health services research are associated with methodological challenges regarding data management and statistical evaluation. We exemplify common pitfalls and want to stimulate discussions on the design, development, and deployment of future longitudinal patient registers and register-based studies. For illustrative purposes, we use data from the prospective, observational, German DIabetes Versorgungs-Evaluation register. One aim was to explore predictors for the initiation of a basal insulin supported therapy in patients with type 2 diabetes initially prescribed to glucose-lowering drugs alone. Major challenges are missing mortality information, time-dependent outcomes, delayed study entries, different follow-up times, and competing events. We show that time-to-event methodology is a valuable tool for improved statistical evaluation of register data and should be preferred to simple case-control approaches. Patient registers provide rich data sources for health services research. Analyses are accompanied with the trade-off between data availability, clinical plausibility, and statistical feasibility. Cox' proportional hazards model allows for the evaluation of the outcome-specific hazards, but prediction of outcome probabilities is compromised by missing mortality information. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. The large break LOCA evaluation method with the simplified statistic approach

    International Nuclear Information System (INIS)

    Kamata, Shinya; Kubo, Kazuo

    2004-01-01

    USNRC published the Code Scaling, Applicability and Uncertainty (CSAU) evaluation methodology to large break LOCA which supported the revised rule for Emergency Core Cooling System performance in 1989. In USNRC regulatory guide 1.157, it is required that the peak cladding temperature (PCT) cannot exceed 2200deg F with high probability 95th percentile. In recent years, overseas countries have developed statistical methodology and best estimate code with the model which can provide more realistic simulation for the phenomena based on the CSAU evaluation methodology. In order to calculate PCT probability distribution by Monte Carlo trials, there are approaches such as the response surface technique using polynomials, the order statistics method, etc. For the purpose of performing rational statistic analysis, Mitsubishi Heavy Industries, LTD (MHI) tried to develop the statistic LOCA method using the best estimate LOCA code MCOBRA/TRAC and the simplified code HOTSPOT. HOTSPOT is a Monte Carlo heat conduction solver to evaluate the uncertainties of the significant fuel parameters at the PCT positions of the hot rod. The direct uncertainty sensitivity studies can be performed without the response surface because the Monte Carlo simulation for key parameters can be performed in short time using HOTSPOT. With regard to the parameter uncertainties, MHI established the treatment that the bounding conditions are given for LOCA boundary and plant initial conditions, the Monte Carlo simulation using HOTSPOT is applied to the significant fuel parameters. The paper describes the large break LOCA evaluation method with the simplified statistic approach and the results of the application of the method to the representative four-loop nuclear power plant. (author)

  8. Evaluation of the performance of Moses statistical engine adapted to ...

    African Journals Online (AJOL)

    ... of Moses statistical engine adapted to English-Arabic language combination. ... of Artificial Intelligence (AI) dedicated to Natural Language Processing (NLP). ... and focuses on SMT, then introducing the features of the open source Moses ...

  9. Empirical and Statistical Evaluation of the Effectiveness of Four ...

    African Journals Online (AJOL)

    Akorede

    ABSTRACT: Data compression is the process of reducing the size of a file to effectively ... Through the statistical analysis performed using Boxplot and ANOVA and comparison made ...... Automatic Control, Electronics and Computer Science.

  10. Use of Statistics for Data Evaluation in Environmental Radioactivity Measurements

    International Nuclear Information System (INIS)

    Sutarman

    2001-01-01

    Counting statistics will give a correction on environmental radioactivity measurement result. Statistics provides formulas to determine standard deviation (S B ) and minimum detectable concentration (MDC) according to the Poisson distribution. Both formulas depend on the background count rate, counting time, counting efficiency, gamma intensity, and sample size. A long time background counting results in relatively low S B and MDC that can present relatively accurate measurement results. (author)

  11. Statistical estimation Monte Carlo for unreliability evaluation of highly reliable system

    International Nuclear Information System (INIS)

    Xiao Gang; Su Guanghui; Jia Dounan; Li Tianduo

    2000-01-01

    Based on analog Monte Carlo simulation, statistical Monte Carlo methods for unreliable evaluation of highly reliable system are constructed, including direct statistical estimation Monte Carlo method and weighted statistical estimation Monte Carlo method. The basal element is given, and the statistical estimation Monte Carlo estimators are derived. Direct Monte Carlo simulation method, bounding-sampling method, forced transitions Monte Carlo method, direct statistical estimation Monte Carlo and weighted statistical estimation Monte Carlo are used to evaluate unreliability of a same system. By comparing, weighted statistical estimation Monte Carlo estimator has smallest variance, and has highest calculating efficiency

  12. The statistical analysis techniques to support the NGNP fuel performance experiments

    Energy Technology Data Exchange (ETDEWEB)

    Pham, Binh T., E-mail: Binh.Pham@inl.gov; Einerson, Jeffrey J.

    2013-10-15

    This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He–Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.

  13. Statistical evaluation and measuring strategy for extremely small line shifts

    International Nuclear Information System (INIS)

    Hansen, P.G.

    1978-01-01

    For a measuring situation limited by counting statistics, but where the level of precision is such that possible systematic errors are a major concern, it is proposed to determine the position of a spectral line from a measured line segment by applying a bias correction to the centre of gravity of the segment. This procedure is statistically highly efficient and not sensitive to small errors in assumptions about the line shape. The counting strategy for an instrument that takes data point by point is also considered. It is shown that an optimum (''two-point'') strategy exists; a scan of the central part of the line is 68% efficient by this standard. (Auth.)

  14. Statistical methods for the evaluation of educational services and quality of products

    CERN Document Server

    Bini, Matilde; Piccolo, Domenico; Salmaso, Luigi

    2009-01-01

    The book presents statistical methods and models that can usefully support the evaluation of educational services and quality of products. The evaluation of educational services, as well as the analysis of judgments and preferences, poses severe methodological challenges because of the presence of the following aspects: the observational nature of the context, which is associated with the problems of selection bias and presence of nuisance factors; the hierarchical structure of the data (multilevel analysis); the multivariate and qualitative nature of the dependent variable; the presence of non observable factors, e.g. the satisfaction, calling for the use of latent variables models; the simultaneous presence of components of pleasure and components of uncertainty in the explication of the judgments, that asks for the specification and estimation of mixture models. The contributions concern methodological advances developed mostly with reference to specific problems of evaluation using real data sets.

  15. A Statistical Approach for Deriving Key NFC Evaluation Criteria

    Energy Technology Data Exchange (ETDEWEB)

    Kim, S. K; Kang, G. B.; Ko, W. I [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Young, S. R.; Gao, R. X. [Univ. of Science and Technology, Daejeon (Korea, Republic of)

    2014-02-15

    This study suggests 5 evaluation criteria (safety and technology, environmental impact, economic feasibility, social factors, and institutional factors) and 24 evaluation indicators for a NFC (nuclear fuel cycle) derived using factor analysis. To do so, a survey using 1 on 1 interview was given to nuclear energy experts and local residents who live near nuclear power plants. In addition, by conducting a factor analysis, homogeneous evaluation indicators were grouped with the same evaluation criteria, and unnecessary evaluation criteria and evaluation indicators were dropped out. As a result of analyzing the weight of evaluation criteria with the sample of nuclear power experts and the general public, both sides recognized safety as the most important evaluation criterion, and the social factors such as public acceptance appeared to be ranked as more important evaluation criteria by the nuclear energy experts than the general public.

  16. A Statistical Approach for Deriving Key NFC Evaluation Criteria

    International Nuclear Information System (INIS)

    Kim, S. K; Kang, G. B.; Ko, W. I; Young, S. R.; Gao, R. X.

    2014-01-01

    This study suggests 5 evaluation criteria (safety and technology, environmental impact, economic feasibility, social factors, and institutional factors) and 24 evaluation indicators for a NFC (nuclear fuel cycle) derived using factor analysis. To do so, a survey using 1 on 1 interview was given to nuclear energy experts and local residents who live near nuclear power plants. In addition, by conducting a factor analysis, homogeneous evaluation indicators were grouped with the same evaluation criteria, and unnecessary evaluation criteria and evaluation indicators were dropped out. As a result of analyzing the weight of evaluation criteria with the sample of nuclear power experts and the general public, both sides recognized safety as the most important evaluation criterion, and the social factors such as public acceptance appeared to be ranked as more important evaluation criteria by the nuclear energy experts than the general public

  17. Evaluation of forensic DNA mixture evidence: protocol for evaluation, interpretation, and statistical calculations using the combined probability of inclusion.

    Science.gov (United States)

    Bieber, Frederick R; Buckleton, John S; Budowle, Bruce; Butler, John M; Coble, Michael D

    2016-08-31

    The evaluation and interpretation of forensic DNA mixture evidence faces greater interpretational challenges due to increasingly complex mixture evidence. Such challenges include: casework involving low quantity or degraded evidence leading to allele and locus dropout; allele sharing of contributors leading to allele stacking; and differentiation of PCR stutter artifacts from true alleles. There is variation in statistical approaches used to evaluate the strength of the evidence when inclusion of a specific known individual(s) is determined, and the approaches used must be supportable. There are concerns that methods utilized for interpretation of complex forensic DNA mixtures may not be implemented properly in some casework. Similar questions are being raised in a number of U.S. jurisdictions, leading to some confusion about mixture interpretation for current and previous casework. Key elements necessary for the interpretation and statistical evaluation of forensic DNA mixtures are described. Given the most common method for statistical evaluation of DNA mixtures in many parts of the world, including the USA, is the Combined Probability of Inclusion/Exclusion (CPI/CPE). Exposition and elucidation of this method and a protocol for use is the focus of this article. Formulae and other supporting materials are provided. Guidance and details of a DNA mixture interpretation protocol is provided for application of the CPI/CPE method in the analysis of more complex forensic DNA mixtures. This description, in turn, should help reduce the variability of interpretation with application of this methodology and thereby improve the quality of DNA mixture interpretation throughout the forensic community.

  18. Statistical evaluation of SAGE libraries: consequences for experimental design

    NARCIS (Netherlands)

    Ruijter, Jan M.; van Kampen, Antoine H. C.; Baas, Frank

    2002-01-01

    Since the introduction of serial analysis of gene expression (SAGE) as a method to quantitatively analyze the differential expression of genes, several statistical tests have been published for the pairwise comparison of SAGE libraries. Testing the difference between the number of specific tags

  19. Two statistics for evaluating parameter identifiability and error reduction

    Science.gov (United States)

    Doherty, John; Hunt, Randall J.

    2009-01-01

    Two statistics are presented that can be used to rank input parameters utilized by a model in terms of their relative identifiability based on a given or possible future calibration dataset. Identifiability is defined here as the capability of model calibration to constrain parameters used by a model. Both statistics require that the sensitivity of each model parameter be calculated for each model output for which there are actual or presumed field measurements. Singular value decomposition (SVD) of the weighted sensitivity matrix is then undertaken to quantify the relation between the parameters and observations that, in turn, allows selection of calibration solution and null spaces spanned by unit orthogonal vectors. The first statistic presented, "parameter identifiability", is quantitatively defined as the direction cosine between a parameter and its projection onto the calibration solution space. This varies between zero and one, with zero indicating complete non-identifiability and one indicating complete identifiability. The second statistic, "relative error reduction", indicates the extent to which the calibration process reduces error in estimation of a parameter from its pre-calibration level where its value must be assigned purely on the basis of prior expert knowledge. This is more sophisticated than identifiability, in that it takes greater account of the noise associated with the calibration dataset. Like identifiability, it has a maximum value of one (which can only be achieved if there is no measurement noise). Conceptually it can fall to zero; and even below zero if a calibration problem is poorly posed. An example, based on a coupled groundwater/surface-water model, is included that demonstrates the utility of the statistics. ?? 2009 Elsevier B.V.

  20. Evaluating the One-in-Five Statistic: Women's Risk of Sexual Assault While in College.

    Science.gov (United States)

    Muehlenhard, Charlene L; Peterson, Zoë D; Humphreys, Terry P; Jozkowski, Kristen N

    In 2014, U.S. president Barack Obama announced a White House Task Force to Protect Students From Sexual Assault, noting that "1 in 5 women on college campuses has been sexually assaulted during their time there." Since then, this one-in-five statistic has permeated public discourse. It is frequently reported, but some commentators have criticized it as exaggerated. Here, we address the question, "What percentage of women are sexually assaulted while in college?" After discussing definitions of sexual assault, we systematically review available data, focusing on studies that used large, representative samples of female undergraduates and multiple behaviorally specific questions. We conclude that one in five is a reasonably accurate average across women and campuses. We also review studies that are inappropriately cited as either supporting or debunking the one-in-five statistic; we explain why they do not adequately address this question. We identify and evaluate several assumptions implicit in the public discourse (e.g., the assumption that college students are at greater risk than nonstudents). Given the empirical support for the one-in-five statistic, we suggest that the controversy occurs because of misunderstandings about studies' methods and results and because this topic has implications for gender relations, power, and sexuality; this controversy is ultimately about values.

  1. A STATISTICAL APPROACH FOR DERIVING KEY NFC EVALUATION CRITERIA

    Directory of Open Access Journals (Sweden)

    S.K. KIM

    2014-02-01

    As a result of analyzing the weight of evaluation criteria with the sample of nuclear power experts and the general public, both sides recognized safety as the most important evaluation criterion, and the social factors such as public acceptance appeared to be ranked as more important evaluation criteria by the nuclear energy experts than the general public.

  2. Evaluation of local corrosion life by statistical method

    International Nuclear Information System (INIS)

    Kato, Shunji; Kurosawa, Tatsuo; Takaku, Hiroshi; Kusanagi, Hideo; Hirano, Hideo; Kimura, Hideo; Hide, Koichiro; Kawasaki, Masayuki

    1987-01-01

    In this paper, for the purpose of achievement of life extension of light water reactor, we examined the evaluation of local corrosion by satistical method and its application of nuclear power plant components. There are many evaluation examples of maximum cracking depth of local corrosion by dowbly exponential distribution. This evaluation method has been established. But, it has not been established that we evaluate service lifes of construction materials by satistical method. In order to establish of service life evaluation by satistical method, we must strive to collect local corrosion dates and its analytical researchs. (author)

  3. Statistical and Multifractal Evaluation of Soil Compaction in a Vineyard

    Science.gov (United States)

    Marinho, M.; Raposo, J. R.; Mirás Avalos, J. M.; Paz González, A.

    2012-04-01

    One of the detrimental effects caused by agricultural machines is soil compaction, which can be defined by an increase in soil bulk density. Soil compaction often has a negative impact on plant growth, since it reduces the macroporosity and soil permeability and increases resistance to penetration. Our research explored the effect of the agricultural machinery on soil when trafficking through a vineyard at a small spatial scale, based on the evaluation of the soil compaction status. The objectives of this study were: i) to quantify soil bulk density along transects following wine row, wheel track and outside track, and, ii) to characterize the variability of the bulk density along these transects using multifractal analysis. The field work was conducted at the experimental farm of EVEGA (Viticulture and Enology Centre of Galicia) located in Ponte San Clodio, Leiro, Orense, Spain. Three parallel transects were marked on positions with contrasting machine traffic effects, i.e. vine row, wheel-track and outside-track. Undisturbed samples were collected in 16 points of each transect, spaced 0.50 m apart, for bulk density determination using the cylinder method. Samples were taken in autumn 2011, after grape harvest. Since soil between vine rows was tilled and homogenized beginning spring 2011, cumulative effects of traffic during the vine growth period could be evaluated. The distribution patterns of soil bulk density were characterized by multifractal analysis carried out by the method of moments. Multifractality was assessed by several indexes derived from the mass exponent, τq, the generalized dimension, Dq, and the singularity spectrum, f(α), curves. Mean soil bulk density values determined for vine row, outside-track and wheel-track transects were 1.212 kg dm-3, 1.259 kg dm-3and 1.582 kg dm-3, respectively. The respective coefficients of variation (CV) for these three transects were 7.76%, 4.82% and 2.03%. Therefore mean bulk density under wheel-track was 30

  4. Statistical modeling for visualization evaluation through data fusion.

    Science.gov (United States)

    Chen, Xiaoyu; Jin, Ran

    2017-11-01

    There is a high demand of data visualization providing insights to users in various applications. However, a consistent, online visualization evaluation method to quantify mental workload or user preference is lacking, which leads to an inefficient visualization and user interface design process. Recently, the advancement of interactive and sensing technologies makes the electroencephalogram (EEG) signals, eye movements as well as visualization logs available in user-centered evaluation. This paper proposes a data fusion model and the application procedure for quantitative and online visualization evaluation. 15 participants joined the study based on three different visualization designs. The results provide a regularized regression model which can accurately predict the user's evaluation of task complexity, and indicate the significance of all three types of sensing data sets for visualization evaluation. This model can be widely applied to data visualization evaluation, and other user-centered designs evaluation and data analysis in human factors and ergonomics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Liquid metal systems development: reactor vessel support structure evaluation

    International Nuclear Information System (INIS)

    McEdwards, J.A.

    1981-01-01

    Results of an evaluation of support structures for the reactor vessel are reported. The U ring, box ring, integral ring, tee ring and tangential beam supports were investigated. The U ring is the recommended vessel support structure configuration

  6. Reliability Evaluation of Concentric Butterfly Valve Using Statistical Hypothesis Test

    International Nuclear Information System (INIS)

    Chang, Mu Seong; Choi, Jong Sik; Choi, Byung Oh; Kim, Do Sik

    2015-01-01

    A butterfly valve is a type of flow-control device typically used to regulate a fluid flow. This paper presents an estimation of the shape parameter of the Weibull distribution, characteristic life, and B10 life for a concentric butterfly valve based on a statistical analysis of the reliability test data taken before and after the valve improvement. The difference in the shape and scale parameters between the existing and improved valves is reviewed using a statistical hypothesis test. The test results indicate that the shape parameter of the improved valve is similar to that of the existing valve, and that the scale parameter of the improved valve is found to have increased. These analysis results are particularly useful for a reliability qualification test and the determination of the service life cycles

  7. Reliability Evaluation of Concentric Butterfly Valve Using Statistical Hypothesis Test

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Mu Seong; Choi, Jong Sik; Choi, Byung Oh; Kim, Do Sik [Korea Institute of Machinery and Materials, Daejeon (Korea, Republic of)

    2015-12-15

    A butterfly valve is a type of flow-control device typically used to regulate a fluid flow. This paper presents an estimation of the shape parameter of the Weibull distribution, characteristic life, and B10 life for a concentric butterfly valve based on a statistical analysis of the reliability test data taken before and after the valve improvement. The difference in the shape and scale parameters between the existing and improved valves is reviewed using a statistical hypothesis test. The test results indicate that the shape parameter of the improved valve is similar to that of the existing valve, and that the scale parameter of the improved valve is found to have increased. These analysis results are particularly useful for a reliability qualification test and the determination of the service life cycles.

  8. General specifications for the development of a USL NASA PC R and D statistical analysis support package

    Science.gov (United States)

    Dominick, Wayne D. (Editor); Bassari, Jinous; Triantafyllopoulos, Spiros

    1984-01-01

    The University of Southwestern Louisiana (USL) NASA PC R and D statistical analysis support package is designed to be a three-level package to allow statistical analysis for a variety of applications within the USL Data Base Management System (DBMS) contract work. The design addresses usage of the statistical facilities as a library package, as an interactive statistical analysis system, and as a batch processing package.

  9. Additional methodology development for statistical evaluation of reactor safety analyses

    International Nuclear Information System (INIS)

    Marshall, J.A.; Shore, R.W.; Chay, S.C.; Mazumdar, M.

    1977-03-01

    The project described is motivated by the desire for methods to quantify uncertainties and to identify conservatisms in nuclear power plant safety analysis. The report examines statistical methods useful for assessing the probability distribution of output response from complex nuclear computer codes, considers sensitivity analysis and several other topics, and also sets the path for using the developed methods for realistic assessment of the design basis accident

  10. Environmental offences in 1995. An evaluation of statistics

    International Nuclear Information System (INIS)

    Goertz, M.; Werner, J.; Sanchez de la Cerda, J.; Schwertfeger, C.; Winkler, K.

    1997-01-01

    This publication deals with the execution of environmental criminal law. On the basis of police and judicial statistics it is pointed out how often an environmental criminal offence was at least suspected by the police or law courts, how they reacted to their suspicion, which individual environmental criminal offences were committed particularly frequently, and what segment of the population the typical perpetrator belonged to. (orig./SR) [de

  11. Statistical significance of epidemiological data. Seminar: Evaluation of epidemiological studies

    International Nuclear Information System (INIS)

    Weber, K.H.

    1993-01-01

    In stochastic damages, the numbers of events, e.g. the persons who are affected by or have died of cancer, and thus the relative frequencies (incidence or mortality) are binomially distributed random variables. Their statistical fluctuations can be characterized by confidence intervals. For epidemiologic questions, especially for the analysis of stochastic damages in the low dose range, the following issues are interesting: - Is a sample (a group of persons) with a definite observed damage frequency part of the whole population? - Is an observed frequency difference between two groups of persons random or statistically significant? - Is an observed increase or decrease of the frequencies with increasing dose random or statistically significant and how large is the regression coefficient (= risk coefficient) in this case? These problems can be solved by sttistical tests. So-called distribution-free tests and tests which are not bound to the supposition of normal distribution are of particular interest, such as: - χ 2 -independence test (test in contingency tables); - Fisher-Yates-test; - trend test according to Cochran; - rank correlation test given by Spearman. These tests are explained in terms of selected epidemiologic data, e.g. of leukaemia clusters, of the cancer mortality of the Japanese A-bomb survivors especially in the low dose range as well as on the sample of the cancer mortality in the high background area in Yangjiang (China). (orig.) [de

  12. Evaluation of clustering statistics with N-body simulations

    International Nuclear Information System (INIS)

    Quinn, T.R.

    1986-01-01

    Two series of N-body simulations are used to determine the effectiveness of various clustering statistics in revealing initial conditions from evolved models. All the simulations contained 16384 particles and were integrated with the PPPM code. One series is a family of models with power at only one wavelength. The family contains five models with the wavelength of the power separated by factors of √2. The second series is a family of all equal power combinations of two wavelengths taken from the first series. The clustering statistics examined are the two point correlation function, the multiplicity function, the nearest neighbor distribution, the void probability distribution, the distribution of counts in cells, and the peculiar velocity distribution. It is found that the covariance function, the nearest neighbor distribution, and the void probability distribution are relatively insensitive to the initial conditions. The distribution of counts in cells show a little more sensitivity, but the multiplicity function is the best of the statistics considered for revealing the initial conditions

  13. Operational Contract Support: Economic Impact Evaluation and Measures of Effectiveness

    Science.gov (United States)

    2017-12-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA MBA PROFESSIONAL REPORT OPERATIONAL CONTRACT SUPPORT: ECONOMIC IMPACT EVALUATION AND MEASURES...DATES COVERED MBA professional report 4. TITLE AND SUBTITLE OPERATIONAL CONTRACT SUPPORT: ECONOMIC IMPACT EVALUATION AND MEASURES OF EFFECTIVENESS 5...evaluation, expeditionary economics , operational contract support, measure of effectiveness 15. NUMBER OF PAGES 89 16. PRICE CODE 17. SECURITY

  14. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    International Nuclear Information System (INIS)

    Pham, Bihn T.; Einerson, Jeffrey J.

    2010-01-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory's Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automated processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.

  15. Wind power statistics and an evaluation of wind energy density

    Energy Technology Data Exchange (ETDEWEB)

    Jamil, M.; Parsa, S.; Majidi, M. [Materials and Energy Research Centre, Tehran (Iran, Islamic Republic of)

    1995-11-01

    In this paper the statistical data of fifty days` wind speed measurements at the MERC- solar site are used to find out the wind energy density and other wind characteristics with the help of the Weibull probability distribution function. It is emphasized that the Weibull and Rayleigh probability functions are useful tools for wind energy density estimation but are not quite appropriate for properly fitting the actual wind data of low mean speed, short-time records. One has to use either the actual wind data (histogram) or look for a better fit by other models of the probability function. (Author)

  16. Bayesian statistical evaluation of peak area measurements in gamma spectrometry

    International Nuclear Information System (INIS)

    Silva, L.; Turkman, A.; Paulino, C.D.

    2010-01-01

    We analyze results from determinations of peak areas for a radioactive source containing several radionuclides. The statistical analysis was performed using Bayesian methods based on the usual Poisson model for observed counts. This model does not appear to be a very good assumption for the counting system under investigation, even though it is not questioned as a whole by the inferential procedures adopted. We conclude that, in order to avoid incorrect inferences on relevant quantities, one must proceed to a further study that allows us to include missing influence parameters and to select a model explaining the observed data much better.

  17. Environmental offenses in 1999. An evaluation of statistics

    International Nuclear Information System (INIS)

    Goertz, M.; Werner, J.; Heinrich, M.

    2000-01-01

    A total of 43,382 known environmental offenses was recorded in 1999 as compared to 47,900 in 1998. There were 36,663 penal offenses (section 29 StGB), 48 penal offenses (section 28 StGB) and 6,671 offenses against other laws (BNatSchG, ChemG, etc.). This statistics covers chemical pollutants, radioactive materials, ionizing and non-ionizing radiation, noise and explosions. It is estimated that a much higher number of offenses went unnoticed [de

  18. A Blended Learning Experience in Statistics for Psychology Students Using the Evaluation as a Learning Tool

    Directory of Open Access Journals (Sweden)

    Alberto VALENTÍN CENTENO

    2016-05-01

    Full Text Available Teaching statistics course Applied Psychology, was based on different teaching models that incorporate active teaching methodologies. In this experience have combined approaches that prioritize the use of ICT with other where evaluation becomes an element of learning. This has involved the use of virtual platforms to support teaching that facilitate learning and activities where no face-to-face are combined. The design of the components of the course is inspired by the dimensions proposed by Carless (2003 model. This model uses evaluation as a learning element. The development of this experience has shown how the didactic proposal has been positively interpreted by students. Students recognized that they had to learn and deeply understand the basic concepts of the subject, so that they can teach and assess their peers.

  19. Application of statistical distribution theory to launch-on-time for space construction logistic support

    Science.gov (United States)

    Morgenthaler, George W.

    1989-01-01

    The ability to launch-on-time and to send payloads into space has progressed dramatically since the days of the earliest missile and space programs. Causes for delay during launch, i.e., unplanned 'holds', are attributable to several sources: weather, range activities, vehicle conditions, human performance, etc. Recent developments in space program, particularly the need for highly reliable logistic support of space construction and the subsequent planned operation of space stations, large unmanned space structures, lunar and Mars bases, and the necessity of providing 'guaranteed' commercial launches have placed increased emphasis on understanding and mastering every aspect of launch vehicle operations. The Center of Space Construction has acquired historical launch vehicle data and is applying these data to the analysis of space launch vehicle logistic support of space construction. This analysis will include development of a better understanding of launch-on-time capability and simulation of required support systems for vehicle assembly and launch which are necessary to support national space program construction schedules. In this paper, the author presents actual launch data on unscheduled 'hold' distributions of various launch vehicles. The data have been supplied by industrial associate companies of the Center for Space Construction. The paper seeks to determine suitable probability models which describe these historical data and that can be used for several purposes such as: inputs to broader simulations of launch vehicle logistic space construction support processes and the determination of which launch operations sources cause the majority of the unscheduled 'holds', and hence to suggest changes which might improve launch-on-time. In particular, the paper investigates the ability of a compound distribution probability model to fit actual data, versus alternative models, and recommends the most productive avenues for future statistical work.

  20. A new quantum statistical evaluation method for time correlation functions

    International Nuclear Information System (INIS)

    Loss, D.; Schoeller, H.

    1989-01-01

    Considering a system of N identical interacting particles, which obey Fermi-Dirac or Bose-Einstein statistics, the authors derive new formulas for correlation functions of the type C(t) = i= 1 N A i (t) Σ j=1 N B j > (where B j is diagonal in the free-particle states) in the thermodynamic limit. Thereby they apply and extend a superoperator formalism, recently developed for the derivation of long-time tails in semiclassical systems. As an illustrative application, the Boltzmann equation value of the time-integrated correlation function C(t) is derived in a straight-forward manner. Due to exchange effects, the obtained t-matrix and the resulting scattering cross section, which occurs in the Boltzmann collision operator, are now functionals of the Fermi-Dirac or Bose-Einstein distribution

  1. Evaluation of selected environmental decision support software

    International Nuclear Information System (INIS)

    Sullivan, T.M.; Moskowitz, P.D.; Gitten, M.

    1997-06-01

    Decision Support Software (DSS) continues to be developed to support analysis of decisions pertaining to environmental management. Decision support systems are computer-based systems that facilitate the use of data, models, and structured decision processes in decision making. The optimal DSS should attempt to integrate, analyze, and present environmental information to remediation project managers in order to select cost-effective cleanup strategies. The optimal system should have a balance between the sophistication needed to address the wide range of complicated sites and site conditions present at DOE facilities, and ease of use (e.g., the system should not require data that is typically unknown and should have robust error checking of problem definition through input, etc.). In the first phase of this study, an extensive review of the literature, the Internet, and discussions with sponsors and developers of DSS led to identification of approximately fifty software packages that met the preceding definition

  2. STATISTICAL EVALUATION OF EXAMINATION TESTS IN MATHEMATICS FOR ECONOMISTS

    Directory of Open Access Journals (Sweden)

    KASPŘÍKOVÁ, Nikola

    2012-12-01

    Full Text Available Examination results are rather important for many students with regard to their future profession development. Results of exams should be carefully inspected by the teachers to help improve design and evaluation of tests and education process in general. Analysis of examination papers in mathematics taken by students of basic mathematics course at University of Economics in Prague is reported. The first issue addressed is identification of significant dependencies between performance in particular problem areas covered in the test and also between particular items and total score in test or ability level as a latent trait. The assessment is first performed with Spearman correlation coefficient, items in the test are then evaluated within Item Response Theory framework. The second analytical task addressed is a search for groups of students who are similar with respect to performance in test. Cluster analysis is performed using partitioning around medoids method and final model selection is made according to average silhouette width. Results of clustering, which may be also considered in connection with setting of the minimum score for passing the exam, show that two groups of students can be identified. The group which may be called "well-performers" is the more clearly defined one.

  3. Statistical evaluations of current sampling procedures and incomplete core recovery

    International Nuclear Information System (INIS)

    Heasler, P.G.; Jensen, L.

    1994-03-01

    This document develops two formulas that describe the effects of incomplete recovery on core sampling results for the Hanford waste tanks. The formulas evaluate incomplete core recovery from a worst-case (i.e.,biased) and best-case (i.e., unbiased) perspective. A core sampler is unbiased if the sample material recovered is a random sample of the material in the tank, while any sampler that preferentially recovers a particular type of waste over others is a biased sampler. There is strong evidence to indicate that the push-mode sampler presently used at the Hanford site is a biased one. The formulas presented here show the effects of incomplete core recovery on the accuracy of composition measurements, as functions of the vertical variability in the waste. These equations are evaluated using vertical variability estimates from previously sampled tanks (B110, U110, C109). Assuming that the values of vertical variability used in this study adequately describes the Hanford tank farm, one can use the formulas to compute the effect of incomplete recovery on the accuracy of an average constituent estimate. To determine acceptable recovery limits, we have assumed that the relative error of such an estimate should be no more than 20%

  4. When Is Statistical Evidence Superior to Anecdotal Evidence in Supporting Probability Claims? The Role of Argument Type

    Science.gov (United States)

    Hoeken, Hans; Hustinx, Lettica

    2009-01-01

    Under certain conditions, statistical evidence is more persuasive than anecdotal evidence in supporting a claim about the probability that a certain event will occur. In three experiments, it is shown that the type of argument is an important condition in this respect. If the evidence is part of an argument by generalization, statistical evidence…

  5. ASA conference on radiation and health: Health effects of electric and magnetic fields: Statistical support for research strategies. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1990-05-01

    This report is a collection of papers documenting presentations made at the VIII ASA (American Statistical Association) Conference on Radiation and Health entitled Health Effects of Electric and Magnetic Fields: Statistical Support for Research Strategies. Individual papers are abstracted and indexed for the database.

  6. Evaluation of a newly developed media-supported 4-step approach for basic life support training

    Directory of Open Access Journals (Sweden)

    Sopka Saša

    2012-07-01

    Full Text Available Abstract Objective The quality of external chest compressions (ECC is of primary importance within basic life support (BLS. Recent guidelines delineate the so-called 4“-step approach” for teaching practical skills within resuscitation training guided by a certified instructor. The objective of this study was to evaluate whether a “media-supported 4-step approach” for BLS training leads to equal practical performance compared to the standard 4-step approach. Materials and methods After baseline testing, 220 laypersons were either trained using the widely accepted method for resuscitation training (4-step approach or using a newly created “media-supported 4-step approach”, both of equal duration. In this approach, steps 1 and 2 were ensured via a standardised self-produced podcast, which included all of the information regarding the BLS algorithm and resuscitation skills. Participants were tested on manikins in the same mock cardiac arrest single-rescuer scenario prior to intervention, after one week and after six months with respect to ECC-performance, and participants were surveyed about the approach. Results Participants (age 23 ± 11, 69% female reached comparable practical ECC performances in both groups, with no statistical difference. Even after six months, there was no difference detected in the quality of the initial assessment algorithm or delay concerning initiation of CPR. Overall, at least 99% of the intervention group (n = 99; mean 1.5 ± 0.8; 6-point Likert scale: 1 = completely agree, 6 = completely disagree agreed that the video provided an adequate introduction to BLS skills. Conclusions The “media-supported 4-step approach” leads to comparable practical ECC-performance compared to standard teaching, even with respect to retention of skills. Therefore, this approach could be useful in special educational settings where, for example, instructors’ resources are sparse or large-group sessions

  7. Use of library statistics to support library and advisory services and ...

    African Journals Online (AJOL)

    Statistical information is a vital tool for management and development of organizations. Keeping statistics of activities is basic to the survival and progress of a library and enables the library to measure its performance periodically. The National Library of Nigeria (NLN) places high premium on the library statistics that it ...

  8. Fault diagnosis of automobile hydraulic brake system using statistical features and support vector machines

    Science.gov (United States)

    Jegadeeshwaran, R.; Sugumaran, V.

    2015-02-01

    Hydraulic brakes in automobiles are important components for the safety of passengers; therefore, the brakes are a good subject for condition monitoring. The condition of the brake components can be monitored by using the vibration characteristics. On-line condition monitoring by using machine learning approach is proposed in this paper as a possible solution to such problems. The vibration signals for both good as well as faulty conditions of brakes were acquired from a hydraulic brake test setup with the help of a piezoelectric transducer and a data acquisition system. Descriptive statistical features were extracted from the acquired vibration signals and the feature selection was carried out using the C4.5 decision tree algorithm. There is no specific method to find the right number of features required for classification for a given problem. Hence an extensive study is needed to find the optimum number of features. The effect of the number of features was also studied, by using the decision tree as well as Support Vector Machines (SVM). The selected features were classified using the C-SVM and Nu-SVM with different kernel functions. The results are discussed and the conclusion of the study is presented.

  9. Integrating observation and statistical forecasts over sub-Saharan Africa to support Famine Early Warning

    Science.gov (United States)

    Funk, Chris; Verdin, James P.; Husak, Gregory

    2007-01-01

    Famine early warning in Africa presents unique challenges and rewards. Hydrologic extremes must be tracked and anticipated over complex and changing climate regimes. The successful anticipation and interpretation of hydrologic shocks can initiate effective government response, saving lives and softening the impacts of droughts and floods. While both monitoring and forecast technologies continue to advance, discontinuities between monitoring and forecast systems inhibit effective decision making. Monitoring systems typically rely on high resolution satellite remote-sensed normalized difference vegetation index (NDVI) and rainfall imagery. Forecast systems provide information on a variety of scales and formats. Non-meteorologists are often unable or unwilling to connect the dots between these disparate sources of information. To mitigate these problem researchers at UCSB's Climate Hazard Group, NASA GIMMS and USGS/EROS are implementing a NASA-funded integrated decision support system that combines the monitoring of precipitation and NDVI with statistical one-to-three month forecasts. We present the monitoring/forecast system, assess its accuracy, and demonstrate its application in food insecure sub-Saharan Africa.

  10. Statistical analysis supporting decision-making about opening na university library on saturdays

    Directory of Open Access Journals (Sweden)

    Lisandra Maria Kovaliczn Nadal

    2017-06-01

    Full Text Available The concern with the welfare of employees and the significant reduction in the demand of books loans by postgraduate students on Saturdays led to a change in operating days at the information units of Central Library Professor Faris Michaele (BICEN, in State University of Ponta Grossa (UEPG, in Ponta Grossa, PR. Therefore, the study intended to support the decision of closing the university library on Saturdays in 2016. It was verified whether there is statistical significance in the relationship between the type of library user and the number of books borrowed on Saturdays, and whether the loan of books by postgraduate students was relevant compared to others. Based on the loan data between February 2014 and December 2015, it was determined that there is a significant relationship between the type of library user and the number of borrowed books, and that the loan of books by undergraduate students is the most relevant. Also considering the saving of resources such as light and overtime and the maintenance of compliance with the norms of the Ministry of Education (MEC for the approval of undergraduate courses, closing the units on Saturdays during the academic year of 2016 was the right decision.

  11. Supporting Qualified Database for Uncertainty Evaluation

    International Nuclear Information System (INIS)

    Petruzzi, A.; Fiori, F.; Kovtonyuk, A.; Lisovyy, O.; D'Auria, F.

    2013-01-01

    Uncertainty evaluation constitutes a key feature of BEPU (Best Estimate Plus Uncertainty) process. The uncertainty can be the result of a Monte Carlo type analysis involving input uncertainty parameters or the outcome of a process involving the use of experimental data and connected code calculations. Those uncertainty methods are discussed in several papers and guidelines (IAEA-SRS-52, OECD/NEA BEMUSE reports). The present paper aims at discussing the role and the depth of the analysis required for merging from one side suitable experimental data and on the other side qualified code calculation results. This aspect is mostly connected with the second approach for uncertainty mentioned above, but it can be used also in the framework of the first approach. Namely, the paper discusses the features and structure of the database that includes the following kinds of documents: 1. The 'RDS-facility' (Reference Data Set for the selected facility): this includes the description of the facility, the geometrical characterization of any component of the facility, the instrumentations, the data acquisition system, the evaluation of pressure losses, the physical properties of the material and the characterization of pumps, valves and heat losses; 2. The 'RDS-test' (Reference Data Set for the selected test of the facility): this includes the description of the main phenomena investigated during the test, the configuration of the facility for the selected test (possible new evaluation of pressure and heat losses if needed) and the specific boundary and initial conditions; 3. The 'QP' (Qualification Report) of the code calculation results: this includes the description of the nodalization developed following a set of homogeneous techniques, the achievement of the steady state conditions and the qualitative and quantitative analysis of the transient with the characterization of the Relevant Thermal-Hydraulics Aspects (RTA); 4. The EH (Engineering Handbook) of the input nodalization

  12. Supporting qualified database for uncertainty evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Petruzzi, A.; Fiori, F.; Kovtonyuk, A.; D' Auria, F. [Nuclear Research Group of San Piero A Grado, Univ. of Pisa, Via Livornese 1291, 56122 Pisa (Italy)

    2012-07-01

    Uncertainty evaluation constitutes a key feature of BEPU (Best Estimate Plus Uncertainty) process. The uncertainty can be the result of a Monte Carlo type analysis involving input uncertainty parameters or the outcome of a process involving the use of experimental data and connected code calculations. Those uncertainty methods are discussed in several papers and guidelines (IAEA-SRS-52, OECD/NEA BEMUSE reports). The present paper aims at discussing the role and the depth of the analysis required for merging from one side suitable experimental data and on the other side qualified code calculation results. This aspect is mostly connected with the second approach for uncertainty mentioned above, but it can be used also in the framework of the first approach. Namely, the paper discusses the features and structure of the database that includes the following kinds of documents: 1. The' RDS-facility' (Reference Data Set for the selected facility): this includes the description of the facility, the geometrical characterization of any component of the facility, the instrumentations, the data acquisition system, the evaluation of pressure losses, the physical properties of the material and the characterization of pumps, valves and heat losses; 2. The 'RDS-test' (Reference Data Set for the selected test of the facility): this includes the description of the main phenomena investigated during the test, the configuration of the facility for the selected test (possible new evaluation of pressure and heat losses if needed) and the specific boundary and initial conditions; 3. The 'QR' (Qualification Report) of the code calculation results: this includes the description of the nodalization developed following a set of homogeneous techniques, the achievement of the steady state conditions and the qualitative and quantitative analysis of the transient with the characterization of the Relevant Thermal-Hydraulics Aspects (RTA); 4. The EH (Engineering

  13. The GNP testbed for operator support evaluation

    International Nuclear Information System (INIS)

    Goodstein, L.P.; Hedegaard, J.; Hoejberg, K.S.; Lind, M.

    1984-11-01

    The GNP project is an outgrowth of our work over the past few years in the area of man-machine system representation and modelling - particularly with an eye towards studying the activities of diagnosis and decision making in connection with complex technical systems. Previous publications have dealt with the conceptual basis for this work. However, there was felt to be a need for a realistic test bed of a reasonable (and variable) complexity for evaluating the concepts by means of a suitably designed experimental program. This paper will thus describe the socalled GNP project and the associated activity to date. The following points will be covered: - GNP as a prototypical process - GNP as a simulation - The current GNP experimental setup at Risoe - Initial GNP - Experiments at Risoe - Planning - Experience to date. (author)

  14. A Pilot Evaluation of the Family Caregiver Support Program

    Science.gov (United States)

    Chen, Ya-Mei; Hedrick, Susan C.; Young, Heather M.

    2010-01-01

    The purposes of this study were to evaluate a federal and state-funded Family Caregiver Support Program (FCSP) and explore what types of caregiver support service are associated with what caregiver outcomes. Information was obtained on a sample of 164 caregivers' use of eleven different types of support service. Descriptive and comparative…

  15. EVALUATION OF A NEW MEAN SCALED AND MOMENT ADJUSTED TEST STATISTIC FOR SEM.

    Science.gov (United States)

    Tong, Xiaoxiao; Bentler, Peter M

    2013-01-01

    Recently a new mean scaled and skewness adjusted test statistic was developed for evaluating structural equation models in small samples and with potentially nonnormal data, but this statistic has received only limited evaluation. The performance of this statistic is compared to normal theory maximum likelihood and two well-known robust test statistics. A modification to the Satorra-Bentler scaled statistic is developed for the condition that sample size is smaller than degrees of freedom. The behavior of the four test statistics is evaluated with a Monte Carlo confirmatory factor analysis study that varies seven sample sizes and three distributional conditions obtained using Headrick's fifth-order transformation to nonnormality. The new statistic performs badly in most conditions except under the normal distribution. The goodness-of-fit χ(2) test based on maximum-likelihood estimation performed well under normal distributions as well as under a condition of asymptotic robustness. The Satorra-Bentler scaled test statistic performed best overall, while the mean scaled and variance adjusted test statistic outperformed the others at small and moderate sample sizes under certain distributional conditions.

  16. Application of Different Statistical Techniques in Integrated Logistics Support of the International Space Station Alpha

    Science.gov (United States)

    Sepehry-Fard, F.; Coulthard, Maurice H.

    1995-01-01

    The process to predict the values of the maintenance time dependent variable parameters such as mean time between failures (MTBF) over time must be one that will not in turn introduce uncontrolled deviation in the results of the ILS analysis such as life cycle cost spares calculation, etc. A minor deviation in the values of the maintenance time dependent variable parameters such as MTBF over time will have a significant impact on the logistics resources demands, International Space Station availability, and maintenance support costs. It is the objective of this report to identify the magnitude of the expected enhancement in the accuracy of the results for the International Space Station reliability and maintainability data packages by providing examples. These examples partially portray the necessary information hy evaluating the impact of the said enhancements on the life cycle cost and the availability of the International Space Station.

  17. Program system for inclusion, settlement of account and statistical evaluation of on-line recherches

    International Nuclear Information System (INIS)

    Helmreich, F.; Nevyjel, A.

    1981-03-01

    The described program system is used for the automatisation of the administration in an information retrieval department. The data of the users and of every on line session are stored in two files and can be evaluated in different statistics. The data acquisition is done interactively, the statistic programs run as well in dialog and in batch. (author)

  18. A software package for acquisition, accounting and statistical evaluation of on-line retrieval

    International Nuclear Information System (INIS)

    Helmreich, F.; Nevyjel, A.

    1981-03-01

    The described program system is used for the automatization of the administration in an information retrieval department. The data of the users and of every on line session are stored in two files and can be evaluated in different statistics. The data acquisition is done interactively, the statistic programs run as well in dialog and in batch. (author) [de

  19. Partial discharge testing: a progress report. Statistical evaluation of PD data

    International Nuclear Information System (INIS)

    Warren, V.; Allan, J.

    2005-01-01

    It has long been known that comparing the partial discharge results obtained from a single machine is a valuable tool enabling companies to observe the gradual deterioration of a machine stator winding and thus plan appropriate maintenance for the machine. In 1998, at the annual Iris Rotating Machines Conference (IRMC), a paper was presented that compared thousands of PD test results to establish the criteria for comparing results from different machines and the expected PD levels. At subsequent annual Iris conferences, using similar analytical procedures, papers were presented that supported the previous criteria and: in 1999, established sensor location as an additional criterion; in 2000, evaluated the effect of insulation type and age on PD activity; in 2001, evaluated the effect of manufacturer on PD activity; in 2002, evaluated the effect of operating pressure for hydrogen-cooled machines; in 2003, evaluated the effect of insulation type and setting Trac alarms; in 2004, re-evaluated the effect of manufacturer on PD activity. Before going further in database analysis procedures, it would be prudent to statistically evaluate the anecdotal evidence observed to date. The goal was to determine which variables of machine conditions greatly influenced the PD results and which didn't. Therefore, this year's paper looks at the impact of operating voltage, machine type and winding type on the test results for air-cooled machines. Because of resource constraints, only data collected through 2003 was used; however, as before, it is still standardized for frequency bandwidth and pruned to include only full-load-hot (FLH) results collected for one sensor on operating machines. All questionable data, or data from off-line testing or unusual machine conditions was excluded, leaving 6824 results. Calibration of on-line PD test results is impractical; therefore, only results obtained using the same method of data collection and noise separation techniques are compared. For

  20. Evaluating clinical and public health interventions: a practical guide to study design and statistics

    National Research Council Canada - National Science Library

    Katz, Mitchell H

    2010-01-01

    ... and observational studies. In addition to reviewing standard statistical analysis, the book has easy-to-follow explanations of cutting edge techniques for evaluating interventions, including propensity score analysis...

  1. AUTOMATIC LUNG NODULE DETECTION BASED ON STATISTICAL REGION MERGING AND SUPPORT VECTOR MACHINES

    Directory of Open Access Journals (Sweden)

    Elaheh Aghabalaei Khordehchi

    2017-06-01

    Full Text Available Lung cancer is one of the most common diseases in the world that can be treated if the lung nodules are detected in their early stages of growth. This study develops a new framework for computer-aided detection of pulmonary nodules thorough a fully-automatic analysis of Computed Tomography (CT images. In the present work, the multi-layer CT data is fed into a pre-processing step that exploits an adaptive diffusion-based smoothing algorithm in which the parameters are automatically tuned using an adaptation technique. After multiple levels of morphological filtering, the Regions of Interest (ROIs are extracted from the smoothed images. The Statistical Region Merging (SRM algorithm is applied to the ROIs in order to segment each layer of the CT data. Extracted segments in consecutive layers are then analyzed in such a way that if they intersect at more than a predefined number of pixels, they are labeled with a similar index. The boundaries of the segments in adjacent layers which have the same indices are then connected together to form three-dimensional objects as the nodule candidates. After extracting four spectral, one morphological, and one textural feature from all candidates, they are finally classified into nodules and non-nodules using the Support Vector Machine (SVM classifier. The proposed framework has been applied to two sets of lung CT images and its performance has been compared to that of nine other competing state-of-the-art methods. The considerable efficiency of the proposed approach has been proved quantitatively and validated by clinical experts as well.

  2. Consequence evaluation of hypothetical reactor pressure vessel support failure

    International Nuclear Information System (INIS)

    Lu, S.C.; Holman, G.S.; Lambert, H.E.

    1991-01-01

    This paper describes a consequence evaluation to address safety concerns raised by the radiation embrittlement of the reactor pressure vessel (RPV) supports for the Trojan nuclear power plant. The study comprises a structural evaluation and an effects evaluation and assumes that all four reactor vessel supports have completely lost the load carrying capability. The structural evaluation concludes that the Trojan reactor coolant loop (RCL) piping is capable of transferring loads to the steam generator (SG) supports and the reactor coolant pump (RCP) supports and that the SG supports and the RCP supports have sufficient design margins to accommodate additional loads transferred to them through the RCL piping. The effects evaluation, employing a systems analysis approach, investigates initiating events and the reliability of the engineered safeguard systems as the RPV is subject to movements caused by the RPV support failure. The evaluation identifies a number of areas for further investigation and concludes that a hypothetical failure of the Trojan RPV supports due to radiation embrittlement will not result in consequences of significant safety concerns. (author)

  3. Evaluation of design feature No.20 -- Ground support options

    International Nuclear Information System (INIS)

    Duan, F.

    2000-01-01

    Ground support options are primarily evaluated for emplacement drifts while ground support systems for non-emplacement openings such as access mains and ventilation drifts are not evaluated against LADS evaluation criteria in this report. Considerations include functional requirements for ground support, the use of a steel-lined system, and the feasibility of using an unlined ground support system principally with grouted rock bolts for permanent ground support. The feature evaluation also emphasizes the postclosure effects of ground support materials on waste isolation and the preclosure aspects such as durability, maintainability, constructibility, safety, engineering acceptability, and cost. This evaluation is to: (A) Review the existing analyses, reports, and studies regarding this design feature, and compile relevant information on performance characteristics. (B) Develop an appropriate evaluation approach for evaluating ground support options against evaluation criteria provided by the LADS team. (C) Evaluate ground support options not only for their preclosure performance in terms of drift stability, material durability, maintenance, constructibility, and cost, but also for their postclosure performance in terms of chemical effects of ground support materials (i.e., concrete, steel) on waste isolation and radionuclide transport. Specifically, the scope for ground support options evaluation include: (1) all steel-lined drifts (no cementitious materials), (2) unlined drifts with minimum cementitious materials (e.g., grout for rockbolts), and (3) concrete-lined drifts, with the focus on the postclosure acceptability evaluation. In addition, unlined drifts with zero cementitious materials (e.g., use of frictional bolts such as split sets, Swellex bolts) are briefly discussed. (D) Identify candidate ground support systems that have the potential to enhance the repository performance based on the feature evaluation. and (E) Provide conclusions and recommendations

  4. Using statistical anomaly detection models to find clinical decision support malfunctions.

    Science.gov (United States)

    Ray, Soumi; McEvoy, Dustin S; Aaron, Skye; Hickman, Thu-Trang; Wright, Adam

    2018-05-11

    Malfunctions in Clinical Decision Support (CDS) systems occur due to a multitude of reasons, and often go unnoticed, leading to potentially poor outcomes. Our goal was to identify malfunctions within CDS systems. We evaluated 6 anomaly detection models: (1) Poisson Changepoint Model, (2) Autoregressive Integrated Moving Average (ARIMA) Model, (3) Hierarchical Divisive Changepoint (HDC) Model, (4) Bayesian Changepoint Model, (5) Seasonal Hybrid Extreme Studentized Deviate (SHESD) Model, and (6) E-Divisive with Median (EDM) Model and characterized their ability to find known anomalies. We analyzed 4 CDS alerts with known malfunctions from the Longitudinal Medical Record (LMR) and Epic® (Epic Systems Corporation, Madison, WI, USA) at Brigham and Women's Hospital, Boston, MA. The 4 rules recommend lead testing in children, aspirin therapy in patients with coronary artery disease, pneumococcal vaccination in immunocompromised adults and thyroid testing in patients taking amiodarone. Poisson changepoint, ARIMA, HDC, Bayesian changepoint and the SHESD model were able to detect anomalies in an alert for lead screening in children and in an alert for pneumococcal conjugate vaccine in immunocompromised adults. EDM was able to detect anomalies in an alert for monitoring thyroid function in patients on amiodarone. Malfunctions/anomalies occur frequently in CDS alert systems. It is important to be able to detect such anomalies promptly. Anomaly detection models are useful tools to aid such detections.

  5. A statistical approach to evaluate hydrocarbon remediation in the unsaturated zone

    International Nuclear Information System (INIS)

    Hajali, P.; Marshall, T.; Overman, S.

    1991-01-01

    This paper presents an evaluation of performance and cleanup effectiveness of a vapor extraction system (VES) in extracting chlorinated hydrocarbons and petroleum-based hydrocarbons (mineral spirits) from the unsaturated zone. The statistical analysis of soil concentration data to evaluate the VES remediation success is described. The site is a former electronics refurbishing facility in southern California; soil contamination from organic solvents was found mainly in five areas (Area A through E) beneath two buildings. The evaluation begins with a brief description of the site background, discusses the statistical approach, and presents conclusions

  6. A Formal Approach for RT-DVS Algorithms Evaluation Based on Statistical Model Checking

    Directory of Open Access Journals (Sweden)

    Shengxin Dai

    2015-01-01

    Full Text Available Energy saving is a crucial concern in embedded real time systems. Many RT-DVS algorithms have been proposed to save energy while preserving deadline guarantees. This paper presents a novel approach to evaluate RT-DVS algorithms using statistical model checking. A scalable framework is proposed for RT-DVS algorithms evaluation, in which the relevant components are modeled as stochastic timed automata, and the evaluation metrics including utilization bound, energy efficiency, battery awareness, and temperature awareness are expressed as statistical queries. Evaluation of these metrics is performed by verifying the corresponding queries using UPPAAL-SMC and analyzing the statistical information provided by the tool. We demonstrate the applicability of our framework via a case study of five classical RT-DVS algorithms.

  7. Evaluation of decision support systems for nuclear accidents

    International Nuclear Information System (INIS)

    Sdouz, G.; Mueck, K.

    1998-05-01

    In order to adopt countermeasures to protect the public after an accident in a nuclear power plant in an appropriate and optimum way, decision support systems offer a valuable assistance in supporting the decision maker in choosing and optimizing protective actions. Such decision support systems may range from simple systems to accumulate relevant parameters for the evaluation of the situation over prediction models for the rapid evaluation of the dose to be expected to systems which permit the evaluation and comparison of possible countermeasures. Since the establishment of a decision support systems obviously is also required in Austria, an evaluation of systems available or in the state of development in other countries or unions was performed. The aim was to determine the availability of decision support systems in various countries and to evaluate them with regard to depth and extent of the system. The evaluation showed that in most industrialized countries the requirement for a decision support system was realized, but in only few countries actual systems are readily available and operable. Most systems are limited to early phase consequences, i.e. dispersion calculations of calculated source terms and the estimation of exposure in the vicinity of the plant. Only few systems offer the possibility to predict long-term exposures by ingestion. Few systems permit also an evaluation of potential countermeasures, in most cases, however, limited to a few short-term countermeasures. Only one system which is presently not operable allows the evaluation of a large number of agricultural countermeasures. In this report the different systems are compared. The requirements with regard to an Austrian decision support system are defined and consequences for a possible utilization of a DSS or parts thereof for the Austrian decision support system are derived. (author)

  8. A Group Creativity Support System for Dynamic Idea Evaluation

    DEFF Research Database (Denmark)

    Ulrich, Frank

    2015-01-01

    Idea evaluation is necessary in most modern organizations to identify the level of novelty and usefulness of new ideas. However, current idea evaluation research hinders creativity by primarily supporting convergent thinking (narrowing down ideas to a few tangible solutions), while divergent...... thinking (the development of wildly creative and novel thoughts patterns) is discounted. In this paper, this current view of idea evaluation is challenged through the development of a prototype that supports dynamic idea evaluation. The prototype uses knowledge created during evaluative processes...... to facilitate divergent thinking in a Group Creativity Support System (GCSS) designed from state-of-the-art research. The prototype is interpretively explored through a field experiment in a Danish IS research department. Consequently, the prototype demonstrates the ability to including divergent thinking...

  9. Statistical evaluation of fracture characteristics of RPV steels in the ductile-brittle transition temperature region

    International Nuclear Information System (INIS)

    Kang, Sung Sik; Chi, Se Hwan; Hong, Jun Hwa

    1998-01-01

    The statistical analysis method was applied to the evaluation of fracture toughness in the ductile-brittle transition temperature region. Because cleavage fracture in steel is of a statistical nature, fracture toughness data or values show a similar statistical trend. Using the three-parameter Weibull distribution, a fracture toughness vs. temperature curve (K-curve) was directly generated from a set of fracture toughness data at a selected temperature. Charpy V-notch impact energy was also used to obtain the K-curve by a K IC -CVN (Charpy V-notch energy) correlation. Furthermore, this method was applied to evaluate the neutron irradiation embrittlement of reactor pressure vessel(RPV) steel. Most of the fracture toughness data were within the 95 percent confidence limits. The prediction of a transition temperature shift by statistical analysis was compared with that from the experimental data. (author)

  10. Statistical evaluation of characteristic SDDLV-induced stress resultants to discriminate between undamaged and damaged elements

    DEFF Research Database (Denmark)

    Hansen, Lasse Majgaard; Johansen, Rasmus Johan; Ulriksen, Martin Dalgaard

    2015-01-01

    of modified characteristic stress resultants, which are compared to a pre-defined tolerance value, without any thorough statistical evaluation. In the present paper, it is tested whether three widely-used statistical pattern-recognition-based damage-detection methods can provide an effective statistical...... evaluation of the characteristic stress resultants, hence facilitating general discrimination between damaged and undamaged elements. The three detection methods in question enable outlier analysis on the basis of, respectively, Euclidian distance, Hotelling’s statistics, and Mahalanobis distance. The study...... alternately to an undamaged reference model with known stiffness matrix, hereby, theoretically, yielding characteristic stress resultants approaching zero in the damaged elements. At present, the discrimination between potentially damaged elements and undamaged ones is typically conducted on the basis...

  11. Support for Alzheimer's Caregivers: Psychometric Evaluation of Familial and Friend Support Measures

    Science.gov (United States)

    Wilks, Scott E.

    2009-01-01

    Objective: Information on the shortened, 20-item version of the Perceived Social Support Scale (S-PSSS) is scarce. The purpose of this study is to evaluate the psychometric properties of the S-PSSS Family (SSfa) and Friends (SSfr) subscales. Method: Because of their common coping method of social support, a cross-sectional sample of Alzheimer's…

  12. Numerical evaluation of the statistical properties of a potential energy landscape

    International Nuclear Information System (INIS)

    Nave, E La; Sciortino, F; Tartaglia, P; Michele, C De; Mossa, S

    2003-01-01

    The techniques which allow the numerical evaluation of the statistical properties of the potential energy landscape for models of simple liquids are reviewed and critically discussed. Expressions for the liquid free energy and its vibrational and configurational components are reported. Finally, a possible model for the statistical properties of the landscape, which appears to describe correctly fragile liquids in the region where equilibrium simulations are feasible, is discussed

  13. Using Statistical and Probabilistic Methods to Evaluate Health Risk Assessment: A Case Study

    Directory of Open Access Journals (Sweden)

    Hongjing Wu

    2014-06-01

    Full Text Available The toxic chemical and heavy metals within wastewater can cause serious adverse impacts on human health. Health risk assessment (HRA is an effective tool for supporting decision-making and corrective actions in water quality management. HRA can also help people understand the water quality and quantify the adverse effects of pollutants on human health. Due to the imprecision of data, measurement error and limited available information, uncertainty is inevitable in the HRA process. The purpose of this study is to integrate statistical and probabilistic methods to deal with censored and limited numbers of input data to improve the reliability of the non-cancer HRA of dermal contact exposure to contaminated river water by considering uncertainty. A case study in the Kelligrews River in St. John’s, Canada, was conducted to demonstrate the feasibility and capacity of the proposed approach. Five heavy metals were selected to evaluate the risk level, including arsenic, molybdenum, zinc, uranium and manganese. The results showed that the probability of the total hazard index of dermal exposure exceeding 1 is very low, and there is no obvious evidence of risk in the study area.

  14. Learning across Languages: Bilingual Experience Supports Dual Language Statistical Word Segmentation

    Science.gov (United States)

    Antovich, Dylan M.; Graf Estes, Katharine

    2018-01-01

    Bilingual acquisition presents learning challenges beyond those found in monolingual environments, including the need to segment speech in two languages. Infants may use statistical cues, such as syllable-level transitional probabilities, to segment words from fluent speech. In the present study we assessed monolingual and bilingual 14-month-olds'…

  15. The effect on prospective teachers of the learning environment supported by dynamic statistics software

    Science.gov (United States)

    Koparan, Timur

    2016-02-01

    In this study, the effect on the achievement and attitudes of prospective teachers is examined. With this aim ahead, achievement test, attitude scale for statistics and interviews were used as data collection tools. The achievement test comprises 8 problems based on statistical data, and the attitude scale comprises 13 Likert-type items. The study was carried out in 2014-2015 academic year fall semester at a university in Turkey. The study, which employed the pre-test-post-test control group design of quasi-experimental research method, was carried out on a group of 80 prospective teachers, 40 in the control group and 40 in the experimental group. Both groups had four-hour classes about descriptive statistics. The classes with the control group were carried out through traditional methods while dynamic statistics software was used in the experimental group. Five prospective teachers from the experimental group were interviewed clinically after the application for a deeper examination of their views about application. Qualitative data gained are presented under various themes. At the end of the study, it was found that there is a significant difference in favour of the experimental group in terms of achievement and attitudes, the prospective teachers have affirmative approach to the use of dynamic software and see it as an effective tool to enrich maths classes. In accordance with the findings of the study, it is suggested that dynamic software, which offers unique opportunities, be used in classes by teachers and students.

  16. STATISTICAL EVALUATION OF SMALL SCALE MIXING DEMONSTRATION SAMPLING AND BATCH TRANSFER PERFORMANCE - 12093

    Energy Technology Data Exchange (ETDEWEB)

    GREER DA; THIEN MG

    2012-01-12

    The ability to effectively mix, sample, certify, and deliver consistent batches of High Level Waste (HLW) feed from the Hanford Double Shell Tanks (DST) to the Waste Treatment and Immobilization Plant (WTP) presents a significant mission risk with potential to impact mission length and the quantity of HLW glass produced. DOE's Tank Operations Contractor, Washington River Protection Solutions (WRPS) has previously presented the results of mixing performance in two different sizes of small scale DSTs to support scale up estimates of full scale DST mixing performance. Currently, sufficient sampling of DSTs is one of the largest programmatic risks that could prevent timely delivery of high level waste to the WTP. WRPS has performed small scale mixing and sampling demonstrations to study the ability to sufficiently sample the tanks. The statistical evaluation of the demonstration results which lead to the conclusion that the two scales of small DST are behaving similarly and that full scale performance is predictable will be presented. This work is essential to reduce the risk of requiring a new dedicated feed sampling facility and will guide future optimization work to ensure the waste feed delivery mission will be accomplished successfully. This paper will focus on the analytical data collected from mixing, sampling, and batch transfer testing from the small scale mixing demonstration tanks and how those data are being interpreted to begin to understand the relationship between samples taken prior to transfer and samples from the subsequent batches transferred. An overview of the types of data collected and examples of typical raw data will be provided. The paper will then discuss the processing and manipulation of the data which is necessary to begin evaluating sampling and batch transfer performance. This discussion will also include the evaluation of the analytical measurement capability with regard to the simulant material used in the demonstration tests. The

  17. Statistical analyses to support guidelines for marine avian sampling. Final report

    Science.gov (United States)

    Kinlan, Brian P.; Zipkin, Elise; O'Connell, Allan F.; Caldow, Chris

    2012-01-01

    Interest in development of offshore renewable energy facilities has led to a need for high-quality, statistically robust information on marine wildlife distributions. A practical approach is described to estimate the amount of sampling effort required to have sufficient statistical power to identify species-specific “hotspots” and “coldspots” of marine bird abundance and occurrence in an offshore environment divided into discrete spatial units (e.g., lease blocks), where “hotspots” and “coldspots” are defined relative to a reference (e.g., regional) mean abundance and/or occurrence probability for each species of interest. For example, a location with average abundance or occurrence that is three times larger the mean (3x effect size) could be defined as a “hotspot,” and a location that is three times smaller than the mean (1/3x effect size) as a “coldspot.” The choice of the effect size used to define hot and coldspots will generally depend on a combination of ecological and regulatory considerations. A method is also developed for testing the statistical significance of possible hotspots and coldspots. Both methods are illustrated with historical seabird survey data from the USGS Avian Compendium Database. Our approach consists of five main components: 1. A review of the primary scientific literature on statistical modeling of animal group size and avian count data to develop a candidate set of statistical distributions that have been used or may be useful to model seabird counts. 2. Statistical power curves for one-sample, one-tailed Monte Carlo significance tests of differences of observed small-sample means from a specified reference distribution. These curves show the power to detect "hotspots" or "coldspots" of occurrence and abundance at a range of effect sizes, given assumptions which we discuss. 3. A model selection procedure, based on maximum likelihood fits of models in the candidate set, to determine an appropriate statistical

  18. Evaluation of undergraduate nursing students' attitudes towards statistics courses, before and after a course in applied statistics.

    Science.gov (United States)

    Hagen, Brad; Awosoga, Olu; Kellett, Peter; Dei, Samuel Ofori

    2013-09-01

    Undergraduate nursing students must often take a course in statistics, yet there is scant research to inform teaching pedagogy. The objectives of this study were to assess nursing students' overall attitudes towards statistics courses - including (among other things) overall fear and anxiety, preferred learning and teaching styles, and the perceived utility and benefit of taking a statistics course - before and after taking a mandatory course in applied statistics. The authors used a pre-experimental research design (a one-group pre-test/post-test research design), by administering a survey to nursing students at the beginning and end of the course. The study was conducted at a University in Western Canada that offers an undergraduate Bachelor of Nursing degree. Participants included 104 nursing students, in the third year of a four-year nursing program, taking a course in statistics. Although students only reported moderate anxiety towards statistics, student anxiety about statistics had dropped by approximately 40% by the end of the course. Students also reported a considerable and positive change in their attitudes towards learning in groups by the end of the course, a potential reflection of the team-based learning that was used. Students identified preferred learning and teaching approaches, including the use of real-life examples, visual teaching aids, clear explanations, timely feedback, and a well-paced course. Students also identified preferred instructor characteristics, such as patience, approachability, in-depth knowledge of statistics, and a sense of humor. Unfortunately, students only indicated moderate agreement with the idea that statistics would be useful and relevant to their careers, even by the end of the course. Our findings validate anecdotal reports on statistics teaching pedagogy, although more research is clearly needed, particularly on how to increase students' perceptions of the benefit and utility of statistics courses for their nursing

  19. Development and psychometric evaluation of supportive leadership scales.

    Science.gov (United States)

    McGilton, Katherine S

    2003-12-01

    The purpose of this study was to develop and evaluate the psychometric properties of 2 supportive leadership scales, the Charge Nurse Support Scale and the Unit Manager Support Scale, designed for long-term-care environments. These 6-item self-report scales were administered to 70 nursing staff and their internal consistency reliability, test-retest reliability, content validity, factor structure, and construct validity investigated. Content validity was established with the assistance of experts. Both scales were deemed reliable. As hypothesized, a significant relationship was found between the measure of how nursing staff related to residents and measures of charge nurses' supportive behaviours (r = .42, p = .05). Reliable and valid measures of supportive leadership could be developed for use in identifying the quality of support provided to staff in long-term-care environments.

  20. Evaluation of mechanical properties of steel wire ropes by statistical methods

    Directory of Open Access Journals (Sweden)

    Boroška Ján

    1999-12-01

    Full Text Available The contribution deals with the evaluation of mechanical properties of steel wire ropes using statistical methods from the viewpoint of the quality of single wires as well as the internal construction of the wire ropes. The evaluation is based on the loading capacity calculated from the strength, number of folds and torsions. For the better ilustration, a box plot has been constructed.

  1. STATISTICAL EVALUATION OF THE IMPACT OF ECONOMIC FACTORS ON SOCIO-DEMOGRAPHICS OF THE COUNTRY

    Directory of Open Access Journals (Sweden)

    O. Evseenko

    2014-04-01

    Full Text Available In theory made a case the necessity of modeling economic and demographic indicators. The influences of economic, social and environmental indicators on social and demographic factors of development country are researeched. Given statistical evaluation of relationships based on correlation and regression analysis method.

  2. Evaluating an Active Learning Approach to Teaching Introductory Statistics: A Classroom Workbook Approach

    Science.gov (United States)

    Carlson, Kieth A.; Winquist, Jennifer R.

    2011-01-01

    The study evaluates a semester-long workbook curriculum approach to teaching a college level introductory statistics course. The workbook curriculum required students to read content before and during class and then work in groups to complete problems and answer conceptual questions pertaining to the material they read. Instructors spent class…

  3. Statistical evaluation of an interlaboratory comparison for the determination of uranium by potentiometric titration

    International Nuclear Information System (INIS)

    Ketema, D.J.; Harry, R.J.S.; Zijp, W.L.

    1990-09-01

    Upon request of the ESARDA working group 'Low enriched uranium conversion - and fuel fabrication plants' an interlaboratory comparison was organized, to assess the precision and accuracy concerning the determination of uranium by the potentiometric titration method. This report presents the results of a statistical evaluation on the data of the first phase of this exercise. (author). 9 refs.; 5 figs.; 24 tabs

  4. Performance Evaluations of Ion Exchanged Zeolite Membranes on Alumina Supports

    Energy Technology Data Exchange (ETDEWEB)

    Bhave, Ramesh R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Jubin, Robert Thomas [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Spencer, Barry B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Nair, Sankar [Georgia Inst. of Technology, Atlanta, GA (United States)

    2017-08-27

    This report describes the synthesis and evaluation of molecular sieve zeolite membranes to separate and concentrate tritiated water (HTO) from dilute HTO-bearing aqueous streams. In the first phase of this effort, several monovalent and divalent cation-exchanged silico alumino phosphate (SAPO-34) molecular sieve zeolite membranes were synthesized on disk supports and characterized with gas and vapor permeation measurements. In the second phase, Linde Type A (LTA) zeolite membranes were synthesized in disk and tubular supports. The pervaporation process performance was evaluated for the separation and concentration of tritiated water.

  5. Statistical process control support during Defense Waste Processing Facility chemical runs

    International Nuclear Information System (INIS)

    Brown, K.G.

    1994-01-01

    The Product Composition Control System (PCCS) has been developed to ensure that the wasteforms produced by the Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) will satisfy the regulatory and processing criteria that will be imposed. The PCCS provides rigorous, statistically-defensible management of a noisy, multivariate system subject to multiple constraints. The system has been successfully tested and has been used to control the production of the first two melter feed batches during DWPF Chemical Runs. These operations will demonstrate the viability of the DWPF process. This paper provides a brief discussion of the technical foundation for the statistical process control algorithms incorporated into PCCS, and describes the results obtained and lessons learned from DWPF Cold Chemical Run operations. The DWPF will immobilize approximately 130 million liters of high-level nuclear waste currently stored at the Site in 51 carbon steel tanks. Waste handling operations separate this waste into highly radioactive sludge and precipitate streams and less radioactive water soluble salts. (In a separate facility, soluble salts are disposed of as low-level waste in a mixture of cement slag, and flyash.) In DWPF, the precipitate steam (Precipitate Hydrolysis Aqueous or PHA) is blended with the insoluble sludge and ground glass frit to produce melter feed slurry which is continuously fed to the DWPF melter. The melter produces a molten borosilicate glass which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in a geologic repository

  6. Evaluating the effect of domestic support on international trade

    DEFF Research Database (Denmark)

    Urban, Kirsten; Brockmeier, Martina; Jensen, Hans Grinsted

    We use the Mercantilist Trade Restrictiveness Index (MTRI) to develop an extended index that measures the overall trade effects of domestic support payments in a general equilibrium framework environment. Our index is capable of analyzing the development of the trade restrictiveness of domestic...... support payments over time and across countries and of comparing these payments with other protection instruments. Furthermore, our index helps evaluate agricultural policy reforms that introduce changes into the composition of domestic support payments. We conduct this analysis with an extended version...... of the GTAP model and database using the EU as an example. Thus, we incorporate detailed EU domestic support payments taken from the OECD Producer Support Estimate (PSE) tables in the GTAP framework and reconcile PSE data with the WTO classification scheme. Although our index slightly increases from 2004...

  7. Electromyographic evaluation of mastication and swallowing in elderly individuals with mandibular fixed implant-supported prostheses

    Directory of Open Access Journals (Sweden)

    Giédre Berretin-Felix

    2008-04-01

    Full Text Available This study evaluated the effect of implant-supported oral rehabilitation in the mandible on the electromyographic activity during mastication and swallowing in edentulous elderly individuals. Fifteen patients aged more than 60 years were evaluated, being 10 females and 5 males. All patients were edentulous, wore removable complete dentures on both dental arches, and had the mandibular dentures replaced by implant-supported prostheses. All patients were submitted to electromyographic evaluation of the masseter, superior orbicularis oris muscles, and the submental muscles, before surgery and 3, 6 and 18 months postoperatively, using foods of different textures. The results obtained at the different periods were analyzed statistically by Kruskal-Wallis non-parametric test. Statistical analysis showed that only the masseter muscle had a significant loss in electromyographic activity (p<0.001, with a tendency of similar response for the submental muscles. Moreover, there was an increase in the activity of the orbicularis oris muscle during rubber chewing after treatment, yet without statistically significant difference. Mandibular fixed implant-supported prostheses in elderly individuals revealed a decrease in electromyographic amplitude for the masseter muscles during swallowing, which may indicate adaptation to new conditions of stability provided by fixation of the complete denture in the mandibular arch.

  8. Incorporating big data into treatment plan evaluation: Development of statistical DVH metrics and visualization dashboards.

    Science.gov (United States)

    Mayo, Charles S; Yao, John; Eisbruch, Avraham; Balter, James M; Litzenberg, Dale W; Matuszak, Martha M; Kessler, Marc L; Weyburn, Grant; Anderson, Carlos J; Owen, Dawn; Jackson, William C; Haken, Randall Ten

    2017-01-01

    To develop statistical dose-volume histogram (DVH)-based metrics and a visualization method to quantify the comparison of treatment plans with historical experience and among different institutions. The descriptive statistical summary (ie, median, first and third quartiles, and 95% confidence intervals) of volume-normalized DVH curve sets of past experiences was visualized through the creation of statistical DVH plots. Detailed distribution parameters were calculated and stored in JavaScript Object Notation files to facilitate management, including transfer and potential multi-institutional comparisons. In the treatment plan evaluation, structure DVH curves were scored against computed statistical DVHs and weighted experience scores (WESs). Individual, clinically used, DVH-based metrics were integrated into a generalized evaluation metric (GEM) as a priority-weighted sum of normalized incomplete gamma functions. Historical treatment plans for 351 patients with head and neck cancer, 104 with prostate cancer who were treated with conventional fractionation, and 94 with liver cancer who were treated with stereotactic body radiation therapy were analyzed to demonstrate the usage of statistical DVH, WES, and GEM in a plan evaluation. A shareable dashboard plugin was created to display statistical DVHs and integrate GEM and WES scores into a clinical plan evaluation within the treatment planning system. Benchmarking with normal tissue complication probability scores was carried out to compare the behavior of GEM and WES scores. DVH curves from historical treatment plans were characterized and presented, with difficult-to-spare structures (ie, frequently compromised organs at risk) identified. Quantitative evaluations by GEM and/or WES compared favorably with the normal tissue complication probability Lyman-Kutcher-Burman model, transforming a set of discrete threshold-priority limits into a continuous model reflecting physician objectives and historical experience

  9. Performance evaluation of CT measurements made on step gauges using statistical methodologies

    DEFF Research Database (Denmark)

    Angel, J.; De Chiffre, L.; Kruth, J.P.

    2015-01-01

    In this paper, a study is presented in which statistical methodologies were applied to evaluate the measurement of step gauges on an X-ray computed tomography (CT) system. In particular, the effects of step gauge material density and orientation were investigated. The step gauges consist of uni......- and bidirectional lengths. By confirming the repeatability of measurements made on the test system, the number of required scans in the design of experiment (DOE) was reduced. The statistical model was checked using model adequacy principles; model adequacy checking is an important step in validating...

  10. Statistical evaluations concerning the failure behaviour of formed parts with superheated steam flow. Pt. 1

    International Nuclear Information System (INIS)

    Oude-Hengel, H.H.; Vorwerk, K.; Heuser, F.W.; Boesebeck, K.

    1976-01-01

    Statistical evaluations concerning the failure behaviour of formed parts with superheated-steam flow were carried out using data from VdTUEV inventory and failure statistics. Due to the great number of results, the findings will be published in two volumes. This first part will describe and classify the stock of data and will make preliminary quantitative statements on failure behaviour. More differentiated statements are made possible by including the operation time and the number of start-ups per failed part. On the basis of time-constant failure rates some materials-specific statements are given. (orig./ORU) [de

  11. Threat evaluation and weapon assignment decision support: A ...

    African Journals Online (AJOL)

    ... stands within the context of a ground based air defence system (GBADS) at the turn of the twenty first century. However, much of the contents of the paper maye generalized to military environments other than a GBADS one. Keywords: Threat evaluation, weapon assignment, decision support. ORiON Vol. 23 (2) 2007: pp.

  12. Comparison of Asian Aquaculture Products by Use of Statistically Supported Life Cycle Assessment

    NARCIS (Netherlands)

    Henriksson, P.J.G.; Rico Artero, A.; Zhang, W.; Nahid, S.S.A.; Newton, R.; Phan, L.T.; Zhang, Z.

    2015-01-01

    We investigated aquaculture production of Asian tiger shrimp, whiteleg shrimp, giant river prawn, tilapia, and pangasius catfish in Bangladesh, China, Thailand, and Vietnam by using life cycle assessments (LCAs), with the purpose of evaluating the comparative eco-efficiency of producing different

  13. Evaluation of atmospheric dispersion/consequence models supporting safety analysis

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Lazaro, M.A.; Woodard, K.

    1996-01-01

    Two DOE Working Groups have completed evaluation of accident phenomenology and consequence methodologies used to support DOE facility safety documentation. The independent evaluations each concluded that no one computer model adequately addresses all accident and atmospheric release conditions. MACCS2, MATHEW/ADPIC, TRAC RA/HA, and COSYMA are adequate for most radiological dispersion and consequence needs. ALOHA, DEGADIS, HGSYSTEM, TSCREEN, and SLAB are recommended for chemical dispersion and consequence applications. Additional work is suggested, principally in evaluation of new models, targeting certain models for continued development, training, and establishing a Web page for guidance to safety analysts

  14. Fast Monte Carlo reliability evaluation using support vector machine

    International Nuclear Information System (INIS)

    Rocco, Claudio M.; Moreno, Jose Ali

    2002-01-01

    This paper deals with the feasibility of using support vector machine (SVM) to build empirical models for use in reliability evaluation. The approach takes advantage of the speed of SVM in the numerous model calculations typically required to perform a Monte Carlo reliability evaluation. The main idea is to develop an estimation algorithm, by training a model on a restricted data set, and replace system performance evaluation by a simpler calculation, which provides reasonably accurate model outputs. The proposed approach is illustrated by several examples. Excellent system reliability results are obtained by training a SVM with a small amount of information

  15. Mathematical model of statistical identification of information support of road transport

    Directory of Open Access Journals (Sweden)

    V. G. Kozlov

    2016-01-01

    Full Text Available In this paper based on the statistical identification method using the theory of self-organizing systems, built multifactor model the relationship of road transport and training system. Background information for the model represented by a number of parameters of average annual road transport operations and information provision, including training complex system parameters (inputs, road management and output parameters. Ask two criteria: stability criterion model and test correlation. The program determines their minimum, and is the only model of optimal complexity. The predetermined number of parameters established mathematical relationship of each output parameter with the others. To improve the accuracy and regularity of the forecast of the interpolation nodes allocated in the test data sequence. Other data form the training sequence. Decision model based on the principle of selection. Running it with the gradual complication of the mathematical description and exhaustive search of all possible variants of the models on the specified criteria. Advantages of the proposed model: adequately reflects the actual process, allows you to enter any additional input parameters and determine their impact on the individual output parameters of the road transport, allows in turn change the values of key parameters in a certain ratio and to determine the appropriate changes the output parameters of the road transport, allows to predict the output parameters road transport operations.

  16. Editorial: Creating, Supporting, Sustaining and Evaluating Virtual Learning Communities

    Directory of Open Access Journals (Sweden)

    Xun Ge

    2011-12-01

    Full Text Available This special issue is dedicated to creating, building, supporting, sustaining and evaluating virtual learning communities (VLCs using emerging technologies. The contributors from diverse disciplines have come together to share their valuable experiences and findings through their research in the following themes: (a instructional models, strategies, approaches for building, supporting and evaluating VLCs, (b designing effective use of tools to promote discourse and scaffold peer interactions among members, (c iterative processes and models of designing and evaluating VLCs; and (d various variables concerning VLCs, such as virtual community behaviors, cultural factors, adoption patterns of tools. It is hoped that these articles will provide practical guidance and offer valuable experience to both educators and researchers who are interested in designing effective VLCs and examining various aspects of VLCs to advance our understanding of VLCs.

  17. Development of a new statistical evaluation method for brain SPECT images

    International Nuclear Information System (INIS)

    Kawashima, Ryuta; Sato, Kazunori; Ito, Hiroshi; Koyama, Masamichi; Goto, Ryoui; Yoshioka, Seiro; Ono, Shuichi; Sato, Tachio; Fukuda, Hiroshi

    1996-01-01

    The purpose of this study was to develop a new statistical evaluation method for brain SPECT images. First, we made normal brain image databases using 99m Tc-ECD and SPECT in 10 normal subjects as described previously. Each SPECT images were globally normalized and anatomically standardized to the standard brain shape using Human Brain Atlas (HBA) of Roland et al. and each subject's X-CT. Then, mean and SD images were calculated voxel by voxel. For the next step, 99m Tc-ECD SPECT images of a patient were obtained, and global normalization and anatomical standardization were performed as the same way. Then, a statistical map was calculated as following voxel by voxel; (P-Mean)/SDx10+50, where P, mean and SD indicate voxel value of patient, mean and SD images of normal databases, respectively. We found this statistical map was helpful for clinical diagnosis of brain SPECT studies. (author)

  18. Statistical approaches for evaluating body composition markers in clinical cancer research.

    Science.gov (United States)

    Bayar, Mohamed Amine; Antoun, Sami; Lanoy, Emilie

    2017-04-01

    The term 'morphomics' stands for the markers of body composition in muscle and adipose tissues. in recent years, as part of clinical cancer research, several associations between morphomics and outcome or toxicity were found in different treatment settings leading to a growing interest. we aim to review statistical approaches used to evaluate these markers and suggest practical statistical recommendations. Area covered: We identified statistical methods used recently to take into account properties of morphomics measurements. We also reviewed adjustment methods on major confounding factors such as gender and approaches to model morphomic data, especially mixed models for repeated measures. Finally, we focused on methods for determining a cut-off for a morphomic marker that could be used in clinical practice and how to assess its robustness. Expert commentary: From our review, we proposed 13 key points to strengthen analyses and reporting of clinical research assessing associations between morphomics and outcome or toxicity.

  19. Evaluating statistical tests on OLAP cubes to compare degree of disease.

    Science.gov (United States)

    Ordonez, Carlos; Chen, Zhibo

    2009-09-01

    Statistical tests represent an important technique used to formulate and validate hypotheses on a dataset. They are particularly useful in the medical domain, where hypotheses link disease with medical measurements, risk factors, and treatment. In this paper, we propose to compute parametric statistical tests treating patient records as elements in a multidimensional cube. We introduce a technique that combines dimension lattice traversal and statistical tests to discover significant differences in the degree of disease within pairs of patient groups. In order to understand a cause-effect relationship, we focus on patient group pairs differing in one dimension. We introduce several optimizations to prune the search space, to discover significant group pairs, and to summarize results. We present experiments showing important medical findings and evaluating scalability with medical datasets.

  20. Facilitating participation in formative evaluation supported by effect map

    DEFF Research Database (Denmark)

    Granlien, Maren Sander

    2009-01-01

    It has been suggested that formative evaluation should be an integrated part of system implementation in order to improve the outcome of system use. In a design project an approach combining participatory design (PD) and formative evaluation has shown a great potential for improving the design of...... map is applied in an action research study in the Danish health care sector aiming at improving the medication process and the use of the electronic medication record supporting the medication process. ...... designed effect map. The purpose of the effect map is twofold: a) To encourage user participation in the early activities of formative evaluation b) The effects specified can be used as formative evaluation measures and guidance in the process of improving the system. The evaluation approach and the effect...

  1. Statistically based evaluation of toughness properties of components in older nuclear power stations

    International Nuclear Information System (INIS)

    Aurich, D.; Jaenicke, B.; Veith, H.

    1996-01-01

    The KTA code 3201.2 contains provisions for the evaluation of K Ic values measured in components, but there are no instructions on how to proceed. According to the present state of the art in science and technology, fracture toughness values K Ic (T) should be evaluated statistically in order to specify the relationship to the loading values K I (T). The 'Master Curve' concept of Wallin yields too flat a curve shape at high temperatures. The statistical evaluation of K Ic values can also be carried out with the KTA-K Ic reference temperature function assuming a normal distribution of the measuring values. The KTA-K Ic reference temperature curve approximately corresponds to a fracture probability of 5 % when the KTA-K Ic reference temperature function is used for the statistical evaluation of the test results. Conclusions for the assessment of the safe distances can be drawn from the steeper shape of the KTA-K Ic reference temperature function in comparison to the 'Master Curve'. (orig.) [de

  2. Usability Evaluation for Business Intelligence Applications: A User Support Perspective

    Directory of Open Access Journals (Sweden)

    Chrisna Jooste

    2014-08-01

    Full Text Available Business Intelligence (BI applications provide business information to drive decision support. Usability is one of the factors determining the optimal use and eventual benefit derived from BI applications. The documented need for more BI usability research together with the practical necessity for BI evaluation guidelines in the mining industry provides the rationale for this study. The purpose of the study was to investigate the usability evaluation of BI applications in the context of a coal mining organization. The research is guided by the question: How can existing usability criteria be customized to evaluate the usability of BI applications. The research design included user observation, heuristic evaluation and a survey. Based on observations made during user support on a BI application used at a coal mining organization a log of usability issues was compiled. The usability issues extracted from this log was compared and contrasted with general usability criteria from literature to synthesize an initial set of BI usability evaluation criteria. These criteria were used as the basis for a heuristic evaluation of the BI application used at the coal mining organization. The same BI application was also evaluated using the Software Usability Measurement Inventory (SUMI standardized questionnaire. The results from the two evaluations were triangulated and then compared with the BI user issues again to contextualize the findings and synthesize a validated and refined set of criteria. The main contribution of the study is the usability evaluation criteria for BI applications presented as guidelines. These guidelines deviate from existing evaluation guidelines in the emphasis on information architecture, learnability and operability.

  3. A Statistical Classifier to Support Diagnose Meningitis in Less Developed Areas of Brazil.

    Science.gov (United States)

    Lélis, Viviane-Maria; Guzmán, Eduardo; Belmonte, María-Victoria

    2017-08-11

    This paper describes the development of statistical classifiers to help diagnose meningococcal meningitis, i.e. the most sever, infectious and deadliest type of this disease. The goal is to find a mechanism able to determine whether a patient has this type of meningitis from a set of symptoms that can be directly observed in the earliest stages of this pathology. Currently, in Brazil, a country that is heavily affected by meningitis, all suspected cases require immediate hospitalization and the beginning of a treatment with invasive tests and medicines. This procedure, therefore, entails expensive treatments unaffordable in less developed regions. For this purpose, we have gathered together a dataset of 22,602 records of suspected meningitis cases from the Brazilian state of Bahia. Seven classification techniques have been applied from input data of nine symptoms and other information about the patient such as age, sex and the area they live in, and a 10 cross-fold validation has been performed. Results show that the techniques applied are suitable for diagnosing the meningococcal meningitis. Several indexes, such as precision, recall or ROC area, have been computed to show the accuracy of the models. All of them provide good results, but the best corresponds to the J48 classifier with a precision of 0.942 and a ROC area over 0.95. These results indicate that our model can indeed help lead to a non-invasive and early diagnosis of this pathology. This is especially useful in less developed areas, where the epidemiologic risk is usually high and medical expenses, sometimes, unaffordable.

  4. A Statistics-Based Material Property Analysis to Support TPS Characterization

    Science.gov (United States)

    Copeland, Sean R.; Cozmuta, Ioana; Alonso, Juan J.

    2012-01-01

    Accurate characterization of entry capsule heat shield material properties is a critical component in modeling and simulating Thermal Protection System (TPS) response in a prescribed aerothermal environment. The thermal decomposition of the TPS material during the pyrolysis and charring processes is poorly characterized and typically results in large uncertainties in material properties as inputs for ablation models. These material property uncertainties contribute to large design margins on flight systems and cloud re- construction efforts for data collected during flight and ground testing, making revision to existing models for entry systems more challenging. The analysis presented in this work quantifies how material property uncertainties propagate through an ablation model and guides an experimental test regimen aimed at reducing these uncertainties and characterizing the dependencies between properties in the virgin and charred states for a Phenolic Impregnated Carbon Ablator (PICA) based TPS. A sensitivity analysis identifies how the high-fidelity model behaves in the expected flight environment, while a Monte Carlo based uncertainty propagation strategy is used to quantify the expected spread in the in-depth temperature response of the TPS. An examination of how perturbations to the input probability density functions affect output temperature statistics is accomplished using a Kriging response surface of the high-fidelity model. Simulations are based on capsule configuration and aerothermal environments expected during the Mars Science Laboratory (MSL) entry sequence. We identify and rank primary sources of uncertainty from material properties in a flight-relevant environment, show the dependence on spatial orientation and in-depth location on those uncertainty contributors, and quantify how sensitive the expected results are.

  5. Statistical evaluation of waveform collapse reveals scale-free properties of neuronal avalanches

    Directory of Open Access Journals (Sweden)

    Aleena eShaukat

    2016-04-01

    Full Text Available Neural avalanches are a prominent form of brain activity characterized by network-wide bursts whose statistics follow a power-law distribution with a slope near 3/2. Recent work suggests that avalanches of different durations can be rescaled and thus collapsed together. This collapse mirrors work in statistical physics where it is proposed to form a signature of systems evolving in a critical state. However, no rigorous statistical test has been proposed to examine the degree to which neuronal avalanches collapse together. Here, we describe a statistical test based on functional data analysis, where raw avalanches are first smoothed with a Fourier basis, then rescaled using a time-warping function. Finally, an F ratio test combined with a bootstrap permutation is employed to determine if avalanches collapse together in a statistically reliable fashion. To illustrate this approach, we recorded avalanches from cortical cultures on multielectrode arrays as in previous work. Analyses show that avalanches of various durations can be collapsed together in a statistically robust fashion. However, a principal components analysis revealed that the offset of avalanches resulted in marked variance in the time-warping function, thus arguing for limitations to the strict fractal nature of avalanche dynamics. We compared these results with those obtained from cultures treated with an AMPA/NMDA receptor antagonist (APV/DNQX, which yield a power-law of avalanche durations with a slope greater than 3/2. When collapsed together, these avalanches showed marked misalignments both at onset and offset time-points. In sum, the proposed statistical evaluation suggests the presence of scale-free avalanche waveforms and constitutes an avenue for examining critical dynamics in neuronal systems.

  6. Evaluating statistical cloud schemes: What can we gain from ground-based remote sensing?

    Science.gov (United States)

    Grützun, V.; Quaas, J.; Morcrette, C. J.; Ament, F.

    2013-09-01

    Statistical cloud schemes with prognostic probability distribution functions have become more important in atmospheric modeling, especially since they are in principle scale adaptive and capture cloud physics in more detail. While in theory the schemes have a great potential, their accuracy is still questionable. High-resolution three-dimensional observational data of water vapor and cloud water, which could be used for testing them, are missing. We explore the potential of ground-based remote sensing such as lidar, microwave, and radar to evaluate prognostic distribution moments using the "perfect model approach." This means that we employ a high-resolution weather model as virtual reality and retrieve full three-dimensional atmospheric quantities and virtual ground-based observations. We then use statistics from the virtual observation to validate the modeled 3-D statistics. Since the data are entirely consistent, any discrepancy occurring is due to the method. Focusing on total water mixing ratio, we find that the mean ratio can be evaluated decently but that it strongly depends on the meteorological conditions as to whether the variance and skewness are reliable. Using some simple schematic description of different synoptic conditions, we show how statistics obtained from point or line measurements can be poor at representing the full three-dimensional distribution of water in the atmosphere. We argue that a careful analysis of measurement data and detailed knowledge of the meteorological situation is necessary to judge whether we can use the data for an evaluation of higher moments of the humidity distribution used by a statistical cloud scheme.

  7. An Evaluation of the Use of Statistical Procedures in Soil Science

    Directory of Open Access Journals (Sweden)

    Laene de Fátima Tavares

    2016-01-01

    Full Text Available ABSTRACT Experimental statistical procedures used in almost all scientific papers are fundamental for clearer interpretation of the results of experiments conducted in agrarian sciences. However, incorrect use of these procedures can lead the researcher to incorrect or incomplete conclusions. Therefore, the aim of this study was to evaluate the characteristics of the experiments and quality of the use of statistical procedures in soil science in order to promote better use of statistical procedures. For that purpose, 200 articles, published between 2010 and 2014, involving only experimentation and studies by sampling in the soil areas of fertility, chemistry, physics, biology, use and management were randomly selected. A questionnaire containing 28 questions was used to assess the characteristics of the experiments, the statistical procedures used, and the quality of selection and use of these procedures. Most of the articles evaluated presented data from studies conducted under field conditions and 27 % of all papers involved studies by sampling. Most studies did not mention testing to verify normality and homoscedasticity, and most used the Tukey test for mean comparisons. Among studies with a factorial structure of the treatments, many had ignored this structure, and data were compared assuming the absence of factorial structure, or the decomposition of interaction was performed without showing or mentioning the significance of the interaction. Almost none of the papers that had split-block factorial designs considered the factorial structure, or they considered it as a split-plot design. Among the articles that performed regression analysis, only a few of them tested non-polynomial fit models, and none reported verification of the lack of fit in the regressions. The articles evaluated thus reflected poor generalization and, in some cases, wrong generalization in experimental design and selection of procedures for statistical analysis.

  8. The issue of statistical power for overall model fit in evaluating structural equation models

    Directory of Open Access Journals (Sweden)

    Richard HERMIDA

    2015-06-01

    Full Text Available Statistical power is an important concept for psychological research. However, examining the power of a structural equation model (SEM is rare in practice. This article provides an accessible review of the concept of statistical power for the Root Mean Square Error of Approximation (RMSEA index of overall model fit in structural equation modeling. By way of example, we examine the current state of power in the literature by reviewing studies in top Industrial-Organizational (I/O Psychology journals using SEMs. Results indicate that in many studies, power is very low, which implies acceptance of invalid models. Additionally, we examined methodological situations which may have an influence on statistical power of SEMs. Results showed that power varies significantly as a function of model type and whether or not the model is the main model for the study. Finally, results indicated that power is significantly related to model fit statistics used in evaluating SEMs. The results from this quantitative review imply that researchers should be more vigilant with respect to power in structural equation modeling. We therefore conclude by offering methodological best practices to increase confidence in the interpretation of structural equation modeling results with respect to statistical power issues.

  9. Evaluating the statistical methodology of randomized trials on dentin hypersensitivity management.

    Science.gov (United States)

    Matranga, Domenica; Matera, Federico; Pizzo, Giuseppe

    2017-12-27

    The present study aimed to evaluate the characteristics and quality of statistical methodology used in clinical studies on dentin hypersensitivity management. An electronic search was performed for data published from 2009 to 2014 by using PubMed, Ovid/MEDLINE, and Cochrane Library databases. The primary search terms were used in combination. Eligibility criteria included randomized clinical trials that evaluated the efficacy of desensitizing agents in terms of reducing dentin hypersensitivity. A total of 40 studies were considered eligible for assessment of quality statistical methodology. The four main concerns identified were i) use of nonparametric tests in the presence of large samples, coupled with lack of information about normality and equality of variances of the response; ii) lack of P-value adjustment for multiple comparisons; iii) failure to account for interactions between treatment and follow-up time; and iv) no information about the number of teeth examined per patient and the consequent lack of cluster-specific approach in data analysis. Owing to these concerns, statistical methodology was judged as inappropriate in 77.1% of the 35 studies that used parametric methods. Additional studies with appropriate statistical analysis are required to obtain appropriate assessment of the efficacy of desensitizing agents.

  10. Ten Years of Cloud Properties from MODIS: Global Statistics and Use in Climate Model Evaluation

    Science.gov (United States)

    Platnick, Steven E.

    2011-01-01

    The NASA Moderate Resolution Imaging Spectroradiometer (MODIS), launched onboard the Terra and Aqua spacecrafts, began Earth observations on February 24, 2000 and June 24,2002, respectively. Among the algorithms developed and applied to this sensor, a suite of cloud products includes cloud masking/detection, cloud-top properties (temperature, pressure), and optical properties (optical thickness, effective particle radius, water path, and thermodynamic phase). All cloud algorithms underwent numerous changes and enhancements between for the latest Collection 5 production version; this process continues with the current Collection 6 development. We will show example MODIS Collection 5 cloud climatologies derived from global spatial . and temporal aggregations provided in the archived gridded Level-3 MODIS atmosphere team product (product names MOD08 and MYD08 for MODIS Terra and Aqua, respectively). Data sets in this Level-3 product include scalar statistics as well as 1- and 2-D histograms of many cloud properties, allowing for higher order information and correlation studies. In addition to these statistics, we will show trends and statistical significance in annual and seasonal means for a variety of the MODIS cloud properties, as well as the time required for detection given assumed trends. To assist in climate model evaluation, we have developed a MODIS cloud simulator with an accompanying netCDF file containing subsetted monthly Level-3 statistical data sets that correspond to the simulator output. Correlations of cloud properties with ENSO offer the potential to evaluate model cloud sensitivity; initial results will be discussed.

  11. Simulation-based decision support for evaluating operational plans

    Directory of Open Access Journals (Sweden)

    Johan Schubert

    2015-12-01

    Full Text Available In this article, we describe simulation-based decision support techniques for evaluation of operational plans within effects-based planning. Using a decision support tool, developers of operational plans are able to evaluate thousands of alternative plans against possible courses of events and decide which of these plans are capable of achieving a desired end state. The objective of this study is to examine the potential of a decision support system that helps operational analysts understand the consequences of numerous alternative plans through simulation and evaluation. Operational plans are described in the effects-based approach to operations concept as a set of actions and effects. For each action, we examine several different alternative ways to perform the action. We use a representation where a plan consists of several actions that should be performed. Each action may be performed in one of several different alternative ways. Together these action alternatives make up all possible plan instances, which are represented as a tree of action alternatives that may be searched for the most effective sequence of alternative actions. As a test case, we use an expeditionary operation with a plan of 43 actions and several alternatives for these actions, as well as a scenario of 40 group actors. Decision support for planners is provided by several methods that analyze the impact of a plan on the 40 actors, e.g., by visualizing time series of plan performance. Detailed decision support for finding the most influential actions of a plan is presented by using sensitivity analysis and regression tree analysis. Finally, a decision maker may use the tool to determine the boundaries of an operation that it must not move beyond without risk of drastic failure. The significant contribution of this study is the presentation of an integrated approach for evaluation of operational plans.

  12. Statistical evaluation of internal contamination data in the man following the Chernobyl accident

    International Nuclear Information System (INIS)

    Tarroni, G.; Battisti, P.; Melandri, C.; Castellani, C.M.; Formignani, M.

    1989-01-01

    The main implications of the general interest derived from the statistical analysis of the internal human contamination data obtained by ENEA-PAS with Whole Body Counter mesurements performed in Bologna in consequence of the Chernobyl accident are presented. In particular the trend with time of the individual body activity of members of a homogeneous group, the variability of individual contamination in ralation to the mean contamination, the statistical distribution of the data, the significance of mean values concerning small, homogeneous groups of subjects, the difference between subjects of different sex and its trend with time, are examined. Finally, the substantial independence of the individual committed dose equivalent evaluation due to the Chernobyl contamination on the Whole from the hypothesized values of the metabolic parameters is pointed out when the evaluation is performed on the basis of direct measurements with a Whole Body Counter

  13. Computer-based teaching and evaluation of introductory statistics for health science students: some lessons learned

    Directory of Open Access Journals (Sweden)

    Nuala Colgan

    1994-12-01

    Full Text Available In recent years, it has become possible to introduce health science students to statistical packages at an increasingly early stage in their undergraduate studies. This has enabled teaching to take place in a computer laboratory, using real data, and encouraging an exploratory and research-oriented approach. This paper briefly describes a hypertext Computer Based Tutorial (CBT concerned with descriptive statistics and introductory data analysis. The CBT has three primary objectives: the introduction of concepts, the facilitation of revision, and the acquisition of skills for project work. Objective testing is incorporated and used for both self-assessment and formal examination. Evaluation was carried out with a large group of Health Science students, heterogeneous with regard to their IT skills and basic numeracy. The results of the evaluation contain valuable lessons.

  14. Some tendencies of the radioanalytical literature statistical games for trend evaluation. Pt. 1

    International Nuclear Information System (INIS)

    Braun, T.

    1975-01-01

    The distribution of the radioanalytical information sources was statistically evaluated by citation counting. Using some review and progress reports as object of the study, it seems that in the period 1956-1973 one witnesses a significant concentration of the radioanalytical information sources. Fundamental assumptions were that the information bank of each particular field is its published literature and that the most important and most characteristic information sources of a given field are surveyed in reviews and progress reports evaluating the published literature critically. The present study therefore analyses the references appended to some of such reviews and progress reports. The percentage distribution of the references of four reviews published between 1970 and 1975 was calculated with respects to their appearing in journals or nonjournals including books, conference proceedings, reports and patents. Statistic taken from 1.4 million references, which appeared in the 1961 literature, disclosed that 84% of these references are to journal articles. (F.Gy.)

  15. SIIFSCOP - a computer package for statistical evaluations of stable isotopes in precipitation

    International Nuclear Information System (INIS)

    Hussain, Q.M.; Qureshi, R.M.; Sajjad, M.I.

    1989-08-01

    SIIFSCOP is a FORTRAN 77 computer package developed for the statistical evaluations of precipitation data using an IBM com patible PC-XT/AT. The report describes the terminology and equa tions on which the program is based. The required format of the input data, sample outputs and the program listing. Using the measured/calculated isotopic values and the available meteorological data several correlations may be calculated e.g. deuterium, temperature etc. (orig./A.B.)

  16. A Statistical Evaluation of Atmosphere-Ocean General Circulation Models: Complexity vs. Simplicity

    OpenAIRE

    Robert K. Kaufmann; David I. Stern

    2004-01-01

    The principal tools used to model future climate change are General Circulation Models which are deterministic high resolution bottom-up models of the global atmosphere-ocean system that require large amounts of supercomputer time to generate results. But are these models a cost-effective way of predicting future climate change at the global level? In this paper we use modern econometric techniques to evaluate the statistical adequacy of three general circulation models (GCMs) by testing thre...

  17. Incorporating big data into treatment plan evaluation: Development of statistical DVH metrics and visualization dashboards

    Directory of Open Access Journals (Sweden)

    Charles S. Mayo, PhD

    2017-07-01

    Conclusions: Statistical DVH offers an easy-to-read, detailed, and comprehensive way to visualize the quantitative comparison with historical experiences and among institutions. WES and GEM metrics offer a flexible means of incorporating discrete threshold-prioritizations and historic context into a set of standardized scoring metrics. Together, they provide a practical approach for incorporating big data into clinical practice for treatment plan evaluations.

  18. Evaluating online diagnostic decision support tools for the clinical setting.

    Science.gov (United States)

    Pryor, Marie; White, David; Potter, Bronwyn; Traill, Roger

    2012-01-01

    Clinical decision support tools available at the point of care are an effective adjunct to support clinicians to make clinical decisions and improve patient outcomes. We developed a methodology and applied it to evaluate commercially available online clinical diagnostic decision support (DDS) tools for use at the point of care. We identified 11 commercially available DDS tools and assessed these against an evaluation instrument that included 6 categories; general information, content, quality control, search, clinical results and other features. We developed diagnostically challenging clinical case scenarios based on real patient experience that were commonly missed by junior medical staff. The evaluation was divided into 2 phases; an initial evaluation of all identified and accessible DDS tools conducted by the Clinical Information Access Portal (CIAP) team and a second phase that further assessed the top 3 tools identified in the initial evaluation phase. An evaluation panel consisting of senior and junior medical clinicians from NSW Health conducted the second phase. Of the eleven tools that were assessed against the evaluation instrument only 4 tools completely met the DDS definition that was adopted for this evaluation and were able to produce a differential diagnosis. From the initial phase of the evaluation 4 DDS tools scored 70% or more (maximum score 96%) for the content category, 8 tools scored 65% or more (maximum 100%) for the quality control category, 5 tools scored 65% or more (maximum 94%) for the search category, and 4 tools score 70% or more (maximum 81%) for the clinical results category. The second phase of the evaluation was focused on assessing diagnostic accuracy for the top 3 tools identified in the initial phase. Best Practice ranked highest overall against the 6 clinical case scenarios used. Overall the differentiating factor between the top 3 DDS tools was determined by diagnostic accuracy ranking, ease of use and the confidence and

  19. A statistical framework for evaluating neural networks to predict recurrent events in breast cancer

    Science.gov (United States)

    Gorunescu, Florin; Gorunescu, Marina; El-Darzi, Elia; Gorunescu, Smaranda

    2010-07-01

    Breast cancer is the second leading cause of cancer deaths in women today. Sometimes, breast cancer can return after primary treatment. A medical diagnosis of recurrent cancer is often a more challenging task than the initial one. In this paper, we investigate the potential contribution of neural networks (NNs) to support health professionals in diagnosing such events. The NN algorithms are tested and applied to two different datasets. An extensive statistical analysis has been performed to verify our experiments. The results show that a simple network structure for both the multi-layer perceptron and radial basis function can produce equally good results, not all attributes are needed to train these algorithms and, finally, the classification performances of all algorithms are statistically robust. Moreover, we have shown that the best performing algorithm will strongly depend on the features of the datasets, and hence, there is not necessarily a single best classifier.

  20. Statistical evaluation of failures and repairs of the V-1 measuring and control system

    International Nuclear Information System (INIS)

    Laurinec, R.; Korec, J.; Mitosinka, J.; Zarnovican, V.

    1984-01-01

    A failure record card system was introduced for evaluating the reliability of the measurement and control equipment of the V-1 nuclear power plant. The SPU-800 microcomputer system is used for recording data on magnetic tape and their transmission to the central data processing department. The data are used for evaluating the reliability of components and circuits and a selection is made of the most failure-prone components, and the causes of failures are evaluated as are failure identification, repair and causes of outages. The system provides monthly, annual and total assessment data since the system was commissioned. The results of the statistical evaluation of failures are used for planning preventive maintenance and for determining optimal repair intervals. (E.S.)

  1. Uranium determination in U-Al alloy with statistical tools support

    International Nuclear Information System (INIS)

    Furusawa, Helio Akira; Medalla, Felipe Quirino; Cotrim, Marycel Elena Barbosa; Pires, Maria Aparecida Faustino

    2011-01-01

    ICP-OES was used to quantify total uranium in natural UAl x powder alloy. A simple solubilisation procedure using diluted HNO 3 /HCl was successfully applied. Only 100 mg of sample were used which is an advantage over the volumetric methodologies. Only two dilutions were needed to reach measurable concentration. No other treatment was applied to the solutions. Calibration curves of three uranium lines (367.007, 385.958 and 409.014 nm) were evaluated using ANOVA. Comparing the indicators, the 367.007 nm line was the poorer one but exhibiting a R 2 = 0.998 and 0.9996 and 0.999 for the other two lines. No significant difference was found between these two lines. If needed, the 385.958 nm line could be used to quantify uranium in very low concentrations but with few advantages over the 409.014 nm line, if so. The average uranium concentration found was 0.80±0.01 μg.g-1, as expected for a predominant UAl 2 phase alloy. Higher uranium concentrations are also expected to be successfully quantified using these lines. In order to verify possibly inhomogeneity due to the high uranium concentration, one-way ANOVA was applied to 3 replicates. Homogeneity was confirmed measuring in both 385.958 and 409.014 nm lines. The uncertainty of solution homogeneity was estimated also in these two emission lines giving 0.006 and 0.005 μg.g-1, respectively. These two values are in compliance with the standard deviation of the average. (author)

  2. Relevance of the c-statistic when evaluating risk-adjustment models in surgery.

    Science.gov (United States)

    Merkow, Ryan P; Hall, Bruce L; Cohen, Mark E; Dimick, Justin B; Wang, Edward; Chow, Warren B; Ko, Clifford Y; Bilimoria, Karl Y

    2012-05-01

    The measurement of hospital quality based on outcomes requires risk adjustment. The c-statistic is a popular tool used to judge model performance, but can be limited, particularly when evaluating specific operations in focused populations. Our objectives were to examine the interpretation and relevance of the c-statistic when used in models with increasingly similar case mix and to consider an alternative perspective on model calibration based on a graphical depiction of model fit. From the American College of Surgeons National Surgical Quality Improvement Program (2008-2009), patients were identified who underwent a general surgery procedure, and procedure groups were increasingly restricted: colorectal-all, colorectal-elective cases only, and colorectal-elective cancer cases only. Mortality and serious morbidity outcomes were evaluated using logistic regression-based risk adjustment, and model c-statistics and calibration curves were used to compare model performance. During the study period, 323,427 general, 47,605 colorectal-all, 39,860 colorectal-elective, and 21,680 colorectal cancer patients were studied. Mortality ranged from 1.0% in general surgery to 4.1% in the colorectal-all group, and serious morbidity ranged from 3.9% in general surgery to 12.4% in the colorectal-all procedural group. As case mix was restricted, c-statistics progressively declined from the general to the colorectal cancer surgery cohorts for both mortality and serious morbidity (mortality: 0.949 to 0.866; serious morbidity: 0.861 to 0.668). Calibration was evaluated graphically by examining predicted vs observed number of events over risk deciles. For both mortality and serious morbidity, there was no qualitative difference in calibration identified between the procedure groups. In the present study, we demonstrate how the c-statistic can become less informative and, in certain circumstances, can lead to incorrect model-based conclusions, as case mix is restricted and patients become

  3. Statistical properties of material strength for reliability evaluation of components of fast reactors. Austenitic stainless steels

    International Nuclear Information System (INIS)

    Takaya, Shigeru; Sasaki, Naoto; Tomobe, Masato

    2015-03-01

    Many efforts have been made to implement the System Based Code concept of which objective is to optimize margins dispersed in several codes and standards. Failure probability is expected to be a promising quantitative index for optimization of margins, and statistical information for random variables is needed to evaluate failure probability. Material strength like tensile strength is an important random variable, but the statistical information has not been provided enough yet. In this report, statistical properties of material strength such as creep rupture time, steady creep strain rate, yield stress, tensile stress, flow stress, fatigue life and cyclic stress-strain curve, were estimated for SUS304 and 316FR steel, which are typical structural materials for fast reactors. Other austenitic stainless steels like SUS316 were also used for statistical estimation of some material properties such as fatigue life. These materials are registered in the JSME code of design and construction of fast reactors, so test data used for developing the code were used as much as possible in this report. (author)

  4. Evaluation of Solid Rocket Motor Component Data Using a Commercially Available Statistical Software Package

    Science.gov (United States)

    Stefanski, Philip L.

    2015-01-01

    Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.

  5. Descriptive statistics: the specification of statistical measures and their presentation in tables and graphs. Part 7 of a series on evaluation of scientific publications.

    Science.gov (United States)

    Spriestersbach, Albert; Röhrig, Bernd; du Prel, Jean-Baptist; Gerhold-Ay, Aslihan; Blettner, Maria

    2009-09-01

    Descriptive statistics are an essential part of biometric analysis and a prerequisite for the understanding of further statistical evaluations, including the drawing of inferences. When data are well presented, it is usually obvious whether the author has collected and evaluated them correctly and in keeping with accepted practice in the field. Statistical variables in medicine may be of either the metric (continuous, quantitative) or categorical (nominal, ordinal) type. Easily understandable examples are given. Basic techniques for the statistical description of collected data are presented and illustrated with examples. The goal of a scientific study must always be clearly defined. The definition of the target value or clinical endpoint determines the level of measurement of the variables in question. Nearly all variables, whatever their level of measurement, can be usefully presented graphically and numerically. The level of measurement determines what types of diagrams and statistical values are appropriate. There are also different ways of presenting combinations of two independent variables graphically and numerically. The description of collected data is indispensable. If the data are of good quality, valid and important conclusions can already be drawn when they are properly described. Furthermore, data description provides a basis for inferential statistics.

  6. Supporting academic publication: evaluation of a writing course combined with writers' support group.

    Science.gov (United States)

    Rickard, Claire M; McGrail, Matthew R; Jones, Rebecca; O'Meara, Peter; Robinson, Anske; Burley, Mollie; Ray-Barruel, Gillian

    2009-07-01

    Publication rates are a vital measure of individual and institutional performance, yet many nurse academics publish rarely or not at all. Despite widespread acceptance of the need to increase academic publication rates and the pressure university faculty may experience to fulfil this obligation, little is known about the effectiveness of practical strategies to support academic writing. In this small cohort study (n=8) comprising nurses and other professionals involved in university education, a questionnaire survey was used to evaluate the effectiveness of a one-week "Writing for Publication" course combined with a monthly writers support group to increase publication rates. Two year pre and post submissions increased from 9 to 33 articles in peer-reviewed journals. Publications (in print) per person increased from a baseline of 0.5-1.2 per year. Participants reported increased writing confidence and greater satisfaction with the publishing process. Peer support and receiving recognition and encouragement from line managers were also cited as incentives to publish. Writing for publication is a skill that can be learned. The evaluated model of a formal writing course, followed by informal monthly group support meetings, can effectively increase publication rates.

  7. Exploring the practicing-connections hypothesis: using gesture to support coordination of ideas in understanding a complex statistical concept.

    Science.gov (United States)

    Son, Ji Y; Ramos, Priscilla; DeWolf, Melissa; Loftus, William; Stigler, James W

    2018-01-01

    In this article, we begin to lay out a framework and approach for studying how students come to understand complex concepts in rich domains. Grounded in theories of embodied cognition, we advance the view that understanding of complex concepts requires students to practice, over time, the coordination of multiple concepts, and the connection of this system of concepts to situations in the world. Specifically, we explore the role that a teacher's gesture might play in supporting students' coordination of two concepts central to understanding in the domain of statistics: mean and standard deviation. In Study 1 we show that university students who have just taken a statistics course nevertheless have difficulty taking both mean and standard deviation into account when thinking about a statistical scenario. In Study 2 we show that presenting the same scenario with an accompanying gesture to represent variation significantly impacts students' interpretation of the scenario. Finally, in Study 3 we present evidence that instructional videos on the internet fail to leverage gesture as a means of facilitating understanding of complex concepts. Taken together, these studies illustrate an approach to translating current theories of cognition into principles that can guide instructional design.

  8. Statistics of meteorology for dose evaluation of crews of nuclear ship

    International Nuclear Information System (INIS)

    Imai, Kazuhiko; Chino, Masamichi

    1981-01-01

    For the purpose of the dose evaluation of crews of a nuclear ship, the statistics of wind speed and direction relative to the ship is discussed, using wind data which are reported from ships crusing sea around Japan Island. The analysis on the data shows that the occurrence frequency of wind speed can be fitted with the γ-distribution having parameter p around 3 and wind direction frequency can be treated as a uniform distribution. Using these distributions and taking the ship speed u 3 and the long-term mean speed of natural wind anti u as constant parameters, frequency distribution of wind speed and direction relative to the ship was calculated and statistical quantities necessary for dose evaluation were obtained in the way similar to the procedure for reactor sites on land. The 97% value of wind speed u 97 , which should be used in the dose evaluation for accidental releases may give conservative doses, if it is evaluated as follows, u 97 = 0.64 u sub(s) in the cases u sub(s) > anti u, and u 97 = 0.86 anti u in the cases u sub(s) < anti u including u sub(s) = 0. (author)

  9. Evaluating Statistical Process Control (SPC) techniques and computing the uncertainty of force calibrations

    Science.gov (United States)

    Navard, Sharon E.

    1989-01-01

    In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.

  10. The Australasian Resuscitation in Sepsis Evaluation (ARISE) trial statistical analysis plan.

    Science.gov (United States)

    Delaney, Anthony P; Peake, Sandra L; Bellomo, Rinaldo; Cameron, Peter; Holdgate, Anna; Howe, Belinda; Higgins, Alisa; Presneill, Jeffrey; Webb, Steve

    2013-09-01

    The Australasian Resuscitation in Sepsis Evaluation (ARISE) study is an international, multicentre, randomised, controlled trial designed to evaluate the effectiveness of early goal-directed therapy compared with standard care for patients presenting to the emergency department with severe sepsis. In keeping with current practice, and considering aspects of trial design and reporting specific to non-pharmacological interventions, our plan outlines the principles and methods for analysing and reporting the trial results. The document is prepared before completion of recruitment into the ARISE study, without knowledge of the results of the interim analysis conducted by the data safety and monitoring committee and before completion of the two related international studies. Our statistical analysis plan was designed by the ARISE chief investigators, and reviewed and approved by the ARISE steering committee. We reviewed the data collected by the research team as specified in the study protocol and detailed in the study case report form. We describe information related to baseline characteristics, characteristics of delivery of the trial interventions, details of resuscitation, other related therapies and other relevant data with appropriate comparisons between groups. We define the primary, secondary and tertiary outcomes for the study, with description of the planned statistical analyses. We have developed a statistical analysis plan with a trial profile, mock-up tables and figures. We describe a plan for presenting baseline characteristics, microbiological and antibiotic therapy, details of the interventions, processes of care and concomitant therapies and adverse events. We describe the primary, secondary and tertiary outcomes with identification of subgroups to be analysed. We have developed a statistical analysis plan for the ARISE study, available in the public domain, before the completion of recruitment into the study. This will minimise analytical bias and

  11. Method of public support evaluation for advanced NPP deployment

    International Nuclear Information System (INIS)

    Zezula, L.; Hermansky, B.

    2005-01-01

    Public support of nuclear power could be fully recovered only if the public would, from the very beginning of the new power source selection process, receive transparent information and was made a part of interactive dialogue. The presented method was developed with the objective to facilitate the complex process of the utilities - public interaction. Our method of the public support evaluation allows to classify designs of new nuclear power plants taking into consideration the public attitude to continued nuclear power deployment in the Czech Republic as well as the preference of a certain plant design. The method is based on the model with a set of probabilistic input metrics, which permits to compare the offered concepts with the reference one, with a high degree of objectivity. This method is a part of the more complex evaluation procedure applicable for the new designs assessment that uses the computer code ''Potencial'' developed at the NRI Rez plc. The metrics of the established public support criteria are discussed. (author)

  12. Quantitative Evaluation of Hybrid Aspen Xylem and Immunolabeling Patterns Using Image Analysis and Multivariate Statistics

    Directory of Open Access Journals (Sweden)

    David Sandquist

    2015-06-01

    Full Text Available A new method is presented for quantitative evaluation of hybrid aspen genotype xylem morphology and immunolabeling micro-distribution. This method can be used as an aid in assessing differences in genotypes from classic tree breeding studies, as well as genetically engineered plants. The method is based on image analysis, multivariate statistical evaluation of light, and immunofluorescence microscopy images of wood xylem cross sections. The selected immunolabeling antibodies targeted five different epitopes present in aspen xylem cell walls. Twelve down-regulated hybrid aspen genotypes were included in the method development. The 12 knock-down genotypes were selected based on pre-screening by pyrolysis-IR of global chemical content. The multivariate statistical evaluations successfully identified comparative trends for modifications in the down-regulated genotypes compared to the unmodified control, even when no definitive conclusions could be drawn from individual studied variables alone. Of the 12 genotypes analyzed, three genotypes showed significant trends for modifications in both morphology and immunolabeling. Six genotypes showed significant trends for modifications in either morphology or immunocoverage. The remaining three genotypes did not show any significant trends for modification.

  13. GeneTrailExpress: a web-based pipeline for the statistical evaluation of microarray experiments

    Directory of Open Access Journals (Sweden)

    Kohlbacher Oliver

    2008-12-01

    Full Text Available Abstract Background High-throughput methods that allow for measuring the expression of thousands of genes or proteins simultaneously have opened new avenues for studying biochemical processes. While the noisiness of the data necessitates an extensive pre-processing of the raw data, the high dimensionality requires effective statistical analysis methods that facilitate the identification of crucial biological features and relations. For these reasons, the evaluation and interpretation of expression data is a complex, labor-intensive multi-step process. While a variety of tools for normalizing, analysing, or visualizing expression profiles has been developed in the last years, most of these tools offer only functionality for accomplishing certain steps of the evaluation pipeline. Results Here, we present a web-based toolbox that provides rich functionality for all steps of the evaluation pipeline. Our tool GeneTrailExpress offers besides standard normalization procedures powerful statistical analysis methods for studying a large variety of biological categories and pathways. Furthermore, an integrated graph visualization tool, BiNA, enables the user to draw the relevant biological pathways applying cutting-edge graph-layout algorithms. Conclusion Our gene expression toolbox with its interactive visualization of the pathways and the expression values projected onto the nodes will simplify the analysis and interpretation of biochemical pathways considerably.

  14. Addressing issues associated with evaluating prediction models for survival endpoints based on the concordance statistic.

    Science.gov (United States)

    Wang, Ming; Long, Qi

    2016-09-01

    Prediction models for disease risk and prognosis play an important role in biomedical research, and evaluating their predictive accuracy in the presence of censored data is of substantial interest. The standard concordance (c) statistic has been extended to provide a summary measure of predictive accuracy for survival models. Motivated by a prostate cancer study, we address several issues associated with evaluating survival prediction models based on c-statistic with a focus on estimators using the technique of inverse probability of censoring weighting (IPCW). Compared to the existing work, we provide complete results on the asymptotic properties of the IPCW estimators under the assumption of coarsening at random (CAR), and propose a sensitivity analysis under the mechanism of noncoarsening at random (NCAR). In addition, we extend the IPCW approach as well as the sensitivity analysis to high-dimensional settings. The predictive accuracy of prediction models for cancer recurrence after prostatectomy is assessed by applying the proposed approaches. We find that the estimated predictive accuracy for the models in consideration is sensitive to NCAR assumption, and thus identify the best predictive model. Finally, we further evaluate the performance of the proposed methods in both settings of low-dimensional and high-dimensional data under CAR and NCAR through simulations. © 2016, The International Biometric Society.

  15. Evaluation of RxNorm for Medication Clinical Decision Support.

    Science.gov (United States)

    Freimuth, Robert R; Wix, Kelly; Zhu, Qian; Siska, Mark; Chute, Christopher G

    2014-01-01

    We evaluated the potential use of RxNorm to provide standardized representations of generic drug name and route of administration to facilitate management of drug lists for clinical decision support (CDS) rules. We found a clear representation of generic drug name but not route of administration. We identified several issues related to data quality, including erroneous or missing defined relationships, and the use of different concept hierarchies to represent the same drug. More importantly, we found extensive semantic precoordination of orthogonal concepts related to route and dose form, which would complicate the use of RxNorm for drug-based CDS. This study demonstrated that while RxNorm is a valuable resource for the standardization of medications used in clinical practice, additional work is required to enhance the terminology so that it can support expanded use cases, such as managing drug lists for CDS.

  16. Peer-supported review of teaching: an evaluation.

    Science.gov (United States)

    Thampy, Harish; Bourke, Michael; Naran, Prasheena

    2015-09-01

    Peer-supported review (also called peer observation) of teaching is a commonly implemented method of ascertaining teaching quality that supplements student feedback. A large variety of scheme formats with rather differing purposes are described in the literature. They range from purely formative, developmental formats that facilitate a tutor's reflection of their own teaching to reaffirm strengths and identify potential areas for development through to faculty- or institution-driven summative quality assurance-based schemes. Much of the current literature in this field focuses within general higher education and on the development of rating scales, checklists or observation tools to help guide the process. This study reports findings from a qualitative evaluation of a purely formative peer-supported review of teaching scheme that was implemented for general practice clinical tutors at our medical school and describes tutors' attitudes and perceived benefits and challenges when undergoing observation.

  17. Evaluating the statistical power of DNA-based identification, exemplified by 'The missing grandchildren of Argentina'.

    Science.gov (United States)

    Kling, Daniel; Egeland, Thore; Piñero, Mariana Herrera; Vigeland, Magnus Dehli

    2017-11-01

    Methods and implementations of DNA-based identification are well established in several forensic contexts. However, assessing the statistical power of these methods has been largely overlooked, except in the simplest cases. In this paper we outline general methods for such power evaluation, and apply them to a large set of family reunification cases, where the objective is to decide whether a person of interest (POI) is identical to the missing person (MP) in a family, based on the DNA profile of the POI and available family members. As such, this application closely resembles database searching and disaster victim identification (DVI). If parents or children of the MP are available, they will typically provide sufficient statistical evidence to settle the case. However, if one must resort to more distant relatives, it is not a priori obvious that a reliable conclusion is likely to be reached. In these cases power evaluation can be highly valuable, for instance in the recruitment of additional family members. To assess the power in an identification case, we advocate the combined use of two statistics: the Probability of Exclusion, and the Probability of Exceedance. The former is the probability that the genotypes of a random, unrelated person are incompatible with the available family data. If this is close to 1, it is likely that a conclusion will be achieved regarding general relatedness, but not necessarily the specific relationship. To evaluate the ability to recognize a true match, we use simulations to estimate exceedance probabilities, i.e. the probability that the likelihood ratio will exceed a given threshold, assuming that the POI is indeed the MP. All simulations are done conditionally on available family data. Such conditional simulations have a long history in medical linkage analysis, but to our knowledge this is the first systematic forensic genetics application. Also, for forensic markers mutations cannot be ignored and therefore current models and

  18. Scientific Opinion on Statistical considerations for the safety evaluation of GMOs

    DEFF Research Database (Denmark)

    Sørensen, Ilona Kryspin

    in the experimental design of field trials, such as the inclusion of commercial varieties, in order to ensure sufficient statistical power and reliable estimation of natural variability. A graphical representation is proposed to allow the comparison of the GMO, its conventional counterpart and the commercial...... such estimates are unavailable may they be estimated from databases or literature. Estimated natural variability should be used to specify equivalence limits to test the difference between the GMO and the commercial varieties. Adjustments to these equivalence limits allow a simple graphical representation so...... in this opinion may be used, in certain cases, for the evaluation of GMOs other than plants....

  19. Center of Excellence for Applied Mathematical and Statistical Research in support of development of multicrop production monitoring capability

    Science.gov (United States)

    Woodward, W. A.; Gray, H. L.

    1983-01-01

    Efforts in support of the development of multicrop production monitoring capability are reported. In particular, segment level proportion estimation techniques based upon a mixture model were investigated. Efforts have dealt primarily with evaluation of current techniques and development of alternative ones. A comparison of techniques is provided on both simulated and LANDSAT data along with an analysis of the quality of profile variables obtained from LANDSAT data.

  20. Explanation of the methods employed in the statistical evaluation of SALE program data

    International Nuclear Information System (INIS)

    Bracey, J.T.; Soriano, M.

    1981-01-01

    The analysis of Safeguards Analytical Laboratory Evaluation (SALE) bimonthly data is described. Statistical procedures are discussed in Section A, followed by the descriptions of tabular and graphic values in Section B. Calculation formulae for the various statistics in the reports are presented in Section C. SALE data reported to New Brunswick Laboratory (NBL) are entered into a computerized system through routine data processing procedures. Bimonthly and annual reports are generated from this data system. In the bimonthly data analysis, data from the six most recent reporting periods of each laboratory-material-analytical method combination are utilized. Analysis results in the bimonthly reports are only presented for those participants who have reported data at least once during the last 12-month period. Reported values are transformed to relative percent difference values calculated by [(reported value - reference value)/reference value] x 100. Analysis of data is performed on these transformed values. Accordingly, the results given in the bimonthly report are (relative) percent differences (% DIFF). Suspect, large variations are verified with individual participants to eliminate errors in the transcription process. Statistical extreme values are not excluded from bimonthly analysis; all data are used

  1. Statistical re-evaluation of the ASME KIC and KIR fracture toughness reference curves

    International Nuclear Information System (INIS)

    Wallin, K.

    1999-01-01

    Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the 'master curve', has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original database was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound master curve has the same inherent degree of safety as originally intended for the K IC -reference curve. Similarly, the 1% lower bound master curve corresponds to the K IR -reference curve. (orig.)

  2. Statistical re-evaluation of the ASME KIC and KIR fracture toughness reference curves

    International Nuclear Information System (INIS)

    Wallin, K.; Rintamaa, R.

    1998-01-01

    Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the 'Master curve', has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the Master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original data base was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound Master curve has the same inherent degree of safety as originally intended for the K IC -reference curve. Similarly, the 1% lower bound Master curve corresponds to the K IR -reference curve. (orig.)

  3. Improving alignment in Tract-based spatial statistics: evaluation and optimization of image registration.

    Science.gov (United States)

    de Groot, Marius; Vernooij, Meike W; Klein, Stefan; Ikram, M Arfan; Vos, Frans M; Smith, Stephen M; Niessen, Wiro J; Andersson, Jesper L R

    2013-08-01

    Anatomical alignment in neuroimaging studies is of such importance that considerable effort is put into improving the registration used to establish spatial correspondence. Tract-based spatial statistics (TBSS) is a popular method for comparing diffusion characteristics across subjects. TBSS establishes spatial correspondence using a combination of nonlinear registration and a "skeleton projection" that may break topological consistency of the transformed brain images. We therefore investigated feasibility of replacing the two-stage registration-projection procedure in TBSS with a single, regularized, high-dimensional registration. To optimize registration parameters and to evaluate registration performance in diffusion MRI, we designed an evaluation framework that uses native space probabilistic tractography for 23 white matter tracts, and quantifies tract similarity across subjects in standard space. We optimized parameters for two registration algorithms on two diffusion datasets of different quality. We investigated reproducibility of the evaluation framework, and of the optimized registration algorithms. Next, we compared registration performance of the regularized registration methods and TBSS. Finally, feasibility and effect of incorporating the improved registration in TBSS were evaluated in an example study. The evaluation framework was highly reproducible for both algorithms (R(2) 0.993; 0.931). The optimal registration parameters depended on the quality of the dataset in a graded and predictable manner. At optimal parameters, both algorithms outperformed the registration of TBSS, showing feasibility of adopting such approaches in TBSS. This was further confirmed in the example experiment. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. Small nodule detectability evaluation using a generalized scan-statistic model

    International Nuclear Information System (INIS)

    Popescu, Lucretiu M; Lewitt, Robert M

    2006-01-01

    In this paper is investigated the use of the scan statistic for evaluating the detectability of small nodules in medical images. The scan-statistic method is often used in applications in which random fields must be searched for abnormal local features. Several results of the detection with localization theory are reviewed and a generalization is presented using the noise nodule distribution obtained by scanning arbitrary areas. One benefit of the noise nodule model is that it enables determination of the scan-statistic distribution by using only a few image samples in a way suitable both for simulation and experimental setups. Also, based on the noise nodule model, the case of multiple targets per image is addressed and an image abnormality test using the likelihood ratio and an alternative test using multiple decision thresholds are derived. The results obtained reveal that in the case of low contrast nodules or multiple nodules the usual test strategy based on a single decision threshold underperforms compared with the alternative tests. That is a consequence of the fact that not only the contrast or the size, but also the number of suspicious nodules is a clue indicating the image abnormality. In the case of the likelihood ratio test, the multiple clues are unified in a single decision variable. Other tests that process multiple clues differently do not necessarily produce a unique ROC curve, as shown in examples using a test involving two decision thresholds. We present examples with two-dimensional time-of-flight (TOF) and non-TOF PET image sets analysed using the scan statistic for different search areas, as well as the fixed position observer

  5. Statistical evaluation of the analytical method involved in French nuclear glasses leaching rate determination

    Energy Technology Data Exchange (ETDEWEB)

    Broudic, V.; Marques, C.; Bonnal, M

    2004-07-01

    Chemical durability studies of nuclear glasses involves a large number of water leaching experiments at different temperatures and pressures on both, glasses doped with fission products and actinides and non radioactive surrogates. The leaching rates of these glasses are evaluated through ICPAES analysis of the leachate over time. This work presents a statistical evaluation of the analysis method used to determine the concentrations of various vitreous matrix constituents: Si, B, Na, Al, Ca, Li as major elements and Ba, Cr, Fe, Mn, Mo, Ni, P, Sr, Zn, Zr as minor elements. Calibration characteristics, limits of detection, limits of quantification and uncertainties quantification are illustrated with different examples of analysis performed on surrogates and on radioactive leachates in glove box. (authors)

  6. Laser ektacytometry and evaluation of statistical characteristics of inhomogeneous ensembles of red blood cells

    Science.gov (United States)

    Nikitin, S. Yu.; Priezzhev, A. V.; Lugovtsov, A. E.; Ustinov, V. D.; Razgulin, A. V.

    2014-10-01

    The paper is devoted to development of the laser ektacytometry technique for evaluation of the statistical characteristics of inhomogeneous ensembles of red blood cells (RBCs). We have analyzed theoretically laser beam scattering by the inhomogeneous ensembles of elliptical discs, modeling red blood cells in the ektacytometer. The analysis shows that the laser ektacytometry technique allows for quantitative evaluation of such population characteristics of RBCs as the cells mean shape, the cells deformability variance and asymmetry of the cells distribution in the deformability. Moreover, we show that the deformability distribution itself can be retrieved by solving a specific Fredholm integral equation of the first kind. At this stage we do not take into account the scatter in the RBC sizes.

  7. Statistical evaluation of the analytical method involved in French nuclear glasses leaching rate determination

    International Nuclear Information System (INIS)

    Broudic, V.; Marques, C.; Bonnal, M.

    2004-01-01

    Chemical durability studies of nuclear glasses involves a large number of water leaching experiments at different temperatures and pressures on both, glasses doped with fission products and actinides and non radioactive surrogates. The leaching rates of these glasses are evaluated through ICPAES analysis of the leachate over time. This work presents a statistical evaluation of the analysis method used to determine the concentrations of various vitreous matrix constituents: Si, B, Na, Al, Ca, Li as major elements and Ba, Cr, Fe, Mn, Mo, Ni, P, Sr, Zn, Zr as minor elements. Calibration characteristics, limits of detection, limits of quantification and uncertainties quantification are illustrated with different examples of analysis performed on surrogates and on radioactive leachates in glove box. (authors)

  8. A commercial microbial enhanced oil recovery process: statistical evaluation of a multi-project database

    Energy Technology Data Exchange (ETDEWEB)

    Portwood, J.T.

    1995-12-31

    This paper discusses a database of information collected and organized during the past eight years from 2,000 producing oil wells in the United States, all of which have been treated with special applications techniques developed to improve the effectiveness of MEOR technology. The database, believed to be the first of its kind, has been generated for the purpose of statistically evaluating the effectiveness and economics of the MEOR process in a wide variety of oil reservoir environments, and is a tool that can be used to improve the predictability of treatment response. The information in the database has also been evaluated to determine which, if any, reservoir characteristics are dominant factors in determining the applicability of MEOR.

  9. Efficient computational model for classification of protein localization images using Extended Threshold Adjacency Statistics and Support Vector Machines.

    Science.gov (United States)

    Tahir, Muhammad; Jan, Bismillah; Hayat, Maqsood; Shah, Shakir Ullah; Amin, Muhammad

    2018-04-01

    Discriminative and informative feature extraction is the core requirement for accurate and efficient classification of protein subcellular localization images so that drug development could be more effective. The objective of this paper is to propose a novel modification in the Threshold Adjacency Statistics technique and enhance its discriminative power. In this work, we utilized Threshold Adjacency Statistics from a novel perspective to enhance its discrimination power and efficiency. In this connection, we utilized seven threshold ranges to produce seven distinct feature spaces, which are then used to train seven SVMs. The final prediction is obtained through the majority voting scheme. The proposed ETAS-SubLoc system is tested on two benchmark datasets using 5-fold cross-validation technique. We observed that our proposed novel utilization of TAS technique has improved the discriminative power of the classifier. The ETAS-SubLoc system has achieved 99.2% accuracy, 99.3% sensitivity and 99.1% specificity for Endogenous dataset outperforming the classical Threshold Adjacency Statistics technique. Similarly, 91.8% accuracy, 96.3% sensitivity and 91.6% specificity values are achieved for Transfected dataset. Simulation results validated the effectiveness of ETAS-SubLoc that provides superior prediction performance compared to the existing technique. The proposed methodology aims at providing support to pharmaceutical industry as well as research community towards better drug designing and innovation in the fields of bioinformatics and computational biology. The implementation code for replicating the experiments presented in this paper is available at: https://drive.google.com/file/d/0B7IyGPObWbSqRTRMcXI2bG5CZWs/view?usp=sharing. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Walkability and Urban Capabilities: evaluation and Planning Decision Support

    Directory of Open Access Journals (Sweden)

    Ivan Blečić

    2015-06-01

    Full Text Available We propose a methodology for the evaluation of urban walkability, and the related software tool for decision and planning support. In the introduction, we discuss the relevance of the concept of walkability for urban quality of life, and attempt to place it within the framework of the capability approach. The central part of the article is dedicated to the presentation of the spatial multi-criteria evaluation model for walkability. Our construction of the walkability in the model proposes a certain change of perspective with regard to the methods suggested thus far: rather than evaluating how a place is walkable in itself, the walkability score we calculate reflects how and where to one can walk from that place, in other words, what is the walkability the place is endowed with. Therefore, the walkability score combines three components: (1 the number of available destinations (urban “opportunities” reachable by foot; (2 their distances; and (3 the quality of pedestrian routes towards those destinations. The quality of pedestrian routes is evaluated on different attributes relevant for walkability, related to the characteristics of the streets and their surrounding environment which contribute to render the route pleasant, secure and attractive. By way of example, in the third part we present an example application on the city of Alghero (Italy.

  11. Performance evaluation of a hybrid-passive landfill leachate treatment system using multivariate statistical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, Jack, E-mail: jack.wallace@ce.queensu.ca [Department of Civil Engineering, Queen’s University, Ellis Hall, 58 University Avenue, Kingston, Ontario K7L 3N6 (Canada); Champagne, Pascale, E-mail: champagne@civil.queensu.ca [Department of Civil Engineering, Queen’s University, Ellis Hall, 58 University Avenue, Kingston, Ontario K7L 3N6 (Canada); Monnier, Anne-Charlotte, E-mail: anne-charlotte.monnier@insa-lyon.fr [National Institute for Applied Sciences – Lyon, 20 Avenue Albert Einstein, 69621 Villeurbanne Cedex (France)

    2015-01-15

    Highlights: • Performance of a hybrid passive landfill leachate treatment system was evaluated. • 33 Water chemistry parameters were sampled for 21 months and statistically analyzed. • Parameters were strongly linked and explained most (>40%) of the variation in data. • Alkalinity, ammonia, COD, heavy metals, and iron were criteria for performance. • Eight other parameters were key in modeling system dynamics and criteria. - Abstract: A pilot-scale hybrid-passive treatment system operated at the Merrick Landfill in North Bay, Ontario, Canada, treats municipal landfill leachate and provides for subsequent natural attenuation. Collected leachate is directed to a hybrid-passive treatment system, followed by controlled release to a natural attenuation zone before entering the nearby Little Sturgeon River. The study presents a comprehensive evaluation of the performance of the system using multivariate statistical techniques to determine the interactions between parameters, major pollutants in the leachate, and the biological and chemical processes occurring in the system. Five parameters (ammonia, alkalinity, chemical oxygen demand (COD), “heavy” metals of interest, with atomic weights above calcium, and iron) were set as criteria for the evaluation of system performance based on their toxicity to aquatic ecosystems and importance in treatment with respect to discharge regulations. System data for a full range of water quality parameters over a 21-month period were analyzed using principal components analysis (PCA), as well as principal components (PC) and partial least squares (PLS) regressions. PCA indicated a high degree of association for most parameters with the first PC, which explained a high percentage (>40%) of the variation in the data, suggesting strong statistical relationships among most of the parameters in the system. Regression analyses identified 8 parameters (set as independent variables) that were most frequently retained for modeling

  12. Statistical Support for Analysis of the Social Stratification and Economic Inequality of the Country’s Population

    Directory of Open Access Journals (Sweden)

    Aksyonova Irina V.

    2017-12-01

    Full Text Available The aim of the article is to summarize the theoretical and methodological as well as information and analytical support for statistical research of economic and social stratification in society and conduct an analysis of the differentiation of the population of Ukraine in terms of the economic component of social inequality. The theoretical and methodological level of the research is studied, and criteria for social stratification and inequalities in society, systems, models and theories of social stratification of the population are singled out. The indicators of social and economic statistics regarding the differentiation of the population by income level are considered as the research tools. As a result of the analysis it was concluded that the economic inequality of the population leads to changes in the social structure, which requires formation of a new social stratification of society. The basis of social stratification is indicators of the population well-being, which require a comprehensive study. Prospects for further research in this area are the analysis of the components of economic inequality that determine and influence the social stratification of the population of the country, the formation of the middle class, and the study of the components of the human development index as a cross-currency indicator of the socio-economic inequality of the population.

  13. Statistical evaluation of major human errors during the development of new technological systems

    International Nuclear Information System (INIS)

    Campbell, G; Ott, K.O.

    1979-01-01

    Statistical procedures are presented to evaluate major human errors during the development of a new system, errors that have led or can lead to accidents or major failures. The first procedure aims at estimating the average residual occurrence rate for s or major failures after several have occurred. The procedure is solely based on the historical record. Certain idealizations are introduced that allow the application of a sound statistical evaluation procedure. These idealizations are practically realized to a sufficient degree such that the proposed estimation procedure yields meaningful results, even for situations with a sparse data base, represented by very few accidents. Under the assumption that the possible human-error-related failure times have exponential distributions, the statistical technique of isotonic regression is proposed to estimate the failure rates due to human design error at the failure times of the system. The last value in the sequence of estimates gives the residual accident chance. In addition, theactual situation is tested against the hypothesis that the failure rate of the system remains constant over time. This test determines the chance for a decreasing failure rate being incidental, rather than an indication of an actual learning process. Both techniques can be applied not merely to a single system but to an entire series of similar systems that a technology would generate, enabling the assessment of technological improvement. For the purpose of illustration, the nuclear decay of isotopes was chosen as an example, since the assumptions of the model are rigorously satisfied in this case. This application shows satisfactory agreement of the estimated and actual failure rates (which are exactly known in this example), although the estimation was deliberately based on a sparse historical record

  14. Hedonic approaches based on spatial econometrics and spatial statistics: application to evaluation of project benefits

    Science.gov (United States)

    Tsutsumi, Morito; Seya, Hajime

    2009-12-01

    This study discusses the theoretical foundation of the application of spatial hedonic approaches—the hedonic approach employing spatial econometrics or/and spatial statistics—to benefits evaluation. The study highlights the limitations of the spatial econometrics approach since it uses a spatial weight matrix that is not employed by the spatial statistics approach. Further, the study presents empirical analyses by applying the Spatial Autoregressive Error Model (SAEM), which is based on the spatial econometrics approach, and the Spatial Process Model (SPM), which is based on the spatial statistics approach. SPMs are conducted based on both isotropy and anisotropy and applied to different mesh sizes. The empirical analysis reveals that the estimated benefits are quite different, especially between isotropic and anisotropic SPM and between isotropic SPM and SAEM; the estimated benefits are similar for SAEM and anisotropic SPM. The study demonstrates that the mesh size does not affect the estimated amount of benefits. Finally, the study provides a confidence interval for the estimated benefits and raises an issue with regard to benefit evaluation.

  15. Characterization of groundwater quality using water evaluation indices, multivariate statistics and geostatistics in central Bangladesh

    Directory of Open Access Journals (Sweden)

    Md. Bodrud-Doza

    2016-04-01

    Full Text Available This study investigates the groundwater quality in the Faridpur district of central Bangladesh based on preselected 60 sample points. Water evaluation indices and a number of statistical approaches such as multivariate statistics and geostatistics are applied to characterize water quality, which is a major factor for controlling the groundwater quality in term of drinking purposes. The study reveal that EC, TDS, Ca2+, total As and Fe values of groundwater samples exceeded Bangladesh and international standards. Ground water quality index (GWQI exhibited that about 47% of the samples were belonging to good quality water for drinking purposes. The heavy metal pollution index (HPI, degree of contamination (Cd, heavy metal evaluation index (HEI reveal that most of the samples belong to low level of pollution. However, Cd provide better alternative than other indices. Principle component analysis (PCA suggests that groundwater quality is mainly related to geogenic (rock–water interaction and anthropogenic source (agrogenic and domestic sewage in the study area. Subsequently, the findings of cluster analysis (CA and correlation matrix (CM are also consistent with the PCA results. The spatial distributions of groundwater quality parameters are determined by geostatistical modeling. The exponential semivariagram model is validated as the best fitted models for most of the indices values. It is expected that outcomes of the study will provide insights for decision makers taking proper measures for groundwater quality management in central Bangladesh.

  16. Statistical evaluation of the degree of nominal convergence of the inflation rate in Romania

    Directory of Open Access Journals (Sweden)

    Mihai GHEORGHE

    2011-07-01

    Full Text Available Nominal convergence is a process that is characterised by the gradual harmonisation, to a relatively high degree, of the national institutions and policies of the Member States with those of the EU, in the monetary and financial fields.The birth of nominal convergence is marked by the Maastricht Treaty, by means of which the criteria required for adopting the euro were established. One of the criteria refers to price stability (inflation rate, which is measured by the Harmonised Index of Consumer Prices. A Member State meets this criterion if it has a price performance that is sustainable and an average rate of inflation, observed over a period of one year before the examination,that does not exceed by more than 1.5 percentage point that of, at most, the three best performing Member States in terms of price stability.The article proposes a model for the statistical evaluation of the degree to which the nominal convergence criterion related to price stability is met. The evaluation is based on the following pillars: a theoretical synthesis of the Harmonised Index of Consumer Prices, a statistical analysis concerning the evolution of inflation in Romania and the gap vis-à-vis the reference value for meeting the nominal convergence criterion.

  17. IMPORTANCE OF MATERIAL BALANCES AND THEIR STATISTICAL EVALUATION IN RUSSIAN MATERIAL, PROTECTION, CONTROL AND ACCOUNTING

    International Nuclear Information System (INIS)

    Fishbone, L.G.

    1999-01-01

    While substantial work has been performed in the Russian MPC and A Program, much more needs to be done at Russian nuclear facilities to complete four necessary steps. These are (1) periodically measuring the physical inventory of nuclear material, (2) continuously measuring the flows of nuclear material, (3) using the results to close the material balance, particularly at bulk processing facilities, and (4) statistically evaluating any apparent loss of nuclear material. The periodic closing of material balances provides an objective test of the facility's system of nuclear material protection, control and accounting. The statistical evaluation using the uncertainties associated with individual measurement systems involved in the calculation of the material balance provides a fair standard for concluding whether the apparent loss of nuclear material means a diversion or whether the facility's accounting system needs improvement. In particular, if unattractive flow material at a facility is not measured well, the accounting system cannot readily detect the loss of attractive material if the latter substantially derives from the former

  18. Support for system connectivity –Learning from evaluating an evaluation

    DEFF Research Database (Denmark)

    Christensen, Jesper Lindgaard

    been sparse on investigat-ing effects of these as well as on the broader criteria to evaluate the innovation networks and functioning of the system. This paper discusses what are such criteria for evaluating innovation policies that rely on enhanc-ing system connectivity and repair system failures......Innovation is said to be dependent upon collaboration and networks. The innovation system thinking empha-sizes networks but also their supporting informal institutions, learning processes, and the relations between actors in the system. Despite the importance of networks, evaluation studies have......? By way of illustration, and as a mean to be specific on these criteria, the paper discusses the possible rationale for governments to support business angel networks (BAN) and what criteria to apply when evaluating such networks. It is found that applying traditional evaluation criteria for assessing...

  19. Evaluating the effects of cognitive support on psychiatric clinical comprehension.

    Science.gov (United States)

    Dalai, Venkata V; Khalid, Sana; Gottipati, Dinesh; Kannampallil, Thomas; John, Vineeth; Blatter, Brett; Patel, Vimla L; Cohen, Trevor

    2014-10-01

    Clinicians' attention is a precious resource, which in the current healthcare practice is consumed by the cognitive demands arising from complex patient conditions, information overload, time pressure, and the need to aggregate and synthesize information from disparate sources. The ability to organize information in ways that facilitate the generation of effective diagnostic solutions is a distinguishing characteristic of expert physicians, suggesting that automated systems that organize clinical information in a similar manner may augment physicians' decision-making capabilities. In this paper, we describe the design and evaluation of a theoretically driven cognitive support system (CSS) that assists psychiatrists in their interpretation of clinical cases. The system highlights, and provides the means to navigate to, text that is organized in accordance with a set of diagnostically and therapeutically meaningful higher-level concepts. To evaluate the interface, 16 psychiatry residents interpreted two clinical case scenarios, with and without the CSS. Think-aloud protocols captured during their interpretation of the cases were transcribed and analyzed qualitatively. In addition, the frequency and relative position of content related to key higher-level concepts in a verbal summary of the case were evaluated. In addition the transcripts from both groups were compared to an expert derived reference standard using latent semantic analysis (LSA). Qualitative analysis showed that users of the system better attended to specific clinically important aspects of both cases when these were highlighted by the system, and revealed ways in which the system mediates hypotheses generation and evaluation. Analysis of the summary data showed differences in emphasis with and without the system. The LSA analysis suggested users of the system were more "expert-like" in their emphasis, and that cognitive support was more effective in the more complex case. Cognitive support impacts

  20. Critical evaluation of national vital statistics: the case of preterm birth trends in Portugal.

    Science.gov (United States)

    Correia, Sofia; Rodrigues, Teresa; Montenegro, Nuno; Barros, Henrique

    2015-11-01

    Using vital statistics, the Portuguese National Health Plan predicts that 14% of live births will be preterm in 2016. The prediction was based on a preterm birth rise from 5.9% in 2000 to 8.8% in 2009. However, the same source showed an actual decline from 2010 onwards. To assess the plausibility of national preterm birth trends, we aimed to compare the evolution of preterm birth and low birthweight rates between vital statistics and a hospital database. A time-trend analysis (2004-2011) of preterm birth (rates was conducted using data on singleton births from the national birth certificates (n = 801,783) and an electronic maternity unit database (n = 21,392). Annual prevalence estimates, ratios of preterm birth:low birthweight and adjusted prevalence ratios were estimated to compare data sources. Although the national prevalence of preterm birth increased from 2004 (5.4%), particularly between 2006 and 2009 (highest rate was 7.5% in 2007), and decreased after 2009 (5.7% in 2011), the prevalence at the maternity unit remained constant. Between 2006 and 2009, preterm birth was almost 1.4 times higher in the national statistics (using the national or the catchment region samples) than in the maternity unit, but no differences were found for low birthweight. Portuguese preterm birth prevalence seems biased between 2006 and 2009, suggesting that early term babies were misclassified as preterm. As civil registration systems are important to support public health decisions, monitoring strategies should be taken to assure good quality data. © 2015 Nordic Federation of Societies of Obstetrics and Gynecology.

  1. Using support vector machines with tract-based spatial statistics for automated classification of Tourette syndrome children

    Science.gov (United States)

    Wen, Hongwei; Liu, Yue; Wang, Jieqiong; Zhang, Jishui; Peng, Yun; He, Huiguang

    2016-03-01

    Tourette syndrome (TS) is a developmental neuropsychiatric disorder with the cardinal symptoms of motor and vocal tics which emerges in early childhood and fluctuates in severity in later years. To date, the neural basis of TS is not fully understood yet and TS has a long-term prognosis that is difficult to accurately estimate. Few studies have looked at the potential of using diffusion tensor imaging (DTI) in conjunction with machine learning algorithms in order to automate the classification of healthy children and TS children. Here we apply Tract-Based Spatial Statistics (TBSS) method to 44 TS children and 48 age and gender matched healthy children in order to extract the diffusion values from each voxel in the white matter (WM) skeleton, and a feature selection algorithm (ReliefF) was used to select the most salient voxels for subsequent classification with support vector machine (SVM). We use a nested cross validation to yield an unbiased assessment of the classification method and prevent overestimation. The accuracy (88.04%), sensitivity (88.64%) and specificity (87.50%) were achieved in our method as peak performance of the SVM classifier was achieved using the axial diffusion (AD) metric, demonstrating the potential of a joint TBSS and SVM pipeline for fast, objective classification of healthy and TS children. These results support that our methods may be useful for the early identification of subjects with TS, and hold promise for predicting prognosis and treatment outcome for individuals with TS.

  2. Statistical evaluation of GLONASS amplitude scintillation over low latitudes in the Brazilian territory

    Science.gov (United States)

    de Oliveira Moraes, Alison; Muella, Marcio T. A. H.; de Paula, Eurico R.; de Oliveira, César B. A.; Terra, William P.; Perrella, Waldecir J.; Meibach-Rosa, Pâmela R. P.

    2018-04-01

    The ionospheric scintillation, generated by the ionospheric plasma irregularities, affects the radio signals that pass through it. Their effects are widely studied in the literature with two different approaches. The first one deals with the use of radio signals to study and understand the morphology of this phenomenon, while the second one seeks to understand and model how much this phenomenon interferes in the radio signals and consequently in the services to which these systems work. The interest of several areas, particularly to those that are life critical, has increased using the concept of satellite multi-constellation, which consists of receiving, processing and using data from different navigation and positioning systems. Although there is a vast literature analyzing the effects of ionospheric scintillation on satellite navigation systems, the number of studies using signals received from the Russian satellite positioning system (named GLONASS) is still very rare. This work presents for the first time in the Brazilian low-latitude sector a statistical analysis of ionospheric scintillation data for all levels of magnetic activities obtained by a set of scintillation monitors that receive signals from the GLONASS system. In this study, data collected from four stations were used in the analysis; Fortaleza, Presidente Prudente, São José dos Campos and Porto Alegre. The GLONASS L-band signals were analyzed for the period from December 21, 2012 to June 20, 2016, which includes the peak of the solar cycle 24 that occurred in 2014. The main characteristics of scintillation presented in this study include: (1) the statistical evaluation of seasonal and solar activity, showing the chances that an user on similar geophysical conditions may be susceptible to the effects of ionospheric scintillation; (2) a temporal analysis based on the local time distribution of scintillation at different seasons and intensity levels; and (3) the evaluation of number of

  3. Visual classification of very fine-grained sediments: Evaluation through univariate and multivariate statistics

    Science.gov (United States)

    Hohn, M. Ed; Nuhfer, E.B.; Vinopal, R.J.; Klanderman, D.S.

    1980-01-01

    Classifying very fine-grained rocks through fabric elements provides information about depositional environments, but is subject to the biases of visual taxonomy. To evaluate the statistical significance of an empirical classification of very fine-grained rocks, samples from Devonian shales in four cored wells in West Virginia and Virginia were measured for 15 variables: quartz, illite, pyrite and expandable clays determined by X-ray diffraction; total sulfur, organic content, inorganic carbon, matrix density, bulk density, porosity, silt, as well as density, sonic travel time, resistivity, and ??-ray response measured from well logs. The four lithologic types comprised: (1) sharply banded shale, (2) thinly laminated shale, (3) lenticularly laminated shale, and (4) nonbanded shale. Univariate and multivariate analyses of variance showed that the lithologic classification reflects significant differences for the variables measured, difference that can be detected independently of stratigraphic effects. Little-known statistical methods found useful in this work included: the multivariate analysis of variance with more than one effect, simultaneous plotting of samples and variables on canonical variates, and the use of parametric ANOVA and MANOVA on ranked data. ?? 1980 Plenum Publishing Corporation.

  4. Pattern recognition by the use of multivariate statistical evaluation of macro- and micro-PIXE results

    International Nuclear Information System (INIS)

    Tapper, U.A.S.; Malmqvist, K.G.; Loevestam, N.E.G.; Swietlicki, E.; Salford, L.G.

    1991-01-01

    The importance of statistical evaluation of multielemental data is illustrated using the data collected in a macro- and micro-PIXE analysis of human brain tumours. By employing a multivariate statistical classification methodology (SIMCA) it was shown that the total information collected from each specimen separates three types of tissue: High malignant, less malignant and normal brain tissue. This makes a classification of a given specimen possible based on the elemental concentrations. Partial least squares regression (PLS), a multivariate regression method, made it possible to study the relative importance of the examined nine trace elements, the dry/wet weight ratio and the age of the patient in predicting the survival time after operation for patients with the high malignant form, astrocytomas grade III-IV. The elemental maps from a microprobe analysis were also subjected to multivariate analysis. This showed that the six elements sorted into maps could be presented in three maps containing all the relevant information. The intensity in these maps is proportional to the value (score) of the actual pixel along the calculated principal components. (orig.)

  5. Various Statistical Methods in Use for Evaluating Human Malignant Gastric Specimens

    Directory of Open Access Journals (Sweden)

    Ventzeslav Enchev

    1998-01-01

    Full Text Available This paper presents the use of certain statistical methods (comparison of means – independent samples t‐test, multiple linear regression analysis, multiple logistic regression analysis, analysis of clusters, etc. included in the SPSS Statistical Package used to classify the patients quantitatively evaluated after a subtotal resection of their stomachs. The group consisted of 40 patients subdivided into two groups: primary neoplasia of the stomach (20 patients, and corresponding lymphogenic deposits in the abdominal perigastric lymph nodes (20 patients. Paraffin‐embedded tissue sections (thickness 4–5µm prepared as consecutive hematoxylin‐eosin‐stained slides were morphometrically measured by a rotation of a graduated eyepiece‐micrometer; thus, we obtained the minor and major axes’ lengths of the elliptic nuclear profiles and the minor and major caliper diameters of the corresponding cellular profiles. These four variables were used to determine the dynamic changes in quantitative features of human gastric lesions when passing from normal histological structures, through hyperplastic processes (chronic gastritis, gastric precancer (ulcers and polyps with or without malignancy till the development of primary carcinomas and their corresponding lymphogeneous metastases. Besides the increased cytomorphometrical measures, we also noted an opportunity to classify the patients according to these data as well as to add to the knowledge of our consultation system for clinical aid and use, recently published in the literature.

  6. Evaluation and projection of daily temperature percentiles from statistical and dynamical downscaling methods

    Directory of Open Access Journals (Sweden)

    A. Casanueva

    2013-08-01

    Full Text Available The study of extreme events has become of great interest in recent years due to their direct impact on society. Extremes are usually evaluated by using extreme indicators, based on order statistics on the tail of the probability distribution function (typically percentiles. In this study, we focus on the tail of the distribution of daily maximum and minimum temperatures. For this purpose, we analyse high (95th and low (5th percentiles in daily maximum and minimum temperatures on the Iberian Peninsula, respectively, derived from different downscaling methods (statistical and dynamical. First, we analyse the performance of reanalysis-driven downscaling methods in present climate conditions. The comparison among the different methods is performed in terms of the bias of seasonal percentiles, considering as observations the public gridded data sets E-OBS and Spain02, and obtaining an estimation of both the mean and spatial percentile errors. Secondly, we analyse the increments of future percentile projections under the SRES A1B scenario and compare them with those corresponding to the mean temperature, showing that their relative importance depends on the method, and stressing the need to consider an ensemble of methodologies.

  7. Evaluating correlation between geometrical relationship and dose difference caused by respiratory motion using statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Dong Seok; Kim, Dong Su; Kim, Tae Ho; Kim, Kyeong Hyeon; Yoon, Do Kun; Suh, Tae Suk [The Catholic University of Korea, Seoul (Korea, Republic of); Kang, Seong Hee [Seoul National University Hospital, Seoul (Korea, Republic of); Cho, Min Seok [Asan Medical Center, Seoul (Korea, Republic of); Noh, Yu Yoon [Eulji University Hospital, Daejeon (Korea, Republic of)

    2017-04-15

    Three-dimensional dose (3D dose) can consider coverage of moving target, however it is difficult to provide dosimetric effect which occurs by respiratory motions. Four-dimensional dose (4D dose) which uses deformable image registration (DIR) algorithm from four-dimensional computed tomography (4DCT) images can consider dosimetric effect by respiratory motions. The dose difference between 3D dose and 4D dose can be varied according to the geometrical relationship between a planning target volume (PTV) and an organ at risk (OAR). The purpose of this study is to evaluate the correlation between the overlap volume histogram (OVH), which quantitatively shows the geometrical relationship between the PTV and OAR, and the dose differences. In conclusion, no significant statistical correlation was found between the OVH and dose differences. However, it was confirmed that a higher difference between the 3D and 4D doses could occur in cases that have smaller OVH value. No significant statistical correlation was found between the OVH and dose differences. However, it was confirmed that a higher difference between the 3D and 4D doses could occur in cases that have smaller OVH value.

  8. Implementation of Statistical Process Control: Evaluating the Mechanical Performance of a Candidate Silicone Elastomer Docking Seal

    Science.gov (United States)

    Oravec, Heather Ann; Daniels, Christopher C.

    2014-01-01

    The National Aeronautics and Space Administration has been developing a novel docking system to meet the requirements of future exploration missions to low-Earth orbit and beyond. A dynamic gas pressure seal is located at the main interface between the active and passive mating components of the new docking system. This seal is designed to operate in the harsh space environment, but is also to perform within strict loading requirements while maintaining an acceptable level of leak rate. In this study, a candidate silicone elastomer seal was designed, and multiple subscale test articles were manufactured for evaluation purposes. The force required to fully compress each test article at room temperature was quantified and found to be below the maximum allowable load for the docking system. However, a significant amount of scatter was observed in the test results. Due to the stochastic nature of the mechanical performance of this candidate docking seal, a statistical process control technique was implemented to isolate unusual compression behavior from typical mechanical performance. The results of this statistical analysis indicated a lack of process control, suggesting a variation in the manufacturing phase of the process. Further investigation revealed that changes in the manufacturing molding process had occurred which may have influenced the mechanical performance of the seal. This knowledge improves the chance of this and future space seals to satisfy or exceed design specifications.

  9. Jsub(Ic)-testing of A-533 B - statistical evaluation of some different testing techniques

    International Nuclear Information System (INIS)

    Nilsson, F.

    1978-01-01

    The purpose of the present study was to compare statistically some different methods for the evaluation of fracture toughness of the nuclear reactor material A-533 B. Since linear elastic fracture mechanics is not applicable to this material at the interesting temperature (275 0 C), the so-called Jsub(Ic) testing method was employed. Two main difficulties are inherent in this type of testing. The first one is to determine the quantity J as a function of the deflection of the three-point bend specimens used. Three different techniques were used, the first two based on the experimentally observed input of energy to the specimen and the third employing finite element calculations. The second main problem is to determine the point when crack growth begins. For this, two methods were used, a direct electrical method and the indirect R-curve method. A total of forty specimens were tested at two laboratories. No statistically significant different results were obtained from the respective laboratories. The three methods of calculating J yielded somewhat different results, although the discrepancy was small. Also the two methods of determination of the growth initiation point yielded consistent results. The R-curve method, however, exhibited a larger uncertainty as measured by the standard deviation. The resulting Jsub(Ic) value also agreed well with earlier presented results. The relative standard deviation was of the order of 25%, which is quite small for this type of experiment. (author)

  10. Statistical evaluation of effluent monitoring data for the 200 Area Treated Effluent Disposal Facility

    International Nuclear Information System (INIS)

    Chou, C.J.; Johnson, V.G.

    2000-01-01

    The 200 Area Treated Effluent Disposal Facility (TEDF) consists of a pair of infiltration basins that receive wastewater originating from the 200 West and 200 East Areas of the Hanford Site. TEDF has been in operation since 1995 and is regulated by State Waste Discharge Permit ST 4502 (Ecology 1995) under the authority of Chapter 90.48 Revised Code of Washington (RCW) and Washington Administrative Code (WAC) Chapter 173-216. The permit stipulates monitoring requirements for effluent (or end-of-pipe) discharges and groundwater monitoring for TEDF. Groundwater monitoring began in 1992 prior to TEDF construction. Routine effluent monitoring in accordance with the permit requirements began in late April 1995 when the facility began operations. The State Waste Discharge Permit ST 4502 included a special permit condition (S.6). This condition specified a statistical study of the variability of permitted constituents in the effluent from TEDF during its first year of operation. The study was designed to (1) demonstrate compliance with the waste discharge permit; (2) determine the variability of all constituents in the effluent that have enforcement limits, early warning values, and monitoring requirements (WHC 1995); and (3) determine if concentrations of permitted constituents vary with season. Additional and more frequent sampling was conducted for the effluent variability study. Statistical evaluation results were provided in Chou and Johnson (1996). Parts of the original first year sampling and analysis plan (WHC 1995) were continued with routine monitoring required up to the present time

  11. Developing Statistical Evaluation Model of Introduction Effect of MSW Thermal Recycling

    Science.gov (United States)

    Aoyama, Makoto; Kato, Takeyoshi; Suzuoki, Yasuo

    For the effective utilization of municipal solid waste (MSW) through a thermal recycling, new technologies, such as an incineration plant using a Molten Carbonate Fuel Cell (MCFC), are being developed. The impact of new technologies should be evaluated statistically for various municipalities, so that the target of technological development or potential cost reduction due to the increased cumulative number of installed system can be discussed. For this purpose, we developed a model for discussing the impact of new technologies, where a statistical mesh data set was utilized to estimate the heat demand around the incineration plant. This paper examines a case study by using a developed model, where a conventional type and a MCFC type MSW incineration plant is compared in terms of the reduction in primary energy and the revenue by both electricity and heat supply. Based on the difference in annual revenue, we calculate the allowable investment in MCFC-type MSW incineration plant in addition to conventional plant. The results suggest that allowable investment can be about 30 millions yen/(t/day) in small municipalities, while it is only 10 millions yen/(t/day) in large municipalities. The sensitive analysis shows the model can be useful for discussing the difference of impact of material recycling of plastics on thermal recycling technologies.

  12. Evaluation of higher order statistics parameters for multi channel sEMG using different force levels.

    Science.gov (United States)

    Naik, Ganesh R; Kumar, Dinesh K

    2011-01-01

    The electromyograpy (EMG) signal provides information about the performance of muscles and nerves. The shape of the muscle signal and motor unit action potential (MUAP) varies due to the movement of the position of the electrode or due to changes in contraction level. This research deals with evaluating the non-Gaussianity in Surface Electromyogram signal (sEMG) using higher order statistics (HOS) parameters. To achieve this, experiments were conducted for four different finger and wrist actions at different levels of Maximum Voluntary Contractions (MVCs). Our experimental analysis shows that at constant force and for non-fatiguing contractions, probability density functions (PDF) of sEMG signals were non-Gaussian. For lesser MVCs (below 30% of MVC) PDF measures tends to be Gaussian process. The above measures were verified by computing the Kurtosis values for different MVCs.

  13. Statistical Evaluation of the Azimuth and Elevation Angles Seen at the Output of the Receiving Antenna

    Science.gov (United States)

    Ziolkowski, Cezary; Kelner, Jan M.

    2018-04-01

    A method to evaluate the statistical properties of the reception angle seen at the input receiver that considers the receiving antenna pattern is presented. In particular, the impact of the direction and beamwidth of the antenna pattern on distribution of the reception angle is shown on the basis of 3D simulation studies. The obtained results show significant differences between distributions of angle of arrival and angle of reception. This means that the presented new method allows assessing the impact of the receiving antenna pattern on the correlation and spectral characteristics at the receiver input in simulation studies of wireless channel. The use of this method also provides an opportunity for analysis of a co-existence between small cells and wireless backhaul, what is currently a significant problem in designing 5G networks.

  14. Statistical evaluation of the dose-distribution charts of the National Computerized Irradiation Planning Network

    International Nuclear Information System (INIS)

    Varjas, Geza; Jozsef, Gabor; Gyenes, Gyoergy; Petranyi, Julia; Bozoky, Laszlo; Pataki, Gezane

    1985-01-01

    The establishment of the National Computerized Irradiation Planning Network allowed to perform the statistical evaluation presented in this report. During the first 5 years 13389 dose-distribution charts were calculated for the treatment of 5320 patients, i.e. in average, 2,5 dose-distribution chart-variants per patient. This number practically did not change in the last 4 years. The irradiation plan of certain tumour localizations was performed on the basis of the calculation of, in average, 1.6-3.0 dose-distribution charts. Recently, radiation procedures assuring optimal dose-distribution, such as the use of moving fields, and two- or three-irradiation fields, are gaining grounds. (author)

  15. Evaluation of statistical control charts for on-line radiation monitoring

    International Nuclear Information System (INIS)

    Hughes, L.D.; DeVol, T.A.

    2008-01-01

    Statistical control charts are presented for the evaluation of time series radiation counter data from flow cells used for monitoring of low levels of 99 TcO 4 - in environmental solutions. Control chart methods consisted of the 3-sigma (3σ) chart, the cumulative sum (CUSUM) chart, and the exponentially weighted moving average (EWMA) chart. Each method involves a control limit based on the detector background which constitutes the detection limit. Both the CUSUM and EWMA charts are suitable to detect and estimate sample concentration requiring less solution volume than when using a 3? control chart. Data presented here indicate that the overall accuracy and precision of the CUSUM method is the best. (author)

  16. Source Evaluation and Trace Metal Contamination in Benthic Sediments from Equatorial Ecosystems Using Multivariate Statistical Techniques.

    Directory of Open Access Journals (Sweden)

    Nsikak U Benson

    Full Text Available Trace metals (Cd, Cr, Cu, Ni and Pb concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria. The degree of contamination was assessed using the individual contamination factors (ICF and global contamination factor (GCF. Multivariate statistical approaches including principal component analysis (PCA, cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources.

  17. Evaluation of statistical distributions to analyze the pollution of Cd and Pb in urban runoff.

    Science.gov (United States)

    Toranjian, Amin; Marofi, Safar

    2017-05-01

    Heavy metal pollution in urban runoff causes severe environmental damage. Identification of these pollutants and their statistical analysis is necessary to provide management guidelines. In this study, 45 continuous probability distribution functions were selected to fit the Cd and Pb data in the runoff events of an urban area during October 2014-May 2015. The sampling was conducted from the outlet of the city basin during seven precipitation events. For evaluation and ranking of the functions, we used the goodness of fit Kolmogorov-Smirnov and Anderson-Darling tests. The results of Cd analysis showed that Hyperbolic Secant, Wakeby and Log-Pearson 3 are suitable for frequency analysis of the event mean concentration (EMC), the instantaneous concentration series (ICS) and instantaneous concentration of each event (ICEE), respectively. In addition, the LP3, Wakeby and Generalized Extreme Value functions were chosen for the EMC, ICS and ICEE related to Pb contamination.

  18. Damage localization by statistical evaluation of signal-processed mode shapes

    DEFF Research Database (Denmark)

    Ulriksen, Martin Dalgaard; Damkilde, Lars

    2015-01-01

    Due to their inherent ability to provide structural information on a local level, mode shapes and their derivatives are utilized extensively for structural damage identification. Typically, more or less advanced mathematical methods are implemented to identify damage-induced discontinuities in th...... is conducted on the basis of T2-statistics. The proposed method is demonstrated in the context of analytical work with a free-vibrating Euler-Bernoulli beam under noisy conditions.......) and subsequent application of a generalized discrete Teager-Kaiser energy operator (GDTKEO) to identify damage-induced mode shape discontinuities. In order to evaluate whether the identified discontinuities are in fact damage-induced, outlier analysis of principal components of the signal-processed mode shapes...

  19. Evaluation of seizure propagation on ictal brain SPECT using statistical parametric mapping in temporal lobe epilepsy

    International Nuclear Information System (INIS)

    Jeon, Tae Joo; Lee, Jong Doo; Kim, Hee Joung; Lee, Byung In; Kim, Ok Joon; Kim, Min Jung; Jeon, Jeong Dong

    1999-01-01

    Ictal brain SPECT has a high diagnostic sensitivity exceeding 90 % in the localization of seizure focus, however, it often shows increased uptake within the extratemporal areas due to early propagation of seizure discharge. This study aimed to evaluate seizure propagation on ictal brian SPECT in patients with temporal lobe epilepsy (TLE) by statistical parametric mapping (SPM). Twenty-one patients (age 27.14 5.79 y) with temporal lobe epilepsy (right in 8, left in 13) who had successful seizure outcome after surgery and nine normal control were included. The data of ictal and interictal brain SPECT of the patients and baseline SPECT of normal control group were analyzed using automatic image registration and SPM96 softwares. The statistical analysis was performed to compare the mean SPECT image of normal group with individual ictal SPECT, and each mean image of the interictal groups of the right or left TLE with individual ictal scans. The t statistic SPM [t] was transformed to SPM [Z] with a threshold of 1.64. The statistical results were displayed and rendered on the reference 3 dimensional MRI images with P value of 0.05 and uncorrected extent threshold p value of 0.5 for SPM [Z]. SPM data demonstrated increased uptake within the epileptic lesion in 19 patients (90.4 %), among them, localized increased uptake confined to the epileptogenic lesion was seen in only 4 (19%) but 15 patients (71.4%) showed hyperperfusion within propagation sites. Bi-temporal hyperperfusion was observed in 11 out of 19 patients (57.9%, 5 in the right and 6 in the left); higher uptake within the lesion than contralateral side in 9, similar activity in 1 and higher uptake within contralateral lobe in one. Extra-temporal hyperperfusion was observed in 8 (2 in the right, 3 in the left, 3 in bilateral); unilateral hyperperfusion within the epileptogenic temporal lobe and extra-temporal area in 4, bi-temporal with extra-temporal hyperperfusion in remaining 4. Ictal brain SPECT is highly

  20. An Evaluation of Mesoscale Model Based Model Output Statistics (MOS) During the 2002 Olympic and Paralympic Winter Games

    National Research Council Canada - National Science Library

    Hart, Kenneth

    2003-01-01

    The skill of a mesoscale model based Model Output Statistics (MOS) system that provided hourly forecasts for 18 sites over northern Utah during the 2002 Winter Olympic and Paralympic Games is evaluated...

  1. Evaluation of Web-Based Ostomy Patient Support Resources.

    Science.gov (United States)

    Pittman, Joyce; Nichols, Thom; Rawl, Susan M

    To evaluate currently available, no-cost, Web-based patient support resources designed for those who have recently undergone ostomy surgery. Descriptive, correlational study using telephone survey. The sample comprised 202 adults who had ostomy surgery within the previous 24 months in 1 of 5 hospitals within a large healthcare organization in the Midwestern United States. Two of the hospitals were academic teaching hospitals, and 3 were community hospitals. The study was divided into 2 phases: (1) gap analysis of 4 Web sites (labeled A-D) based on specific criteria; and (2) telephone survey of individuals with an ostomy. In phase 1, a comprehensive checklist based on best practice standards was developed to conduct the gap analysis. In phase 2, data were collected from 202 participants by trained interviewers via 1-time structured telephone interviews that required approximately 30 minutes to complete. Descriptive analyses were performed, along with correlational analysis of relationships among Web site usage, acceptability and satisfaction, demographic characteristics, and medical history. Gap analysis revealed that Web site D, managed by a patient advocacy group, received the highest total content score of 155/176 (88%) and the highest usability score of 31.7/35 (91%). Two hundred two participants completed the telephone interview, with 96 (48%) reporting that they used the Internet as a source of information. Sixty participants (30%) reported that friends or family member had searched the Internet for ostomy information on their behalf, and 148 (75%) indicated they were confident they could get information about ostomies on the Internet. Of the 90 participants (45%) who reported using the Internet to locate ostomy information, 73 (82%) found the information on the Web easy to understand, 28 (31%) reported being frustrated during their search for information, 24 (27%) indicated it took a lot of effort to get the information they needed, and 39 (43%) were

  2. Statistical analysis of the electrocatalytic activity of Pt nanoparticles supported on novel functionalized reduced graphene oxide-chitosan for methanol electrooxidation

    Science.gov (United States)

    Ekrami-Kakhki, Mehri-Saddat; Abbasi, Sedigheh; Farzaneh, Nahid

    2018-01-01

    The purpose of this study is to statistically analyze the anodic current density and peak potential of methanol oxidation at Pt nanoparticles supported on functionalized reduced graphene oxide (RGO), using design of experiments methodology. RGO is functionalized with methyl viologen (MV) and chitosan (CH). The novel Pt/MV-RGO-CH catalyst is successfully prepared and characterized with transmission electron microscopy (TEM) image. The electrocatalytic activity of Pt/MV-RGOCH catalyst is experimentally evaluated for methanol oxidation. The effects of methanol concentration and scan rate factors are also investigated experimentally and statistically. The effects of these two main factors and their interactions are investigated, using analysis of variance test, Duncan's multiple range test and response surface method. The results of the analysis of variance show that all the main factors and their interactions have a significant effect on anodic current density and peak potential of methanol oxidation at α = 0.05. The suggested models which encompass significant factors can predict the variation of the anodic current density and peak potential of methanol oxidation. The results of Duncan's multiple range test confirmed that there is a significant difference between the studied levels of the main factors. [Figure not available: see fulltext.

  3. Evaluation of statistical and geostatistical models of digital soil properties mapping in tropical mountain regions

    Directory of Open Access Journals (Sweden)

    Waldir de Carvalho Junior

    2014-06-01

    Full Text Available Soil properties have an enormous impact on economic and environmental aspects of agricultural production. Quantitative relationships between soil properties and the factors that influence their variability are the basis of digital soil mapping. The predictive models of soil properties evaluated in this work are statistical (multiple linear regression-MLR and geostatistical (ordinary kriging and co-kriging. The study was conducted in the municipality of Bom Jardim, RJ, using a soil database with 208 sampling points. Predictive models were evaluated for sand, silt and clay fractions, pH in water and organic carbon at six depths according to the specifications of the consortium of digital soil mapping at the global level (GlobalSoilMap. Continuous covariates and categorical predictors were used and their contributions to the model assessed. Only the environmental covariates elevation, aspect, stream power index (SPI, soil wetness index (SWI, normalized difference vegetation index (NDVI, and b3/b2 band ratio were significantly correlated with soil properties. The predictive models had a mean coefficient of determination of 0.21. Best results were obtained with the geostatistical predictive models, where the highest coefficient of determination 0.43 was associated with sand properties between 60 to 100 cm deep. The use of a sparse data set of soil properties for digital mapping can explain only part of the spatial variation of these properties. The results may be related to the sampling density and the quantity and quality of the environmental covariates and predictive models used.

  4. Empirical and Statistical Evaluation of the Effectiveness of Four Lossless Data Compression Algorithms

    Directory of Open Access Journals (Sweden)

    N. A. Azeez

    2017-04-01

    Full Text Available Data compression is the process of reducing the size of a file to effectively reduce storage space and communication cost. The evolvement in technology and digital age has led to an unparalleled usage of digital files in this current decade. The usage of data has resulted to an increase in the amount of data being transmitted via various channels of data communication which has prompted the need to look into the current lossless data compression algorithms to check for their level of effectiveness so as to maximally reduce the bandwidth requirement in communication and transfer of data. Four lossless data compression algorithm: Lempel-Ziv Welch algorithm, Shannon-Fano algorithm, Adaptive Huffman algorithm and Run-Length encoding have been selected for implementation. The choice of these algorithms was based on their similarities, particularly in application areas. Their level of efficiency and effectiveness were evaluated using some set of predefined performance evaluation metrics namely compression ratio, compression factor, compression time, saving percentage, entropy and code efficiency. The algorithms implementation was done in the NetBeans Integrated Development Environment using Java as the programming language. Through the statistical analysis performed using Boxplot and ANOVA and comparison made on the four algo

  5. Statistical analysis of correlated experimental data and neutron cross section evaluation

    International Nuclear Information System (INIS)

    Badikov, S.A.

    1998-01-01

    The technique for evaluation of neutron cross sections on the basis of statistical analysis of correlated experimental data is presented. The most important stages of evaluation beginning from compilation of correlation matrix for measurement uncertainties till representation of the analysis results in the ENDF-6 format are described in details. Special attention is paid to restrictions (positive uncertainty) on covariation matrix of approximate parameters uncertainties generated within the method of least square fit which is derived from physical reasons. The requirements for source experimental data assuring satisfaction of the restrictions mentioned above are formulated. Correlation matrices of measurement uncertainties in particular should be also positively determined. Variants of modelling the positively determined correlation matrices of measurement uncertainties in a situation when their consequent calculation on the basis of experimental information is impossible are discussed. The technique described is used for creating the new generation of estimates of dosimetric reactions cross sections for the first version of the Russian dosimetric file (including nontrivial covariation information)

  6. Examining publication bias—a simulation-based evaluation of statistical tests on publication bias

    Directory of Open Access Journals (Sweden)

    Andreas Schneck

    2017-11-01

    Full Text Available Background Publication bias is a form of scientific misconduct. It threatens the validity of research results and the credibility of science. Although several tests on publication bias exist, no in-depth evaluations are available that examine which test performs best for different research settings. Methods Four tests on publication bias, Egger’s test (FAT, p-uniform, the test of excess significance (TES, as well as the caliper test, were evaluated in a Monte Carlo simulation. Two different types of publication bias and its degree (0%, 50%, 100% were simulated. The type of publication bias was defined either as file-drawer, meaning the repeated analysis of new datasets, or p-hacking, meaning the inclusion of covariates in order to obtain a significant result. In addition, the underlying effect (β = 0, 0.5, 1, 1.5, effect heterogeneity, the number of observations in the simulated primary studies (N = 100, 500, and the number of observations for the publication bias tests (K = 100, 1,000 were varied. Results All tests evaluated were able to identify publication bias both in the file-drawer and p-hacking condition. The false positive rates were, with the exception of the 15%- and 20%-caliper test, unbiased. The FAT had the largest statistical power in the file-drawer conditions, whereas under p-hacking the TES was, except under effect heterogeneity, slightly better. The CTs were, however, inferior to the other tests under effect homogeneity and had a decent statistical power only in conditions with 1,000 primary studies. Discussion The FAT is recommended as a test for publication bias in standard meta-analyses with no or only small effect heterogeneity. If two-sided publication bias is suspected as well as under p-hacking the TES is the first alternative to the FAT. The 5%-caliper test is recommended under conditions of effect heterogeneity and a large number of primary studies, which may be found if publication bias is examined in a

  7. Predictive analysis of beer quality by correlating sensory evaluation with higher alcohol and ester production using multivariate statistics methods.

    Science.gov (United States)

    Dong, Jian-Jun; Li, Qing-Liang; Yin, Hua; Zhong, Cheng; Hao, Jun-Guang; Yang, Pan-Fei; Tian, Yu-Hong; Jia, Shi-Ru

    2014-10-15

    Sensory evaluation is regarded as a necessary procedure to ensure a reproducible quality of beer. Meanwhile, high-throughput analytical methods provide a powerful tool to analyse various flavour compounds, such as higher alcohol and ester. In this study, the relationship between flavour compounds and sensory evaluation was established by non-linear models such as partial least squares (PLS), genetic algorithm back-propagation neural network (GA-BP), support vector machine (SVM). It was shown that SVM with a Radial Basis Function (RBF) had a better performance of prediction accuracy for both calibration set (94.3%) and validation set (96.2%) than other models. Relatively lower prediction abilities were observed for GA-BP (52.1%) and PLS (31.7%). In addition, the kernel function of SVM played an essential role of model training when the prediction accuracy of SVM with polynomial kernel function was 32.9%. As a powerful multivariate statistics method, SVM holds great potential to assess beer quality. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Evaluation of the applicability in the future climate of a statistical downscaling method in France

    Science.gov (United States)

    Dayon, G.; Boé, J.; Martin, E.

    2013-12-01

    The uncertainties in climate projections during the next decades generally remain large, with an important contribution of internal climate variability. To quantify and capture the impact of those uncertainties in impact projections, multi-model and multi-member approaches are essential. Statistical downscaling (SD) methods are computationally inexpensive allowing for large ensemble approaches. The main weakness of SD is that it relies on a stationarity hypothesis, namely that the statistical relation established in the present climate remains valid in the climate change context. In this study, the evaluation of SD methods developed for a future study of hydrological changes during the next decades over France is presented, focusing on precipitation. The SD methods are all based on the analogs method which is quite simple to set up and permits to easily test different combinations of predictors, the only changing parameter in the methods discussed in this presentation. The basic idea of the analogs method is that for a same large scale climatic state, the state of local variables will be identical. In a climate change context, the statistical relation established on past climate is assumed to remain valid in the future climate. In practice, this stationarity assumption is impossible to verify until the future climate is effectively observed. It is possible to evaluate the ability of SD methods to reproduce the interannual variability in the present climate, but this approach does not guarantee their validity in the future climate as the mechanisms that play in the interannual and climate change contexts may not be identical. Another common approach is to test whether a SD method is able to reproduce observed, as they may be partly caused by climate changes. The observed trends in precipitation are compared to those obtained by downscaling 4 different atmospheric reanalyses with analogs methods. The uncertainties in downscaled trends due to renalyses are very large

  9. Evaluating the Use of Random Distribution Theory to Introduce Statistical Inference Concepts to Business Students

    Science.gov (United States)

    Larwin, Karen H.; Larwin, David A.

    2011-01-01

    Bootstrapping methods and random distribution methods are increasingly recommended as better approaches for teaching students about statistical inference in introductory-level statistics courses. The authors examined the effect of teaching undergraduate business statistics students using random distribution and bootstrapping simulations. It is the…

  10. Evaluation of Theoretical and Empirical Characteristics of the Communication, Language, and Statistics Survey (CLASS)

    Science.gov (United States)

    Wagler, Amy E.; Lesser, Lawrence M.

    2018-01-01

    The interaction between language and the learning of statistical concepts has been receiving increased attention. The Communication, Language, And Statistics Survey (CLASS) was developed in response to the need to focus on dynamics of language in light of the culturally and linguistically diverse environments of introductory statistics classrooms.…

  11. A statistical approach to evaluate flood risk at the regional level: an application to Italy

    Science.gov (United States)

    Rossi, Mauro; Marchesini, Ivan; Salvati, Paola; Donnini, Marco; Guzzetti, Fausto; Sterlacchini, Simone; Zazzeri, Marco; Bonazzi, Alessandro; Carlesi, Andrea

    2016-04-01

    Floods are frequent and widespread in Italy, causing every year multiple fatalities and extensive damages to public and private structures. A pre-requisite for the development of mitigation schemes, including financial instruments such as insurance, is the ability to quantify their costs starting from the estimation of the underlying flood hazard. However, comprehensive and coherent information on flood prone areas, and estimates on the frequency and intensity of flood events, are not often available at scales appropriate for risk pooling and diversification. In Italy, River Basins Hydrogeological Plans (PAI), prepared by basin administrations, are the basic descriptive, regulatory, technical and operational tools for environmental planning in flood prone areas. Nevertheless, such plans do not cover the entire Italian territory, having significant gaps along the minor hydrographic network and in ungauged basins. Several process-based modelling approaches have been used by different basin administrations for the flood hazard assessment, resulting in an inhomogeneous hazard zonation of the territory. As a result, flood hazard assessments expected and damage estimations across the different Italian basin administrations are not always coherent. To overcome these limitations, we propose a simplified multivariate statistical approach for the regional flood hazard zonation coupled with a flood impact model. This modelling approach has been applied in different Italian basin administrations, allowing a preliminary but coherent and comparable estimation of the flood hazard and the relative impact. Model performances are evaluated comparing the predicted flood prone areas with the corresponding PAI zonation. The proposed approach will provide standardized information (following the EU Floods Directive specifications) on flood risk at a regional level which can in turn be more readily applied to assess flood economic impacts. Furthermore, in the assumption of an appropriate

  12. ERROR DISTRIBUTION EVALUATION OF THE THIRD VANISHING POINT BASED ON RANDOM STATISTICAL SIMULATION

    Directory of Open Access Journals (Sweden)

    C. Li

    2012-07-01

    Full Text Available POS, integrated by GPS / INS (Inertial Navigation Systems, has allowed rapid and accurate determination of position and attitude of remote sensing equipment for MMS (Mobile Mapping Systems. However, not only does INS have system error, but also it is very expensive. Therefore, in this paper error distributions of vanishing points are studied and tested in order to substitute INS for MMS in some special land-based scene, such as ground façade where usually only two vanishing points can be detected. Thus, the traditional calibration approach based on three orthogonal vanishing points is being challenged. In this article, firstly, the line clusters, which parallel to each others in object space and correspond to the vanishing points, are detected based on RANSAC (Random Sample Consensus and parallelism geometric constraint. Secondly, condition adjustment with parameters is utilized to estimate nonlinear error equations of two vanishing points (VX, VY. How to set initial weights for the adjustment solution of single image vanishing points is presented. Solving vanishing points and estimating their error distributions base on iteration method with variable weights, co-factor matrix and error ellipse theory. Thirdly, under the condition of known error ellipses of two vanishing points (VX, VY and on the basis of the triangle geometric relationship of three vanishing points, the error distribution of the third vanishing point (VZ is calculated and evaluated by random statistical simulation with ignoring camera distortion. Moreover, Monte Carlo methods utilized for random statistical estimation are presented. Finally, experimental results of vanishing points coordinate and their error distributions are shown and analyzed.

  13. Error Distribution Evaluation of the Third Vanishing Point Based on Random Statistical Simulation

    Science.gov (United States)

    Li, C.

    2012-07-01

    POS, integrated by GPS / INS (Inertial Navigation Systems), has allowed rapid and accurate determination of position and attitude of remote sensing equipment for MMS (Mobile Mapping Systems). However, not only does INS have system error, but also it is very expensive. Therefore, in this paper error distributions of vanishing points are studied and tested in order to substitute INS for MMS in some special land-based scene, such as ground façade where usually only two vanishing points can be detected. Thus, the traditional calibration approach based on three orthogonal vanishing points is being challenged. In this article, firstly, the line clusters, which parallel to each others in object space and correspond to the vanishing points, are detected based on RANSAC (Random Sample Consensus) and parallelism geometric constraint. Secondly, condition adjustment with parameters is utilized to estimate nonlinear error equations of two vanishing points (VX, VY). How to set initial weights for the adjustment solution of single image vanishing points is presented. Solving vanishing points and estimating their error distributions base on iteration method with variable weights, co-factor matrix and error ellipse theory. Thirdly, under the condition of known error ellipses of two vanishing points (VX, VY) and on the basis of the triangle geometric relationship of three vanishing points, the error distribution of the third vanishing point (VZ) is calculated and evaluated by random statistical simulation with ignoring camera distortion. Moreover, Monte Carlo methods utilized for random statistical estimation are presented. Finally, experimental results of vanishing points coordinate and their error distributions are shown and analyzed.

  14. Indicators to support the dynamic evaluation of air quality models

    Science.gov (United States)

    Thunis, P.; Clappier, A.

    2014-12-01

    Air quality models are useful tools for the assessment and forecast of pollutant concentrations in the atmosphere. Most of the evaluation process relies on the “operational phase” or in other words the comparison of model results with available measurements which provides insight on the model capability to reproduce measured concentrations for a given application. But one of the key advantages of air quality models lies in their ability to assess the impact of precursor emission reductions on air quality levels. Models are then used in a dynamic mode (i.e. response to a change in a given model input data) for which evaluation of the model performances becomes a challenge. The objective of this work is to propose common indicators and diagrams to facilitate the understanding of model responses to emission changes when models are to be used for policy support. These indicators are shown to be useful to retrieve information on the magnitude of the locally produced impacts of emission reductions on concentrations with respect to the “external to the domain” contribution but also to identify, distinguish and quantify impacts arising from different factors (different precursors). In addition information about the robustness of the model results is provided. As such these indicators might reveal useful as first screening methodology to identify the feasibility of a given action as well as to prioritize the factors on which to act for an increased efficiency. Finally all indicators are made dimensionless to facilitate the comparison of results obtained with different models, different resolutions, or on different geographical areas.

  15. Evaluation of photographs supporting an FFQ developed for adolescents.

    Science.gov (United States)

    Brito, Alessandra Page; Guimarães, Celso Pereira; Pereira, Rosangela Alves

    2014-01-01

    To evaluate the validity of food photographs used to support the reporting of food intake with an FFQ designed for adolescents from Rio de Janeiro, Brazil. A set of ninety-five food photographs was elaborated. The photographs' evaluation process included the acknowledgement of foods and portions in the pictures. In the identification of foods (ninety-five photographs) and typical portions (twelve photographs), the adolescents were requested to answer a structured questionnaire related to the food photographs. The identification of the portion size of amorphous foods (forty-three photographs) was performed using three different portion sizes of actual preparations. The proportions (and 95% confidence intervals) of adolescents who correctly identified foods and portion size in each photograph were estimated. A public school in Niterói, Rio de Janeiro State, Brazil. Sixty-two adolescents between 11·0 and 18·9 years old, randomly selected. At least 90% of adolescents correctly identified the food in ninety-two photographs and the food in the three remaining photographs was recognized by 80-89% of the adolescents. At least 98% of the adolescents correctly identified eleven typical or natural portions in the food photographs. For amorphous foods, at least 70% of teenagers correctly identified the portion size in the photograph of thirty-one foods; for the other photographs, the portion size was correctly recognized by 50-69% of the adolescents for eight foods and by less than 50% of adolescents for four foods. The analysed photographs are appropriate visual aids to the reporting of food consumption by adolescents.

  16. Meta-analysis as Statistical and Analytical Method of Journal's Content Scientific Evaluation.

    Science.gov (United States)

    Masic, Izet; Begic, Edin

    2015-02-01

    A meta-analysis is a statistical and analytical method which combines and synthesizes different independent studies and integrates their results into one common result. Analysis of the journals "Medical Archives", "Materia Socio Medica" and "Acta Informatica Medica", which are located in the most eminent indexed databases of the biomedical milieu. The study has retrospective and descriptive character, and included the period of the calendar year 2014. Study included six editions of all three journals (total of 18 journals). In this period was published a total of 291 articles (in the "Medical Archives" 110, "Materia Socio Medica" 97, and in "Acta Informatica Medica" 84). The largest number of articles was original articles. Small numbers have been published as professional, review articles and case reports. Clinical events were most common in the first two journals, while in the journal "Acta Informatica Medica" belonged to the field of medical informatics, as part of pre-clinical medical disciplines. Articles are usually required period of fifty to fifty nine days for review. Articles were received from four continents, mostly from Europe. The authors are most often from the territory of Bosnia and Herzegovina, then Iran, Kosovo and Macedonia. The number of articles published each year is increasing, with greater participation of authors from different continents and abroad. Clinical medical disciplines are the most common, with the broader spectrum of topics and with a growing number of original articles. Greater support of the wider scientific community is needed for further development of all three of the aforementioned journals.

  17. Degrees of separation as a statistical tool for evaluating candidate genes.

    Science.gov (United States)

    Nelson, Ronald M; Pettersson, Mats E

    2014-12-01

    Selection of candidate genes is an important step in the exploration of complex genetic architecture. The number of gene networks available is increasing and these can provide information to help with candidate gene selection. It is currently common to use the degree of connectedness in gene networks as validation in Genome Wide Association (GWA) and Quantitative Trait Locus (QTL) mapping studies. However, it can cause misleading results if not validated properly. Here we present a method and tool for validating the gene pairs from GWA studies given the context of the network they co-occur in. It ensures that proposed interactions and gene associations are not statistical artefacts inherent to the specific gene network architecture. The CandidateBacon package provides an easy and efficient method to calculate the average degree of separation (DoS) between pairs of genes to currently available gene networks. We show how these empirical estimates of average connectedness are used to validate candidate gene pairs. Validation of interacting genes by comparing their connectedness with the average connectedness in the gene network will provide support for said interactions by utilising the growing amount of gene network information available. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Comparing the Goodness of Different Statistical Criteria for Evaluating the Soil Water Infiltration Models

    Directory of Open Access Journals (Sweden)

    S. Mirzaee

    2016-02-01

    Full Text Available Introduction: The infiltration process is one of the most important components of the hydrologic cycle. Quantifying the infiltration water into soil is of great importance in watershed management. Prediction of flooding, erosion and pollutant transport all depends on the rate of runoff which is directly affected by the rate of infiltration. Quantification of infiltration water into soil is also necessary to determine the availability of water for crop growth and to estimate the amount of additional water needed for irrigation. Thus, an accurate model is required to estimate infiltration of water into soil. The ability of physical and empirical models in simulation of soil processes is commonly measured through comparisons of simulated and observed values. For these reasons, a large variety of indices have been proposed and used over the years in comparison of infiltration water into soil models. Among the proposed indices, some are absolute criteria such as the widely used root mean square error (RMSE, while others are relative criteria (i.e. normalized such as the Nash and Sutcliffe (1970 efficiency criterion (NSE. Selecting and using appropriate statistical criteria to evaluate and interpretation of the results for infiltration water into soil models is essential because each of the used criteria focus on specific types of errors. Also, descriptions of various goodness of fit indices or indicators including their advantages and shortcomings, and rigorous discussions on the suitability of each index are very important. The objective of this study is to compare the goodness of different statistical criteria to evaluate infiltration of water into soil models. Comparison techniques were considered to define the best models: coefficient of determination (R2, root mean square error (RMSE, efficiency criteria (NSEI and modified forms (such as NSEjI, NSESQRTI, NSElnI and NSEiI. Comparatively little work has been carried out on the meaning and

  19. Comments on Brodsky's statistical methods for evaluating epidemiological results, and reply by Brodsky, A

    International Nuclear Information System (INIS)

    Frome, E.L.; Khare, M.

    1980-01-01

    Brodsky's paper 'A Statistical Method for Testing Epidemiological Results, as applied to the Hanford Worker Population', (Health Phys., 36, 611-628, 1979) proposed two test statistics for use in comparing the survival experience of a group of employees and controls. This letter states that both of the test statistics were computed using incorrect formulas and concludes that the results obtained using these statistics may also be incorrect. In his reply Brodsky concurs with the comments on the proper formulation of estimates of pooled standard errors in constructing test statistics but believes that the erroneous formulation does not invalidate the major points, results and discussions of his paper. (author)

  20. Evaluation of SOVAT: an OLAP-GIS decision support system for community health assessment data analysis.

    Science.gov (United States)

    Scotch, Matthew; Parmanto, Bambang; Monaco, Valerie

    2008-06-09

    Data analysis in community health assessment (CHA) involves the collection, integration, and analysis of large numerical and spatial data sets in order to identify health priorities. Geographic Information Systems (GIS) enable for management and analysis using spatial data, but have limitations in performing analysis of numerical data because of its traditional database architecture.On-Line Analytical Processing (OLAP) is a multidimensional datawarehouse designed to facilitate querying of large numerical data. Coupling the spatial capabilities of GIS with the numerical analysis of OLAP, might enhance CHA data analysis. OLAP-GIS systems have been developed by university researchers and corporations, yet their potential for CHA data analysis is not well understood. To evaluate the potential of an OLAP-GIS decision support system for CHA problem solving, we compared OLAP-GIS to the standard information technology (IT) currently used by many public health professionals. SOVAT, an OLAP-GIS decision support system developed at the University of Pittsburgh, was compared against current IT for data analysis for CHA. For this study, current IT was considered the combined use of SPSS and GIS ("SPSS-GIS"). Graduate students, researchers, and faculty in the health sciences at the University of Pittsburgh were recruited. Each round consisted of: an instructional video of the system being evaluated, two practice tasks, five assessment tasks, and one post-study questionnaire. Objective and subjective measurement included: task completion time, success in answering the tasks, and system satisfaction. Thirteen individuals participated. Inferential statistics were analyzed using linear mixed model analysis. SOVAT was statistically significant (alpha = .01) from SPSS-GIS for satisfaction and time (p OLAP-GIS decision support systems as a valuable tool for CHA data analysis.

  1. Evaluation of SOVAT: An OLAP-GIS decision support system for community health assessment data analysis

    Directory of Open Access Journals (Sweden)

    Parmanto Bambang

    2008-06-01

    Full Text Available Abstract Background Data analysis in community health assessment (CHA involves the collection, integration, and analysis of large numerical and spatial data sets in order to identify health priorities. Geographic Information Systems (GIS enable for management and analysis using spatial data, but have limitations in performing analysis of numerical data because of its traditional database architecture. On-Line Analytical Processing (OLAP is a multidimensional datawarehouse designed to facilitate querying of large numerical data. Coupling the spatial capabilities of GIS with the numerical analysis of OLAP, might enhance CHA data analysis. OLAP-GIS systems have been developed by university researchers and corporations, yet their potential for CHA data analysis is not well understood. To evaluate the potential of an OLAP-GIS decision support system for CHA problem solving, we compared OLAP-GIS to the standard information technology (IT currently used by many public health professionals. Methods SOVAT, an OLAP-GIS decision support system developed at the University of Pittsburgh, was compared against current IT for data analysis for CHA. For this study, current IT was considered the combined use of SPSS and GIS ("SPSS-GIS". Graduate students, researchers, and faculty in the health sciences at the University of Pittsburgh were recruited. Each round consisted of: an instructional video of the system being evaluated, two practice tasks, five assessment tasks, and one post-study questionnaire. Objective and subjective measurement included: task completion time, success in answering the tasks, and system satisfaction. Results Thirteen individuals participated. Inferential statistics were analyzed using linear mixed model analysis. SOVAT was statistically significant (α = .01 from SPSS-GIS for satisfaction and time (p Conclusion Using SOVAT, tasks were completed more efficiently, with a higher rate of success, and with greater satisfaction, than the

  2. Evaluation of a hanging core support concept for LMR application

    International Nuclear Information System (INIS)

    Burelbach, J.P.; Cha, B.K.; Huebotter, P.R.; Kann, W.J.; Pan, Y.C.; Saiveau, J.G.; Seidensticker, R.W.; Wu, T.S.

    1985-01-01

    The paper describes an innovative design concept for a liquid metal reactor (LMR) core support structure (CSS). A hanging core support structure is described and analyzed. The design offers inherent safety features, constructability advantages, and potential cost reductions. Some safety considerations are examined which include the in-service inspection (ISI), the backup support system and the structural behavior in a hypothetical case of a broken beam in the core support structure

  3. Notes on the Implementation of Non-Parametric Statistics within the Westinghouse Realistic Large Break LOCA Evaluation Model (ASTRUM)

    International Nuclear Information System (INIS)

    Frepoli, Cesare; Oriani, Luca

    2006-01-01

    In recent years, non-parametric or order statistics methods have been widely used to assess the impact of the uncertainties within Best-Estimate LOCA evaluation models. The bounding of the uncertainties is achieved with a direct Monte Carlo sampling of the uncertainty attributes, with the minimum trial number selected to 'stabilize' the estimation of the critical output values (peak cladding temperature (PCT), local maximum oxidation (LMO), and core-wide oxidation (CWO A non-parametric order statistics uncertainty analysis was recently implemented within the Westinghouse Realistic Large Break LOCA evaluation model, also referred to as 'Automated Statistical Treatment of Uncertainty Method' (ASTRUM). The implementation or interpretation of order statistics in safety analysis is not fully consistent within the industry. This has led to an extensive public debate among regulators and researchers which can be found in the open literature. The USNRC-approved Westinghouse method follows a rigorous implementation of the order statistics theory, which leads to the execution of 124 simulations within a Large Break LOCA analysis. This is a solid approach which guarantees that a bounding value (at 95% probability) of the 95 th percentile for each of the three 10 CFR 50.46 ECCS design acceptance criteria (PCT, LMO and CWO) is obtained. The objective of this paper is to provide additional insights on the ASTRUM statistical approach, with a more in-depth analysis of pros and cons of the order statistics and of the Westinghouse approach in the implementation of this statistical methodology. (authors)

  4. Statistical Analysis of Model Data for Operational Space Launch Weather Support at Kennedy Space Center and Cape Canaveral Air Force Station

    Science.gov (United States)

    Bauman, William H., III

    2010-01-01

    The 12-km resolution North American Mesoscale (NAM) model (MesoNAM) is used by the 45th Weather Squadron (45 WS) Launch Weather Officers at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) to support space launch weather operations. The 45 WS tasked the Applied Meteorology Unit to conduct an objective statistics-based analysis of MesoNAM output compared to wind tower mesonet observations and then develop a an operational tool to display the results. The National Centers for Environmental Prediction began running the current version of the MesoNAM in mid-August 2006. The period of record for the dataset was 1 September 2006 - 31 January 2010. The AMU evaluated MesoNAM hourly forecasts from 0 to 84 hours based on model initialization times of 00, 06, 12 and 18 UTC. The MesoNAM forecast winds, temperature and dew point were compared to the observed values of these parameters from the sensors in the KSC/CCAFS wind tower network. The data sets were stratified by model initialization time, month and onshore/offshore flow for each wind tower. Statistics computed included bias (mean difference), standard deviation of the bias, root mean square error (RMSE) and a hypothesis test for bias = O. Twelve wind towers located in close proximity to key launch complexes were used for the statistical analysis with the sensors on the towers positioned at varying heights to include 6 ft, 30 ft, 54 ft, 60 ft, 90 ft, 162 ft, 204 ft and 230 ft depending on the launch vehicle and associated weather launch commit criteria being evaluated. These twelve wind towers support activities for the Space Shuttle (launch and landing), Delta IV, Atlas V and Falcon 9 launch vehicles. For all twelve towers, the results indicate a diurnal signal in the bias of temperature (T) and weaker but discernable diurnal signal in the bias of dewpoint temperature (T(sub d)) in the MesoNAM forecasts. Also, the standard deviation of the bias and RMSE of T, T(sub d), wind speed and wind

  5. Gonorrhea Statistics

    Science.gov (United States)

    ... Search Form Controls Cancel Submit Search the CDC Gonorrhea Note: Javascript is disabled or is not supported ... Twitter STD on Facebook Sexually Transmitted Diseases (STDs) Gonorrhea Statistics Recommend on Facebook Tweet Share Compartir Gonorrhea ...

  6. An Evaluation Framework for Energy Aware Buildings using Statistical Model Checking

    DEFF Research Database (Denmark)

    David, Alexandre; Du, DeHui; Larsen, Kim Guldstrand

    2012-01-01

    properties of this formalisms. A particular kind of cyber-physical systems are Smart Grids which together with Intelligent, Energy Aware Buildings will play a major role in achieving an energy efficient society of the future. In this paper we present a framework in Uppaal-smc for energy aware buildings...... allowing to evaluate the performance of proposed control strategies in terms of their induced comfort and energy profiles under varying environmental settings (e.g. weather, user behaviour, ...). To demonstrate the intended use and usefulness of our framework, we present an application to the Hybrid......Cyber-physical systems are to be found in numerous applications throughout society. The principal barrier to develop trustworthy cyber-physical systems is the lack of expressive modelling and specification for- malisms supported by efficient tools and methodologies. To overcome this barrier, we...

  7. Simulating snow maps for Norway: description and statistical evaluation of the seNorge snow model

    Directory of Open Access Journals (Sweden)

    T. M. Saloranta

    2012-11-01

    Full Text Available Daily maps of snow conditions have been produced in Norway with the seNorge snow model since 2004. The seNorge snow model operates with 1 × 1 km resolution, uses gridded observations of daily temperature and precipitation as its input forcing, and simulates, among others, snow water equivalent (SWE, snow depth (SD, and the snow bulk density (ρ. In this paper the set of equations contained in the seNorge model code is described and a thorough spatiotemporal statistical evaluation of the model performance from 1957–2011 is made using the two major sets of extensive in situ snow measurements that exist for Norway. The evaluation results show that the seNorge model generally overestimates both SWE and ρ, and that the overestimation of SWE increases with elevation throughout the snow season. However, the R2-values for model fit are 0.60 for (log-transformed SWE and 0.45 for ρ, indicating that after removal of the detected systematic model biases (e.g. by recalibrating the model or expressing snow conditions in relative units the model performs rather well. The seNorge model provides a relatively simple, not very data-demanding, yet nonetheless process-based method to construct snow maps of high spatiotemporal resolution. It is an especially well suited alternative for operational snow mapping in regions with rugged topography and large spatiotemporal variability in snow conditions, as is the case in the mountainous Norway.

  8. Predicting of biomass in Brazilian tropical dry forest: a statistical evaluation of generic equations

    Directory of Open Access Journals (Sweden)

    ROBSON B. DE LIMA

    2017-08-01

    Full Text Available ABSTRACT Dry tropical forests are a key component in the global carbon cycle and their biomass estimates depend almost exclusively of fitted equations for multi-species or individual species data. Therefore, a systematic evaluation of statistical models through validation of estimates of aboveground biomass stocks is justifiable. In this study was analyzed the capacity of generic and specific equations obtained from different locations in Mexico and Brazil, to estimate aboveground biomass at multi-species levels and for four different species. Generic equations developed in Mexico and Brazil performed better in estimating tree biomass for multi-species data. For Poincianella bracteosa and Mimosa ophthalmocentra, only the Sampaio and Silva (2005 generic equation was the most recommended. These equations indicate lower tendency and lower bias, and biomass estimates for these equations are similar. For the species Mimosa tenuiflora, Aspidosperma pyrifolium and for the genus Croton the specific regional equations are more recommended, although the generic equation of Sampaio and Silva (2005 is not discarded for biomass estimates. Models considering gender, families, successional groups, climatic variables and wood specific gravity should be adjusted, tested and the resulting equations should be validated at both local and regional levels as well as on the scales of tropics with dry forest dominance.

  9. Predicting of biomass in Brazilian tropical dry forest: a statistical evaluation of generic equations.

    Science.gov (United States)

    Lima, Robson B DE; Alves, Francisco T; Oliveira, Cinthia P DE; Silva, José A A DA; Ferreira, Rinaldo L C

    2017-01-01

    Dry tropical forests are a key component in the global carbon cycle and their biomass estimates depend almost exclusively of fitted equations for multi-species or individual species data. Therefore, a systematic evaluation of statistical models through validation of estimates of aboveground biomass stocks is justifiable. In this study was analyzed the capacity of generic and specific equations obtained from different locations in Mexico and Brazil, to estimate aboveground biomass at multi-species levels and for four different species. Generic equations developed in Mexico and Brazil performed better in estimating tree biomass for multi-species data. For Poincianella bracteosa and Mimosa ophthalmocentra, only the Sampaio and Silva (2005) generic equation was the most recommended. These equations indicate lower tendency and lower bias, and biomass estimates for these equations are similar. For the species Mimosa tenuiflora, Aspidosperma pyrifolium and for the genus Croton the specific regional equations are more recommended, although the generic equation of Sampaio and Silva (2005) is not discarded for biomass estimates. Models considering gender, families, successional groups, climatic variables and wood specific gravity should be adjusted, tested and the resulting equations should be validated at both local and regional levels as well as on the scales of tropics with dry forest dominance.

  10. EVALUATION OF PHOTOGRAMMETRIC BLOCK ORIENTATION USING QUALITY DESCRIPTORS FROM STATISTICALLY FILTERED TIE POINTS

    Directory of Open Access Journals (Sweden)

    A. Calantropio

    2018-05-01

    Full Text Available Due to the increasing number of low-cost sensors, widely accessible on the market, and because of the supposed granted correctness of the semi-automatic workflow for 3D reconstruction, highly implemented in the recent commercial software, more and more users operate nowadays without following the rigorousness of classical photogrammetric methods. This behaviour often naively leads to 3D products that lacks metric quality assessment. This paper proposes and analyses an approach that gives the users the possibility to preserve the trustworthiness of the metric information inherent in the 3D model, without sacrificing the automation offered by modern photogrammetry software. At the beginning, the importance of Data Quality Assessment is outlined, together with some recall of photogrammetry best practices. With the purpose of guiding the user through a correct pipeline for a certified 3D model reconstruction, an operative workflow is proposed, focusing on the first part of the object reconstruction steps (tie-points extraction, camera calibration, and relative orientation. A new GUI (Graphical User Interface developed for the open source MicMac suite is then presented, and a sample dataset is used for the evaluation of the photogrammetric block orientation using statistically obtained quality descriptors. The results and the future directions are then presented and discussed.

  11. Do Different Mental Models Influence Cybersecurity Behavior? Evaluations via Statistical Reasoning Performance

    Directory of Open Access Journals (Sweden)

    Gary L. Brase

    2017-11-01

    Full Text Available Cybersecurity research often describes people as understanding internet security in terms of metaphorical mental models (e.g., disease risk, physical security risk, or criminal behavior risk. However, little research has directly evaluated if this is an accurate or productive framework. To assess this question, two experiments asked participants to respond to a statistical reasoning task framed in one of four different contexts (cybersecurity, plus the above alternative models. Each context was also presented using either percentages or natural frequencies, and these tasks were followed by a behavioral likelihood rating. As in previous research, consistent use of natural frequencies promoted correct Bayesian reasoning. There was little indication, however, that any of the alternative mental models generated consistently better understanding or reasoning over the actual cybersecurity context. There was some evidence that different models had some effects on patterns of responses, including the behavioral likelihood ratings, but these effects were small, as compared to the effect of the numerical format manipulation. This points to a need to improve the content of actual internet security warnings, rather than working to change the models users have of warnings.

  12. Do Different Mental Models Influence Cybersecurity Behavior? Evaluations via Statistical Reasoning Performance.

    Science.gov (United States)

    Brase, Gary L; Vasserman, Eugene Y; Hsu, William

    2017-01-01

    Cybersecurity research often describes people as understanding internet security in terms of metaphorical mental models (e.g., disease risk, physical security risk, or criminal behavior risk). However, little research has directly evaluated if this is an accurate or productive framework. To assess this question, two experiments asked participants to respond to a statistical reasoning task framed in one of four different contexts (cybersecurity, plus the above alternative models). Each context was also presented using either percentages or natural frequencies, and these tasks were followed by a behavioral likelihood rating. As in previous research, consistent use of natural frequencies promoted correct Bayesian reasoning. There was little indication, however, that any of the alternative mental models generated consistently better understanding or reasoning over the actual cybersecurity context. There was some evidence that different models had some effects on patterns of responses, including the behavioral likelihood ratings, but these effects were small, as compared to the effect of the numerical format manipulation. This points to a need to improve the content of actual internet security warnings, rather than working to change the models users have of warnings.

  13. Evaluation of partially premixed turbulent flame stability from mixture fraction statistics in a slot burner

    KAUST Repository

    Kruse, Stephan

    2018-04-11

    Partially premixed combustion is characterized by mixture fraction inhomogeneity upstream of the reaction zone and occurs in many applied combustion systems. The temporal and spatial fluctuations of the mixture fraction have tremendous impact on the combustion characteristics, emission formation, and flame stability. In this study, turbulent partially premixed flames are experimentally studied in a slot burner configuration. The local temperature and gas composition is determined by means of one-dimensional, simultaneous detection of Rayleigh and Raman scattering. The statistics of the mixture fraction are utilized to characterize the impact of the Reynolds number, the global equivalence ratio, the progress of mixing within the flame, as well as the mixing length on the mixing field. Furthermore, these effects are evaluated by means of a regime diagram for partially premixed flames. In this study, it is shown that the increase of the mixing length results in a significantly more stable flame. The impact of the Reynolds number on flame stability is found to be minor.

  14. Evaluation of partially premixed turbulent flame stability from mixture fraction statistics in a slot burner

    KAUST Repository

    Kruse, Stephan; Mansour, Mohy S.; Elbaz, Ayman M.; Varea, Emilien; Grü nefeld, Gerd; Beeckmann, Joachim; Pitsch, Heinz

    2018-01-01

    Partially premixed combustion is characterized by mixture fraction inhomogeneity upstream of the reaction zone and occurs in many applied combustion systems. The temporal and spatial fluctuations of the mixture fraction have tremendous impact on the combustion characteristics, emission formation, and flame stability. In this study, turbulent partially premixed flames are experimentally studied in a slot burner configuration. The local temperature and gas composition is determined by means of one-dimensional, simultaneous detection of Rayleigh and Raman scattering. The statistics of the mixture fraction are utilized to characterize the impact of the Reynolds number, the global equivalence ratio, the progress of mixing within the flame, as well as the mixing length on the mixing field. Furthermore, these effects are evaluated by means of a regime diagram for partially premixed flames. In this study, it is shown that the increase of the mixing length results in a significantly more stable flame. The impact of the Reynolds number on flame stability is found to be minor.

  15. Statistical evaluation of variables affecting occurrence of hydrocarbons in aquifers used for public supply, California

    Science.gov (United States)

    Landon, Matthew K.; Burton, Carmen A.; Davis, Tracy A.; Belitz, Kenneth; Johnson, Tyler D.

    2014-01-01

    The variables affecting the occurrence of hydrocarbons in aquifers used for public supply in California were assessed based on statistical evaluation of three large statewide datasets; gasoline oxygenates also were analyzed for comparison with hydrocarbons. Benzene is the most frequently detected (1.7%) compound among 17 hydrocarbons analyzed at generally low concentrations (median detected concentration 0.024 μg/l) in groundwater used for public supply in California; methyl tert-butyl ether (MTBE) is the most frequently detected (5.8%) compound among seven oxygenates analyzed (median detected concentration 0.1 μg/l). At aquifer depths used for public supply, hydrocarbons and MTBE rarely co-occur and are generally related to different variables; in shallower groundwater, co-occurrence is more frequent and there are similar relations to the density or proximity of potential sources. Benzene concentrations are most strongly correlated with reducing conditions, regardless of groundwater age and depth. Multiple lines of evidence indicate that benzene and other hydrocarbons detected in old, deep, and/or brackish groundwater result from geogenic sources of oil and gas. However, in recently recharged (since ~1950), generally shallower groundwater, higher concentrations and detection frequencies of benzene and hydrocarbons were associated with a greater proportion of commercial land use surrounding the well, likely reflecting effects of anthropogenic sources, particularly in combination with reducing conditions.

  16. Statistical Evaluation of Causal Factors Associated with Astronaut Shoulder Injury in Space Suits.

    Science.gov (United States)

    Anderson, Allison P; Newman, Dava J; Welsch, Roy E

    2015-07-01

    Shoulder injuries due to working inside the space suit are some of the most serious and debilitating injuries astronauts encounter. Space suit injuries occur primarily in the Neutral Buoyancy Laboratory (NBL) underwater training facility due to accumulated musculoskeletal stress. We quantitatively explored the underlying causal mechanisms of injury. Logistic regression was used to identify relevant space suit components, training environment variables, and anthropometric dimensions related to an increased propensity for space-suited injury. Two groups of subjects were analyzed: those whose reported shoulder incident is attributable to the NBL or working in the space suit, and those whose shoulder incidence began in active duty, meaning working in the suit could be a contributing factor. For both groups, percent of training performed in the space suit planar hard upper torso (HUT) was the most important predictor variable for injury. Frequency of training and recovery between training were also significant metrics. The most relevant anthropometric dimensions were bideltoid breadth, expanded chest depth, and shoulder circumference. Finally, record of previous injury was found to be a relevant predictor for subsequent injury. The first statistical model correctly identifies 39% of injured subjects, while the second model correctly identifies 68% of injured subjects. A review of the literature suggests this is the first work to quantitatively evaluate the hypothesized causal mechanisms of all space-suited shoulder injuries. Although limited in predictive capability, each of the identified variables can be monitored and modified operationally to reduce future impacts on an astronaut's health.

  17. Implementation of Statistical Methods and SWOT Analysis for Evaluation of Metal Waste Management in Engineering Company

    Directory of Open Access Journals (Sweden)

    Záhorská Renáta

    2016-12-01

    Full Text Available This paper presents the results of the waste management research in a selected engineering company RIBE Slovakia, k. s., Nitra factory. Within of its manufacturing programme, the mentioned factory uses wide range of the manufacturing technologies (cutting operations, metal cold-forming, thread rolling, metal surface finishing, automatic sorting, metrology, assembly, with the aim to produce the final products – connecting components (fasteners delivered to many industrial fields (agricultural machinery manufacturers, car industry, etc.. There were obtained data characterizing production technologies and the range of manufactured products. The key attention is paid to the classification of waste produced by engineering production and to waste management within the company. Within the research, there were obtained data characterizing the time course of production of various waste types and these data were evaluated by means of statistical method using STATGRAPHICS. Based on the application of SWOT analysis, there is objectively assessed the waste management in the company in terms of strengths and weaknesses, as well as determination of the opportunities and potential threats. Results obtained by the SWOT analysis application have allowed to come to conclusion that the company RIBE Slovakia, k. s., Nitra factory has well organized waste management system. The fact that the waste management system is incorporated into the company management system can be considered as an advantage.

  18. Application of Multivariate Statistical Analysis in Evaluation of Surface River Water Quality of a Tropical River

    Directory of Open Access Journals (Sweden)

    Teck-Yee Ling

    2017-01-01

    Full Text Available The present study evaluated the spatial variations of surface water quality in a tropical river using multivariate statistical techniques, including cluster analysis (CA and principal component analysis (PCA. Twenty physicochemical parameters were measured at 30 stations along the Batang Baram and its tributaries. The water quality of the Batang Baram was categorized as “slightly polluted” where the chemical oxygen demand and total suspended solids were the most deteriorated parameters. The CA grouped the 30 stations into four clusters which shared similar characteristics within the same cluster, representing the upstream, middle, and downstream regions of the main river and the tributaries from the middle to downstream regions of the river. The PCA has determined a reduced number of six principal components that explained 83.6% of the data set variance. The first PC indicated that the total suspended solids, turbidity, and hydrogen sulphide were the dominant polluting factors which is attributed to the logging activities, followed by the five-day biochemical oxygen demand, total phosphorus, organic nitrogen, and nitrate-nitrogen in the second PC which are related to the discharges from domestic wastewater. The components also imply that logging activities are the major anthropogenic activities responsible for water quality variations in the Batang Baram when compared to the domestic wastewater discharge.

  19. Statistical Evaluation of the Identified Structural Parameters of an idling Offshore Wind Turbine

    International Nuclear Information System (INIS)

    Kramers, Hendrik C.; Van der Valk, Paul L.C.; Van Wingerden, Jan-Willem

    2016-01-01

    With the increased need for renewable energy, new offshore wind farms are being developed at an unprecedented scale. However, as the costs of offshore wind energy are still too high, design optimization and new innovations are required for lowering its cost. The design of modern day offshore wind turbines relies on numerical models for estimating ultimate and fatigue loads of the turbines. The dynamic behavior and the resulting structural loading of the turbines is determined for a large part by its structural properties, such as the natural frequencies and damping ratios. Hence, it is important to obtain accurate estimates of these modal properties. For this purpose stochastic subspace identification (SSI), in combination with clustering and statistical evaluation methods, is used to obtain the variance of the identified modal properties of an installed 3.6MW offshore wind turbine in idling conditions. It is found that one is able to obtain confidence intervals for the means of eigenfrequencies and damping ratios of the fore-aft and side-side modes of the wind turbine. (paper)

  20. Evaluation of nutritional support in a regional hospital.

    Science.gov (United States)

    Morán López, Jesús Manuel; Hernández González, Miriam; Peñalver Talavera, David; Peralta Watt, María; Temprano Ferreras, José Luis; Redondo Llorente, Cristina; Rubio Blanco, María Yolanda

    2018-05-08

    Disease-related malnutrition (DRM) is highly prevalent in Spanish hospitals (occurring in 1 out of every 4 patients). The 'Más Nutridos' Alliance has developed an action plan to detect and treat DRM. In Extremadura (Spain), the public health system has included nutritional screening as the only mechanism to fight malnutrition. The results of this strategy are evaluated here. An agreement study was conducted in standard clinical practice. Variables collected included the following rates: nutritional screening at entry, coded nutritional diagnoses, nutritional status assessment, nutritional requirements, successful nutritional therapy, weight and height at entry and discharge, referral to a nutritional support unit (NSU). Standards to comparison based on the results of the Netherland Program to Fight Malnutrition. Nutritional screening rate at entry was 20.5% (95% CI: 18.00-21.00). Coding and nutritional status assessment rate at entry was 13%. Weight and height were both measured in 16.5% of patients at entry and 20% at discharge. Nutritional requirements were estimated in 30% and were poorly monitored (13.3%). Only 15% of patients were referred to a NSU. Significantly lower values were found for all indicators as compared to standards, with kappa values lower than 0.2 in all cases. Data analysis showed poorer results when patients referred to the NSU were excluded. A strategy to fight malnutrition based on nutritional screening alone is highly inefficient in hospitals such as HVP. Copyright © 2018 SEEN y SED. Publicado por Elsevier España, S.L.U. All rights reserved.

  1. Flight Deck Weather Avoidance Decision Support: Implementation and Evaluation

    Science.gov (United States)

    Wu, Shu-Chieh; Luna, Rocio; Johnson, Walter W.

    2013-01-01

    Weather related disruptions account for seventy percent of the delays in the National Airspace System (NAS). A key component in the weather plan of the Next Generation of Air Transportation System (NextGen) is to assimilate observed weather information and probabilistic forecasts into the decision process of flight crews and air traffic controllers. In this research we explore supporting flight crew weather decision making through the development of a flight deck predicted weather display system that utilizes weather predictions generated by ground-based radar. This system integrates and presents this weather information, together with in-flight trajectory modification tools, within a cockpit display of traffic information (CDTI) prototype. that the CDTI features 2D and perspective 3D visualization models of weather. The weather forecast products that we implemented were the Corridor Integrated Weather System (CIWS) and the Convective Weather Avoidance Model (CWAM), both developed by MIT Lincoln Lab. We evaluated the use of CIWS and CWAM for flight deck weather avoidance in two part-task experiments. Experiment 1 compared pilots' en route weather avoidance performance in four weather information conditions that differed in the type and amount of predicted forecast (CIWS current weather only, CIWS current and historical weather, CIWS current and forecast weather, CIWS current and forecast weather and CWAM predictions). Experiment 2 compared the use of perspective 3D and 21/2D presentations of weather for flight deck weather avoidance. Results showed that pilots could take advantage of longer range predicted weather forecasts in performing en route weather avoidance but more research will be needed to determine what combinations of information are optimal and how best to present them.

  2. Evaluating Restorative Justice Circles of Support and Accountability: Can Social Support Overcome Structural Barriers?

    Science.gov (United States)

    Bohmert, Miriam Northcutt; Duwe, Grant; Hipple, Natalie Kroovand

    2018-02-01

    In a climate in which stigmatic shaming is increasing for sex offenders as they leave prison, restorative justice practices have emerged as a promising approach to sex offender reentry success and have been shown to reduce recidivism. Criminologists and restorative justice advocates believe that providing ex-offenders with social support that they may not otherwise have is crucial to reducing recidivism. This case study describes the expressive and instrumental social support required and received, and its relationship to key outcomes, by sex offenders who participated in Circles of Support and Accountability (COSAs), a restorative justice, reentry program in Minnesota. In-depth interviews with re-entering sex offenders and program volunteers revealed that 75% of offenders reported weak to moderate levels of social support leaving prison, 70% reported receiving instrumental support in COSAs, and 100% reported receiving expressive support. Findings inform work on social support, structural barriers, and restorative justice programming during sex offender reentry.

  3. Application of a statistical thermal design procedure to evaluate the PWR DNBR safety analysis limits

    International Nuclear Information System (INIS)

    Robeyns, J.; Parmentier, F.; Peeters, G.

    2001-01-01

    In the framework of safety analysis for the Belgian nuclear power plants and for the reload compatibility studies, Tractebel Energy Engineering (TEE) has developed, to define a 95/95 DNBR criterion, a statistical thermal design method based on the analytical full statistical approach: the Statistical Thermal Design Procedure (STDP). In that methodology, each DNBR value in the core assemblies is calculated with an adapted CHF (Critical Heat Flux) correlation implemented in the sub-channel code Cobra for core thermal hydraulic analysis. The uncertainties of the correlation are represented by the statistical parameters calculated from an experimental database. The main objective of a sub-channel analysis is to prove that in all class 1 and class 2 situations, the minimum DNBR (Departure from Nucleate Boiling Ratio) remains higher than the Safety Analysis Limit (SAL). The SAL value is calculated from the Statistical Design Limit (SDL) value adjusted with some penalties and deterministic factors. The search of a realistic value for the SDL is the objective of the statistical thermal design methods. In this report, we apply a full statistical approach to define the DNBR criterion or SDL (Statistical Design Limit) with the strict observance of the design criteria defined in the Standard Review Plan. The same statistical approach is used to define the expected number of rods experiencing DNB. (author)

  4. Meta-analysis as Statistical and Analytical Method of Journal’s Content Scientific Evaluation

    Science.gov (United States)

    Masic, Izet; Begic, Edin

    2015-01-01

    Introduction: A meta-analysis is a statistical and analytical method which combines and synthesizes different independent studies and integrates their results into one common result. Goal: Analysis of the journals “Medical Archives”, “Materia Socio Medica” and “Acta Informatica Medica”, which are located in the most eminent indexed databases of the biomedical milieu. Material and methods: The study has retrospective and descriptive character, and included the period of the calendar year 2014. Study included six editions of all three journals (total of 18 journals). Results: In this period was published a total of 291 articles (in the “Medical Archives” 110, “Materia Socio Medica” 97, and in “Acta Informatica Medica” 84). The largest number of articles was original articles. Small numbers have been published as professional, review articles and case reports. Clinical events were most common in the first two journals, while in the journal “Acta Informatica Medica” belonged to the field of medical informatics, as part of pre-clinical medical disciplines. Articles are usually required period of fifty to fifty nine days for review. Articles were received from four continents, mostly from Europe. The authors are most often from the territory of Bosnia and Herzegovina, then Iran, Kosovo and Macedonia. Conclusion: The number of articles published each year is increasing, with greater participation of authors from different continents and abroad. Clinical medical disciplines are the most common, with the broader spectrum of topics and with a growing number of original articles. Greater support of the wider scientific community is needed for further development of all three of the aforementioned journals. PMID:25870484

  5. The Canadian Precipitation Analysis (CaPA): Evaluation of the statistical interpolation scheme

    Science.gov (United States)

    Evans, Andrea; Rasmussen, Peter; Fortin, Vincent

    2013-04-01

    CaPA (Canadian Precipitation Analysis) is a data assimilation system which employs statistical interpolation to combine observed precipitation with gridded precipitation fields produced by Environment Canada's Global Environmental Multiscale (GEM) climate model into a final gridded precipitation analysis. Precipitation is important in many fields and applications, including agricultural water management projects, flood control programs, and hydroelectric power generation planning. Precipitation is a key input to hydrological models, and there is a desire to have access to the best available information about precipitation in time and space. The principal goal of CaPA is to produce this type of information. In order to perform the necessary statistical interpolation, CaPA requires the estimation of a semi-variogram. This semi-variogram is used to describe the spatial correlations between precipitation innovations, defined as the observed precipitation amounts minus the GEM forecasted amounts predicted at the observation locations. Currently, CaPA uses a single isotropic variogram across the entire analysis domain. The present project investigates the implications of this choice by first conducting a basic variographic analysis of precipitation innovation data across the Canadian prairies, with specific interest in identifying and quantifying potential anisotropy within the domain. This focus is further expanded by identifying the effect of storm type on the variogram. The ultimate goal of the variographic analysis is to develop improved semi-variograms for CaPA that better capture the spatial complexities of precipitation over the Canadian prairies. CaPA presently applies a Box-Cox data transformation to both the observations and the GEM data, prior to the calculation of the innovations. The data transformation is necessary to satisfy the normal distribution assumption, but introduces a significant bias. The second part of the investigation aims at devising a bias

  6. Community Post-Tornado Support Groups: Intervention and Evaluation.

    Science.gov (United States)

    McCammon, Susan; And Others

    Post-tornado support groups were organized by the Greene County, North Carolina disaster coordinators and the Pitt County outreach workers from the Community Mental Health Center sponsored tornado follow-up project. The most significant intervention used was the emphasis on creating a climate of group support by establishing a forum for…

  7. Support Provided to the External Tank (ET) Project on the Use of Statistical Analysis for ET Certification Consultation Position Paper

    Science.gov (United States)

    Null, Cynthia H.

    2009-01-01

    In June 2004, the June Space Flight Leadership Council (SFLC) assigned an action to the NASA Engineering and Safety Center (NESC) and External Tank (ET) project jointly to characterize the available dataset [of defect sizes from dissections of foam], identify resultant limitations to statistical treatment of ET as-built foam as part of the overall thermal protection system (TPS) certification, and report to the Program Requirements Change Board (PRCB) and SFLC in September 2004. The NESC statistics team was formed to assist the ET statistics group in August 2004. The NESC's conclusions are presented in this report.

  8. A comparative empirical analysis of statistical models for evaluating highway segment crash frequency

    Directory of Open Access Journals (Sweden)

    Bismark R.D.K. Agbelie

    2016-08-01

    Full Text Available The present study conducted an empirical highway segment crash frequency analysis on the basis of fixed-parameters negative binomial and random-parameters negative binomial models. Using a 4-year data from a total of 158 highway segments, with a total of 11,168 crashes, the results from both models were presented, discussed, and compared. About 58% of the selected variables produced normally distributed parameters across highway segments, while the remaining produced fixed parameters. The presence of a noise barrier along a highway segment would increase mean annual crash frequency by 0.492 for 88.21% of the highway segments, and would decrease crash frequency for 11.79% of the remaining highway segments. Besides, the number of vertical curves per mile along a segment would increase mean annual crash frequency by 0.006 for 84.13% of the highway segments, and would decrease crash frequency for 15.87% of the remaining highway segments. Thus, constraining the parameters to be fixed across all highway segments would lead to an inaccurate conclusion. Although, the estimated parameters from both models showed consistency in direction, the magnitudes were significantly different. Out of the two models, the random-parameters negative binomial model was found to be statistically superior in evaluating highway segment crashes compared with the fixed-parameters negative binomial model. On average, the marginal effects from the fixed-parameters negative binomial model were observed to be significantly overestimated compared with those from the random-parameters negative binomial model.

  9. Pulmonary complications of liver transplantation: radiological appearance and statistical evaluation of risk factors in 300 cases

    International Nuclear Information System (INIS)

    Golfieri, R.; Giampalma, E.; D'Arienzo, P.; Maffei, M.; Muzzi, C.; Tancioni, S.; Gavelli, G.; Morselli Labate, A.M.; Sama, C.; Jovine, E.; Grazi, G.L.; Mazziotti, A.; Cavallari, A.

    2000-01-01

    The aim of this study was to evaluate the incidence, radiographic appearance, time of onset, outcome and risk factors of non-infectious and infectious pulmonary complications following liver transplantation. Chest X-ray features of 300 consecutive patients who had undergone 333 liver transplants over an 11-year period were analysed: the type of pulmonary complication, the infecting pathogens and the mean time of their occurrence are described. The main risk factors for lung infections were quantified through univariate and multivariate statistical analysis. Non-infectious pulmonary abnormalities (atelectasis and/or pleural effusion: 86.7%) and pulmonary oedema (44.7%) appeared during the first postoperative week. Infectious pneumonia was observed in 13.7%, with a mortality of 36.6%. Bacterial and viral pneumonia made up the bulk of infections (63.4 and 29.3%, respectively) followed by fungal infiltrates (24.4%). A fairly good correlation between radiological chest X-ray pattern, time of onset and the cultured microorganisms has been observed in all cases. In multivariate analysis, persistent non-infectious abnormalities and pulmonary oedema were identified as the major independent predictors of posttransplant pneumonia, followed by prolonged assisted mechanical ventilation and traditional caval anastomosis. A ''pneumonia-risk score'' was calculated: low-risk score ( 3.30) population. The ''pneumonia-risk score'' identifies a specific group of patients in whom closer radiographic monitoring is recommended. In addition, a highly significant correlation (p<0.001) was observed between pneumonia-risk score and the expected survival, thus confirming pulmonary infections as a major cause of death in OLT recipients. (orig.)

  10. Evaluation of statistical models for forecast errors from the HBV model

    Science.gov (United States)

    Engeland, Kolbjørn; Renard, Benjamin; Steinsland, Ingelin; Kolberg, Sjur

    2010-04-01

    SummaryThree statistical models for the forecast errors for inflow into the Langvatn reservoir in Northern Norway have been constructed and tested according to the agreement between (i) the forecast distribution and the observations and (ii) median values of the forecast distribution and the observations. For the first model observed and forecasted inflows were transformed by the Box-Cox transformation before a first order auto-regressive model was constructed for the forecast errors. The parameters were conditioned on weather classes. In the second model the Normal Quantile Transformation (NQT) was applied on observed and forecasted inflows before a similar first order auto-regressive model was constructed for the forecast errors. For the third model positive and negative errors were modeled separately. The errors were first NQT-transformed before conditioning the mean error values on climate, forecasted inflow and yesterday's error. To test the three models we applied three criterions: we wanted (a) the forecast distribution to be reliable; (b) the forecast intervals to be narrow; (c) the median values of the forecast distribution to be close to the observed values. Models 1 and 2 gave almost identical results. The median values improved the forecast with Nash-Sutcliffe R eff increasing from 0.77 for the original forecast to 0.87 for the corrected forecasts. Models 1 and 2 over-estimated the forecast intervals but gave the narrowest intervals. Their main drawback was that the distributions are less reliable than Model 3. For Model 3 the median values did not fit well since the auto-correlation was not accounted for. Since Model 3 did not benefit from the potential variance reduction that lies in bias estimation and removal it gave on average wider forecasts intervals than the two other models. At the same time Model 3 on average slightly under-estimated the forecast intervals, probably explained by the use of average measures to evaluate the fit.

  11. T2* Mapping Provides Information That Is Statistically Comparable to an Arthroscopic Evaluation of Acetabular Cartilage.

    Science.gov (United States)

    Morgan, Patrick; Nissi, Mikko J; Hughes, John; Mortazavi, Shabnam; Ellerman, Jutta

    2017-07-01

    Objectives The purpose of this study was to validate T2* mapping as an objective, noninvasive method for the prediction of acetabular cartilage damage. Methods This is the second step in the validation of T2*. In a previous study, we established a quantitative predictive model for identifying and grading acetabular cartilage damage. In this study, the model was applied to a second cohort of 27 consecutive hips to validate the model. A clinical 3.0-T imaging protocol with T2* mapping was used. Acetabular regions of interest (ROI) were identified on magnetic resonance and graded using the previously established model. Each ROI was then graded in a blinded fashion by arthroscopy. Accurate surgical location of ROIs was facilitated with a 2-dimensional map projection of the acetabulum. A total of 459 ROIs were studied. Results When T2* mapping and arthroscopic assessment were compared, 82% of ROIs were within 1 Beck group (of a total 6 possible) and 32% of ROIs were classified identically. Disease prediction based on receiver operating characteristic curve analysis demonstrated a sensitivity of 0.713 and a specificity of 0.804. Model stability evaluation required no significant changes to the predictive model produced in the initial study. Conclusions These results validate that T2* mapping provides statistically comparable information regarding acetabular cartilage when compared to arthroscopy. In contrast to arthroscopy, T2* mapping is quantitative, noninvasive, and can be used in follow-up. Unlike research quantitative magnetic resonance protocols, T2* takes little time and does not require a contrast agent. This may facilitate its use in the clinical sphere.

  12. Statistical evaluation of the impact of shale gas activities on ozone pollution in North Texas.

    Science.gov (United States)

    Ahmadi, Mahdi; John, Kuruvilla

    2015-12-01

    Over the past decade, substantial growth in shale gas exploration and production across the US has changed the country's energy outlook. Beyond its economic benefits, the negative impacts of shale gas development on air and water are less well known. In this study the relationship between shale gas activities and ground-level ozone pollution was statistically evaluated. The Dallas-Fort Worth (DFW) area in north-central Texas was selected as the study region. The Barnett Shale, which is one the most productive and fastest growing shale gas fields in the US, is located in the western half of DFW. Hourly meteorological and ozone data were acquired for fourteen years from monitoring stations established and operated by the Texas Commission on Environmental Quality (TCEQ). The area was divided into two regions, the shale gas region (SGR) and the non-shale gas (NSGR) region, according to the number of gas wells in close proximity to each monitoring site. The study period was also divided into 2000-2006 and 2007-2013 because the western half of DFW has experienced significant growth in shale gas activities since 2007. An evaluation of the raw ozone data showed that, while the overall trend in the ozone concentration was down over the entire region, the monitoring sites in the NSGR showed an additional reduction of 4% in the annual number of ozone exceedance days than those in the SGR. Directional analysis of ozone showed that the winds blowing from areas with high shale gas activities contributed to higher ozone downwind. KZ-filtering method and linear regression techniques were used to remove the effects of meteorological variations on ozone and to construct long-term and short-term meteorologically adjusted (M.A.) ozone time series. The mean value of all M.A. ozone components was 8% higher in the sites located within the SGR than in the NSGR. These findings may be useful for understanding the overall impact of shale gas activities on the local and regional ozone

  13. PROCEDURE FOR THE EVALUATION OF MEASURED DATA IN TERMS OF VIBRATION DIAGNOSTICS BY APPLICATION OF A MULTIDIMENSIONAL STATISTICAL MODEL

    Directory of Open Access Journals (Sweden)

    Tomas TOMKO

    2016-06-01

    Full Text Available The evaluation process of measured data in terms of vibration diagnosis is problematic for timeline constructors. The complexity of such an evaluation is compounded by the fact that it is a process involving a large amount of disparate measurement data. One of the most effective analytical approaches when dealing with large amounts of data is to engage in a process using multidimensional statistical methods, which can provide a picture of the current status of the flexibility of the machinery. The more methods that are used, the more precise the statistical analysis of measurement data, making it possible to obtain a better picture of the current condition of the machinery.

  14. Evaluation and application of summary statistic imputation to discover new height-associated loci.

    Science.gov (United States)

    Rüeger, Sina; McDaid, Aaron; Kutalik, Zoltán

    2018-05-01

    As most of the heritability of complex traits is attributed to common and low frequency genetic variants, imputing them by combining genotyping chips and large sequenced reference panels is the most cost-effective approach to discover the genetic basis of these traits. Association summary statistics from genome-wide meta-analyses are available for hundreds of traits. Updating these to ever-increasing reference panels is very cumbersome as it requires reimputation of the genetic data, rerunning the association scan, and meta-analysing the results. A much more efficient method is to directly impute the summary statistics, termed as summary statistics imputation, which we improved to accommodate variable sample size across SNVs. Its performance relative to genotype imputation and practical utility has not yet been fully investigated. To this end, we compared the two approaches on real (genotyped and imputed) data from 120K samples from the UK Biobank and show that, genotype imputation boasts a 3- to 5-fold lower root-mean-square error, and better distinguishes true associations from null ones: We observed the largest differences in power for variants with low minor allele frequency and low imputation quality. For fixed false positive rates of 0.001, 0.01, 0.05, using summary statistics imputation yielded a decrease in statistical power by 9, 43 and 35%, respectively. To test its capacity to discover novel associations, we applied summary statistics imputation to the GIANT height meta-analysis summary statistics covering HapMap variants, and identified 34 novel loci, 19 of which replicated using data in the UK Biobank. Additionally, we successfully replicated 55 out of the 111 variants published in an exome chip study. Our study demonstrates that summary statistics imputation is a very efficient and cost-effective way to identify and fine-map trait-associated loci. Moreover, the ability to impute summary statistics is important for follow-up analyses, such as Mendelian

  15. Statistical program for the data evaluation of a thermal ionization mass spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    van Raaphorst, J. G.

    1978-12-15

    A computer program has been written to statistically analyze mass spectrometer measurements. The program tests whether the difference between signal and background intensities is statistically significant, corrects for signal drift in the measured values, and calculates ratios against the main isotope from the corrected intensities. Repeated ratio value measurements are screened for outliers using the Dixon statistical test. Means of ratios and the coefficient of variation are calculated and reported. The computer program is written in Basic and is available for anyone who is interested.

  16. A decision support system-based procedure for evaluation and ...

    Indian Academy of Sciences (India)

    with an overview of the web-based Decision Support System (DSS) developed to facilitate its wide adop- tion. .... contributes significant catchment management and water supply functions .... experience in engagement and facilitation methods.

  17. Improving alignment in Tract-based spatial statistics: evaluation and optimization of image registration

    NARCIS (Netherlands)

    de Groot, Marius; Vernooij, Meike W.; Klein, Stefan; Ikram, M. Arfan; Vos, Frans M.; Smith, Stephen M.; Niessen, Wiro J.; Andersson, Jesper L. R.

    2013-01-01

    Anatomical alignment in neuroimaging studies is of such importance that considerable effort is put into improving the registration used to establish spatial correspondence. Tract-based spatial statistics (TBSS) is a popular method for comparing diffusion characteristics across subjects. TBSS

  18. Development of a statistical shape model of multi-organ and its performance evaluation

    International Nuclear Information System (INIS)

    Nakada, Misaki; Shimizu, Akinobu; Kobatake, Hidefumi; Nawano, Shigeru

    2010-01-01

    Existing statistical shape modeling methods for an organ can not take into account the correlation between neighboring organs. This study focuses on a level set distribution model and proposes two modeling methods for multiple organs that can take into account the correlation between neighboring organs. The first method combines level set functions of multiple organs into a vector. Subsequently it analyses the distribution of the vectors of a training dataset by a principal component analysis and builds a multiple statistical shape model. Second method constructs a statistical shape model for each organ independently and assembles component scores of different organs in a training dataset so as to generate a vector. It analyses the distribution of the vectors of to build a statistical shape model of multiple organs. This paper shows results of applying the proposed methods trained by 15 abdominal CT volumes to unknown 8 CT volumes. (author)

  19. Improving alignment in Tract-based spatial statistics : Evaluation and optimization of image registration

    NARCIS (Netherlands)

    De Groot, M.; Vernooij, M.W.; Klein, S.; Arfan Ikram, M.; Vos, F.M.; Smith, S.M.; Niessen, W.J.; Andersson, J.L.R.

    2013-01-01

    Anatomical alignment in neuroimaging studies is of such importance that considerable effort is put into improving the registration used to establish spatial correspondence. Tract-based spatial statistics (TBSS) is a popular method for comparing diffusion characteristics across subjects. TBSS

  20. A statistical method for evaluation of the experimental phase equilibrium data of simple clathrate hydrates

    DEFF Research Database (Denmark)

    Eslamimanesh, Ali; Gharagheizi, Farhad; Mohammadi, Amir H.

    2012-01-01

    We, herein, present a statistical method for diagnostics of the outliers in phase equilibrium data (dissociation data) of simple clathrate hydrates. The applied algorithm is performed on the basis of the Leverage mathematical approach, in which the statistical Hat matrix, Williams Plot, and the r......We, herein, present a statistical method for diagnostics of the outliers in phase equilibrium data (dissociation data) of simple clathrate hydrates. The applied algorithm is performed on the basis of the Leverage mathematical approach, in which the statistical Hat matrix, Williams Plot...... in exponential form is used to represent/predict the hydrate dissociation pressures for three-phase equilibrium conditions (liquid water/ice–vapor-hydrate). The investigated hydrate formers are methane, ethane, propane, carbon dioxide, nitrogen, and hydrogen sulfide. It is interpreted from the obtained results...

  1. Evaluation of significantly modified water bodies in Vojvodina by using multivariate statistical techniques

    Directory of Open Access Journals (Sweden)

    Vujović Svetlana R.

    2013-01-01

    Full Text Available This paper illustrates the utility of multivariate statistical techniques for analysis and interpretation of water quality data sets and identification of pollution sources/factors with a view to get better information about the water quality and design of monitoring network for effective management of water resources. Multivariate statistical techniques, such as factor analysis (FA/principal component analysis (PCA and cluster analysis (CA, were applied for the evaluation of variations and for the interpretation of a water quality data set of the natural water bodies obtained during 2010 year of monitoring of 13 parameters at 33 different sites. FA/PCA attempts to explain the correlations between the observations in terms of the underlying factors, which are not directly observable. Factor analysis is applied to physico-chemical parameters of natural water bodies with the aim classification and data summation as well as segmentation of heterogeneous data sets into smaller homogeneous subsets. Factor loadings were categorized as strong and moderate corresponding to the absolute loading values of >0.75, 0.75-0.50, respectively. Four principal factors were obtained with Eigenvalues >1 summing more than 78 % of the total variance in the water data sets, which is adequate to give good prior information regarding data structure. Each factor that is significantly related to specific variables represents a different dimension of water quality. The first factor F1 accounting for 28 % of the total variance and represents the hydrochemical dimension of water quality. The second factor F2 accounting for 18% of the total variance and may be taken factor of water eutrophication. The third factor F3 accounting 17 % of the total variance and represents the influence of point sources of pollution on water quality. The fourth factor F4 accounting 13 % of the total variance and may be taken as an ecological dimension of water quality. Cluster analysis (CA is an

  2. An Evaluation of the WSSC (Weapon System Support Cost) Cost Allocation Algorithms. II. Installation Support.

    Science.gov (United States)

    1983-06-01

    S XX3OXX, or XX37XX is found. As a result, the following two host-financed tenant support accounts currently will be treated as unit operations costs ... Horngren , Cost Accounting : A Managerial Emphasis, Prentice-Hall Inc., Englewood Cliffs, NJ, 1972. 10. D. B. Levine and J. M. Jondrow, "The...WSSC COST ALLOCATION Technical Report ~ALGORITHMS II: INSTALLATION SUPPORT 6. PERFORMING ORG. REPORT NUMBER 7. AUTHOR( S ) 9. CONTRACT OR GRANT NUMBER

  3. Evaluating clinical ethics support in mental healthcare: a systematic literature review.

    NARCIS (Netherlands)

    Hem, M.H.; Pedersen, R.; Norvoll, R.; Molewijk, A.C.

    2015-01-01

    A systematic literature review on evaluation of clinical ethics support services in mental healthcare is presented and discussed. The focus was on (a) forms of clinical ethics support services, (b) evaluation of clinical ethics support services, (c) contexts and participants and (d) results. Five

  4. Using assemblage data in ecological indicators: A comparison and evaluation of commonly available statistical tools

    Science.gov (United States)

    Smith, Joseph M.; Mather, Martha E.

    2012-01-01

    Ecological indicators are science-based tools used to assess how human activities have impacted environmental resources. For monitoring and environmental assessment, existing species assemblage data can be used to make these comparisons through time or across sites. An impediment to using assemblage data, however, is that these data are complex and need to be simplified in an ecologically meaningful way. Because multivariate statistics are mathematical relationships, statistical groupings may not make ecological sense and will not have utility as indicators. Our goal was to define a process to select defensible and ecologically interpretable statistical simplifications of assemblage data in which researchers and managers can have confidence. For this, we chose a suite of statistical methods, compared the groupings that resulted from these analyses, identified convergence among groupings, then we interpreted the groupings using species and ecological guilds. When we tested this approach using a statewide stream fish dataset, not all statistical methods worked equally well. For our dataset, logistic regression (Log), detrended correspondence analysis (DCA), cluster analysis (CL), and non-metric multidimensional scaling (NMDS) provided consistent, simplified output. Specifically, the Log, DCA, CL-1, and NMDS-1 groupings were ≥60% similar to each other, overlapped with the fluvial-specialist ecological guild, and contained a common subset of species. Groupings based on number of species (e.g., Log, DCA, CL and NMDS) outperformed groupings based on abundance [e.g., principal components analysis (PCA) and Poisson regression]. Although the specific methods that worked on our test dataset have generality, here we are advocating a process (e.g., identifying convergent groupings with redundant species composition that are ecologically interpretable) rather than the automatic use of any single statistical tool. We summarize this process in step-by-step guidance for the

  5. Batch-to-Batch Quality Consistency Evaluation of Botanical Drug Products Using Multivariate Statistical Analysis of the Chromatographic Fingerprint

    OpenAIRE

    Xiong, Haoshu; Yu, Lawrence X.; Qu, Haibin

    2013-01-01

    Botanical drug products have batch-to-batch quality variability due to botanical raw materials and the current manufacturing process. The rational evaluation and control of product quality consistency are essential to ensure the efficacy and safety. Chromatographic fingerprinting is an important and widely used tool to characterize the chemical composition of botanical drug products. Multivariate statistical analysis has showed its efficacy and applicability in the quality evaluation of many ...

  6. A framework for evaluating innovative statistical and risk assessment tools to solve environment restoration problems

    International Nuclear Information System (INIS)

    Hassig, N.L.; Gilbert, R.O.; Pulsipher, B.A.

    1991-09-01

    Environmental restoration activities at the US Department of Energy (DOE) Hanford site face complex issues due to history of varied past contaminant disposal practices. Data collection and analysis required for site characterization, pathway modeling, and remediation selection decisions must deal with inherent uncertainties and unique problems associated with the restoration. A framework for working through the statistical aspects of the site characterization and remediation selection problems is needed. This framework would facilitate the selection of appropriate statistical tools for solving unique aspects of the environmental restoration problem. This paper presents a framework for selecting appropriate statistical and risk assessment methods. The following points will be made: (1) pathway modelers and risk assessors often recognize that ''some type'' of statistical methods are required but don't work with statisticians on tools development in the early planning phases of the project; (2) statistical tools selection and development are problem-specific and often site-specific, further indicating a need for up-front involvement of statisticians; and (3) the right tool, applied in the right way can minimize sampling costs, get as much information as possible out of the data that does exist, provide consistency and defensibility for the results, and given structure and quantitative measures to decision risks and uncertainties

  7. Evaluating automatically parallelized versions of the support vector machine

    NARCIS (Netherlands)

    Codreanu, V.; Dröge, B.; Williams, D.; Yasar, B.; Yang, P.; Liu, B.; Dong, F.; Surinta, O.; Schomaker, L.R.B.; Roerdink, J.B.T.M.; Wiering, M.A.

    2016-01-01

    The support vector machine (SVM) is a supervised learning algorithm used for recognizing patterns in data. It is a very popular technique in machine learning and has been successfully used in applications such as image classification, protein classification, and handwriting recognition. However, the

  8. Biological evaluation of mechanical circulatory support systems in calves

    NARCIS (Netherlands)

    Rakhorst, G; VanDerMeer, J; Kik, C; Mihaylov, D; Havlik, P; Trinkl, J; Monties, [No Value

    Data from animal experiments with mechanical circulatory support systems (MCSS) performed in Groningen and Marseille over the past years were used to obtain normal values of hematological, coagulation, rheological and blood chemistry parameters in calves. These parameters were divided between two

  9. Evaluation of a Reproductive Health Program to Support Married ...

    African Journals Online (AJOL)

    ... self-esteem, reproductive health and gender through girls' groups. The husbands' program focused on non-violence, support to families, and reproductive health. Population-based surveys were undertaken among married girls, at midterm and end line. Outcomes of interest were husbands' assistance with domestic work, ...

  10. Evaluating automatically parallelized versions of the support vector machine

    NARCIS (Netherlands)

    Codreanu, Valeriu; Droge, Bob; Williams, David; Yasar, Burhan; Yang, Fo; Liu, Baoquan; Dong, Feng; Surinta, Olarik; Schomaker, Lambertus; Roerdink, Jos; Wiering, Marco

    2014-01-01

    The support vector machine (SVM) is a supervised learning algorithm used for recognizing patterns in data. It is a very popular technique in machine learning and has been successfully used in applications such as image classification, protein classification, and handwriting recognition. However, the

  11. Evaluation of Rock Bolt Support for Polish Hard Rock Mines

    Science.gov (United States)

    Skrzypkowski, Krzysztof

    2018-03-01

    The article presents different types of rock bolt support used in Polish ore mining. Individual point resin and expansion rock bolt support were characterized. The roof classes for zinc and lead and copper ore mines were presented. Furthermore, in the article laboratory tests of point resin rock bolt support in a geometric scale of 1:1 with minimal fixing length of 0.6 m were made. Static testing of point resin rock bolt support were carried out on a laboratory test facility of Department of Underground Mining which simulate mine conditions for Polish ore and hard coal mining. Laboratory tests of point resin bolts were carried out, especially for the ZGH Bolesław, zinc and lead "Olkusz - Pomorzany" mine. The primary aim of the research was to check whether at the anchoring point length of 0.6 m by means of one and a half resin cartridge, the type bolt "Olkusz - 20A" is able to overcome the load.The second purpose of the study was to obtain load - displacement characteristic with determination of the elastic and plastic range of the bolt. For the best simulation of mine conditions the station steel cylinders with an external diameter of 0.1 m and a length of 0.6 m with a core of rock from the roof of the underground excavations were used.

  12. Decision support for natural resource management; models and evaluation methods

    NARCIS (Netherlands)

    Wessels, J.; Makowski, M.; Nakayama, H.

    2001-01-01

    When managing natural resources or agrobusinesses, one always has to deal with autonomous processes. These autonomous processes play a core role in designing model-based decision support systems. This chapter tries to give insight into the question of which types of models might be used in which

  13. A multicriteria decision support methodology for evaluating airport expansion plans

    NARCIS (Netherlands)

    Vreeker, R.; Nijkamp, P.; ter Welle, C.

    2001-01-01

    Rational decision-making requires an assessment of advantages and disadvantages of choice possibilities, including non-market effects (such as externalities). This also applies to strategic decision-making in the transport sector (including aviation). In the past decades various decision support and

  14. Physical activity support community togetheractive - architecture, implementation and evaluation

    NARCIS (Netherlands)

    Elloumi, Lamia; van Beijnum, Bernhard J.F.; Hermens, Hermanus J.

    Reducing sedentary lifestyle and physical inactivity is getting an increased attention of researchers and health organizations due to its significant benefits on health. In the same direction we are proposing a virtual community system, TogetherActive, which supports people in their daily physical

  15. Designing, Modeling and Evaluating Influence Strategiesfor Behavior Change Support Systems

    NARCIS (Netherlands)

    Öörni, Anssi; Kelders, Saskia Marion; van Gemert-Pijnen, Julia E.W.C.; Oinas-Kukkonen, Harri

    2014-01-01

    Behavior change support systems (BCSS) research is an evolving area. While the systems have been demonstrated to work to the effect, there is still a lot of work to be done to better understand the influence mechanisms of behavior change, and work out their influence on the systems architecture. The

  16. A Decision Support Framework for Evaluation of Engineered Nanomaterials

    Science.gov (United States)

    Engineered nanomaterials (ENM) are currently being developed and applied at rates that far exceed our ability to evaluate their potential for environmental or human health risks. The gap between material development and capacity for assessment grows wider every day. Transforma...

  17. FHWA operations support : port peak pricing program evaluation

    Science.gov (United States)

    2009-01-01

    This report evaluates the applicability, Federal policy implications, and possible public and private sector roles related to peak pricing strategies at ports and intermodal facilities in the U.S. A number of ports and intermodal terminals are consid...

  18. Halon Flightline Extinguisher Evaluation: Data Supporting Standard Development

    National Research Council Canada - National Science Library

    Dierdorf, Dougls S; Kiel, Jennifer C

    2005-01-01

    .... An F-100 engine nacelle mockup was used to evaluate the full extinguishment times and amount of extinguishing agent used on a series of twenty aft engine and pool fires of 100-ft2 and ten access panel fires...

  19. To the problem of the statistical basis of evaluation of the mechanical safety factor

    International Nuclear Information System (INIS)

    Tsyganov, S.V.

    2009-01-01

    The methodology applied for the safety factor assessment of the WWER fuel cycles uses methods and terms of statistics. Value of the factor is calculated on the basis of estimation of probability to meet predefined limits. Such approach demands the special attention to the statistical properties of parameters of interest. Considering the mechanical constituents of the engineering factor it is assumed uncertainty factors of safety parameters are stochastic values. It characterized by probabilistic distributions that can be unknown. Traditionally in the safety factor assessment process the unknown parameters are estimated from the conservative points of view. This paper analyses how the refinement of the factors distribution parameters is important for the assessment of the mechanical safety factor. For the analysis the statistical approach is applied for modelling of different type of factor probabilistic distributions. It is shown the significant influence of the shape and parameters of distributions for some factors on the value of mechanical safety factor. (Authors)

  20. The System of Indicators for the Statistical Evaluation of Market Conjuncture

    Directory of Open Access Journals (Sweden)

    Chernenko Daryna I.

    2017-04-01

    Full Text Available The article is aimed at systematizing and improving the system of statistical indicators for the market of laboratory health services (LHS and developing methods for their calculation. In the course of formation of the system of statistical indicators for the market of LHS, allocation of nine blocks has been proposed: market size; proportionality of market; market demand; market proposal; level and dynamics of prices; variation of the LHS; dynamics, development trends, and cycles of the market; market structure; level of competition and monopolization. The proposed system of statistical indicators together with methods for their calculation should ensure studying the trends and regularities in formation of the market for laboratory health services in Ukraine.

  1. To the problem of the statistical basis of evaluation of the mechanical safety factor

    International Nuclear Information System (INIS)

    Tsyganov, S.

    2009-01-01

    The methodology applied for the safety factor assessment of the VVER fuel cycles uses methods and terms of statistics. Value of the factor is calculated on the basis of estimation of probability to meet predefined limits. Such approach demands the special attention to the statistical properties of parameters of interest. Considering the mechanical constituents of the engineering factor it is assumed uncertainty factors of safety parameters are stochastic values. It characterized by probabilistic distributions that can be unknown. Traditionally in the safety factor assessment process the unknown parameters are estimated from the conservative points of view. This paper analyses how the refinement of the factors distribution parameters is important for the assessment of the mechanical safety factor. For the analysis the statistical approach is applied for modelling of different type of factor probabilistic distributions. It is shown the significant influence of the shape and parameters of distributions for some factors on the value of mechanical safety factor. (author)

  2. A utilização da avaliação tipo "teste" on-line como apoio ao ensino presencial: uma abordagem quantitativa sobre a sua contribuição no ensino de ferramentas estatística multivariadas Use of on-line evaluation type "test" as support for presential teaching: a quantitative approach on its contribution to the teaching of multivariate statistics tools

    Directory of Open Access Journals (Sweden)

    Erica Ferreira Marques

    2011-07-01

    ância do uso dessa ferramenta de avaliação como apoio ao ensino presencial e a sua contribuição para o processo de ensino-aprendizagem.This work aims to show the important role an online assessment test can play as a tool developed to support the presential teaching of multivariate statistical resources to Business Management undergraduate students at FEARP/USP enrolled in Applied Statistics to Business Management II. This study is part of a project named LaVie, a virtual environment of teaching-learning of Statistics, applied and developed to support presential teaching in this field. Based on the importance of an assessment tool, LaVie created content, interaction and "test your knowledge" tools. This assessment tool was developed based on online quizzes having three adaptation levels: basic (I, intermediate (II, and advanced (III to each module of the discipline. The methodology used for checking the efficiency of this online test tool was based on a quantitative assessment according to the students' (users' opinions. Four assumptions were investigated in this study. Data were collected in two distinct occasions: second semester of 2005, as a pilot project, and second semester of 2006, thus enabling a comparative analysis of the system by the users. This survey was conducted in class where students completed two questionnaires, one before the presential assessment and the other immediately after it. The study shows the importance of this tool as a support to presential teaching and its contribution to the teaching-learning process.

  3. Statistical evaluation of a project to estimate fish trajectories through the intakes of Kaplan hydropower turbines

    Science.gov (United States)

    Sutton, Virginia Kay

    This paper examines statistical issues associated with estimating paths of juvenile salmon through the intakes of Kaplan turbines. Passive sensors, hydrophones, detecting signals from ultrasonic transmitters implanted in individual fish released into the preturbine region were used to obtain the information to estimate fish paths through the intake. Aim and location of the sensors affects the spatial region in which the transmitters can be detected, and formulas relating this region to sensor aiming directions are derived. Cramer-Rao lower bounds for the variance of estimators of fish location are used to optimize placement of each sensor. Finally, a statistical methodology is developed for analyzing angular data collected from optimally placed sensors.

  4. Training and support to improve ICD coding quality: A controlled before-and-after impact evaluation

    Directory of Open Access Journals (Sweden)

    Robin Dyers

    2017-06-01

    Full Text Available Background. The proposed National Health Insurance policy for South Africa (SA requires hospitals to maintain high-quality International Statistical Classification of Diseases (ICD codes for patient records. While considerable strides had been made to improve ICD coding coverage by digitising the discharge process in the Western Cape Province, further intervention was required to improve data quality. The aim of this controlled before-and-after study was to evaluate the impact of a clinician training and support initiative to improve ICD coding quality. Objective. To compare ICD coding quality between two central hospitals in the Western Cape before and after the implementation of a training and support initiative for clinicians at one of the sites. Methods. The difference in differences in data quality between the intervention site and the control site was calculated. Multiple logistic regression was also used to determine the odds of data quality improvement after the intervention and to adjust for potential differences between the groups. Results. The intervention had a positive impact of 38.0% on ICD coding completeness over and above changes that occurred at the control site. Relative to the baseline, patient records at the intervention site had a 6.6 (95% confidence interval 3.5 - 16.2 adjusted odds ratio of having a complete set of ICD codes for an admission episode after the introduction of the training and support package. The findings on impact on ICD coding accuracy were not significant. Conclusion. There is sufficient pragmatic evidence that a training and support package will have a considerable positive impact on ICD coding completeness in the SA setting.

  5. Training and support to improve ICD coding quality: A controlled before-and-after impact evaluation.

    Science.gov (United States)

    Dyers, Robin; Ward, Grant; Du Plooy, Shane; Fourie, Stephanus; Evans, Juliet; Mahomed, Hassan

    2017-05-24

    The proposed National Health Insurance policy for South Africa (SA) requires hospitals to maintain high-quality International Statistical Classification of Diseases (ICD) codes for patient records. While considerable strides had been made to improve ICD coding coverage by digitising the discharge process in the Western Cape Province, further intervention was required to improve data quality. The aim of this controlled before-and-after study was to evaluate the impact of a clinician training and support initiative to improve ICD coding quality. To compare ICD coding quality between two central hospitals in the Western Cape before and after the implementation of a training and support initiative for clinicians at one of the sites. The difference in differences in data quality between the intervention site and the control site was calculated. Multiple logistic regression was also used to determine the odds of data quality improvement after the intervention and to adjust for potential differences between the groups. The intervention had a positive impact of 38.0% on ICD coding completeness over and above changes that occurred at the control site. Relative to the baseline, patient records at the intervention site had a 6.6 (95% confidence interval 3.5 - 16.2) adjusted odds ratio of having a complete set of ICD codes for an admission episode after the introduction of the training and support package. The findings on impact on ICD coding accuracy were not significant. There is sufficient pragmatic evidence that a training and support package will have a considerable positive impact on ICD coding completeness in the SA setting.

  6. Do Open Source LMSs Support Personalization? A Comparative Evaluation

    Science.gov (United States)

    Kerkiri, Tania; Paleologou, Angela-Maria

    A number of parameters that support the LMSs capabilities towards content personalization are presented and substantiated. These parameters constitute critical criteria for an exhaustive investigation of the personalization capabilities of the most popular open source LMSs. Results are comparatively shown and commented upon, thus highlighting a course of conduct for the implementation of new personalization methodologies for these LMSs, aligned at their existing infrastructure, to maintain support of the numerous educational institutions entrusting major part of their curricula to them. Meanwhile, new capabilities arise as drawn from a more efficient description of the existing resources -especially when organized into widely available repositories- that lead to qualitatively advanced learner-oriented courses which would ideally meet the challenge of combining personification of demand and personalization of thematic content at once.

  7. Evaluation of trauma care using TRISS method: the role of adjusted misclassification rate and adjusted w-statistic

    Directory of Open Access Journals (Sweden)

    Bytyçi Cen I

    2009-01-01

    Full Text Available Abstract Background Major trauma is a leading cause of death worldwide. Evaluation of trauma care using Trauma Injury and Injury Severity Score (TRISS method is focused in trauma outcome (deaths and survivors. For testing TRISS method TRISS misclassification rate is used. Calculating w-statistic, as a difference between observed and TRISS expected survivors, we compare our trauma care results with the TRISS standard. Aim The aim of this study is to analyze interaction between misclassification rate and w-statistic and to adjust these parameters to be closer to the truth. Materials and methods Analysis of components of TRISS misclassification rate and w-statistic and actual trauma outcome. Results The component of false negative (FN (by TRISS method unexpected deaths has two parts: preventable (Pd and non-preventable (nonPd trauma deaths. Pd represents inappropriate trauma care of an institution; otherwise nonpreventable trauma deaths represents errors in TRISS method. Removing patients with preventable trauma deaths we get an Adjusted misclassification rate: (FP + FN - Pd/N or (b+c-Pd/N. Substracting nonPd from FN value in w-statistic formula we get an Adjusted w-statistic: [FP-(FN - nonPd]/N, respectively (FP-Pd/N, or (b-Pd/N. Conclusion Because adjusted formulas clean method from inappropriate trauma care, and clean trauma care from the methods error, TRISS adjusted misclassification rate and adjusted w-statistic gives more realistic results and may be used in researches of trauma outcome.

  8. Evaluation of trauma care using TRISS method: the role of adjusted misclassification rate and adjusted w-statistic.

    Science.gov (United States)

    Llullaku, Sadik S; Hyseni, Nexhmi Sh; Bytyçi, Cen I; Rexhepi, Sylejman K

    2009-01-15

    Major trauma is a leading cause of death worldwide. Evaluation of trauma care using Trauma Injury and Injury Severity Score (TRISS) method is focused in trauma outcome (deaths and survivors). For testing TRISS method TRISS misclassification rate is used. Calculating w-statistic, as a difference between observed and TRISS expected survivors, we compare our trauma care results with the TRISS standard. The aim of this study is to analyze interaction between misclassification rate and w-statistic and to adjust these parameters to be closer to the truth. Analysis of components of TRISS misclassification rate and w-statistic and actual trauma outcome. The component of false negative (FN) (by TRISS method unexpected deaths) has two parts: preventable (Pd) and non-preventable (nonPd) trauma deaths. Pd represents inappropriate trauma care of an institution; otherwise nonpreventable trauma deaths represents errors in TRISS method. Removing patients with preventable trauma deaths we get an Adjusted misclassification rate: (FP + FN - Pd)/N or (b+c-Pd)/N. Substracting nonPd from FN value in w-statistic formula we get an Adjusted w-statistic: [FP-(FN - nonPd)]/N, respectively (FP-Pd)/N, or (b-Pd)/N). Because adjusted formulas clean method from inappropriate trauma care, and clean trauma care from the methods error, TRISS adjusted misclassification rate and adjusted w-statistic gives more realistic results and may be used in researches of trauma outcome.

  9. Batch-to-batch quality consistency evaluation of botanical drug products using multivariate statistical analysis of the chromatographic fingerprint.

    Science.gov (United States)

    Xiong, Haoshu; Yu, Lawrence X; Qu, Haibin

    2013-06-01

    Botanical drug products have batch-to-batch quality variability due to botanical raw materials and the current manufacturing process. The rational evaluation and control of product quality consistency are essential to ensure the efficacy and safety. Chromatographic fingerprinting is an important and widely used tool to characterize the chemical composition of botanical drug products. Multivariate statistical analysis has showed its efficacy and applicability in the quality evaluation of many kinds of industrial products. In this paper, the combined use of multivariate statistical analysis and chromatographic fingerprinting is presented here to evaluate batch-to-batch quality consistency of botanical drug products. A typical botanical drug product in China, Shenmai injection, was selected as the example to demonstrate the feasibility of this approach. The high-performance liquid chromatographic fingerprint data of historical batches were collected from a traditional Chinese medicine manufacturing factory. Characteristic peaks were weighted by their variability among production batches. A principal component analysis model was established after outliers were modified or removed. Multivariate (Hotelling T(2) and DModX) control charts were finally successfully applied to evaluate the quality consistency. The results suggest useful applications for a combination of multivariate statistical analysis with chromatographic fingerprinting in batch-to-batch quality consistency evaluation for the manufacture of botanical drug products.

  10. Statistics to the Rescue!: Using Data to Evaluate a Manufacturing Process

    Science.gov (United States)

    Keithley, Michael G.

    2009-01-01

    The use of statistics and process controls is too often overlooked in educating students. This article describes an activity appropriate for high school students who have a background in material processing. It gives them a chance to advance their knowledge by determining whether or not a manufacturing process works well. The activity follows a…

  11. Multivariate statistical evaluation of trace elements in groundwater in a coastal area in Shenzhen, China

    International Nuclear Information System (INIS)

    Chen Kouping; Jiao, Jiu J.; Huang Jianmin; Huang Runqiu

    2007-01-01

    Multivariate statistical techniques are efficient ways to display complex relationships among many objects. An attempt was made to study the data of trace elements in groundwater using multivariate statistical techniques such as principal component analysis (PCA), Q-mode factor analysis and cluster analysis. The original matrix consisted of 17 trace elements estimated from 55 groundwater samples colleted in 27 wells located in a coastal area in Shenzhen, China. PCA results show that trace elements of V, Cr, As, Mo, W, and U with greatest positive loadings typically occur as soluble oxyanions in oxidizing waters, while Mn and Co with greatest negative loadings are generally more soluble within oxygen depleted groundwater. Cluster analyses demonstrate that most groundwater samples collected from the same well in the study area during summer and winter still fall into the same group. This study also demonstrates the usefulness of multivariate statistical analysis in hydrochemical studies. - Multivariate statistical analysis was used to investigate relationships among trace elements and factors controlling trace element distribution in groundwater

  12. Robust statistical methods for significance evaluation and applications in cancer driver detection and biomarker discovery

    DEFF Research Database (Denmark)

    Madsen, Tobias

    2017-01-01

    In the present thesis I develop, implement and apply statistical methods for detecting genomic elements implicated in cancer development and progression. This is done in two separate bodies of work. The first uses the somatic mutation burden to distinguish cancer driver mutations from passenger m...

  13. Statistical Power in Evaluations That Investigate Effects on Multiple Outcomes: A Guide for Researchers

    Science.gov (United States)

    Porter, Kristin E.

    2016-01-01

    In education research and in many other fields, researchers are often interested in testing the effectiveness of an intervention on multiple outcomes, for multiple subgroups, at multiple points in time, or across multiple treatment groups. The resulting multiplicity of statistical hypothesis tests can lead to spurious findings of effects. Multiple…

  14. Statistical evaluation of the data obtained from the K East Basin Sandfilter Backwash Pit samples

    International Nuclear Information System (INIS)

    Welsh, T.L.

    1994-01-01

    Samples were obtained from different locations from the K Each Sandfilter Backwash Pit to characterize the sludge material. These samples were analyzed chemically for elements, radionuclides, and residual compounds. The analytical results were statistically analyzed to determine the mean analyte content and the associated variability for each mean value

  15. Statistical Power in Evaluations That Investigate Effects on Multiple Outcomes: A Guide for Researchers

    Science.gov (United States)

    Porter, Kristin E.

    2018-01-01

    Researchers are often interested in testing the effectiveness of an intervention on multiple outcomes, for multiple subgroups, at multiple points in time, or across multiple treatment groups. The resulting multiplicity of statistical hypothesis tests can lead to spurious findings of effects. Multiple testing procedures (MTPs) are statistical…

  16. Evaluating effects of residential treatment for juvenile offenders by statistical metaanalysis : A review

    NARCIS (Netherlands)

    Grietens, H; Hellinckx, W

    Statistical metaanalyses on the effects of residential treatment for juvenile offenders were reviewed to examine the mean effect sizes and reductions of recidivism reported for this group. Five metaanalyses (three on North American and two on European studies) were selected and synthesized in a

  17. Evaluating Two Models of Collaborative Tests in an Online Introductory Statistics Course

    Science.gov (United States)

    Björnsdóttir, Auðbjörg; Garfield, Joan; Everson, Michelle

    2015-01-01

    This study explored the use of two different types of collaborative tests in an online introductory statistics course. A study was designed and carried out to investigate three research questions: (1) What is the difference in students' learning between using consensus and non-consensus collaborative tests in the online environment?, (2) What is…

  18. Statistical approach To understand MALDI-TOFMS matrices : discovery and evaluation of new MALDI matrices

    NARCIS (Netherlands)

    Meier, M.A.R.; Adams, N.; Schubert, U.S.

    2007-01-01

    A statistical approach is described to better understand the role of the matrix during a MALDI-TOFMS expt. Potential matrix mols. were selected based on a rational exptl. design and subsequently screened in order to investigate whether a certain compd. can act as a matrix for synthetic polymers. The

  19. Statistical evaluation of the mechanical properties of high-volume class F fly ash concretes

    KAUST Repository

    Yoon, Seyoon; Monteiro, Paulo J.M.; Macphee, Donald E.; Glasser, Fredrik P.; Imbabi, Mohammed Salah-Eldin

    2014-01-01

    the authors experimentally and statistically investigated the effects of mix-design factors on the mechanical properties of high-volume class F fly ash concretes. A total of 240 and 32 samples were produced and tested in the laboratory to measure compressive

  20. Evaluation of statistical protocols for quality control of ecosystem carbon dioxide fluxes

    Science.gov (United States)

    Jorge F. Perez-Quezada; Nicanor Z. Saliendra; William E. Emmerich; Emilio A. Laca

    2007-01-01

    The process of quality control of micrometeorological and carbon dioxide (CO2) flux data can be subjective and may lack repeatability, which would undermine the results of many studies. Multivariate statistical methods and time series analysis were used together and independently to detect and replace outliers in CO2 flux...

  1. Design and statistical considerations for studies evaluating the efficacy of a single dose of the human papillomavirus (HPV) vaccine.

    Science.gov (United States)

    Sampson, Joshua N; Hildesheim, Allan; Herrero, Rolando; Gonzalez, Paula; Kreimer, Aimee R; Gail, Mitchell H

    2018-05-01

    Cervical cancer is a leading cause of cancer mortality in women worldwide. Human papillomavirus (HPV) types 16 and 18 cause about 70% of all cervical cancers. Clinical trials have demonstrated that three doses of either commercially available HPV vaccine, Cervarix ® or Gardasil ®, prevent most new HPV 16/18 infections and associated precancerous lesions. Based on evidence of immunological non-inferiority, 2-dose regimens have been licensed for adolescents in the United States, European Union, and elsewhere. However, if a single dose were effective, vaccine costs would be reduced substantially and the logistics of vaccination would be greatly simplified, enabling vaccination programs in developing countries. The National Cancer Institute (NCI) and the Agencia Costarricense de Investigaciones Biomédicas (ACIB) are conducting, with support from the Bill & Melinda Gates Foundation and the International Agency for Research on Cancer (IARC), a large 24,000 girl study to evaluate the efficacy of a 1-dose regimen. The first component of the study is a four-year non-inferiority trial comparing 1- to 2-dose regimens of the two licensed vaccines. The second component is an observational study that estimates the vaccine efficacy (VE) of each regimen by comparing the HPV infection rates in the trial arms to those in a contemporaneous survey group of unvaccinated girls. In this paper, we describe the design and statistical analysis for this study. We explain the advantage of defining non-inferiority on the absolute risk scale when the expected event rate is near 0 and, given this definition, suggest an approach to account for missing clinic visits. We then describe the problem of estimating VE in the absence of a randomized placebo arm and offer our solution. Copyright © 2018. Published by Elsevier Inc.

  2. Numerical evaluation of micro-structural parameters of porous supports in metal-supported solid oxide fuel cells

    DEFF Research Database (Denmark)

    Reiss, Georg; Frandsen, Henrik Lund; Brandstätter, Wilhelm

    2014-01-01

    Metallic supported Solid Oxide Fuel Cells (SOFCs) are considered as a durable and cost effective alternative to the state-of-the-art ceramic supported cell designs. In order to understand the mass and charge transport in the metal-support of this new type of cell a novel technique involving X......-ray tomography and micro-structural modelling is presented in this work. The simulation technique comprises a novel treatment of the boundary conditions, which leads to more accurate effective transport parameters compared to those, which can be achieved with the conventional homogenisation procedures....... Furthermore, the porosity distribution in the metal-support was determined, which provided information about the inhomogeneous nature of the material. In addition to that, transport parameters for two identified, different dense layers of the metal-support are evaluated separately. The results...

  3. Statistics Clinic

    Science.gov (United States)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  4. Special study for the statistical evaluation of groundwater data trends. Final report

    International Nuclear Information System (INIS)

    1993-05-01

    Analysis of trends over time in the concentrations of chemicals in groundwater at Uranium Mill Tailings Remedial Action (UMTRA) Project sites can provide valuable information for monitoring the performance of disposal cells and the effectiveness of groundwater restoration activities. Random variation in data may obscure real trends or may produce the illusion of a trend where none exists, so statistical methods are needed to reliably detect and estimate trends. Trend analysis includes both trend detection and estimation. Trend detection uses statistical hypothesis testing and provides a yes or no answer regarding the existence of a trend. Hypothesis tests try to reach a balance between false negative and false positive conclusions. To quantify the magnitude of a trend, estimation is required. This report presents the statistical concepts that are necessary for understanding trend analysis. The types of patterns most likely to occur in UMTRA data sets are emphasized. Two general approaches to analyzing data for trends are proposed and recommendations are given to assist UMTRA Project staff in selecting an appropriate method for their site data. Trend analysis is much more difficult when data contain values less than the reported laboratory detection limit. The complications that arise are explained. This report also discusses the impact of data collection procedures on statistical trend methods and offers recommendations to improve the efficiency of the methods and reduce sampling costs. Guidance for determining how many sampling rounds might be needed by statistical methods to detect trends of various magnitudes is presented. This information could be useful in planning site monitoring activities

  5. Statistically rigorous calculations do not support common input and long-term synchronization of motor-unit firings

    Science.gov (United States)

    Kline, Joshua C.

    2014-01-01

    Over the past four decades, various methods have been implemented to measure synchronization of motor-unit firings. In this work, we provide evidence that prior reports of the existence of universal common inputs to all motoneurons and the presence of long-term synchronization are misleading, because they did not use sufficiently rigorous statistical tests to detect synchronization. We developed a statistically based method (SigMax) for computing synchronization and tested it with data from 17,736 motor-unit pairs containing 1,035,225 firing instances from the first dorsal interosseous and vastus lateralis muscles—a data set one order of magnitude greater than that reported in previous studies. Only firing data, obtained from surface electromyographic signal decomposition with >95% accuracy, were used in the study. The data were not subjectively selected in any manner. Because of the size of our data set and the statistical rigor inherent to SigMax, we have confidence that the synchronization values that we calculated provide an improved estimate of physiologically driven synchronization. Compared with three other commonly used techniques, ours revealed three types of discrepancies that result from failing to use sufficient statistical tests necessary to detect synchronization. 1) On average, the z-score method falsely detected synchronization at 16 separate latencies in each motor-unit pair. 2) The cumulative sum method missed one out of every four synchronization identifications found by SigMax. 3) The common input assumption method identified synchronization from 100% of motor-unit pairs studied. SigMax revealed that only 50% of motor-unit pairs actually manifested synchronization. PMID:25210152

  6. Statistical Analyses of Second Indoor Bio-Release Field Evaluation Study at Idaho National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G.; Pulsipher, Brent A.; Matzke, Brett D.

    2009-12-17

    In September 2008 a large-scale testing operation (referred to as the INL-2 test) was performed within a two-story building (PBF-632) at the Idaho National Laboratory (INL). The report “Operational Observations on the INL-2 Experiment” defines the seven objectives for this test and discusses the results and conclusions. This is further discussed in the introduction of this report. The INL-2 test consisted of five tests (events) in which a floor (level) of the building was contaminated with the harmless biological warfare agent simulant Bg and samples were taken in most, if not all, of the rooms on the contaminated floor. After the sampling, the building was decontaminated, and the next test performed. Judgmental samples and probabilistic samples were determined and taken during each test. Vacuum, wipe, and swab samples were taken within each room. The purpose of this report is to study an additional four topics that were not within the scope of the original report. These topics are: 1) assess the quantitative assumptions about the data being normally or log-normally distributed; 2) evaluate differences and quantify the sample to sample variability within a room and across the rooms; 3) perform geostatistical types of analyses to study spatial correlations; and 4) quantify the differences observed between surface types and sampling methods for each scenario and study the consistency across the scenarios. The following four paragraphs summarize the results of each of the four additional analyses. All samples after decontamination came back negative. Because of this, it was not appropriate to determine if these clearance samples were normally distributed. As Table 1 shows, the characterization data consists of values between and inclusive of 0 and 100 CFU/cm2 (100 was the value assigned when the number is too numerous to count). The 100 values are generally much bigger than the rest of the data, causing the data to be right skewed. There are also a significant

  7. Early Career Academic Staff Support: Evaluating Mentoring Networks

    Science.gov (United States)

    Thomas, J. Denard; Lunsford, Laura Gail; Rodrigues, Helena A.

    2015-01-01

    Which academics benefit from participation in formal mentoring programmes? This study examined the needs and mentoring networks of new academics with evaluative data from a pilot mentoring programme. Themes from these data point towards re-envisioning initiatives for academic staff development. First, an examination of the expansion of mentoring…

  8. Evaluation Support and Follow Up (Acacia) | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    The IDRC program initiative, Acacia (communities and the information society in Africa), seeks to integrate an evaluation process within its activities and those of its partners. This project aims to ... A new website and resource library will help improve developing country registration and information systems for vital events.

  9. Texas Teacher Evaluation & Support System (T-TESS) Appraiser Handbook

    Science.gov (United States)

    Texas Education Agency, 2016

    2016-01-01

    The Texas Education Agency's (TEA) approved instrument for evaluating teachers, the Professional Development and Appraisal System (PDAS), was the primary instrument used by 86 percent of local education agencies (LEAs) in the state and has been in place since 1997. In acknowledging the vital roles teachers play in student achievement, and based on…

  10. Evaluation of Medicare Health Support chronic disease pilot program.

    Science.gov (United States)

    Cromwell, Jerry; McCall, Nancy; Burton, Joe

    2008-01-01

    The Medicare Program is conducting a randomized trial of care management services among fee-for-service (FFS) beneficiaries called the Medicare Health Support (MHS) pilot program. Eight disease management (DM) companies have contracted with CMS to improve clinical quality, increase beneficiary and provider satisfaction, and achieve targeted savings for chronically ill Medicare FFS beneficiaries. In this article, we present 6-month intervention results on beneficiary selection and participation rates, mortality rates, trends in hospitalizations, and success in achieving Medicare cost savings. Results to date indicate limited success in achieving Medicare cost savings or reducing acute care utilization.

  11. Walkability Explorer. An Evaluation and Design Support Tool for Walkability

    Directory of Open Access Journals (Sweden)

    Ivan Blečić

    2014-05-01

    Full Text Available Walkability Explorer is a software tool for the evaluation of urban walkability which, we argue, is an important aspect of the quality of life in cities. Many conventional approaches to the assessment of quality of life measure the distribution, density and distances of different opportunities in space. But distance is not all there is. To reason in terms of urban capabilities of people we should also take into account the quality of pedestrian accessibility and of urban opportunities offered by the city. The software tool we present in this paper is an user-friendly implementation of such an evaluation approach to walkability. It includes several GIS and analysis features, and is interoperable with other standard GIS and data-analysis tools.

  12. Statistical identifiability and convergence evaluation for nonlinear pharmacokinetic models with particle swarm optimization.

    Science.gov (United States)

    Kim, Seongho; Li, Lang

    2014-02-01

    The statistical identifiability of nonlinear pharmacokinetic (PK) models with the Michaelis-Menten (MM) kinetic equation is considered using a global optimization approach, which is particle swarm optimization (PSO). If a model is statistically non-identifiable, the conventional derivative-based estimation approach is often terminated earlier without converging, due to the singularity. To circumvent this difficulty, we develop a derivative-free global optimization algorithm by combining PSO with a derivative-free local optimization algorithm to improve the rate of convergence of PSO. We further propose an efficient approach to not only checking the convergence of estimation but also detecting the identifiability of nonlinear PK models. PK simulation studies demonstrate that the convergence and identifiability of the PK model can be detected efficiently through the proposed approach. The proposed approach is then applied to clinical PK data along with a two-compartmental model. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  13. Statistically sound evaluation of trace element depth profiles by ion beam analysis

    International Nuclear Information System (INIS)

    Schmid, K.; Toussaint, U. von

    2012-01-01

    This paper presents the underlying physics and statistical models that are used in the newly developed program NRADC for fully automated deconvolution of trace level impurity depth profiles from ion beam data. The program applies Bayesian statistics to find the most probable depth profile given ion beam data measured at different energies and angles for a single sample. Limiting the analysis to % level amounts of material allows one to linearize the forward calculation of ion beam data which greatly improves the computation speed. This allows for the first time to apply the maximum likelihood approach to both the fitting of the experimental data and the determination of confidence intervals of the depth profiles for real world applications. The different steps during the automated deconvolution will be exemplified by applying the program to artificial and real experimental data.

  14. Possible uses of animal databases for further statistical evaluation and modeling

    International Nuclear Information System (INIS)

    Griffith, W.C.; Boecker, B.B.; Gerber, G.B.

    1995-01-01

    Many studies have been performed in animals which mimic potential exposures of people in order to understand how factors modify radiation dose-response relationships. Cooperative analyses by investigators in different laboratories have a large potential for strengthening the conclusions that can be drawn from individual studies. When information on each animal is combined, then formal tests can be made to demonstrate that apparent consistencies or inconsistencies are statistically significant. Statistical methods must be carefully chosen so that differences between laboratories or studies can be controlled or described as part of the analysis in the interpretation of the conclusions. In this report, the example of bone cancer of the large number of studies of modifying factors for bone cancer available from studies in US and European laboratories

  15. Evaluation of artillery equipment maintenance support capability based on grey clustering

    Science.gov (United States)

    Zhai, Mei-jie; Gao, Peng

    2017-12-01

    This paper, the theory and method of evaluating the capability of equipment maintenance support in China and abroad are studied, from the point of view of the combat task of artillery troops and the strategic attachment in the future military struggle. This paper establishes the framework of the evaluation Index system of the equipment maintenance support capability of the artillery units, and applies the grey clustering method to the evaluation of the equipment maintenance support capability of the artillery units, and finally evaluates the equipment maintenance and support capability of the artillery brigade as an example, and analyzes the evaluation results. This paper finds out the outstanding problems existing in the maintenance and support of military equipment, and puts forward some constructive suggestions, in order to improve the status of military equipment maintenance and support and improve the level of future equipment maintenance.

  16. Statistical evaluation of recorded knowledge in nuclear and other instrumental analytical techniques

    International Nuclear Information System (INIS)

    Braun, T.

    1987-01-01

    The main points addressed in this study are the following: Statistical distribution patterns of published literature on instrumental analytical techniques 1981-1984; structure of scientific literature and heuristics for identifying active specialities and emerging hot spot research areas in instrumental analytical techniques; growth and growth rates of the literature in some of the identified hot research areas; quality and quantity in instrumental analytical research output. (orig.)

  17. Statistical Association Criteria in Forensic Psychiatry–A criminological evaluation of casuistry

    Science.gov (United States)

    Gheorghiu, V; Buda, O; Popescu, I; Trandafir, MS

    2011-01-01

    Purpose. Identification of potential shared primary psychoprophylaxis and crime prevention is measured by analyzing the rate of commitments for patients–subjects to forensic examination. Material and method. The statistic trial is a retrospective, document–based study. The statistical lot consists of 770 initial examination reports performed and completed during the whole year 2007, primarily analyzed in order to summarize the data within the National Institute of Forensic Medicine, Bucharest, Romania (INML), with one of the group variables being ‘particularities of the psychiatric patient history’, containing the items ‘forensic onset’, ‘commitments within the last year prior to the examination’ and ‘absence of commitments within the last year prior to the examination’. The method used was the Kendall bivariate correlation. For this study, the authors separately analyze only the two items regarding commitments by other correlation alternatives and by modern, elaborate statistical analyses, i.e. recording of the standard case study variables, Kendall bivariate correlation, cross tabulation, factor analysis and hierarchical cluster analysis. Results. The results are varied, from theoretically presumed clinical nosography (such as schizophrenia or manic depression), to non–presumed (conduct disorders) or unexpected behavioral acts, and therefore difficult to interpret. Conclusions. One took into consideration the features of the batch as well as the results of the previous standard correlation of the whole statistical lot. The authors emphasize the role of medical security measures that are actually applied in the therapeutic management in general and in risk and second offence management in particular, as well as the role of forensic psychiatric examinations in the detection of certain aspects related to the monitoring of mental patients. PMID:21505571

  18. Quality evaluation of no-reference MR images using multidirectional filters and image statistics.

    Science.gov (United States)

    Jang, Jinseong; Bang, Kihun; Jang, Hanbyol; Hwang, Dosik

    2018-09-01

    This study aimed to develop a fully automatic, no-reference image-quality assessment (IQA) method for MR images. New quality-aware features were obtained by applying multidirectional filters to MR images and examining the feature statistics. A histogram of these features was then fitted to a generalized Gaussian distribution function for which the shape parameters yielded different values depending on the type of distortion in the MR image. Standard feature statistics were established through a training process based on high-quality MR images without distortion. Subsequently, the feature statistics of a test MR image were calculated and compared with the standards. The quality score was calculated as the difference between the shape parameters of the test image and the undistorted standard images. The proposed IQA method showed a >0.99 correlation with the conventional full-reference assessment methods; accordingly, this proposed method yielded the best performance among no-reference IQA methods for images containing six types of synthetic, MR-specific distortions. In addition, for authentically distorted images, the proposed method yielded the highest correlation with subjective assessments by human observers, thus demonstrating its superior performance over other no-reference IQAs. Our proposed IQA was designed to consider MR-specific features and outperformed other no-reference IQAs designed mainly for photographic images. Magn Reson Med 80:914-924, 2018. © 2018 International Society for Magnetic Resonance in Medicine. © 2018 International Society for Magnetic Resonance in Medicine.

  19. A laboratory evaluation of the influence of weighing gauges performance on extreme events statistics

    Science.gov (United States)

    Colli, Matteo; Lanza, Luca

    2014-05-01

    The effects of inaccurate ground based rainfall measurements on the information derived from rain records is yet not much documented in the literature. La Barbera et al. (2002) investigated the propagation of the systematic mechanic errors of tipping bucket type rain gauges (TBR) into the most common statistics of rainfall extremes, e.g. in the assessment of the return period T (or the related non-exceedance probability) of short-duration/high intensity events. Colli et al. (2012) and Lanza et al. (2012) extended the analysis to a 22-years long precipitation data set obtained from a virtual weighing type gauge (WG). The artificial WG time series was obtained basing on real precipitation data measured at the meteo-station of the University of Genova and modelling the weighing gauge output as a linear dynamic system. This approximation was previously validated with dedicated laboratory experiments and is based on the evidence that the accuracy of WG measurements under real world/time varying rainfall conditions is mainly affected by the dynamic response of the gauge (as revealed during the last WMO Field Intercomparison of Rainfall Intensity Gauges). The investigation is now completed by analyzing actual measurements performed by two common weighing gauges, the OTT Pluvio2 load-cell gauge and the GEONOR T-200 vibrating-wire gauge, since both these instruments demonstrated very good performance under previous constant flow rate calibration efforts. A laboratory dynamic rainfall generation system has been arranged and validated in order to simulate a number of precipitation events with variable reference intensities. Such artificial events were generated basing on real world rainfall intensity (RI) records obtained from the meteo-station of the University of Genova so that the statistical structure of the time series is preserved. The influence of the WG RI measurements accuracy on the associated extreme events statistics is analyzed by comparing the original intensity

  20. Evaluation of PDA Technical Report No 33. Statistical Testing Recommendations for a Rapid Microbiological Method Case Study.

    Science.gov (United States)

    Murphy, Thomas; Schwedock, Julie; Nguyen, Kham; Mills, Anna; Jones, David

    2015-01-01

    New recommendations for the validation of rapid microbiological methods have been included in the revised Technical Report 33 release from the PDA. The changes include a more comprehensive review of the statistical methods to be used to analyze data obtained during validation. This case study applies those statistical methods to accuracy, precision, ruggedness, and equivalence data obtained using a rapid microbiological methods system being evaluated for water bioburden testing. Results presented demonstrate that the statistical methods described in the PDA Technical Report 33 chapter can all be successfully applied to the rapid microbiological method data sets and gave the same interpretation for equivalence to the standard method. The rapid microbiological method was in general able to pass the requirements of PDA Technical Report 33, though the study shows that there can be occasional outlying results and that caution should be used when applying statistical methods to low average colony-forming unit values. Prior to use in a quality-controlled environment, any new method or technology has to be shown to work as designed by the manufacturer for the purpose required. For new rapid microbiological methods that detect and enumerate contaminating microorganisms, additional recommendations have been provided in the revised PDA Technical Report No. 33. The changes include a more comprehensive review of the statistical methods to be used to analyze data obtained during validation. This paper applies those statistical methods to analyze accuracy, precision, ruggedness, and equivalence data obtained using a rapid microbiological method system being validated for water bioburden testing. The case study demonstrates that the statistical methods described in the PDA Technical Report No. 33 chapter can be successfully applied to rapid microbiological method data sets and give the same comparability results for similarity or difference as the standard method. © PDA, Inc

  1. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  2. Optimism in the face of uncertainty supported by a statistically-designed multi-armed bandit algorithm.

    Science.gov (United States)

    Kamiura, Moto; Sano, Kohei

    2017-10-01

    The principle of optimism in the face of uncertainty is known as a heuristic in sequential decision-making problems. Overtaking method based on this principle is an effective algorithm to solve multi-armed bandit problems. It was defined by a set of some heuristic patterns of the formulation in the previous study. The objective of the present paper is to redefine the value functions of Overtaking method and to unify the formulation of them. The unified Overtaking method is associated with upper bounds of confidence intervals of expected rewards on statistics. The unification of the formulation enhances the universality of Overtaking method. Consequently we newly obtain Overtaking method for the exponentially distributed rewards, numerically analyze it, and show that it outperforms UCB algorithm on average. The present study suggests that the principle of optimism in the face of uncertainty should be regarded as the statistics-based consequence of the law of large numbers for the sample mean of rewards and estimation of upper bounds of expected rewards, rather than as a heuristic, in the context of multi-armed bandit problems. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. VALIDATION OF SPRING OPERATED PRESSURE RELIEF VALVE TIME TO FAILURE AND THE IMPORTANCE OF STATISTICALLY SUPPORTED MAINTENANCE INTERVALS

    Energy Technology Data Exchange (ETDEWEB)

    Gross, R; Stephen Harris, S

    2009-02-18

    The Savannah River Site operates a Relief Valve Repair Shop certified by the National Board of Pressure Vessel Inspectors to NB-23, The National Board Inspection Code. Local maintenance forces perform inspection, testing, and repair of approximately 1200 spring-operated relief valves (SORV) each year as the valves are cycled in from the field. The Site now has over 7000 certified test records in the Computerized Maintenance Management System (CMMS); a summary of that data is presented in this paper. In previous papers, several statistical techniques were used to investigate failure on demand and failure rates including a quantal response method for predicting the failure probability as a function of time in service. The non-conservative failure mode for SORV is commonly termed 'stuck shut'; industry defined as the valve opening at greater than or equal to 1.5 times the cold set pressure. Actual time to failure is typically not known, only that failure occurred some time since the last proof test (censored data). This paper attempts to validate the assumptions underlying the statistical lifetime prediction results using Monte Carlo simulation. It employs an aging model for lift pressure as a function of set pressure, valve manufacturer, and a time-related aging effect. This paper attempts to answer two questions: (1) what is the predicted failure rate over the chosen maintenance/ inspection interval; and do we understand aging sufficient enough to estimate risk when basing proof test intervals on proof test results?

  4. Global optimization based on noisy evaluations: An empirical study of two statistical approaches

    International Nuclear Information System (INIS)

    Vazquez, Emmanuel; Villemonteix, Julien; Sidorkiewicz, Maryan; Walter, Eric

    2008-01-01

    The optimization of the output of complex computer codes has often to be achieved with a small budget of evaluations. Algorithms dedicated to such problems have been developed and compared, such as the Expected Improvement algorithm (El) or the Informational Approach to Global Optimization (IAGO). However, the influence of noisy evaluation results on the outcome of these comparisons has often been neglected, despite its frequent appearance in industrial problems. In this paper, empirical convergence rates for El and IAGO are compared when an additive noise corrupts the result of an evaluation. IAGO appears more efficient than El and various modifications of El designed to deal with noisy evaluations. Keywords. Global optimization; computer simulations; kriging; Gaussian process; noisy evaluations.

  5. Principal Components of Superhigh-Dimensional Statistical Features and Support Vector Machine for Improving Identification Accuracies of Different Gear Crack Levels under Different Working Conditions

    Directory of Open Access Journals (Sweden)

    Dong Wang

    2015-01-01

    Full Text Available Gears are widely used in gearbox to transmit power from one shaft to another. Gear crack is one of the most frequent gear fault modes found in industry. Identification of different gear crack levels is beneficial in preventing any unexpected machine breakdown and reducing economic loss because gear crack leads to gear tooth breakage. In this paper, an intelligent fault diagnosis method for identification of different gear crack levels under different working conditions is proposed. First, superhigh-dimensional statistical features are extracted from continuous wavelet transform at different scales. The number of the statistical features extracted by using the proposed method is 920 so that the extracted statistical features are superhigh dimensional. To reduce the dimensionality of the extracted statistical features and generate new significant low-dimensional statistical features, a simple and effective method called principal component analysis is used. To further improve identification accuracies of different gear crack levels under different working conditions, support vector machine is employed. Three experiments are investigated to show the superiority of the proposed method. Comparisons with other existing gear crack level identification methods are conducted. The results show that the proposed method has the highest identification accuracies among all existing methods.

  6. The economic support index evaluation study on the pig breeding scale of the six provinces in central China

    Science.gov (United States)

    Leng, Bi-Bin; Le, Xi-lin; Yuan, Gang; Ji, Xue-Qiang

    2017-11-01

    Shanxi province, Anhui province, Jiangxi province, Henan province, Hubei province and Hunan province are located in the central part of China. They are playing an essential role in China’s economic and social development. In this article, we use analytic hierarchy process(AHP), on the basis of the statistical yearbook data of 2016, conduct an appraisal research about the economic support index of the pig breeding scale in the six provinces of central China. The evaluation shows that Hubei tops all of the provinces on the economic support index, followed by Hunan, Anhui, Henan, Jiangxi. The lowest index is in Shanxi province. It indicates the economic conditions in Hubei province is the most capable for it to support the pig breeding scale, Shanxi province is the opposite.

  7. Statistical Analysis and Evaluation of the Depth of the Ruts on Lithuanian State Significance Roads

    Directory of Open Access Journals (Sweden)

    Erinijus Getautis

    2011-04-01

    Full Text Available The aim of this work is to gather information about the national flexible pavement roads ruts depth, to determine its statistical dispersijon index and to determine their validity for needed requirements. Analysis of scientific works of ruts apearance in the asphalt and their influence for driving is presented in this work. Dynamical models of ruts in asphalt are presented in the work as well. Experimental outcome data of rut depth dispersijon in the national highway of Lithuania Vilnius – Kaunas is prepared. Conclusions are formulated and presented. Article in Lithuanian

  8. A statistical evaluation of the design and precision of the shrimp trawl survey off West Greenland

    DEFF Research Database (Denmark)

    Folmer, Ole; Pennington, M.

    2000-01-01

    statistical techniques were used to estimate two indices of shrimp abundance and their precision, and to determine the effective sample sizes for estimates of length-frequency distributions. It is concluded that the surveys produce a fairly precise abundance index, and that given the relatively small...... effective sample size, reducing tow duration to 15 min would increase overall survey precision. An unexpected outcome of the analysis is that the density of shrimp appears to have been fairly stable over the last 11 years. (C) 2000 Elsevier Science B.V. All rights reserved....

  9. Radiographic evaluation and statistical analysis of canine skull form: Dolichocephaly, mesocephaly and brachycephaly

    International Nuclear Information System (INIS)

    Regodon, S.; Robina, A.; Franco, A.; Vivo, J.M.; Lignereux, Y.

    1991-01-01

    Cranio-encephalic morphology of three breeds of dogs (Greyhound, Pointer and Pekinese at the rate of 10 subjects, 5 males and 5 females, in each one) has been radiologically observed. Radiographic negatives in dorso-ventral and latero-lateral positions were taken and analyzed before and after the visualisation of the encephalic cavity using baryum sulfat. 18 cranio-encephalic measurements were chosen and interpreted statistically. The results showed that certain variables were more closely correlated with morphologic types of the cranium than others. We discuss the validity of the data applied for clinical diagnostic or osteo-archeology determinations

  10. Statistical evaluation of Pacific Northwest Residential Energy Consumption Survey weather data

    Energy Technology Data Exchange (ETDEWEB)

    Tawil, J.J.

    1986-02-01

    This report addresses an issue relating to energy consumption and conservation in the residential sector. BPA has obtained two meteorological data bases for use with its 1983 Pacific Northwest Residential Energy Survey (PNWRES). One data base consists of temperature data from weather stations; these have been aggregated to form a second data base that covers the National Oceanographic and Atmospheric Administration (NOAA) climatic divisions. At BPA's request, Pacific Northwest Laboratory has produced a household energy use model for both electricity and natural gas in order to determine whether the statistically estimated parameters of the model significantly differ when the two different meteorological data bases are used.

  11. Statistical evaluation of mature landfill leachate treatment by homogeneous catalytic ozonation

    Directory of Open Access Journals (Sweden)

    A. L. C. Peixoto

    2010-12-01

    Full Text Available This study presents the results of a mature landfill leachate treated by a homogeneous catalytic ozonation process with ions Fe2+ and Fe3+ at acidic pH. Quality assessments were performed using Taguchi's method (L8 design. Strong synergism was observed statistically between molecular ozone and ferric ions, pointing to their catalytic effect on •OH generation. The achievement of better organic matter depollution rates requires an ozone flow of 5 L h-1 (590 mg h-1 O3 and a ferric ion concentration of 5 mg L-1.

  12. The Effects of Types of Training Evaluation on Support of Training among Corporate Managers.

    Science.gov (United States)

    Kusy, Mitchell E., Jr.

    A study was conducted to determine which type of training evaluation method elicited the most management support of the training function among corporate managers. The investigator designed and distributed a case study survey instrument called the Training Evaluation Methods Survey (TEMS) to assess the extent of management support for each type of…

  13. Toward Building a Typology for the Evaluation of Services in Family Support Programs.

    Science.gov (United States)

    Manalo, Victor; Meezan, William

    2000-01-01

    Articulates how the family support movement emerged in the last 20 years, and describes the philosophical premises, principles, and practices that currently guide it. Considers the inability of current family support program typologies to guide outcome evaluations, and introduces a typology that deconstructs family support programs into their…

  14. Development of 4S and related technologies. (3) Statistical evaluation of safety performance of 4S on ULOF event

    International Nuclear Information System (INIS)

    Ishii, Kyoko; Matsumiya, Hisato; Horie, Hideki; Miyagi, Kazumi

    2009-01-01

    The purpose of this work is to evaluate quantitatively and statistically the safety performance of Super-Safe, Small, and Simple reactor (4S) by analyzing with ARGO code, a plant dynamics code for a sodium-cooled fast reactor. In this evaluation, an Anticipated Transient Without Scram (ATWS) is assumed, and an Unprotected Loss of Flow (ULOF) event is selected as a typical ATWS case. After a metric concerned with safety design is defined as performance factor a Phenomena Identification Ranking Table (PIRT) is produced in order to select the plausible phenomena that affect the metric. Then a sensitivity analysis is performed for the parameters related to the selected plausible phenomena. Finally the metric is evaluated with statistical methods whether it satisfies the given safety acceptance criteria. The result is as follows: The Cumulative Damage Fraction (CDF) for the cladding is defined as a metric, and the statistical estimation of the one-sided upper tolerance limit of 95 percent probability at a 95 percent confidence level in CDF is within the safety acceptance criterion; CDF < 0.1. The result shows that the 4S safety performance is acceptable in the ULOF event. (author)

  15. Evaluation of Two Statistical Methods Provides Insights into the Complex Patterns of Alternative Polyadenylation Site Switching

    Science.gov (United States)

    Li, Jie; Li, Rui; You, Leiming; Xu, Anlong; Fu, Yonggui; Huang, Shengfeng

    2015-01-01

    Switching between different alternative polyadenylation (APA) sites plays an important role in the fine tuning of gene expression. New technologies for the execution of 3’-end enriched RNA-seq allow genome-wide detection of the genes that exhibit significant APA site switching between different samples. Here, we show that the independence test gives better results than the linear trend test in detecting APA site-switching events. Further examination suggests that the discrepancy between these two statistical methods arises from complex APA site-switching events that cannot be represented by a simple change of average 3’-UTR length. In theory, the linear trend test is only effective in detecting these simple changes. We classify the switching events into four switching patterns: two simple patterns (3’-UTR shortening and lengthening) and two complex patterns. By comparing the results of the two statistical methods, we show that complex patterns account for 1/4 of all observed switching events that happen between normal and cancerous human breast cell lines. Because simple and complex switching patterns may convey different biological meanings, they merit separate study. We therefore propose to combine both the independence test and the linear trend test in practice. First, the independence test should be used to detect APA site switching; second, the linear trend test should be invoked to identify simple switching events; and third, those complex switching events that pass independence testing but fail linear trend testing can be identified. PMID:25875641

  16. A Bayesian statistical analysis of mouse dermal tumor promotion assay data for evaluating cigarette smoke condensate.

    Science.gov (United States)

    Kathman, Steven J; Potts, Ryan J; Ayres, Paul H; Harp, Paul R; Wilson, Cody L; Garner, Charles D

    2010-10-01

    The mouse dermal assay has long been used to assess the dermal tumorigenicity of cigarette smoke condensate (CSC). This mouse skin model has been developed for use in carcinogenicity testing utilizing the SENCAR mouse as the standard strain. Though the model has limitations, it remains as the most relevant method available to study the dermal tumor promoting potential of mainstream cigarette smoke. In the typical SENCAR mouse CSC bioassay, CSC is applied for 29 weeks following the application of a tumor initiator such as 7,12-dimethylbenz[a]anthracene (DMBA). Several endpoints are considered for analysis including: the percentage of animals with at least one mass, latency, and number of masses per animal. In this paper, a relatively straightforward analytic model and procedure is presented for analyzing the time course of the incidence of masses. The procedure considered here takes advantage of Bayesian statistical techniques, which provide powerful methods for model fitting and simulation. Two datasets are analyzed to illustrate how the model fits the data, how well the model may perform in predicting data from such trials, and how the model may be used as a decision tool when comparing the dermal tumorigenicity of cigarette smoke condensate from multiple cigarette types. The analysis presented here was developed as a statistical decision tool for differentiating between two or more prototype products based on the dermal tumorigenicity. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  17. EVALUATION OF SEMANTIC SIMILARITY FOR SENTENCES IN NATURAL LANGUAGE BY MATHEMATICAL STATISTICS METHODS

    Directory of Open Access Journals (Sweden)

    A. E. Pismak

    2016-03-01

    Full Text Available Subject of Research. The paper is focused on Wiktionary articles structural organization in the aspect of its usage as the base for semantic network. Wiktionary community references, article templates and articles markup features are analyzed. The problem of numerical estimation for semantic similarity of structural elements in Wiktionary articles is considered. Analysis of existing software for semantic similarity estimation of such elements is carried out; algorithms of their functioning are studied; their advantages and disadvantages are shown. Methods. Mathematical statistics methods were used to analyze Wiktionary articles markup features. The method of semantic similarity computing based on statistics data for compared structural elements was proposed.Main Results. We have concluded that there is no possibility for direct use of Wiktionary articles as the source for semantic network. We have proposed to find hidden similarity between article elements, and for that purpose we have developed the algorithm for calculation of confidence coefficients proving that each pair of sentences is semantically near. The research of quantitative and qualitative characteristics for the developed algorithm has shown its major performance advantage over the other existing solutions in the presence of insignificantly higher error rate. Practical Relevance. The resulting algorithm may be useful in developing tools for automatic Wiktionary articles parsing. The developed method could be used in computing of semantic similarity for short text fragments in natural language in case of algorithm performance requirements are higher than its accuracy specifications.

  18. Some aspects of statistic evaluation of fast reactor fuel element reliability

    International Nuclear Information System (INIS)

    Proshkin, A.A.; Likhachev, Yu.I.; Tuzov, A.N.; Zabud'ko, L.M.

    1980-01-01

    Certain aspects of application of statistical methods in forecasting operating ability of fuel elements of fast reactors with liquid-metal-heat-carriers are considered. Results of statistical analysis of fuel element operating ability with oxide fuel (U, Pu)O 2 under stationary regime of fast power reactor capacity are given. The analysis carried out permits to single out the main parameters, considerably affecting the calculated determination of fuel element operating ability. It is shown that parameters which introduce the greatest uncertainty are: steel creep rate - up to 30%; steel swelling - up to 20%; fuel ceep rate - up to 30%, fuel swelling - up to 20%, the coating material corrosion - up to 15%; contact conductivity of the fuel-coating gap - up to 10%. Contribution of these parameters in every given case is different depending on the construction, operation conditions and fuel element cross section considered. Contribution of the coating temperature uncertainty to the total dispersion does not exceed several per cent. It is shown that for the given reactor operation conditions the number of fuel elements depressurized increases with the burn out almost exponentially, starting from the burn out higher than 7% of heavy atoms

  19. Evaluation of Diffusivity in Dense Polymeric Membranes by Statistical Moment Analysis

    Czech Academy of Sciences Publication Activity Database

    Řezníčková Čermáková, Jiřina; Kudrna, Vladimír; Setničková, Kateřina; Uchytil, Petr

    2013-01-01

    Roč. 435, 15 MAY (2013), s. 46-51 ISSN 0376-7388 R&D Projects: GA ČR GA104/09/1165 Institutional support: RVO:67985858 Keywords : diffusion * membranes * transport processes Subject RIV: CI - Industrial Chemistry, Chemical Engineering Impact factor: 4.908, year: 2013

  20. Improving esthetic results in benign parotid surgery: statistical evaluation of facelift approach, sternocleidomastoid flap, and superficial musculoaponeurotic system flap application.

    Science.gov (United States)

    Bianchi, Bernardo; Ferri, Andrea; Ferrari, Silvano; Copelli, Chiara; Sesenna, Enrico

    2011-04-01

    The purpose of this article was to analyze the efficacy of facelift incision, sternocleidomastoid muscle flap, and superficial musculoaponeurotic system flap for improving the esthetic results in patients undergoing partial parotidectomy for benign parotid tumor resection. The usefulness of partial parotidectomy is discussed, and a statistical evaluation of the esthetic results was performed. From January 1, 1996, to January 1, 2007, 274 patients treated for benign parotid tumors were studied. Of these, 172 underwent partial parotidectomy. The 172 patients were divided into 4 groups: partial parotidectomy with classic or modified Blair incision without reconstruction (group 1), partial parotidectomy with facelift incision and without reconstruction (group 2), partial parotidectomy with facelift incision associated with sternocleidomastoid muscle flap (group 3), and partial parotidectomy with facelift incision associated with superficial musculoaponeurotic system flap (group 4). Patients were considered, after a follow-up of at least 18 months, for functional and esthetic evaluation. The functional outcome was assessed considering the facial nerve function, Frey syndrome, and recurrence. The esthetic evaluation was performed by inviting the patients and a blind panel of 1 surgeon and 2 secretaries of the department to give a score of 1 to 10 to assess the final cosmetic outcome. The statistical analysis was finally performed using the Mann-Whitney U test for nonparametric data to compare the different group results. P less than .05 was considered significant. No recurrence developed in any of the 4 groups or in any of the 274 patients during the follow-up period. The statistical analysis, comparing group 1 and the other groups, revealed a highly significant statistical difference (P esthetic results in benign parotid surgery. The evaluation of functional complications and the recurrence rate in this series of patients has confirmed that this technique can be safely

  1. A Decision Support Framework for Evaluation of Engineered ...

    Science.gov (United States)

    Engineered nanomaterials (ENM) are currently being developed and applied at rates that far exceed our ability to evaluate their potential for environmental or human health risks. The gap between material development and capacity for assessment grows wider every day. Transformative approaches are required that enhance our ability to forecast potential exposure and adverse health risks based on limited information such as the physical and chemical parameters of ENM, their proposed uses, and functional assays reflective of key ENM - environmental interactions. We are developing a framework that encompasses the potential for release of nanomaterials across a product life cycle, environmental transport, transformations and fate, exposure to sensitive species, including humans, and the potential for causing adverse effects. Each component of the framework is conceive of as a sequential segmented model depicting the movement, transformations and actions of ENM through environmental or biological compartments, and along which targeted functional assays can be developed that are indicative of projected rates of ENM movement or action. The eventual goal is to allow simple predictive models to be built that incorporate the data from key functional assays and thereby allow rapid screening of the projected margin of exposure for proposed applications of ENM enabled products. In this way, cases where a substantially safe margin of exposure is forecast can be reduced in

  2. Evaluation of auto-assessment method for C-D analysis based on support vector machine

    International Nuclear Information System (INIS)

    Takei, Takaaki; Ikeda, Mitsuru; Imai, Kuniharu; Kamihira, Hiroaki; Kishimoto, Tomonari; Goto, Hiroya

    2010-01-01

    Contrast-Detail (C-D) analysis is one of the visual quality assessment methods in medical imaging, and many auto-assessment methods for C-D analysis have been developed in recent years. However, for the auto-assessment method for C-D analysis, the effects of nonlinear image processing are not clear. So, we have made an auto-assessment method for C-D analysis using a support vector machine (SVM), and have evaluated its performance for the images processed with a noise reduction method. The feature indexes used in the SVM were the normalized cross correlation (NCC) coefficient on each signal between the noise-free and noised image, the contrast to noise ratio (CNR) on each signal, the radius of each signal, and the Student's t-test statistic for the mean difference between the signal and background pixel values. The results showed that the auto-assessment method for C-D analysis by using Student's t-test statistic agreed well with the visual assessment for the non-processed images, but disagreed for the images processed with the noise reduction method. Our results also showed that the auto-assessment method for C-D analysis by the SVM made of NCC and CNR agreed well with the visual assessment for the non-processed and noise-reduced images. Therefore, the auto-assessment method for C-D analysis by the SVM will be expected to have the robustness for the non-linear image processing. (author)

  3. EVA Swab Tool to Support Planetary Protection and Astrobiology Evaluations

    Science.gov (United States)

    Rucker, Michelle A.; Hood, Drew; Walker, Mary; Venkateswaran, Kasthuri J.; Schuerger, Andrew C.

    2018-01-01

    various pressure environments. To further minimize cost, the design team acquired extensive ground test experience in a relevant flight environment by piggy-backing onto suited crew training runs. These training runs allowed the project to validate tool interfaces with pressurized EVA gloves and collect user feedback on the tool design and function, as well as characterize baseline microbial data for different types of spacesuits. In general, test subjects found the EVA Swab Kit relatively straightforward to operate, but identified a number of design improvements that will be incorporated into the final design. Although originally intended to help characterize human forward contaminants, this tool has other potential applications, such as for collecting and preserving space-exposed materials to support astrobiology experiments.

  4. A recoil resilient lumen support, design, fabrication and mechanical evaluation

    Science.gov (United States)

    Mehdizadeh, Arash; Ali, Mohamed Sultan Mohamed; Takahata, Kenichi; Al-Sarawi, Said; Abbott, Derek

    2013-06-01

    smaller than the recoil reported for commercial stents. These experimental results demonstrate the effectiveness of the device design for the targeted luminal support and stenting applications.

  5. A recoil resilient lumen support, design, fabrication and mechanical evaluation

    International Nuclear Information System (INIS)

    Mehdizadeh, Arash; Al-Sarawi, Said; Abbott, Derek; Ali, Mohamed Sultan Mohamed; Takahata, Kenichi

    2013-01-01

    11× smaller than the recoil reported for commercial stents. These experimental results demonstrate the effectiveness of the device design for the targeted luminal support and stenting applications. (paper)

  6. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    International Nuclear Information System (INIS)

    Edjabou, Maklawe Essonanawe; Jensen, Morten Bang; Götze, Ramona; Pivnenko, Kostyantyn; Petersen, Claus; Scheutz, Charlotte; Astrup, Thomas Fruergaard

    2015-01-01

    Highlights: • Tiered approach to waste sorting ensures flexibility and facilitates comparison of solid waste composition data. • Food and miscellaneous wastes are the main fractions contributing to the residual household waste. • Separation of food packaging from food leftovers during sorting is not critical for determination of the solid waste composition. - Abstract: Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10–50 waste fractions, organised according to a three-level (tiered approach) facilitating comparison of the waste data between individual sub-areas with different fractionation (waste from one municipality was sorted at “Level III”, e.g. detailed, while the two others were sorted only at “Level I”). The results showed that residual household waste mainly contained food waste (42 ± 5%, mass per wet basis) and miscellaneous combustibles (18 ± 3%, mass per wet basis). The residual household waste generation rate in the study areas was 3–4 kg per person per week. Statistical analyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three municipalities. While the waste generation rates were similar for each of the two housing types (single

  7. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Edjabou, Maklawe Essonanawe, E-mail: vine@env.dtu.dk [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark); Jensen, Morten Bang; Götze, Ramona; Pivnenko, Kostyantyn [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark); Petersen, Claus [Econet AS, Omøgade 8, 2.sal, 2100 Copenhagen (Denmark); Scheutz, Charlotte; Astrup, Thomas Fruergaard [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark)

    2015-02-15

    Highlights: • Tiered approach to waste sorting ensures flexibility and facilitates comparison of solid waste composition data. • Food and miscellaneous wastes are the main fractions contributing to the residual household waste. • Separation of food packaging from food leftovers during sorting is not critical for determination of the solid waste composition. - Abstract: Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10–50 waste fractions, organised according to a three-level (tiered approach) facilitating comparison of the waste data between individual sub-areas with different fractionation (waste from one municipality was sorted at “Level III”, e.g. detailed, while the two others were sorted only at “Level I”). The results showed that residual household waste mainly contained food waste (42 ± 5%, mass per wet basis) and miscellaneous combustibles (18 ± 3%, mass per wet basis). The residual household waste generation rate in the study areas was 3–4 kg per person per week. Statistical analyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three municipalities. While the waste generation rates were similar for each of the two housing types (single

  8. Estimation of the residual bromine concentration after disinfection of cooling water by statistical evaluation.

    Science.gov (United States)

    Megalopoulos, Fivos A; Ochsenkuehn-Petropoulou, Maria T

    2015-01-01

    A statistical model based on multiple linear regression is developed, to estimate the bromine residual that can be expected after the bromination of cooling water. Make-up water sampled from a power plant in the Greek territory was used for the creation of the various cooling water matrices under investigation. The amount of bromine fed to the circuit, as well as other important operational parameters such as concentration at the cooling tower, temperature, organic load and contact time are taken as the independent variables. It is found that the highest contribution to the model's predictive ability comes from cooling water's organic load concentration, followed by the amount of bromine fed to the circuit, the water's mean temperature, the duration of the bromination period and finally its conductivity. Comparison of the model results with the experimental data confirms its ability to predict residual bromine given specific bromination conditions.

  9. Evaluating the effect of disturbed ensemble distributions on SCFG based statistical sampling of RNA secondary structures

    Directory of Open Access Journals (Sweden)

    Scheid Anika

    2012-07-01

    Full Text Available Abstract Background Over the past years, statistical and Bayesian approaches have become increasingly appreciated to address the long-standing problem of computational RNA structure prediction. Recently, a novel probabilistic method for the prediction of RNA secondary structures from a single sequence has been studied which is based on generating statistically representative and reproducible samples of the entire ensemble of feasible structures for a particular input sequence. This method samples the possible foldings from a distribution implied by a sophisticated (traditional or length-dependent stochastic context-free grammar (SCFG that mirrors the standard thermodynamic model applied in modern physics-based prediction algorithms. Specifically, that grammar represents an exact probabilistic counterpart to the energy model underlying the Sfold software, which employs a sampling extension of the partition function (PF approach to produce statistically representative subsets of the Boltzmann-weighted ensemble. Although both sampling approaches have the same worst-case time and space complexities, it has been indicated that they differ in performance (both with respect to prediction accuracy and quality of generated samples, where neither of these two competing approaches generally outperforms the other. Results In this work, we will consider the SCFG based approach in order to perform an analysis on how the quality of generated sample sets and the corresponding prediction accuracy changes when different degrees of disturbances are incorporated into the needed sampling probabilities. This is motivated by the fact that if the results prove to be resistant to large errors on the distinct sampling probabilities (compared to the exact ones, then it will be an indication that these probabilities do not need to be computed exactly, but it may be sufficient and more efficient to approximate them. Thus, it might then be possible to decrease the worst

  10. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    DEFF Research Database (Denmark)

    Edjabou, Vincent Maklawe Essonanawe; Jensen, Morten Bang; Götze, Ramona

    2015-01-01

    Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both...... comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub......-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10-50 waste fractions, organised according to a three-level (tiered approach) facilitating,comparison of the waste data between individual sub-areas with different fractionation (waste...

  11. Environmental offences in 1995. An evaluation of statistics; Umweltdelikte 1995. Eine Auswertung der Statistiken

    Energy Technology Data Exchange (ETDEWEB)

    Goertz, M.; Werner, J.; Sanchez de la Cerda, J.; Schwertfeger, C.; Winkler, K.

    1997-06-01

    This publication deals with the execution of environmental criminal law. On the basis of police and judicial statistics it is pointed out how often an environmental criminal offence was at least suspected by the police or law courts, how they reacted to their suspicion, which individual environmental criminal offences were committed particularly frequently, and what segment of the population the typical perpetrator belonged to. (orig./SR) [Deutsch] Mit der vorliegenden Schrift soll ein Blick auf den Vollzug des Umweltstrafrechts geworfen werden. Auf der Basis der Polizei- und Gerichtsstatistiken wird dargelegt, wie oft bei diesen Stellen mindestens der Verdacht einer Umweltstraftat bestand, wie auf diesen Verdacht reagiert wurde, welche einzelnen Umweltstraftaten besonders haeufig registriert wurden und aus welchem Personenkreis der tpische Taeter stammt. (orig./SR)

  12. Evaluating Michigan's community hospital access: spatial methods for decision support

    Directory of Open Access Journals (Sweden)

    Varnakovida Pariwate

    2006-09-01

    Full Text Available Abstract Background Community hospital placement is dictated by a diverse set of geographical factors and historical contingency. In the summer of 2004, a multi-organizational committee headed by the State of Michigan's Department of Community Health approached the authors of this paper with questions about how spatial analyses might be employed to develop a revised community hospital approval procedure. Three objectives were set. First, the committee needed visualizations of both the spatial pattern of Michigan's population and its 139 community hospitals. Second, the committee required a clear, defensible assessment methodology to quantify access to existing hospitals statewide, taking into account factors such as distance to nearest hospital and road network density to estimate travel time. Third, the committee wanted to contrast the spatial distribution of existing community hospitals with a theoretical configuration that best met statewide demand. This paper presents our efforts to first describe the distribution of Michigan's current community hospital pattern and its people, and second, develop two models, access-based and demand-based, to identify areas with inadequate access to existing hospitals. Results Using the product from the access-based model and contiguity and population criteria, two areas were identified as being "under-served." The lower area, located north/northeast of Detroit, contained the greater total land area and population of the two areas. The upper area was centered north of Grand Rapids. A demand-based model was applied to evaluate the existing facility arrangement by allocating daily bed demand in each ZIP code to the closest facility. We found 1,887 beds per day were demanded by ZIP centroids more than 16.1 kilometers from the nearest existing hospital. This represented 12.7% of the average statewide daily bed demand. If a 32.3 kilometer radius was employed, unmet demand dropped to 160 beds per day (1

  13. Internet-based peer support for Ménière's disease: a summary of web-based data collection, impact evaluation, and user evaluation.

    Science.gov (United States)

    Pyykkő, Ilmari; Manchaiah, Vinaya; Levo, Hilla; Kentala, Erna; Juhola, Martti

    2017-07-01

    This paper presents a summary of web-based data collection, impact evaluation, and user evaluations of an Internet-based peer support program for Ménière's disease (MD). The program is written in html-form. The data are stored in a MySQL database and uses machine learning in the diagnosis of MD. The program works interactively with the user and assesses the participant's disorder profile in various dimensions (i.e., symptoms, impact, personal traits, and positive attitude). The inference engine uses a database to compare the impact with 50 referents, and provides regular feedback to the user. Data were analysed using descriptive statistics and regression analysis. The impact evaluation was based on 740 cases and the user evaluation on a sample of 75 cases of MD respectively. The web-based system was useful in data collection and impact evaluation of people with MD. Among those with a recent onset of MD, 78% rated the program as useful or very useful, whereas those with chronic MD rated the program 55%. We suggest that a web-based data collection and impact evaluation for peer support can be helpful while formulating the rehabilitation goals of building the self-confidence needed for coping and increasing social participation.

  14. Statistical evaluation of steam condensation loads in pressure suppression pool, (1)

    International Nuclear Information System (INIS)

    Kukita, Yutaka; Takeshita, Isao; Namatame, Ken; Shiba, Masayoshi; Kato, Masami; Moriya, Kumiaki.

    1981-10-01

    The LOCA steam condensation loads in the BWR pressure suppression pool was evaluated with use of the test data obtained in the first eight tests of the JAERI Full-Scale Mark II CRT Program. Through this evaluation, finite desynchronization between the vent pressures during the chugging and the condensation oscillation phases was identified and quantified. The characteristics of the pressure oscillation propagation through the vent pipe and in the pool water, the fluid-structure-interaction (FSI) effects on the pool pressure loads, and the characteristics of the vent lateral loads were also investigated. (author)

  15. Levers supporting program evaluation culture and capacity in Romanian public administration: The role of leadership

    OpenAIRE

    Cristina Mora; Raluca Antonie

    2012-01-01

    Program evaluation culture and capacity is at the very beginning of its development in Romania. In this article we highlight one of the fundamental, but not always obvious, connections that support sustainable evaluation culture and capacity building and development: the link between leadership and program evaluation. If properly used, program evaluation results can be a strong instrument in leadership, just as leadership can fundamentally encourage the development of evaluation culture and c...

  16. Group Peer Assessment for Summative Evaluation in a Graduate-Level Statistics Course for Ecologists

    Science.gov (United States)

    ArchMiller, Althea; Fieberg, John; Walker, J.D.; Holm, Noah

    2017-01-01

    Peer assessment is often used for formative learning, but few studies have examined the validity of group-based peer assessment for the summative evaluation of course assignments. The present study contributes to the literature by using online technology (the course management system Moodle™) to implement structured, summative peer review based on…

  17. An Assessment of Statistical Process Control-Based Approaches for Charting Student Evaluation Scores

    Science.gov (United States)

    Ding, Xin; Wardell, Don; Verma, Rohit

    2006-01-01

    We compare three control charts for monitoring data from student evaluations of teaching (SET) with the goal of improving student satisfaction with teaching performance. The two charts that we propose are a modified "p" chart and a z-score chart. We show that these charts overcome some of the shortcomings of the more traditional charts…

  18. Rasch analysis of the Knee injury and Osteoarthritis Outcome Score (KOOS): a statistical re-evaluation

    DEFF Research Database (Denmark)

    Comins, J; Brodersen, J; Krogsgaard, M

    2008-01-01

    The knee injury and Osteoarthritis Outcome Score (KOOS), based on the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC), is widely used to evaluate subjective outcome in anterior cruciate ligament (ACL) reconstructed patients. However, the validity of KOOS has not been assessed...

  19. Squared Euclidean distance: a statistical test to evaluate plant community change

    Science.gov (United States)

    Raymond D. Ratliff; Sylvia R. Mori

    1993-01-01

    The concepts and a procedure for evaluating plant community change using the squared Euclidean distance (SED) resemblance function are described. Analyses are based on the concept that Euclidean distances constitute a sample from a population of distances between sampling units (SUs) for a specific number of times and SUs. With different times, the distances will be...

  20. Evaluation of Oceanic Transport Statistics By Use of Transient Tracers and Bayesian Methods

    Science.gov (United States)

    Trossman, D. S.; Thompson, L.; Mecking, S.; Bryan, F.; Peacock, S.

    2013-12-01

    Key variables that quantify the time scales over which atmospheric signals penetrate into the oceanic interior and their uncertainties are computed using Bayesian methods and transient tracers from both models and observations. First, the mean residence times, subduction rates, and formation rates of Subtropical Mode Water (STMW) and Subpolar Mode Water (SPMW) in the North Atlantic and Subantarctic Mode Water (SAMW) in the Southern Ocean are estimated by combining a model and observations of chlorofluorocarbon-11 (CFC-11) via Bayesian Model Averaging (BMA), statistical technique that weights model estimates according to how close they agree with observations. Second, a Bayesian method is presented to find two oceanic transport parameters associated with the age distribution of ocean waters, the transit-time distribution (TTD), by combining an eddying global ocean model's estimate of the TTD with hydrographic observations of CFC-11, temperature, and salinity. Uncertainties associated with objectively mapping irregularly spaced bottle data are quantified by making use of a thin-plate spline and then propagated via the two Bayesian techniques. It is found that the subduction of STMW, SPMW, and SAMW is mostly an advective process, but up to about one-third of STMW subduction likely owes to non-advective processes. Also, while the formation of STMW is mostly due to subduction, the formation of SPMW is mostly due to other processes. About half of the formation of SAMW is due to subduction and half is due to other processes. A combination of air-sea flux, acting on relatively short time scales, and turbulent mixing, acting on a wide range of time scales, is likely the dominant SPMW erosion mechanism. Air-sea flux is likely responsible for most STMW erosion, and turbulent mixing is likely responsible for most SAMW erosion. Two oceanic transport parameters, the mean age of a water parcel and the half-variance associated with the TTD, estimated using the model's tracers as

  1. Evaluation of ictal brain SPET using statistical parametric mapping in temporal lobe epilepsy

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J.D.; Kim, H.-J.; Jeon, T.J.; Kim, M.J. [Div. of Nuclear Medicine, Yonsei University Medical College, Seoul (Korea); Lee, B.I.; Kim, O.J. [Dept. of Neurology, Yonsei University Medical College, Seoul (Korea)

    2000-11-01

    An automated voxel-based analysis of brain images using statistical parametric mapping (SPM) is accepted as a standard approach in the analysis of activation studies in positron emission tomography and functional magnetic resonance imaging. This study aimed to investigate whether or not SPM would increase the diagnostic yield of ictal brain single-photon emission tomography (SPET) in temporal lobe epilepsy (TLE). Twenty-one patients (age 27.14{+-}5.79 years) with temporal lobe epilepsy (right in 8, left in 13) who had a successful seizure outcome after surgery and nine normal subjects were included in the study. The data of ictal and interictal brain SPET of the patients and baseline SPET of the normal control group were analysed using SPM96 software. The t statistic SPM(t) was transformed to SPM(Z) with various thresholds of P<0.05, 0.005 and 0.001, and corrected extent threshold P value of 0.05. The SPM data were compared with the conventional ictal and interictal subtraction method. On group comparison, ictal SPET showed increased uptake within the epileptogenic mesial temporal lobe. On single case analysis, ictal SPET images correctly lateralized the epileptogenic temporal lobe in 18 cases, falsely lateralized it in one and failed to lateralize it in two as compared with the mean image of the normal group at a significance level of P<0.05. Comparing the individual ictal images with the corresponding interictal group, 15 patients were correctly lateralized, one was falsely lateralized and four were not lateralized. At significance levels of P<0.005 and P<0.001, correct lateralization of the epileptogenic temporal lobe was achieved in 15 and 13 patients, respectively, as compared with the normal group. On the other hand, when comparison was made with the corresponding interictal group, only 7 out of 21 patients were correctly lateralized at the threshold of P<0.005 and five at P<0.001. The result of the subtraction method was close to the single case analysis on

  2. Evaluation of ictal brain SPET using statistical parametric mapping in temporal lobe epilepsy

    International Nuclear Information System (INIS)

    Lee, J.D.; Kim, H.-J.; Jeon, T.J.; Kim, M.J.; Lee, B.I.; Kim, O.J.

    2000-01-01

    An automated voxel-based analysis of brain images using statistical parametric mapping (SPM) is accepted as a standard approach in the analysis of activation studies in positron emission tomography and functional magnetic resonance imaging. This study aimed to investigate whether or not SPM would increase the diagnostic yield of ictal brain single-photon emission tomography (SPET) in temporal lobe epilepsy (TLE). Twenty-one patients (age 27.14±5.79 years) with temporal lobe epilepsy (right in 8, left in 13) who had a successful seizure outcome after surgery and nine normal subjects were included in the study. The data of ictal and interictal brain SPET of the patients and baseline SPET of the normal control group were analysed using SPM96 software. The t statistic SPM(t) was transformed to SPM(Z) with various thresholds of P<0.05, 0.005 and 0.001, and corrected extent threshold P value of 0.05. The SPM data were compared with the conventional ictal and interictal subtraction method. On group comparison, ictal SPET showed increased uptake within the epileptogenic mesial temporal lobe. On single case analysis, ictal SPET images correctly lateralized the epileptogenic temporal lobe in 18 cases, falsely lateralized it in one and failed to lateralize it in two as compared with the mean image of the normal group at a significance level of P<0.05. Comparing the individual ictal images with the corresponding interictal group, 15 patients were correctly lateralized, one was falsely lateralized and four were not lateralized. At significance levels of P<0.005 and P<0.001, correct lateralization of the epileptogenic temporal lobe was achieved in 15 and 13 patients, respectively, as compared with the normal group. On the other hand, when comparison was made with the corresponding interictal group, only 7 out of 21 patients were correctly lateralized at the threshold of P<0.005 and five at P<0.001. The result of the subtraction method was close to the single case analysis on

  3. Statistical evaluation of low cycle loading curves parameters for structural materials by mechanical characteristics

    International Nuclear Information System (INIS)

    Daunys, Mykolas; Sniuolis, Raimondas

    2006-01-01

    About 300 welded joint materials that are used in nuclear power energy were tested under monotonous tension and low cycle loading in Kaunas University of Technology together with St. Peterburg Central Research Institute of Structural Materials in 1970-2000. The main mechanical, low cycle loading and fracture characteristics of base metals, weld metals and some heat-affected zones of welded joints metals were determined during these experiments. Analytical dependences of low cycle fatigue parameters on mechanical characteristics of structural materials were proposed on the basis of a large number of experimental data, obtained by the same methods and testing equipment. When these dependences are used, expensive low cycle fatigue tests may be omitted and it is possible to compute low cycle loading curves parameters and lifetime for structural materials according to the main mechanical characteristics given in technical manuals. Dependences of low cycle loading curves parameters on mechanical characteristics for several groups of structural materials used in Russian nuclear power energy are obtained by statistical methods and proposed in this paper

  4. Evaluating anemometer drift: A statistical approach to correct biases in wind speed measurement

    Science.gov (United States)

    Azorin-Molina, Cesar; Asin, Jesus; McVicar, Tim R.; Minola, Lorenzo; Lopez-Moreno, Juan I.; Vicente-Serrano, Sergio M.; Chen, Deliang

    2018-05-01

    Recent studies on observed wind variability have revealed a decline (termed "stilling") of near-surface wind speed during the last 30-50 years over many mid-latitude terrestrial regions, particularly in the Northern Hemisphere. The well-known impact of cup anemometer drift (i.e., wear on the bearings) on the observed weakening of wind speed has been mentioned as a potential contributor to the declining trend. However, to date, no research has quantified its contribution to stilling based on measurements, which is most likely due to lack of quantification of the ageing effect. In this study, a 3-year field experiment (2014-2016) with 10-minute paired wind speed measurements from one new and one malfunctioned (i.e., old bearings) SEAC SV5 cup anemometer which has been used by the Spanish Meteorological Agency in automatic weather stations since mid-1980s, was developed for assessing for the first time the role of anemometer drift on wind speed measurement. The results showed a statistical significant impact of anemometer drift on wind speed measurements, with the old anemometer measuring lower wind speeds than the new one. Biases show a marked temporal pattern and clear dependency on wind speed, with both weak and strong winds causing significant biases. This pioneering quantification of biases has allowed us to define two regression models that correct up to 37% of the artificial bias in wind speed due to measurement with an old anemometer.

  5. Environmental offenses in 1999. An evaluation of statistics; Umweltdelikte 1999. Eine Auswertung der Statistiken

    Energy Technology Data Exchange (ETDEWEB)

    Goertz, M.; Werner, J.; Heinrich, M.

    2000-11-01

    A total of 43,382 known environmental offenses was recorded in 1999 as compared to 47,900 in 1998. There were 36,663 penal offenses (section 29 StGB), 48 penal offenses (section 28 StGB) and 6,671 offenses against other laws (BNatSchG, ChemG, etc.). This statistics covers chemical pollutants, radioactive materials, ionizing and non-ionizing radiation, noise and explosions. It is estimated that a much higher number of offenses went unnoticed. [German] Mit insgesamt 43 382 bekannt gewordenen umweltrelevanten Straftaten ist die Umweltkriminalitaet im Jahre 1999 gegenueber 47 900 im Jahre 1998 gesunken. Die 43 382 Taten verteilen sich auf 36 663 Taten nach dem 29. Abschnitt des StGB (Straftaten gegen die Umwelt), 48 umweltrelevante Taten nach dem 28. Abschnitt des StGB (gemeingefaehrliche Straftaten) und 6 671 Straftaten nach dem Umweltnebenstrafrecht (BNatSchG, ChemG u.a.). Erfasst wurden alle Delikte bezueglich chemischer Schadstoffe, radioaktiver Stoffe, ionisierender und nichtionisierender Strahlen, Laerm und Erschuetterungen. Das Dunkelfeld bei Umweltstraftaten wird von den mit ihnen befassten Personen im Vergleich zu tatsaechlich angezeigten Umweltdelikten ueberwiegend als bedeutend groesser eingeschaetzt. (orig.)

  6. MEG/EEG source reconstruction, statistical evaluation, and visualization with NUTMEG.

    Science.gov (United States)

    Dalal, Sarang S; Zumer, Johanna M; Guggisberg, Adrian G; Trumpis, Michael; Wong, Daniel D E; Sekihara, Kensuke; Nagarajan, Srikantan S

    2011-01-01

    NUTMEG is a source analysis toolbox geared towards cognitive neuroscience researchers using MEG and EEG, including intracranial recordings. Evoked and unaveraged data can be imported to the toolbox for source analysis in either the time or time-frequency domains. NUTMEG offers several variants of adaptive beamformers, probabilistic reconstruction algorithms, as well as minimum-norm techniques to generate functional maps of spatiotemporal neural source activity. Lead fields can be calculated from single and overlapping sphere head models or imported from other software. Group averages and statistics can be calculated as well. In addition to data analysis tools, NUTMEG provides a unique and intuitive graphical interface for visualization of results. Source analyses can be superimposed onto a structural MRI or headshape to provide a convenient visual correspondence to anatomy. These results can also be navigated interactively, with the spatial maps and source time series or spectrogram linked accordingly. Animations can be generated to view the evolution of neural activity over time. NUTMEG can also display brain renderings and perform spatial normalization of functional maps using SPM's engine. As a MATLAB package, the end user may easily link with other toolboxes or add customized functions.

  7. Quantitative evaluation of ASiR image quality: an adaptive statistical iterative reconstruction technique

    Science.gov (United States)

    Van de Casteele, Elke; Parizel, Paul; Sijbers, Jan

    2012-03-01

    Adaptive statistical iterative reconstruction (ASiR) is a new reconstruction algorithm used in the field of medical X-ray imaging. This new reconstruction method combines the idealized system representation, as we know it from the standard Filtered Back Projection (FBP) algorithm, and the strength of iterative reconstruction by including a noise model in the reconstruction scheme. It studies how noise propagates through the reconstruction steps, feeds this model back into the loop and iteratively reduces noise in the reconstructed image without affecting spatial resolution. In this paper the effect of ASiR on the contrast to noise ratio is studied using the low contrast module of the Catphan phantom. The experiments were done on a GE LightSpeed VCT system at different voltages and currents. The results show reduced noise and increased contrast for the ASiR reconstructions compared to the standard FBP method. For the same contrast to noise ratio the images from ASiR can be obtained using 60% less current, leading to a reduction in dose of the same amount.

  8. Permafrost distribution in the European Alps: calculation and evaluation of an index map and summary statistics

    Directory of Open Access Journals (Sweden)

    L. Boeckli

    2012-07-01

    Full Text Available The objective of this study is the production of an Alpine Permafrost Index Map (APIM covering the entire European Alps. A unified statistical model that is based on Alpine-wide permafrost observations is used for debris and bedrock surfaces across the entire Alps. The explanatory variables of the model are mean annual air temperatures, potential incoming solar radiation and precipitation. Offset terms were applied to make model predictions for topographic and geomorphic conditions that differ from the terrain features used for model fitting. These offsets are based on literature review and involve some degree of subjective choice during model building. The assessment of the APIM is challenging because limited independent test data are available for comparison and these observations represent point information in a spatially highly variable topography. The APIM provides an index that describes the spatial distribution of permafrost and comes together with an interpretation key that helps to assess map uncertainties and to relate map contents to their actual expression in the terrain. The map can be used as a first resource to estimate permafrost conditions at any given location in the European Alps in a variety of contexts such as research and spatial planning.

    Results show that Switzerland likely is the country with the largest permafrost area in the Alps, followed by Italy, Austria, France and Germany. Slovenia and Liechtenstein may have marginal permafrost areas. In all countries the permafrost area is expected to be larger than the glacier-covered area.

  9. Methodological and Statistical Quality in Research Evaluating Nutritional Attitudes in Sports.

    Science.gov (United States)

    Kouvelioti, Rozalia; Vagenas, George

    2015-12-01

    The assessment of dietary attitudes and behaviors provides information of interest to sports nutritionists. Although there has been little analysis of the quality of research undertaken in this field, there is evidence of a number of flaws and methodological concerns in some of the studies in the available literature. This review undertook a systematic assessment of the attributes of research assessing the nutritional knowledge and attitudes of athletes and coaches. Sixty questionnaire-based studies were identified by a search of official databases using specific key terms with subsequent analysis by certain inclusion-exclusion criteria. These studies were then analyzed using 33 research quality criteria related to the methods, questionnaires, and statistics used. We found that many studies did not provide information on critical issues such as research hypotheses (92%), the gaining of ethics approval (50%) or informed consent (35%), or acknowledgment of limitations in the implementation of studies or interpretation of data (72%). Many of the samples were nonprobabilistic (85%) and rather small (42%). Many questionnaires were of unknown origin (30%), validity (72%), and reliability (70%) and resulted in low (≤ 60%) response rates (38%). Pilot testing was not undertaken in 67% of the studies. Few studies dealt with sample size (2%), power (3%), assumptions (7%), confidence intervals (3%), or effect sizes (3%). Improving some of these problems and deficits may enhance future research in this field.

  10. Statistical Evaluation of the Emissions Level Of CO, CO2 and HC Generated by Passenger Cars

    Directory of Open Access Journals (Sweden)

    Claudiu Ursu

    2014-12-01

    Full Text Available This paper aims to make an evaluation of differences emission level of CO, CO2 and HC generated by passenger cars in different walking regimes and times, to identify measures of reducing pollution. Was analyzed a sample of Dacia Logan passenger cars (n = 515, made during the period 2004-2007, equipped with spark ignition engines, assigned to emission standards EURO 3 (E3 and EURO4 (E4. These cars were evaluated at periodical technical inspection (ITP by two times in the two walk regimes (slow idle and accelerated idle. Using the t test for paired samples (Paired Samples T Test, the results showed that there are significant differences between emissions levels (CO, CO2, HC generated by Dacia Logan passenger cars at both assessments, and regression analysis showed that these differences are not significantly influenced by turnover differences.

  11. Multivariate Statistical Analysis of Water Chemistry in Evaluating the Origin of Contamination in Many Devils Wash, Shiprock, New Mexico

    International Nuclear Information System (INIS)

    2012-01-01

    This report evaluates the chemistry of seep water occurring in three desert drainages near Shiprock, New Mexico: Many Devils Wash, Salt Creek Wash, and Eagle Nest Arroyo. Through the use of geochemical plotting tools and multivariate statistical analysis techniques, analytical results of samples collected from the three drainages are compared with the groundwater chemistry at a former uranium mill in the Shiprock area (the Shiprock site), managed by the U.S. Department of Energy Office of Legacy Management. The objective of this study was to determine, based on the water chemistry of the samples, if statistically significant patterns or groupings are apparent between the sample populations and, if so, whether there are any reasonable explanations for those groupings.

  12. The implementation of the Strategy Europe 2020 objectives in European Union countries: the concept analysis and statistical evaluation.

    Science.gov (United States)

    Stec, Małgorzata; Grzebyk, Mariola

    2018-01-01

    The European Union (EU), striving to create economic dominance on the global market, has prepared a comprehensive development programme, which initially was the Lisbon Strategy and then the Strategy Europe 2020. The attainment of the strategic goals included in the prospective development programmes shall transform the EU into the most competitive economy in the world based on knowledge. This paper presents a statistical evaluation of progress being made by EU member states in meeting Europe 2020. For the basis of the assessment, the authors proposed a general synthetic measure in dynamic terms, which allows to objectively compare EU member states by 10 major statistical indicators. The results indicate that most of EU countries show average progress in realisation of Europe's development programme which may suggest that the goals may not be achieved in the prescribed time. It is particularly important to monitor the implementation of Europe 2020 to arrive at the right decisions which will guarantee the accomplishment of the EU's development strategy.

  13. A Framework for Establishing Standard Reference Scale of Texture by Multivariate Statistical Analysis Based on Instrumental Measurement and Sensory Evaluation.

    Science.gov (United States)

    Zhi, Ruicong; Zhao, Lei; Xie, Nan; Wang, Houyin; Shi, Bolin; Shi, Jingye

    2016-01-13

    A framework of establishing standard reference scale (texture) is proposed by multivariate statistical analysis according to instrumental measurement and sensory evaluation. Multivariate statistical analysis is conducted to rapidly select typical reference samples with characteristics of universality, representativeness, stability, substitutability, and traceability. The reasonableness of the framework method is verified by establishing standard reference scale of texture attribute (hardness) with Chinese well-known food. More than 100 food products in 16 categories were tested using instrumental measurement (TPA test), and the result was analyzed with clustering analysis, principal component analysis, relative standard deviation, and analysis of variance. As a result, nine kinds of foods were determined to construct the hardness standard reference scale. The results indicate that the regression coefficient between the estimated sensory value and the instrumentally measured value is significant (R(2) = 0.9765), which fits well with Stevens's theory. The research provides reliable a theoretical basis and practical guide for quantitative standard reference scale establishment on food texture characteristics.

  14. Multivariate Statistical Analysis of Water Chemistry in Evaluating the Origin of Contamination in Many Devils Wash, Shiprock, New Mexico

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2012-12-31

    This report evaluates the chemistry of seep water occurring in three desert drainages near Shiprock, New Mexico: Many Devils Wash, Salt Creek Wash, and Eagle Nest Arroyo. Through the use of geochemical plotting tools and multivariate statistical analysis techniques, analytical results of samples collected from the three drainages are compared with the groundwater chemistry at a former uranium mill in the Shiprock area (the Shiprock site), managed by the U.S. Department of Energy Office of Legacy Management. The objective of this study was to determine, based on the water chemistry of the samples, if statistically significant patterns or groupings are apparent between the sample populations and, if so, whether there are any reasonable explanations for those groupings.

  15. The relationship between the number of loci and the statistical support for the topology of UPGMA trees obtained from genetic distance data.

    Science.gov (United States)

    Highton, R

    1993-12-01

    An analysis of the relationship between the number of loci utilized in an electrophoretic study of genetic relationships and the statistical support for the topology of UPGMA trees is reported for two published data sets. These are Highton and Larson (Syst. Zool.28:579-599, 1979), an analysis of the relationships of 28 species of plethodonine salamanders, and Hedges (Syst. Zool., 35:1-21, 1986), a similar study of 30 taxa of Holarctic hylid frogs. As the number of loci increases, the statistical support for the topology at each node in UPGMA trees was determined by both the bootstrap and jackknife methods. The results show that the bootstrap and jackknife probabilities supporting the topology at some nodes of UPGMA trees increase as the number of loci utilized in a study is increased, as expected for nodes that have groupings that reflect phylogenetic relationships. The pattern of increase varies and is especially rapid in the case of groups with no close relatives. At nodes that likely do not represent correct phylogenetic relationships, the bootstrap probabilities do not increase and often decline with the addition of more loci.

  16. Evaluating statistical approaches to leverage large clinical datasets for uncovering therapeutic and adverse medication effects.

    Science.gov (United States)

    Choi, Leena; Carroll, Robert J; Beck, Cole; Mosley, Jonathan D; Roden, Dan M; Denny, Joshua C; Van Driest, Sara L

    2018-04-18

    Phenome-wide association studies (PheWAS) have been used to discover many genotype-phenotype relationships and have the potential to identify therapeutic and adverse drug outcomes using longitudinal data within electronic health records (EHRs). However, the statistical methods for PheWAS applied to longitudinal EHR medication data have not been established. In this study, we developed methods to address two challenges faced with reuse of EHR for this purpose: confounding by indication, and low exposure and event rates. We used Monte Carlo simulation to assess propensity score (PS) methods, focusing on two of the most commonly used methods, PS matching and PS adjustment, to address confounding by indication. We also compared two logistic regression approaches (the default of Wald vs. Firth's penalized maximum likelihood, PML) to address complete separation due to sparse data with low exposure and event rates. PS adjustment resulted in greater power than propensity score matching, while controlling Type I error at 0.05. The PML method provided reasonable p-values, even in cases with complete separation, with well controlled Type I error rates. Using PS adjustment and the PML method, we identify novel latent drug effects in pediatric patients exposed to two common antibiotic drugs, ampicillin and gentamicin. R packages PheWAS and EHR are available at https://github.com/PheWAS/PheWAS and at CRAN (https://www.r-project.org/), respectively. The R script for data processing and the main analysis is available at https://github.com/choileena/EHR. leena.choi@vanderbilt.edu. Supplementary data are available at Bioinformatics online.

  17. Statistical evaluation of the mechanical properties of high-volume class F fly ash concretes

    KAUST Repository

    Yoon, Seyoon

    2014-03-01

    High-Volume Fly Ash (HVFA) concretes are seen by many as a feasible solution for sustainable, low embodied carbon construction. At the moment, fly ash is classified as a waste by-product, primarily of thermal power stations. In this paper the authors experimentally and statistically investigated the effects of mix-design factors on the mechanical properties of high-volume class F fly ash concretes. A total of 240 and 32 samples were produced and tested in the laboratory to measure compressive strength and Young\\'s modulus respectively. Applicability of the CEB-FIP (Comite Euro-international du Béton - Fédération Internationale de la Précontrainte) and ACI (American Concrete Institute) Building Model Code (Thomas, 2010; ACI Committee 209, 1982) [1,2] to the experimentally-derived mechanical property data for HVFA concretes was established. Furthermore, using multiple linear regression analysis, Mean Squared Residuals (MSRs) were obtained to determine whether a weight- or volume-based mix proportion is better to predict the mechanical properties of HVFA concrete. The significance levels of the design factors, which indicate how significantly the factors affect the HVFA concrete\\'s mechanical properties, were determined using analysis of variance (ANOVA) tests. The results show that a weight-based mix proportion is a slightly better predictor of mechanical properties than volume-based one. The significance level of fly ash substitution rate was higher than that of w/b ratio initially but reduced over time. © 2014 Elsevier Ltd. All rights reserved.

  18. Determination of grain boundary mobility during recrystallization by statistical evaluation of electron backscatter diffraction measurements

    International Nuclear Information System (INIS)

    Basu, I.; Chen, M.; Loeck, M.; Al-Samman, T.; Molodov, D.A.

    2016-01-01

    One of the key aspects influencing microstructural design pathways in metallic systems is grain boundary motion. The present work introduces a method by means of which direct measurement of grain boundary mobility vs. misorientation dependence is made possible. The technique utilizes datasets acquired by means of serial electron backscatter diffraction (EBSD) measurements. The experimental EBSD measurements are collectively analyzed, whereby datasets were used to obtain grain boundary mobility and grain aspect ratio with respect to grain boundary misorientation. The proposed method is further validated using cellular automata (CA) simulations. Single crystal aluminium was cold rolled and scratched in order to nucleate random orientations. Subsequent annealing at 300 °C resulted in grains growing, in the direction normal to the scratch, into a single deformed orientation. Growth selection was observed, wherein the boundaries with misorientations close to Σ7 CSL orientation relationship (38° 〈111〉) migrated considerably faster. The obtained boundary mobility distribution exhibited a non-monotonic behavior with a maximum corresponding to misorientation of 38° ± 2° about 〈111〉 axes ± 4°, which was 10–100 times higher than the mobility values of random high angle boundaries. Correlation with the grain aspect ratio values indicated a strong growth anisotropy displayed by the fast growing grains. The observations have been discussed in terms of the influence of grain boundary character on grain boundary motion during recrystallization. - Highlights: • Statistical microstructure method to measure grain boundary mobility during recrystallization • Method implementation independent of material or crystal structure • Mobility of the Σ7 boundaries in 5N Al was calculated as 4.7 × 10"–"8 m"4/J ⋅ s. • Pronounced growth selection in the recrystallizing nuclei in Al • Boundary mobility values during recrystallization 2–3 orders of magnitude

  19. Statistical quality control charts for liver transplant process indicators: evaluation of a single-center experience.

    Science.gov (United States)

    Varona, M A; Soriano, A; Aguirre-Jaime, A; Barrera, M A; Medina, M L; Bañon, N; Mendez, S; Lopez, E; Portero, J; Dominguez, D; Gonzalez, A

    2012-01-01

    Liver transplantation, the best option for many end-stage liver diseases, is indicated in more candidates than the donor availability. In this situation, this demanding treatment must achieve excellence, accessibility and patient satisfaction to be ethical, scientific, and efficient. The current consensus of quality measurements promoted by the Sociedad Española de Trasplante Hepático (SETH) seeks to depict criteria, indicators, and standards for liver transplantation in Spain. According to this recommendation, the Canary Islands liver program has studied its experience. We separated the 411 cadaveric transplants performed in the last 15 years into 2 groups: The first 100 and the other 311. The 8 criteria of SETH 2010 were correctly fulfilled. In most indicators, the outcomes were favorable, with an actuarial survivals at 1, 3, 5, and 10 years of 84%, 79%, 76%, and 65%, respectively; excellent results in retransplant rates (early 0.56% and long-term 5.9%), primary nonfunction rate (0.43%), waiting list mortality (13.34%), and patient satisfaction (91.5%). On the other hand, some indicators of mortality were worse as perioperative, postoperative, and early mortality with normal graft function and reoperation rate. After the analyses of the series with statistical quality control charts, we observed an improvement in all indicators, even in the apparently worst, early mortality with normal graft functions in a stable program. Such results helped us to discover specific areas to improve the program. The application of the quality measurement, as SETH consensus recommends, has shown in our study that despite being a consuming time process, it is a useful tool. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Evaluation Of Statistical Models For Forecast Errors From The HBV-Model

    Science.gov (United States)

    Engeland, K.; Kolberg, S.; Renard, B.; Stensland, I.

    2009-04-01

    Three statistical models for the forecast errors for inflow to the Langvatn reservoir in Northern Norway have been constructed and tested according to how well the distribution and median values of the forecasts errors fit to the observations. For the first model observed and forecasted inflows were transformed by the Box-Cox transformation before a first order autoregressive model was constructed for the forecast errors. The parameters were conditioned on climatic conditions. In the second model the Normal Quantile Transformation (NQT) was applied on observed and forecasted inflows before a similar first order autoregressive model was constructed for the forecast errors. For the last model positive and negative errors were modeled separately. The errors were first NQT-transformed before a model where the mean values were conditioned on climate, forecasted inflow and yesterday's error. To test the three models we applied three criterions: We wanted a) the median values to be close to the observed values; b) the forecast intervals to be narrow; c) the distribution to be correct. The results showed that it is difficult to obtain a correct model for the forecast errors, and that the main challenge is to account for the auto-correlation in the errors. Model 1 and 2 gave similar results, and the main drawback is that the distributions are not correct. The 95% forecast intervals were well identified, but smaller forecast intervals were over-estimated, and larger intervals were under-estimated. Model 3 gave a distribution that fits better, but the median values do not fit well since the auto-correlation is not properly accounted for. If the 95% forecast interval is of interest, Model 2 is recommended. If the whole distribution is of interest, Model 3 is recommended.

  1. Forecasting methodologies for Ganoderma spore concentration using combined statistical approaches and model evaluations

    Science.gov (United States)

    Sadyś, Magdalena; Skjøth, Carsten Ambelas; Kennedy, Roy

    2016-04-01

    High concentration levels of Ganoderma spp. spores were observed in Worcester, UK, during 2006-2010. These basidiospores are known to cause sensitization due to the allergen content and their small dimensions. This enables them to penetrate the lower part of the respiratory tract in humans. Establishment of a link between occurring symptoms of sensitization to Ganoderma spp. and other basidiospores is challenging due to lack of information regarding spore concentration in the air. Hence, aerobiological monitoring should be conducted, and if possible extended with the construction of forecast models. Daily mean concentration of allergenic Ganoderma spp. spores in the atmosphere of Worcester was measured using 7-day volumetric spore sampler through five consecutive years. The relationships between the presence of spores in the air and the weather parameters were examined. Forecast models were constructed for Ganoderma spp. spores using advanced statistical techniques, i.e. multivariate regression trees and artificial neural networks. Dew point temperature along with maximum temperature was the most important factor influencing the presence of spores in the air of Worcester. Based on these two major factors and several others of lesser importance, thresholds for certain levels of fungal spore concentration, i.e. low (0-49 s m-3), moderate (50-99 s m-3), high (100-149 s m-3) and very high (150 < n s m-3), could be designated. Despite some deviation in results obtained by artificial neural networks, authors have achieved a forecasting model, which was accurate (correlation between observed and predicted values varied from r s = 0.57 to r s = 0.68).

  2. Correlating tephras and cryptotephras using glass compositional analyses and numerical and statistical methods: Review and evaluation

    Science.gov (United States)

    Lowe, David J.; Pearce, Nicholas J. G.; Jorgensen, Murray A.; Kuehn, Stephen C.; Tryon, Christian A.; Hayward, Chris L.

    2017-11-01

    We define tephras and cryptotephras and their components (mainly ash-sized particles of glass ± crystals in distal deposits) and summarize the basis of tephrochronology as a chronostratigraphic correlational and dating tool for palaeoenvironmental, geological, and archaeological research. We then document and appraise recent advances in analytical methods used to determine the major, minor, and trace elements of individual glass shards from tephra or cryptotephra deposits to aid their correlation and application. Protocols developed recently for the electron probe microanalysis of major elements in individual glass shards help to improve data quality and standardize reporting procedures. A narrow electron beam (diameter ∼3-5 μm) can now be used to analyze smaller glass shards than previously attainable. Reliable analyses of 'microshards' (defined here as glass shards T2 test). Randomization tests can be used where distributional assumptions such as multivariate normality underlying parametric tests are doubtful. Compositional data may be transformed and scaled before being subjected to multivariate statistical procedures including calculation of distance matrices, hierarchical cluster analysis, and PCA. Such transformations may make the assumption of multivariate normality more appropriate. A sequential procedure using Mahalanobis distance and the Hotelling two-sample T2 test is illustrated using glass major element data from trachytic to phonolitic Kenyan tephras. All these methods require a broad range of high-quality compositional data which can be used to compare 'unknowns' with reference (training) sets that are sufficiently complete to account for all possible correlatives, including tephras with heterogeneous glasses that contain multiple compositional groups. Currently, incomplete databases are tending to limit correlation efficacy. The development of an open, online global database to facilitate progress towards integrated, high

  3. Impact of adaptive statistical iterative reconstruction on radiation dose in evaluation of trauma patients.

    Science.gov (United States)

    Maxfield, Mark W; Schuster, Kevin M; McGillicuddy, Edward A; Young, Calvin J; Ghita, Monica; Bokhari, S A Jamal; Oliva, Isabel B; Brink, James A; Davis, Kimberly A

    2012-12-01

    A recent study showed that computed tomographic (CT) scans contributed 93% of radiation exposure of 177 patients admitted to our Level I trauma center. Adaptive statistical iterative reconstruction (ASIR) is an algorithm that reduces the noise level in reconstructed images and therefore allows the use of less ionizing radiation during CT scans without significantly affecting image quality. ASIR was instituted on all CT scans performed on trauma patients in June 2009. Our objective was to determine if implementation of ASIR reduced radiation dose without compromising patient outcomes. We identified 300 patients activating the trauma system before and after the implementation of ASIR imaging. After applying inclusion criteria, 245 charts were reviewed. Baseline demographics, presenting characteristics, number of delayed diagnoses, and missed injuries were recorded. The postexamination volume CT dose index (CTDIvol) and dose-length product (DLP) reported by the scanner for CT scans of the chest, abdomen, and pelvis and CT scans of the brain and cervical spine were recorded. Subjective image quality was compared between the two groups. For CT scans of the chest, abdomen, and pelvis, the mean CTDIvol (17.1 mGy vs. 14.2 mGy; p ASIR. For CT scans of the brain and cervical spine, the mean CTDIvol (61.7 mGy vs. 49.6 mGy; p ASIR. There was no subjective difference in image quality between ASIR and non-ASIR scans. All CT scans were deemed of good or excellent image quality. There were no delayed diagnoses or missed injuries related to CT scanning identified in either group. Implementation of ASIR imaging for CT scans performed on trauma patients led to a nearly 20% reduction in ionizing radiation without compromising outcomes or image quality. Therapeutic study, level IV.

  4. Evaluating the statistical performance of less applied algorithms in classification of worldview-3 imagery data in an urbanized landscape

    Science.gov (United States)

    Ranaie, Mehrdad; Soffianian, Alireza; Pourmanafi, Saeid; Mirghaffari, Noorollah; Tarkesh, Mostafa

    2018-03-01

    In recent decade, analyzing the remotely sensed imagery is considered as one of the most common and widely used procedures in the environmental studies. In this case, supervised image classification techniques play a central role. Hence, taking a high resolution Worldview-3 over a mixed urbanized landscape in Iran, three less applied image classification methods including Bagged CART, Stochastic gradient boosting model and Neural network with feature extraction were tested and compared with two prevalent methods: random forest and support vector machine with linear kernel. To do so, each method was run ten time and three validation techniques was used to estimate the accuracy statistics consist of cross validation, independent validation and validation with total of train data. Moreover, using ANOVA and Tukey test, statistical difference significance between the classification methods was significantly surveyed. In general, the results showed that random forest with marginal difference compared to Bagged CART and stochastic gradient boosting model is the best performing method whilst based on independent validation there was no significant difference between the performances of classification methods. It should be finally noted that neural network with feature extraction and linear support vector machine had better processing speed than other.

  5. The evaluation of the statistical monomineral thermobarometric methods for the reconstruction of the lithospheric mantle structure

    Science.gov (United States)

    Ashchepkov, I.; Vishnyakova, E.

    2009-04-01

    The modified versions of the thermobarometers for the mantle assemblages were revised sing statistical calibrations on the results of Opx thermobarometry. The modifications suggest the calculation of the Fe# of coexisting olivine Fe#Ol according to the statistical approximations by the regressions obtained from the xenoliths from kimberlite data base including >700 associations. They allow reproduces the Opx based TP estimates and to receive the complete set of the TP values for mantle xenoliths and xenocrysts. For GARNET Three variants of barometer give similar results. The first is published (Ashchepkov, 2006). The second is calculating the Al2O3 from Garnet for Orthopyroxene according to procedure: xCrOpx=Cr2O3/CaO)/FeO/MgO/500 xAlOpx=1/(3875*(exp(Cr2O3^0.2/CaO)-0.3)*CaO/989+16)-XcrOpx Al2O3=xAlOp*24.64/Cr2O3^0.2*CaO/2.+FeO*(ToK-501)/1002 And then it suppose using of the Al2O3 in Opx barometer (McGregor, 1974). The third variant is transformation of the G. Grutter (2006) method by introducing of the influence of temperature. P=40+(Cr2O3)-4.5)*10/3-20/7*CaO+(ToC)*0.0000751*MgO)*CaO+2.45*Cr2O3*(7-xv(5,8)) -Fe*0.5 with the correction for P>55: P=55+(P-55)*55/(1+0.9*P) Average from this three methods give appropriate values comparable with determined with (McGregor,1974) barometer. Temperature are estimating according to transformed Krogh thermometer Fe#Ol_Gar=Fe#Gar/2+(T(K)-1420)*0.000112+0.01 For the deep seated associations P>55 kbar T=T-(0.25/(0.4-0.004*(20-P))-0.38/Ca)*275+51*Ca*Cr2-378*CaO-0.51)-Cr/Ca2*5+Mg/(Fe+0.0001)*17.4 ILMENITE P= ((TiO2-23.)*2.15-(T0-973)/20*MgO*Cr2O3 and next P=(60-P)/6.1+P ToK is determined according to (Taylor et al , 1998) Fe#Ol_Chr =(Fe/(Fe+Mg)ilm -0.35)/2.252-0.0000351*(T(K)-973) CHROMITE The equations for PT estimates with chromite compositions P=Cr/(Cr+Al)*T(K)/14.+Ti*0.10 with the next iteration P=-0.0053*P^2+1.1292*P+5.8059 +0.00135*T(K)*Ti*410-8.2 For P> 57 P=P+(P-57)*2.75 Temperature estimates are according to the O

  6. A Statistical Evaluation of Rules for Biochemical Failure After Radiotherapy in Men Treated for Prostate Cancer

    International Nuclear Information System (INIS)

    Bellera, Carine A.; Hanley, James A.; Joseph, Lawrence; Albertsen, Peter C.

    2009-01-01

    Purpose: The 'PSA nadir + 2 rule,' defined as any rise of 2 ng/ml above the current prostate-specific antigen (PSA) nadir, has replaced the American Society for Therapeutic Radiology and Oncology (ASTRO) rule, defined as three consecutive PSA rises, to indicate biochemical failure (BF) after radiotherapy in patients treated for prostate cancer. We propose an original approach to evaluate BF rules based on the PSAdt as the gold standard rule and on a simulation process allowing us to evaluate the BF rules under multiple settings (different frequency, duration of follow-up, PSA doubling time [PSAdt]). Methods and Materials: We relied on a retrospective, population-based cohort of individuals identified by the Connecticut Tumor Registry and treated for localized prostate cancer with radiotherapy. We estimated the 470 underlying true PSA trajectories, including the PSAdt, using a Bayesian hierarchical changepoint model. Next, we simulated realistic, sophisticated data sets that accurately reflect the systematic and random variations observed in PSA series. We estimated the sensitivity and specificity by comparing the simulated PSA series to the underlying true PSAdt. Results: For follow-up of more than 3 years, the specificity of the PSA nadir + 2 rule was systematically greater than that of the ASTRO criterion. In few settings, the nadir + 2 rule had a lower sensitivity than the ASTRO. The PSA nadir + 2 rule appeared less dependent on the frequency and duration of follow-up than the ASTRO. Conclusions: Our results provide some refinements to earlier findings as the BF rules were evaluated according to various parameters. In most settings, the PSA nadir + 2 rule outperforms the ASTRO criterion.

  7. Statistical evaluation of physical examinations conducted under atomic bomb survivors medical treatment law Nagasaki

    Energy Technology Data Exchange (ETDEWEB)

    Ohri, Shigehisa; Shimada, Daisaburo; Ishida, Morthiro; Onishi, Shigeyuki

    1961-09-19

    An evaluation was made of the reliability and validity of the information obtained by the first examination completed under the ABSMTL. Results of the analysis show clearly that the materials hardly can be utilized for studying the relationship between findings obtained from the medical examination and distance from the hypocenter. From the standpoint of clinical medicine, the lack of exactness in the examinations may be a major difficulty. However, as long as the degree of inexactness of the medical examinations is distributed equally to all sample members, comparison of the findings may be made within the limits of their accuracy. 4 references, 1 figure, 3 tables.

  8. Statistical Evaluations of Variations in Dairy Cows’ Milk Yields as a Precursor of Earthquakes

    Science.gov (United States)

    Yamauchi, Hiroyuki; Hayakawa, Masashi; Asano, Tomokazu; Ohtani, Nobuyo; Ohta, Mitsuaki

    2017-01-01

    Simple Summary There are many reports of abnormal changes occurring in various natural systems prior to earthquakes. Unusual animal behavior is one of these abnormalities; however, there are few objective indicators and to date, reliability has remained uncertain. We found that milk yields of dairy cows decreased prior to an earthquake in our previous case study. In this study, we examined the reliability of decreases in milk yields as a precursor for earthquakes using long-term observation data. In the results, milk yields decreased approximately three weeks before earthquakes. We have come to the conclusion that dairy cow milk yields have applicability as an objectively observable unusual animal behavior prior to earthquakes, and dairy cows respond to some physical or chemical precursors of earthquakes. Abstract Previous studies have provided quantitative data regarding unusual animal behavior prior to earthquakes; however, few studies include long-term, observational data. Our previous study revealed that the milk yields of dairy cows decreased prior to an extremely large earthquake. To clarify whether the milk yields decrease prior to earthquakes, we examined the relationship between earthquakes of various magnitudes and daily milk yields. The observation period was one year. In the results, cross-correlation analyses revealed a significant negative correlation between earthquake occurrence and milk yields approximately three weeks beforehand. Approximately a week and a half beforehand, a positive correlation was revealed, and the correlation gradually receded to zero as the day of the earthquake approached. Future studies that use data from a longer observation period are needed because this study only considered ten earthquakes and therefore does not have strong statistical power. Additionally, we compared the milk yields with the subionospheric very low frequency/low frequency (VLF/LF) propagation data indicating ionospheric perturbations. The results showed

  9. Statistical evaluation of tablet coating processes: influence of pan design and solvent type

    Directory of Open Access Journals (Sweden)

    Valdomero Pereira de Melo Junior

    2010-12-01

    Full Text Available Partially and fully perforated pan coaters are among the most relevant types of equipment currently used in the process of coating tablets. The goal of this study was to assess the performance differences among these types of equipment employing a factorial design. This statistical approach allowed the simultaneous study of the process variables and verification of interactions among them. The study included partially-perforated and fully-perforated pan coaters, aqueous and organic solvents, as well as hypromellose-based immediate-release coating. The dependent variables were process time, energy consumption, mean weight of tablets and process yield. For the tests, placebo tablets with a mean weight of 250 mg were produced, divided into eight lots of two kilograms each and coated in duplicate, using both partially perforated pan and fully perforated pan coaters. The results showed a significant difference between the type of equipment used (partially and fully perforated pan coaters with regard to process time and energy consumption, whereas no significant difference was identified for mean weight of the coated tablets and process yield.Entre os tipos de equipamentos de maior relevância utilizados atualmente no processo de revestimento de comprimidos estão os de tambor parcial e totalmente perfurados. A proposta desse trabalho foi avaliar as diferenças de desempenho entre esses equipamentos empregando projeto fatorial. Essa abordagem estatística possibilitou o estudo simultâneo das variáveis do processo, permitindo verificar interações entre elas. O trabalho incluiu equipamento com tambor parcialmente perfurado e totalmente perfurado, solventes aquoso e orgânico, assim como revestimento de liberação imediata à base de hipromelose. As variáveis dependentes ou respostas foram tempo de processo, consumo de energia, peso médio e rendimento do processo. Para os ensaios, foram produzidos comprimidos de placebo de 250 mg de peso m

  10. Evaluation of Statistical Methods for Modeling Historical Resource Production and Forecasting

    Science.gov (United States)

    Nanzad, Bolorchimeg

    This master's thesis project consists of two parts. Part I of the project compares modeling of historical resource production and forecasting of future production trends using the logit/probit transform advocated by Rutledge (2011) with conventional Hubbert curve fitting, using global coal production as a case study. The conventional Hubbert/Gaussian method fits a curve to historical production data whereas a logit/probit transform uses a linear fit to a subset of transformed production data. Within the errors and limitations inherent in this type of statistical modeling, these methods provide comparable results. That is, despite that apparent goodness-of-fit achievable using the Logit/Probit methodology, neither approach provides a significant advantage over the other in either explaining the observed data or in making future projections. For mature production regions, those that have already substantially passed peak production, results obtained by either method are closely comparable and reasonable, and estimates of ultimately recoverable resources obtained by either method are consistent with geologically estimated reserves. In contrast, for immature regions, estimates of ultimately recoverable resources generated by either of these alternative methods are unstable and thus, need to be used with caution. Although the logit/probit transform generates high quality-of-fit correspondence with historical production data, this approach provides no new information compared to conventional Gaussian or Hubbert-type models and may have the effect of masking the noise and/or instability in the data and the derived fits. In particular, production forecasts for immature or marginally mature production systems based on either method need to be regarded with considerable caution. Part II of the project investigates the utility of a novel alternative method for multicyclic Hubbert modeling tentatively termed "cycle-jumping" wherein overlap of multiple cycles is limited. The

  11. Bladder cancer staging in CT urography: effect of stage labels on statistical modeling of a decision support system

    Science.gov (United States)

    Gandikota, Dhanuj; Hadjiiski, Lubomir; Cha, Kenny H.; Chan, Heang-Ping; Caoili, Elaine M.; Cohan, Richard H.; Weizer, Alon; Alva, Ajjai; Paramagul, Chintana; Wei, Jun; Zhou, Chuan

    2018-02-01

    In bladder cancer, stage T2 is an important threshold in the decision of administering neoadjuvant chemotherapy. Our long-term goal is to develop a quantitative computerized decision support system (CDSS-S) to aid clinicians in accurate staging. In this study, we examined the effect of stage labels of the training samples on modeling such a system. We used a data set of 84 bladder cancers imaged with CT Urography (CTU). At clinical staging prior to treatment, 43 lesions were staged as below stage T2 and 41 were stage T2 or above. After cystectomy and pathological staging that is considered the gold standard, 10 of the lesions were upstaged to stage T2 or above. After correcting the stage labels, 33 lesions were below stage T2, and 51 were stage T2 or above. For the CDSS-S, the lesions were segmented using our AI-CALS method and radiomic features were extracted. We trained a linear discriminant analysis (LDA) classifier with leave-one-case-out cross validation to distinguish between bladder lesions of stage T2 or above and those below stage T2. The CDSS-S was trained and tested with the corrected post-cystectomy labels, and as a comparison, CDSS-S was also trained with understaged pre-treatment labels and tested on lesions with corrected labels. The test AUC for the CDSS-S trained with corrected labels was 0.89 +/- 0.04. For the CDSS-S trained with understaged pre-treatment labels and tested on the lesions with corrected labels, the test AUC was 0.86 +/- 0.04. The likelihood of stage T2 or above for 9 out of the 10 understaged lesions was correctly increased for the CDSS-S trained with corrected labels. The CDSS-S is sensitive to the accuracy of stage labeling. The CDSS-S trained with correct labels shows promise in prediction of the bladder cancer stage.

  12. SOME STATISTICAL RESULTS REGARDING THE EVALUATION OF THE QUALITY OF THE MASTER EDUCATION

    Directory of Open Access Journals (Sweden)

    POPA IRIMIE

    2011-12-01

    Full Text Available The article emphasizes aspects regarding the evaluation of the higher education's quality. In certain countries, the questionnaires regarding quality of the activity of HEIs (Higher Education Institutions are administrated by specialized institutions led by the Ministry of Education or the university associations. The evaluation principles derive from well-known economic and social theories, evolving even evaluation models (see the SERVQUAL model. As a result of the Bologna Treaty (1999, the European Union has the objective to become an international reference concerning the higher education quality and to be more attractive than in the present for students, professors and researchers from other regions of the world. So as to fulfill these objectives ENQA (European Association for Quality Assurance in Higher Education recommends HEIs to include in their development plans regarding quality aspects five principles described in the article. The practical study refers to the results of a questionnaire applied to the master business students from a Romanian university. In order to assess the level of satisfaction of students in relation to the master programme they are involved in, a questionnaire was applied on a sample of 200 such persons. The responses were then analyzed using multidimensional data analysis methods. Out of these, the present research is based on multiple response analysis. In the questionnaire, students were asked to return their level of satisfaction for different aspects related to the educational process they are involved in. The questions were constructed as five-level Likert items. In this way was insured a connection between answers given at each of the questions assessing the quality of the programme. Only 0.2% of the answers given relate to aspects about which the students were not satisfied at all. These answers represent 3.2% of the number of respondents. 30% of the students were slightly satisfied, returning 57 choices

  13. Subjective evaluation of a peer support program by women with breast cancer: A qualitative study.

    Science.gov (United States)

    Ono, Miho; Tsuyumu, Yuko; Ota, Hiroko; Okamoto, Reiko

    2017-01-01

    The aim of this study was to determine the subjective evaluation of a breast cancer peer support program based on a survey of the participants who completed the program. Semistructured interviews were held with 10 women with breast cancer. The responses were subject to a qualitative inductive analysis. Women with breast cancer who participated in the breast cancer peer support program evaluated the features of the program and cited benefits, such as "Receiving individual peer support tailored to your needs," "Easily consulted trained peer supporters," and "Excellent coordination." Also indicated were benefits of the peer support that was received, such as "Receiving peer-specific emotional support," "Obtaining specific experimental information," "Re-examining yourself," and "Making preparations to move forward." The women also spoke of disadvantages, such as "Strict management of personal information" and "Matching limitations." In this study, the subjective evaluation of a peer support program by women with breast cancer was clarified . The women with breast cancer felt that the program had many benefits and some disadvantages. These results suggest that there is potential for peer support-based patient-support programs in medical services that are complementary to the current support that is provided by professionals. © 2016 Japan Academy of Nursing Science.

  14. Evaluation of the ICS and DEW scatter correction methods for low statistical content scans in 3D PET

    International Nuclear Information System (INIS)

    Sossi, V.; Oakes, T.R.; Ruth, T.J.

    1996-01-01

    The performance of the Integral Convolution and the Dual Energy Window scatter correction methods in 3D PET has been evaluated over a wide range of statistical content of acquired data (1M to 400M events) The order in which scatter correction and detector normalization should be applied has also been investigated. Phantom and human neuroreceptor studies were used with the following figures of merit: axial and radial uniformity, sinogram and image noise, contrast accuracy and contrast accuracy uniformity. Both scatter correction methods perform reliably in the range of number of events examined. Normalization applied after scatter correction yields better radial uniformity and fewer image artifacts

  15. Comparison of statistical evaluation of criticality calculations for reactors VENUS-F and ALFRED

    Directory of Open Access Journals (Sweden)

    Janczyszyn Jerzy

    2017-01-01

    Full Text Available Limitations of correct evaluation of keff in Monte Carlo calculations, claimed in literature, apart from the nuclear data uncertainty, need to be addressed more thoroughly. Respective doubts concern: the proper number of discarded initial cycles, the sufficient number of neutrons in a cycle and the recognition and dealing with the keff bias. Calculations were performed to provide more information on these points with the use of the MCB code, solely for fast cores. We present applied methods and results, such as: calculation results for stability of variance, relation between standard deviation reported by MCNP and this from the dispersion of multiple independent keff values, second order standard deviations obtained from different numbers of grouped results. All obtained results for numbers of discarded initial cycles from 0 to 3000 were analysed leading for interesting conclusions.

  16. Using complete measurement statistics for optimal device-independent randomness evaluation

    International Nuclear Information System (INIS)

    Nieto-Silleras, O; Pironio, S; Silman, J

    2014-01-01

    The majority of recent works investigating the link between non-locality and randomness, e.g. in the context of device-independent cryptography, do so with respect to some specific Bell inequality, usually the CHSH inequality. However, the joint probabilities characterizing the measurement outcomes of a Bell test are richer than just the degree of violation of a single Bell inequality. In this work we show how to take this extra information into account in a systematic manner in order to optimally evaluate the randomness that can be certified from non-local correlations. We further show that taking into account the complete set of outcome probabilities is equivalent to optimizing over all possible Bell inequalities, thereby allowing us to determine the optimal Bell inequality for certifying the maximal amount of randomness from a given set of non-local correlations. (paper)

  17. Evaluation of Groundwater for Arsenic Contamination Using Hydrogeochemical Properties and Multivariate Statistical Methods in Saudi Arabia

    Directory of Open Access Journals (Sweden)

    Abdullah S. Al-Farraj

    2013-01-01

    Full Text Available The aim of this research is to evaluate arsenic distribution and associated hydrogeochemical parameters in 27 randomly selected boreholes representing aquifers in the Al-Kharj geothermal fields of Saudi Arabia. Arsenic was detected at all sites, with 92.5% of boreholes yielding concentrations above the WHO permissible limit of 10 μg/L. The maximum concentration recorded was 122 μg/L (SD = 29 μg/L skewness = 1.87. The groundwater types were mainly Ca+2-Mg+2-SO4-2-Cl− and Na+-Cl−-SO4-2, accounting for 67% of the total composition. Principal component analysis (PCA showed that the main source of arsenic release was geothermal in nature and was linked to processes similar to those involved in the release of boron. The PCA yielded five components, which accounted for 44.1%, 17.0%, 10.1%, 08.4%, and 06.5% of the total variance. The first component had positive loadings for arsenic and boron along with other hydrogeochemical parameters, indicating the primary sources of As mobilization are derived from regional geothermal systems and weathering of minerals. The remaining principal components indicated reductive dissolution of iron oxyhydroxides as a possible mechanism. Spatial evaluation of the PCA results indicated that this secondary mechanism of arsenic mobilization may be active and correlates positively with total organic carbon. The aquifers were found to be contaminated to a high degree with organic carbon ranging from 0.57 mg/L to 21.42 mg/L and showed high concentrations of NO3- ranging from 8.05 mg/L to 248.2 mg/L.

  18. Statistical Analysis for Subjective and Objective Evaluations of Dental Drill Sounds.

    Directory of Open Access Journals (Sweden)

    Tomomi Yamada

    Full Text Available The sound produced by a dental air turbine handpiece (dental drill can markedly influence the sound environment in a dental clinic. Indeed, many patients report that the sound of a dental drill elicits an unpleasant feeling. Although several manufacturers have attempted to reduce the sound pressure levels produced by dental drills during idling based on ISO 14457, the sound emitted by such drills under active drilling conditions may negatively influence the dental clinic sound environment. The physical metrics related to the unpleasant impressions associated with dental drill sounds have not been determined. In the present study, psychological measurements of dental drill sounds were conducted with the aim of facilitating improvement of the sound environment at dental clinics. Specifically, we examined the impressions elicited by the sounds of 12 types of dental drills in idling and drilling conditions using a semantic differential. The analysis revealed that the impressions of dental drill sounds varied considerably between idling and drilling conditions and among the examined drills. This finding suggests that measuring the sound of a dental drill in idling conditions alone may be insufficient for evaluating the effects of the sound. We related the results of the psychological evaluations to those of measurements of the physical metrics of equivalent continuous A-weighted sound pressure levels (LAeq and sharpness. Factor analysis indicated that impressions of the dental drill sounds consisted of two factors: "metallic and unpleasant" and "powerful". LAeq had a strong relationship with "powerful impression", calculated sharpness was positively related to "metallic impression", and "unpleasant impression" was predicted by the combination of both LAeq and calculated sharpness. The present analyses indicate that, in addition to a reduction in sound pressure level, refining the frequency components of dental drill sounds is important for creating a

  19. Supporting Sustainable Markets Through Life Cycle Assessment: Evaluating emerging technologies, incorporating uncertainty and the consumer perspective

    Science.gov (United States)

    Merugula, Laura

    As civilization's collective knowledge grows, we are met with the realization that human-induced physical and biological transformations influenced by exogenous psychosocial and economic factors affect virtually every ecosystem on the planet. Despite improvements in energy generation and efficiencies, demand of material goods and energy services increases with no sign of a slowing pace. Sustainable development requires a multi-prong approach that involves reshaping demand, consumer education, sustainability-oriented policy, and supply chain management that does not serve the expansionist mentality. Thus, decision support tools are needed that inform developers, consumers, and policy-makers for short-term and long-term planning. These tools should incorporate uncertainty through quantitative methods as well as qualitatively informing the nature of the model as imperfect but necessary and adequate. A case study is presented of the manufacture and deployment of utility-scale wind turbines evaluated for a proposed change in blade manufacturing. It provides the first life cycle assessment (LCA) evaluating impact of carbon nanofibers, an emerging material, proposed for integration to wind power generation systems as blade reinforcement. Few LCAs of nanoproducts are available in scientific literature due to research and development (R&D) for applications that continues to outpace R&D for environmental, health, and safety (EHS) and life cycle impacts. LCAs of emerging technologies are crucial for informing developers of potential impacts, especially where market growth is swift and dissipative. A second case study is presented that evaluates consumer choice between disposable and reusable beverage cups. While there are a few studies that attempt to make the comparison using LCA, none adequately address uncertainty, nor are they representative for the typical American consumer. By disaggregating U.S. power generation into 26 subregional grid production mixes and evaluating

  20. Supporting the Future Total Force: A Methodology for Evaluating Potential Air National Guard Mission Assignments

    National Research Council Canada - National Science Library

    Lynch, Kristin F; Drew, John G; Sleeper, Sally; Williams, William A; Masters, James M; Luangkesorn, Louis; Tripp, Robert S; Lichter, Dahlia S; Roll, Charles R

    2007-01-01

    ... trained, highly experienced personnel with no aircraft to operate and support. The authors develop a methodology to evaluate missions that could be transferred from the active component to the ANG without significant cost to the total force...

  1. Mash evaluation of TxDOT high-mounting-height temporary work zone sign support system.

    Science.gov (United States)

    2017-02-01

    The objective of this research was to develop a nonproprietary, lightweight, crashworthy, temporary work-zone single sign support for use with an aluminum sign substrate. The device is intended to meet the evaluation criteria in American Association ...

  2. To Be or Not to Be?: A Method for Evaluating Academic Support Units.

    Science.gov (United States)

    Cohn, Roy E.

    1979-01-01

    Reasons for the budget cut vulnerability of instructional support agencies and the haphazard, capricious criteria often used to judge their effectiveness are discussed. An evaluation strategy for rational decision-making is proposed. (Author/PHR)

  3. Evaluation of Combat Service Support Logistics Concepts for Supplying a USMC Regimental Task Force

    National Research Council Canada - National Science Library

    Lenhardt, Thomas

    2001-01-01

    .... This thesis evaluates existing and proposed concepts on how to best use the CSSE resources of a Force Service Support Group to transport supplies to Regimental Combat Teams over constrained networks...

  4. Evaluation of strategies to promote learning using ICT: the case of a course on Topics of Multivariate Statistics

    Directory of Open Access Journals (Sweden)

    Mario Miguel Ojeda Ramírez

    2017-01-01

    Full Text Available Currently some teachers implement different methods in order to promote education linked to reality, to provide more effective training and a meaningful learning. Activemethods aim to increase motivation and create scenarios in which student participation is central to achieve a more meaningful learning. This paper reports on the implementation of a process of educational innovation in the course of Topics of Multivariate Statistics offered in the degree in Statistical Sciences and Techniques at the Universidad Veracruzana (Mexico. The strategies used as sets for data collection, design and project development and realization of individual and group presentations are described. Information and communication technologies (ICT used are: EMINUS, distributed education platform of the Universidad Veracruzana, and managing files with Dropbox, plus communication via WhatsApp. The R software was used for statistical analysis and for making presentations in academic forums. To explore students' perceptions depth interviews were conducted and indicators for evaluating the student satisfaction were defined; the results show positive evidence, concluding that students were satisfied with the way that the course was designed and implemented. They also stated that they feel able to apply what they have learned. The opinions put that using these strategies they were feeling in preparation for their professional life. Finally, some suggestions for improving the course in future editions are included.

  5. Statistical re-evaluation of the ASME K{sub IC} and K{sub IR} fracture toughness reference curves

    Energy Technology Data Exchange (ETDEWEB)

    Wallin, K.; Rintamaa, R. [Valtion Teknillinen Tutkimuskeskus, Espoo (Finland)

    1998-11-01

    Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the `Master curve`, has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the Master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original data base was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound Master curve has the same inherent degree of safety as originally intended for the K{sub IC}-reference curve. Similarly, the 1% lower bound Master curve corresponds to the K{sub IR}-reference curve. (orig.)

  6. EVALUATION OF TRAINING AND‐METHODOLOGICAL SUPPORT OF UNIVERSITY COURSES (in Russian

    Directory of Open Access Journals (Sweden)

    Natalia BELKINA

    2012-04-01

    Full Text Available Quality of teaching at a Higher Education Institution certainly depends on the integrity and quality of its training and methodological support. However, in order to improve this quality it is necessary to have a sound methodology for evaluation of such support. This article contains a list of recommended university teaching course materials, criteria of their separate components evaluation and an approach to calculating the quality levels of separate components and teaching course materials as a whole.

  7. Application of Statistics to Evaluate Iranian Analytical Laboratories Proficiency: Case of Aflatoxins in Pistachio

    Directory of Open Access Journals (Sweden)

    Leila Fotouhi

    2015-12-01

    Full Text Available The aim of this study was to evaluate the utility of a proficiency testing program among limited number of local laboratories as an alternative to the IUPAC/CITAC guide on proficiency testing with a limited number of participants, specially where international schemes are not accessible. As a sample scheme we planned to determine aflatoxins (B1, G1, B2, G2, total in Iranian pistachio matrix. A part of naturally contaminated pistachio sample was tested for sufficient homogeneity by a competent laboratory and then homogenized sub-samples were distributed among participants all across the country. The median of participants’ results was selected as assigned value. Student t-test was applied to show there is no significant difference between assigned and mean values of homogeneity test results obtained by the competent laboratory. Calculated z-scores showed that 6 out of 8 results in aflatoxin B1, 7 out of 8 results in aflatoxin B2, 5 out of 8 results in aflatoxin G1, 7 out of 8 results in aflatoxin G2 and 6 out of 9 results in aflatoxin total were in satisfactory range. Together our studies indicate that the approach described here is highly cost efficient and applicable for quality assurance of test results when there is no access to international proficiency testing providers.

  8. Application of a new methodology to evaluate Dnb limits based on statistical propagation of uncertainties

    International Nuclear Information System (INIS)

    Machado, Marcio Dornellas

    1998-09-01

    One of the most important thermalhydraulics safety parameters is the DNBR (Departure from Nucleate Boiling Ratio). The current methodology in use at Eletronuclear to determine DNBR is extremely conservative and may result in penalties to the reactor power due to an increase plugging level of steam generator tubes. This work uses a new methodology to evaluate DNBR, named mini-RTDP. The standard methodology (STDP) currently in use establishes a limit design value which cannot be surpassed. This limit value is determined taking into account the uncertainties of the empirical correlation used in COBRA IIC/MIT code, modified to Angra 1 conditions. The correlation used is the Westinghouse's W-3 and the minimum DNBR (MDBR) value cannot be less than 1.3. The new methodology reduces the excessive level of conservatism associated with the parameters used in the DNBR calculation, which take most unfavorable values in the STDP methodology, by using their best estimate values. The final goal is to obtain a new DNBR design limit which will provide a margin gain due to more realistic parameters values used in the methodology. (author)

  9. Multivariate statistical techniques for the evaluation of groundwater quality of Amaravathi River Basin: South India

    Science.gov (United States)

    Loganathan, K.; Ahamed, A. Jafar

    2017-12-01

    The study of groundwater in Amaravathi River basin of Karur District resulted in large geochemical data set. A total of 24 water samples were collected and analyzed for physico-chemical parameters, and the abundance of cation and anion concentrations was in the following order: Na+ > Ca2+ > Mg2+ > K+ = Cl- > HCO3 - > SO4 2-. Correlation matrix shows that the basic ionic chemistry is influenced by Na+, Ca2+, Mg2+, and Cl-, and also suggests that the samples contain Na+-Cl-, Ca2+-Cl- an,d mixed Ca2+-Mg2+-Cl- types of water. HCO3 -, SO4 2-, and F- association is less than that of other parameters due to poor or less available of bearing minerals. PCA extracted six components, which are accountable for the data composition explaining 81% of the total variance of the data set and allowed to set the selected parameters according to regular features as well as to evaluate the frequency of each group on the overall variation in water quality. Cluster analysis results show that groundwater quality does not vary extensively as a function of seasons, but shows two main clusters.

  10. Statistical Techniques Applied to Aerial Radiometric Surveys (STAARS): cluster analysis. National Uranium Resource Evaluation

    International Nuclear Information System (INIS)

    Pirkle, F.L.; Stablein, N.K.; Howell, J.A.; Wecksung, G.W.; Duran, B.S.

    1982-11-01

    One objective of the aerial radiometric surveys flown as part of the US Department of Energy's National Uranium Resource Evaluation (NURE) program was to ascertain the regional distribution of near-surface radioelement abundances. Some method for identifying groups of observations with similar radioelement values was therefore required. It is shown in this report that cluster analysis can identify such groups even when no a priori knowledge of the geology of an area exists. A method of convergent k-means cluster analysis coupled with a hierarchical cluster analysis is used to classify 6991 observations (three radiometric variables at each observation location) from the Precambrian rocks of the Copper Mountain, Wyoming, area. Another method, one that combines a principal components analysis with a convergent k-means analysis, is applied to the same data. These two methods are compared with a convergent k-means analysis that utilizes available geologic knowledge. All three methods identify four clusters. Three of the clusters represent background values for the Precambrian rocks of the area, and one represents outliers (anomalously high 214 Bi). A segmentation of the data corresponding to geologic reality as discovered by other methods has been achieved based solely on analysis of aerial radiometric data. The techniques employed are composites of classical clustering methods designed to handle the special problems presented by large data sets. 20 figures, 7 tables

  11. EEG windowed statistical wavelet scoring for evaluation and discrimination of muscular artifacts

    International Nuclear Information System (INIS)

    Vialatte, François-Benoit; Cichocki, Andrzej; Solé-Casals, Jordi

    2008-01-01

    EEG recordings are usually corrupted by spurious extra-cerebral artifacts, which should be rejected or cleaned up by the practitioner. Since manual screening of human EEGs is inherently error prone and might induce experimental bias, automatic artifact detection is an issue of importance. Automatic artifact detection is the best guarantee for objective and clean results. We present a new approach, based on the time–frequency shape of muscular artifacts, to achieve reliable and automatic scoring. The impact of muscular activity on the signal can be evaluated using this methodology by placing emphasis on the analysis of EEG activity. The method is used to discriminate evoked potentials from several types of recorded muscular artifacts—with a sensitivity of 98.8% and a specificity of 92.2%. Automatic cleaning of EEG data is then successfully realized using this method, combined with independent component analysis. The outcome of the automatic cleaning is then compared with the Slepian multitaper spectrum based technique introduced by Delorme et al (2007 Neuroimage 34 1443–9)

  12. Evaluating clinical ethics support in mental healthcare: a systematic literature review.

    Science.gov (United States)

    Hem, Marit Helene; Pedersen, Reidar; Norvoll, Reidun; Molewijk, Bert

    2015-06-01

    A systematic literature review on evaluation of clinical ethics support services in mental healthcare is presented and discussed. The focus was on (a) forms of clinical ethics support services, (b) evaluation of clinical ethics support services, (c) contexts and participants and (d) results. Five studies were included. The ethics support activities described were moral case deliberations and ethics rounds. Different qualitative and quantitative research methods were utilized. The results show that (a) participants felt that they gained an increased insight into moral issues through systematic reflection; (b) there was improved cooperation among multidisciplinary team members; (c) it was uncertain whether clinical ethics support services led to better patient care; (d) the issue of patient and client participation is complex; and (e) the implementation process is challenging. Clinical ethics support services have mainly been studied through the experiences of the participating facilitators and healthcare professionals. Hence, there is limited knowledge of whether and how various types of clinical ethics support services influence the quality of care and how patients and relatives may evaluate clinical ethics support services. Based on the six excluded 'grey zone articles', in which there was an implicit focus on ethics reflection, other ways of working with ethical reflection in practice are discussed. Implementing and evaluating clinical ethics support services as approaches to clinical ethics support that are more integrated into the development of good practice are in focus. In order to meet some of the shortcomings of the field of clinical ethics support services, a research project that aims to strengthen ethics support in the mental health services, including patients' and caregivers' views on ethical challenges, is presented. © The Author(s) 2014.

  13. EVALUATING THE EFFECT OF AN EDUCATIONAL INTERVENTION ON PARENTS' NUTRITIONAL SOCIAL SUPPORT

    OpenAIRE

    Fatemeh Mokhtari1 , Soheila Ehsanpour2 and Ashraf Kazemi 3*

    2017-01-01

    Background: Social support is one of the important effective factors on health-related behaviors in different groups. The present study has evaluated the effect of an educational intervention on parents’ nutritional social support for having a healthy diet by teenagers. Methods: This field trial was conducted in two groups on the parents of 63 female early adolescent.The level of parents’ nutritional social support for having a healthy diet were measured using a questionnaire. One month after...

  14. Improving Vision Awareness in Autism Services: Evaluation of a Dedicated Education Programme for Support Practitioners

    Science.gov (United States)

    Long, Joseph J.; Butchart, Maggie; Brown, Michael; Bain, Janice; McMillan, Anne; Karatzias, Thanos

    2018-01-01

    Background: The research reported here sought to evaluate whether a dedicated education programme in vision awareness improved the knowledge and skills of autism support practitioners in identifying visual impairment in autistic people with intellectual disabilities and providing better support to those individuals identified as visually impaired.…

  15. A BDI Dialogue Agent for Social Support : Specification and Evaluation Method

    NARCIS (Netherlands)

    Van der Zwaan, J.M.; Dignum, V.; Jonker, C.M.

    2012-01-01

    An important task for empathic agents is to provide social support, that is, to help people increase their well-being and decrease the perceived burden of their problems. The contributions of this paper are 1) the specifcation of speech acts for a social support dialogue agent, and 2) an evaluation

  16. Consequence evaluation of radiation embrittlement of Trojan reactor pressure vessel supports

    International Nuclear Information System (INIS)

    Lu, S.C.; Sommer, S.C.; Johnson, G.L.; Lambert, H.E.

    1990-10-01

    This report describes a consequence evaluation to address safety concerns raised by the radiation embrittlement of the reactor pressure vessel (RPV) supports for the Trojan nuclear power plant. The study comprises a structural evaluation and an effects evaluation and assumes that all four reactor vessel supports have completely lost the load carrying capability. By demonstrating that the ASME code requirements governing Level D service limits are satisfied, the structural evaluation concludes that the Trojan reactor coolant loop (RCL) piping is capable of transferring loads to the steam generator (SG) supports and the reactor coolant pump (RCP) supports. A subsequent design margins to accommodate additional loads transferred to them through the RCL piping. The effects evaluation, employing a systems analysis approach, investigates initiating events and the reliability of the engineered safeguard systems as the RPV is subject to movements caused by the RPV support failure. The evaluation identifies a number of areas of additional safety concerns, but further investigation of the above safety concerns, however, concludes that a hypothetical failure of the Trojan RPV supports due to radiation embrittlement will not result in consequences of significant safety concerns

  17. Report of the summative evaluation by the advisory committee on research support and collaborative activities

    International Nuclear Information System (INIS)

    2005-03-01

    The Research Evaluation Committee of the Japan Atomic Energy Research Institute (JAERI) set up an Advisory Committee on Research Support and Collaborative Activities in accordance with the 'Fundamental Guideline for the Evaluation of Research and Development (R and D) at JAERI' and its subsidiary regulations. The Advisory Committee on Research Support and Collaborative Activities evaluated the adequacy of the plans of research support and collaborative activities to be succeeded from JAERI to a new research institute which will be established by integration of JAERI and the Japan Nuclear Cycle Development Institute (JNC). The Advisory Committee consisted of nine specialists from outside the JAERI conducted its activities from June 2004 to August 2004. The evaluation was performed on the basis of the materials submitted in advance and of the oral presentations made at the Advisory Committee meeting which was held on July 21, 2004, in line with the items, viewpoints, and criteria for the evaluation specified by the Research Evaluation Committee. The result of the evaluation by the Advisory Committee was submitted to the Research Evaluation Committee, and was judged to be appropriate at its meeting held on December 1, 2004. This report describes the result of the evaluation by the Advisory Committee on Research Support and Collaborative Activities. (author)

  18. The evaluation of stress and piping support loads on RSG-GAS secondary cooling system

    International Nuclear Information System (INIS)

    Pustandyo, W.; Sitandung, Y. B.; Sujalmo, S.

    1998-01-01

    The evaluation of stress and piping support loads was evaluated on piping segment of secondary cooling water piping. In this paper, the analysis methods are presented with the use of computer code PS + CAEPIPE Version 3. 4. 05. W. From the selected pipe segment, the data of pipe characteristic, material properties, operation and design condition, equipment and support were used as inputs. The result of analysis show that stress and support loads if using location, kind and number of support equal with the system that have been installed for sustain load 3638 psi (node 160), thermal 13517 psi (node 90) and combination of sustain and thermal (node 90) 16747 psi. Meanwhile,if the optimization support, stress and support load for sustain load are respectively 4238 psi (node 10), thermal 13517 psi (node 90) and combination of sustain + thermal (node 90) 17350 psi. The limit values of permitted support based on Code PS+CAEPIPE of sustain load are 15000 psi, thermal 22500 psi and combination of sustain + thermal 37500 psi. The conclusion of evaluation result, that stress support load of pipe secondary cooling system are sufficiently low and using support show excessive and not economic

  19. Methods for dependency estimation and system unavailability evaluation based on failure data statistics

    International Nuclear Information System (INIS)

    Azarm, M.A.; Hsu, F.; Martinez-Guridi, G.; Vesely, W.E.

    1993-07-01

    This report introduces a new perspective on the basic concept of dependent failures where the definition of dependency is based on clustering in failure times of similar components. This perspective has two significant implications: first, it relaxes the conventional assumption that dependent failures must be simultaneous and result from a severe shock; second, it allows the analyst to use all the failures in a time continuum to estimate the potential for multiple failures in a window of time (e.g., a test interval), therefore arriving at a more accurate value for system unavailability. In addition, the models developed here provide a method for plant-specific analysis of dependency, reflecting the plant-specific maintenance practices that reduce or increase the contribution of dependent failures to system unavailability. The proposed methodology can be used for screening analysis of failure data to estimate the fraction of dependent failures among the failures. In addition, the proposed method can evaluate the impact of the observed dependency on system unavailability and plant risk. The formulations derived in this report have undergone various levels of validations through computer simulation studies and pilot applications. The pilot applications of these methodologies showed that the contribution of dependent failures of diesel generators in one plant was negligible, while in another plant was quite significant. It also showed that in the plant with significant contribution of dependency to Emergency Power System (EPS) unavailability, the contribution changed with time. Similar findings were reported for the Containment Fan Cooler breakers. Drawing such conclusions about system performance would not have been possible with any other reported dependency methodologies

  20. WE-AB-202-04: Statistical Evaluation of Lung Function Using 4DCT Ventilation Imaging: Proton Therapy VS IMRT

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Q; Zhang, M; Chen, T; Yue, N; Zou, J [Rutgers University, New Brunswick, NJ (United States)

    2016-06-15

    Purpose: Variation in function of different lung regions has been ignored so far for conventional lung cancer treatment planning, which may lead to higher risk of radiation induced lung disease. 4DCT based lung ventilation imaging provides a novel yet convenient approach for lung functional imaging as 4DCT is taken as routine for lung cancer treatment. Our work aims to evaluate the impact of accounting for spatial heterogeneity in lung function using 4DCT based lung ventilation imaging for proton and IMRT plans. Methods: Six patients with advanced stage lung cancer of various tumor locations were retrospectively evaluated for the study. Proton and IMRT plans were designed following identical planning objective and constrains for each patient. Ventilation images were calculated from patients’ 4DCT using deformable image registration implemented by Velocity AI software based on Jacobian-metrics. Lung was delineated into two function level regions based on ventilation (low and high functional area). High functional region was defined as lung ventilation greater than 30%. Dose distribution and statistics in different lung function area was calculated for patients. Results: Variation in dosimetric statistics of different function lung region was observed between proton and IMRT plans. In all proton plans, high function lung regions receive lower maximum dose (100.2%–108.9%), compared with IMRT plans (106.4%–119.7%). Interestingly, three out of six proton plans gave higher mean dose by up to 2.2% than IMRT to high function lung region. Lower mean dose (lower by up to 14.1%) and maximum dose (lower by up to 9%) were observed in low function lung for proton plans. Conclusion: A systematic approach was developed to generate function lung ventilation imaging and use it to evaluate plans. This method hold great promise in function analysis of lung during planning. We are currently studying more subjects to evaluate this tool.

  1. Evaluation of image quality and radiation dose by adaptive statistical iterative reconstruction technique level for chest CT examination.

    Science.gov (United States)

    Hong, Sun Suk; Lee, Jong-Woong; Seo, Jeong Beom; Jung, Jae-Eun; Choi, Jiwon; Kweon, Dae Cheol

    2013-12-01

    The purpose of this research is to determine the adaptive statistical iterative reconstruction (ASIR) level that enables optimal image quality and dose reduction in the chest computed tomography (CT) protocol with ASIR. A chest phantom with 0-50 % ASIR levels was scanned and then noise power spectrum (NPS), signal and noise and the degree of distortion of peak signal-to-noise ratio (PSNR) and the root-mean-square error (RMSE) were measured. In addition, the objectivity of the experiment was measured using the American College of Radiology (ACR) phantom. Moreover, on a qualitative basis, five lesions' resolution, latitude and distortion degree of chest phantom and their compiled statistics were evaluated. The NPS value decreased as the frequency increased. The lowest noise and deviation were at the 20 % ASIR level, mean 126.15 ± 22.21. As a result of the degree of distortion, signal-to-noise ratio and PSNR at 20 % ASIR level were at the highest value as 31.0 and 41.52. However, maximum absolute error and RMSE showed the lowest deviation value as 11.2 and 16. In the ACR phantom study, all ASIR levels were within acceptable allowance of guidelines. The 20 % ASIR level performed best in qualitative evaluation at five lesions of chest phantom as resolution score 4.3, latitude 3.47 and the degree of distortion 4.25. The 20 % ASIR level was proved to be the best in all experiments, noise, distortion evaluation using ImageJ and qualitative evaluation of five lesions of a chest phantom. Therefore, optimal images as well as reduce radiation dose would be acquired when 20 % ASIR level in thoracic CT is applied.

  2. Strata control in tunnels and an evaluation of support units and systems currently used with a view to improving the effectiveness of support, stability and safety of tunnels.

    CSIR Research Space (South Africa)

    Haile, AT

    1998-12-01

    Full Text Available This report details a methodology for rational design of tunnel support systems based on a mechanistic evaluation of the interaction between the components of a tunnel support system and a highly discontinuous rock mass structure. This analysis...

  3. Evaluation and Comparison Research on the Support of Websites to Enterprise's E-Commerce

    Institute of Scientific and Technical Information of China (English)

    SHAO Peiji; HUANG Yixiao; WAN Jie; YANG Jing

    2004-01-01

    This paper comparatively analyzes the existing evaluation index of websites, and puts forward the evaluation index and method about the support of a website to enterprise's e-commerce.Through researching on 56 super enterprises of information industry in Sichuan province, throughout China and the world, analyzing and comparatively studying the support ability of a website to an enterprise's e-commerce, this paper brings forward using five levels to categorize the support ability of a website to enterprise's e-commerce. In the end, the flaw of enterprise's e-commerce practice in Sichuan province and corresponding countermeasure will be illustrated.

  4. The stress analysis evaluation and pipe support layout for pressurizer discharge system

    International Nuclear Information System (INIS)

    Mao Qing; Wang Wei; Zhang Yixiong

    2000-01-01

    The author presents the stress analysis and evaluation of pipe layout and support adjustment process for Qinshan phase II pressurizer discharge system. Using PDL-SYSPIPE INTERFACE software, the characteristic parameters of the system are gained from 3-D CAD engineering design software PDL and outputted as the input date file format of special pipe stress analysis program SYSPIPE. Based on that, SYSPIPE program fast stress analysis function is applied in adjusting pipe layout , support layout and support types. According to RCC-M standard, the pipe stress analysis and evaluation under deadweight, internal pressure, thermal expansion, seismic, pipe rupture and discharge loads are fulfilled

  5. Evaluation of support loss in micro-beam resonators: A revisit

    Science.gov (United States)

    Chen, S. Y.; Liu, J. Z.; Guo, F. L.

    2017-12-01

    This paper presents an analytical study on evaluation of support loss in micromechanical resonators undergoing in-plane flexural vibrations. Two-dimensional elastic wave theory is used to determine the energy transmission from the vibrating resonator to the support. Fourier transform and Green's function technique are adopted to solve the problem of wave motions on the surface of the support excited by the forces transmitted by the resonator onto the support. Analytical expressions of support loss in terms of quality factor, taking into account distributed normal stress and shear stress in the attachment region, and coupling between the normal stress and shear stress as well as material disparity between the support and the resonator, have been derived. Effects of geometry of micro-beam resonators, and material dissimilarity between support and resonator on support loss are examined. Numerical results show that 'harder resonator' and 'softer support' combination leads to larger support loss. In addition, the Perfectly Matched Layer (PML) numerical simulation technique is employed for validation of the proposed analytical model. Comparing with results of quality factor obtained by PML technique, we find that the present model agrees well with the results of PML technique and the pure-shear model overestimates support loss noticeably, especially for resonators with small aspect ratio and large material dissimilarity between the support and resonator.

  6. The evaluator as technical assistant: A model for systemic reform support

    Science.gov (United States)

    Century, Jeanne Rose

    This study explored evaluation of systemic reform. Specifically, it focused on the evaluation of a systemic effort to improve K-8 science, mathematics and technology education. The evaluation was of particular interest because it used both technical assistance and evaluation strategies. Through studying the combination of these roles, this investigation set out to increase understanding of potentially new evaluator roles, distinguish important characteristics of the evaluator/project participant relationship, and identify how these roles and characteristics contribute to effective evaluation of systemic science education reform. This qualitative study used interview, document analysis, and participant observation as methods of data collection. Interviews were conducted with project leaders, project participants, and evaluators and focused on the evaluation strategies and process, the use of the evaluation, and technical assistance. Documents analyzed included transcripts of evaluation team meetings and reports, memoranda and other print materials generated by the project leaders and the evaluators. Data analysis consisted of analytic and interpretive procedures consistent with the qualitative data collected and entailed a combined process of coding transcripts of interviews and meetings, field notes, and other documents; analyzing and organizing findings; writing of reflective and analytic memos; and designing and diagramming conceptual relationships. The data analysis resulted in the development of the Multi-Function Model for Systemic Reform Support. This model organizes systemic reform support into three functions: evaluation, technical assistance, and a third, named here as "systemic perspective." These functions work together to support the project's educational goals as well as a larger goal--building capacity in project participants. This model can now serve as an informed starting point or "blueprint" for strategically supporting systemic reform.

  7. Evaluating electronic performance support systems: A methodology focused on future use-in-practice

    NARCIS (Netherlands)

    Collis, Betty; Verwijs, C.A.

    1995-01-01

    Electronic performance support systems, as an emerging type of software environment, present many new challenges in relation to effective evaluation. In this paper, a global approach to a 'usage-orientated' evaluation methodology for software product is presented, followed by a specific example of

  8. Smartphone Apps for Cardiopulmonary Resuscitation Training and Real Incident Support: A Mixed-Methods Evaluation Study

    NARCIS (Netherlands)

    Kalz, Marco; Lenssen, Niklas; Felzen, Marco; Rossaint, Rolf; Tabuenca, Bernardo; Specht, Marcus; Skorning, Max

    2014-01-01

    Background: No systematic evaluation of smartphone/mobile apps for resuscitation training and real incident support is available to date. To provide medical, usability, and additional quality criteria for the development of apps, we conducted a mixed-methods sequential evaluation combining the

  9. Helping families improve: an evaluation of two primary care approaches to parenting support in the Netherlands

    NARCIS (Netherlands)

    Graaf, I.M. de; Onrust, S.A.; Haverman, M.C.C.; Janssens, J.M.A.M.

    2009-01-01

    The present study evaluated two primary care parenting interventions. First, we evaluated the most widely used Dutch practices for primary care parenting support. Second, we assessed the applicability of the Primary Care Triple P approach, which is now being utilized in a wide variety of primary

  10. Usability evaluation of mobile ICT support used at the building construction site

    DEFF Research Database (Denmark)

    Christiansson, Per; Svidt, Kjeld

    2006-01-01

    The paper summarizes findings from field evaluations and controlled laboratory usability evaluations of new mobile Information and Communication Technology, ICT, support used by craftsmen at construction sites as well as a discussion of methodologies for user centred ICT tools design. The finding...

  11. A Preliminary Evaluation of Reach: Training Early Childhood Teachers to Support Children's Social and Emotional Development

    Science.gov (United States)

    Conners-Burrow, Nicola A.; Patrick, Terese; Kyzer, Angela; McKelvey, Lorraine

    2017-01-01

    This paper describes the development, implementation and preliminary evaluation of the Reaching Educators and Children (REACH) program, a training and coaching intervention designed to increase the capacity of early childhood teachers to support children's social and emotional development. We evaluated REACH with 139 teachers of toddler and…

  12. How Feedback Can Improve Managerial Evaluations of Model-based Marketing Decision Support Systems

    NARCIS (Netherlands)

    U. Kayande (Ujwal); A. de Bruyn (Arnoud); G.L. Lilien (Gary); A. Rangaswamy (Arvind); G.H. van Bruggen (Gerrit)

    2006-01-01

    textabstractMarketing managers often provide much poorer evaluations of model-based marketing decision support systems (MDSSs) than are warranted by the objective performance of those systems. We show that a reason for this discrepant evaluation may be that MDSSs are often not designed to help users

  13. Evaluation of a clinical decision support algorithm for patient-specific childhood immunization.

    Science.gov (United States)

    Zhu, Vivienne J; Grannis, Shaun J; Tu, Wanzhu; Rosenman, Marc B; Downs, Stephen M

    2012-09-01

    To evaluate the effectiveness of a clinical decision support system (CDSS) implementing standard childhood immunization guidelines, using real-world patient data from the Regenstrief Medical Record System (RMRS). Study subjects were age 6-years or younger in 2008 and had visited the pediatric clinic on the campus of Wishard Memorial Hospital. Immunization records were retrieved from the RMRS for 135 randomly selected pediatric patients. We compared vaccine recommendations from the CDSS for both eligible and recommended timelines, based on the child's date of birth and vaccine history, to recommendations from registered nurses who routinely selected vaccines for administration in a busy inner city hospital, using the same date of birth and vaccine history. Aggregated and stratified agreement and Kappa statistics were reported. The reasons for disagreement between suggestions from the CDSS and nurses were also identified. For the 135 children, a total of 1215 vaccination suggestions were generated by nurses and were compared to the recommendations of the CDSS. The overall agreement rates were 81.3% and 90.6% for the eligible and recommended timelines, respectively. The overall Kappa values were 0.63 for the eligible timeline and 0.80 for the recommended timeline. Common reasons for disagreement between the CDSS and nurses were: (1) missed vaccination opportunities by nurses, (2) nurses sometimes suggested a vaccination before the minimal age and minimal waiting interval, (3) nurses usually did not validate patient immunization history, and (4) nurses sometimes gave an extra vaccine dose. Our childhood immunization CDSS can assist providers in delivering accurate childhood vaccinations. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Development and evaluation of measurement devices used to support testing of radioactive material transportation packages

    International Nuclear Information System (INIS)

    Uncapher, W.L.; Ammerman, D.J.; Stenberg, D.R.; Bronowski, D.R.; Arviso, M.

    1992-01-01

    Radioactive material package designers use structural testing to verify and demonstrate package performance. A major part of evaluating structural response is the collection of instrumentation measurement data. Sandia National Laboratories (SNL) has an ongoing program to develop and evaluate measurement devices to support testing of radioactive material packages. Measurement devices developed in support of this activity include evaluation channels, ruggedly constructed linear variable differential transformers, and piezoresistive accelerometers with enhanced measurement capabilities. In addition to developing measurement devices, a method has been derived to evaluate accelerometers and strain gages for measurement repeatability, ruggedness, and manufacturers' calibration data under both laboratory and field conditions. The developed measurement devices and evaluation technique will be discussed and the results of the evaluation will be presented

  15. Nuclear and radiation emergency evaluation and decision-making support system for ministry of environmental protection

    International Nuclear Information System (INIS)

    Yue Huiguo; Lin Quanyi; Zhang Jiangang

    2010-01-01

    This article introduces the design features and main functions of The Nuclear and Radiation Emergency Evaluation and Decision Support System. The Ministry of Environmental Protection will construct a complete set of evaluation and decision-making system at the Nuclear Safety Center of Ministry of Environmental Protection to cope with the sudden event. The system will provide a comprehensive technical support for the consequence evaluation and decision-making of anti-terrorism event according to the responsibility of MEP in the sudden event, with the data provided by the MEP's anti-terrorism information platform. (authors)

  16. Evaluation of 241-AZ tank farm supporting phase 1 privatization waste feed delivery

    Energy Technology Data Exchange (ETDEWEB)

    CARLSON, A.B.

    1998-11-19

    This evaluation is one in a series of evaluations determining the process needs and assessing the adequacy of existing and planned equipment in meeting those needs at various double-shell tank farms in support of Phase 1 privatization. A number of tank-to-tank transfers and waste preparation activities are needed to process and feed waste to the private contractor in support of Phase 1 privatization. The scope of this evaluation is limited to process needs associated with 241-AZ tank farm during the Phase 1 privatization.

  17. Evaluation of 241-AZ tank farm supporting phase 1 privatization waste feed delivery

    International Nuclear Information System (INIS)

    CARLSON, A.B.

    1998-01-01

    This evaluation is one in a series of evaluations determining the process needs and assessing the adequacy of existing and planned equipment in meeting those needs at various double-shell tank farms in support of Phase 1 privatization. A number of tank-to-tank transfers and waste preparation activities are needed to process and feed waste to the private contractor in support of Phase 1 privatization. The scope of this evaluation is limited to process needs associated with 241-AZ tank farm during the Phase 1 privatization

  18. Evaluation of support groups for women with breast cancer: importance of the navigator role

    Directory of Open Access Journals (Sweden)

    Till James E

    2003-05-01

    Full Text Available Abstract Background At least some forms of breast cancer are increasingly being viewed as a chronic illness, where an emphasis is placed on meeting the various ongoing needs of people living with cancer, their families and other members of their social support networks. This commentary outlines some approaches to the evaluation of cancer-related support groups, with a particular emphasis on those designed to provide long-distance support, via the internet, for women with breast cancer. Discussion The literature on evaluations of community-based cancer support groups indicates that they offer a number of benefits, and that it is more reasonable to expect an impact of such interventions on psychosocial functioning and/or health-related quality of life than on survival. The literature on both face-to-face and online social support groups suggests that they offer many advantages, although evaluation of the latter delivery mechanism presents some ethical issues that need to be addressed. Many popular online support groups are peer-moderated, rather than professionally-moderated. In an evaluation of online support groups, different models of the role of the "navigator" need to be taken into account. Some conceptual models are outlined for the evaluation of the "navigator role" in meeting the informational, decisional and educational needs of women with breast cancer. The Breast-Cancer Mailing List, an example of an unmoderated internet-based peer-support group, is considered within the context of a Shared or Tacit Model of the navigator role. Conclusion Application of the concept of a "navigator role" to support groups in general, and to unmoderated online ones in particular, has received little or no attention in the research literature. The navigator role should be taken into account in research on this increasingly important aspect of cancer communication.

  19. Chemical characteristics of surface systems in the Forsmark area. Visualisation and statistical evaluation of data from surface water, precipitation, shallow groundwater, and regolith

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-02-15

    The Swedish Nuclear Fuel and Waste management Co (SKB) initiated site investigations for a deep repository for spent nuclear fuel at two different sites in Sweden, Forsmark and Oskarshamn, in 2002. This report evaluates the results from chemical investigations of the surface system in the Forsmark area during the period November 2002 - March 2005. The evaluation includes data from surface waters (lakes, streams and the sea), precipitation, shallow groundwater and regolith (till, soil, peat, sediments and biota) in the area. Results from surface waters are not presented in this report since these were treated in a recently published report. The main focus of the study is to visualize the vast amount of data collected hitherto in the site investigations, and to give a chemical characterisation of the investigated media at the site. The results will be used to support the site descriptive models, which in turn are used for safety assessment studies and for the environmental impact assessment. The data used consist of water chemical composition in lakes, streams, coastal sites, and in precipitation, predominantly sampled on a monthly basis, and in groundwater from soil tubes and wells, sampled up to four times per year. Moreover, regolith data includes information on the chemical composition of till, soil, sediment and vegetation samples from the area. The characterisations include all measured chemical parameters, i.e. major and minor constituents, trace elements, nutrients, isotopes and radio nuclides, as well as field measured parameters. The evaluation of data from each medium has been divided into the following parts: Characterisation of individual sampling sites, and comparisons within and among sampling sites as well as comparisons with local, regional and national reference data; Analysis of time trends and seasonal variation (for shallow groundwater); Exploration of relationships among the various chemical parameters. For all investigated parameters, the

  20. Chemical characteristics of surface systems in the Forsmark area. Visualisation and statistical evaluation of data from shallow groundwater, precipitation, and regolith

    International Nuclear Information System (INIS)

    Troejbom, Mats; Soederbaeck, Bjoern

    2006-02-01

    The Swedish Nuclear Fuel and Waste management Co (SKB) initiated site investigations for a deep repository for spent nuclear fuel at two different sites in Sweden, Forsmark and Oskarshamn, in 2002. This report evaluates the results from chemical investigations of the surface system in the Forsmark area during the period November 2002 - March 2005. The evaluation includes data from surface waters (lakes, streams and the sea), precipitation, shallow groundwater and regolith (till, soil, peat, sediments and biota) in the area. Results from surface waters are not presented in this report since these were treated in a recently published report. The main focus of the study is to visualize the vast amount of data collected hitherto in the site investigations, and to give a chemical characterisation of the investigated media at the site. The results will be used to support the site descriptive models, which in turn are used for safety assessment studies and for the environmental impact assessment. The data used consist of water chemical composition in lakes, streams, coastal sites, and in precipitation, predominantly sampled on a monthly basis, and in groundwater from soil tubes and wells, sampled up to four times per year. Moreover, regolith data includes information on the chemical composition of till, soil, sediment and vegetation samples from the area. The characterisations include all measured chemical parameters, i.e. major and minor constituents, trace elements, nutrients, isotopes and radio nuclides, as well as field measured parameters. The evaluation of data from each medium has been divided into the following parts: Characterisation of individual sampling sites, and comparisons within and among sampling sites as well as comparisons with local, regional and national reference data; Analysis of time trends and seasonal variation (for shallow groundwater); Exploration of relationships among the various chemical parameters. For all investigated parameters, the