WorldWideScience

Sample records for project prerequisites statistics

  1. Changes in Math Prerequisites and Student Performance in Business Statistics: Do Math Prerequisites Really Matter?

    OpenAIRE

    Jeffrey J. Green; Courtenay C. Stone; Abera Zegeye; Thomas A. Charles

    2007-01-01

    We use a binary probit model to assess the impact of several changes in math prerequisites on student performance in an undergraduate business statistics course. While the initial prerequisites did not necessarily provide students with the necessary math skills, our study, the first to examine the effect of math prerequisite changes, shows that these changes were deleterious to student performance. Our results helped convince the College of Business to change the math prerequisite again begin...

  2. [''R"--project for statistical computing

    DEFF Research Database (Denmark)

    Dessau, R.B.; Pipper, Christian Bressen

    2008-01-01

    An introduction to the R project for statistical computing (www.R-project.org) is presented. The main topics are: 1. To make the professional community aware of "R" as a potent and free software for graphical and statistical analysis of medical data; 2. Simple well-known statistical tests are fai...... are fairly easy to perform in R, but more complex modelling requires programming skills; 3. R is seen as a tool for teaching statistics and implementing complex modelling of medical data among medical professionals Udgivelsesdato: 2008/1/28......An introduction to the R project for statistical computing (www.R-project.org) is presented. The main topics are: 1. To make the professional community aware of "R" as a potent and free software for graphical and statistical analysis of medical data; 2. Simple well-known statistical tests...

  3. Statistical Techniques for Project Control

    CERN Document Server

    Badiru, Adedeji B

    2012-01-01

    A project can be simple or complex. In each case, proven project management processes must be followed. In all cases of project management implementation, control must be exercised in order to assure that project objectives are achieved. Statistical Techniques for Project Control seamlessly integrates qualitative and quantitative tools and techniques for project control. It fills the void that exists in the application of statistical techniques to project control. The book begins by defining the fundamentals of project management then explores how to temper quantitative analysis with qualitati

  4. Commentary: Prerequisite Knowledge

    Science.gov (United States)

    Taylor, Ann T. S.

    2013-01-01

    Most biochemistry, genetics, cell biology, and molecular biology classes have extensive prerequisite or co-requisite requirements, often including introductory chemistry, introductory biology, and organic chemistry coursework. But what is the function of these prerequisites? While it seems logical that a basic understanding of biological and…

  5. The Prerequisites for a Degrowth Paradigm Shift

    DEFF Research Database (Denmark)

    Buch-Hansen, Hubert

    2018-01-01

    What would it take for a degrowth paradigm shift to take place? Drawing on contemporary critical political economy scholarship, this article identifies four prerequisites for socio-economic paradigm shifts: deep crisis, an alternative political project, a comprehensive coalition of social forces...... currently facing humanity. On the other hand, the prospects for a degrowth paradigm shift remain bleak: unlike political projects that became hegemonic in the past, degrowth has neither support from a comprehensive coalition of social forces nor any consent to its agenda among the broader population....

  6. Accurate KAP meter calibration as a prerequisite for optimisation in projection radiography

    International Nuclear Information System (INIS)

    Malusek, A.; Sandborg, M.; Alm Carlsson, G.

    2016-01-01

    Modern X-ray units register the air kerma-area product, PKA, with a built-in KAP meter. Some KAP meters show an energy-dependent bias comparable with the maximum uncertainty articulated by the IEC (25 %), adversely affecting dose-optimisation processes. To correct for the bias, a reference KAP meter calibrated at a standards laboratory and two calibration methods described here can be used to achieve an uncertainty of <7 % as recommended by IAEA. A computational model of the reference KAP meter is used to calculate beam quality correction factors for transfer of the calibration coefficient at the standards laboratory, Q 0 , to any beam quality, Q, in the clinic. Alternatively, beam quality corrections are measured with an energy-independent dosemeter via a reference beam quality in the clinic, Q 1 , to beam quality, Q. Biases up to 35 % of built-in KAP meter readings were noted. Energy-dependent calibration factors are needed for unbiased PKA. Accurate KAP meter calibration as a prerequisite for optimisation in projection radiography. (authors)

  7. A Multi-Class, Interdisciplinary Project Using Elementary Statistics

    Science.gov (United States)

    Reese, Margaret

    2012-01-01

    This article describes a multi-class project that employs statistical computing and writing in a statistics class. Three courses, General Ecology, Meteorology, and Introductory Statistics, cooperated on a project for the EPA's Student Design Competition. The continuing investigation has also spawned several undergraduate research projects in…

  8. Projection operator techniques in nonequilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Grabert, H.

    1982-01-01

    This book is an introduction to the application of the projection operator technique to the statistical mechanics of irreversible processes. After a general introduction to the projection operator technique and statistical thermodynamics the Fokker-Planck and the master equation approach are described together with the response theory. Then, as applications the damped harmonic oscillator, simple fluids, and the spin relaxation are considered. (HSI)

  9. A Statistical Project Control Tool for Engineering Managers

    Science.gov (United States)

    Bauch, Garland T.

    2001-01-01

    This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.

  10. Prerequisites for Computer-Aided Cognitive Rehabilitation.

    Science.gov (United States)

    Legrand, Colette

    1989-01-01

    This paper describes computer-aided cognitive rehabilitation for mentally deficient persons. It lists motor, cognitive, emotional, and educational prerequisites to such rehabilitation and states advantages and disadvantages in using the prerequisites. (JDD)

  11. The Impact of Student-Directed Projects in Introductory Statistics

    Science.gov (United States)

    Spence, Dianna J.; Bailey, Brad; Sharp, Julia L.

    2017-01-01

    A multi-year study investigated the impact of incorporating student-directed discovery projects into introductory statistics courses. Pilot instructors at institutions across the United States taught statistics implementing student-directed projects with the help of a common set of instructional materials designed to facilitate such projects.…

  12. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  13. Goals, requirements and prerequisites for teleradiology

    International Nuclear Information System (INIS)

    Walz, M.; Wein, B.; Lehmann, K.J.; Bolte, R.; Kilbinger, M.; Loose, R.; Guenther, R.W.; Georgi, M.

    1997-01-01

    Specific radiological requirements have to be considered for the realization of telemedicine. In this article the goals and requirements for an extensive introduction of teleradiology will be defined from the radiological user's point of view. Necessary medical, legal and professional prerequisites for teleradiology are presented. Essential requirements, such as data security maintenance of personal rights and standardization, must be realized. Application-specific requirements, e.g. quality and extent of teleradiological functions, as well as technological alternatives, are discussed. Each project must be carefully planned in relation to one's own needs, extent of functions and system selection. Topics, such as acknowledgement of electronic documentation, reimbursement of teleradiology and liability, must be clarified. Legal advice and the observance of quality guidelines are recommended. (orig.) [de

  14. The GenABEL Project for statistical genomics.

    Science.gov (United States)

    Karssen, Lennart C; van Duijn, Cornelia M; Aulchenko, Yurii S

    2016-01-01

    Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the "core team", facilitating agile statistical omics methodology development and fast dissemination.

  15. Investigating Validity of Math 105 as Prerequisite to Math 201 among Undergraduate Students, Nigeria

    Science.gov (United States)

    Zakariya, Yusuf F.

    2016-01-01

    In this study, the author examined the validity of MATH 105 as a prerequisite to MATH 201. The data for this study was extracted directly from the examination results logic of the university. Descriptive statistics in form of correlations and linear regressions were used to analyze the obtained data. Three research questions were formulated and…

  16. Prerequisites for Correctness in Legal Argumentation

    OpenAIRE

    Mackuvienė, Eglė

    2011-01-01

    A phenomenon called legal argumentation is analyzed in the dissertation. The aim of the thesis is to identify the prerequisites that allow to consider the legal argumentation to be correct, also to evaluate those prerequisites logically. Legal argumentation is analyzed as a phenomenon per se, without relating it to any particular arguing subject. Other dimensions of the process of making a legal decision, such as legal reasoning, legal discourse, interpretation of law and others are discu...

  17. Using R-Project for Free Statistical Analysis in Extension Research

    Science.gov (United States)

    Mangiafico, Salvatore S.

    2013-01-01

    One option for Extension professionals wishing to use free statistical software is to use online calculators, which are useful for common, simple analyses. A second option is to use a free computing environment capable of performing statistical analyses, like R-project. R-project is free, cross-platform, powerful, and respected, but may be…

  18. How Much Math Do Students Need to Succeed in Business and Economics Statistics? An Ordered Probit Analysis

    OpenAIRE

    Jeffrey J. Green; Courtenay C. Stone; Abera Zegeye; Thomas A. Charles

    2008-01-01

    Because statistical analysis requires both familiarity with and the ability to use mathematics, students typically are required to take one or more prerequisite math courses prior to enrolling in the business statistics course. Despite these math prerequisites, however, students find it extremely difficult to learn business statistics. In this study, we use an ordered probit model to examine the effect of alternative prerequisite math course sequences on the grade performance of 1,684 busines...

  19. A method of 2D/3D registration of a statistical mouse atlas with a planar X-ray projection and an optical photo.

    Science.gov (United States)

    Wang, Hongkai; Stout, David B; Chatziioannou, Arion F

    2013-05-01

    The development of sophisticated and high throughput whole body small animal imaging technologies has created a need for improved image analysis and increased automation. The registration of a digital mouse atlas to individual images is a prerequisite for automated organ segmentation and uptake quantification. This paper presents a fully-automatic method for registering a statistical mouse atlas with individual subjects based on an anterior-posterior X-ray projection and a lateral optical photo of the mouse silhouette. The mouse atlas was trained as a statistical shape model based on 83 organ-segmented micro-CT images. For registration, a hierarchical approach is applied which first registers high contrast organs, and then estimates low contrast organs based on the registered high contrast organs. To register the high contrast organs, a 2D-registration-back-projection strategy is used that deforms the 3D atlas based on the 2D registrations of the atlas projections. For validation, this method was evaluated using 55 subjects of preclinical mouse studies. The results showed that this method can compensate for moderate variations of animal postures and organ anatomy. Two different metrics, the Dice coefficient and the average surface distance, were used to assess the registration accuracy of major organs. The Dice coefficients vary from 0.31 ± 0.16 for the spleen to 0.88 ± 0.03 for the whole body, and the average surface distance varies from 0.54 ± 0.06 mm for the lungs to 0.85 ± 0.10mm for the skin. The method was compared with a direct 3D deformation optimization (without 2D-registration-back-projection) and a single-subject atlas registration (instead of using the statistical atlas). The comparison revealed that the 2D-registration-back-projection strategy significantly improved the registration accuracy, and the use of the statistical mouse atlas led to more plausible organ shapes than the single-subject atlas. This method was also tested with shoulder

  20. Quality dementia care: Prerequisites and relational ethics among multicultural healthcare providers.

    Science.gov (United States)

    Sellevold, Gerd Sylvi; Egede-Nissen, Veslemøy; Jakobsen, Rita; Sørlie, Venke

    2017-01-01

    Many nursing homes appear as multicultural workplaces where the majority of healthcare providers have an ethnic minority background. This environment creates challenges linked to communication, interaction and cultural differences. Furthermore, the healthcare providers have varied experiences and understanding of what quality care of patients with dementia involves. The aim of this study is to illuminate multi-ethnic healthcare providers' lived experiences of their own working relationship, and its importance to quality care for people with dementia. The study is part of a greater participatory action research project: 'Hospice values in the care for persons with dementia'. The data material consists of extensive notes from seminars, project meetings and dialogue-based teaching. The text material was subjected to phenomenological-hermeneutical interpretation. Participants and research context: Participants in the project were healthcare providers working in a nursing home unit. The participants came from 15 different countries, had different formal qualifications, varied backgrounds and ethnic origins. Ethical considerations: The study is approved by the Norwegian Regional Ethics Committee and the Norwegian Social Science Data Services. The results show that good working relationships, characterized by understanding each other's vulnerability and willingness to learn from each other through shared experiences, are prerequisites for quality care. The healthcare providers further described ethical challenges as uncertainty and different understandings. The results are discussed in the light of Lögstrup's relational philosophy of ethics and the concepts of vulnerability, ethic responsibility, trust and openness of speech. The prerequisite for quality care for persons with dementia in a multicultural working environment is to create arenas for open discussions between the healthcare providers. Leadership is of great importance.

  1. Academic Performance in MBA Programs: Do Prerequisites Really Matter?

    Science.gov (United States)

    Christensen, Donald Gene; Nance, William R.; White, Darin W.

    2012-01-01

    Many researchers have examined criteria used in Master of Business Administration (MBA) admissions decisions. However, prior research has not examined predictive ability of undergraduate prerequisite courses in core business disciplines. The authors investigated whether undergraduate prerequisite courses predicted MBA success by analyzing the…

  2. How Much Math Do Students Need to Succeed in Business and Economics Statistics? An Ordered Probit Analysis

    Science.gov (United States)

    Green, Jeffrey J.; Stone, Courtenay C.; Zegeye, Abera; Charles, Thomas A.

    2009-01-01

    Because statistical analysis requires the ability to use mathematics, students typically are required to take one or more prerequisite math courses prior to enrolling in the business statistics course. Despite these math prerequisites, however, many students find it difficult to learn business statistics. In this study, we use an ordered probit…

  3. 76 FR 14678 - Communications Unit Leader Prerequisite and Evaluation

    Science.gov (United States)

    2011-03-17

    ... evaluation form. OEC will use the evaluation form to identify course attendees, verify satisfaction of course... and evaluation of OEC events. Evaluation forms will be available in hard copy at each training session... Prerequisite and Evaluation. OMB Number: 1670--NEW. COML Prerequisites Verification Frequency: On occasion...

  4. Lectures on statistical mechanics

    CERN Document Server

    Bowler, M G

    1982-01-01

    Anyone dissatisfied with the almost ritual dullness of many 'standard' texts in statistical mechanics will be grateful for the lucid explanation and generally reassuring tone. Aimed at securing firm foundations for equilibrium statistical mechanics, topics of great subtlety are presented transparently and enthusiastically. Very little mathematical preparation is required beyond elementary calculus and prerequisites in physics are limited to some elementary classical thermodynamics. Suitable as a basis for a first course in statistical mechanics, the book is an ideal supplement to more convent

  5. Statistical Process Control. A Summary. FEU/PICKUP Project Report.

    Science.gov (United States)

    Owen, M.; Clark, I.

    A project was conducted to develop a curriculum and training materials to be used in training industrial operatives in statistical process control (SPC) techniques. During the first phase of the project, questionnaires were sent to 685 companies (215 of which responded) to determine where SPC was being used, what type of SPC firms needed, and how…

  6. 6 CFR 13.6 - Prerequisites for issuing a Complaint.

    Science.gov (United States)

    2010-01-01

    ... 6 Domestic Security 1 2010-01-01 2010-01-01 false Prerequisites for issuing a Complaint. 13.6 Section 13.6 Domestic Security DEPARTMENT OF HOMELAND SECURITY, OFFICE OF THE SECRETARY PROGRAM FRAUD CIVIL REMEDIES § 13.6 Prerequisites for issuing a Complaint. (a) The Reviewing Official may issue a...

  7. Minimal Impact of Organic Chemistry Prerequisite on Student Performance in Introductory Biochemistry

    Science.gov (United States)

    Wright, Robin; Cotner, Sehoya; Winkel, Amy

    2009-01-01

    Curriculum design assumes that successful completion of prerequisite courses will have a positive impact on student performance in courses that require the prerequisite. We recently had the opportunity to test this assumption concerning the relationship between completion of the organic chemistry prerequisite and performance in introductory…

  8. Statistical projection effects in a hydrodynamic pilot-wave system

    Science.gov (United States)

    Sáenz, Pedro J.; Cristea-Platon, Tudor; Bush, John W. M.

    2018-03-01

    Millimetric liquid droplets can walk across the surface of a vibrating fluid bath, self-propelled through a resonant interaction with their own guiding or `pilot' wave fields. These walking droplets, or `walkers', exhibit several features previously thought to be peculiar to the microscopic, quantum realm. In particular, walkers confined to circular corrals manifest a wave-like statistical behaviour reminiscent of that of electrons in quantum corrals. Here we demonstrate that localized topological inhomogeneities in an elliptical corral may lead to resonant projection effects in the walker's statistics similar to those reported in quantum corrals. Specifically, we show that a submerged circular well may drive the walker to excite specific eigenmodes in the bath that result in drastic changes in the particle's statistical behaviour. The well tends to attract the walker, leading to a local peak in the walker's position histogram. By placing the well at one of the foci, a mode with maxima near the foci is preferentially excited, leading to a projection effect in the walker's position histogram towards the empty focus, an effect strongly reminiscent of the quantum mirage. Finally, we demonstrate that the mean pilot-wave field has the same form as the histogram describing the walker's statistics.

  9. Do screencasts help to revise prerequisite mathematics? An investigation of student performance and perception

    Science.gov (United States)

    Loch, Birgit; Jordan, Camilla R.; Lowe, Tim W.; Mestel, Ben D.

    2014-02-01

    Basic calculus skills that are prerequisites for advanced mathematical studies continue to be a problem for a significant proportion of higher education students. While there are many types of revision material that could be offered to students, in this paper we investigate whether short, narrated video recordings of mathematical explanations (screencasts) are a useful tool to enhance student learning when revisiting prerequisite topics. We report on the outcomes of a study that was designed to both measure change in student performance before and after watching screencasts, and to capture students' perception of the usefulness of screencasts in their learning. Volunteers were recruited from students enrolled on an entry module for the Mathematics Master of Science programme at the Open University to watch two screencasts sandwiched between two online calculus quizzes. A statistical analysis of student responses to the quizzes shows that screencasts can have a positive effect on student performance. Further analysis of student feedback shows that student confidence was increased by watching the screencasts. Student views on the value of screencasts for their learning indicated that they appreciated being able to watch a problem being solved and explained by an experienced mathematician; hear the motivation for a particular problem-solving approach; engage more readily with the material being presented, thereby retaining it more easily. The positive student views and impact on student scores indicate that short screencasts could play a useful role in revising prerequisite mathematics.

  10. Relationship between Students' Scores on Research Methods and Statistics, and Undergraduate Project Scores

    Science.gov (United States)

    Ossai, Peter Agbadobi Uloku

    2016-01-01

    This study examined the relationship between students' scores on Research Methods and statistics, and undergraduate project at the final year. The purpose was to find out whether students matched knowledge of research with project-writing skill. The study adopted an expost facto correlational design. Scores on Research Methods and Statistics for…

  11. PREREQUISITES OF THE RESOLUTION OF A CONTRACT

    Directory of Open Access Journals (Sweden)

    Vlad-Victor OCHEA

    2017-05-01

    Full Text Available I herein want to emphasise the prerequisites of the resolution of a contract according to the Romanian Civil Code of 2009. The prerequisites of the resolution of a contract are substantially different from those identified under the former fundamental civil legislation (the Romanian Civil code of 1864. This study aims at a better understanding of the new prerequisites of the resolution of a contract: a. a fundamental non-performance of the obligation; b. an unjustified non-performance of the obligation; c. mora debitoris The analysis of these prerequisites reveals a new possible trait of the resolution: a remedy for the non-performance of the contract rather than a sanction or a variety of contractual liability. Thus the modern legislator of the Romanian Civil Code of 2009 proposed to partially change the physiognomy of the resolution of a contract, different from the former institution and here we are in front of a new law institution. The resolution of a contract under the Romanian Civil Code of 2009 is regulated under The 5th Book – The Obligations, The second chapter – The enforcement of the Obligations, The 5th Section – Resolution of the Contract, respectively under the Article 1549 – 1554. As will be shown below, the resolution of a contract has a homogeneous structure without being spread in different parts of the Civil code. The earning lies in the action of organism the new legal provisions, apparently enriched in comparison to those found in the Romanian Civil Code of 1864. Most notably, the Romanian Civil Code of 2009 preserves the Roman legacy. The modern legislator had a difficult task: 146 years of legal doctrine and jurisprudence transposed into a new legislation which, of course, has its flaws. Nevertheless, it should be praised, as it encompasses useful tools to regulate social relations

  12. Correcting a Persistent Manhattan Project Statistical Error

    Science.gov (United States)

    Reed, Cameron

    2011-04-01

    In his 1987 autobiography, Major-General Kenneth Nichols, who served as the Manhattan Project's ``District Engineer'' under General Leslie Groves, related that when the Clinton Engineer Works at Oak Ridge, TN, was completed it was consuming nearly one-seventh (~ 14%) of the electric power being generated in the United States. This statement has been reiterated in several editions of a Department of Energy publication on the Manhattan Project. This remarkable claim has been checked against power generation and consumption figures available in Manhattan Engineer District documents, Tennessee Valley Authority records, and historical editions of the Statistical Abstract of the United States. The correct figure is closer to 0.9% of national generation. A speculation will be made as to the origin of Nichols' erroneous one-seventh figure.

  13. Studying Student Benefits of Assigning a Service-Learning Project Compared to a Traditional Final Project in a Business Statistics Class

    Science.gov (United States)

    Phelps, Amy L.; Dostilio, Lina

    2008-01-01

    The present study addresses the efficacy of using service-learning methods to meet the GAISE guidelines (http://www.amstat.org/education/gaise/GAISECollege.htm) in a second business statistics course and further explores potential advantages of assigning a service-learning (SL) project as compared to the traditional statistics project assignment.…

  14. Neuroanatomical prerequisites for language functions in the maturing brain.

    Science.gov (United States)

    Brauer, Jens; Anwander, Alfred; Friederici, Angela D

    2011-02-01

    The 2 major language-relevant cortical regions in the human brain, Broca's area and Wernicke's area, are connected via the fibers of the arcuate fasciculus/superior longitudinal fasciculus (AF/SLF). Here, we compared this pathway in adults and children and its relation to language processing during development. Comparison of fiber properties demonstrated lower anisotropy in children's AF/SLF, arguing for an immature status of this particular pathway with conceivably a lower degree of myelination. Combined diffusion tensor imaging (DTI) data and functional magnetic resonance imaging (fMRI) data indicated that in adults the termination of the AF/SLF fiber projection is compatible with functional activation in Broca's area, that is pars opercularis. In children, activation in Broca's area extended from the pars opercularis into the pars triangularis revealing an alternative connection to the temporal lobe (Wernicke's area) via the ventrally projecting extreme capsule fiber system. fMRI and DTI data converge to indicate that adults make use of a more confined language network than children based on ongoing maturation of the structural network. Our data suggest relations between language development and brain maturation and, moreover, indicate the brain's plasticity to adjust its function to available structural prerequisites.

  15. Statistics in Action: The Story of a Successful Service-Learning Project

    Science.gov (United States)

    DeHart, Mary; Ham, Jim

    2011-01-01

    The purpose of this article is to share the stories of an Introductory Statistics service-learning project in which students from both New Jersey and Michigan design and conduct phone surveys that lead to publication in local newspapers; to discuss the pedagogical benefits and challenges of the project; and to provide information for those who…

  16. Steam Generator Group Project. Progress report on data acquisition/statistical analysis

    International Nuclear Information System (INIS)

    Doctor, P.G.; Buchanan, J.A.; McIntyre, J.M.; Hof, P.J.; Ercanbrack, S.S.

    1984-01-01

    A major task of the Steam Generator Group Project (SGGP) is to establish the reliability of the eddy current inservice inspections of PWR steam generator tubing, by comparing the eddy current data to the actual physical condition of the tubes via destructive analyses. This report describes the plans for the computer systems needed to acquire, store and analyze the diverse data to be collected during the project. The real-time acquisition of the baseline eddy current inspection data will be handled using a specially designed data acquisition computer system based on a Digital Equipment Corporation (DEC) PDP-11/44. The data will be archived in digital form for use after the project is completed. Data base management and statistical analyses will be done on a DEC VAX-11/780. Color graphics will be heavily used to summarize the data and the results of the analyses. The report describes the data that will be taken during the project and the statistical methods that will be used to analyze the data. 7 figures, 2 tables

  17. Assessment of Problem-Based Learning in the Undergraduate Statistics Course

    Science.gov (United States)

    Karpiak, Christie P.

    2011-01-01

    Undergraduate psychology majors (N = 51) at a mid-sized private university took a statistics examination on the first day of the research methods course, a course for which a grade of "C" or higher in statistics is a prerequisite. Students who had taken a problem-based learning (PBL) section of the statistics course (n = 15) were compared to those…

  18. The Importance of Mathematics as a Prerequisite to Introductory Financial Accounting

    Science.gov (United States)

    McCarron, Karen B.; Burstein, Alan N.

    2017-01-01

    Mathematics has long served as a prerequisite to introductory financial accounting in the 4-year college business curriculum. However, 2-year colleges have been slower to adopt math as a prerequisite. Its usefulness in relation to achieving successful completion of accounting has not been demonstrated at either a 2-year or 4-year college. Using…

  19. Web-based diagnosis and therapy of auditory prerequisites for reading and spelling

    Directory of Open Access Journals (Sweden)

    Krammer, Sandra

    2006-11-01

    Full Text Available Cognitive deficits in auditory or visual processing or in verbal short-term-memory are amongst others risk factors for the development of dyslexia (reading and spelling disability. By early identification and intervention (optimally before school entry, detrimental effects of these cognitive deficits on reading and spelling might be prevented. The goal of the CASPAR-project is to develop and evaluate web-based tools for diagnosis and therapy of cognitive prerequisites for reading and spelling, which are appropriate for kindergarten children. In the first approach CASPAR addresses auditory processing disorders. This article describes a computerized and web-based approach for screening and testing phoneme discrimination and for promoting phoneme discrimination abilities through interactive games in kindergarteners.

  20. Statistics on Science and Technology in Latin America, Experience with UNESCO Pilot Projects, 1972-1974.

    Science.gov (United States)

    Thebaud, Schiller

    This report examines four UNESCO pilot projects undertaken in 1972 in Brazil, Colombia, Peru, and Uruguay to study the methods used for national statistical surveys of science and technology. The projects specifically addressed the problems of comparing statistics gathered by different methods in different countries. Surveys carried out in Latin…

  1. Prerequisite Coursework as a Predictor of Performance in a Graduate Management Course

    Science.gov (United States)

    McMillan-Capehart, Amy; Adeyemi-Bello, Tope

    2008-01-01

    There have been many studies published concerning predictors of academic performance but few of these studies have examined the impact of prerequisites. As such, we investigated the impact of a prerequisite management course on graduate student performance in an Organizational Behavior (OB) course. In this longitudinal study, we explored…

  2. The success of international development projects, trust and communication: an African perspective

    Energy Technology Data Exchange (ETDEWEB)

    Diallo, A.; Thuillier, D. [Universite du Quebec a Montreal (Canada). Dept. Management et Technologie

    2005-04-01

    Project success is strongly linked to communication and cooperation between stakeholders. This research explores the relationship between trust and communication and tests the influence of these factors upon project success and success criteria for international development projects financed by multilateral institutions in sub-Saharan Africa. The research analyses the coordinators' perceptions of project success, communication climate and interpersonal relationship between himself and his stakeholders (task manager in the multilateral agency, national supervisor) and within the project team. Data were collected from questionnaires completed by project coordinators of development projects. The statistical analysis confirms that trust and communication between players are proxy variables. Trust between the task manager and the coordinator is the key success factor, whereas team cohesion is the second most important factor. Trust between the coordinator and his national supervisor does not play a prominent role, although the task manager considers significant local autonomy for the coordinator a prerequisite for funding a subsequent phase when the project comes to an end. (author)

  3. Prerequisites of ideal safety-critical organizations

    International Nuclear Information System (INIS)

    Takeuchi, Michiru; Hikono, Masaru; Matsui, Yuko; Goto, Manabu; Sakuda, Hiroshi

    2013-01-01

    This study explores the prerequisites of ideal safety-critical organizations, marshalling arguments of 4 areas of organizational research on safety, each of which has overlap: a safety culture, high reliability organizations (HROs), organizational resilience, and leadership especially in safety-critical organizations. The approach taken in this study was to retrieve questionnaire items or items on checklists of the 4 research areas and use them as materials of abduction (as referred to in the KJ method). The results showed that the prerequisites of ideal safety-oriented organizations consist of 9 factors as follows: (1) The organization provides resources and infrastructure to ensure safety. (2) The organization has a sharable vision. (3) Management attaches importance to safety. (4) Employees openly communicate issues and share wide-ranging information with each other. (5) Adjustments and improvements are made as the organization's situation changes. (6) Learning activities from mistakes and failures are performed. (7) Management creates a positive work environment and promotes good relations in the workplace. (8) Workers have good relations in the workplace. (9) Employees have all the necessary requirements to undertake their own functions, and act conservatively. (author)

  4. A Familiar(ity Problem: Assessing the Impact of Prerequisites and Content Familiarity on Student Learning.

    Directory of Open Access Journals (Sweden)

    Justin F Shaffer

    Full Text Available Prerequisites are embedded in most STEM curricula. However, the assumption that the content presented in these courses will improve learning in later courses has not been verified. Because a direct comparison of performance between students with and without required prerequisites is logistically difficult to arrange in a randomized fashion, we developed a novel familiarity scale, and used this to determine whether concepts introduced in a prerequisite course improved student learning in a later course (in two biology disciplines. Exam questions in the latter courses were classified into three categories, based on the degree to which the tested concept had been taught in the prerequisite course. If content familiarity mattered, it would be expected that exam scores on topics covered in the prerequisite would be higher than scores on novel topics. We found this to be partially true for "Very Familiar" questions (concepts covered in depth in the prerequisite. However, scores for concepts only briefly discussed in the prerequisite ("Familiar" were indistinguishable from performance on topics that were "Not Familiar" (concepts only taught in the later course. These results imply that merely "covering" topics in a prerequisite course does not result in improved future performance, and that some topics may be able to removed from a course thereby freeing up class time. Our results may therefore support the implementation of student-centered teaching methods such as active learning, as the time-intensive nature of active learning has been cited as a barrier to its adoption. In addition, we propose that our familiarity system could be broadly utilized to aid in the assessment of the effectiveness of prerequisites.

  5. A Familiar(ity) Problem: Assessing the Impact of Prerequisites and Content Familiarity on Student Learning.

    Science.gov (United States)

    Shaffer, Justin F; Dang, Jennifer V; Lee, Amanda K; Dacanay, Samantha J; Alam, Usman; Wong, Hollie Y; Richards, George J; Kadandale, Pavan; Sato, Brian K

    2016-01-01

    Prerequisites are embedded in most STEM curricula. However, the assumption that the content presented in these courses will improve learning in later courses has not been verified. Because a direct comparison of performance between students with and without required prerequisites is logistically difficult to arrange in a randomized fashion, we developed a novel familiarity scale, and used this to determine whether concepts introduced in a prerequisite course improved student learning in a later course (in two biology disciplines). Exam questions in the latter courses were classified into three categories, based on the degree to which the tested concept had been taught in the prerequisite course. If content familiarity mattered, it would be expected that exam scores on topics covered in the prerequisite would be higher than scores on novel topics. We found this to be partially true for "Very Familiar" questions (concepts covered in depth in the prerequisite). However, scores for concepts only briefly discussed in the prerequisite ("Familiar") were indistinguishable from performance on topics that were "Not Familiar" (concepts only taught in the later course). These results imply that merely "covering" topics in a prerequisite course does not result in improved future performance, and that some topics may be able to removed from a course thereby freeing up class time. Our results may therefore support the implementation of student-centered teaching methods such as active learning, as the time-intensive nature of active learning has been cited as a barrier to its adoption. In addition, we propose that our familiarity system could be broadly utilized to aid in the assessment of the effectiveness of prerequisites.

  6. Number projected statistics and the pairing correlations at high excitation energies

    International Nuclear Information System (INIS)

    Esebbag, C.; Egido, J.L.

    1993-01-01

    We analyze the use of particle-number projected statistics (PNPS) as an effective way to include the quantum and statistical fluctuations, associated with the pairing degree of freedom, left out in finite-temperature mean-field theories. As a numerical application the exact-soluble degenerate model is worked out. In particular, we find that the sharp temperature-induced superfluid-normal phase transition, predicted in the mean-field approximations, is washed out in the PNPS. Some approximations as well as the Landau prescription to include statistical fluctuations are also discussed. We find that the Landau prescription provides a reasonable approximation to the PNPS. (orig.)

  7. The GenABEL Project for statistical genomics [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Lennart C. Karssen

    2016-05-01

    Full Text Available Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the “core team”, facilitating agile statistical omics methodology development and fast dissemination.

  8. The CASE Project: Evaluation of Case-Based Approaches to Learning and Teaching in Statistics Service Courses

    Science.gov (United States)

    Fawcett, Lee

    2017-01-01

    The CASE project (Case-based Approaches to Statistics Education; see www.mas.ncl.ac.uk/~nlf8/innovation) was established to investigate how the use of real-life, discipline-specific case study material in Statistics service courses could improve student engagement, motivation, and confidence. Ultimately, the project aims to promote deep learning…

  9. Prerequisites for Successful Strategic Partnerships for Sustainable Building Renovation

    DEFF Research Database (Denmark)

    Jensen, Per Anker; Johansen, Jakob Berg; Thuesen, Christian

    The purpose of this paper is to identify the prerequisites for establishing successful strategic partnerships in relation to renovating buildings sustainably. Establishing strategic partnerships is in the paper seen as a potential way to make building renovation more sustainable in Denmark...... and analysis of strategic partnerships models as well as typical processes used in building renovation. Experiences from development of new strategic partnerships have particularly been found in the UK and Sweden. Based on two workshops with practitioners representing the whole value chain in the construction...... industry and analyses of two exemplary cases the paper suggests prerequisites for establishing successful strategic partnerships for sustainable building renovation. The results show that strategic partnerships are collaborations set up between two or more organizations that remain independent...

  10. The Effect of Project Based Learning on the Statistical Literacy Levels of Student 8th Grade

    Science.gov (United States)

    Koparan, Timur; Güven, Bülent

    2014-01-01

    This study examines the effect of project based learning on 8th grade students' statistical literacy levels. A performance test was developed for this aim. Quasi-experimental research model was used in this article. In this context, the statistics were taught with traditional method in the control group and it was taught using project based…

  11. 18 Prerequisite for Sustainable Agricultural Development in the Sub ...

    African Journals Online (AJOL)

    User

    2011-07-21

    Jul 21, 2011 ... Keywords: Prerequisite, agricultural development, sustainable .... into many areas of policy and public provision, reducing subsidies and bringing ... indirectly influence agricultural prices is often far greater than the effects of.

  12. Statistical language learning in neonates revealed by event-related brain potentials

    Directory of Open Access Journals (Sweden)

    Näätänen Risto

    2009-03-01

    Full Text Available Abstract Background Statistical learning is a candidate for one of the basic prerequisites underlying the expeditious acquisition of spoken language. Infants from 8 months of age exhibit this form of learning to segment fluent speech into distinct words. To test the statistical learning skills at birth, we recorded event-related brain responses of sleeping neonates while they were listening to a stream of syllables containing statistical cues to word boundaries. Results We found evidence that sleeping neonates are able to automatically extract statistical properties of the speech input and thus detect the word boundaries in a continuous stream of syllables containing no morphological cues. Syllable-specific event-related brain responses found in two separate studies demonstrated that the neonatal brain treated the syllables differently according to their position within pseudowords. Conclusion These results demonstrate that neonates can efficiently learn transitional probabilities or frequencies of co-occurrence between different syllables, enabling them to detect word boundaries and in this way isolate single words out of fluent natural speech. The ability to adopt statistical structures from speech may play a fundamental role as one of the earliest prerequisites of language acquisition.

  13. Mathematics authentic assessment on statistics learning: the case for student mini projects

    Science.gov (United States)

    Fauziah, D.; Mardiyana; Saputro, D. R. S.

    2018-03-01

    Mathematics authentic assessment is a form of meaningful measurement of student learning outcomes for the sphere of attitude, skill and knowledge in mathematics. The construction of attitude, skill and knowledge achieved through the fulfilment of tasks which involve active and creative role of the students. One type of authentic assessment is student mini projects, started from planning, data collecting, organizing, processing, analysing and presenting the data. The purpose of this research is to learn the process of using authentic assessments on statistics learning which is conducted by teachers and to discuss specifically the use of mini projects to improving students’ learning in the school of Surakarta. This research is an action research, where the data collected through the results of the assessments rubric of student mini projects. The result of data analysis shows that the average score of rubric of student mini projects result is 82 with 96% classical completeness. This study shows that the application of authentic assessment can improve students’ mathematics learning outcomes. Findings showed that teachers and students participate actively during teaching and learning process, both inside and outside of the school. Student mini projects also provide opportunities to interact with other people in the real context while collecting information and giving presentation to the community. Additionally, students are able to exceed more on the process of statistics learning using authentic assessment.

  14. Prerequisites for Successful Strategic Partnerships for Sustainable Building Renovation

    DEFF Research Database (Denmark)

    Jensen, Per Anker; Johansen, Jakob Berg; Thuesen, Christian

    2017-01-01

    The purpose of this paper is to identify the prerequisites for establishing successful strategic partnerships in relation to renovating buildings sustainably. Establishing strategic partnerships is in the paper seen as a potential way to make building renovation more sustainable in Denmark...... industry and analyses of two exemplary cases the paper suggests prerequisites for establishing successful strategic partnerships for sustainable building renovation. The results show that strategic partnerships are collaborations set up between two or more organizations that remain independent...... particularly in terms of reducing energy consumption and use of resources and increase productivity. However, until now we have only had a limited number of such partnerships implemented and the few examples that do exist, mostly concern the construction of new buildings. The paper is based on an investigation...

  15. Statistical Projections for Multi-resolution, Multi-dimensional Visual Data Exploration and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Hoa T. [Univ. of Utah, Salt Lake City, UT (United States); Stone, Daithi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bethel, E. Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-01-01

    An ongoing challenge in visual exploration and analysis of large, multi-dimensional datasets is how to present useful, concise information to a user for some specific visualization tasks. Typical approaches to this problem have proposed either reduced-resolution versions of data, or projections of data, or both. These approaches still have some limitations such as consuming high computation or suffering from errors. In this work, we explore the use of a statistical metric as the basis for both projections and reduced-resolution versions of data, with a particular focus on preserving one key trait in data, namely variation. We use two different case studies to explore this idea, one that uses a synthetic dataset, and another that uses a large ensemble collection produced by an atmospheric modeling code to study long-term changes in global precipitation. The primary findings of our work are that in terms of preserving the variation signal inherent in data, that using a statistical measure more faithfully preserves this key characteristic across both multi-dimensional projections and multi-resolution representations than a methodology based upon averaging.

  16. Indigenous dress as a prerequisite for cultural preservation in ...

    African Journals Online (AJOL)

    Indigenous dress as a prerequisite for cultural preservation in traditional African festival: ... EJOTMAS: Ekpoma Journal of Theatre and Media Arts ... The study concludes that in this era of globalization where western influence seems to be the ...

  17. Mathematical Representation Ability by Using Project Based Learning on the Topic of Statistics

    Science.gov (United States)

    Widakdo, W. A.

    2017-09-01

    Seeing the importance of the role of mathematics in everyday life, mastery of the subject areas of mathematics is a must. Representation ability is one of the fundamental ability that used in mathematics to make connection between abstract idea with logical thinking to understanding mathematics. Researcher see the lack of mathematical representation and try to find alternative solution to dolve it by using project based learning. This research use literature study from some books and articles in journals to see the importance of mathematical representation abiliy in mathemtics learning and how project based learning able to increase this mathematical representation ability on the topic of Statistics. The indicators for mathematical representation ability in this research classifies namely visual representation (picture, diagram, graph, or table); symbolize representation (mathematical statement. Mathematical notation, numerical/algebra symbol) and verbal representation (written text). This article explain about why project based learning able to influence student’s mathematical representation by using some theories in cognitive psychology, also showing the example of project based learning that able to use in teaching statistics, one of mathematics topic that very useful to analyze data.

  18. A Study of the Comparative Effectiveness of Zoology Prerequisites at Slippery Rock State College.

    Science.gov (United States)

    Morrison, William Sechler

    This study compared the effectiveness of three sequences of prerequisite courses required before taking zoology. Sequence 1 prerequisite courses consisted of general biology and human biology; Sequence 2 consisted of general biology; and Sequence 3 required cell biology. Zoology students in the spring of 1972 were pretest and a posttest. The mean…

  19. Prerequisite programs at schools: diagnosis and economic evaluation.

    Science.gov (United States)

    Lockis, Victor R; Cruz, Adriano G; Walter, Eduardo H M; Faria, Jose A F; Granato, Daniel; Sant'Ana, Anderson S

    2011-02-01

    In this study, 20 Brazilian public schools have been assessed regarding good manufacturing practices and standard sanitation operating procedures implementation. We used a checklist comprised of 10 parts (facilities and installations, water supply, equipments and tools, pest control, waste management, personal hygiene, sanitation, storage, documentation, and training), making a total of 69 questions. The implementing modification cost to the found nonconformities was also determined so that it could work with technical data as a based decision-making prioritization. The average nonconformity percentage at schools concerning to prerequisite program was 36%, from which 66% of them own inadequate installations, 65% waste management, 44% regarding documentation, and 35% water supply and sanitation. The initial estimated cost for changing has been U.S.$24,438 and monthly investments of 1.55% on the currently needed invested values. This would result in U.S.$0.015 increase on each served meal cost over the investment replacement within a year. Thus, we have concluded that such modifications are economically feasible and will be considered on technical requirements when prerequisite program implementation priorities are established.

  20. Prerequisites to promote energy efficiency investments in Bulgaria

    International Nuclear Information System (INIS)

    Boernsen, O.

    1994-01-01

    The PHARE Energy Programme's team observation and advice to the Committee of Energy in Bulgaria are outlined. In comparison to the Western European countries energy intensity in Bulgaria is 2-3 times higher. It is explained by the energy intensive industrial structure and the old and depreciated capital equipment. Cost-covering energy prices would make energy efficiency investment financially feasible and would attract financiers. But the lesson from Western European experience is that availability of finance capital and cost reflecting energy prices is not at all a necessary prerequisite for energy efficiency improvement. This improvement can be achieved with no cost or low cost measures. The potential for energy efficiency in industry (consuming more than 50% of the energy) is 11%-20%; in buildings - 6%; in transport - 4%. There are other obstacles, as lack of information, other business interests and no internal expertise, especially for small and medium size industries. The basic prerequisite to improve energy efficiency is raising of awareness and change of management culture, as well as radical change in organisational and management structures. (orig.)

  1. PRICES - PREREQUISITE OF MARKET DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    VĂDUVA MARIA

    2017-08-01

    Full Text Available Prices are the key points of transfer and interactions. Balance means knowing the real demand and adapting thier supply at its level and structure. In studying the prices, the knowledge of economic content and the mechanism of their formation in exchange process is a crucial prerequisites to accomplish the transition from theoretical foundations to practical foundations of concrete modalities, of pricing techniques. If demand can assimilate the production of considered enterprises, then the manufacturer is concerned to determine that level of production for which will get maximum profit, profitability threshold, elasticity of supply compared with the price, to choose the best outlet. Price depends on the intersection of demand and supply

  2. Students' Perceptions on Intrapreneurship Education--Prerequisites for Learning Organisations

    Science.gov (United States)

    Kansikas, Juha; Murphy, Linda

    2010-01-01

    The aim of this qualitative study is to understand the prerequisites for learning organisations (LO) as perceived by university students. Intrapreneurship education offers possibilities to increase student's adaptation of learning organisation's climate and behaviour. By analysing students' perceptions, more information about learning organisation…

  3. Prerequisites for Forming the Institutional Concept of the National Economy Competitiveness under Conditions of Globalization

    Directory of Open Access Journals (Sweden)

    Yaremenko Oleh L.

    2015-03-01

    Full Text Available The article attempts to prove that under conditions of globalization there have developed objective and subjective prerequisites for forming the institutional concept of the national economy. The objective prerequisites are the newest information and communication technologies, post-industrial trends and market transformation of civilization intensified by globalization. Under such conditions instability and volatility of the institutional environment both within national economies and at the international level are observed. The aggravation of the global competition between national economies actualizes the role of such institutional factors as political system, property, public administration, economic organization, culture, etc. The subjective prerequisites are related to the fact that the institutional economic theory is currently one of the leading trends in the modern world and Ukrainian economic thought. Interest in it is explained not only by the fact that it overcomes the limitations of a number of prerequisites for the mainstream, but also because it allows considering the modern economic processes in complex

  4. Prerequisite Change and Its Effect on Intermediate Accounting Performance

    Science.gov (United States)

    Huang, Jiunn; O'Shaughnessy, John; Wagner, Robin

    2005-01-01

    As of Fall 1996, San Francisco State University changed its introductory financial accounting course to focus on a "user's" perspective, de-emphasizing the accounting cycle. Anticipating that these changes could impair subsequent performance, the Department of Accounting instituted a new prerequisite for intermediate accounting: Students would…

  5. Prerequisites for successful nuclear generation in southern Africa

    International Nuclear Information System (INIS)

    Semark, P.

    1990-01-01

    The prerequisites and the requisites for successful nuclear powered electricity generation in southern Africa are explored. There are four elements essential to success, namely, the mission or vision; the appropriate means; the right and sufficient time, and the skilled, committed executor. The ongoing success of nuclear powered electricity generation in South Africa is discussed in the light of these four elements. 2 ills

  6. A Laboratory Experiment, Based on the Maillard Reaction, Conducted as a Project in Introductory Statistics

    Science.gov (United States)

    Kravchuk, Olena; Elliott, Antony; Bhandari, Bhesh

    2005-01-01

    A simple laboratory experiment, based on the Maillard reaction, served as a project in Introductory Statistics for undergraduates in Food Science and Technology. By using the principles of randomization and replication and reflecting on the sources of variation in the experimental data, students reinforced the statistical concepts and techniques…

  7. The effect of project-based learning on students' statistical literacy levels for data representation

    Science.gov (United States)

    Koparan, Timur; Güven, Bülent

    2015-07-01

    The point of this study is to define the effect of project-based learning approach on 8th Grade secondary-school students' statistical literacy levels for data representation. To achieve this goal, a test which consists of 12 open-ended questions in accordance with the views of experts was developed. Seventy 8th grade secondary-school students, 35 in the experimental group and 35 in the control group, took this test twice, one before the application and one after the application. All the raw scores were turned into linear points by using the Winsteps 3.72 modelling program that makes the Rasch analysis and t-tests, and an ANCOVA analysis was carried out with the linear points. Depending on the findings, it was concluded that the project-based learning approach increases students' level of statistical literacy for data representation. Students' levels of statistical literacy before and after the application were shown through the obtained person-item maps.

  8. A statistical proposal for environmental impact assessment of development projects

    International Nuclear Information System (INIS)

    Plazas C, Julian A; De J Lema T, Alvaro; Leon P, Juan Diego

    2009-01-01

    Environmental impact assessment of development projects is a fundamental process, which main goal is to avoid that their construction and functioning, lead to serious and negative consequences on the environment. Some of the most important limitations of the models employed to assess environmental impacts, are the subjectivity of its parameters and weights, and the multicolineality among the variables, which represent high quantities of similar information. This work presents a multivariate statistical-based method that tries to diminish such limitations. For this purpose, environmental impact assessment, is valuated through different environmental impact attributes and environmental elements, synthesized in an environmental quality index (ICA in Spanish). ICA can be applied at different levels, such as at a project level, or applied only at a partial level on one or some environmental components.

  9. The Legal Prerequisites of Juvenile Delinquency Mediation Institution Creation

    Directory of Open Access Journals (Sweden)

    Zabuga E. E.

    2012-11-01

    Full Text Available In the article the author analyzes the criminal procedure legislation of the Russian Federation, stresses the presence of prerequisites for creating the mediation institution in juvenile delinquency cases. In particular, here are considered the legal preconditions of utmost importance also at the international and national levels

  10. Prerequisites for sustainable care improvement using the reflective team as a work model.

    Science.gov (United States)

    Jonasson, Lise-Lotte; Carlsson, Gunilla; Nyström, Maria

    2014-01-01

    Several work models for care improvement have been developed in order to meet the requirement for evidence-based care. This study examines a work model for reflection, entitled the reflective team (RT). The main idea behind RTs is that caring skills exist among those who work closest to the patients. The team leader (RTL) encourages sustainable care improvement, rooted in research and proven experience, by using a lifeworld perspective to stimulate further reflection and a developmental process leading to research-based caring actions within the team. In order to maintain focus, it is important that the RTL has a clear idea of what sustainable care improvement means, and what the prerequisites are for such improvement. The aim of the present study is, therefore, to explore the prerequisites for improving sustainable care, seeking to answer how RTLs perceive these and use RTs for concrete planning. Nine RTLs were interviewed, and their statements were phenomenographically analysed. The analysis revealed three separate qualitative categories, which describe personal, interpersonal, and structural aspects of the prerequisites. In the discussion, these categories are compared with previous research on reflection, and the conclusion is reached that the optimal conditions for RTs to work, when focussed on sustainable care improvement, occur when the various aspects of the prerequisites are intertwined and become a natural part of the reflective work.

  11. 38 CFR 42.6 - Prerequisites for issuing a complaint.

    Science.gov (United States)

    2010-07-01

    ... this section), the amount of money or the value of property or services, or both, demanded or requested... are unrelated or were not submitted simultaneously, regardless of the amount of money, or the value of... AFFAIRS (CONTINUED) STANDARDS IMPLEMENTING THE PROGRAM FRAUD CIVIL REMEDIES ACT § 42.6 Prerequisites for...

  12. 22 CFR 35.6 - Prerequisites for issuing a complaint.

    Science.gov (United States)

    2010-04-01

    ... this section), the amount of money or the value of property or services demanded or requested in... simultaneously, regardless of the amount of money, or the value of property or services, demanded or requested. ... § 35.6 Prerequisites for issuing a complaint. (a) The reviewing official may issue a complaint under...

  13. 10 CFR 1013.6 - Prerequisites for issuing a complaint.

    Science.gov (United States)

    2010-01-01

    ... in paragraph (b) of this section), the amount of money or the value of property or services demanded... that are unrelated or were not submitted simultaneously, regardless of the amount of money, or the... § 1013.6 Prerequisites for issuing a complaint. (a) The reviewing official may issue a complaint under...

  14. The Effect on the 8th Grade Students' Attitude towards Statistics of Project Based Learning

    Science.gov (United States)

    Koparan, Timur; Güven, Bülent

    2014-01-01

    This study investigates the effect of the project based learning approach on 8th grade students' attitude towards statistics. With this aim, an attitude scale towards statistics was developed. Quasi-experimental research model was used in this study. Following this model in the control group the traditional method was applied to teach statistics…

  15. Gender-Equal Organizations as a Prerequisite for Workplace Learning

    Science.gov (United States)

    Johansson, Kristina; Abrahamsson, Lena

    2018-01-01

    Purpose: This paper aims to explore how gendering of the learning environment acts to shape the design and outcome of workplace learning. The primary intention is to reflect on the idea of gender-equal organizations as a prerequisite for workplace learning. Design/methodology/approach: A review of literature relating to gender and workplace…

  16. 20 CFR 355.6 - Prerequisites for issuing a complaint.

    Science.gov (United States)

    2010-04-01

    ...), the amount of money or the value of property or services demanded or requested in violation of § 355.3... simultaneously, regardless of the amount of money or the value of property or services demanded or requested. ... OR STATEMENTS REGULATIONS UNDER THE PROGRAM FRAUD CIVIL REMEDIES ACT OF 1986 § 355.6 Prerequisites...

  17. Sensitivity Analysis of Arctic Sea Ice Extent Trends and Statistical Projections Using Satellite Data

    Directory of Open Access Journals (Sweden)

    Ge Peng

    2018-02-01

    Full Text Available An ice-free Arctic summer would have pronounced impacts on global climate, coastal habitats, national security, and the shipping industry. Rapid and accelerated Arctic sea ice loss has placed the reality of an ice-free Arctic summer even closer to the present day. Accurate projection of the first Arctic ice-free summer year is extremely important for business planning and climate change mitigation, but the projection can be affected by many factors. Using an inter-calibrated satellite sea ice product, this article examines the sensitivity of decadal trends of Arctic sea ice extent and statistical projections of the first occurrence of an ice-free Arctic summer. The projection based on the linear trend of the last 20 years of data places the first Arctic ice-free summer year at 2036, 12 years earlier compared to that of the trend over the last 30 years. The results from a sensitivity analysis of six commonly used curve-fitting models show that the projected timings of the first Arctic ice-free summer year tend to be earlier for exponential, Gompertz, quadratic, and linear with lag fittings, and later for linear and log fittings. Projections of the first Arctic ice-free summer year by all six statistical models appear to converge to the 2037 ± 6 timeframe, with a spread of 17 years, and the earliest first ice-free Arctic summer year at 2031.

  18. Classical and statistical thermodynamics

    CERN Document Server

    Rizk, Hanna A

    2016-01-01

    This is a text book of thermodynamics for the student who seeks thorough training in science or engineering. Systematic and thorough treatment of the fundamental principles rather than presenting the large mass of facts has been stressed. The book includes some of the historical and humanistic background of thermodynamics, but without affecting the continuity of the analytical treatment. For a clearer and more profound understanding of thermodynamics this book is highly recommended. In this respect, the author believes that a sound grounding in classical thermodynamics is an essential prerequisite for the understanding of statistical thermodynamics. Such a book comprising the two wide branches of thermodynamics is in fact unprecedented. Being a written work dealing systematically with the two main branches of thermodynamics, namely classical thermodynamics and statistical thermodynamics, together with some important indexes under only one cover, this treatise is so eminently useful.

  19. Using the Student Research Project to Integrate Macroeconomics and Statistics in an Advanced Cost Accounting Course

    Science.gov (United States)

    Hassan, Mahamood M.; Schwartz, Bill N.

    2014-01-01

    This paper discusses a student research project that is part of an advanced cost accounting class. The project emphasizes active learning, integrates cost accounting with macroeconomics and statistics by "learning by doing" using real world data. Students analyze sales data for a publicly listed company by focusing on the company's…

  20. Statistical Analysis of the Polarimetric Cloud Analysis and Seeding Test (POLCAST) Field Projects

    Science.gov (United States)

    Ekness, Jamie Lynn

    The North Dakota farming industry brings in more than $4.1 billion annually in cash receipts. Unfortunately, agriculture sales vary significantly from year to year, which is due in large part to weather events such as hail storms and droughts. One method to mitigate drought is to use hygroscopic seeding to increase the precipitation efficiency of clouds. The North Dakota Atmospheric Research Board (NDARB) sponsored the Polarimetric Cloud Analysis and Seeding Test (POLCAST) research project to determine the effectiveness of hygroscopic seeding in North Dakota. The POLCAST field projects obtained airborne and radar observations, while conducting randomized cloud seeding. The Thunderstorm Identification Tracking and Nowcasting (TITAN) program is used to analyze radar data (33 usable cases) in determining differences in the duration of the storm, rain rate and total rain amount between seeded and non-seeded clouds. The single ratio of seeded to non-seeded cases is 1.56 (0.28 mm/0.18 mm) or 56% increase for the average hourly rainfall during the first 60 minutes after target selection. A seeding effect is indicated with the lifetime of the storms increasing by 41 % between seeded and non-seeded clouds for the first 60 minutes past seeding decision. A double ratio statistic, a comparison of radar derived rain amount of the last 40 minutes of a case (seed/non-seed), compared to the first 20 minutes (seed/non-seed), is used to account for the natural variability of the cloud system and gives a double ratio of 1.85. The Mann-Whitney test on the double ratio of seeded to non-seeded cases (33 cases) gives a significance (p-value) of 0.063. Bootstrapping analysis of the POLCAST set indicates that 50 cases would provide statistically significant results based on the Mann-Whitney test of the double ratio. All the statistical analysis conducted on the POLCAST data set show that hygroscopic seeding in North Dakota does increase precipitation. While an additional POLCAST field

  1. Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.

    Science.gov (United States)

    Dunlap, Dale

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…

  2. Statistical analysis of real-time, enviromental radon monitoring results at the Fernald Enviromental Management Project

    International Nuclear Information System (INIS)

    Liu, Ning; Spitz, H.B.; Tomezak, L.

    1996-01-01

    A comprehensive real-time, environmental radon monitoring program is being conducted at the Fernald Environmental Management Project, where a large quantity of radium-bearing residues have been stored in two covered earth-bermed silos. Statistical analyses was conducted to determine what impact radon emitted by the radium bearing materials contained in the silos has on the ambient radon concentration at the Fernald Environmental Management Project site. The distribution that best describes the outdoor radon monitoring data was determined before statistical analyses were conducted. Random effects associated with the selection of radon monitoring locations were accommodated by using nested and nested factorial classification models. The Project site was divided into four general areas according to their characteristics and functions: (1) the silo area, where the radium-bearing waste is stored; (2) the production/administration area; (3) the perimeter area, or fence-line, of the Fernald Environmental Management Project site; and (4) a background area, located approximately 13 km from the Fernald Environmental Management Project site, representing the naturally-occurring radon concentration. A total of 15 continuous, hourly readout radon monitors were installed to measure the outdoor radon concentration. Measurement results from each individual monitor were found to be log-normally distributed. A series of contrast tests, which take random effects into account, were performed to compare the radon concentration between different areas of the site. These comparisons demonstrate that the radon concentrations in the production/administration area and the perimeter area are statistically equal to the natural background, whereas the silo area is significantly higher than background. The study also showed that the radon concentration in the silo area was significantly reduced after a sealant barrier was applied to the contents of the silos. 10 refs., 6 figs., 8 tabs

  3. The Healthy ALLiances (HALL) framework: prerequisites for success.

    Science.gov (United States)

    Koelen, Maria A; Vaandrager, Lenneke; Wagemakers, Annemarie

    2012-04-01

    Chronic conditions are on the rise worldwide, and there is increasingly a call for the primary care and public health sectors to join forces in alliances. GPs have an important role to play in such alliances. However, successful cooperation is not as obvious as it may seem, and the sectors are not used to working together. The objective is to identify conditions and prerequisites for successful alliances. Identification of conditions and prerequisites is mainly based on stepwise analysis and iterative developments in research on collaboration processes in the area of health promotion and public health. The process as a whole resulted in the framework presented in this paper. This so-called HALL framework identifies three clusters of factors that either hinder or facilitate the success of alliances: (i) institutional factors, (ii) personal factors of participants in the alliance and (iii) factors relating to the organization of the alliance. The institutional and personal factors 'stick' to the stakeholders and are brought into the alliance. The third group refers to the lessons learned from dealing with the first two characteristics to make the alliance successful. Partners in alliances bring in personal attributes and institutional characteristics that can form obstacles to successful alliances, but, when they are addressed in a flexible and positive way, obstacles can be turned in contributory factors, leading to many potential benefits, such as collaborative learning and innovation.

  4. 7 CFR 70.52 - Prerequisites to packaging ready-to-cook poultry or rabbits identified with consumer grademarks.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Prerequisites to packaging ready-to-cook poultry or... ACT OF 1946 AND THE EGG PRODUCTS INSPECTION ACT (CONTINUED) VOLUNTARY GRADING OF POULTRY PRODUCTS AND... Prerequisites to packaging ready-to-cook poultry or rabbits identified with consumer grademarks. The official...

  5. Projecting future precipitation and temperature at sites with diverse climate through multiple statistical downscaling schemes

    Science.gov (United States)

    Vallam, P.; Qin, X. S.

    2017-10-01

    Anthropogenic-driven climate change would affect the global ecosystem and is becoming a world-wide concern. Numerous studies have been undertaken to determine the future trends of meteorological variables at different scales. Despite these studies, there remains significant uncertainty in the prediction of future climates. To examine the uncertainty arising from using different schemes to downscale the meteorological variables for the future horizons, projections from different statistical downscaling schemes were examined. These schemes included statistical downscaling method (SDSM), change factor incorporated with LARS-WG, and bias corrected disaggregation (BCD) method. Global circulation models (GCMs) based on CMIP3 (HadCM3) and CMIP5 (CanESM2) were utilized to perturb the changes in the future climate. Five study sites (i.e., Alice Springs, Edmonton, Frankfurt, Miami, and Singapore) with diverse climatic conditions were chosen for examining the spatial variability of applying various statistical downscaling schemes. The study results indicated that the regions experiencing heavy precipitation intensities were most likely to demonstrate the divergence between the predictions from various statistical downscaling methods. Also, the variance computed in projecting the weather extremes indicated the uncertainty derived from selection of downscaling tools and climate models. This study could help gain an improved understanding about the features of different downscaling approaches and the overall downscaling uncertainty.

  6. Statistical Emulation of Climate Model Projections Based on Precomputed GCM Runs*

    KAUST Repository

    Castruccio, Stefano

    2014-03-01

    The authors describe a new approach for emulating the output of a fully coupled climate model under arbitrary forcing scenarios that is based on a small set of precomputed runs from the model. Temperature and precipitation are expressed as simple functions of the past trajectory of atmospheric CO2 concentrations, and a statistical model is fit using a limited set of training runs. The approach is demonstrated to be a useful and computationally efficient alternative to pattern scaling and captures the nonlinear evolution of spatial patterns of climate anomalies inherent in transient climates. The approach does as well as pattern scaling in all circumstances and substantially better in many; it is not computationally demanding; and, once the statistical model is fit, it produces emulated climate output effectively instantaneously. It may therefore find wide application in climate impacts assessments and other policy analyses requiring rapid climate projections.

  7. Evaluation of excitation energy and spin in fission fragments using the statistical model, and the FIPPS project

    International Nuclear Information System (INIS)

    Faust, H.; Koester, U.; Kessedjian, G.; Sage, C.; Chebboubi, A.

    2013-01-01

    We review the statistical model and its application for the process of nuclear fission. The expressions for excitation energy and spin distributions for the individual fission fragments are given. We will finally emphasize the importance of measuring prompt gamma decay to further test the statistical model in nuclear fission with the FIPPS project. (authors)

  8. Teaching biology through statistics: application of statistical methods in genetics and zoology courses.

    Science.gov (United States)

    Colon-Berlingeri, Migdalisel; Burrowes, Patricia A

    2011-01-01

    Incorporation of mathematics into biology curricula is critical to underscore for undergraduate students the relevance of mathematics to most fields of biology and the usefulness of developing quantitative process skills demanded in modern biology. At our institution, we have made significant changes to better integrate mathematics into the undergraduate biology curriculum. The curricular revision included changes in the suggested course sequence, addition of statistics and precalculus as prerequisites to core science courses, and incorporating interdisciplinary (math-biology) learning activities in genetics and zoology courses. In this article, we describe the activities developed for these two courses and the assessment tools used to measure the learning that took place with respect to biology and statistics. We distinguished the effectiveness of these learning opportunities in helping students improve their understanding of the math and statistical concepts addressed and, more importantly, their ability to apply them to solve a biological problem. We also identified areas that need emphasis in both biology and mathematics courses. In light of our observations, we recommend best practices that biology and mathematics academic departments can implement to train undergraduates for the demands of modern biology.

  9. A Survey of Statistical Capstone Projects

    Science.gov (United States)

    Martonosi, Susan E.; Williams, Talithia D.

    2016-01-01

    In this article, we highlight the advantages of incorporating a statistical capstone experience in the undergraduate curriculum, where students perform an in-depth analysis of real-world data. Capstone experiences develop statistical thinking by allowing students to engage in a consulting-like experience that requires skills outside the scope of…

  10. Project T.E.A.M. (Technical Education Advancement Modules). Introduction to Statistical Process Control.

    Science.gov (United States)

    Billings, Paul H.

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 6-hour introductory module on statistical process control (SPC), designed to develop competencies in the following skill areas: (1) identification of the three classes of SPC use; (2) understanding a process and how it works; (3)…

  11. Frog Statistics

    Science.gov (United States)

    Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics

  12. Prerequisites for building a computer security incident response capability

    CSIR Research Space (South Africa)

    Mooi, M

    2015-08-01

    Full Text Available . 1]. 2) Handbook for Computer Security Incident Response Teams (CSIRTs) [18] (CMU-SEI): Providing guidance on building and running a CSIRT, this handbook has a particular focus on the incident handling service [18, p. xv]. In addition, a basic CSIRT... stream_source_info Mooi_2015.pdf.txt stream_content_type text/plain stream_size 41092 Content-Encoding UTF-8 stream_name Mooi_2015.pdf.txt Content-Type text/plain; charset=UTF-8 Prerequisites for building a computer...

  13. The Effect of Project-Based Learning on Students' Statistical Literacy Levels for Data Representation

    Science.gov (United States)

    Koparan, Timur; Güven, Bülent

    2015-01-01

    The point of this study is to define the effect of project-based learning approach on 8th Grade secondary-school students' statistical literacy levels for data representation. To achieve this goal, a test which consists of 12 open-ended questions in accordance with the views of experts was developed. Seventy 8th grade secondary-school students, 35…

  14. Prerequisite programs and food hygiene in hospitals: food safety knowledge and practices of food service staff in Ankara, Turkey.

    Science.gov (United States)

    Bas, Murat; Temel, Mehtap Akçil; Ersun, Azmi Safak; Kivanç, Gökhan

    2005-04-01

    Our objective was to determine food safety practices related to prerequisite program implementation in hospital food services in Turkey. Staff often lack basic food hygiene knowledge. Problems of implementing HACCP and prerequisite programs in hospitals include lack of food hygiene management training, lack of financial resources, and inadequate equipment and environment.

  15. Self-Knowledge, Capacity and Sensitivity: Prerequisites to Authentic Leadership by School Principals

    Science.gov (United States)

    Begley, Paul T.

    2006-01-01

    Purpose: The article proposes three prerequisites to authentic leadership by school principals: self-knowledge, a capacity for moral reasoning, and sensitivity to the orientations of others. Design/methodology/approach: A conceptual framework, based on research on the valuation processes of school principals and their strategic responses to…

  16. Japanese Mobile Phone Usage in Sweden - Technological and Social Prerequisites

    OpenAIRE

    Fredriksson, Susanne; Hillerdal, Ida

    2010-01-01

    Japan is an advanced country when it comes to mobile phone technology. This thesis firstly investigates the mobile phone usage in Japan. Secondly it describes the prerequisites for implementation of some distinguished Japanese mobile phone functions in Sweden. This is done from a social as well as a technological aspect. The Japanese mobile phone usage is investigated on three levels; governmental, industrial and consumer. The governmental level is characterised by an ICT policy which strives...

  17. Preeminence and prerequisites of sample size calculations in clinical trials

    OpenAIRE

    Richa Singhal; Rakesh Rana

    2015-01-01

    The key components while planning a clinical study are the study design, study duration, and sample size. These features are an integral part of planning a clinical trial efficiently, ethically, and cost-effectively. This article describes some of the prerequisites for sample size calculation. It also explains that sample size calculation is different for different study designs. The article in detail describes the sample size calculation for a randomized controlled trial when the primary out...

  18. Deconstructing Engineering Education Programmes: The DEEP Project to Reform the Mechanical Engineering Curriculum

    Science.gov (United States)

    Busch-Vishniac, Ilene; Kibler, Tom; Campbell, Patricia B.; Patterson, Eann; Guillaume, Darrell; Jarosz, Jeffrey; Chassapis, Constantin; Emery, Ashley; Ellis, Glenn; Whitworth, Horace; Metz, Susan; Brainard, Suzanne; Ray, Pradosh

    2011-01-01

    The goal of the Deconstructing Engineering Education Programmes project is to revise the mechanical engineering undergraduate curriculum to make the discipline more able to attract and retain a diverse community of students. The project seeks to reduce and reorder the prerequisite structure linking courses to offer greater flexibility for…

  19. Statistical limitations in functional neuroimaging. I. Non-inferential methods and statistical models.

    Science.gov (United States)

    Petersson, K M; Nichols, T E; Poline, J B; Holmes, A P

    1999-01-01

    Functional neuroimaging (FNI) provides experimental access to the intact living brain making it possible to study higher cognitive functions in humans. In this review and in a companion paper in this issue, we discuss some common methods used to analyse FNI data. The emphasis in both papers is on assumptions and limitations of the methods reviewed. There are several methods available to analyse FNI data indicating that none is optimal for all purposes. In order to make optimal use of the methods available it is important to know the limits of applicability. For the interpretation of FNI results it is also important to take into account the assumptions, approximations and inherent limitations of the methods used. This paper gives a brief overview over some non-inferential descriptive methods and common statistical models used in FNI. Issues relating to the complex problem of model selection are discussed. In general, proper model selection is a necessary prerequisite for the validity of the subsequent statistical inference. The non-inferential section describes methods that, combined with inspection of parameter estimates and other simple measures, can aid in the process of model selection and verification of assumptions. The section on statistical models covers approaches to global normalization and some aspects of univariate, multivariate, and Bayesian models. Finally, approaches to functional connectivity and effective connectivity are discussed. In the companion paper we review issues related to signal detection and statistical inference. PMID:10466149

  20. Clinical Governance in Primary Care; Principles, Prerequisites and Barriers: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Jaafar Sadeq Tabrizi

    2013-07-01

    Full Text Available Introduction: Primary care organizations are the entities through which clinical governance is developed at local level. To implement clinical governance in primary care, awareness about principles, prerequisites and barriers of this quality improvement paradigm is necessary. The aim of this study is to pool evidence about implementing clinical governance in primary care organizations. Data sources: The literature search was conducted in July 2012. PubMed, Web of Science, Emerald, Springerlink, and MD Consult were searched using the following MESH keywords; “clinical governance” and “primary care” Study selection: The search was limited to English language journals with no time limitation. Articles that were either quantitative or qualitative on concepts of implementing clinical governance in primary care were eligible for this study. Data extraction: From selected articles, data on principles, prerequisites and barriers of clinical governance in primary health care were extracted and classified in the extraction tables. Results: We classified our findings about principles of clinical governance in primary care in four groups; general principles, principles related to staff, patient and communication. Prerequisites were categorized in eight clusters; same as the seven dimensions of National Health System (NHS models of clinical governance. Barriers were sorted out in five categories as structure and organizing, cultural, resource, theoretical and logistical. Conclusion: Primary care organizations must provide budget holding, incentivized programs, data feedback, peer review, education, human relations, health information technology (HIT support, and resources. Key elements include; enrolled populations, an interdisciplinary team approach, HIT interoperability and access between all providers as well as patients, devolution of hospital based services into the community, inter-sectorial integration, blended payments, and a balance of

  1. Understanding spermatogenesis is a prerequisite for treatment

    Directory of Open Access Journals (Sweden)

    Schulze Wolfgang

    2003-11-01

    Full Text Available Abstract Throughout spermatogenesis multiplication, maturation and differentiation of germ cells results in the formation of the male gamete. The understanding of spermatogenesis needs detailed informations about the organization of the germinal epithelium, the structure and function of different types of germ cells, endocrine and paracrine cells and mechanisms, intratesticular and extratesticular regulation of spermatogenesis. Normal germ cells must be discriminated from malformed, apoptotic and degenerating germ cells and tumor cells. Identification of the border line between normal and disturbed spermatogenesis substantiate the diagnosis of impaired male fertility. The profound knowledge of the complicate process of spermatogenesis and all cells or cell systems involved with is the prerequisite to develop concepts for therapy of male infertility or to handle germ cells in the management of assisted reproduction.

  2. Connection between perturbation theory, projection-operator techniques, and statistical linearization for nonlinear systems

    International Nuclear Information System (INIS)

    Budgor, A.B.; West, B.J.

    1978-01-01

    We employ the equivalence between Zwanzig's projection-operator formalism and perturbation theory to demonstrate that the approximate-solution technique of statistical linearization for nonlinear stochastic differential equations corresponds to the lowest-order β truncation in both the consolidated perturbation expansions and in the ''mass operator'' of a renormalized Green's function equation. Other consolidated equations can be obtained by selectively modifying this mass operator. We particularize the results of this paper to the Duffing anharmonic oscillator equation

  3. Recalling Prerequisite Material in a Calculus II Course to Improve Student Success

    Science.gov (United States)

    Mokry, Jeanette

    2016-01-01

    This article discusses preparation assignments used in a Calculus II course that cover material from prerequisite courses. Prior to learning new material, students work on problems outside of class involving concepts from algebra, trigonometry, and Calculus I. These problems are directly built upon in order to answer Calculus II questions,…

  4. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    Science.gov (United States)

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  5. Prerequisites for understanding climate-change impacts on northern prairie wetlands

    Science.gov (United States)

    Anteau, Michael J.; Wiltermuth, Mark T.; Post van der Burg, Max; Pearse, Aaron T.

    2016-01-01

    The Prairie Pothole Region (PPR) contains ecosystems that are typified by an extensive matrix of grasslands and depressional wetlands, which provide numerous ecosystem services. Over the past 150 years the PPR has experienced numerous landscape modifications resulting in agricultural conversion of 75–99 % of native prairie uplands and drainage of 50–90 % of wetlands. There is concern over how and where conservation dollars should be spent within the PPR to protect and restore wetland basins to support waterbird populations that will be robust to a changing climate. However, while hydrological impacts of landscape modifications appear substantial, they are still poorly understood. Previous modeling efforts addressing impacts of climate change on PPR wetlands have yet to fully incorporate interacting or potentially overshadowing impacts of landscape modification. We outlined several information needs for building more informative models to predict climate change effects on PPR wetlands. We reviewed how landscape modification influences wetland hydrology and present a conceptual model to describe how modified wetlands might respond to climate variability. We note that current climate projections do not incorporate cyclical variability in climate between wet and dry periods even though such dynamics have shaped the hydrology and ecology of PPR wetlands. We conclude that there are at least three prerequisite steps to making meaningful predictions about effects of climate change on PPR wetlands. Those evident to us are: 1) an understanding of how physical and watershed characteristics of wetland basins of similar hydroperiods vary across temperature and moisture gradients; 2) a mechanistic understanding of how wetlands respond to climate across a gradient of anthropogenic modifications; and 3) improved climate projections for the PPR that can meaningfully represent potential changes in climate variability including intensity and duration of wet and dry periods. Once

  6. Practical statistics a handbook for business projects

    CERN Document Server

    Buglear, John

    2013-01-01

    Practical Statistics is a hands-on guide to statistics, progressing by complexity of data (univariate, bivariate, multivariate) and analysis (portray, summarise, generalise) in order to give the reader a solid understanding of the fundamentals and how to apply them.

  7. Introducing Statistical Research to Undergraduate Mathematical Statistics Students Using the Guitar Hero Video Game Series

    Science.gov (United States)

    Ramler, Ivan P.; Chapman, Jessica L.

    2011-01-01

    In this article we describe a semester-long project, based on the popular video game series Guitar Hero, designed to introduce upper-level undergraduate statistics students to statistical research. Some of the goals of this project are to help students develop statistical thinking that allows them to approach and answer open-ended research…

  8. Statistics and data analysis for financial engineering with R examples

    CERN Document Server

    Ruppert, David

    2015-01-01

    The new edition of this influential textbook, geared towards graduate or advanced undergraduate students, teaches the statistics necessary for financial engineering. In doing so, it illustrates concepts using financial markets and economic data, R Labs with real-data exercises, and graphical and analytic methods for modeling and diagnosing modeling errors. Financial engineers now have access to enormous quantities of data. To make use of these data, the powerful methods in this book, particularly about volatility and risks, are essential. Strengths of this fully-revised edition include major additions to the R code and the advanced topics covered. Individual chapters cover, among other topics, multivariate distributions, copulas, Bayesian computations, risk management, multivariate volatility and cointegration. Suggested prerequisites are basic knowledge of statistics and probability, matrices and linear algebra, and calculus. There is an appendix on probability, statistics and linear algebra. Practicing fina...

  9. System supplier approach to projects and operations efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Moe, P O [Siemens Offshore A/S (Norway)

    1994-12-31

    The conference paper outlines the most important elements for a new approach to project realisation that enable a cost reduction of 30-50% compared to conventional methods. The achievements are based on studies and evaluations to the Norwegian Vigdis development project. The system elements covered are the electrical and automation systems including safety and process control and all traditional phases of a project from concept design to the operational phase. The concept involves new principles for project execution where traditional borderlines and interfaces between the various participants have been redefined. Management attention has been verified as an important prerequisite for a successful implementation of this strategy. 2 figs.

  10. The development of mini project interactive media on junior statistical materials (developmental research in junior high school)

    Science.gov (United States)

    Fauziah, D.; Mardiyana; Saputro, D. R. S.

    2018-05-01

    Assessment is an integral part in the learning process. The process and the result should be in line, regarding to measure the ability of learners. Authentic assessment refers to a form of assessment that measures the competence of attitudes, knowledge, and skills. In fact, many teachers including mathematics teachers who have implemented curriculum based teaching 2013 feel confuse and difficult in mastering the use of authentic assessment instruments. Therefore, it is necessary to design an authentic assessment instrument with an interactive mini media project where teacher can adopt it in the assessment. The type of this research is developmental research. The developmental research refers to the 4D models development, which consist of four stages: define, design, develop and disseminate. The research purpose is to create a valid mini project interactive media on statistical materials in junior high school. The retrieved valid instrument based on expert judgment are 3,1 for eligibility constructions aspect, and 3,2 for eligibility presentation aspect, 3,25 for eligibility contents aspect, and 2,9 for eligibility didactic aspect. The research results obtained interactive mini media projects on statistical materials using Adobe Flash so it can help teachers and students in achieving learning objectives.

  11. Organizational Diagnosis in Project-Based Companies

    Directory of Open Access Journals (Sweden)

    Behrouz Zarei

    2014-05-01

    Full Text Available The purpose of this article is to develop a new method for corporate diagnosis (CD. To this end, a method is developed for the diagnosis process of project-based companies. The article presents a case study in a large company where data have been collected through focus groups. Project delay, high project cost, and low profitability are examples of project deficiency in project-based companies. Such issues have made managers pay special attention to find effective solutions to improve them. Prominent factors are inappropriate strategy, structure, system, human resource management, and PMBOK(Project Management Body of Knowledge processes. Thus, CD and analysis is an important task in improvement of corporate performance. The CD model that is developed in this article could be used for project-based companies. The proposed method can be used for CD in any project-based company. This article provides an emphatic application of CD as a prerequisite for restructuring in project-based companies.

  12. Reasons and Prerequisites of Goodwill Devaluation in the Ukrainian Banking Sector

    Directory of Open Access Journals (Sweden)

    Kundrya-Vysotska Oksana P.

    2014-01-01

    Full Text Available The authors conduct studies of economic prerequisites and factors, which resulted in writing-off significant amounts of goodwill value in Ukrainian banking institutions, which are structural parts of international financial institutions. In the result of analysis of main prerequisites the article identifies external and internal reasons of acknowledgement of devaluation of goodwill in the domestic banking sector. In accordance with the results of the study, the article identifies that acknowledgement of devaluation of goodwill testifies to a negative mood of foreign investors with respect to prospects of business development in the domestic banking market. The article justifies expediency of, separate from goodwill, identification of intangible assets, subject to acknowledgement in the result of unification of banks, in particular the client base of a bank, as an intangible asset with a final date of useful use. It proves that acknowledgement of this asset in the result of unification would allow avoiding significant amounts of write-off of goodwill value under unfavourable economic conditions. The prospect of further studies in this direction is justification of an optimal method of identification of the amount of goodwill devaluation, which would allow avoiding manipulations with financial reporting and would improve quality of presented information about the real financial state of banking institutions.

  13. Descriptive statistics: the specification of statistical measures and their presentation in tables and graphs. Part 7 of a series on evaluation of scientific publications.

    Science.gov (United States)

    Spriestersbach, Albert; Röhrig, Bernd; du Prel, Jean-Baptist; Gerhold-Ay, Aslihan; Blettner, Maria

    2009-09-01

    Descriptive statistics are an essential part of biometric analysis and a prerequisite for the understanding of further statistical evaluations, including the drawing of inferences. When data are well presented, it is usually obvious whether the author has collected and evaluated them correctly and in keeping with accepted practice in the field. Statistical variables in medicine may be of either the metric (continuous, quantitative) or categorical (nominal, ordinal) type. Easily understandable examples are given. Basic techniques for the statistical description of collected data are presented and illustrated with examples. The goal of a scientific study must always be clearly defined. The definition of the target value or clinical endpoint determines the level of measurement of the variables in question. Nearly all variables, whatever their level of measurement, can be usefully presented graphically and numerically. The level of measurement determines what types of diagrams and statistical values are appropriate. There are also different ways of presenting combinations of two independent variables graphically and numerically. The description of collected data is indispensable. If the data are of good quality, valid and important conclusions can already be drawn when they are properly described. Furthermore, data description provides a basis for inferential statistics.

  14. Preeminence and prerequisites of sample size calculations in clinical trials

    Directory of Open Access Journals (Sweden)

    Richa Singhal

    2015-01-01

    Full Text Available The key components while planning a clinical study are the study design, study duration, and sample size. These features are an integral part of planning a clinical trial efficiently, ethically, and cost-effectively. This article describes some of the prerequisites for sample size calculation. It also explains that sample size calculation is different for different study designs. The article in detail describes the sample size calculation for a randomized controlled trial when the primary outcome is a continuous variable and when it is a proportion or a qualitative variable.

  15. Event-based stochastic point rainfall resampling for statistical replication and climate projection of historical rainfall series

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Korup Andersen, Aske; Larsen, Anders Badsberg

    2017-01-01

    Continuous and long rainfall series are a necessity in rural and urban hydrology for analysis and design purposes. Local historical point rainfall series often cover several decades, which makes it possible to estimate rainfall means at different timescales, and to assess return periods of extreme...... includes climate changes projected to a specific future period. This paper presents a framework for resampling of historical point rainfall series in order to generate synthetic rainfall series, which has the same statistical properties as an original series. Using a number of key target predictions...... for the future climate, such as winter and summer precipitation, and representation of extreme events, the resampled historical series are projected to represent rainfall properties in a future climate. Climate-projected rainfall series are simulated by brute force randomization of model parameters, which leads...

  16. [Evaluation of prerequisites programs for a HACCP plan for frozen sardine plant].

    Science.gov (United States)

    Rosas, Patricia; Reyes, Genara

    2008-06-01

    Good manufacturing practices (GMP) and sanitation standard operating procedures (SSOP) are prerequisites programs for the application of the Hazard Analysis and Critical Control Point (HACCP) system as a food safety approach during processing. The aim of this study was to evaluate GMP/SSOP prerequisites in processing line of frozen whole sardine (Sardinella aurita). The GMP compliance was verified according to a standard procedure of the Ministry for the Health of Venezuela, and the SSOP were assessed according to a checklist proposed by the FDA. GMP and SSOP were evaluated following a demerit-based approach. A percentage value was calculated and referred to as sanitary effectiveness. Results indicated that the plant had a good level of compliance with GMP from assessment of buildings and facilities, equipment and tools, hygienic requisites of the production, assurance of the hygiene quality, storage and transportation, and the percentage of sanitary effectiveness was 84%. The level of compliance for SSOP was 53,12% with demerits found in all assessed aspects consisting of inexistent guidelines, lack of control in the sanitary plan and lack of leadership in applying corrective actions. Thus, an improvement in the plant sanitation program was designed targeting SSOP.

  17. Prerequisites for data-based decision making in the classroom: Research evidence and practical illustrations

    NARCIS (Netherlands)

    Hoogland, Inge; Schildkamp, Kim; van der Kleij, Fabienne; Heitink, Maaike Christine; Kippers, Wilma Berdien; Veldkamp, Bernard P.; Dijkstra, Anne M.

    2016-01-01

    Data-based decision making can lead to increased student learning. The desired effects of increased student learning can only be realized if data-based decision making is implemented successfully. Therefore, a systematic literature review was conducted to identify prerequisites of such successful

  18. The Hodge theory of projective manifolds

    CERN Document Server

    de Cataldo, Mark Andrea

    2007-01-01

    This book is a written-up and expanded version of eight lectures on the Hodge theory of projective manifolds. It assumes very little background and aims at describing how the theory becomes progressively richer and more beautiful as one specializes from Riemannian, to Kähler, to complex projective manifolds. Though the proof of the Hodge Theorem is omitted, its consequences - topological, geometrical and algebraic - are discussed at some length. The special properties of complex projective manifolds constitute an important body of knowledge and readers are guided through it with the help of selected exercises. Despite starting with very few prerequisites, the concluding chapter works out, in the meaningful special case of surfaces, the proof of a special property of maps between complex projective manifolds, which was discovered only quite recently.

  19. Inter-comparison of statistical downscaling methods for projection of extreme flow indices across Europe

    DEFF Research Database (Denmark)

    Hundecha, Yeshewatesfa; Sunyer Pinya, Maria Antonia; Lawrence, Deborah

    2016-01-01

    The effect of methods of statistical downscaling of daily precipitation on changes in extreme flow indices under a plausible future climate change scenario was investigated in 11 catchments selected from 9 countries in different parts of Europe. The catchments vary from 67 to 6171km2 in size...... catchments to simulate daily runoff. A set of flood indices were derived from daily flows and their changes have been evaluated by comparing their values derived from simulations corresponding to the current and future climate. Most of the implemented downscaling methods project an increase in the extreme...... flow indices in most of the catchments. The catchments where the extremes are expected to increase have a rainfall-dominated flood regime. In these catchments, the downscaling methods also project an increase in the extreme precipitation in the seasons when the extreme flows occur. In catchments where...

  20. THE MILKY WAY PROJECT: A STATISTICAL STUDY OF MASSIVE STAR FORMATION ASSOCIATED WITH INFRARED BUBBLES

    International Nuclear Information System (INIS)

    Kendrew, S.; Robitaille, T. P.; Simpson, R.; Lintott, C. J.; Bressert, E.; Povich, M. S.; Sherman, R.; Schawinski, K.; Wolf-Chase, G.

    2012-01-01

    The Milky Way Project citizen science initiative recently increased the number of known infrared bubbles in the inner Galactic plane by an order of magnitude compared to previous studies. We present a detailed statistical analysis of this data set with the Red MSX Source (RMS) catalog of massive young stellar sources to investigate the association of these bubbles with massive star formation. We particularly address the question of massive triggered star formation near infrared bubbles. We find a strong positional correlation of massive young stellar objects (MYSOs) and H II regions with Milky Way Project bubbles at separations of <2 bubble radii. As bubble sizes increase, a statistically significant overdensity of massive young sources emerges in the region of the bubble rims, possibly indicating the occurrence of triggered star formation. Based on numbers of bubble-associated RMS sources, we find that 67% ± 3% of MYSOs and (ultra-)compact H II regions appear to be associated with a bubble. We estimate that approximately 22% ± 2% of massive young stars may have formed as a result of feedback from expanding H II regions. Using MYSO-bubble correlations, we serendipitously recovered the location of the recently discovered massive cluster Mercer 81, suggesting the potential of such analyses for discovery of heavily extincted distant clusters.

  1. THE MILKY WAY PROJECT: A STATISTICAL STUDY OF MASSIVE STAR FORMATION ASSOCIATED WITH INFRARED BUBBLES

    Energy Technology Data Exchange (ETDEWEB)

    Kendrew, S.; Robitaille, T. P. [Max-Planck-Institut fuer Astronomie, Koenigstuhl 17, D-69117 Heidelberg (Germany); Simpson, R.; Lintott, C. J. [Department of Astrophysics, University of Oxford, Denys Wilkinson Building, Keble Road, Oxford OX1 3RH (United Kingdom); Bressert, E. [School of Physics, University of Exeter, Stocker Road, Exeter EX4 4QL (United Kingdom); Povich, M. S. [Department of Astronomy and Astrophysics, Pennsylvania State University, 525 Davey Laboratory, University Park, PA 16802 (United States); Sherman, R. [Department of Astronomy and Astrophysics, University of Chicago, 5640 S. Ellis Avenue, Chicago, IL 60637 (United States); Schawinski, K. [Yale Center for Astronomy and Astrophysics, Yale University, P.O. Box 208121, New Haven, CT 06520 (United States); Wolf-Chase, G., E-mail: kendrew@mpia.de [Astronomy Department, Adler Planetarium, 1300 S. Lake Shore Drive, Chicago, IL 60605 (United States)

    2012-08-10

    The Milky Way Project citizen science initiative recently increased the number of known infrared bubbles in the inner Galactic plane by an order of magnitude compared to previous studies. We present a detailed statistical analysis of this data set with the Red MSX Source (RMS) catalog of massive young stellar sources to investigate the association of these bubbles with massive star formation. We particularly address the question of massive triggered star formation near infrared bubbles. We find a strong positional correlation of massive young stellar objects (MYSOs) and H II regions with Milky Way Project bubbles at separations of <2 bubble radii. As bubble sizes increase, a statistically significant overdensity of massive young sources emerges in the region of the bubble rims, possibly indicating the occurrence of triggered star formation. Based on numbers of bubble-associated RMS sources, we find that 67% {+-} 3% of MYSOs and (ultra-)compact H II regions appear to be associated with a bubble. We estimate that approximately 22% {+-} 2% of massive young stars may have formed as a result of feedback from expanding H II regions. Using MYSO-bubble correlations, we serendipitously recovered the location of the recently discovered massive cluster Mercer 81, suggesting the potential of such analyses for discovery of heavily extincted distant clusters.

  2. What transformations in the international system are prerequisites for the complete elimination of nuclear weapons?

    International Nuclear Information System (INIS)

    Tsipis, K.

    1993-01-01

    The author reviews prerequisites for the complete elimination of nuclear weapons (NW), among which are: symmetry of NW possession; stopping the NW tests; establishment of a multinational nuclear deterrent force; common security regional arrangements aimed at denuclearization

  3. Engaging Diverse Students in Statistical Inquiry: A Comparison of Learning Experiences and Outcomes of Under-Represented and Non-Underrepresented Students Enrolled in a Multidisciplinary Project-Based Statistics Course

    Science.gov (United States)

    Dierker, Lisa; Alexander, Jalen; Cooper, Jennifer L.; Selya, Arielle; Rose, Jennifer; Dasgupta, Nilanjana

    2016-01-01

    Introductory statistics needs innovative, evidence-based teaching practices that support and engage diverse students. To evaluate the success of a multidisciplinary, project-based course, we compared experiences of under-represented (URM) and non-underrepresented students in 4 years of the course. While URM students considered the material more…

  4. Technical/institutional prerequisite for nuclear forensics response framework

    International Nuclear Information System (INIS)

    Tamai, Hiroshi; Okubo, Ayako; Kimura, Yoshiki; Kokaji, Lisa; Shinohara, Nobuo; Tomikawa, Hirofumi

    2016-01-01

    Nuclear Forensics capability has been developed under the international collaborations. For its effective function, technical development in analysis of seized nuclear materials as well as the institutional development in comprehensive response framework are required under individual national responsibility. In order to keep the “chain of custody” in the proper operation of sample collection at the event scene, radiological analysis at the laboratory, storage of the samples, and further inspection and trial, close cooperation and information sharing between relevant organisations are essential. IAEA issues the Implementing Guide to provide the model action plan and assists individual national development. Some countries at the advancing stage of national response framework, promote the international cooperation for the technical improvement and awareness cultivation. Examples in such national developments will be introduced and prospective technical/institutional prerequisite for nuclear forensics response framework will be studied. (author)

  5. Computational statistics handbook with Matlab

    CERN Document Server

    Martinez, Wendy L

    2007-01-01

    Prefaces Introduction What Is Computational Statistics? An Overview of the Book Probability Concepts Introduction Probability Conditional Probability and Independence Expectation Common Distributions Sampling Concepts Introduction Sampling Terminology and Concepts Sampling Distributions Parameter Estimation Empirical Distribution Function Generating Random Variables Introduction General Techniques for Generating Random Variables Generating Continuous Random Variables Generating Discrete Random Variables Exploratory Data Analysis Introduction Exploring Univariate Data Exploring Bivariate and Trivariate Data Exploring Multidimensional Data Finding Structure Introduction Projecting Data Principal Component Analysis Projection Pursuit EDA Independent Component Analysis Grand Tour Nonlinear Dimensionality Reduction Monte Carlo Methods for Inferential Statistics Introduction Classical Inferential Statistics Monte Carlo Methods for Inferential Statist...

  6. Syllable division: Prerequisite to dyslexics' literacy.

    Science.gov (United States)

    Cox, A R; Hutcheson, L

    1988-01-01

    Skill in reading long words is prerequisite to dyslexics' literacy. Instant recognition of printed symbols is easy for those readers with photographic memories, but dyslexics often fail to recognize visually many long words which are actually familiar to them auditorially. Scientific, automatic, multisensory procedures for dividing longer words into easily read syllables can enable students to translate visual symbols rapidly and thereby to read, write, or spell accurately words of any length.Over one thousand dyslexics, aged seven to fifteen, guided the interdisciplinary team at Texas Scottish Rite Hospital in Dallas to develop, observe results, and test specific structured, sequential steps in working out longer words. The ten-year study (1965-1975) in the Language Laboratory of the Hospital established the Alphabetic Phonics curriculum which is now used successfully, not only in remedial groups but in regular classes of any size or age, in public and private schools in 45 states and six foreign countries.The newly-established Aylett Royall Cox Institute in Dallas prepares teachers and Master Instructors to train both students and other teachers. Comparable Alphabetic Phonics Teacher Training Centers are already established in Houston and Lubbock, Texas, in Oklahoma City, and at Columbia University Teachers College in New York.

  7. A quantitative method for selecting renewable energy projects in the mining industry based on sustainability

    OpenAIRE

    Mostert, M.

    2014-01-01

    Mining companies sponsor a range of non-core, corporate social responsibility projects to adhere to social and labour plans and environmental management prerequisites that form part of a mining licence application. Some companies go above and beyond such projects, sponsoring initiatives that generate renewable energy through solar power, wind energy, natural gas, etc. The challenge for these companies is to choose between a variety of projects to ensure maximum value, especially in times when...

  8. PREREQUISITE PROGRAMMES IN OWN CHECKS IN STATUTORY AND VOLUNTARY LEGISLATION

    Directory of Open Access Journals (Sweden)

    E. Guidi

    2012-08-01

    Full Text Available Prerequisite Programmes approach is a requirement for implementing a correct own check plan. This new approach, born according to the European Legislation, is completely recognized by third Nation Authorities and private Inspection and Accreditation Bodies. This method is the basis to verify if an own check system is under control and to verify if corrective actions are built up to warrant hygienic production standards. The present work demonstrate that a correct own check plan is built up only by a Pre Requisites Program approach. The new UNI EN ISO 22000:2005 standard describe this concept specifying the difference between PRP and CCP.

  9. Statistical CT noise reduction with multiscale decomposition and penalized weighted least squares in the projection domain

    International Nuclear Information System (INIS)

    Tang Shaojie; Tang Xiangyang

    2012-01-01

    Purposes: The suppression of noise in x-ray computed tomography (CT) imaging is of clinical relevance for diagnostic image quality and the potential for radiation dose saving. Toward this purpose, statistical noise reduction methods in either the image or projection domain have been proposed, which employ a multiscale decomposition to enhance the performance of noise suppression while maintaining image sharpness. Recognizing the advantages of noise suppression in the projection domain, the authors propose a projection domain multiscale penalized weighted least squares (PWLS) method, in which the angular sampling rate is explicitly taken into consideration to account for the possible variation of interview sampling rate in advanced clinical or preclinical applications. Methods: The projection domain multiscale PWLS method is derived by converting an isotropic diffusion partial differential equation in the image domain into the projection domain, wherein a multiscale decomposition is carried out. With adoption of the Markov random field or soft thresholding objective function, the projection domain multiscale PWLS method deals with noise at each scale. To compensate for the degradation in image sharpness caused by the projection domain multiscale PWLS method, an edge enhancement is carried out following the noise reduction. The performance of the proposed method is experimentally evaluated and verified using the projection data simulated by computer and acquired by a CT scanner. Results: The preliminary results show that the proposed projection domain multiscale PWLS method outperforms the projection domain single-scale PWLS method and the image domain multiscale anisotropic diffusion method in noise reduction. In addition, the proposed method can preserve image sharpness very well while the occurrence of “salt-and-pepper” noise and mosaic artifacts can be avoided. Conclusions: Since the interview sampling rate is taken into account in the projection domain

  10. Support Provided to the External Tank (ET) Project on the Use of Statistical Analysis for ET Certification Consultation Position Paper

    Science.gov (United States)

    Null, Cynthia H.

    2009-01-01

    In June 2004, the June Space Flight Leadership Council (SFLC) assigned an action to the NASA Engineering and Safety Center (NESC) and External Tank (ET) project jointly to characterize the available dataset [of defect sizes from dissections of foam], identify resultant limitations to statistical treatment of ET as-built foam as part of the overall thermal protection system (TPS) certification, and report to the Program Requirements Change Board (PRCB) and SFLC in September 2004. The NESC statistics team was formed to assist the ET statistics group in August 2004. The NESC's conclusions are presented in this report.

  11. Principles, Economic and Institutional Prerequisites for Fiscal Decentralization under Conditions of Post-Conflict Reconstruction

    Directory of Open Access Journals (Sweden)

    Vishnevsky Valentine P.

    2016-11-01

    Full Text Available The aim of the article is to study principles, economic and institutional prerequisites for fiscal decentralization on post-conflict territories. It is determined that fiscal decentralization is one of the main ways to solve problems of post-conflict areas. There justified principles, economic and institutional prerequisites of fiscal decentralization on post-conflict territories with regard to the specificity of individual spheres of fiscal relations. Moreover, different spheres of fiscal relations require different approaches: the sphere of public revenues — providing economic efficiency with the formation of the tax structure contributing to the expansion of the tax base; the sphere of public spending — ensuring social justice and transparency in allocation of social cost at the local level; the sphere of subsidies — narrowing the scope of application of intergovernmental transfers with organizing the redistribution of financial resources under the principle of «center - post-conflict regions - post-conflict recipients»; the sphere of external assistance — ensuring proper coordination for cultivation of new co-operative institutions.

  12. US Urban Forest Statistics, Values, and Projections

    Science.gov (United States)

    David J Nowak; Eric J. Greenfield

    2018-01-01

    U.S. urban land increased from 2.6% (57.9 million acres) in 2000 to 3.0% (68.0 million acres) in 2010. States with the greatest amount of urban growth were in the South/Southeast (TX, FL, NC, GA and SC). Between 2010 and 2060, urban land is projected to increase another 95.5 million acres to 163.1 million acres (8.6%) with 18 states projected to have an increase of...

  13. Evaluation and projection of daily temperature percentiles from statistical and dynamical downscaling methods

    Directory of Open Access Journals (Sweden)

    A. Casanueva

    2013-08-01

    Full Text Available The study of extreme events has become of great interest in recent years due to their direct impact on society. Extremes are usually evaluated by using extreme indicators, based on order statistics on the tail of the probability distribution function (typically percentiles. In this study, we focus on the tail of the distribution of daily maximum and minimum temperatures. For this purpose, we analyse high (95th and low (5th percentiles in daily maximum and minimum temperatures on the Iberian Peninsula, respectively, derived from different downscaling methods (statistical and dynamical. First, we analyse the performance of reanalysis-driven downscaling methods in present climate conditions. The comparison among the different methods is performed in terms of the bias of seasonal percentiles, considering as observations the public gridded data sets E-OBS and Spain02, and obtaining an estimation of both the mean and spatial percentile errors. Secondly, we analyse the increments of future percentile projections under the SRES A1B scenario and compare them with those corresponding to the mean temperature, showing that their relative importance depends on the method, and stressing the need to consider an ensemble of methodologies.

  14. Quality Partnership as a Contextual Prerequisite of Successful Learning of Young and Preschool-Aged Children

    Science.gov (United States)

    Ljubetic, Maja; Ercegovac, Ina Reic; Koludrovic, Morana

    2016-01-01

    The paper discusses quality partnership as a prerequisite for the functioning of the institutions of early and pre-school education and for the child's overall development and learning. Considering that child's development and learning take place in different contexts (family, educational institutions, clubs, local and wider communities), the…

  15. ADVERTISEMENTS FOR ICT PROJECT MANAGERS SHOW DIVERSITY BETWEEN SWEDISH EMPLOYERS’ AND PROJECT MANAGEMENT ASSOCIATIONS’ VIEWS OF PM CERTIFICATIONS

    Directory of Open Access Journals (Sweden)

    Siw Lundqvist

    2014-05-01

    Full Text Available Appointing ICT project managers is a delicate issue for management; not least since ICT projects are known to be unsuccessful in delivering the required product in time and on budget. Hence, it is even more important to find the “right” individual for the job. According to project management associations, certification in project management is a prerequisite for a project manager’s successful career. The appreciation of project management certifications among Swedish employers was studied using data collected from job advertisements for ICT project managers during four years (2010-2013. Judged on how the advertisements were worded the result indicates surprisingly low interest from the employers’ side, which conflicts with the project management associations statements about the certifications’ indispensable value for successful projects. Furthermore, it conflicts with a common understanding of certifications as essential for appointment as a project manager. The findings identify a possible gap between PM associations’ and employers’ views regarding the certifications’ value, and highlight the necessity of seriously considering whether it is worthwhile for the individuals to strive for, and for the organizations to promote certification, since it is costly in both time, effort and money.

  16. Descriptive statistics.

    Science.gov (United States)

    Nick, Todd G

    2007-01-01

    Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.

  17. Self-Leadership Change Project: The Continuation of an Ongoing Experiential Program

    Science.gov (United States)

    Phillips, James I.; Kern, Dave; Tewari, Jitendra; Jones, Kenneth E.; Beemraj, Eshwar Prasad; Ettigi, Chaitra Ashok

    2017-01-01

    Purpose: The self-leadership change project (SLCP) is an ongoing program for senior level students at a regional university designed to provide hands-on experience in building self-management skills, which is considered a pre-requisite by many leaders and scholars (e.g. Drucker, 1996; Schaetti et al., 2008). The paper aims to discuss this issue.…

  18. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in brain CT

    International Nuclear Information System (INIS)

    Ren, Qingguo; Dewan, Sheilesh Kumar; Li, Ming; Li, Jianying; Mao, Dingbiao; Wang, Zhenglei; Hua, Yanqing

    2012-01-01

    Purpose: To compare image quality and visualization of normal structures and lesions in brain computed tomography (CT) with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP) reconstruction techniques in different X-ray tube current–time products. Materials and methods: In this IRB-approved prospective study, forty patients (nineteen men, twenty-one women; mean age 69.5 ± 11.2 years) received brain scan at different tube current–time products (300 and 200 mAs) in 64-section multi-detector CT (GE, Discovery CT750 HD). Images were reconstructed with FBP and four levels of ASIR-FBP blending. Two radiologists (please note that our hospital is renowned for its geriatric medicine department, and these two radiologists are more experienced in chronic cerebral vascular disease than in neoplastic disease, so this research did not contain cerebral tumors but as a discussion) assessed all the reconstructed images for visibility of normal structures, lesion conspicuity, image contrast and diagnostic confidence in a blinded and randomized manner. Volume CT dose index (CTDI vol ) and dose-length product (DLP) were recorded. All the data were analyzed by using SPSS 13.0 statistical analysis software. Results: There was no statistically significant difference between the image qualities at 200 mAs with 50% ASIR blending technique and 300 mAs with FBP technique (p > .05). While between the image qualities at 200 mAs with FBP and 300 mAs with FBP technique a statistically significant difference (p < .05) was found. Conclusion: ASIR provided same image quality and diagnostic ability in brain imaging with greater than 30% dose reduction compared with FBP reconstruction technique

  19. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in brain CT

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Qingguo, E-mail: renqg83@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Dewan, Sheilesh Kumar, E-mail: sheilesh_d1@hotmail.com [Department of Geriatrics, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Ming, E-mail: minli77@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Jianying, E-mail: Jianying.Li@med.ge.com [CT Imaging Research Center, GE Healthcare China, Beijing (China); Mao, Dingbiao, E-mail: maodingbiao74@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Wang, Zhenglei, E-mail: Williswang_doc@yahoo.com.cn [Department of Radiology, Shanghai Electricity Hospital, Shanghai 200050 (China); Hua, Yanqing, E-mail: cjr.huayanqing@vip.163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China)

    2012-10-15

    Purpose: To compare image quality and visualization of normal structures and lesions in brain computed tomography (CT) with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP) reconstruction techniques in different X-ray tube current–time products. Materials and methods: In this IRB-approved prospective study, forty patients (nineteen men, twenty-one women; mean age 69.5 ± 11.2 years) received brain scan at different tube current–time products (300 and 200 mAs) in 64-section multi-detector CT (GE, Discovery CT750 HD). Images were reconstructed with FBP and four levels of ASIR-FBP blending. Two radiologists (please note that our hospital is renowned for its geriatric medicine department, and these two radiologists are more experienced in chronic cerebral vascular disease than in neoplastic disease, so this research did not contain cerebral tumors but as a discussion) assessed all the reconstructed images for visibility of normal structures, lesion conspicuity, image contrast and diagnostic confidence in a blinded and randomized manner. Volume CT dose index (CTDI{sub vol}) and dose-length product (DLP) were recorded. All the data were analyzed by using SPSS 13.0 statistical analysis software. Results: There was no statistically significant difference between the image qualities at 200 mAs with 50% ASIR blending technique and 300 mAs with FBP technique (p > .05). While between the image qualities at 200 mAs with FBP and 300 mAs with FBP technique a statistically significant difference (p < .05) was found. Conclusion: ASIR provided same image quality and diagnostic ability in brain imaging with greater than 30% dose reduction compared with FBP reconstruction technique.

  20. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in quantifying coronary calcium.

    Science.gov (United States)

    Takahashi, Masahiro; Kimura, Fumiko; Umezawa, Tatsuya; Watanabe, Yusuke; Ogawa, Harumi

    2016-01-01

    Adaptive statistical iterative reconstruction (ASIR) has been used to reduce radiation dose in cardiac computed tomography. However, change of image parameters by ASIR as compared to filtered back projection (FBP) may influence quantification of coronary calcium. To investigate the influence of ASIR on calcium quantification in comparison to FBP. In 352 patients, CT images were reconstructed using FBP alone, FBP combined with ASIR 30%, 50%, 70%, and ASIR 100% based on the same raw data. Image noise, plaque density, Agatston scores and calcium volumes were compared among the techniques. Image noise, Agatston score, and calcium volume decreased significantly with ASIR compared to FBP (each P ASIR reduced Agatston score by 10.5% to 31.0%. In calcified plaques both of patients and a phantom, ASIR decreased maximum CT values and calcified plaque size. In comparison to FBP, adaptive statistical iterative reconstruction (ASIR) may significantly decrease Agatston scores and calcium volumes. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  1. Characteristics and drivers of drought in Europe-a summary of the DROUGHT-R&SPI project

    NARCIS (Netherlands)

    Tallaksen, Lena M.; Stagge, James H.; Stahl, Kerstin; Gudmundsson, Lukas; Orth, Rene; Seneviratne, Sonia I.; Loon, van Anne F.; Lanen, van Henny A.J.

    2015-01-01

    A prerequisite to mitigate the wide range of drought impacts is to establish a good understanding of the drought generating mechanisms from their initiation as a meteorological drought through to their development as soil moisture and hydrological drought. The DROUGHT-R&SPI project has

  2. PSYCHOLINGUISTIC PREREQUISITES FOR DEVELOPING LISTENING COMPETENCE OF PRE-SERVICE TEACHERS THROUGH FICTION AUDIOBOOKS

    Directory of Open Access Journals (Sweden)

    Iryna Bilianska

    2017-07-01

    Full Text Available The quality of the professional training of foreign language teachers presupposes high level of their listening competence. However, in non-authentic language environment developing proficiency in listening is recognized as a difficult task. Therefore, Ukrainian methodologists are in search of new ways to improve listening skills of pre-service teachers. The purpose of this article is to explore recent research into psycho-linguistic issues and analyse the grounds for the development of listening competence by means of fiction audiobooks. This paper therefore deals with the analysis of cognitive processes and psychological mechanisms, listening stages (motivational, analytically-synthetic, executive and controlling. It goes on to focus on artistic perception and its mechanisms and the information processing mechanisms. Since fiction is an art of words, specific features of listening to audiobooks are primarily related to the category of art. It is revealed that at all levels of the structure of an artistic text (genre, plot, structure there are some authors guidelines which guide, direct attention and activate apperception. The typical benchmarks of audiobooks that help to activate apperception (genre, cover, title, sample, summary, reviews, author / narrator, volume, rating etc. have been determined. It has been found that listening to an audiobook should result into its "projection" in the recipients mind. The "projection" may be materialized through a secondary text. It is concluded that the mechanisms of listening to fiction audiobooks are: 1 mental processes (perception, thinking, memory, attention; 2 psychological mechanisms (speech hearing, articulation, anticipation, comprehension, working memory; 3 mechanisms of artistic perception (emotions and feelings, imagination, apperception, figurative and associative thinking; 4 information processing mechanisms (mechanism of equivalent replacements, transcoding, compression, expansion

  3. Cost effective solutions for field development. System supplier approach to projects and operations

    International Nuclear Information System (INIS)

    Moe, P.O.

    1994-01-01

    The conference paper outlines the most important elements for a new approach to project realisation that enable a cost reduction of 30-50% compared to conventional methods. The achievements are based on studies and evaluations to the Norwegian Vigdis development project. The system elements covered are the electrical and automation systems including safety and process control and all traditional phases of a project from concept design to the operational phase. The concept involves new principles for project execution where traditional borderlines and interfaces between the various participants have been redefined. Management attention has been verified as an important prerequisite for a successful implementation of this strategy. 2 figs

  4. [Transparency as a prerequisite of innovation in health services research: deficits in the reporting of model projects concerning managed care].

    Science.gov (United States)

    Wiethege, J; Ommen, O; Ernstmann, N; Pfaff, H

    2010-10-01

    Currently, elements of managed care are being implemented in the German health-care system. The legal basis for these innovations are § 140, § 73, § 137, and §§ 63 et seq. of the German Social Code - Part 5 (SGB V). For the model projects according to §§ 63 et seq. of the German Social Code a scientific evaluation and publication of the evaluation results is mandatory. The present study examines the status of evaluation of German model projects. The present study has a mixed method design: A mail and telephone survey with the German Federal Social Insurance Authority, the health insurance funds, and the regional Associations of Statutory Health Insurance Physicians has been conducted. Furthermore, an internet research on "Medpilot" and "Google" has been accomplished to search for model projects and their evaluation reports. 34 model projects met the inclusion criteria. 13 of these projects had been terminated up to 30/9/2008. 6 of them have published an evaluation report. 4 model projects have published substantial documents. One model project in progress has published a meaningful interim report. 12 model projects failed to give information concerning the evaluator or the duration of the model projects. The results show a significant deficit in the mandatory reporting of the evaluation of model projects in Germany. There is a need for action for the legislator and the health insurance funds in terms of promoting the evaluation and the publication of the results. The institutions evaluating the model projects should obligate themselves to publish the evaluation results. The publication is an essential precondition for the development of managed care structures in the health-care system and in the development of scientific evaluation methods. © Georg Thieme Verlag KG Stuttgart · New York.

  5. The prerequisites for effective competition in restructured wholesale electricity markets

    International Nuclear Information System (INIS)

    Haas, R.; Auer, H.

    2006-01-01

    This paper argues that effective competition in reformed wholesale electricity markets can only be achieved if the following six prerequisites are met: (1) separation of the grid from generation and supply; (2) wholesale price deregulation; (3) sufficient transmission capacity for a competitive market and non-discriminating grid access; (4) excess generation capacity developed by a large number of competing generators; (5) an equilibrium relationship between short-term spot markets and the long-term financial instruments that marketers use to manage spot-market price volatility; (6) an essentially hands-off government policy that encompasses reduced oversight and privatization. The absence of any one of the first five conditions may result in an oligopoly or monopoly market whose economic performance does not meet the efficiency standards of a competently managed regulated electrical utility. (author)

  6. Overview of ACTYS project on development of indigenous state-of-the-art code suites for nuclear activation analysis

    International Nuclear Information System (INIS)

    Subhash, P.V.; Tadepalli, Sai Chaitanya; Deshpande, Shishir P.; Kanth, Priti; Srinivasan, R.

    2017-01-01

    Rigorous activation calculations are warranted for safer and efficient design of future fusion machines. Suitable activation codes, which yield accurate results with faster performance yet include all fusion relevant reactions are a prerequisite. To meet these, an indigenous project called ACTYS-Project is initiated and as a result, four state-of-art codes are developed so far. The goal of this project is to develop indigenous state-of-the-art code suites for nuclear activation analysis

  7. Democratization of Education as Prerequisite for Social Economic and Cultural Progress in a Multi-Cultural Society

    Science.gov (United States)

    Madumere, S. C.; Olisaemeka, B. U.

    2011-01-01

    This paper focuses on democratization of education as a prerequisite for social, economic and cultural progress in a multi-cultural society, such as Nigeria. Attempt was made to define and explain the major concepts in the paper. Education was explained as an instrument of democracy and as function of socialization, culture and economic…

  8. Combined equations for estimating global solar radiation: Projection of radiation field over Japan under global warming conditions by statistical downscaling

    International Nuclear Information System (INIS)

    Iizumi, T.; Nishimori, M.; Yokozawa, M.

    2008-01-01

    For this study, we developed a new statistical model to estimate the daily accumulated global solar radiation on the earth's surface and used the model to generate a high-resolution climate change scenario of the radiation field in Japan. The statistical model mainly relies on precipitable water vapor calculated from air temperature and relative humidity on the surface to estimate seasonal changes in global solar radiation. On the other hand, to estimate daily radiation fluctuations, the model uses either a diurnal temperature range or relative humidity. The diurnal temperature range, calculated from the daily maximum and minimum temperatures, and relative humidity is a general output of most climate models, and pertinent observation data are comparatively easy to access. The statistical model performed well when estimating the monthly mean value, daily fluctuation statistics, and regional differences in the radiation field in Japan. To project the change in the radiation field for the years 2081 to 2100, we applied the statistical model to the climate change scenario of a high-resolution Regional Climate Model with a 20-km mesh size (RCM20) developed at the Meteorological Research Institute based on the Special Report for Emission Scenario (SRES)-A2. The projected change shows the following tendency: global solar radiation will increase in the warm season and decrease in the cool season in many areas of Japan, indicating that global warming may cause changes in the radiation field in Japan. The generated climate change scenario for the radiation field is linked to long-term and short-term changes in air temperature and relative humidity obtained from the RCM20 and, consequently, is expected to complement the RCM20 datasets for an impact assessment study in the agricultural sector

  9. Introductory Statistics Education and the National Science Foundation

    Science.gov (United States)

    Hall, Megan R.; Rowell, Ginger Holmes

    2008-01-01

    This paper describes 27 National Science Foundation supported grant projects that have innovations designed to improve teaching and learning in introductory statistics courses. The characteristics of these projects are compared with the six recommendations given in the "Guidelines for Assessment and Instruction in Statistics Education (GAISE)…

  10. Fast computation of statistical uncertainty for spatiotemporal distributions estimated directly from dynamic cone beam SPECT projections

    International Nuclear Information System (INIS)

    Reutter, Bryan W.; Gullberg, Grant T.; Huesman, Ronald H.

    2001-01-01

    The estimation of time-activity curves and kinetic model parameters directly from projection data is potentially useful for clinical dynamic single photon emission computed tomography (SPECT) studies, particularly in those clinics that have only single-detector systems and thus are not able to perform rapid tomographic acquisitions. Because the radiopharmaceutical distribution changes while the SPECT gantry rotates, projections at different angles come from different tracer distributions. A dynamic image sequence reconstructed from the inconsistent projections acquired by a slowly rotating gantry can contain artifacts that lead to biases in kinetic parameters estimated from time-activity curves generated by overlaying regions of interest on the images. If cone beam collimators are used and the focal point of the collimators always remains in a particular transaxial plane, additional artifacts can arise in other planes reconstructed using insufficient projection samples [1]. If the projection samples truncate the patient's body, this can result in additional image artifacts. To overcome these sources of bias in conventional image based dynamic data analysis, we and others have been investigating the estimation of time-activity curves and kinetic model parameters directly from dynamic SPECT projection data by modeling the spatial and temporal distribution of the radiopharmaceutical throughout the projected field of view [2-8]. In our previous work we developed a computationally efficient method for fully four-dimensional (4-D) direct estimation of spatiotemporal distributions from dynamic SPECT projection data [5], which extended Formiconi's least squares algorithm for reconstructing temporally static distributions [9]. In addition, we studied the biases that result from modeling various orders temporal continuity and using various time samplings [5]. the present work, we address computational issues associated with evaluating the statistical uncertainty of

  11. Developing statistical wildlife habitat relationships for assessing cumulative effects of fuels treatments: Final Report for Joint Fire Science Program Project

    Science.gov (United States)

    Samuel A. Cushman; Kevin S. McKelvey

    2006-01-01

    The primary weakness in our current ability to evaluate future landscapes in terms of wildlife lies in the lack of quantitative models linking wildlife to forest stand conditions, including fuels treatments. This project focuses on 1) developing statistical wildlife habitat relationships models (WHR) utilizing Forest Inventory and Analysis (FIA) and National Vegetation...

  12. Projection of spatial and temporal changes of rainfall in Sarawak of Borneo Island using statistical downscaling of CMIP5 models

    Science.gov (United States)

    Sa'adi, Zulfaqar; Shahid, Shamsuddin; Chung, Eun-Sung; Ismail, Tarmizi bin

    2017-11-01

    This study assesses the possible changes in rainfall patterns of Sarawak in Borneo Island due to climate change through statistical downscaling of General Circulation Models (GCM) projections. Available in-situ observed rainfall data were used to downscale the future rainfall from ensembles of 20 GCMs of Coupled Model Intercomparison Project phase 5 (CMIP5) for four Representative Concentration Pathways (RCP) scenarios, namely, RCP2.6, RCP4.5, RCP6.0 and RCP8.5. Model Output Statistics (MOS) based downscaling models were developed using two data mining approaches known as Random Forest (RF) and Support Vector Machine (SVM). The SVM was found to downscale all GCMs with normalized mean square error (NMSE) of 48.2-75.2 and skill score (SS) of 0.94-0.98 during validation. The results show that the future projection of the annual rainfalls is increasing and decreasing on the region-based and catchment-based basis due to the influence of the monsoon season affecting the coast of Sarawak. The ensemble mean of GCMs projections reveals the increased and decreased mean of annual precipitations at 33 stations with the rate of 0.1% to 19.6% and one station with the rate of - 7.9% to - 3.1%, respectively under all RCP scenarios. The remaining 15 stations showed inconsistency neither increasing nor decreasing at the rate of - 5.6% to 5.2%, but mainly showing a trend of decreasing rainfall during the first period (2010-2039) followed by increasing rainfall for the period of 2070-2099.

  13. Applied Statistics with SPSS

    Science.gov (United States)

    Huizingh, Eelko K. R. E.

    2007-01-01

    Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…

  14. Responses of mink to auditory stimuli: Prerequisites for applying the ‘cognitive bias’ approach

    DEFF Research Database (Denmark)

    Svendsen, Pernille Maj; Malmkvist, Jens; Halekoh, Ulrich

    2012-01-01

    The aim of the study was to determine and validate prerequisites for applying a cognitive (judgement) bias approach to assessing welfare in farmed mink (Neovison vison). We investigated discrimination ability and associative learning ability using auditory cues. The mink (n = 15 females) were...... farmed mink in a judgement bias approach would thus appear to be feasible. However several specific issues are to be considered in order to successfully adapt a cognitive bias approach to mink, and these are discussed....

  15. Video self-modeling in children with autism: a pilot study validating prerequisite skills and extending the utilization of VSM across skill sets.

    Science.gov (United States)

    Williamson, Robert L; Casey, Laura B; Robertson, Janna Siegel; Buggey, Tom

    2013-01-01

    Given the recent interest in the use of video self-modeling (VSM) to provide instruction within iPod apps and other pieces of handheld mobile assistive technologies, investigating appropriate prerequisite skills for effective use of this intervention is particularly timely and relevant. To provide additional information regarding the efficacy of VSM for students with autism and to provide insights into any possible prerequisite skills students may require for such efficacy, the authors investigated the use of VSM in increasing the instances of effective initiations of interpersonal greetings for three students with autism that exhibited different pre-intervention abilities. Results showed that only one of the three participants showed an increase in self-initiated greetings following the viewing of videos edited to show each participant self-modeling a greeting when entering his or her classroom. Due to the differences in initial skill sets between the three children, this finding supports anecdotally observed student prerequisite abilities mentioned in previous studies that may be required to effectively utilize video based teaching methods.

  16. Uncertainties in projecting climate-change impacts in marine ecosystems

    DEFF Research Database (Denmark)

    Payne, Mark; Barange, Manuel; Cheung, William W. L.

    2016-01-01

    with a projection and building confidence in its robustness. We review how uncertainties in such projections are handled in marine science. We employ an approach developed in climate modelling by breaking uncertainty down into (i) structural (model) uncertainty, (ii) initialization and internal variability......Projections of the impacts of climate change on marine ecosystems are a key prerequisite for the planning of adaptation strategies, yet they are inevitably associated with uncertainty. Identifying, quantifying, and communicating this uncertainty is key to both evaluating the risk associated...... and highlight the opportunities and challenges associated with doing a better job. We find that even within a relatively small field such as marine science, there are substantial differences between subdisciplines in the degree of attention given to each type of uncertainty. We find that initialization...

  17. Hualapai Wind Project Feasibility Report

    Energy Technology Data Exchange (ETDEWEB)

    Davidson, Kevin [Hualapai Tribe; Randall, Mark [Daystar Consulting; Isham, Tom [Power Engineers; Horna, Marion J [MJH Power Consulting LLC; Koronkiewicz, T [SWCA Environmental, Inc.; Simon, Rich [V-Bar, LLC; Matthew, Rojas [Squire Sanders Dempsey; MacCourt, Doug C. [Ater Wynne, LLP; Burpo, Rob [First American Financial Advisors, Inc.

    2012-12-20

    The Hualapai Department of Planning and Economic Development, with funding assistance from the U.S. Department of Energy, Tribal Energy Program, with the aid of six consultants has completed the four key prerequisites as follows: 1. Identify the site area for development and its suitability for construction. 2. Determine the wind resource potential for the identified site area. 3. Determine the electrical transmission and interconnection feasibility to get the electrical power produced to the marketplace. 4. Complete an initial permitting and environmental assessment to determine the feasibility for getting the project permitted. Those studies indicated a suitable wind resource and favorable conditions for permitting and construction. The permitting and environmental study did not reveal any fatal flaws. A review of the best power sale opportunities indicate southern California has the highest potential for obtaining a PPA that may make the project viable. Based on these results, the recommendation is for the Hualapai Tribal Nation to move forward with attracting a qualified wind developer to work with the Tribe to move the project into the second phase - determining the reality factors for developing a wind project. a qualified developer will bid to a utility or negotiate a PPA to make the project viable for financing.

  18. [Quality assurance in student training. Prerequisites for DIN EN ISO 9001:2000 in teaching].

    Science.gov (United States)

    Ochsner, W; Kaiser, C; Schirmer, U

    2007-07-01

    Standards of quality assurance according to DIN EN ISO 9001:2000 have been implemented in many university hospital departments, but often teaching activities are not included. This work presents a method that allows, after having defined the various teaching activities as sub-processes of one single core process, to include the manifold teaching activities of university hospital departments into the certification process. The stepwise description of the prerequisites for including teaching activities into ISO 9001 certification is illustrated by a concrete implementation example.

  19. Determination of prerequisites for the estimation of transportation cost of spent fuels

    International Nuclear Information System (INIS)

    Choi, Heui Joo; Lee, Jong Youl; Kim, Seong Ki; Cha, Jeong Hoon; Choi, Jong Won

    2007-10-01

    The cost for the spent fuel management includes the costs for the interim storage, the transportation, and the permanent disposal of the spent fuels. The scope of this report is limited to the cost for the spent fuel transportation. KAERI is developing a cost estimation method for the spent fuel transportation through a joint study with the French AREVA TN. Several prerequisites should be fixed in order to estimate the cost for the spent fuel transportation properly. In this report we produced them considering the Korean current status on the management of spent fuels. The representative characteristics of a spent fuel generated from the six nuclear reactors at the YG site were determined. Total 7,200 tons of spent fuels are projected with the lifespan of 60 years. As the transportation mode, sea transportation and road transportation is recommended considering the location of the YG site and the hypothetical Centralized Interim Storage Facility (CISF) and Final Repository (FR). The sea route and transportation time were analyzed by using a sea distance analysis program which the NORI (National Oceanographic Research Institute) supplies on a web. Based on the results of the analysis, the shipping rates were determined. The regulations related to the spent fuel transportation were reviewed. The characteristics of the transportation vessel and a trailer were suggested. The handling and transportation systems at the YG site, Centralized Interim Storage Facility, and the Final Repository were described in detail for the purpose of the cost estimation of the spent fuel transportation. From the detail description the major components of the transportation system were determined for the conceptual design. It is believed that the conceptual design of the transportation system developed in this report will be used for the analysis of transportation logistics and the cost estimation of spent fuels

  20. Physical and psychosocial prerequisites of functioning in relation to work ability and general subjective well-being among office workers.

    Science.gov (United States)

    Sjögren-Rönkä, Tuulikki; Ojanen, Markku T; Leskinen, Esko K; Tmustalampi, Sirpa; Mälkiä, Esko A

    2002-06-01

    The purpose of the study was to investigate the physical and psychological prerequisites of functioning, as well as the social environment at work and personal factors, in relation to work ability and general subjective well-being in a group of office workers. The study was a descriptive cross-sectional investigation, using path analysis, of office workers. The subjects comprised 88 volunteers, 24 men and 64 women, from the same workplace [mean age 45.7 (SD 8.6) years]. The independent variables were measured using psychosocial and physical questionnaires and physical measurements. The first dependent variable, work ability, was measured by a work ability index. The second dependent variable, general subjective well-being, was assessed by life satisfaction and meaning of life. The variables were structured according to a modified version of the International Classification of Functioning, Disability and Health. Forward flexion of the spine, intensity of musculoskeletal symptoms, self-confidence, and mental stress at work explained 58% of work ability and had indirect effects on general subjective well-being. Self-confidence, mood, and work ability had a direct effect on general subjective well-being. The model developed explained 68% of general subjective well-being. Age played a significant role in this study population. The prerequisites of physical functioning are important in maintaining work ability, particularly among aging workers, and psychological prerequisites of functioning are of even greater importance in maintaining general subjective well-being.

  1. Can donor aid for health be effective in a poor country? Assessment of prerequisites for aid effectiveness in Uganda

    Directory of Open Access Journals (Sweden)

    Ssengooba Freddie

    2009-10-01

    Full Text Available Background: Inadequate funding for health is a challenge to attaining health-related Millennium Development Goals. Significant increase in health funding was recommended by the Commission for Macroeconomics and Health. Indeed Official Development Assistance has increased significantly in Uganda. However, the effectiveness of donor aid has come under greater scrutiny. This paper scrutinizes the prerequisites for aid effectiveness. The objective of the study was to assess the prerequisites for effectiveness of donor aid, specifically, its proportion to overall health funding, predictability, comprehensiveness, alignment to country priorities, and channeling mechanisms. Methods:Secondary data obtained from various official reports and surveys were analyzed against the variables mentioned under objectives. This was augmented by observations and participation in discussions with all stakeholders to discuss sector performance including health financing. Results:Between 2004−2007, the level of aid increased from US$6 per capita to US$11. Aid was found to be unpredictable with expenditure varying between 174−360 percent from budgets. More than 50% of aid was found to be off budget and unavailable for comprehensive planning. There was disproportionate funding for some items such as drugs. Key health system elements such as human resources and infrastructure have not been given due attention in investment. The government’s health funding from domestic sources grew only modestly which did not guarantee fiscal sustainability. Conclusion: Although donor aid is significant there is need to invest in the prerequisites that would guarantee its effective use.

  2. Can donor aid for health be effective in a poor country? Assessment of prerequisites for aid effectiveness in Uganda.

    Science.gov (United States)

    Juliet, Nabyonga Orem; Freddie, Ssengooba; Okuonzi, Sam

    2009-10-22

    Inadequate funding for health is a challenge to attaining health-related Millennium Development Goals. Significant increase in health funding was recommended by the Commission for Macroeconomics and Health. Indeed Official Development Assistance has increased significantly in Uganda. However, the effectiveness of donor aid has come under greater scrutiny. This paper scrutinizes the prerequisites for aid effectiveness. The objective of the study was to assess the prerequisites for effectiveness of donor aid, specifically, its proportion to overall health funding, predictability, comprehensiveness, alignment to country priorities, and channeling mechanisms. Secondary data obtained from various official reports and surveys were analyzed against the variables mentioned under objectives. This was augmented by observations and participation in discussions with all stakeholders to discuss sector performance including health financing. Between 2004-2007, the level of aid increased from US$6 per capita to US$11. Aid was found to be unpredictable with expenditure varying between 174-8722;360 percent from budgets. More than 50% of aid was found to be off budget and unavailable for comprehensive planning. There was disproportionate funding for some items such as drugs. Key health system elements such as human resources and infrastructure have not been given due attention in investment. The government's health funding from domestic sources grew only modestly which did not guarantee fiscal sustainability. Although donor aid is significant there is need to invest in the prerequisites that would guarantee its effective use.

  3. Two Understandings of "Soft Power": Prerequisites, Correlates and Consequences

    Directory of Open Access Journals (Sweden)

    Pavel Parshin

    2014-01-01

    Full Text Available The category of "soft power" suggested by Joseph Nye in early 1990s is analyzed in the paper as one of realization of tactile metaphor. Highlighted are those cognitive semantic peculiarities of this metaphor which contribute to its wide popularity and, at the same time, produce prerequisites for two dramatically different understandings of "soft power". According to technological understanding, "soft power" is an instrument or, broader, a technology, especially a communicative one, applied in world politics in such a way as to minimize damage caused to the object of power exertion in comparison to other, "hard power" instruments. In accordance with resource understanding, "soft power" is peculiar to influence exerted by an actor due to his/her/its attractiveness and shared values. The author analyses political and ideological correlates of these two understandings and relates them to different traditions in the study of country image and reputation, namely international relations theory and nation branding.. Analyzed are also the most topical disagreements about the "soft power" in the discourse of world politics.

  4. Statistics for clinical nursing practice: an introduction.

    Science.gov (United States)

    Rickard, Claire M

    2008-11-01

    Difficulty in understanding statistics is one of the most frequently reported barriers to nurses applying research results in their practice. Yet the amount of nursing research published each year continues to grow, as does the expectation that nurses will undertake practice based on this evidence. Critical care nurses do not need to be statisticians, but they do need to develop a working knowledge of statistics so they can be informed consumers of research and so practice can evolve and improve. For those undertaking a research project, statistical literacy is required to interact with other researchers and statisticians, so as to best design and undertake the project. This article is the first in a series that guides critical care nurses through statistical terms and concepts relevant to their practice.

  5. Six sigma for organizational excellence a statistical approach

    CERN Document Server

    Muralidharan, K

    2015-01-01

    This book discusses the integrated concepts of statistical quality engineering and management tools. It will help readers to understand and apply the concepts of quality through project management and technical analysis, using statistical methods. Prepared in a ready-to-use form, the text will equip practitioners to implement the Six Sigma principles in projects. The concepts discussed are all critically assessed and explained, allowing them to be practically applied in managerial decision-making, and in each chapter, the objectives and connections to the rest of the work are clearly illustrated. To aid in understanding, the book includes a wealth of tables, graphs, descriptions and checklists, as well as charts and plots, worked-out examples and exercises. Perhaps the most unique feature of the book is its approach, using statistical tools, to explain the science behind Six Sigma project management and integrated in engineering concepts. The material on quality engineering and statistical management tools of...

  6. Review of Time Management for the Research Reactor Project

    Energy Technology Data Exchange (ETDEWEB)

    Park, Kook-Nam; Park, Su-Jin; Choi, Min-Ho; Yoon, Hyung-Mo; Kim, Hyeonil [KAERI, Daejeon (Korea, Republic of); Lee, Eung-Jae [DAEWOO E and C, Seoul (Korea, Republic of)

    2016-05-15

    In this paper, the processes for the time management, which have actually been implemented for JRTR, are presented. In JRTR, a master schedule was submitted in December 2012 whereas the project was contracted in October 2010. The schedule includes fixing the Engineering Deliverable List (EDL), the list of equipment, the actual issue date, the results of Primavera, a piece of software to manage progress, the progress rate and the issuance of the schedule based on the Project level III. Afterwards JAEC approved to the extension of the schedule from 56 months to 70.5 months mainly due to late preparation of the Jordanian nuclear legislative system. The project schedule was updated up to the fifth revision to compensate the delay by recovering measures such as for design, purchase, construction, and finally the owner of the project, Jordanian Atomic Energy Commission (JAEC) approved in August 2014. Construction work, the prerequisite for commissioning stage A had been finished in February 2016, and commissioning stage A has been being performed.

  7. Review of Time Management for the Research Reactor Project

    International Nuclear Information System (INIS)

    Park, Kook-Nam; Park, Su-Jin; Choi, Min-Ho; Yoon, Hyung-Mo; Kim, Hyeonil; Lee, Eung-Jae

    2016-01-01

    In this paper, the processes for the time management, which have actually been implemented for JRTR, are presented. In JRTR, a master schedule was submitted in December 2012 whereas the project was contracted in October 2010. The schedule includes fixing the Engineering Deliverable List (EDL), the list of equipment, the actual issue date, the results of Primavera, a piece of software to manage progress, the progress rate and the issuance of the schedule based on the Project level III. Afterwards JAEC approved to the extension of the schedule from 56 months to 70.5 months mainly due to late preparation of the Jordanian nuclear legislative system. The project schedule was updated up to the fifth revision to compensate the delay by recovering measures such as for design, purchase, construction, and finally the owner of the project, Jordanian Atomic Energy Commission (JAEC) approved in August 2014. Construction work, the prerequisite for commissioning stage A had been finished in February 2016, and commissioning stage A has been being performed

  8. Underestimating Costs in Public Works Projects

    DEFF Research Database (Denmark)

    Flyvbjerg, Bent; Holm, Mette K. Skamris; Buhl, Søren L.

    2002-01-01

    This article presents results from the first statistically significant study of cost escalation in transportation infrastructure projects. Based on a sample of 258 transportation infrastructure projects worth $90 billion (U.S.), it is found with overwhelming statistical significance that the cost...... honest numbers should not trust the cost estimates and cost-benefit analyses produced by project promoters and their analysts. Independent estimates and analyses are needed as are institutional checks and balances to curb deception.......This article presents results from the first statistically significant study of cost escalation in transportation infrastructure projects. Based on a sample of 258 transportation infrastructure projects worth $90 billion (U.S.), it is found with overwhelming statistical significance that the cost...... estimates used to decide whether important infrastructure should be built are highly and systematically misleading. The result is continuous cost escalation of billions of dollars. The sample used in the study is the largest of its kind, allowing for the first time statistically valid conclusions regarding...

  9. Cost Underestimation in Public Works Projects

    DEFF Research Database (Denmark)

    Flyvbjerg, Bent; Holm, Mette K. Skamris; Buhl, Søren L.

    This article presents results from the first statistically significant study of cost escalation in transportation infrastructure projects. Based on a sample of 258 transportation infrastructure projects worth $90 billion (U.S.), it is found with overwhelming statistical significance that the cost...... honest numbers should not trust the cost estimates and cost-benefit analyses produced by project promoters and their analysts. Independent estimates and analyses are needed as are institutional checks and balances to curb deception.......This article presents results from the first statistically significant study of cost escalation in transportation infrastructure projects. Based on a sample of 258 transportation infrastructure projects worth $90 billion (U.S.), it is found with overwhelming statistical significance that the cost...... estimates used to decide whether important infrastructure should be built are highly and systematically misleading. The result is continuous cost escalation of billions of dollars. The sample used in the study is the largest of its kind, allowing for the first time statistically valid conclusions regarding...

  10. Motivation, values, and work design as drivers of participation in the R open source project for statistical computing.

    Science.gov (United States)

    Mair, Patrick; Hofmann, Eva; Gruber, Kathrin; Hatzinger, Reinhold; Zeileis, Achim; Hornik, Kurt

    2015-12-01

    One of the cornerstones of the R system for statistical computing is the multitude of packages contributed by numerous package authors. This amount of packages makes an extremely broad range of statistical techniques and other quantitative methods freely available. Thus far, no empirical study has investigated psychological factors that drive authors to participate in the R project. This article presents a study of R package authors, collecting data on different types of participation (number of packages, participation in mailing lists, participation in conferences), three psychological scales (types of motivation, psychological values, and work design characteristics), and various socio-demographic factors. The data are analyzed using item response models and subsequent generalized linear models, showing that the most important determinants for participation are a hybrid form of motivation and the social characteristics of the work design. Other factors are found to have less impact or influence only specific aspects of participation.

  11. MUSIC EDUCATION AND MULTIMEDIA PROJECTS

    Directory of Open Access Journals (Sweden)

    Orlova Elena V.

    2013-12-01

    Full Text Available The article deals with the prerequisites of shift of music education paradigm in the XXI century, tells about emergence of new forms in the creative efforts of musicians enrolled in primary schools, and at secondary and highest education levels. Different types and genres of the multimedia creativity are considered. They were in demand by musicians at various events-contests, including Russian and international festivals and competitions in terms of which the music was called upon to play a leading role. Criteria of estimates of new forms of artistic expression are developed. The article contains some video examples given the varying multimedia projects noted by juries of several international competitions held in Moscow (Russia in 2008-2013.

  12. Phasor Simulator for Operator Training Project

    Energy Technology Data Exchange (ETDEWEB)

    Dyer, Jim [Electric Power Group, Llc, Pasadena, CA (United States)

    2016-09-14

    Synchrophasor systems are being deployed in power systems throughout the North American Power Grid and there are plans to integrate this technology and its associated tools into Independent System Operator (ISO)/utility control room operations. A pre-requisite to using synchrophasor technologies in control rooms is for operators to obtain training and understand how to use this technology in real-time situations. The Phasor Simulator for Operator Training (PSOT) project objective was to develop, deploy and demonstrate a pre-commercial training simulator for operators on the use of this technology and to promote acceptance of the technology in utility and ISO/Regional Transmission Owner (RTO) control centers.

  13. Using statistics to understand the environment

    CERN Document Server

    Cook, Penny A

    2000-01-01

    Using Statistics to Understand the Environment covers all the basic tests required for environmental practicals and projects and points the way to the more advanced techniques that may be needed in more complex research designs. Following an introduction to project design, the book covers methods to describe data, to examine differences between samples, and to identify relationships and associations between variables.Featuring: worked examples covering a wide range of environmental topics, drawings and icons, chapter summaries, a glossary of statistical terms and a further reading section, this book focuses on the needs of the researcher rather than on the mathematics behind the tests.

  14. Should Research Thesis be a Prerequisite for Doctor of Medicine Degree? A Cross-sectional Study at Jordan University of Science and Technology

    Directory of Open Access Journals (Sweden)

    Aisha Gharaibeh

    2014-02-01

    Full Text Available Background: University based research is an integral part of many prestigious medical schools worldwide. The benefits of student-conducted research have long been highlighted in the literature. This article aims to identify the insights of medical students concerning research training, including perceived hurdles in the way of conducting research, and the utility of a research thesis in acquiring a Doctor of Medicine degree.Methods: A total of 808 medical students at Jordan University of Science and Technology were selected by random sampling with a confidence level of 95%. A survey was constructed by a group of students through literature review and group discussions. The survey utilized polar and Likert scale questions to collect data from the students. Statistical inferences were then obtained through analysis of means and one sample t-test of the hypothesis.Results: A total of 687 students filled out the survey (85%. Analysis shows that respondents have a strong and positive attitude towards research. The respondents with past research experience constituted 14.3% of those surveyed. Those respondents identified the barriers faced by them during their experience. The students showed high degree of agreement that a research thesis should be a prerequisite for graduation with statistical significance of p-value ≤0.05.Conclusion: Modifying the curriculum to include research methodology is recommended, and developing it to incorporate a thesis as a requirement for graduation may be advised upon further review.

  15. [Anthropology and oral health projects in developing countries].

    Science.gov (United States)

    Grasveld, A E

    2016-01-01

    The mouth and teeth play an important role in social interactions around the world. The way people deal with their teeth and mouth, however, is determined culturally. When oral healthcare projects are being carried out in developing countries, differing cultural worldviews can cause misunderstandings between oral healthcare providers and their patients. The oral healthcare volunteer often has to try to understand the local assumptions about teeth and oral hygiene first, before he or she can bring about a change of behaviour, increase therapy compliance and make the oral healthcare project sustainable. Anthropology can be helpful in this respect. In 2014, in a pilot project commissioned by the Dutch Dental Care Foundation, in which oral healthcare was provided in combination with anthropological research, an oral healthcare project in Kwale (Kenia) was evaluated. The study identified 6 primary themes that indicate the most important factors influencing the oral health of school children in Kwale. Research into the local culture by oral healthcare providers would appear to be an important prerequisite to meaningful work in developing countries.

  16. State Support: A Prerequisite for Global Health Network Effectiveness

    Science.gov (United States)

    Marten, Robert; Smith, Richard D.

    2018-01-01

    Shiffman recently summarized lessons for network effectiveness from an impressive collection of case-studies. However, in common with most global health governance analysis in recent years, Shiffman underplays the important role of states in these global networks. As the body which decides and signs international agreements, often provides the resourcing, and is responsible for implementing initiatives all contributing to the prioritization of certain issues over others, state recognition and support is a prerequisite to enabling and determining global health networks’ success. The role of states deserves greater attention, analysis and consideration. We reflect upon the underappreciated role of the state within the current discourse on global health. We present the tobacco case study to illustrate the decisive role of states in determining progress for global health networks, and highlight how states use a legitimacy loop to gain legitimacy from and provide legitimacy to global health networks. Moving forward in assessing global health networks’ effectiveness, further investigating state support as a determinant of success will be critical. Understanding how global health networks and states interact and evolve to shape and support their respective interests should be a focus for future research. PMID:29524958

  17. Between Certainty and Uncertainty Statistics and Probability in Five Units with Notes on Historical Origins and Illustrative Numerical Examples

    CERN Document Server

    Laudański, Ludomir M

    2013-01-01

    „Between Certainty & Uncertainty” is a one-of–a-kind short course on statistics for students, engineers  and researchers.  It is a fascinating introduction to statistics and probability with notes on historical origins and 80 illustrative numerical examples organized in the five units:   ·         Chapter 1  Descriptive Statistics:  Compressing small samples, basic averages - mean and variance, their main properties including God’s proof; linear transformations and z-scored statistics .   ·         Chapter 2 Grouped data: Udny Yule’s concept of qualitative and quantitative variables. Grouping these two kinds of data. Graphical tools. Combinatorial rules and qualitative variables.  Designing frequency histogram. Direct and coded evaluation of quantitative data. Significance of percentiles.   ·         Chapter 3 Regression and correlation: Geometrical distance and equivalent distances in two orthogonal directions  as a prerequisite to the concept of two regressi...

  18. Environmental Interfaces in Teaching Economic Statistics

    Science.gov (United States)

    Campos, Celso; Wodewotzki, Maria Lucia; Jacobini, Otavio; Ferrira, Denise

    2016-01-01

    The objective of this article is, based on the Critical Statistics Education assumptions, to value some environmental interfaces in teaching Statistics by modeling projects. Due to this, we present a practical case, one in which we address an environmental issue, placed in the context of the teaching of index numbers, within the Statistics…

  19. Statistical Content in Middle Grades Mathematics Textbooks

    Science.gov (United States)

    Pickle, Maria Consuelo Capiral

    2012-01-01

    This study analyzed the treatment and scope of statistical concepts in four, widely-used, contemporary, middle grades mathematics textbook series: "Glencoe Math Connects," "Prentice Hall Mathematics," "Connected Mathematics Project," and "University of Chicago School Mathematics Project." There were three…

  20. Motivation, values, and work design as drivers of participation in the R open source project for statistical computing

    Science.gov (United States)

    Mair, Patrick; Hofmann, Eva; Gruber, Kathrin; Hatzinger, Reinhold; Zeileis, Achim; Hornik, Kurt

    2015-01-01

    One of the cornerstones of the R system for statistical computing is the multitude of packages contributed by numerous package authors. This amount of packages makes an extremely broad range of statistical techniques and other quantitative methods freely available. Thus far, no empirical study has investigated psychological factors that drive authors to participate in the R project. This article presents a study of R package authors, collecting data on different types of participation (number of packages, participation in mailing lists, participation in conferences), three psychological scales (types of motivation, psychological values, and work design characteristics), and various socio-demographic factors. The data are analyzed using item response models and subsequent generalized linear models, showing that the most important determinants for participation are a hybrid form of motivation and the social characteristics of the work design. Other factors are found to have less impact or influence only specific aspects of participation. PMID:26554005

  1. New way of working: Professionals' expectations and experiences of the Culture and Health Project for clients with psychiatric disabilities: A focus group study.

    Science.gov (United States)

    Wästberg, Birgitta A; Sandström, Boel; Gunnarsson, Anna Birgitta

    2018-02-01

    There is a need for various types of interventions when meeting needs of clients with psychiatric disabilities and complementary interventions may also influence their well-being. The Culture and Health project, based on complementary interventions with 270 clients, was created in a county in Sweden for clients with psychiatric disabilities and for professionals to carry out the interventions. The aim of this study was to investigate the professionals' expectations regarding the project and their clients' possibilities for participating, and to investigate the professionals' experiences of the project after its completion. Focus group data with a total of 30 professionals participating were collected. A qualitative content analysis revealed four categories of the professionals' expectations before entering the project: "Clients' own possibilities and limitations for their development and independence", "Professionals' possibilities for supporting the clients", "Societal prerequisites", and "Expectations of a new way of working". Furthermore, the analysis regarding professionals' experiences after working with the project revealed three categories: "Adopting the challenges", "Having ways of working that function - prerequisites and possibilities", and "Meeting the future - an ambition to continue". Working in the Culture and Health project together with the clients in group-based activities was perceived as beneficial, although challenges arose. When implementing cultural activities, support from stakeholder organisations is needed. © 2017 Australian College of Mental Health Nurses Inc.

  2. A New Statistical Tool: Scalar Score Function

    Czech Academy of Sciences Publication Activity Database

    Fabián, Zdeněk

    2011-01-01

    Roč. 2, - (2011), s. 109-116 ISSN 1934-7332 R&D Projects: GA ČR GA205/09/1079 Institutional research plan: CEZ:AV0Z10300504 Keywords : statistics * inference function * data characteristics * point estimates * heavy tails Subject RIV: BB - Applied Statistics, Operational Research

  3. Topology for statistical modeling of petascale data.

    Energy Technology Data Exchange (ETDEWEB)

    Pascucci, Valerio (University of Utah, Salt Lake City, UT); Mascarenhas, Ajith Arthur; Rusek, Korben (Texas A& M University, College Station, TX); Bennett, Janine Camille; Levine, Joshua (University of Utah, Salt Lake City, UT); Pebay, Philippe Pierre; Gyulassy, Attila (University of Utah, Salt Lake City, UT); Thompson, David C.; Rojas, Joseph Maurice (Texas A& M University, College Station, TX)

    2011-07-01

    This document presents current technical progress and dissemination of results for the Mathematics for Analysis of Petascale Data (MAPD) project titled 'Topology for Statistical Modeling of Petascale Data', funded by the Office of Science Advanced Scientific Computing Research (ASCR) Applied Math program. Many commonly used algorithms for mathematical analysis do not scale well enough to accommodate the size or complexity of petascale data produced by computational simulations. The primary goal of this project is thus to develop new mathematical tools that address both the petascale size and uncertain nature of current data. At a high level, our approach is based on the complementary techniques of combinatorial topology and statistical modeling. In particular, we use combinatorial topology to filter out spurious data that would otherwise skew statistical modeling techniques, and we employ advanced algorithms from algebraic statistics to efficiently find globally optimal fits to statistical models. This document summarizes the technical advances we have made to date that were made possible in whole or in part by MAPD funding. These technical contributions can be divided loosely into three categories: (1) advances in the field of combinatorial topology, (2) advances in statistical modeling, and (3) new integrated topological and statistical methods.

  4. The SYOEKSY research project. Electrically-powered vehicles in ring rail line feeder traffic and short-distance travel; Saehkoeiset ajoneuvot kehaeradan syoettoe- ja asiointiliikenteessae. SYOEKSY-tutkimushankkeen loppuraportti 21.9.2011

    Energy Technology Data Exchange (ETDEWEB)

    2011-07-01

    The research report is divided into three sections: a general description of the project, separate reports on each of the themes investigated and a summary of the research results and recommendations. Section 2 deals with themes such as: Low-emissions-technology solutions for traffic and their technical prerequisites in an urban environment and the results obtained from pilot projects. New service models based on electrically-powered traffic technology with an assessment of their feasibility and their effect on urban infrastructure and levels of CO{sub 2} emissions. User needs in new feeder traffic, short-distance travel and other journeys on personal business together with proposals for planning measures which will promote sustainable mobility in new residential areas. The prerequisites for electrically-powered feeder traffic and short-distance travel are therefore handled in a wide-ranging manner. A condensed version of the project conclusions and recommendations is provided in Section 3. A list of the data sources and publications resulting from project work can be found at the end of the report

  5. Electrical engineering research support for FDOT Traffic Statistics Office

    Science.gov (United States)

    2010-03-01

    The aim of this project was to provide electrical engineering support for the telemetered traffic monitoring sites (TTMSs) operated by the Statistics Office of the Florida Department of Transportation. This project was a continuation of project BD-54...

  6. Engineer’s estimate reliability and statistical characteristics of bids

    Directory of Open Access Journals (Sweden)

    Fariborz M. Tehrani

    2016-12-01

    Full Text Available The objective of this report is to provide a methodology for examining bids and evaluating the performance of engineer’s estimates in capturing the true cost of projects. This study reviews the cost development for transportation projects in addition to two sources of uncertainties in a cost estimate, including modeling errors and inherent variability. Sample projects are highway maintenance projects with a similar scope of the work, size, and schedule. Statistical analysis of engineering estimates and bids examines the adaptability of statistical models for sample projects. Further, the variation of engineering cost estimates from inception to implementation has been presented and discussed for selected projects. Moreover, the applicability of extreme values theory is assessed for available data. The results indicate that the performance of engineer’s estimate is best evaluated based on trimmed average of bids, excluding discordant bids.

  7. Statistical physics of networks, information and complex systems

    Energy Technology Data Exchange (ETDEWEB)

    Ecke, Robert E [Los Alamos National Laboratory

    2009-01-01

    In this project we explore the mathematical methods and concepts of statistical physics that are fmding abundant applications across the scientific and technological spectrum from soft condensed matter systems and bio-infonnatics to economic and social systems. Our approach exploits the considerable similarity of concepts between statistical physics and computer science, allowing for a powerful multi-disciplinary approach that draws its strength from cross-fertilization and mUltiple interactions of researchers with different backgrounds. The work on this project takes advantage of the newly appreciated connection between computer science and statistics and addresses important problems in data storage, decoding, optimization, the infonnation processing properties of the brain, the interface between quantum and classical infonnation science, the verification of large software programs, modeling of complex systems including disease epidemiology, resource distribution issues, and the nature of highly fluctuating complex systems. Common themes that the project has been emphasizing are (i) neural computation, (ii) network theory and its applications, and (iii) a statistical physics approach to infonnation theory. The project's efforts focus on the general problem of optimization and variational techniques, algorithm development and infonnation theoretic approaches to quantum systems. These efforts are responsible for fruitful collaborations and the nucleation of science efforts that span multiple divisions such as EES, CCS, 0 , T, ISR and P. This project supports the DOE mission in Energy Security and Nuclear Non-Proliferation by developing novel infonnation science tools for communication, sensing, and interacting complex networks such as the internet or energy distribution system. The work also supports programs in Threat Reduction and Homeland Security.

  8. Precipitation projections under GCMs perspective and Turkish Water Foundation (TWF) statistical downscaling model procedures

    Science.gov (United States)

    Dabanlı, İsmail; Şen, Zekai

    2018-04-01

    The statistical climate downscaling model by the Turkish Water Foundation (TWF) is further developed and applied to a set of monthly precipitation records. The model is structured by two phases as spatial (regional) and temporal downscaling of global circulation model (GCM) scenarios. The TWF model takes into consideration the regional dependence function (RDF) for spatial structure and Markov whitening process (MWP) for temporal characteristics of the records to set projections. The impact of climate change on monthly precipitations is studied by downscaling Intergovernmental Panel on Climate Change-Special Report on Emission Scenarios (IPCC-SRES) A2 and B2 emission scenarios from Max Plank Institute (EH40PYC) and Hadley Center (HadCM3). The main purposes are to explain the TWF statistical climate downscaling model procedures and to expose the validation tests, which are rewarded in same specifications as "very good" for all stations except one (Suhut) station in the Akarcay basin that is in the west central part of Turkey. Eventhough, the validation score is just a bit lower at the Suhut station, the results are "satisfactory." It is, therefore, possible to say that the TWF model has reasonably acceptable skill for highly accurate estimation regarding standard deviation ratio (SDR), Nash-Sutcliffe efficiency (NSE), and percent bias (PBIAS) criteria. Based on the validated model, precipitation predictions are generated from 2011 to 2100 by using 30-year reference observation period (1981-2010). Precipitation arithmetic average and standard deviation have less than 5% error for EH40PYC and HadCM3 SRES (A2 and B2) scenarios.

  9. Environmental Impact Assessment: Uri hydroelectric power project on River Jhelum in Kashmir, India

    International Nuclear Information System (INIS)

    Nyman, L.

    1995-09-01

    This report is an Initial Aquatic Environmental Impact Assessment of the Uri Hydroelectric Power Project on River Jhelum in Kashmir, India. It includes the Terms of Reference of the assessment, a discussion on biodiversity and threats to it, the environmental indicators used to monitor and predict the impacts, a description of the physical, chemical and biological prerequisites of the River Jhelum ecosystem, a description of the survey sites chosen, and an overview of the present fish and bottom fauna. Finally, there are sections on the potential impacts on biota of the Uri Project and a list of proposals for how mitigating and enhancing measures could be enforced

  10. Burden of proof for the illegal immissions as prerequisite of in rem removal claim

    Directory of Open Access Journals (Sweden)

    Knežević Marko

    2013-01-01

    Full Text Available The paper examines the question of the burden of proof for the facts that imply illegal immission as prerequisite of in rem removal claim. The approach is different to the standard doctrine and it is in according to the general rule of the burden of proof in litigation - so called modified norm theory. In the centre of the attention is distinction of so called constitutive and impeditive facts, and criteria for distinction. The implementation of modified norm theory regarding issue of this paper shows that primal distinguishing point is not suitable, so the other modification methods should be applied, in order to get the answer.

  11. The Conception of the Information Management of Innovation Project and the Stages of its Implementation

    Directory of Open Access Journals (Sweden)

    Babinska Solomiia Ya.

    2017-01-01

    Full Text Available The informational support plays an important role in the development and implementation of innovative projects, is a prerequisite for development of its conception. Proceeding from this, the article considers approaches of scientists to components of the life cycle of innovation project, revealing that the most of them would allocate three phases (development, implementation, and completion. In terms of the information support for each of these stages were defined objectives, tasks, managerial decisions (selecting of an innovation object; choosing the economy sector; identifying sources of financing; costing; substantiating expediency as to implementing a project; choosing counter-parties; product pricing; selecting markets; further use of the property objects, information arrays, necessary sources of information, it was considered how the conception of information management of innovation project is being implemented in stages.

  12. For the Love of Statistics: Appreciating and Learning to Apply Experimental Analysis and Statistics through Computer Programming Activities

    Science.gov (United States)

    Mascaró, Maite; Sacristán, Ana Isabel; Rufino, Marta M.

    2016-01-01

    For the past 4 years, we have been involved in a project that aims to enhance the teaching and learning of experimental analysis and statistics, of environmental and biological sciences students, through computational programming activities (using R code). In this project, through an iterative design, we have developed sequences of R-code-based…

  13. Against Inferential Statistics: How and Why Current Statistics Teaching Gets It Wrong

    Science.gov (United States)

    White, Patrick; Gorard, Stephen

    2017-01-01

    Recent concerns about a shortage of capacity for statistical and numerical analysis skills among social science students and researchers have prompted a range of initiatives aiming to improve teaching in this area. However, these projects have rarely re-evaluated the content of what is taught to students and have instead focussed primarily on…

  14. What to be implemented at the early stage of a large-scale project

    CERN Document Server

    Bachy, Gérard; Bachy, Gerard; Hameri, Ari Pekka

    1997-01-01

    This paper addresses the importance of the actions to be taken before the project planning phases begin. The approach taken stems from the production planning paradigm, with emphasis on the product, rather than on the process. It is argued that a complete part list or product breakdown structure (PBS) is the absolute prerequisite for the design of a successful work breakdown structure (WBS) for a project. This process requires the definition of the design and configuration disciplines during the engineering phase. These critical issues of concurrent engineering and product development are also emphasized in the paper. The WBS is, in turn, needed to establish a suitable organizational breakdown structure (OBS or organigram) for the project. Finally, the assembly sequence and the related assembly breakdown structure (ABS) of the end product is required before commencing the project planning phase, which provides the schedules, resource allocation, progress control, and the like for the project management. Detai...

  15. The use of statistics in real and simulated investigations performed by undergraduate health sciences' students

    OpenAIRE

    Pimenta, Rui; Nascimento, Ana; Vieira, Margarida; Costa, Elísio

    2010-01-01

    In previous works, we evaluated the statistical reasoning ability acquired by health sciences’ students carrying out their final undergraduate project. We found that these students achieved a good level of statistical literacy and reasoning in descriptive statistics. However, concerning inferential statistics the students did not reach a similar level. Statistics educators therefore claim for more effective ways to learn statistics such as project based investigations. These can be simulat...

  16. Teaching statistics a bag of tricks

    CERN Document Server

    Gelman, Andrew

    2002-01-01

    Students in the sciences, economics, psychology, social sciences, and medicine take introductory statistics. Statistics is increasingly offered at the high school level as well. However, statistics can be notoriously difficult to teach as it is seen by many students as difficult and boring, if not irrelevant to their subject of choice. To help dispel these misconceptions, Gelman and Nolan have put together this fascinating and thought-provoking book. Based on years of teachingexperience the book provides a wealth of demonstrations, examples and projects that involve active student participatio

  17. Visuanimation in statistics

    KAUST Repository

    Genton, Marc G.

    2015-04-14

    This paper explores the use of visualization through animations, coined visuanimation, in the field of statistics. In particular, it illustrates the embedding of animations in the paper itself and the storage of larger movies in the online supplemental material. We present results from statistics research projects using a variety of visuanimations, ranging from exploratory data analysis of image data sets to spatio-temporal extreme event modelling; these include a multiscale analysis of classification methods, the study of the effects of a simulated explosive volcanic eruption and an emulation of climate model output. This paper serves as an illustration of visuanimation for future publications in Stat. Copyright © 2015 John Wiley & Sons, Ltd.

  18. 42 CFR 56.603 - Project elements.

    Science.gov (United States)

    2010-10-01

    ... set forth in the most recent CSA Poverty Income Guidelines (42 CFR 1060.2) (except that nominal fees... statistical data, cost accounting, management information, and reporting or monitoring systems which will meet the project's management needs and shall enable the project to provide such statistics and other...

  19. Statistics for lawyers

    CERN Document Server

    Finkelstein, Michael O

    2015-01-01

    This classic text, first published in 1990, is designed to introduce law students, law teachers, practitioners, and judges to the basic ideas of mathematical probability and statistics as they have been applied in the law. The third edition includes over twenty new sections, including the addition of timely topics, like New York City police stops, exonerations in death-sentence cases, projecting airline costs, and new material on various statistical techniques such as the randomized response survey technique, rare-events meta-analysis, competing risks, and negative binomial regression. The book consists of sections of exposition followed by real-world cases and case studies in which statistical data have played a role. The reader is asked to apply the theory to the facts, to calculate results (a hand calculator is sufficient), and to explore legal issues raised by quantitative findings. The authors' calculations and comments are given in the back of the book. As with previous editions, the cases and case stu...

  20. The Optimization of the Time-Cost Tradeoff Problem in Projects with Conditional Activities Using of the Multi-Objective Charged System Search Algorithm (SMOCSS

    Directory of Open Access Journals (Sweden)

    M. K. Sharbatdar

    2016-11-01

    Full Text Available Abstract The appropriate planning and scheduling for reaching the project goals in the most economical way is the very basic issue of the project management. As in each project, the project manager must determine the required activities for the implementation of the project and select the best option in the implementation of each of the activities, in a way that the least final cost and time of the project is achieved. Considering the number of activities and selecting options for each of the activities, usually the selection has not one unique solution, but it consists of a set of solutions that are not preferred to each other and are known as Pareto solutions. On the other hand, in some actual projects, there are activities that their implementation options depend on the implementation of the prerequisite activity and are not applicable using all the implementation options, and even in some cases the implementation or the non-implementation of some activities are also dependent on the prerequisite activity implementation. These projects can be introduced as conditional projects. Much researchs have been conducted for acquiring Pareto solution set, using different methods and algorithms, but in all the done tasks the time-cost optimization of conditional projects is not considered. Thus, in the present study the concept of conditional network is defined along with some practical examples, then an appropriate way to illustrate these networks and suitable time-cost formulation of these are presented. Finally, for some instances of conditional activity networks, conditional project time-cost optimization conducted multi-objectively using known meta-heuristic algorithms such as multi-objective genetic algorithm, multi-objective particle swarm algorithm and multi-objective charged system search algorithm.

  1. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    Science.gov (United States)

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  2. Detection and statistics of gusts

    DEFF Research Database (Denmark)

    Hannesdóttir, Ásta; Kelly, Mark C.; Mann, Jakob

    In this project, a more realistic representation of gusts, based on statistical analysis, will account for the variability observed in real-world gusts. The gust representation will focus on temporal, spatial, and velocity scales that are relevant for modern wind turbines and which possibly affect...

  3. Shippingport Station Decommissioning Project (SSDP): configuration control system and project activity controls

    International Nuclear Information System (INIS)

    Mullee, G.R.

    1986-01-01

    The SSDP has been using a Configuration Control system as a significant element in the management plan for the safe and effective performance of the project. The objective of the Configuration Control system is to control the physical plant configuration, system status, work schedules, status tracking, and day-to-day problem resolution. Prior to the Decommissioning Operations Contractor (DOC) assuming operational responsibility for the Shippingport Plant, an assessment was made of the status of the configuration of the systems and related documentation. Action was taken as required to match the operating procedures and system documentation with the actual physical condition of the plant. During the first stage of the project, planning was put in place for subsequent decommissioning activities. This planning included defining organizational responsibilities, completing the necessary project instructions and procedures, and doing the planning and scheduling for the subsequent decommissioning phase activities. Detailed instructions for the performance of the various decommissioning tasks were prepared. Prior to the start of any work on a given Activity Package, a Work Authorization is required. The Work Authorization form provides a complete checklist to ensure that all necessary prerequisites are completed. A computerized Communications Configuration Control Information system monitors status including information on system status, tag-outs, radiological work permits, etc. An ongoing effort is being directed toward maintaining operating instructions and system schematics, etc. current as the Plant configuration changes. The experience with the Configuration Control System to date has been favorable

  4. Kolmogorov complexity, pseudorandom generators and statistical models testing

    Czech Academy of Sciences Publication Activity Database

    Šindelář, Jan; Boček, Pavel

    2002-01-01

    Roč. 38, č. 6 (2002), s. 747-759 ISSN 0023-5954 R&D Projects: GA ČR GA102/99/1564 Institutional research plan: CEZ:AV0Z1075907 Keywords : Kolmogorov complexity * pseudorandom generators * statistical models testing Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.341, year: 2002

  5. Application of pedagogy reflective in statistical methods course and practicum statistical methods

    Science.gov (United States)

    Julie, Hongki

    2017-08-01

    Subject Elementary Statistics, Statistical Methods and Statistical Methods Practicum aimed to equip students of Mathematics Education about descriptive statistics and inferential statistics. The students' understanding about descriptive and inferential statistics were important for students on Mathematics Education Department, especially for those who took the final task associated with quantitative research. In quantitative research, students were required to be able to present and describe the quantitative data in an appropriate manner, to make conclusions from their quantitative data, and to create relationships between independent and dependent variables were defined in their research. In fact, when students made their final project associated with quantitative research, it was not been rare still met the students making mistakes in the steps of making conclusions and error in choosing the hypothetical testing process. As a result, they got incorrect conclusions. This is a very fatal mistake for those who did the quantitative research. There were some things gained from the implementation of reflective pedagogy on teaching learning process in Statistical Methods and Statistical Methods Practicum courses, namely: 1. Twenty two students passed in this course and and one student did not pass in this course. 2. The value of the most accomplished student was A that was achieved by 18 students. 3. According all students, their critical stance could be developed by them, and they could build a caring for each other through a learning process in this course. 4. All students agreed that through a learning process that they undergo in the course, they can build a caring for each other.

  6. Interdisciplinary collaboration as a prerequisite for inclusive education

    DEFF Research Database (Denmark)

    Hedegaard-Sørensen, Lotte; Riis Jensen, Charlotte; Tofteng, Ditte Maria Børglum

    2017-01-01

    This article reports on findings from a research project on interdisciplinary collaboration between mainstream school teachers and special school teachers. The aim of the research project has been to examine the knowledge of special school teachers and how this knowledge can contribute...

  7. Technical clarity in inter-agency negotiations: Lessons from four hydropower projects

    Science.gov (United States)

    Burkardt, Nina; Lamb, Berton Lee; Taylor, Jonathan G.; Waddle, Terry J.

    1995-01-01

    We investigated the effect of technical clarity on success in multi-party negotiations in the Federal Energy Regulatory Commission (FERC) licensing process. Technical clarity is the shared understanding of dimensions such as the geographic extent of the project, range of flows to be considered, important species and life stages, and variety of water uses considered. The results of four hydropower licensing consultations are reported. Key participants were interviewed to ascertain the level of technical clarity present during the consultations and the degree to which the consultations were successful. Technical clarity appears to be a prerequisite for successful outcomes. Factors that enhance technical clarity include simple project design, new rather than existing projects, precise definition of issues, a sense of urgency to reach agreement, a sense of fairness among participants, and consistency in participation. Negotiators should not neglect the critical pre-negotiation steps of defining technical issues and determining appropriate studies, deciding how to interpret studies, and agreeing on responses to study results.

  8. Statistically significant relational data mining :

    Energy Technology Data Exchange (ETDEWEB)

    Berry, Jonathan W.; Leung, Vitus Joseph; Phillips, Cynthia Ann; Pinar, Ali; Robinson, David Gerald; Berger-Wolf, Tanya; Bhowmick, Sanjukta; Casleton, Emily; Kaiser, Mark; Nordman, Daniel J.; Wilson, Alyson G.

    2014-02-01

    This report summarizes the work performed under the project (3z(BStatitically significant relational data mining.(3y (BThe goal of the project was to add more statistical rigor to the fairly ad hoc area of data mining on graphs. Our goal was to develop better algorithms and better ways to evaluate algorithm quality. We concetrated on algorithms for community detection, approximate pattern matching, and graph similarity measures. Approximate pattern matching involves finding an instance of a relatively small pattern, expressed with tolerance, in a large graph of data observed with uncertainty. This report gathers the abstracts and references for the eight refereed publications that have appeared as part of this work. We then archive three pieces of research that have not yet been published. The first is theoretical and experimental evidence that a popular statistical measure for comparison of community assignments favors over-resolved communities over approximations to a ground truth. The second are statistically motivated methods for measuring the quality of an approximate match of a small pattern in a large graph. The third is a new probabilistic random graph model. Statisticians favor these models for graph analysis. The new local structure graph model overcomes some of the issues with popular models such as exponential random graph models and latent variable models.

  9. Shippingport Station Decommissioning Project (SSDP). A progress report

    International Nuclear Information System (INIS)

    Mullee, G.R.; Usher, J.M.

    1986-01-01

    The Shippingport Atomic Power Station was shutdown in October, 1982 by the Plant Operator, Duquesne Light Company, for decommissioning by the US Department of Energy. The planning for decommissioning was completed in September, 1983. In September, 1984 operational responsibility for the station was transferred to the DOE's Decommissioning Operations Contractor - the General Electric Company (assisted by an integrated subcontractor, MK Ferguson Company). Significant accomplishments to date include the completion of all prerequisites for decommissioning, the removal of asbestos from plant systems, loading of irradiated reactor components into the reactor vessel for shipment, the commencement of electrical deactivations and the commencement of piping/component removal. Decontamination and waste processing are progressing in support of the project schedule. The reactor vessel will be shipped as one piece on a barge for burial at Hanford, Washington. The final release of the site is scheduled for April, 1990. A technology transfer program is being utilized to disseminate information about the project

  10. Is Pulp Inflammation a Prerequisite for Pulp Healing and Regeneration?

    Directory of Open Access Journals (Sweden)

    Michel Goldberg

    2015-01-01

    Full Text Available The importance of inflammation has been underestimated in pulpal healing, and in the past, it has been considered only as an undesirable effect. Associated with moderate inflammation, necrosis includes pyroptosis, apoptosis, and nemosis. There are now evidences that inflammation is a prerequisite for pulp healing, with series of events ahead of regeneration. Immunocompetent cells are recruited in the apical part. They slide along the root and migrate toward the crown. Due to the high alkalinity of the capping agent, pulp cells display mild inflammation, proliferate, and increase in number and size and initiate mineralization. Pulp fibroblasts become odontoblast-like cells producing type I collagen, alkaline phosphatase, and SPARC/osteonectin. Molecules of the SIBLING family, matrix metalloproteinases, and vascular and nerve mediators are also implicated in the formation of a reparative dentinal bridge, osteo/orthodentin closing the pulp exposure. Beneath a calciotraumatic line, a thin layer identified as reactionary dentin underlines the periphery of the pulp chamber. Inflammatory and/or noninflammatory processes contribute to produce a reparative dentinal bridge closing the pulp exposure, with minute canaliculi and large tunnel defects. Depending on the form and severity of the inflammatory and noninflammatory processes, and according to the capping agent, pulp reactions are induced specifically.

  11. Multivariate spatial Gaussian mixture modeling for statistical clustering of hemodynamic parameters in functional MRI

    International Nuclear Information System (INIS)

    Fouque, A.L.; Ciuciu, Ph.; Risser, L.; Fouque, A.L.; Ciuciu, Ph.; Risser, L.

    2009-01-01

    In this paper, a novel statistical parcellation of intra-subject functional MRI (fMRI) data is proposed. The key idea is to identify functionally homogenous regions of interest from their hemodynamic parameters. To this end, a non-parametric voxel-based estimation of hemodynamic response function is performed as a prerequisite. Then, the extracted hemodynamic features are entered as the input data of a Multivariate Spatial Gaussian Mixture Model (MSGMM) to be fitted. The goal of the spatial aspect is to favor the recovery of connected components in the mixture. Our statistical clustering approach is original in the sense that it extends existing works done on univariate spatially regularized Gaussian mixtures. A specific Gibbs sampler is derived to account for different covariance structures in the feature space. On realistic artificial fMRI datasets, it is shown that our algorithm is helpful for identifying a parsimonious functional parcellation required in the context of joint detection estimation of brain activity. This allows us to overcome the classical assumption of spatial stationarity of the BOLD signal model. (authors)

  12. Statistically downscaled climate projections to support evaluating climate change risks for hydropower

    International Nuclear Information System (INIS)

    Brekke, L.

    2008-01-01

    This paper described a web-served public access archive of down-scaled climate projections developed as a tool for water managers of river and hydropower systems. The archive provided access to climate projection data at basin-relevant resolution and included an extensive compilation of down-scale climate projects designed to support risk-based adaptation planning. Downscaled translations of 112 contemporary climate projections produced using the World Climate Research Program's coupled model intercomparison project were also included. Datasets for the coupled model included temperature and precipitation, monthly time-steps, and geographic coverage for the United States and portions of Mexico and Canada. It was concluded that the archive will be used to develop risk-based studies on shifts in seasonal patterns, changes in mean annual runoff, and associated responses in water resources and hydroelectric power management. Case studies demonstrating reclamation applications of archive content and potential applications for hydroelectric power production impacts were included. tabs., figs

  13. Research projects in family medicine funded by the European Union.

    Science.gov (United States)

    Pavličević, Ivančica; Barać, Lana

    2014-01-01

    This study aimed at synthesizing funding opportunities in the field of family medicine by determining the number of family medicine projects, as well as number of project leaderships and/ or participations by each country. This was done in order to encourage inclusion of physicians in countries with underdeveloped research networks in successful research networks or to encourage them to form new ones. We searched the Community Research and Development Information Service project database in February 2013. Study covered the period from years 1992 - 2012, selecting the projects within the field of general/family medicine. The search was conducted in February 2013. First search conducted in the CORDIS database came up with a total of 466 projects. After excluding 241 projects with insufficient data, we analysed 225 remaining projects; out of those, 22 (9.8%) were in the field of family medicine and 203 (90.2%) were from other fields of medicine. Sorted by the number of projects per country, Dutch institutions had the highest involvement in family medicine projects and were partners or coordinators in 18 out of 22 selected projects (81.8%), followed by British institutions with 15 (68.8%), and Spanish with 10 projects (45.5%). Croatia was a partner in a single FP7 Health project. Research projects in family medicine funded by the European Union show significant differences between countries. Constant and high-quality international cooperation in family medicine is the prerequisite for improvement and development of scientific research and the profession. Copyright © 2014 by Academy of Sciences and Arts of Bosnia and Herzegovina.

  14. Experimental design techniques in statistical practice a practical software-based approach

    CERN Document Server

    Gardiner, W P

    1998-01-01

    Provides an introduction to the diverse subject area of experimental design, with many practical and applicable exercises to help the reader understand, present and analyse the data. The pragmatic approach offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry. Provides an introduction to the diverse subject area of experimental design and includes practical and applicable exercises to help understand, present and analyse the data Offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry Discusses one-factor designs and blocking designs, factorial experimental designs, Taguchi methods and response surface methods, among other topics.

  15. Statistical methods for mechanistic model validation: Salt Repository Project

    International Nuclear Information System (INIS)

    Eggett, D.L.

    1988-07-01

    As part of the Department of Energy's Salt Repository Program, Pacific Northwest Laboratory (PNL) is studying the emplacement of nuclear waste containers in a salt repository. One objective of the SRP program is to develop an overall waste package component model which adequately describes such phenomena as container corrosion, waste form leaching, spent fuel degradation, etc., which are possible in the salt repository environment. The form of this model will be proposed, based on scientific principles and relevant salt repository conditions with supporting data. The model will be used to predict the future characteristics of the near field environment. This involves several different submodels such as the amount of time it takes a brine solution to contact a canister in the repository, how long it takes a canister to corrode and expose its contents to the brine, the leach rate of the contents of the canister, etc. These submodels are often tested in a laboratory and should be statistically validated (in this context, validate means to demonstrate that the model adequately describes the data) before they can be incorporated into the waste package component model. This report describes statistical methods for validating these models. 13 refs., 1 fig., 3 tabs

  16. Improve projections of changes in southern African summer rainfall through comprehensive multi-timescale empirical statistical downscaling

    Science.gov (United States)

    Dieppois, B.; Pohl, B.; Eden, J.; Crétat, J.; Rouault, M.; Keenlyside, N.; New, M. G.

    2017-12-01

    The water management community has hitherto neglected or underestimated many of the uncertainties in climate impact scenarios, in particular, uncertainties associated with decadal climate variability. Uncertainty in the state-of-the-art global climate models (GCMs) is time-scale-dependant, e.g. stronger at decadal than at interannual timescales, in response to the different parameterizations and to internal climate variability. In addition, non-stationarity in statistical downscaling is widely recognized as a key problem, in which time-scale dependency of predictors plays an important role. As with global climate modelling, therefore, the selection of downscaling methods must proceed with caution to avoid unintended consequences of over-correcting the noise in GCMs (e.g. interpreting internal climate variability as a model bias). GCM outputs from the Coupled Model Intercomparison Project 5 (CMIP5) have therefore first been selected based on their ability to reproduce southern African summer rainfall variability and their teleconnections with Pacific sea-surface temperature across the dominant timescales. In observations, southern African summer rainfall has recently been shown to exhibit significant periodicities at the interannual timescale (2-8 years), quasi-decadal (8-13 years) and inter-decadal (15-28 years) timescales, which can be interpret as the signature of ENSO, the IPO, and the PDO over the region. Most of CMIP5 GCMs underestimate southern African summer rainfall variability and their teleconnections with Pacific SSTs at these three timescales. In addition, according to a more in-depth analysis of historical and pi-control runs, this bias is might result from internal climate variability in some of the CMIP5 GCMs, suggesting potential for bias-corrected prediction based empirical statistical downscaling. A multi-timescale regression based downscaling procedure, which determines the predictors across the different timescales, has thus been used to

  17. THE FLUORBOARD A STATISTICALLY BASED DASHBOARD METHOD FOR IMPROVING SAFETY

    International Nuclear Information System (INIS)

    PREVETTE, S.S.

    2005-01-01

    The FluorBoard is a statistically based dashboard method for improving safety. Fluor Hanford has achieved significant safety improvements--including more than a 80% reduction in OSHA cases per 200,000 hours, during its work at the US Department of Energy's Hanford Site in Washington state. The massive project on the former nuclear materials production site is considered one of the largest environmental cleanup projects in the world. Fluor Hanford's safety improvements were achieved by a committed partnering of workers, managers, and statistical methodology. Safety achievements at the site have been due to a systematic approach to safety. This includes excellent cooperation between the field workers, the safety professionals, and management through OSHA Voluntary Protection Program principles. Fluor corporate values are centered around safety, and safety excellence is important for every manager in every project. In addition, Fluor Hanford has utilized a rigorous approach to using its safety statistics, based upon Dr. Shewhart's control charts, and Dr. Deming's management and quality methods

  18. Project Leadership and Quality Performance of Construction Projects

    Directory of Open Access Journals (Sweden)

    SPG Buba

    2017-05-01

    Full Text Available Background: The construction industry in Nigeria, is pigeonholed by poor quality of construction products as a result of the inherent corruption in the country. Lack of purposeful leadership and inappropriate choice of leadership styles in the industry have been attributed to project failure. Abandoned and failed projects are more predominant in the public sector which litters every corner of the country. Objectives: The objective of this paper is to assess the impact of leadership styles on quality performance criteria of public projects in Nigeria. Methodology: A total of 43 questionnaires were distributed to 3 key groups of respondents (Quantity Surveyors, Builders, and Architects who are project managers in Nigeria. Descriptive and Inferential statistics were used to analyse the data using the Statistical Package for Social Sciences (SPSS. Likert Scale was used to measure the independent variables (leadership style: facilitative, coaching, delegating and directing; and the level of achievement of projects based on the dependent variables (quality and function performance criteria which are: achieving highest aesthetic quality; and functional building that fits its purpose. Findings: The study revealed that Directing is the major leadership style used by project managers in Nigeria. Amongst the leadership styles which has the most impact on quality performance indicators is also directing which has the most relative influence on achieving highest aesthetic quality and functional building that fits its purpose. Conclusion/Recommendation/Way forward: The underlying relationship between Directing leadership styles and the performance criteria of achieving highest aesthetic quality and functional building that fits its purpose will be beneficial to the Nigerian construction environment.

  19. Editorial to: Six papers on Dynamic Statistical Models

    DEFF Research Database (Denmark)

    2014-01-01

    statistical methodology and theory for large and complex data sets that included biostatisticians and mathematical statisticians from three faculties at the University of Copenhagen. The satellite meeting took place August 17–19, 2011. Its purpose was to bring together researchers in statistics and related......The following six papers are based on invited lectures at the satellite meeting held at the University of Copenhagen before the 58th World Statistics Congress of the International Statistical Institute in Dublin in 2011. At the invitation of the Bernoulli Society, the satellite meeting...... was organized around the theme “Dynamic Statistical Models” as a part of the Program of Excellence at the University of Copenhagen on “Statistical methods for complex and high dimensional models” (http://statistics.ku.dk/). The Excellence Program in Statistics was a research project to develop and investigate...

  20. The value of statistical tools to detect data fabrication

    NARCIS (Netherlands)

    Hartgerink, C.H.J.; Wicherts, J.M.; van Assen, M.A.L.M.

    2016-01-01

    We aim to investigate how statistical tools can help detect potential data fabrication in the social- and medical sciences. In this proposal we outline three projects to assess the value of such statistical tools to detect potential data fabrication and make the first steps in order to apply them

  1. Project JADE. Description of the MLH-method

    International Nuclear Information System (INIS)

    Sandstedt, H.; Munier, R.; Wichmann, C.; Isaksson, Therese

    2001-08-01

    This report constitutes a part of a series of reports within project JADE, comparison of deposition methods. A comparison of the deposition methods MLH (Medium Long Holes with approximately 25 copper canisters emplaced in a horizontal deposition hole about 200 metres in length bored between central and side tunnels) and KBS-3 (copper canisters are emplaced in vertical deposition holes bored in the floors of horizontal tunnels) has earlier been performed and KBS-3 was judged to be more advantageous than MLH. However, the prerequisites for the comparison have changed with time and an updated evaluation of MLH was therefore required. In this report, the current knowledge of MLH is summarized with focus on geological prerequisites, methods for boring long, horizontal deposition holes, reinforcement and sealing, deposition and cost. Comparisons with KBS-3 are performed sequentially. An MLH-repository is judged to be more sensitive to ingress of water to the deposition holes during the deposition process. This implies that a MLH repository based on today's knowledge is basically recommended for bedrock with fairly low water baring capacity. It has been demonstrated that MLH has considerable economic potential compared to KBS-3. However, the method is judged to be more technically immature than KBS-3. Particularly, methods and equipment for deposition of canisters need to be developed further. Methods and equipment for deposition can be developed, which fulfill the demands on function and safety, in the near future. MLH cannot therefore be rejected as deposition method

  2. Statistical deprojection of galaxy pairs

    Science.gov (United States)

    Nottale, Laurent; Chamaraux, Pierre

    2018-06-01

    Aims: The purpose of the present paper is to provide methods of statistical analysis of the physical properties of galaxy pairs. We perform this study to apply it later to catalogs of isolated pairs of galaxies, especially two new catalogs we recently constructed that contain ≈1000 and ≈13 000 pairs, respectively. We are particularly interested by the dynamics of those pairs, including the determination of their masses. Methods: We could not compute the dynamical parameters directly since the necessary data are incomplete. Indeed, we only have at our disposal one component of the intervelocity between the members, namely along the line of sight, and two components of their interdistance, i.e., the projection on the sky-plane. Moreover, we know only one point of each galaxy orbit. Hence we need statistical methods to find the probability distribution of 3D interdistances and 3D intervelocities from their projections; we designed those methods under the term deprojection. Results: We proceed in two steps to determine and use the deprojection methods. First we derive the probability distributions expected for the various relevant projected quantities, namely intervelocity vz, interdistance rp, their ratio, and the product rp v_z^2, which is involved in mass determination. In a second step, we propose various methods of deprojection of those parameters based on the previous analysis. We start from a histogram of the projected data and we apply inversion formulae to obtain the deprojected distributions; lastly, we test the methods by numerical simulations, which also allow us to determine the uncertainties involved.

  3. Computed Tomography Image Quality Evaluation of a New Iterative Reconstruction Algorithm in the Abdomen (Adaptive Statistical Iterative Reconstruction-V) a Comparison With Model-Based Iterative Reconstruction, Adaptive Statistical Iterative Reconstruction, and Filtered Back Projection Reconstructions.

    Science.gov (United States)

    Goodenberger, Martin H; Wagner-Bartak, Nicolaus A; Gupta, Shiva; Liu, Xinming; Yap, Ramon Q; Sun, Jia; Tamm, Eric P; Jensen, Corey T

    The purpose of this study was to compare abdominopelvic computed tomography images reconstructed with adaptive statistical iterative reconstruction-V (ASIR-V) with model-based iterative reconstruction (Veo 3.0), ASIR, and filtered back projection (FBP). Abdominopelvic computed tomography scans for 36 patients (26 males and 10 females) were reconstructed using FBP, ASIR (80%), Veo 3.0, and ASIR-V (30%, 60%, 90%). Mean ± SD patient age was 32 ± 10 years with mean ± SD body mass index of 26.9 ± 4.4 kg/m. Images were reviewed by 2 independent readers in a blinded, randomized fashion. Hounsfield unit, noise, and contrast-to-noise ratio (CNR) values were calculated for each reconstruction algorithm for further comparison. Phantom evaluation of low-contrast detectability (LCD) and high-contrast resolution was performed. Adaptive statistical iterative reconstruction-V 30%, ASIR-V 60%, and ASIR 80% were generally superior qualitatively compared with ASIR-V 90%, Veo 3.0, and FBP (P ASIR-V 60% with respective CNR values of 5.54 ± 2.39, 8.78 ± 3.15, and 3.49 ± 1.77 (P ASIR 80% had the best and worst spatial resolution, respectively. Adaptive statistical iterative reconstruction-V 30% and ASIR-V 60% provided the best combination of qualitative and quantitative performance. Adaptive statistical iterative reconstruction 80% was equivalent qualitatively, but demonstrated inferior spatial resolution and LCD.

  4. Statistical methods in physical mapping

    International Nuclear Information System (INIS)

    Nelson, D.O.

    1995-05-01

    One of the great success stories of modern molecular genetics has been the ability of biologists to isolate and characterize the genes responsible for serious inherited diseases like fragile X syndrome, cystic fibrosis and myotonic muscular dystrophy. This dissertation concentrates on constructing high-resolution physical maps. It demonstrates how probabilistic modeling and statistical analysis can aid molecular geneticists in the tasks of planning, execution, and evaluation of physical maps of chromosomes and large chromosomal regions. The dissertation is divided into six chapters. Chapter 1 provides an introduction to the field of physical mapping, describing the role of physical mapping in gene isolation and ill past efforts at mapping chromosomal regions. The next two chapters review and extend known results on predicting progress in large mapping projects. Such predictions help project planners decide between various approaches and tactics for mapping large regions of the human genome. Chapter 2 shows how probability models have been used in the past to predict progress in mapping projects. Chapter 3 presents new results, based on stationary point process theory, for progress measures for mapping projects based on directed mapping strategies. Chapter 4 describes in detail the construction of all initial high-resolution physical map for human chromosome 19. This chapter introduces the probability and statistical models involved in map construction in the context of a large, ongoing physical mapping project. Chapter 5 concentrates on one such model, the trinomial model. This chapter contains new results on the large-sample behavior of this model, including distributional results, asymptotic moments, and detection error rates. In addition, it contains an optimality result concerning experimental procedures based on the trinomial model. The last chapter explores unsolved problems and describes future work

  5. Statistical methods in physical mapping

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, David O. [Univ. of California, Berkeley, CA (United States)

    1995-05-01

    One of the great success stories of modern molecular genetics has been the ability of biologists to isolate and characterize the genes responsible for serious inherited diseases like fragile X syndrome, cystic fibrosis and myotonic muscular dystrophy. This dissertation concentrates on constructing high-resolution physical maps. It demonstrates how probabilistic modeling and statistical analysis can aid molecular geneticists in the tasks of planning, execution, and evaluation of physical maps of chromosomes and large chromosomal regions. The dissertation is divided into six chapters. Chapter 1 provides an introduction to the field of physical mapping, describing the role of physical mapping in gene isolation and ill past efforts at mapping chromosomal regions. The next two chapters review and extend known results on predicting progress in large mapping projects. Such predictions help project planners decide between various approaches and tactics for mapping large regions of the human genome. Chapter 2 shows how probability models have been used in the past to predict progress in mapping projects. Chapter 3 presents new results, based on stationary point process theory, for progress measures for mapping projects based on directed mapping strategies. Chapter 4 describes in detail the construction of all initial high-resolution physical map for human chromosome 19. This chapter introduces the probability and statistical models involved in map construction in the context of a large, ongoing physical mapping project. Chapter 5 concentrates on one such model, the trinomial model. This chapter contains new results on the large-sample behavior of this model, including distributional results, asymptotic moments, and detection error rates. In addition, it contains an optimality result concerning experimental procedures based on the trinomial model. The last chapter explores unsolved problems and describes future work.

  6. Statistical downscaling based on dynamically downscaled predictors: Application to monthly precipitation in Sweden

    Science.gov (United States)

    Hellström, Cecilia; Chen, Deliang

    2003-11-01

    A prerequisite of a successful statistical downscaling is that large-scale predictors simulated by the General Circulation Model (GCM) must be realistic. It is assumed here that features smaller than the GCM resolution are important in determining the realism of the large-scale predictors. It is tested whether a three-step method can improve conventional one-step statistical downscaling. The method uses predictors that are upscaled from a dynamical downscaling instead of predictors taken directly from a GCM simulation. The method is applied to downscaling of monthly precipitation in Sweden. The statistical model used is a multiple regression model that uses indices of large-scale atmospheric circulation and 850-hPa specific humidity as predictors. Data from two GCMs (HadCM2 and ECHAM4) and two RCM experiments of the Rossby Centre model (RCA1) driven by the GCMs are used. It is found that upscaled RCA1 predictors capture the seasonal cycle better than those from the GCMs, and hence increase the reliability of the downscaled precipitation. However, there are only slight improvements in the simulation of the seasonal cycle of downscaled precipitation. Due to the cost of the method and the limited improvements in the downscaling results, the three-step method is not justified to replace the one-step method for downscaling of Swedish precipitation.

  7. Encryption of covert information into multiple statistical distributions

    International Nuclear Information System (INIS)

    Venkatesan, R.C.

    2007-01-01

    A novel strategy to encrypt covert information (code) via unitary projections into the null spaces of ill-conditioned eigenstructures of multiple host statistical distributions, inferred from incomplete constraints, is presented. The host pdf's are inferred using the maximum entropy principle. The projection of the covert information is dependent upon the pdf's of the host statistical distributions. The security of the encryption/decryption strategy is based on the extreme instability of the encoding process. A self-consistent procedure to derive keys for both symmetric and asymmetric cryptography is presented. The advantages of using a multiple pdf model to achieve encryption of covert information are briefly highlighted. Numerical simulations exemplify the efficacy of the model

  8. Climate change estimates of South American riverflow through statistical downscaling

    CSIR Research Space (South Africa)

    Landman, W

    2014-03-01

    Full Text Available projections are assimilated into a linear statistical model in order to produce an ensemble of downscaled riverflows in the La Plata Basin and in southern-central Chile. The statistical model uses atmospheric circulation fields (geopotential heights at the 200...

  9. Statistics in three biomedical journals

    Czech Academy of Sciences Publication Activity Database

    Pilčík, Tomáš

    2003-01-01

    Roč. 52, č. 1 (2003), s. 39-43 ISSN 0862-8408 R&D Projects: GA ČR GA310/03/1381 Grant - others:Howard Hughes Medical Institute(US) HHMI55000323 Institutional research plan: CEZ:AV0Z5052915 Keywords : statistics * usage * biomedical journals Subject RIV: EC - Immunology Impact factor: 0.939, year: 2003

  10. Photogrammetric computer vision statistics, geometry, orientation and reconstruction

    CERN Document Server

    Förstner, Wolfgang

    2016-01-01

    This textbook offers a statistical view on the geometry of multiple view analysis, required for camera calibration and orientation and for geometric scene reconstruction based on geometric image features. The authors have backgrounds in geodesy and also long experience with development and research in computer vision, and this is the first book to present a joint approach from the converging fields of photogrammetry and computer vision. Part I of the book provides an introduction to estimation theory, covering aspects such as Bayesian estimation, variance components, and sequential estimation, with a focus on the statistically sound diagnostics of estimation results essential in vision metrology. Part II provides tools for 2D and 3D geometric reasoning using projective geometry. This includes oriented projective geometry and tools for statistically optimal estimation and test of geometric entities and transformations and their rela­tions, tools that are useful also in the context of uncertain reasoning in po...

  11. eSACP - a new Nordic initiative towards developing statistical climate services

    Science.gov (United States)

    Thorarinsdottir, Thordis; Thejll, Peter; Drews, Martin; Guttorp, Peter; Venälainen, Ari; Uotila, Petteri; Benestad, Rasmus; Mesquita, Michel d. S.; Madsen, Henrik; Fox Maule, Cathrine

    2015-04-01

    The Nordic research council NordForsk has recently announced its support for a new 3-year research initiative on "statistical analysis of climate projections" (eSACP). eSACP will focus on developing e-science tools and services based on statistical analysis of climate projections for the purpose of helping decision-makers and planners in the face of expected future challenges in regional climate change. The motivation behind the project is the growing recognition in our society that forecasts of future climate change is associated with various sources of uncertainty, and that any long-term planning and decision-making dependent on a changing climate must account for this. At the same time there is an obvious gap between scientists from different fields and between practitioners in terms of understanding how climate information relates to different parts of the "uncertainty cascade". In eSACP we will develop generic e-science tools and statistical climate services to facilitate the use of climate projections by decision-makers and scientists from all fields for climate impact analyses and for the development of robust adaptation strategies, which properly (in a statistical sense) account for the inherent uncertainty. The new tool will be publically available and include functionality to utilize the extensive and dynamically growing repositories of data and use state-of-the-art statistical techniques to quantify the uncertainty and innovative approaches to visualize the results. Such a tool will not only be valuable for future assessments and underpin the development of dedicated climate services, but will also assist the scientific community in making more clearly its case on the consequences of our changing climate to policy makers and the general public. The eSACP project is led by Thordis Thorarinsdottir, Norwegian Computing Center, and also includes the Finnish Meteorological Institute, the Norwegian Meteorological Institute, the Technical University of Denmark

  12. Topology for Statistical Modeling of Petascale Data

    Energy Technology Data Exchange (ETDEWEB)

    Pascucci, Valerio [Univ. of Utah, Salt Lake City, UT (United States); Levine, Joshua [Univ. of Utah, Salt Lake City, UT (United States); Gyulassy, Attila [Univ. of Utah, Salt Lake City, UT (United States); Bremer, P. -T. [Univ. of Utah, Salt Lake City, UT (United States)

    2013-10-31

    Many commonly used algorithms for mathematical analysis do not scale well enough to accommodate the size or complexity of petascale data produced by computational simulations. The primary goal of this project is to develop new mathematical tools that address both the petascale size and uncertain nature of current data. At a high level, the approach of the entire team involving all three institutions is based on the complementary techniques of combinatorial topology and statistical modelling. In particular, we use combinatorial topology to filter out spurious data that would otherwise skew statistical modelling techniques, and we employ advanced algorithms from algebraic statistics to efficiently find globally optimal fits to statistical models. The overall technical contributions can be divided loosely into three categories: (1) advances in the field of combinatorial topology, (2) advances in statistical modelling, and (3) new integrated topological and statistical methods. Roughly speaking, the division of labor between our 3 groups (Sandia Labs in Livermore, Texas A&M in College Station, and U Utah in Salt Lake City) is as follows: the Sandia group focuses on statistical methods and their formulation in algebraic terms, and finds the application problems (and data sets) most relevant to this project, the Texas A&M Group develops new algebraic geometry algorithms, in particular with fewnomial theory, and the Utah group develops new algorithms in computational topology via Discrete Morse Theory. However, we hasten to point out that our three groups stay in tight contact via videconference every 2 weeks, so there is much synergy of ideas between the groups. The following of this document is focused on the contributions that had grater direct involvement from the team at the University of Utah in Salt Lake City.

  13. Tube problems: worldwide statistics reviewed

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    EPRI's Steam Generator Strategic Management Project issues an annual report on the progress being made in tackling steam generator problems worldwide, containing a wealth of detailed statistics on the status of operating units and degradation mechanisms encountered. A few highlights are presented from the latest report, issued in October 1993, which covers the period to 31 December 1992. (Author)

  14. Watt-Lite; Energy Statistics Made Tangible

    DEFF Research Database (Denmark)

    Jönsson, Li; Broms, Loove; Katzeff, Cecilia

    2011-01-01

    of consumers its consequences are poorly understood. In order to better understand how we can use design to increase awareness of electricity consumption in everyday life, we will discuss the design of Watt-Lite, a set of three oversized torches projecting real time energy statistics of a factory...... in the physical environments of its employees. The design of Watt-Lite is meant to explore ways of representing, understanding and interacting with electricity in industrial workspaces. We discuss three design inquiries and their implications for the design of Watt-Lite: the use of tangible statistics...

  15. Highly Robust Statistical Methods in Medical Image Analysis

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2012-01-01

    Roč. 32, č. 2 (2012), s. 3-16 ISSN 0208-5216 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : robust statistics * classification * faces * robust image analysis * forensic science Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.208, year: 2012 http://www.ibib.waw.pl/bbe/bbefulltext/BBE_32_2_003_FT.pdf

  16. Comprehensive Benefit Evaluation of the Power Distribution Network Planning Project Based on Improved IAHP and Multi-Level Extension Assessment Method

    OpenAIRE

    Qunli Wu; Chenyang Peng

    2016-01-01

    Reasonable distribution network planning is an essential prerequisite of the economics and security of the future power grid. The comprehensive benefit evaluation of a distribution network planning project can make significant contributions towards guiding decisions during the planning scheme, the optimization of the distribution network structure, and the rational use of resources. In this paper, in light of the characteristics of the power distribution network, the comprehensive benefit eva...

  17. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  18. Measuring the Social Impact of Infrastructure Projects: The Case of Gdańsk International Fair Co.

    Directory of Open Access Journals (Sweden)

    Anna Zamojska

    2017-01-01

    Full Text Available Efcient infrastructure is a prerequisite of, and critcal to, development. Only some projects generate a positve rate of return, but all of them should generate positve non-economic impacts and contribute social gains. Social impact is considered as a consequence or efect of decisions or interventons which lead to development. It can also be considered as a social consequence of development. The main problem of social costs and benefts is that the impact is difcult to predict and quantfy and can be taken into account diferently by authorites, decision makers and project developers. The main purpose of the paper is to identfy and demonstrate a concept of the social impact of infrastructure projects. The principal methods used are a review of existng social science literature and surveys based on focus group interviews, devoted stakeholders of infrastructure projects, and their involvement at diferent stages of the project. The expected result is a set of outputs and outcomes which demonstrates social impacts (costs and benefts related to stakeholders’ groups of the analyzed project.

  19. Fulfilling the needs for statistical expertise at Aalborg Hospital

    DEFF Research Database (Denmark)

    Dethlefsen, Claus

    In 2005, the first statistician was employed at Aalborg Hospital due to expanding research activities as part of Aarhus University Hospital. Since then, there has been an increased demand for statistical expertise at all levels. In the talk, I will give an overview of the current staff...... of statisticians and the organisation. I will give examples from our statistical consultancy and illustrate some of the challenges that have led to research projects with heavy statistical involvement....

  20. Expansion of the On-line Archive "Statistically Downscaled WCRP CMIP3 Climate Projections"

    Science.gov (United States)

    Brekke, L. D.; Pruitt, T.; Maurer, E. P.; Das, T.; Duffy, P.; White, K.

    2009-12-01

    Presentation highlights status and plans for a public-access archive of downscaled CMIP3 climate projections. Incorporating climate projection information into long-term evaluations of water and energy resources requires analysts to have access to projections at "basin-relevant" resolution. Such projections would ideally be bias-corrected to account for climate model tendencies to systematically simulate historical conditions different than observed. In 2007, the U.S. Bureau of Reclamation, Santa Clara University and Lawrence Livermore National Laboratory (LLNL) collaborated to develop an archive of 112 bias-corrected and spatially disaggregated (BCSD) CMIP3 temperature and precipitation projections. These projections were generated using 16 CMIP3 models to simulate three emissions pathways (A2, A1b, and B1) from one or more initializations (runs). Projections are specified on a monthly time step from 1950-2099 and at 0.125 degree spatial resolution within the North American Land Data Assimilation System domain (i.e. contiguous U.S., southern Canada and northern Mexico). Archive data are freely accessible at LLNL Green Data Oasis (url). Since being launched, the archive has served over 3500 data requests by nearly 500 users in support of a range of planning, research and educational activities. Archive developers continue to look for ways to improve the archive and respond to user needs. One request has been to serve the intermediate datasets generated during the BCSD procedure, helping users to interpret the relative influences of the bias-correction and spatial disaggregation on the transformed CMIP3 output. This request has been addressed with intermediate datasets now posted at the archive web-site. Another request relates closely to studying hydrologic and ecological impacts under climate change, where users are asking for projected diurnal temperature information (e.g., projected daily minimum and maximum temperature) and daily time step resolution. In

  1. A Statistical Toolkit for Data Analysis

    International Nuclear Information System (INIS)

    Donadio, S.; Guatelli, S.; Mascialino, B.; Pfeiffer, A.; Pia, M.G.; Ribon, A.; Viarengo, P.

    2006-01-01

    The present project aims to develop an open-source and object-oriented software Toolkit for statistical data analysis. Its statistical testing component contains a variety of Goodness-of-Fit tests, from Chi-squared to Kolmogorov-Smirnov, to less known, but generally much more powerful tests such as Anderson-Darling, Goodman, Fisz-Cramer-von Mises, Kuiper, Tiku. Thanks to the component-based design and the usage of the standard abstract interfaces for data analysis, this tool can be used by other data analysis systems or integrated in experimental software frameworks. This Toolkit has been released and is downloadable from the web. In this paper we describe the statistical details of the algorithms, the computational features of the Toolkit and describe the code validation

  2. Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures

    Energy Technology Data Exchange (ETDEWEB)

    Udey, Ruth Norma [Michigan State Univ., East Lansing, MI (United States)

    2013-01-01

    Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.

  3. Projection of future climate change conditions using IPCC simulations, neural networks and Bayesian statistics. Part 2: Precipitation mean state and seasonal cycle in South America

    Energy Technology Data Exchange (ETDEWEB)

    Boulanger, Jean-Philippe [LODYC, UMR CNRS/IRD/UPMC, Tour 45-55/Etage 4/Case 100, UPMC, Paris Cedex 05 (France); University of Buenos Aires, Departamento de Ciencias de la Atmosfera y los Oceanos, Facultad de Ciencias Exactas y Naturales, Buenos Aires (Argentina); Martinez, Fernando; Segura, Enrique C. [University of Buenos Aires, Departamento de Computacion, Facultad de Ciencias Exactas y Naturales, Buenos Aires (Argentina)

    2007-02-15

    Evaluating the response of climate to greenhouse gas forcing is a major objective of the climate community, and the use of large ensemble of simulations is considered as a significant step toward that goal. The present paper thus discusses a new methodology based on neural network to mix ensemble of climate model simulations. Our analysis consists of one simulation of seven Atmosphere-Ocean Global Climate Models, which participated in the IPCC Project and provided at least one simulation for the twentieth century (20c3m) and one simulation for each of three SRES scenarios: A2, A1B and B1. Our statistical method based on neural networks and Bayesian statistics computes a transfer function between models and observations. Such a transfer function was then used to project future conditions and to derive what we would call the optimal ensemble combination for twenty-first century climate change projections. Our approach is therefore based on one statement and one hypothesis. The statement is that an optimal ensemble projection should be built by giving larger weights to models, which have more skill in representing present climate conditions. The hypothesis is that our method based on neural network is actually weighting the models that way. While the statement is actually an open question, which answer may vary according to the region or climate signal under study, our results demonstrate that the neural network approach indeed allows to weighting models according to their skills. As such, our method is an improvement of existing Bayesian methods developed to mix ensembles of simulations. However, the general low skill of climate models in simulating precipitation mean climatology implies that the final projection maps (whatever the method used to compute them) may significantly change in the future as models improve. Therefore, the projection results for late twenty-first century conditions are presented as possible projections based on the &apos

  4. Using Facebook Data to Turn Introductory Statistics Students into Consultants

    Science.gov (United States)

    Childers, Adam F.

    2017-01-01

    Facebook provides businesses and organizations with copious data that describe how users are interacting with their page. This data affords an excellent opportunity to turn introductory statistics students into consultants to analyze the Facebook data using descriptive and inferential statistics. This paper details a semester-long project that…

  5. A Statistical Framework for the Functional Analysis of Metagenomes

    Energy Technology Data Exchange (ETDEWEB)

    Sharon, Itai; Pati, Amrita; Markowitz, Victor; Pinter, Ron Y.

    2008-10-01

    Metagenomic studies consider the genetic makeup of microbial communities as a whole, rather than their individual member organisms. The functional and metabolic potential of microbial communities can be analyzed by comparing the relative abundance of gene families in their collective genomic sequences (metagenome) under different conditions. Such comparisons require accurate estimation of gene family frequencies. They present a statistical framework for assessing these frequencies based on the Lander-Waterman theory developed originally for Whole Genome Shotgun (WGS) sequencing projects. They also provide a novel method for assessing the reliability of the estimations which can be used for removing seemingly unreliable measurements. They tested their method on a wide range of datasets, including simulated genomes and real WGS data from sequencing projects of whole genomes. Results suggest that their framework corrects inherent biases in accepted methods and provides a good approximation to the true statistics of gene families in WGS projects.

  6. Energy statistics. France

    International Nuclear Information System (INIS)

    2002-10-01

    This document summarizes in a series of tables the energy statistical data for France: consumption since 1973; energy supplies (production, imports, exports, stocks) and uses (refining, power production, internal uses, sectoral consumption) for coal, petroleum, gas, electricity, and renewable energy sources; national production and consumption of primary energy; final consumption per sector and per energy source; general indicators (energy bill, US$ change rate, prices, energy independence, internal gross product); projections. Details (resources, uses, prices, imports, internal consumption) are given separately for petroleum, natural gas, electric power and solid mineral fuels. (J.S.)

  7. Project Compliance with Enterprise Architecture

    NARCIS (Netherlands)

    Foorthuis, R.M.

    2012-01-01

    This research project set out to identify effective practices and models for working with projects that are required to comply with Enterprise Architecture (EA), and investigate the benefits and drawbacks brought about by compliance. Research methods used are canonical action research, a statistical

  8. Servant leadership behaviors of aerospace and defense project managers and their relation to project success

    Science.gov (United States)

    Dominik, Michael T.

    The success of a project is dependent in part on the skills, knowledge, and behavior of its leader, the project manager. Despite advances in project manager certifications and professional development, the aerospace and defense industry has continued to see highly visible and expensive project failures partially attributable to failures in leadership. Servant leadership is an emerging leadership theory whose practitioners embrace empowerment, authenticity, humility, accountability, forgiveness, courage, standing back, and stewardship, but has not yet been fully examined in the context of the project manager as leader. The objective of this study was to examine the relationship between servant leadership behaviors demonstrated by aerospace and defense project managers and the resulting success of their projects. Study participants were drawn from aerospace and defense oriented affinity groups from the LinkedInRTM social media web system. The participants rated their project managers using a 30-item servant leadership scale, and rated the success of their project using a 12-item project success scale. One hundred and fifteen valid responses were analyzed from 231 collected samples from persons who had worked for a project manager on an aerospace and defense project within the past year. The results of the study demonstrated statistically significant levels of positive correlation to project success for all eight servant leadership factors independently evaluated. Using multiple linear regression methods, the servant leadership factors of empowerment and authenticity were determined to be substantial and statistically significant predictors of project success. The study results established the potential application of servant leadership as a valid approach for improving outcomes of projects.

  9. A comparison of dynamical and statistical downscaling methods for regional wave climate projections along French coastlines.

    Science.gov (United States)

    Laugel, Amélie; Menendez, Melisa; Benoit, Michel; Mattarolo, Giovanni; Mendez, Fernando

    2013-04-01

    Wave climate forecasting is a major issue for numerous marine and coastal related activities, such as offshore industries, flooding risks assessment and wave energy resource evaluation, among others. Generally, there are two main ways to predict the impacts of the climate change on the wave climate at regional scale: the dynamical and the statistical downscaling of GCM (Global Climate Model). In this study, both methods have been applied on the French coast (Atlantic , English Channel and North Sea shoreline) under three climate change scenarios (A1B, A2, B1) simulated with the GCM ARPEGE-CLIMAT, from Météo-France (AR4, IPCC). The aim of the work is to characterise the wave climatology of the 21st century and compare the statistical and dynamical methods pointing out advantages and disadvantages of each approach. The statistical downscaling method proposed by the Environmental Hydraulics Institute of Cantabria (Spain) has been applied (Menendez et al., 2011). At a particular location, the sea-state climate (Predictand Y) is defined as a function, Y=f(X), of several atmospheric circulation patterns (Predictor X). Assuming these climate associations between predictor and predictand are stationary, the statistical approach has been used to project the future wave conditions with reference to the GCM. The statistical relations between predictor and predictand have been established over 31 years, from 1979 to 2009. The predictor is built as the 3-days-averaged squared sea level pressure gradient from the hourly CFSR database (Climate Forecast System Reanalysis, http://cfs.ncep.noaa.gov/cfsr/). The predictand has been extracted from the 31-years hindcast sea-state database ANEMOC-2 performed with the 3G spectral wave model TOMAWAC (Benoit et al., 1996), developed at EDF R&D LNHE and Saint-Venant Laboratory for Hydraulics and forced by the CFSR 10m wind field. Significant wave height, peak period and mean wave direction have been extracted with an hourly-resolution at

  10. Study on Semi-Parametric Statistical Model of Safety Monitoring of Cracks in Concrete Dams

    Directory of Open Access Journals (Sweden)

    Chongshi Gu

    2013-01-01

    Full Text Available Cracks are one of the hidden dangers in concrete dams. The study on safety monitoring models of concrete dam cracks has always been difficult. Using the parametric statistical model of safety monitoring of cracks in concrete dams, with the help of the semi-parametric statistical theory, and considering the abnormal behaviors of these cracks, the semi-parametric statistical model of safety monitoring of concrete dam cracks is established to overcome the limitation of the parametric model in expressing the objective model. Previous projects show that the semi-parametric statistical model has a stronger fitting effect and has a better explanation for cracks in concrete dams than the parametric statistical model. However, when used for forecast, the forecast capability of the semi-parametric statistical model is equivalent to that of the parametric statistical model. The modeling of the semi-parametric statistical model is simple, has a reasonable principle, and has a strong practicality, with a good application prospect in the actual project.

  11. Leadership and Project Success in Development Sector

    Directory of Open Access Journals (Sweden)

    Saghir Ahmed

    2017-10-01

    Full Text Available Aim/purpose - The study aims to investigate the relationship among the leadership, operational efficiency and project success in general and the impact of transformational leadership and operational efficiency on project success in particular. Design/methodology/approach - Mean comparison from descriptive statistics and multiple linear regression from inferential statistics was used to determine the association between variables and further impact of the transformational leadership and operational efficiency on project success in the development sector. The paper presents the results of a survey conducted among 200 employees from the top, middle & lower management levels of various national & international development organizations working in Pakistan like Microfinance Banks and other Rural Support Programs. Statistical Package for Social Sciences (SPSS was used to process data. Findings - The result shows positive association among transformational leadership, operational efficiency and project success. In addition, it was found that transformational leadership and operational efficiency have a positive and statistically significant impact on the project success. It is concluded that both transformational leadership and operational efficiency are vital to achieving the optimum level of success in any project, especially in the development sector. Research implications/limitations - The integral limitation of the study was the respondents because most of the development organizations have their operations in rural areas where access was difficult because of limited time and resources. In addition, such organizations are always reluctant to provide survey feedback. Originality/value/contribution - The paper contribution is in the theoretical and practical knowledge of the project success factors in the development sector which is still a somehow unexplored area. Regulators of the development sector may be benefited from this study.

  12. Analysis of natural prerequisites for the development of ecological tourism in the Belgorod region

    Science.gov (United States)

    Pendyurin, Eu A.; Glamazda, S. N.; Genenko, O. N.; Ryadnova, S. A.

    2018-01-01

    The tourism industry is related to entertainment, leisure, new impressions, emotions and pleasure. Tourism liberates people from a sense of fatigue by a specific change of environment and activity. Eco-tourism today is becoming one of the most promising developing sector of tourism business. Ecotourism is something average between the measured leisurely walks and extreme sports. It is contemplative and moderately informative at the same time, usually sporty and mobile, quite impressive. Analysis of natural prerequisite for the development of ecological tourism is one of the stages of site assessment as natural resources are one of the important determinants of its use. The Belgorod region has high recreational and touristic potential. On its territory there are unique natural and recreational resources. Exposing, analyzing tourist recreation resources to tourism development in the Belgorod region, want to pay attention to the large potential of this field in the region.

  13. Topology for Statistical Modeling of Petascale Data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, Janine Camille [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Pebay, Philippe Pierre [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Pascucci, Valerio [Univ. of Utah, Salt Lake City, UT (United States); Levine, Joshua [Univ. of Utah, Salt Lake City, UT (United States); Gyulassy, Attila [Univ. of Utah, Salt Lake City, UT (United States); Rojas, Maurice [Texas A & M Univ., College Station, TX (United States)

    2014-07-01

    This document presents current technical progress and dissemination of results for the Mathematics for Analysis of Petascale Data (MAPD) project titled "Topology for Statistical Modeling of Petascale Data", funded by the Office of Science Advanced Scientific Computing Research (ASCR) Applied Math program.

  14. Integrated pest management and entomopathogenic fungal biotechnology in the Latin Americas: II key research and development prerequisites

    International Nuclear Information System (INIS)

    Khachatourians, George G; Valencia, Edison

    1999-01-01

    In the first part of this review article (Valencia and Khachatourians, 1998) we presented the special opportunity that entomopathogenic fungi (EPF) offer for integrated pest management (IPM) in the Latin Americas. As expected, along with the opportunities, there are challenges for the use of EPF. First that there are only two fungi, Beauveria bassiana and Metarhizium anisopliae, for which some prerequisite knowledge of basic and applied mycology for industrial research and development (R and D) are in place. Because of precedent setting leadership in the development of certain EPF, e.g., B. bassiana in IPM, Latin America stands to contribute to and gain from future

  15. Statistics for NAEG: past efforts, new results, and future plans

    International Nuclear Information System (INIS)

    Gilbert, R.O.; Simpson, J.C.; Kinnison, R.R.; Engel, D.W.

    1983-06-01

    A brief review of Nevada Applied Ecology Group (NAEG) objectives is followed by a summary of past statistical analyses conducted by Pacific Northwest Laboratory for the NAEG. Estimates of spatial pattern of radionuclides and other statistical analyses at NS's 201, 219 and 221 are reviewed as background for new analyses presented in this paper. Suggested NAEG activities and statistical analyses needed for the projected termination date of NAEG studies in March 1986 are given

  16. Statistical Emulation of Climate Model Projections Based on Precomputed GCM Runs*

    KAUST Repository

    Castruccio, Stefano; McInerney, David J.; Stein, Michael L.; Liu Crouch, Feifei; Jacob, Robert L.; Moyer, Elisabeth J.

    2014-01-01

    functions of the past trajectory of atmospheric CO2 concentrations, and a statistical model is fit using a limited set of training runs. The approach is demonstrated to be a useful and computationally efficient alternative to pattern scaling and captures

  17. A portfolio evaluation framework for air transportation improvement projects

    Science.gov (United States)

    Baik, Hyeoncheol

    This thesis explores the application of portfolio theory to the Air Transportation System (ATS) improvement. The ATS relies on complexly related resources and different stakeholder groups. Moreover, demand for air travel is significantly increasing relative to capacity of air transportation. In this environment, improving the ATS is challenging. Many projects, which are defined as technologies or initiatives, for improvement have been proposed and some have been demonstrated in practice. However, there is no clear understanding of how well these projects work in different conditions nor of how they interact with each other or with existing systems. These limitations make it difficult to develop good project combinations, or portfolios that maximize improvement. To help address this gap, a framework for identifying good portfolios is proposed. The framework can be applied to individual projects or portfolios of projects. Projects or portfolios are evaluated using four different groups of factors (effectiveness, time-to-implement, scope of applicability, and stakeholder impacts). Portfolios are also evaluated in terms of interaction-determining factors (prerequisites, co-requisites, limiting factors, and amplifying factors) because, while a given project might work well in isolation, interdependencies between projects or with existing systems could result in lower overall performance in combination. Ways to communicate a portfolio to decision makers are also introduced. The framework is unique because (1) it allows using a variety of available data, and (2) it covers diverse benefit metrics. For demonstrating the framework, an application to ground delay management projects serves as a case study. The portfolio evaluation approach introduced in this thesis can aid decision makers and researchers at universities and aviation agencies such as Federal Aviation Administration (FAA), National Aeronautics and Space Administration (NASA), and Department of Defense (DoD), in

  18. Statistical Compilation of the ICT Sector and Policy Analysis | Page 5 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    The project is designed to expand the scope of conventional investigation beyond the telecommunications industry to include other vertically integrated components of the ICT sector such as manufacturing and services. ... Statistical Compilation of the ICT Sector and Policy Analysis project : country experiences; Malaysia.

  19. Weighted regularized statistical shape space projection for breast 3D model reconstruction.

    Science.gov (United States)

    Ruiz, Guillermo; Ramon, Eduard; García, Jaime; Sukno, Federico M; Ballester, Miguel A González

    2018-05-02

    The use of 3D imaging has increased as a practical and useful tool for plastic and aesthetic surgery planning. Specifically, the possibility of representing the patient breast anatomy in a 3D shape and simulate aesthetic or plastic procedures is a great tool for communication between surgeon and patient during surgery planning. For the purpose of obtaining the specific 3D model of the breast of a patient, model-based reconstruction methods can be used. In particular, 3D morphable models (3DMM) are a robust and widely used method to perform 3D reconstruction. However, if additional prior information (i.e., known landmarks) is combined with the 3DMM statistical model, shape constraints can be imposed to improve the 3DMM fitting accuracy. In this paper, we present a framework to fit a 3DMM of the breast to two possible inputs: 2D photos and 3D point clouds (scans). Our method consists in a Weighted Regularized (WR) projection into the shape space. The contribution of each point in the 3DMM shape is weighted allowing to assign more relevance to those points that we want to impose as constraints. Our method is applied at multiple stages of the 3D reconstruction process. Firstly, it can be used to obtain a 3DMM initialization from a sparse set of 3D points. Additionally, we embed our method in the 3DMM fitting process in which more reliable or already known 3D points or regions of points, can be weighted in order to preserve their shape information. The proposed method has been tested in two different input settings: scans and 2D pictures assessing both reconstruction frameworks with very positive results. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Bayer CropScience model village project: Contributions to agricultural suppliers’ competitiveness and human development

    Directory of Open Access Journals (Sweden)

    Regina Moczadlo

    2014-12-01

    Full Text Available Bayer CropScience is carrying out a Model Village Project (MVP in rural India as part of their supply chain management and their corporate social responsibility activities. The MVP includes actions related to future business cases and higher competitiveness as well as philanthropic activities. The preparation of future business case actions aims at creating prerequisites for win-win-situations. In the long run, these prerequisites, such as long-term business relations with suppliers based on trust from both sides, can lead to a higher competitiveness of the whole supply chain and simultaneously improve human development. The impacts on the latter are evaluated using the capability approach (CA developed by Amartya Sen (2000, c1999. The case of the MVP indicates the potential of companies to contribute to human development on a strategic win-win basis. Actions have to be distinguished based on the living and financial conditions different supplier groups face. In the future, the MVP aims at assessing whether and how MNCs may be able to combine competitive enhancement with human development, provided that potential corporate risks for the villagers’ human development are also taken into account.

  1. Environmental impact assessment for uranium mine, mill and in situ leach projects

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-11-01

    Environmental impact assessments and/or statements are an inherent part of any uranium mining project and are a prerequisite for the future opening of an exploitation and its final closure and decommissioning. Since they contain all information related to the physical, biological, chemical and economic condition of the areas where industrial projects are proposed or planned, they present invaluable guidance for the planning and implementation of environmental mitigation as well as environmental restoration after the mine is closed. They further yield relevant data on the socio-economic impacts of a project. The present report provides guidance on the environmental impact assessment of uranium mining and milling projects, including in situ leach projects which will be useful for companies in the process of planning uranium developments as well as for the regional or national authorities who will assess such developments. Additional information and advice is given through environmental case histories from five different countries. Those case histories are not meant to be prescriptions for conducting assessments nor even firm recommendations, but should serve as examples for the type and extent of work involved in assessments. A model assessment and licensing process is recommended based on the experience of the five countries. 1 fig., 5 tabs.

  2. Environmental impact assessment for uranium mine, mill and in situ leach projects

    International Nuclear Information System (INIS)

    1997-11-01

    Environmental impact assessments and/or statements are an inherent part of any uranium mining project and are a prerequisite for the future opening of an exploitation and its final closure and decommissioning. Since they contain all information related to the physical, biological, chemical and economic condition of the areas where industrial projects are proposed or planned, they present invaluable guidance for the planning and implementation of environmental mitigation as well as environmental restoration after the mine is closed. They further yield relevant data on the socio-economic impacts of a project. The present report provides guidance on the environmental impact assessment of uranium mining and milling projects, including in situ leach projects which will be useful for companies in the process of planning uranium developments as well as for the regional or national authorities who will assess such developments. Additional information and advice is given through environmental case histories from five different countries. Those case histories are not meant to be prescriptions for conducting assessments nor even firm recommendations, but should serve as examples for the type and extent of work involved in assessments. A model assessment and licensing process is recommended based on the experience of the five countries

  3. Annotations to quantum statistical mechanics

    CERN Document Server

    Kim, In-Gee

    2018-01-01

    This book is a rewritten and annotated version of Leo P. Kadanoff and Gordon Baym’s lectures that were presented in the book Quantum Statistical Mechanics: Green’s Function Methods in Equilibrium and Nonequilibrium Problems. The lectures were devoted to a discussion on the use of thermodynamic Green’s functions in describing the properties of many-particle systems. The functions provided a method for discussing finite-temperature problems with no more conceptual difficulty than ground-state problems, and the method was equally applicable to boson and fermion systems and equilibrium and nonequilibrium problems. The lectures also explained nonequilibrium statistical physics in a systematic way and contained essential concepts on statistical physics in terms of Green’s functions with sufficient and rigorous details. In-Gee Kim thoroughly studied the lectures during one of his research projects but found that the unspecialized method used to present them in the form of a book reduced their readability. He st...

  4. Permafrost sub-grid heterogeneity of soil properties key for 3-D soil processes and future climate projections

    Directory of Open Access Journals (Sweden)

    Christian Beer

    2016-08-01

    Full Text Available There are massive carbon stocks stored in permafrost-affected soils due to the 3-D soil movement process called cryoturbation. For a reliable projection of the past, recent and future Arctic carbon balance, and hence climate, a reliable concept for representing cryoturbation in a land surface model (LSM is required. The basis of the underlying transport processes is pedon-scale heterogeneity of soil hydrological and thermal properties as well as insulating layers, such as snow and vegetation. Today we still lack a concept of how to reliably represent pedon-scale properties and processes in a LSM. One possibility could be a statistical approach. This perspective paper demonstrates the importance of sub-grid heterogeneity in permafrost soils as a pre-requisite to implement any lateral transport parametrization. Representing such heterogeneity at the sub-pixel size of a LSM is the next logical step of model advancements. As a result of a theoretical experiment, heterogeneity of thermal and hydrological soil properties alone lead to a remarkable initial sub-grid range of subsoil temperature of 2 deg C, and active-layer thickness of 150 cm in East Siberia. These results show the way forward in representing combined lateral and vertical transport of water and soil in LSMs.

  5. Methods for Clustering Variables and the Use of them in Statistical Packages

    Czech Academy of Sciences Publication Activity Database

    Řezanková, H.; Húsek, Dušan

    2002-01-01

    Roč. 10, - (2002), s. 153-160 ISSN 1210-809X. [Applications of Mathematics and Statistics in Economy. Zadov, 13.09.2001-14.09.2001] R&D Projects: GA ČR GA201/01/1192 Institutional research plan: AV0Z1030915 Keywords : factor analysis * cluster analysis * multidimensional * statistical packages Subject RIV: BB - Applied Statistics, Operational Research

  6. Monitoring and Evaluation; Statistical Support for Life-cycle Studies, 2003 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Skalski, John

    2003-12-01

    This report summarizes the statistical analysis and consulting activities performed under Contract No. 00004134, Project No. 199105100 funded by Bonneville Power Administration during 2003. These efforts are focused on providing real-time predictions of outmigration timing, assessment of life-history performance measures, evaluation of status and trends in recovery, and guidance on the design and analysis of Columbia Basin fish and wildlife studies monitoring and evaluation studies. The overall objective of the project is to provide BPA and the rest of the fisheries community with statistical guidance on design, analysis, and interpretation of monitoring data, which will lead to improved monitoring and evaluation of salmonid mitigation programs in the Columbia/Snake River Basin. This overall goal is being accomplished by making fisheries data readily available for public scrutiny, providing statistical guidance on the design and analyses of studies by hands-on support and written documents, and providing real-time analyses of tagging results during the smolt outmigration for review by decision makers. For a decade, this project has been providing in-season projections of smolt outmigration timing to assist in spill management. As many as 50 different fish stocks at 8 different hydroprojects are tracked and real-time to predict the 'percent of run to date' and 'date to specific percentile'. The project also conducts added-value analyses of historical tagging data to understand relationships between fish responses, environmental factors, and anthropogenic effects. The statistical analysis of historical tagging data crosses agency lines in order to assimilate information on salmon population dynamics irrespective of origin. The lessons learned from past studies are used to improve the design and analyses of future monitoring and evaluation efforts. Through these efforts, the project attempts to provide the fisheries community with reliable analyses

  7. Projection of Climate Change Based on Multi-Site Statistical Downscaling over Gilan area, Iran

    Directory of Open Access Journals (Sweden)

    Vesta Afzali

    2017-01-01

    Full Text Available Introduction: The phenomenon of climate change and its consequences is a familiar topic which is associated with natural disasters such as, flooding, hurricane, drought that cause water crisis and irreparable damages. Studying this phenomenon is a serious warning regarding the earth’s weather change for a long period of time. Materials and Methods: In order to understand and survey the impacts of climate change on water resources, Global Circulation Models, GCMs, are used; their main role is analyzing the current climate and projecting the future climate. Climate change scenarios developing from GCMs are the initial source of information to estimate plausible future climate. For transforming coarse resolution outputs of the GCMs into finer resolutions influenced by local variables, there is a need for reliable downscaling techniques in order to analyze climate changes in a region. The classical statistical methods run the model and generate the future climate just with considering the time variable. Multi-site daily rainfall and temperature time series are the primary inputs in most hydrological analyses such as rainfall-runoff modeling. Water resource management is directly influenced by the spatial and temporal variation of rainfall and temperature. Therefore, spatial-temporal modeling of daily rainfall or temperature including climate change effects is required for sustainable planning of water resources. Results and Discussion: For the first time, in this study by ASD model (Automated regression-based Statistical Downscaling tool developed by M. Hessami et al., multi-site downscaling of temperature and precipitation was done with CGCM3.1A2 outputs and two synoptic stations (Rasht and Bandar Anzali simultaneously by considering the correlations of multiple sites. The model can process conditionally on the occurrence of precipitation or unconditionally for temperature. Hence, the modeling of daily precipitation involves two steps: one step

  8. [Delirium in stroke patients : Critical analysis of statistical procedures for the identification of risk factors].

    Science.gov (United States)

    Nydahl, P; Margraf, N G; Ewers, A

    2017-04-01

    Delirium is a relevant complication following an acute stroke. It is a multifactor occurrence with numerous interacting risk factors that alternately influence each other. The risk factors of delirium in stroke patients are often based on limited clinical studies. The statistical procedures and clinical relevance of delirium related risk factors in adult stroke patients should therefore be questioned. This secondary analysis includes clinically relevant studies that give evidence for the clinical relevance and statistical significance of delirium-associated risk factors in stroke patients. The quality of the reporting of regression analyses was assessed using Ottenbacher's quality criteria. The delirium-associated risk factors identified were examined with regard to statistical significance using the Bonferroni method of multiple testing for forming incorrect positive hypotheses. This was followed by a literature-based discussion on clinical relevance. Nine clinical studies were included. None of the studies fulfilled all the prerequisites and assumptions given for the reporting of regression analyses according to Ottenbacher. Of the 108 delirium-associated risk factors, a total of 48 (44.4%) were significant, whereby a total of 28 (58.3%) were false positive after Bonferroni correction. Following a literature-based discussion on clinical relevance, the assumption of statistical significance and clinical relevance could be found for only four risk factors (dementia or cognitive impairment, total anterior infarct, severe infarct and infections). The statistical procedures used in the existing literature are questionable, as are their results. A post-hoc analysis and critical appraisal reduced the number of possible delirium-associated risk factors to just a few clinically relevant factors.

  9. Paradigm shifts and other prerequisites to facilitate the institutionalising of strategy in South African organisations

    Directory of Open Access Journals (Sweden)

    S. Kruger

    2003-12-01

    Full Text Available South African organisations must undergo a mind shift and adhere to certain prerequisites to survive and be successful. It is evident that companies not changing their mindsets will not survive and be able to create a sustainable competitive advantage and to compete in world markets. Companies have to solve new problems with new paradigms, constantly create something better, something new, create new markets as opposed to increasing market share. The Third Wave development will lead to societal transformation. Moving to Third Wave will imply growth organisations to act like small entrepreneurial businesses that will have the benefit of speed and simplicity but also be able to implement strategy more effectively. Time is of the essence. South African companies have no other option but to move swiftly. The transformation from second to third wave is inevitable.

  10. Uncertainties of statistical downscaling from predictor selection: Equifinality and transferability

    Science.gov (United States)

    Fu, Guobin; Charles, Stephen P.; Chiew, Francis H. S.; Ekström, Marie; Potter, Nick J.

    2018-05-01

    The nonhomogeneous hidden Markov model (NHMM) statistical downscaling model, 38 catchments in southeast Australia and 19 general circulation models (GCMs) were used in this study to demonstrate statistical downscaling uncertainties caused by equifinality to and transferability. That is to say, there could be multiple sets of predictors that give similar daily rainfall simulation results for both calibration and validation periods, but project different amounts (or even directions of change) of rainfall changing in the future. Results indicated that two sets of predictors (Set 1 with predictors of sea level pressure north-south gradient, u-wind at 700 hPa, v-wind at 700 hPa, and specific humidity at 700 hPa and Set 2 with predictors of sea level pressure north-south gradient, u-wind at 700 hPa, v-wind at 700 hPa, and dewpoint temperature depression at 850 hPa) as inputs to the NHMM produced satisfactory results of seasonal rainfall in comparison with observations. For example, during the model calibration period, the relative errors across the 38 catchments ranged from 0.48 to 1.76% with a mean value of 1.09% for the predictor Set 1, and from 0.22 to 2.24% with a mean value of 1.16% for the predictor Set 2. However, the changes of future rainfall from NHMM projections based on 19 GCMs produced projections with a different sign for these two different sets of predictors: Set 1 predictors project an increase of future rainfall with magnitudes depending on future time periods and emission scenarios, but Set 2 predictors project a decline of future rainfall. Such divergent projections may present a significant challenge for applications of statistical downscaling as well as climate change impact studies, and could potentially imply caveats in many existing studies in the literature.

  11. A statistical adjustment approach for climate projections of snow conditions in mountain regions using energy balance land surface models

    Science.gov (United States)

    Verfaillie, Deborah; Déqué, Michel; Morin, Samuel; Lafaysse, Matthieu

    2017-04-01

    Projections of future climate change have been increasingly called for lately, as the reality of climate change has been gradually accepted and societies and governments have started to plan upcoming mitigation and adaptation policies. In mountain regions such as the Alps or the Pyrenees, where winter tourism and hydropower production are large contributors to the regional revenue, particular attention is brought to current and future snow availability. The question of the vulnerability of mountain ecosystems as well as the occurrence of climate-related hazards such as avalanches and debris-flows is also under consideration. In order to generate projections of snow conditions, however, downscaling global climate models (GCMs) by using regional climate models (RCMs) is not sufficient to capture the fine-scale processes and thresholds at play. In particular, the altitudinal resolution matters, since the phase of precipitation is mainly controlled by the temperature which is altitude-dependent. Simulations from GCMs and RCMs moreover suffer from biases compared to local observations, due to their rather coarse spatial and altitudinal resolution, and often provide outputs at too coarse time resolution to drive impact models. RCM simulations must therefore be adjusted using empirical-statistical downscaling and error correction methods, before they can be used to drive specific models such as energy balance land surface models. In this study, time series of hourly temperature, precipitation, wind speed, humidity, and short- and longwave radiation were generated over the Pyrenees and the French Alps for the period 1950-2100, by using a new approach (named ADAMONT for ADjustment of RCM outputs to MOuNTain regions) based on quantile mapping applied to daily data, followed by time disaggregation accounting for weather patterns selection. We first introduce a thorough evaluation of the method using using model runs from the ALADIN RCM driven by a global reanalysis over the

  12. Tucker Tensor analysis of Matern functions in spatial statistics

    KAUST Repository

    Litvinenko, Alexander

    2018-03-09

    In this work, we describe advanced numerical tools for working with multivariate functions and for the analysis of large data sets. These tools will drastically reduce the required computing time and the storage cost, and, therefore, will allow us to consider much larger data sets or finer meshes. Covariance matrices are crucial in spatio-temporal statistical tasks, but are often very expensive to compute and store, especially in 3D. Therefore, we approximate covariance functions by cheap surrogates in a low-rank tensor format. We apply the Tucker and canonical tensor decompositions to a family of Matern- and Slater-type functions with varying parameters and demonstrate numerically that their approximations exhibit exponentially fast convergence. We prove the exponential convergence of the Tucker and canonical approximations in tensor rank parameters. Several statistical operations are performed in this low-rank tensor format, including evaluating the conditional covariance matrix, spatially averaged estimation variance, computing a quadratic form, determinant, trace, loglikelihood, inverse, and Cholesky decomposition of a large covariance matrix. Low-rank tensor approximations reduce the computing and storage costs essentially. For example, the storage cost is reduced from an exponential O(n^d) to a linear scaling O(drn), where d is the spatial dimension, n is the number of mesh points in one direction, and r is the tensor rank. Prerequisites for applicability of the proposed techniques are the assumptions that the data, locations, and measurements lie on a tensor (axes-parallel) grid and that the covariance function depends on a distance, ||x-y||.

  13. Neuroinflammation is not a prerequisite for diabetes-induced tau phosphorylation

    Directory of Open Access Journals (Sweden)

    Judith M Van Der Harg

    2015-11-01

    Full Text Available Abnormal phosphorylation and aggregation of tau is a key hallmark of Alzheimer's disease (AD. AD is a multifactorial neurodegenerative disorder for which Diabetes Mellitus (DM is a risk factor. In animal models for DM, the phosphorylation and aggregation of tau is induced or exacerbated, however the underlying mechanism is unknown. In addition to the metabolic dysfunction, DM is characterized by chronic low-grade inflammation. This was reported to be associated with a neuroinflammatory response in the hypothalamus of DM animal models. Neuroinflammation is also implicated in the development and progression of AD. It is unknown whether DM also induces neuroinflammation in brain areas affected in AD, the cortex and hippocampus. Here we investigated whether neuroinflammation could be the mechanistic trigger to induce tau phosphorylation in the brain of DM animals. Two distinct diabetic animal models were used; rats on free-choice high-fat high-sugar (fcHFHS diet that are insulin resistant and streptozotocin-treated rats that are insulin deficient. The streptozotocin-treated animals demonstrated increased tau phosphorylation in the brain as expected, whereas the fcHFHS diet fed animals did not. Remarkably, neither of the diabetic animal models showed reactive microglia or increased GFAP and COX-2 levels in the cortex or hippocampus. From this, we conclude: 1. DM does not induce neuroinflammation in brain regions affected in AD, and 2. Neuroinflammation is not a prerequisite for tau phosphorylation. Neuroinflammation is therefore not the mechanism that explains the close connection between DM and AD.

  14. Using Pre-Statistical Analysis to Streamline Monitoring Assessments

    International Nuclear Information System (INIS)

    Reed, J.K.

    1999-01-01

    A variety of statistical methods exist to aid evaluation of groundwater quality and subsequent decision making in regulatory programs. These methods are applied because of large temporal and spatial extrapolations commonly applied to these data. In short, statistical conclusions often serve as a surrogate for knowledge. However, facilities with mature monitoring programs that have generated abundant data have inherently less uncertainty because of the sheer quantity of analytical results. In these cases, statistical tests can be less important, and ''expert'' data analysis should assume an important screening role.The WSRC Environmental Protection Department, working with the General Separations Area BSRI Environmental Restoration project team has developed a method for an Integrated Hydrogeological Analysis (IHA) of historical water quality data from the F and H Seepage Basins groundwater remediation project. The IHA combines common sense analytical techniques and a GIS presentation that force direct interactive evaluation of the data. The IHA can perform multiple data analysis tasks required by the RCRA permit. These include: (1) Development of a groundwater quality baseline prior to remediation startup, (2) Targeting of constituents for removal from RCRA GWPS, (3) Targeting of constituents for removal from UIC, permit, (4) Targeting of constituents for reduced, (5)Targeting of monitoring wells not producing representative samples, (6) Reduction in statistical evaluation, and (7) Identification of contamination from other facilities

  15. What Causes Cost Overrun in Transport Infrastructure Projects?

    DEFF Research Database (Denmark)

    Flyvbjerg, Bent; Holm, Mette K. Skamris; Buhl, Søren L.

    cost escalation for three types of project ownership - private, state-owned enterprise and other public ownership - it is shown that the oft-seen claim that public ownership is problematic and private ownership effective in curbing cost escalation is an oversimplification. Type of accountability......This article presents results from the first statistically significant study of causes of cost escalation in transport infrastructure projects. The study is based on a sample of 258 rail, bridge, tunnel and roads projects worth US$90 billion. The focus is on the dependence of cost escalation on (1......) length of project implementation phase, (2) size of project and (3) type of project ownership. First, it is found with very high statistical significance that cost escalation is strongly dependent on length of implementation phase. The policy implications are clear: Decision makers and planners should...

  16. Improved custom statistics visualization for CA Performance Center data

    CERN Document Server

    Talevi, Iacopo

    2017-01-01

    The main goal of my project is to understand and experiment the possibilities that CA Performance Center (CA PC) offers for creating custom applications to display stored information through interesting visual means, such as maps. In particular, I have re-written some of the network statistics web pages in order to fetch data from new statistics modules in CA PC, which has its own API, and stop using the RRD data.

  17. On statistical analysis of compound point process

    Czech Academy of Sciences Publication Activity Database

    Volf, Petr

    2006-01-01

    Roč. 35, 2-3 (2006), s. 389-396 ISSN 1026-597X R&D Projects: GA ČR(CZ) GA402/04/1294 Institutional research plan: CEZ:AV0Z10750506 Keywords : counting process * compound process * hazard function * Cox -model Subject RIV: BB - Applied Statistics, Operational Research

  18. Mathematics Prerequisites for Introductory Geoscience Courses: Using Technology to Help Solve the Problem

    Science.gov (United States)

    Burn, H. E.; Wenner, J. M.; Baer, E. M.

    2011-12-01

    The quantitative components of introductory geoscience courses can pose significant barriers to students. Many academic departments respond by stripping courses of their quantitative components or by attaching prerequisite mathematics courses [PMC]. PMCs cause students to incur additional costs and credits and may deter enrollment in introductory courses; yet, stripping quantitative content from geoscience courses masks the data-rich, quantitative nature of geoscience. Furthermore, the diversity of math skills required in geoscience and students' difficulty with transferring mathematical knowledge across domains suggest that PMCs may be ineffective. Instead, this study explores an alternative strategy -- to remediate students' mathematical skills using online modules that provide students with opportunities to build contextual quantitative reasoning skills. The Math You Need, When You Need It [TMYN] is a set of modular online student resources that address mathematical concepts in the context of the geosciences. TMYN modules are online resources that employ a "just-in-time" approach - giving students access to skills and then immediately providing opportunities to apply them. Each module places the mathematical concept in multiple geoscience contexts. Such an approach illustrates the immediate application of a principle and provides repeated exposure to a mathematical skill, enhancing long-term retention. At the same time, placing mathematics directly in several geoscience contexts better promotes transfer of learning by using similar discourse (words, tools, representations) and context that students will encounter when applying mathematics in the future. This study uses quantitative and qualitative data to explore the effectiveness of TMYN modules in remediating students' mathematical skills. Quantitative data derive from ten geoscience courses that used TMYN modules during the fall 2010 and spring 2011 semesters; none of the courses had a PMC. In all courses

  19. Statistical significance of trends in monthly heavy precipitation over the US

    KAUST Repository

    Mahajan, Salil

    2011-05-11

    Trends in monthly heavy precipitation, defined by a return period of one year, are assessed for statistical significance in observations and Global Climate Model (GCM) simulations over the contiguous United States using Monte Carlo non-parametric and parametric bootstrapping techniques. The results from the two Monte Carlo approaches are found to be similar to each other, and also to the traditional non-parametric Kendall\\'s τ test, implying the robustness of the approach. Two different observational data-sets are employed to test for trends in monthly heavy precipitation and are found to exhibit consistent results. Both data-sets demonstrate upward trends, one of which is found to be statistically significant at the 95% confidence level. Upward trends similar to observations are observed in some climate model simulations of the twentieth century, but their statistical significance is marginal. For projections of the twenty-first century, a statistically significant upwards trend is observed in most of the climate models analyzed. The change in the simulated precipitation variance appears to be more important in the twenty-first century projections than changes in the mean precipitation. Stochastic fluctuations of the climate-system are found to be dominate monthly heavy precipitation as some GCM simulations show a downwards trend even in the twenty-first century projections when the greenhouse gas forcings are strong. © 2011 Springer-Verlag.

  20. A quality improvement project using statistical process control methods for type 2 diabetes control in a resource-limited setting.

    Science.gov (United States)

    Flood, David; Douglas, Kate; Goldberg, Vera; Martinez, Boris; Garcia, Pablo; Arbour, MaryCatherine; Rohloff, Peter

    2017-08-01

    Quality improvement (QI) is a key strategy for improving diabetes care in low- and middle-income countries (LMICs). This study reports on a diabetes QI project in rural Guatemala whose primary aim was to improve glycemic control of a panel of adult diabetes patients. Formative research suggested multiple areas for programmatic improvement in ambulatory diabetes care. This project utilized the Model for Improvement and Agile Global Health, our organization's complementary healthcare implementation framework. A bundle of improvement activities were implemented at the home, clinic and institutional level. Control charts of mean hemoglobin A1C (HbA1C) and proportion of patients meeting target HbA1C showed improvement as special cause variation was identified 3 months after the intervention began. Control charts for secondary process measures offered insights into the value of different components of the intervention. Intensity of home-based diabetes education emerged as an important driver of panel glycemic control. Diabetes QI work is feasible in resource-limited settings in LMICs and can improve glycemic control. Statistical process control charts are a promising methodology for use with panels or registries of diabetes patients. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  1. Multivariate statistical assessment of coal properties

    Czech Academy of Sciences Publication Activity Database

    Klika, Z.; Serenčíšová, J.; Kožušníková, Alena; Kolomazník, I.; Študentová, S.; Vontorová, J.

    2014-01-01

    Roč. 128, č. 128 (2014), s. 119-127 ISSN 0378-3820 R&D Projects: GA MŠk ED2.1.00/03.0082 Institutional support: RVO:68145535 Keywords : coal properties * structural,chemical and petrographical properties * multivariate statistics Subject RIV: DH - Mining, incl. Coal Mining Impact factor: 3.352, year: 2014 http://dx.doi.org/10.1016/j.fuproc.2014.06.029

  2. Software Engineering and eLearning: The MuSofT Project - www.musoft.org

    Directory of Open Access Journals (Sweden)

    Ernst-Erich Doberkat

    2005-12-01

    Full Text Available eLearning supports the education in certain disciplines. Here, we report about novel eLearning concepts, techniques, and tools to support education in Software Engineering, a subdiscipline of computer science. We call this "Software Engineering eLearning". On the other side, software support is a substantial prerequisite for eLearning in any discipline. Thus, Software Engineering techniques have to be applied to develop and maintain those software systems. We call this "eLearning Software Engineering". Both aspects have been investigated in a large joint, BMBF-funded research project, termed MuSofT (Multimedia in Software Engineering. The main results are summarized in this paper.

  3. Indoor location-based services prerequisites and foundations

    CERN Document Server

    Werner, Martin

    2014-01-01

    This book delivers concise coverage of classical methods and new developments related to indoor location-based services. It collects results from isolated domains including geometry, artificial intelligence, statistics, cooperative algorithms, and distributed systems and thus provides an accessible overview of fundamental methods and technologies. This makes it an ideal starting point for researchers, students, and professionals in pervasive computing. Location-based services are services using the location of a mobile computing device as their primary input. While such services are fairly e

  4. Statistical variability of hydro-meteorological variables as indicators ...

    African Journals Online (AJOL)

    Statistical variability of hydro-meteorological variables as indicators of climate change in north-east Sokoto-Rima basin, Nigeria. ... water resources development including water supply project, agriculture and tourism in the study area. Key word: Climate change, Climatic variability, Actual evapotranspiration, Global warming ...

  5. Statistics and the statistician in nuclear site decontamination and decommissioning: lecture notes for a 4-day short course

    International Nuclear Information System (INIS)

    Barnes, M.G.

    1981-04-01

    This course provides information on statistical aspects of radiological surveys and the remedial action (RA) process to enable statisticians to serve effectively in Decommissioning and Decontamination (D and D) projects. The material covered is pertinent both to a statistician working onsite and handling day-to-day data analysis and to a statistical member of project planning and management teams. Participants will learn the types of statistical problems that can arise and the kinds of questions that must be answered to enable efficient statistical designs and analyses to be developed. The ultimate goal is to help extract the maximum benefit from statistical contributions to D and D projects. Among the topics covered are: Nature of the D and D Problem: Measuring Devices and Sampling Methods; Estimating Activity in or on Soil, Buildings, Equipment and Other Materials; Estimating Isotope Ratios and Error Variances; Acceptance Sampling as a General Method; Special Estimation Problems; Non-Statistical Consdierations; Challenge Problems

  6. Derivation from first principles of the statistical distribution of the mass peak intensities of MS data.

    Science.gov (United States)

    Ipsen, Andreas

    2015-02-03

    Despite the widespread use of mass spectrometry (MS) in a broad range of disciplines, the nature of MS data remains very poorly understood, and this places important constraints on the quality of MS data analysis as well as on the effectiveness of MS instrument design. In the following, a procedure for calculating the statistical distribution of the mass peak intensity for MS instruments that use analog-to-digital converters (ADCs) and electron multipliers is presented. It is demonstrated that the physical processes underlying the data-generation process, from the generation of the ions to the signal induced at the detector, and on to the digitization of the resulting voltage pulse, result in data that can be well-approximated by a Gaussian distribution whose mean and variance are determined by physically meaningful instrumental parameters. This allows for a very precise understanding of the signal-to-noise ratio of mass peak intensities and suggests novel ways of improving it. Moreover, it is a prerequisite for being able to address virtually all data analytical problems in downstream analyses in a statistically rigorous manner. The model is validated with experimental data.

  7. Metallogenic geologic prerequisites of sandstone-type uranium deposits and target area selection. Taking Erlian and Ordos basins as examples

    International Nuclear Information System (INIS)

    Chen Fazheng

    2002-01-01

    Sandstone-type uranium deposit is the main target of recent uranium prospecting and exploration. According to the metallogenic characteristics, sandstone-type uranium deposits are divided into three groups: paleo-channel type, interlayer oxidation zone type and phreatic interlayer oxidation type. The author makes an analysis on the geologic prerequisites of the three types of uranium deposits, the similarities and difference, and preliminarily summarizes genetic models of different types of uranium deposits. Finally, taking Erlian and Ordos basins as examples, the author makes an evaluation and a strategic analysis on the uranium metallogenic prospect of the above two basins

  8. Development of a funding, cost, and spending model for satellite projects

    Science.gov (United States)

    Johnson, Jesse P.

    1989-01-01

    The need for a predictive budget/funging model is obvious. The current models used by the Resource Analysis Office (RAO) are used to predict the total costs of satellite projects. An effort to extend the modeling capabilities from total budget analysis to total budget and budget outlays over time analysis was conducted. A statistical based and data driven methodology was used to derive and develop the model. Th budget data for the last 18 GSFC-sponsored satellite projects were analyzed and used to build a funding model which would describe the historical spending patterns. This raw data consisted of dollars spent in that specific year and their 1989 dollar equivalent. This data was converted to the standard format used by the RAO group and placed in a database. A simple statistical analysis was performed to calculate the gross statistics associated with project length and project cost ant the conditional statistics on project length and project cost. The modeling approach used is derived form the theory of embedded statistics which states that properly analyzed data will produce the underlying generating function. The process of funding large scale projects over extended periods of time is described by Life Cycle Cost Models (LCCM). The data was analyzed to find a model in the generic form of a LCCM. The model developed is based on a Weibull function whose parameters are found by both nonlinear optimization and nonlinear regression. In order to use this model it is necessary to transform the problem from a dollar/time space to a percentage of total budget/time space. This transformation is equivalent to moving to a probability space. By using the basic rules of probability, the validity of both the optimization and the regression steps are insured. This statistically significant model is then integrated and inverted. The resulting output represents a project schedule which relates the amount of money spent to the percentage of project completion.

  9. Using Microsoft Excel[R] to Calculate Descriptive Statistics and Create Graphs

    Science.gov (United States)

    Carr, Nathan T.

    2008-01-01

    Descriptive statistics and appropriate visual representations of scores are important for all test developers, whether they are experienced testers working on large-scale projects, or novices working on small-scale local tests. Many teachers put in charge of testing projects do not know "why" they are important, however, and are utterly convinced…

  10. the human genome project

    African Journals Online (AJOL)

    Enrique

    have resulted in the biological diversity, both past and present, on this planet. ... Consortium. He is principal investigator on ... Leeuwenhoek was a necessary prerequisite to the vast array of high-definition .... with evidence left at crime scenes.

  11. Concept Specifications/Prerequisites for DeepWind Deliverable D8.1

    DEFF Research Database (Denmark)

    Schmidt Paulsen, Uwe; Schløer, Signe; Larsén, Xiaoli Guo

    (NL), NREL(USA), STATOIL(N), VESTAS(DK) and NENUPHAR(F). The report discuss the design considerations for offshore wind turbines, both in general and specifically for Darrieus-type floating turbines, as is the focus of the DeepWind project. The project is considered in a North Sea environment, notably close......The work is a result of the contributions within the DeepWind project which is supported by the European Commission, Grant 256769 FP7 Energy 2010 - Future emerging technologies, and by the DeepWind beneficiaries: DTU(DK), AAU(DK), TUDELFT(NL), TUTRENTO(I), DHI(DK), SINTEF(N), MARINTEK(N), MARIN...

  12. DEVELOPMENT OF THE SOCIALLY-ORIENTED ECONOMY IN UKRAINE: PREREQUISITES AND STRATEGIC FORECASTING

    Directory of Open Access Journals (Sweden)

    Nataliia Kravchuk

    2017-11-01

    Full Text Available The purpose of this article is to substantiate prerequisites and prospects for the formation and development of the socially-oriented economy in Ukraine. The theoretical and methodological background to the research is system and synergetic scientific approaches, which use in the research of the development of a socially-oriented economy is determined by orientation to basic values and fundamentals of the market economic system and focus on building a democratic society in Ukraine. Scientific results – it is substantiated that in modern conditions of transformational changes, a strategic course of Ukraine is to combine mechanisms of market selfregulation and state regulation, which in its basis is based on principles of socially-oriented economy. Such a model of the economy is focused on a human along with its needs and interests, relations with other members of society during production, distribution, exchange, and consumption. At the same time, social orientation foresees providing high productive economic management that configures private initiative and competition. Results of conducted analysis of analytical indicators of socio-economic development of Ukraine show an urgent necessity and objectivity of forming a socially-oriented economy. It is proved that taking into account increase in openness of the national economy, it is appropriate to consider strict conditions concerning its adequacy to requirements of the current international competition. In this relation, there are analysed vectors of the formation of the socially-oriented economy in Ukraine declared by the Strategy of Sustainable Development “Ukraine – 2020” within which a key reference point is an introduction in the country of European standards of living and its achieving a dominant position in the world. An economic platform of the formation and development of the socially-oriented economy is determined the following: providing economic freedom; availability of

  13. Associate Degree Nursing: Model Prerequisites Validation Study. California Community College Associate Degree Programs by The Center for Student Success, A Health Care Initiative Sponsored Project.

    Science.gov (United States)

    Phillips, Brad C.; Spurling, Steven; Armstrong, William A.

    California faces a severe nursing shortage, with the number of registered nurses far below what is required to avert a potential state health care crisis. The Associate Degree Nursing (ADN) Project is a joint project involving scholars, educational researchers, and analysts from the Center for Student Success (CSS) housed at City College of San…

  14. Challenges for statistics teaching and teacher’s training in Mexico

    Directory of Open Access Journals (Sweden)

    Sergio Hernández González

    2013-08-01

    Full Text Available This work will cover the problems that are found in teacher training and professional development in Probability and Statistics in higher education in Mexico. It will be approached through four focuses: a the characterization and training of teachers that drive the development and implementation of curriculum reforms in the teaching of Statistics; b challenges of teachers in the instruction of university-level Statistics; c new curricular reforms with respect to the instruction of Statistics that propose the development of a learning based in projects through the use of appropriate statistical software, and d educational innovation as a body of knowledge in development, by which the shaping of networks consisting of professors who favor the emergence of real innovation is brought about. Starting from these perspectives, the challenges confronted in the teaching and training of Statistics professors will be proposed.

  15. Statistical methods for quantitative indicators of impacts, applied to transmission line projects

    International Nuclear Information System (INIS)

    Ospina Norena, Jesus Efren; Lema Tapias, Alvaro de Jesus

    2005-01-01

    Multivariate statistical analyses are proposed for encountering the relationships between variables and impacts, to obtain high explanatory power for interpretation of the causes and effects and achieve the highest certainty possible, to evaluate and classify impacts by their level of influence

  16. Spike sorting using locality preserving projection with gap statistics and landmark-based spectral clustering.

    Science.gov (United States)

    Nguyen, Thanh; Khosravi, Abbas; Creighton, Douglas; Nahavandi, Saeid

    2014-12-30

    Understanding neural functions requires knowledge from analysing electrophysiological data. The process of assigning spikes of a multichannel signal into clusters, called spike sorting, is one of the important problems in such analysis. There have been various automated spike sorting techniques with both advantages and disadvantages regarding accuracy and computational costs. Therefore, developing spike sorting methods that are highly accurate and computationally inexpensive is always a challenge in the biomedical engineering practice. An automatic unsupervised spike sorting method is proposed in this paper. The method uses features extracted by the locality preserving projection (LPP) algorithm. These features afterwards serve as inputs for the landmark-based spectral clustering (LSC) method. Gap statistics (GS) is employed to evaluate the number of clusters before the LSC can be performed. The proposed LPP-LSC is highly accurate and computationally inexpensive spike sorting approach. LPP spike features are very discriminative; thereby boost the performance of clustering methods. Furthermore, the LSC method exhibits its efficiency when integrated with the cluster evaluator GS. The proposed method's accuracy is approximately 13% superior to that of the benchmark combination between wavelet transformation and superparamagnetic clustering (WT-SPC). Additionally, LPP-LSC computing time is six times less than that of the WT-SPC. LPP-LSC obviously demonstrates a win-win spike sorting solution meeting both accuracy and computational cost criteria. LPP and LSC are linear algorithms that help reduce computational burden and thus their combination can be applied into real-time spike analysis. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Statistical Engineering in Air Traffic Management Research

    Science.gov (United States)

    Wilson, Sara R.

    2015-01-01

    NASA is working to develop an integrated set of advanced technologies to enable efficient arrival operations in high-density terminal airspace for the Next Generation Air Transportation System. This integrated arrival solution is being validated and verified in laboratories and transitioned to a field prototype for an operational demonstration at a major U.S. airport. Within NASA, this is a collaborative effort between Ames and Langley Research Centers involving a multi-year iterative experimentation process. Designing and analyzing a series of sequential batch computer simulations and human-in-the-loop experiments across multiple facilities and simulation environments involves a number of statistical challenges. Experiments conducted in separate laboratories typically have different limitations and constraints, and can take different approaches with respect to the fundamental principles of statistical design of experiments. This often makes it difficult to compare results from multiple experiments and incorporate findings into the next experiment in the series. A statistical engineering approach is being employed within this project to support risk-informed decision making and maximize the knowledge gained within the available resources. This presentation describes a statistical engineering case study from NASA, highlights statistical challenges, and discusses areas where existing statistical methodology is adapted and extended.

  18. Education and training as prerequisites for overcoming the difficulties in the implementation of ethical and legal norms concerning gender equality in a social environment

    Directory of Open Access Journals (Sweden)

    Gavrilović Danijela

    2008-01-01

    Full Text Available In this paper, the author advances the thesis that in today's Serbia there is no social consensus concerning the unequal treatment of men and women, and that 'patriarchal syndrome', stereotypes and prejudices are still widely present and are greatly influencing the functioning of social mechanisms and the achievement of gender equality. In Serbia the process of achieving the equal treatment of women de jure is still in progress. With the absence of consensus, which is a prerequisite for 'transmitting' social values encompassed by gender equality, the chances are little that equality will be attained de facto. This paper is meant as a warning that not all types of women's inequality are easily noticeable, as well as that on the social scene there are many different and intertwined social actors which influence dealing with the problem of inequality, implementation of international and domestic legal acts, ethical standards, and taking steps to introduce mechanisms for achieving women's equality in society. One of the prerequisites for overcoming these difficulties is a system of education and educational resources, which promote the idea of gender equality.

  19. Learning Statistics at the Farmers Market? A Comparison of Academic Service Learning and Case Studies in an Introductory Statistics Course

    Science.gov (United States)

    Hiedemann, Bridget; Jones, Stacey M.

    2010-01-01

    We compare the effectiveness of academic service learning to that of case studies in an undergraduate introductory business statistics course. Students in six sections of the course were assigned either an academic service learning project (ASL) or business case studies (CS). We examine two learning outcomes: students' performance on the final…

  20. Demonstration project: Load management on the user side at power shortages

    International Nuclear Information System (INIS)

    Lindskoug, Stefan

    2005-10-01

    The risk for power shortages during extreme cold weather has increased in Sweden. Comments are made that high electricity spot prices are important for holding down the demand. Through the consumers' higher price sensitivity, the electricity system can be operated with lower reserve capacity. The objective of the demonstration project is to show methods for reducing the electricity demand at the national level at high spot prices. An important prerequisite is that the measures must be profitable for all parties involved. Four separate studies were made, two concerning households, one industry and one for the district heating sector. The conclusion from the studies is that load management on the customer's side is an economic alternative to investment in new production capacity

  1. All projects related to Ghana | Page 4 | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Climate change constitutes a real threat to the livelihood and well-being of the Ghanaian population. ... HEALTH STATISTICS, STATISTICAL DATA, STATISTICAL ANALYSIS ... Impact of Foreign Direct Investment Flows on Poverty in Ghana. Project. Ghana will need considerable external assistance to achieve its Poverty ...

  2. Rényi statistics for testing composite hypotheses in general exponential models

    Czech Academy of Sciences Publication Activity Database

    Morales, D.; Pardo, L.; Pardo, M. C.; Vajda, Igor

    2004-01-01

    Roč. 38, č. 2 (2004), s. 133-147 ISSN 0233-1888 R&D Projects: GA ČR GA201/02/1391 Grant - others:BMF(ES) 2003-00892; BMF(ES) 2003-04820 Institutional research plan: CEZ:AV0Z1075907 Keywords : natural exponential models * Levy processes * generalized Wald statistics Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.323, year: 2004

  3. Proposed methodology and infrastructure for standards development and implementation within a national statistical agency

    CSIR Research Space (South Africa)

    Cooper, Antony K

    2005-09-01

    Full Text Available The principal objective of the Data Management and Information Delivery (DMID) project in Statistics South Africa (Stats SA) is to develop an infrastructure that supports the business of a statistical organisation, which includes data stores...

  4. Biophotons, coherence and photocount statistics: A critical review

    Czech Academy of Sciences Publication Activity Database

    Cifra, Michal; Brouder, Ch.; Nerudová, Michaela; Kučera, Ondřej

    2015-01-01

    Roč. 164, August (2015), s. 38-51 ISSN 0022-2313 R&D Projects: GA ČR GA13-29294S Institutional support: RVO:67985882 Keywords : Photocount statistics * Chemiluminescence * Squeezed states * Ultra-weak photon emission Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering Impact factor: 2.693, year: 2015

  5. Help guide for setting up photovoltaic projects born by agricultural companies and farmers

    International Nuclear Information System (INIS)

    2010-01-01

    After a brief recall of energy production and consumption challenges in France, and a brief presentation of photovoltaic energy production connected to the distribution network in urban settings, this document describes the arrangements which aim at supporting this production by introducing purchase tariffs. Eligibility criteria and the different tariff levels are presented. They depend on the type of building and on the level of integration of the production module. Then, after having highlighted the reasons to invest in such projects, the document specify technical prerequisites (building orientation, roof slope angle, shadow effect plotting, module technologies, connection technical and economic feasibility), how to carry the project to a successful conclusion (internal communication, urban planning approaches, engineering consultancy, relationship with financiers, administrative aspects). It also comments the various aspects of the technical-economic analysis (photovoltaic system choice and installation, maintenance, insurance, connection to the network), the law and tax issues (roof renting, taxes), and the operation. Four examples are briefly presented

  6. Engaging Students in Survey Research Projects across Research Methods and Statistics Courses

    Science.gov (United States)

    Lovekamp, William E.; Soboroff, Shane D.; Gillespie, Michael D.

    2017-01-01

    One innovative way to help students make sense of survey research has been to create a multifaceted, collaborative assignment that promotes critical thinking, comparative analysis, self-reflection, and statistical literacy. We use a short questionnaire adapted from the Higher Education Research Institute's Cooperative Institutional Research…

  7. From the sea to the laboratory: Characterization of microplastic as prerequisite for the assessment of ecotoxicological impact.

    Science.gov (United States)

    Potthoff, Annegret; Oelschlägel, Kathrin; Schmitt-Jansen, Mechthild; Rummel, Christoph Daniel; Kühnel, Dana

    2017-05-01

    The presence of microplastic (MP) in the aquatic environment is recognized as a global-scale pollution issue. Secondary MP particles result from an ongoing fragmentation process governed by various biotic and abiotic factors. For a reliable risk assessment of these MP particles, knowledge about interactions with biota is needed. However, extensive testing with standard organisms under reproducible laboratory conditions with well-characterized MP suspensions is not available yet. As MP in the environment represents a mixture of particles differing in properties (e.g., size, color, polymer type, surface characteristics), it is likely that only specific particle fractions pose a threat towards organisms. In order to assign hazardous effects to specific particle properties, these characteristics need to be analyzed. As shown by the testing of particles (e.g. nanoparticles), characteristics other than chemical properties are important for the emergence of toxicity in organisms, and parameters such as surface area or size distribution need consideration. Therefore, the use of "well-defined" particles for ecotoxicological testing (i.e., standard particles) facilitates the establishment of causal links between physical-chemical properties of MP particles and toxic effects in organisms. However, the benefits of well-defined particles under laboratory conditions are offset by the disadvantage of the unknown comparability with MP in the environment. Therefore, weathering effects caused by biological, chemical, physical or mechanical processes have to be considered. To date, the characterization of the progression of MP weathering based on powder and suspension characterization methods is in its infancy. The aim of this commentary is to illustrate the prerequisites for testing MP in the laboratory from 3 perspectives: (i) knowledge of particle properties; (ii) behavior of MP in test setups involving ecotoxicological test organisms; and (iii) accordingly, test conditions that

  8. 2D Affine and Projective Shape Analysis.

    Science.gov (United States)

    Bryner, Darshan; Klassen, Eric; Huiling Le; Srivastava, Anuj

    2014-05-01

    Current techniques for shape analysis tend to seek invariance to similarity transformations (rotation, translation, and scale), but certain imaging situations require invariance to larger groups, such as affine or projective groups. Here we present a general Riemannian framework for shape analysis of planar objects where metrics and related quantities are invariant to affine and projective groups. Highlighting two possibilities for representing object boundaries-ordered points (or landmarks) and parameterized curves-we study different combinations of these representations (points and curves) and transformations (affine and projective). Specifically, we provide solutions to three out of four situations and develop algorithms for computing geodesics and intrinsic sample statistics, leading up to Gaussian-type statistical models, and classifying test shapes using such models learned from training data. In the case of parameterized curves, we also achieve the desired goal of invariance to re-parameterizations. The geodesics are constructed by particularizing the path-straightening algorithm to geometries of current manifolds and are used, in turn, to compute shape statistics and Gaussian-type shape models. We demonstrate these ideas using a number of examples from shape and activity recognition.

  9. Statistical analysis of the potassium concentration obtained through

    International Nuclear Information System (INIS)

    Pereira, Joao Eduardo da Silva; Silva, Jose Luiz Silverio da; Pires, Carlos Alberto da Fonseca; Strieder, Adelir Jose

    2007-01-01

    The present work was developed in outcrops of Santa Maria region, southern Brazil, Rio Grande do Sul State. Statistic evaluations were applied in different rock types. The possibility to distinguish different geologic units, sedimentary and volcanic (acid and basic types) by means of the statistic analyses from the use of airborne gamma-ray spectrometry integrating potash radiation emissions data with geological and geochemistry data is discussed. This Project was carried out at 1973 by Geological Survey of Brazil/Companhia de Pesquisas de Recursos Minerais. The Camaqua Project evaluated the behavior of potash concentrations generating XYZ Geosof 1997 format, one grid, thematic map and digital thematic map files from this total area. Using these data base, the integration of statistics analyses in sedimentary formations which belong to the Depressao Central do Rio Grande do Sul and/or to volcanic rocks from Planalto da Serra Geral at the border of Parana Basin was tested. Univariate statistics model was used: the media, the standard media error, and the trust limits were estimated. The Tukey's Test was used in order to compare mean values. The results allowed to create criteria to distinguish geological formations based on their potash content. The back-calibration technique was employed to transform K radiation to percentage. Inside this context it was possible to define characteristic values from radioactive potash emissions and their trust ranges in relation to geologic formations. The potash variable when evaluated in relation to geographic Universal Transverse Mercator coordinates system showed a spatial relation following one polynomial model of second order, with one determination coefficient. The statistica 7.1 software Generalist Linear Models produced by Statistics Department of Federal University of Santa Maria/Brazil was used. (author)

  10. Stakeholder preferences towards the sustainable development of CDM projects: Lessons from biomass (rice husk) CDM project in Thailand

    International Nuclear Information System (INIS)

    Parnphumeesup, Piya; Kerr, Sandy A.

    2011-01-01

    This research applies both quantitative and qualitative methods to investigate stakeholder preferences towards sustainable development (SD) priorities in Clean Development Mechanism (CDM) projects. The CDM's contribution to SD is explored in the context of a biomass (rice husk) case study conducted in Thailand. Quantitative analysis ranks increasing the usage of renewable energy as the highest priority, followed by employment and technology transfer. Air pollution (dust) is ranked as the most important problem. Preference weights expressed by experts and local resident are statistically different in the cases of: employment generation; emission reductions; dust; waste disposal; and noise. Qualitative results, suggest that rice husk CDM projects contribute significantly to SD in terms of employment generation, an increase in usage of renewable energy, and transfer of knowledge. However, rice husk biomass projects create a potential negative impact on air quality. In order to ensure the environmental sustainability of CDM projects, stakeholders suggest that Thailand should cancel an Environmental Impact Assessment (EIA) exemption for CDM projects with an installed capacity below 10 MW and apply it to all CDM projects. - Highlights: → Stakeholders rank increasing the usage of renewable energy as the highest priority. → Biomass (rice husk) CDM projects create a potential negative impact on air quality. → Rice husk CDM projects cannot give an extra income to farmers. → Preference weights expressed by experts and local residents are statistically different.

  11. Renyi statistics in equilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Parvan, A.S.; Biro, T.S.

    2010-01-01

    The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.

  12. The European project Trappist: transfer, processing and interpretation of 3D NDT data in a standard environment

    International Nuclear Information System (INIS)

    Georgel, B.; Nockemann, C.

    1994-01-01

    The European CEC-funded project TRAPPIST aims to provide the pre-requisites to combination of various NDT-methods. The key components to achieve this goal are a multi-method NDT standard data format and a platform-independent software environment. Another important feature is communication of both NDT data and expertise between remote workstations through state-of-the-art European ISDN broadband network. A full scale prototype is under development to demonstrate feasibility of this system. A survey on recent literature showing originality of the TRAPPIST features is included in the paper. (authors). 2 figs., 12 refs

  13. STATISTICAL DOWNSCALING DENGAN PERGESERAN WAKTU BERDASARKAN KORELASI SILANG

    Directory of Open Access Journals (Sweden)

    Aji Hamim Wigena

    2015-09-01

    Full Text Available Pergeseran waktu (time lag dalam analisis data deret waktu diperlukan terutama untuk analisis hubungan dua peubah (variable, seperti dalam statistical downscaling. Pergeseran waktu ini ditentukan berdasarkan korelasi silang tinggi yang setara dengan hubungan yang kuat antar kedua peubah tersebut sehingga dapat digunakan dalam pemodelan untuk prakiraan yang lebih akurat. Makalah ini mengenai statistical downscaling dengan memperhatikan korelasi silang antara data curah hujan dengan data presipitasi Global Circulation Model (GCM dari Climate Model Inter Comparison Project (CMIP5. Salah satu syarat dalam statistical downscaling adalah peubah  skala lokal dan global berkorelasi tinggi. Kedua tipe peubah tersebut berupa data deret waktu sehingga fungsi korelasi silang diterapkan untuk memperoleh pergeseran waktu. Korelasi silang yang tinggi menentukan pergeseran waktu pada luaran GCM yang menghasilkan hubungan fungsional lebih kuat antara kedua tipe peubah. Model regresi komponen utama dan regresi kuadrat terkecil parsial digunakan dalam makalah ini. Model-model dengan pergeseran waktu menduga curah hujan lebih baik daripada model-model tanpa pergeseran waktu.   Time lag in time series data analysis is required especially to analyze the relationship of two variables, such as in statistical downscaling. Time lag is determined based on high cross correlation which is equivalent to strong relationship between the two variables and can be used in modeling for a more accurate forecast. This paper is about  statistical downscaling by considering the cross correlation between rainfall data and precipitation data from Global Circulation Model (GCM of Climate Model Inter Comparison Project (CMIP5. One of the conditions in statistical downscaling is that local scale and global scale variables are highly correlated. Both types of variables are time series data, thus cross correlation function is applied to find time lags. High cross correlation determines

  14. Confinement projections for the Burning Plasma Experiment (BPX)

    International Nuclear Information System (INIS)

    Goldston, R.J.; Bateman, G.; Kaye, S.M.; Perkins, F.W.; Pomphrey, N.; Stotler, D.P.; Zarnstorff, M.C.; Porkolab, M.; Reidel, K.S.; Stambaugh, R.D.; Waltz, R.E.

    1991-01-01

    The mission of the Burning Plasma Experiment (BPX, formerly CIT) is to study the physics of self-heated fusion plasmas (Q = 5 to ignition), and to demonstrate the production of substantial amounts of fusion power (P fus = 100 to 500 MW). Confinement projections for BPX have been made on the basis of (1) dimensional extrapolation (2) theory-based modeling calibrated to experiment, and (3) statistical scaling from the available empirical data base. The results of all three approaches, discussed in this paper, roughly coincide. We presently view the third approach, statistical scaling, as the most reliable means for projecting the confinement performance of BPX, and especially for assessing the uncertainty in the projection. 11 refs., 2 figs., 1 tab

  15. Statistical methods for evaluating the attainment of cleanup standards

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, R.O.; Simpson, J.C.

    1992-12-01

    This document is the third volume in a series of volumes sponsored by the US Environmental Protection Agency (EPA), Statistical Policy Branch, that provide statistical methods for evaluating the attainment of cleanup Standards at Superfund sites. Volume 1 (USEPA 1989a) provides sampling designs and tests for evaluating attainment of risk-based standards for soils and solid media. Volume 2 (USEPA 1992) provides designs and tests for evaluating attainment of risk-based standards for groundwater. The purpose of this third volume is to provide statistical procedures for designing sampling programs and conducting statistical tests to determine whether pollution parameters in remediated soils and solid media at Superfund sites attain site-specific reference-based standards. This.document is written for individuals who may not have extensive training or experience with statistical methods. The intended audience includes EPA regional remedial project managers, Superfund-site potentially responsible parties, state environmental protection agencies, and contractors for these groups.

  16. Statistical physics of an anyon gas

    International Nuclear Information System (INIS)

    Dasnieres de Veigy, A.

    1994-01-01

    In quantum two-dimensional physics, anyons are particles which have an intermediate statistics between Bose-Einstein and Fermi-Dirac statistics. The wave amplitude can change by an arbitrary phase under particle exchanges. Contrary to bosons or fermions, the permutation group cannot uniquely characterize this phase and one must introduce the braid group. One shows that the statistical ''interaction'' is equivalent to an Aharonov-Bohm interaction which derives from a Chern-Simons lagrangian. The main subject of this thesis is the thermodynamics of an anyon gas. Since the complete spectrum of N anyons seems out of reach, we have done a perturbative computation of the equation of state at second order near Bose or Fermi statistics. One avoids ultraviolet divergences by noticing that the short-range singularities of the statistical interaction enforce the wave functions to vanish when two particles approach each other (statistical exclusion). The gas is confined in a harmonic well in order to obtain the thermodynamics limit when the harmonic attraction goes to zero. Infrared divergences thus cancel in this limit and a finite virial expansion is obtained. The complexity of the anyon model appears in this result. We have also computed the equation of state of an anyon gas in a magnetic field strong enough to project the system in its degenerate groundstate. This result concerns anyons with any statistics. One then finds an exclusion principle generalizing the Pauli principle to anyons. On the other hand, we have defined a model of two-dimensional particles topologically interacting at a distance. The anyon model is recovered as a particular case where all particles are identical. (orig.)

  17. Statistical methods to monitor the West Valley off-gas system

    International Nuclear Information System (INIS)

    Eggett, D.L.

    1990-01-01

    This paper reports on the of-gas system for the ceramic melter operated at the West Valley Demonstration Project at West Valley, NY, monitored during melter operation. A one-at-a-time method of monitoring the parameters of the off-gas system is not statistically sound. Therefore, multivariate statistical methods appropriate for the monitoring of many correlated parameters will be used. Monitoring a large number of parameters increases the probability of a false out-of-control signal. If the parameters being monitored are statistically independent, the control limits can be easily adjusted to obtain the desired probability of a false out-of-control signal. The principal component (PC) scores have desirable statistical properties when the original variables are distributed as multivariate normals. Two statistics derived from the PC scores and used to form multivariate control charts are outlined and their distributional properties reviewed

  18. Positive projections of symmetric matrices and Jordan algebras

    DEFF Research Database (Denmark)

    Fuglede, Bent; Jensen, Søren Tolver

    2013-01-01

    An elementary proof is given that the projection from the space of all symmetric p×p matrices onto a linear subspace is positive if and only if the subspace is a Jordan algebra. This solves a problem in a statistical model.......An elementary proof is given that the projection from the space of all symmetric p×p matrices onto a linear subspace is positive if and only if the subspace is a Jordan algebra. This solves a problem in a statistical model....

  19. Statistical downscaling of daily temperature in Central Europe

    Czech Academy of Sciences Publication Activity Database

    Huth, Radan

    2002-01-01

    Roč. 15, - (2002), s. 1731-1742 ISSN 0894-8755 R&D Projects: GA ČR GA205/99/1561; GA AV ČR IAA3042903 Institutional research plan: CEZ:AV0Z3042911 Keywords : statistical downscaling * daily temperature * Central Europe Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 3.250, year: 2002

  20. Testing for changes using permutations of U-statistics

    Czech Academy of Sciences Publication Activity Database

    Horvath, L.; Hušková, Marie

    2005-01-01

    Roč. 2005, č. 128 (2005), s. 351-371 ISSN 0378-3758 R&D Projects: GA ČR GA201/00/0769 Institutional research plan: CEZ:AV0Z10750506 Keywords : U-statistics * permutations * change-point * weighted approximation * Brownian bridge Subject RIV: BD - Theory of Information Impact factor: 0.481, year: 2005

  1. Reaming process improvement and control: An application of statistical engineering

    DEFF Research Database (Denmark)

    Müller, Pavel; Genta, G.; Barbato, G.

    2012-01-01

    A reaming operation had to be performed within given technological and economical constraints. Process improvement under realistic conditions was the goal of a statistical engineering project, supported by a comprehensive experimental investigation providing detailed information on single...

  2. Kinematical analysis of the data from three-particle reactions by statistical methods

    International Nuclear Information System (INIS)

    Krug, J.; Nocken, U.

    1976-01-01

    A statistical procedure to unfold the kinematics of coincidence spectra from three-particle reactions is presented which is used to protect the coincidence events on the kinematical curve. The width of the projection intervals automatically matches the experimental resolution.. The method is characterized by its consistency thus also permitting a reasonable projection of sum-coincidences. (Auth.)

  3. Using Paper Helicopters to Teach Statistical Process Control

    Science.gov (United States)

    Johnson, Danny J.

    2011-01-01

    This hands-on project uses a paper helicopter to teach students how to distinguish between common and special causes of variability when developing and using statistical process control charts. It allows the student to experience a process that is out-of-control due to imprecise or incomplete product design specifications and to discover how the…

  4. Project Documentation as a Risk for Public Projects

    Directory of Open Access Journals (Sweden)

    Vladěna Štěpánková

    2015-08-01

    Full Text Available Purpose of the article: The paper presents the different methodologies used for creating documentation and focuses on public projects and their requirements for this documentation. Since documentation is also incorporated in the overall planning of the project and its duration is estimated using expert qualified estimate, can any change in this documentation lead to project delays, or increase its cost as a result of consuming administration, and therefore the documentation is seen as a risk, which may threaten the project as a public contract by which a company trying to achieve and obtains it, and generally any project. Methodology/methods: There are used methods of obtaining information in this paper. These are mainly structured interviews in combination with a brainstorming, furthermore also been used questionnaire for companies dealing with public procurement. As a data processing program was used MS Excel and basic statistical methods based on regression analysis. Scientific aim: The article deals with the construction market in the Czech Republic and examines the impact of changes in project documentation of public projects on their turnover. Findings: In this paper we summarize the advantages and disadvantages of having project documentation. In the case of public contracts and changes in legislation it is necessary to focus on creating documentation in advance, follow the new requirements and try to reach them in the shortest possible time. Conclusions: The paper concludes with recommendations on how to proceed, if these changes and how to reduce costs, which may cause the risk of documentation.

  5. Helping Raise the Official Statistics Capability of Government Employees

    Directory of Open Access Journals (Sweden)

    Forbes Sharleen

    2016-12-01

    Full Text Available Both the production and the use of official statistics are important in the business of government. In New Zealand, concern persists about many government advisors’ low level of statistical capability. One programme designed specifically to enhance capability is New Zealand’s National Certificate of Official Statistics, first introduced in 2007 and originally targeted at government policy analysts and advisors. It now includes participants from many agencies, including the National Statistics Office. The competency-based 40-credit certificate comprises four taught units that aim to give students skills in basic official statistics and in critically evaluating statistical, research, policy, or media publications for their quality (of data, survey design, analysis, and conclusions and appropriateness for some policy issue (e.g., how to reduce problem gambling, together with an ‘umbrella’ workplace-based statistics project. Case studies are used to embed the statistics learning into the real-world context of these students. Several surveys of students and their managers were undertaken to evaluate the effectiveness of the certificate in terms of enhancing skill levels and meeting organisational needs and also to examine barriers to completion of the certificate. The results were used to both modify the programme and extend its international applicability.

  6. Inferring monopartite projections of bipartite networks: an entropy-based approach

    Science.gov (United States)

    Saracco, Fabio; Straka, Mika J.; Di Clemente, Riccardo; Gabrielli, Andrea; Caldarelli, Guido; Squartini, Tiziano

    2017-05-01

    Bipartite networks are currently regarded as providing a major insight into the organization of many real-world systems, unveiling the mechanisms driving the interactions occurring between distinct groups of nodes. One of the most important issues encountered when modeling bipartite networks is devising a way to obtain a (monopartite) projection on the layer of interest, which preserves as much as possible the information encoded into the original bipartite structure. In the present paper we propose an algorithm to obtain statistically-validated projections of bipartite networks, according to which any two nodes sharing a statistically-significant number of neighbors are linked. Since assessing the statistical significance of nodes similarity requires a proper statistical benchmark, here we consider a set of four null models, defined within the exponential random graph framework. Our algorithm outputs a matrix of link-specific p-values, from which a validated projection is straightforwardly obtainable, upon running a multiple hypothesis testing procedure. Finally, we test our method on an economic network (i.e. the countries-products World Trade Web representation) and a social network (i.e. MovieLens, collecting the users’ ratings of a list of movies). In both cases non-trivial communities are detected: while projecting the World Trade Web on the countries layer reveals modules of similarly-industrialized nations, projecting it on the products layer allows communities characterized by an increasing level of complexity to be detected; in the second case, projecting MovieLens on the films layer allows clusters of movies whose affinity cannot be fully accounted for by genre similarity to be individuated.

  7. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  8. Differential sensitivity of regulatory and effector T cells to cell death: a prerequisite for transplant tolerance

    Directory of Open Access Journals (Sweden)

    Sylvaine eYou

    2015-05-01

    Full Text Available Despite significant progress achieved in transplantation, immunosuppressive therapies currently used to prevent graft rejection are still endowed with severe side effects impairing their efficiency over the long term. Thus, the development of graft-specific, non toxic innovative therapeutic strategies has become a major challenge, the goal being to selectively target alloreactive effector T cells while sparing CD4+Foxp3+ regulatory T cells (Tregs to promote operational tolerance. Various approaches, notably the one based on monoclonal antibodies or fusion proteins directed against the TCR/CD3 complex, TCR coreceptors, or costimulatory molecules, have been proposed to reduce the alloreactive T cell pool which is an essential prerequisite to create a therapeutic window allowing Tregs to induce and maintain allograft tolerance. In this minireview, we focus on the differential sensitivity of Tregs and effector T cells to the depleting and inhibitory effect of these immunotherapies, with a particular emphasis on CD3-specific antibodies that beyond their immunosuppressive effect, also express potent tolerogenic capacities.

  9. THE ECONOMIC AND SOCIAL COORDINATES OF DEVELOPING A SPORT ENTREPRENEURSHIP INDEX – CURRENT CHALLENGES AND PREREQUISITES

    Directory of Open Access Journals (Sweden)

    MUNTEANU SEBASTIAN MADALIN

    2015-07-01

    Full Text Available Focussing on the consideration of the multivariate relationship between sport and economics, the present study is based on the sport potential of supporting economic growth through the enhancement of the relatively new domain of “sport entrepreneurship”. The brief revision of the specialized literature regarding the development of sport entrepreneurship in general terms is followed by a series of fundamental factors for the innovative proposal of a sport entrepreneurship index (SEI in the European Union member countries. The methodological issues concerning the structure of the index represent the main novelty aspect of this research, which aims to be a prerequisite of a future thorough analysis regarding the micro- and macro-economic implications of developing a sport entrepreneurship index (SEI in EU countries. The main identified challenge is a scientific one and it resides in the summative assessment of the elements representing entrepreneurship, sport and culture in the SEI, whereas the necessity of determining such an index appears because of economic and social reasons.

  10. Statistical Challenges of Big Data Analysis in Medicine

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2015-01-01

    Roč. 3, č. 1 (2015), s. 24-27 ISSN 1805-8698 R&D Projects: GA ČR GA13-23940S Grant - others:CESNET Development Fund(CZ) 494/2013 Institutional support: RVO:67985807 Keywords : big data * variable selection * classification * cluster analysis Subject RIV: BB - Applied Statistics, Operational Research http://www.ijbh.org/ijbh2015-1.pdf

  11. Signal processing and statistical analysis of spaced-based measurements

    International Nuclear Information System (INIS)

    Iranpour, K.

    1996-05-01

    The reports deals with data obtained by the ROSE rocket project. This project was designed to investigate the low altitude auroral instabilities in the electrojet region. The spectral and statistical analyses indicate the existence of unstable waves in the ionized gas in the region. An experimentally obtained dispersion relation for these waves were established. It was demonstrated that the characteristic phase velocities are much lower than what is expected from the standard theoretical results. This analysis of the ROSE data indicate the cascading of energy from lower to higher frequencies. 44 refs., 54 figs

  12. Social impact assessment in energy projects

    International Nuclear Information System (INIS)

    Koivujaervi, S.; Kantola, I.; Maekinen, P.

    1998-01-01

    , interviews and earlier experience, a suggestion has been given for the implementation of the SIA for future energy projects. Several methods should be used in the assessment of the social impact, and the methods should be chosen case by case according to the specific features of each project. Good information is a prerequisite for the interaction between citizens and the operator, which is essential for the assessment of social impact. Citizens should be activated to participate and efforts should be made to use the received feedback information. (orig.) 83 refs

  13. Modelling the Reduction of Project Making Duration

    Directory of Open Access Journals (Sweden)

    Oleinik Pavel

    2017-01-01

    Full Text Available The article points out why earlier patterns of investment process were ineffective in developing the construction projects and shows sources for reducing of its total duration. It describes the procedure of statistical modeling and obtaining medium-term time parameters required for modern pattern of project-making; offers design formulas for assessment of total time required for project-making as well as for its main stages; reveals advantage of modern system of project-making against traditional one by comparing indicators of their duration.

  14. Prerequisite competencies for third-year clerkships: an interdisciplinary approach.

    Science.gov (United States)

    Matson, Christine C; Stearns, Jeffrey A; Defer, Thomas; Greenberg, Larrie; Ullian, John A

    2007-01-01

    The Collaborative Curriculum Project (CCP) is one of three components of the Family Medicine Curriculum Resource Project (FMCRP), a federally funded effort to provide resources for medical education curricula at the beginning of the 21st century. Medical educators and staff from public and private geographically distributed medical schools and national specialty organizations in family medicine, internal medicine, and pediatrics developed by consensus essential clinical competencies that all students should have by the beginning of the traditional clerkship year. These competencies are behaviorally measurable and organized into the domains used for the Accreditation Council for Graduate Medical Education (ACGME) core competencies. Exemplary teaching, assessment, and faculty development resources are cited, and attention is given to budgetary considerations, application to diverse populations and settings, and opportunities for integration within existing courses. The CCP also developed a subset of competencies meriting higher priority than currently provided in the pre-clerkship years. These priority areas were empirically validated through a national survey of clerkship directors in six disciplines. The project's documents are not intended to prescribe curricula for any school but rather to provide curricular decision makers with suggestions regarding priorities for allocation of time and resources and detailed clinical competency statements and other resources useful for faculty developing clinical courses in the first 2 years of medical school.

  15. The database of the PREDICTS (Projecting Responses of Ecological Diversity In Changing Terrestrial Systems) project

    OpenAIRE

    Hudson, L. N.; Newbold, T.; Contu, S.; Hill, S. L.; Lysenko, I.; De Palma, A.; Phillips, H. R.; Alhusseini, T. I.; Bedford, F. E.; Bennett, D. J.; Booth, H.; Burton, V. J.; Chng, C. W.; Choimes, A.; Correia, D. L.

    2017-01-01

    The PREDICTS project-Projecting Responses of Ecological Diversity In Changing Terrestrial Systems (www.predicts.org.uk)-has collated from published studies a large, reasonably representative database of comparable samples of biodiversity from multiple sites that differ in the nature or intensity of human impacts relating to land use. We have used this evidence base to develop global and regional statistical models of how local biodiversity responds to these measures. We describe and make free...

  16. Statistical Acceptance Plan for Asphalt Pavement Construction.

    Science.gov (United States)

    1998-05-01

    K. C, Freidenrich, J., and Weed, R. M. (1992). " Managing Qual- ity: Time for a National Policy," Transportation Research Record 1340...Baecher, G. (1987). "Statistical Qaulity Control of Engineered Embank- ments," Contract Report GL-87-2, Waterways Experiment Station, U.S. Army Corps...Arfington, VA22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0186). Washington, DC20503. 1 .AGENCY USE ONLY

  17. Challenges in dental statistics: survey methodology topics

    OpenAIRE

    Pizzo, Giuseppe; Milani, Silvano; Spada, Elena; Ottolenghi, Livia

    2013-01-01

    This paper gathers some contributions concerning survey methodology in dental research, as discussed during the first Workshop of the SISMEC STATDENT working group on statistical methods and applications in dentistry, held in Ancona on the 28th September 2011.The first contribution deals with the European Global Oral Health Indicators Development (EGOHID) Project which proposed a comprehensive and standardized system of epidemiological tools (questionnaires and clinical forms) for national da...

  18. National Statistical Commission and Indian Official Statistics*

    Indian Academy of Sciences (India)

    IAS Admin

    a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.

  19. Official Statistics and Statistics Education: Bridging the Gap

    Directory of Open Access Journals (Sweden)

    Gal Iddo

    2017-03-01

    Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.

  20. Experience in statistical quality control for road construction in South Africa

    CSIR Research Space (South Africa)

    Mitchell, MF

    1977-06-01

    Full Text Available of statistically oriented acceptance control procedures to a major road construction project is examined and it is concluded that such procedures promise to be of benefit to both the client and the contractor....

  1. Image statistics and nonlinear artifacts in composed transmission x-ray tomography

    International Nuclear Information System (INIS)

    Duerinckx, A.J.G.

    1979-01-01

    Knowledge of the image quality and image statistics in Computed Tomography (CT) images obtained with transmission x-ray CT scanners can increase the amount of clinically useful information that can be retrieved. Artifacts caused by nonlinear shadows are strongly object-dependent and are visible over larger areas of the image. No simple technique exists for their complete elimination. One source of artifacts in the first order statistics is the nonlinearities in the measured shadow or projection data used to reconstruct the image. One of the leading causes is the polychromaticity of the x-ray beam used in transmission CT scanners. Ways to improve the resulting image quality and techniques to extract additional information using dual energy scanning are discussed. A unique formalism consisting of a vector representation of the material dependence of the photon-tissue interactions is generalized to allow an in depth analysis. Poly-correction algorithms are compared using this analytic approach. Both quantum and detector electronic noise decrease the quality or information content of first order statistics. Preliminary results are presented using an heuristic adaptive nonlinear noise filter system for projection data. This filter system can be improved and/or modified to remove artifacts in both first and second order image statistics. Artifacts in the second order image statistics arise from the contribution of quantum noise. This can be described with a nonlinear detection equivalent model, similar to the model used to study artifacts in first order statistics. When analyzing these artifacts in second order statistics, one can divide them into linear artifacts, which do not present any problem of interpretation, and nonlinear artifacts, referred to as noise artifacts. A study of noise artifacts is presented together with a discussion of their relative importance in diagnostic radiology

  2. Project-Oriented University – Building the Capability for Innovation

    Directory of Open Access Journals (Sweden)

    Paul DOBRESCU

    2009-01-01

    Full Text Available Global economy is increasingly a knowledge economy, making people’s skills and qualifications more important than traditional power indicators such as territory, geography, natural resources. Globalization imposes new rhythms of performance to every economic or social field. Higher education is no exception to this, since it lies at the interface with the external environment, where skills and qualifications will be used and exploited for economic benefits. Universities are under a two-fold pressure. First, they provide services, knowledge, skills for fast-moving sectors. The knowledge and skills may quickly become obsolete and irrelevant for the economy. Second, universities need to innovate and to adapt to situations of constant change. Both types of pressure force universities to develop their capability for innovation, which becomes a prerequisite for survival. The purpose of this paper is to explain the concept of projectoriented university as a type of university that explicitly uses projects to perform processes of medium to high complexity, thus allowing it to deal with the increasing turbulence and dynamics of its environments. This concept is premised on the idea that there is a connection between a university’s maturity in project management and its managerial competitiveness and innovativeness. The concept inherits the conceptual core of the model of the projectoriented company and it comprises two components. The former is concerned with the structural dimensions of project management, “the hard” component – processes, procedures, organizational structures, terminology. The latter is concerned with the social dimension of project management, the “soft” component – skills, attitudes, competences, project management culture. Empirical results are considered representative for the Romanian higher education system as a whole, with due nuances and exceptions.

  3. 9th Annual UNCG Regional Mathematics and Statistics Conference

    CERN Document Server

    Chhetri, Maya; Gupta, Sat; Shivaji, Ratnasingham

    2015-01-01

    This volume contains rigorously reviewed papers on the topics presented by students at The 9th Annual University of North Carolina at Greensboro Regional Mathematics and Statistics Conference (UNCG RMSC) that took place on November 2, 2013.  All papers are coauthored by student researchers and their faculty mentors. This conference series was inaugurated in 2005, and it now attracts over 150 participants from over 30 universities from North Carolina and surrounding states. The conference is specifically tailored for students to present their research projects that encompass a broad spectrum of topics in mathematics, mathematical biology, statistics, and computer science.

  4. The Key Lake project

    International Nuclear Information System (INIS)

    1991-01-01

    Key Lake is located in the Athabasca sand stone basin, 640 kilometers north of Saskatoon, Saskatchewan, Canada. The three sources of ore at Key Lake contain 70 100 tonnes of uranium. Features of the Key Lake Project were described under the key headings: work force, mining, mill process, tailings storage, permanent camp, environmental features, worker health and safety, and economic benefits. Appendices covering the historical background, construction projects, comparisons of western world mines, mining statistics, Northern Saskatchewan surface lease, and Key Lake development and regulatory agencies were included

  5. Energy Efficiency in Building as a Basic Prerequisite for a Long Term Energy Strategies Realization, Environmental Protection and Sustainability

    International Nuclear Information System (INIS)

    Miscevic, Lj.

    2006-01-01

    Energy efficiency in buildings at the low-energy and 'passive house' standard levels is presently the basic prerequisite for considering and formulating long term strategies, which with the task of meeting energy needs and system maintenance respond to requests of environmental protection and improvements in the context of sustainable development. Orientation to sustainable development is integrated in the development strategies of Croatia. The application of renewable energy sources, in particular solar energy in passive and active systems in the architecture is permanently confirmed by conducting energy monitoring and growing number of domestic studies, projects and realizations. The long-time research project of the European Union Cost Efficient Passive Houses as European Standards (CEPHEUS) with scientific monitoring corroborated energy and economic efficiency of such architectural designs in Germany, France, Austria, and Switzerland. Thus, the 'passive house' is proposed as a standard of residential architecture, but also of the construction of other functional types of architecture in general. The accomplished energy efficiency and verified favorable profitability of investment developed new forms of incentives to low-energy and passive architecture and relevant changes in concepts of long term energy strategies in the European Union member states. In Austria the 1000th passive house was built, and the city of Frankfurt/M brought decision regarding financing building construction through the city budget at the 'passive house' level. The new Technical Regulation on energy savings and thermal protection in Croatia, which is effectively in force as of 1 July, is a long-awaited step towards energy efficiency. Although, according to this Regulation the tolerance in energy use for space heating goes, in worst case calculation, up to 89 kWh/m2 a year, any other more favorable calculation with obligation to calculate the share of solar radiation for buildings, opens

  6. Downscaled projections of Caribbean coral bleaching that can inform conservation planning.

    Science.gov (United States)

    van Hooidonk, Ruben; Maynard, Jeffrey Allen; Liu, Yanyun; Lee, Sang-Ki

    2015-09-01

    Projections of climate change impacts on coral reefs produced at the coarse resolution (~1°) of Global Climate Models (GCMs) have informed debate but have not helped target local management actions. Here, projections of the onset of annual coral bleaching conditions in the Caribbean under Representative Concentration Pathway (RCP) 8.5 are produced using an ensemble of 33 Coupled Model Intercomparison Project phase-5 models and via dynamical and statistical downscaling. A high-resolution (~11 km) regional ocean model (MOM4.1) is used for the dynamical downscaling. For statistical downscaling, sea surface temperature (SST) means and annual cycles in all the GCMs are replaced with observed data from the ~4-km NOAA Pathfinder SST dataset. Spatial patterns in all three projections are broadly similar; the average year for the onset of annual severe bleaching is 2040-2043 for all projections. However, downscaled projections show many locations where the onset of annual severe bleaching (ASB) varies 10 or more years within a single GCM grid cell. Managers in locations where this applies (e.g., Florida, Turks and Caicos, Puerto Rico, and the Dominican Republic, among others) can identify locations that represent relative albeit temporary refugia. Both downscaled projections are different for the Bahamas compared to the GCM projections. The dynamically downscaled projections suggest an earlier onset of ASB linked to projected changes in regional currents, a feature not resolved in GCMs. This result demonstrates the value of dynamical downscaling for this application and means statistically downscaled projections have to be interpreted with caution. However, aside from west of Andros Island, the projections for the two types of downscaling are mostly aligned; projected onset of ASB is within ±10 years for 72% of the reef locations. © 2015 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.

  7. An application of an optimal statistic for characterizing relative orientations

    Science.gov (United States)

    Jow, Dylan L.; Hill, Ryley; Scott, Douglas; Soler, J. D.; Martin, P. G.; Devlin, M. J.; Fissel, L. M.; Poidevin, F.

    2018-02-01

    We present the projected Rayleigh statistic (PRS), a modification of the classic Rayleigh statistic, as a test for non-uniform relative orientation between two pseudo-vector fields. In the application here, this gives an effective way of investigating whether polarization pseudo-vectors (spin-2 quantities) are preferentially parallel or perpendicular to filaments in the interstellar medium. For example, there are other potential applications in astrophysics, e.g. when comparing small-scale orientations with larger scale shear patterns. We compare the efficiency of the PRS against histogram binning methods that have previously been used for characterizing the relative orientations of gas column density structures with the magnetic field projected on the plane of the sky. We examine data for the Vela C molecular cloud, where the column density is inferred from Herschel submillimetre observations, and the magnetic field from observations by the Balloon-borne Large-Aperture Submillimetre Telescope in the 250-, 350- and 500-μm wavelength bands. We find that the PRS has greater statistical power than approaches that bin the relative orientation angles, as it makes more efficient use of the information contained in the data. In particular, the use of the PRS to test for preferential alignment results in a higher statistical significance, in each of the four Vela C regions, with the greatest increase being by a factor 1.3 in the South-Nest region in the 250 - μ m band.

  8. The international Chernobyl project

    International Nuclear Information System (INIS)

    1991-01-01

    This article summarizes the official report of the International Advisory Committee at the conference of the International Chernobyl Project held in Vienna, May 1991. More details will be found in the actual report, ''The International Chernobyl Project: An Overview'' (INI22:066284/5). Measurements and assessments carried out under the project provided general corroboration of the levels of surface cesium-137 contamination reported in the official maps. The project also concluded that the official procedures for estimating radiation doses to the population were scientifically sound, although they generally resulted in overestimates of two- to threefold. The project could find no marked increase in the incidence of leukemia or cancer, but reported absorbed thyroid doses in children might lead to a statistically detectable rise in the incidence of thyroid tumors. Significant non-radiation-related health disorders were found, and the accident had substantial psychological consequences in terms of anxiety and stress. The project concluded that the protective measures taken were too extreme, and that population relocation and foodstuff restrictions should have been less extensive

  9. The use of machine learning and nonlinear statistical tools for ADME prediction.

    Science.gov (United States)

    Sakiyama, Yojiro

    2009-02-01

    Absorption, distribution, metabolism and excretion (ADME)-related failure of drug candidates is a major issue for the pharmaceutical industry today. Prediction of ADME by in silico tools has now become an inevitable paradigm to reduce cost and enhance efficiency in pharmaceutical research. Recently, machine learning as well as nonlinear statistical tools has been widely applied to predict routine ADME end points. To achieve accurate and reliable predictions, it would be a prerequisite to understand the concepts, mechanisms and limitations of these tools. Here, we have devised a small synthetic nonlinear data set to help understand the mechanism of machine learning by 2D-visualisation. We applied six new machine learning methods to four different data sets. The methods include Naive Bayes classifier, classification and regression tree, random forest, Gaussian process, support vector machine and k nearest neighbour. The results demonstrated that ensemble learning and kernel machine displayed greater accuracy of prediction than classical methods irrespective of the data set size. The importance of interaction with the engineering field is also addressed. The results described here provide insights into the mechanism of machine learning, which will enable appropriate usage in the future.

  10. Statistical Models to Assess the Health Effects and to Forecast Ground Level Ozone

    Czech Academy of Sciences Publication Activity Database

    Schlink, U.; Herbath, O.; Richter, M.; Dorling, S.; Nunnari, G.; Cawley, G.; Pelikán, Emil

    2006-01-01

    Roč. 21, č. 4 (2006), s. 547-558 ISSN 1364-8152 R&D Projects: GA AV ČR 1ET400300414 Institutional research plan: CEZ:AV0Z10300504 Keywords : statistical models * ground level ozone * health effects * logistic model * forecasting * prediction performance * neural network * generalised additive model * integrated assessment Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.992, year: 2006

  11. Application UPTASK project management platform to support management decisions in I.T.

    Directory of Open Access Journals (Sweden)

    Lagerev D.G.

    2017-03-01

    Full Text Available Project planning and management accounting of all factors, in terms of multitasking and spontaneity of project man-agement, the task is very voluminous and complex, especially in isolation from the methodological and instrumentation. Calculation of indicators and statistical metrics, and strategies, the decision maker may be biased or incorrect character, and one can’t exclude the human factor. In such a perspective, it was decided to automate the process of statistical anal-ysis and probabilistic forecasting to help the decision-maker, as much as the right choice of strategic and project plan-ning. The problem is solved by means of the development of the project management system for the companies of the areas of information technology. One of the most important features of the development of a module statistical and probabilistic analysis based on Bayesian networks. The use of the proposed tool and methodological complex, will pro-vide a high level of optimization of the allocation of time to work processes and will increase the degree of correctness and continuity of decisions taken by the project manager.

  12. Organizational prerequisites for the preservation of library collections in monastery libraries

    Directory of Open Access Journals (Sweden)

    Maja Krtalić

    2012-02-01

    Full Text Available The aim of the paper is to investigate the preservation of written heritage in monastery libraries from legislative, institutional and organizational perspectives, and establish the necessary organizational prerequisites for improvement. Setting off from the presupposition that the library collections of monastery libraries are of immense cultural value, and can therefore be considered cultural good and part of Croatian written heritage, the paper discussed the need for a systematic approach to its protection, both on the operative level, in libraries themselves, and on the strategic level, by the authorities and other relevant institutions in the Republic of Croatia. In addition to the analysis of the legal and institutional frameworks and library collections preservation projects, three case studies were conducted in Franciscan monasteries in Mostar, Požega and Zadar, including interviews with their managers and one subject from the Croatian Institute of Librarianship. The case study aimed to investigate the context of the preservation of library collections in monastery libraries and provide answers to the following questions: how is the preservation in monastery libraries defined; how does it differ from the preservation in other libraries, and how is the preservation of collections in these libraries organized on institutional, local, and national levels? The research sets off from several core presuppositions: monastery libraries have valuable collections of Croatian and European written heritage; the heritage collections in monastery libraries are not investigated, organized, protected or presented at an adequate level; the responsibility for its preservation is not clearly assigned, there is not enough staff trained for preservation; the improvements in preservation and availability of library collections in monastery libraries should result from a better organization and management of the heritage preservation system in monastery

  13. Statistical Model of the 2001 Czech Census for Interactive Presentation

    Czech Academy of Sciences Publication Activity Database

    Grim, Jiří; Hora, Jan; Boček, Pavel; Somol, Petr; Pudil, Pavel

    Vol. 26, č. 4 (2010), s. 1-23 ISSN 0282-423X R&D Projects: GA ČR GA102/07/1594; GA MŠk 1M0572 Grant - others:GA MŠk(CZ) 2C06019 Institutional research plan: CEZ:AV0Z10750506 Keywords : Interactive statistical model * census data presentation * distribution mixtures * data modeling * EM algorithm * incomplete data * data reproduction accuracy * data mining Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.492, year: 2010 http://library.utia.cas.cz/separaty/2010/RO/grim-0350513.pdf

  14. USE OF POWERPOINT POSSIBILITIES IN IMPLEMENTATION OF PROJECT METHOD INTO THE LEARNING PROCESS WHILE STUDYING CHEMISTRY IN PROFESSION-ORIENTED SCHOOL

    Directory of Open Access Journals (Sweden)

    Maria D. Tukalo

    2011-02-01

    Full Text Available The article reveals the importance of the opportunities and prospects for the introduction of PowerPoint project method in chemistry lessons in profession-oriented school, which is a prerequisite in the search for innovative, effective and optimal computer-oriented teaching methods and forms for the establishment of conditions for the development of creative potential of students based on competences through the organization individual, creative and research activities in the educational system of modern technology. It enables to design the educational process, use of generalized methods of training for providing training and guaranteed results.

  15. Statistical model of exotic rotational correlations in emergent space-time

    Energy Technology Data Exchange (ETDEWEB)

    Hogan, Craig; Kwon, Ohkyung; Richardson, Jonathan

    2017-06-06

    A statistical model is formulated to compute exotic rotational correlations that arise as inertial frames and causal structure emerge on large scales from entangled Planck scale quantum systems. Noncommutative quantum dynamics are represented by random transverse displacements that respect causal symmetry. Entanglement is represented by covariance of these displacements in Planck scale intervals defined by future null cones of events on an observer's world line. Light that propagates in a nonradial direction inherits a projected component of the exotic rotational correlation that accumulates as a random walk in phase. A calculation of the projection and accumulation leads to exact predictions for statistical properties of exotic Planck scale correlations in an interferometer of any configuration. The cross-covariance for two nearly co-located interferometers is shown to depart only slightly from the autocovariance. Specific examples are computed for configurations that approximate realistic experiments, and show that the model can be rigorously tested.

  16. A statistical evaluation of asbestos air concentrations

    International Nuclear Information System (INIS)

    Lange, J.H.

    1999-01-01

    Both area and personal air samples collected during an asbestos abatement project were matched and statistically analysed. Among the many parameters studied were fibre concentrations and their variability. Mean values for area and personal samples were 0.005 and 0.024 f cm - - 3 of air, respectively. Summary values for area and personal samples suggest that exposures are low with no single exposure value exceeding the current OSHA TWA value of 0.1 f cm -3 of air. Within- and between-worker analysis suggests that these data are homogeneous. Comparison of within- and between-worker values suggests that the exposure source and variability for abatement are more related to the process than individual practices. This supports the importance of control measures for abatement. Study results also suggest that area and personal samples are not statistically related, that is, there is no association observed for these two sampling methods when data are analysed by correlation or regression analysis. Personal samples were statistically higher in concentration than area samples. Area sampling cannot be used as a surrogate exposure for asbestos abatement workers. (author)

  17. Critical Success Factors in Construction Projects (Governmental Projects as a Case Study

    Directory of Open Access Journals (Sweden)

    Hatem Khaleefah Al-Ageeli

    2016-03-01

    Full Text Available The importance of the construction sector and its Great role in the provision of services and infrastructure, reduce poverty, improve living conditions and improve the economic situation in the country, impose attention to the way in which the projects implemented for its improvement and to get successful projects. The objective of this research was to determine the criteria for success as well as critical success and failure factors that have a significant impact on project success. A selected 75 engineer (department managers, project managers and engineers are asked to fill the questionnaire form, Sixty-seven valid questionnaire forms were analyzed statistically to get search results, which were as follows : Twelve critical success factors, the most important factors of it were ("contractor financial efficiency ", " security ,political , economic stability ", "the project manager competence" and " Integration and clarity of contract documents " , thirteen critical failure factors, the most important factors of it were ("corruption " , " external circumstances ", "Financial difficulties of owner", and ten success criteria , the most important criteria of it were ("within allocated budget" , " within time period" , "Quality" .

  18. Forecasting of Radiation Belts: Results From the PROGRESS Project.

    Science.gov (United States)

    Balikhin, M. A.; Arber, T. D.; Ganushkina, N. Y.; Walker, S. N.

    2017-12-01

    Forecasting of Radiation Belts: Results from the PROGRESS Project. The overall goal of the PROGRESS project, funded in frame of EU Horizon2020 programme, is to combine first principles based models with the systems science methodologies to achieve reliable forecasts of the geo-space particle radiation environment.The PROGRESS incorporates three themes : The propagation of the solar wind to L1, Forecast of geomagnetic indices, and forecast of fluxes of energetic electrons within the magnetosphere. One of the important aspects of the PROGRESS project is the development of statistical wave models for magnetospheric waves that affect the dynamics of energetic electrons such as lower band chorus, hiss and equatorial noise. The error reduction ratio (ERR) concept has been used to optimise the set of solar wind and geomagnetic parameters for organisation of statistical wave models for these emissions. The resulting sets of parameters and statistical wave models will be presented and discussed. However the ERR analysis also indicates that the combination of solar wind and geomagnetic parameters accounts for only part of the variance of the emissions under investigation (lower band chorus, hiss and equatorial noise). In addition, advances in the forecast of fluxes of energetic electrons, exploiting empirical models and the first principles IMPTAM model achieved by the PROGRESS project is presented.

  19. On Statistical Analysis of Competing Risks with Application to the Time of First Goal

    Czech Academy of Sciences Publication Activity Database

    Volf, Petr

    2016-01-01

    Roč. 2, č. 10 (2016), s. 606-623, č. článku 2. ISSN 2411-2518 R&D Projects: GA ČR GA13-14445S Institutional support: RVO:67985556 Keywords : survival analysis * competing risks * sports statistics Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2016/SI/volf-0466157.pdf

  20. The Prerequisites for Implementing and Ensuring the Efficiency of Marketing Audit at the Publishing and Printing Enterprises

    Directory of Open Access Journals (Sweden)

    Bezpalko Iryna R.

    2017-03-01

    Full Text Available The article is aimed at defining the prerequisites for necessity of successful and efficient implementation of the practice of marketing audit at the domestic publishing and printing enterprises. It has been determined that in the highly competitive market of the publishing and printing services, as well as systemic problems of the domestic market, marketing audit can be an efficient tool for analysis and control on the definition of problems in the system for management of marketing activities of the publishing and printing enterprise, preventive identifying non-conformity of their status with requirements of the market environment, as well as developing recommendations on how to reduce such non-conformity. Determining and accounting external and internal factors that cause impact on quality and efficiency of marketing audit of entities in the publishing and printing market is the basis for development of the organizational-economic mechanism for implementation of such audit and possibilities for its active implementation in the domestic market conditions.

  1. Statistical properties of multiphoton time-dependent three-boson coupled oscillators

    Czech Academy of Sciences Publication Activity Database

    Abdalla, M. S.; Peřina, Jan; Křepelka, Jaromír

    2006-01-01

    Roč. 23, č. 6 (2006), s. 1146-1160 ISSN 0740-3224 R&D Projects: GA MŠk(CZ) OC P11.003 Institutional research plan: CEZ:AV0Z10100522 Keywords : quantum statistic * coupled oscillators * multiphoton Subject RIV: BH - Optics, Masers, Lasers Impact factor: 2.002, year: 2006

  2. A Statistical Model for the Estimation of Natural Gas Consumption

    Czech Academy of Sciences Publication Activity Database

    Vondráček, Jiří; Pelikán, Emil; Konár, Ondřej; Čermáková, Jana; Eben, Kryštof; Malý, Marek; Brabec, Marek

    2008-01-01

    Roč. 85, c. 5 (2008), s. 362-370 ISSN 0306-2619 R&D Projects: GA AV ČR 1ET400300513 Institutional research plan: CEZ:AV0Z10300504 Keywords : nonlinear regression * gas consumption modeling Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.371, year: 2008

  3. Planning international transit oil pipeline projects in Croatia

    International Nuclear Information System (INIS)

    Sekulic, G.; Vrbic, D.

    2004-01-01

    Planning and development of international oil pipeline projects are aimed primarily at enhancing the safety of crude oil supply. Pipeline development is affected by a variety of overlapping factors, such as energy - and environment-protection-related factors, as well as political, economic, legislative, social, technical and technological ones. The success of any pipeline planning, construction and operation in the present conditions will depend upon the degree to which the above factors have been brought in line with global trends. The government should create stable political, economic and legislative frameworks that will meet the global requirements of crude oil transport development. As regards (new) transportation companies, their function is to secure safe transport by providing competitive tariffs and granting environmental protection. A prerequisite for the pipeline planning is to have both major crude oil consumers and producers (as well as their economic and political integrations) consider any state or company as potential partners for crude oil transport and transit, respectively. Croatia and the JANAF transport company have been 'chosen' as one of priority routes for European supply with crude oil from the Caspian region and Russia and one of the directions for Russian crude oil export due to a number of advantages, opportunities and prospects for a successful development. Two international oil pipeline projects - the Druzba Adria Project and the Constanta-Pancevo-Omisalj-Trieste Project - are currently under consideration. The government commitment towards these projects has been documented by the Croatian Energy Development Strategy (April 2002) and by the Programme for its implementation (March 2004). JANAF has assumed the responsibility for carrying out the project preparation activities assigned to it by the Croatian Government and the pertinent ministries. Cooperation between JANAF and government institutions is an integral part of the procedure

  4. Positive Prerequisites for the Use of Reliefs in the Payment of Dues on Social Insurance Contributions

    Directory of Open Access Journals (Sweden)

    Zbigniew Ofiarski

    2014-06-01

    Full Text Available It is permissible to use reliefs in the payment of social security contributions, based either on a definitive waiver by the creditor of the whole or relevant part of the amount due (partial or complete remission or only a temporary waiver of such amounts (payment deferral or payment in installments. The use of such reliefs is possible upon the occurrence of conditions laid down in the Act, for example, in the case of total non-recovery of contributions, for economic or other reasons worth considering, if justified by important interests of the person concerned. The prerequisites mentioned above have a nature of general clauses, allowing for their flexible adjustment to specific situations. Entities authorized to grant reliefs in the payment of social security contributions act within the limits of administrative discretion. But it is not a fully free operation, because the economic impact resulting from the use of such reliefs has a direct impact on the financial balance of earmarked funds which finance social security benefits, in particular pensions, disability allowances and other benefits.

  5. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  6. Energy statistics. France; Statistiques energetiques. France

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-10-01

    This document summarizes in a series of tables the energy statistical data for France: consumption since 1973; energy supplies (production, imports, exports, stocks) and uses (refining, power production, internal uses, sectoral consumption) for coal, petroleum, gas, electricity, and renewable energy sources; national production and consumption of primary energy; final consumption per sector and per energy source; general indicators (energy bill, US$ change rate, prices, energy independence, internal gross product); projections. Details (resources, uses, prices, imports, internal consumption) are given separately for petroleum, natural gas, electric power and solid mineral fuels. (J.S.)

  7. [Statistics for statistics?--Thoughts about psychological tools].

    Science.gov (United States)

    Berger, Uwe; Stöbel-Richter, Yve

    2007-12-01

    Statistical methods take a prominent place among psychologists' educational programs. Being known as difficult to understand and heavy to learn, students fear of these contents. Those, who do not aspire after a research carrier at the university, will forget the drilled contents fast. Furthermore, because it does not apply for the work with patients and other target groups at a first glance, the methodological education as a whole was often questioned. For many psychological practitioners the statistical education makes only sense by enforcing respect against other professions, namely physicians. For the own business, statistics is rarely taken seriously as a professional tool. The reason seems to be clear: Statistics treats numbers, while psychotherapy treats subjects. So, does statistics ends in itself? With this article, we try to answer the question, if and how statistical methods were represented within the psychotherapeutical and psychological research. Therefore, we analyzed 46 Originals of a complete volume of the journal Psychotherapy, Psychosomatics, Psychological Medicine (PPmP). Within the volume, 28 different analyse methods were applied, from which 89 per cent were directly based upon statistics. To be able to write and critically read Originals as a backbone of research, presumes a high degree of statistical education. To ignore statistics means to ignore research and at least to reveal the own professional work to arbitrariness.

  8. Actuarial pricing of energy efficiency projects: lessons foul and fair

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul E-mail: pamathew@lbl.gov; Kromer, J. Stephen; Sezgen, Osman; Meyers, Steven

    2005-07-01

    Recent market convulsions in the energy industry have generated a plethora of post-mortem analyses on a wide range of issues, including accounting rules, corporate governance, commodity markets, and energy policy. While most of these analyses have focused on business practices related to wholesale energy trading, there has been limited analysis of retail energy services, particularly energy efficiency projects. We suggest that there were several business concepts and strategies in the energy efficiency arena whose inherent value may have been masked by the larger failure of companies such as Enron. In this paper, we describe one such concept, namely, actuarial pricing of energy efficiency projects, which leverages a portfolio-based approach to risk management. First, we discuss the business drivers, contrasting this approach with conventional industry practice. We then describe the implementation of this approach, including an actuarial database, pricing curves, and a pricing process compatible with commodity pricing. We conclude with a discussion of the prospects and barriers for the further development of transparent and quantifiable risk management products for energy efficiency, a prerequisite for developing energy efficiency as a tradeable commodity. We address these issues from an experiential standpoint, drawing mostly on our experience in developing and implementing such strategies at Enron.

  9. Actuarial pricing of energy efficiency projects: lessons foul and fair

    International Nuclear Information System (INIS)

    Mathew, Paul; Kromer, J. Stephen; Sezgen, Osman; Meyers, Steven

    2005-01-01

    Recent market convulsions in the energy industry have generated a plethora of post-mortem analyses on a wide range of issues, including accounting rules, corporate governance, commodity markets, and energy policy. While most of these analyses have focused on business practices related to wholesale energy trading, there has been limited analysis of retail energy services, particularly energy efficiency projects. We suggest that there were several business concepts and strategies in the energy efficiency arena whose inherent value may have been masked by the larger failure of companies such as Enron. In this paper, we describe one such concept, namely, actuarial pricing of energy efficiency projects, which leverages a portfolio-based approach to risk management. First, we discuss the business drivers, contrasting this approach with conventional industry practice. We then describe the implementation of this approach, including an actuarial database, pricing curves, and a pricing process compatible with commodity pricing. We conclude with a discussion of the prospects and barriers for the further development of transparent and quantifiable risk management products for energy efficiency, a prerequisite for developing energy efficiency as a tradeable commodity. We address these issues from an experiential standpoint, drawing mostly on our experience in developing and implementing such strategies at Enron

  10. Energy statistics. France. August 2001

    International Nuclear Information System (INIS)

    2001-08-01

    This document summarizes in a series of tables the statistical data relative to the production, consumption, supplies, resources, and prices of energies in France: 1 - all energies (coal, oil, gas, electric power, renewable energies): supplies, uses per sector, national production and consumption of primary energies, final consumption, general indicators (energy bill, US$ change rate, prices index, prices of imported crude oil, energy independence, internal gross product, evolution between 1973 and 2000, and projections for 2020). 2 - detailed data per energy source (petroleum, natural gas, electric power, solid mineral fuels): resources, uses, and prices. An indicative comparison is made with the other countries of the European Union. (J.S.)

  11. Variance in parametric images: direct estimation from parametric projections

    International Nuclear Information System (INIS)

    Maguire, R.P.; Leenders, K.L.; Spyrou, N.M.

    2000-01-01

    Recent work has shown that it is possible to apply linear kinetic models to dynamic projection data in PET in order to calculate parameter projections. These can subsequently be back-projected to form parametric images - maps of parameters of physiological interest. Critical to the application of these maps, to test for significant changes between normal and pathophysiology, is an assessment of the statistical uncertainty. In this context, parametric images also include simple integral images from, e.g., [O-15]-water used to calculate statistical parametric maps (SPMs). This paper revisits the concept of parameter projections and presents a more general formulation of the parameter projection derivation as well as a method to estimate parameter variance in projection space, showing which analysis methods (models) can be used. Using simulated pharmacokinetic image data we show that a method based on an analysis in projection space inherently calculates the mathematically rigorous pixel variance. This results in an estimation which is as accurate as either estimating variance in image space during model fitting, or estimation by comparison across sets of parametric images - as might be done between individuals in a group pharmacokinetic PET study. The method based on projections has, however, a higher computational efficiency, and is also shown to be more precise, as reflected in smooth variance distribution images when compared to the other methods. (author)

  12. Inter-comparison of statistical downscaling methods for projection of extreme precipitation in Europe

    DEFF Research Database (Denmark)

    Sunyer Pinya, Maria Antonia; Hundecha, Y.; Lawrence, D.

    2015-01-01

    Information on extreme precipitation for future climate is needed to assess the changes in the frequency and intensity of flooding. The primary source of information in climate change impact studies is climate model projections. However, due to the coarse resolution and biases of these models......), three are bias correction (BC) methods, and one is a perfect prognosis method. The eight methods are used to downscale precipitation output from 15 regional climate models (RCMs) from the ENSEMBLES project for 11 catchments in Europe. The overall results point to an increase in extreme precipitation...... that at least 30% and up to approximately half of the total variance is derived from the SDMs. This study illustrates the large variability in the expected changes in extreme precipitation and highlights the need for considering an ensemble of both SDMs and climate models. Recommendations are provided...

  13. Statistical Compilation of the ICT Sector and Policy Analysis | Page 2 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... to widen and deepen, so too does its impact on economic development. ... The outcomes of such efforts will subsequently inform policy discourse and ... Studies. Statistical Compilation of the ICT Sector and Policy Analysis project : country experiences; Malaysia ... Asian outlook: New growth dependent on new productivity.

  14. Measured PET Data Characterization with the Negative Binomial Distribution Model.

    Science.gov (United States)

    Santarelli, Maria Filomena; Positano, Vincenzo; Landini, Luigi

    2017-01-01

    Accurate statistical model of PET measurements is a prerequisite for a correct image reconstruction when using statistical image reconstruction algorithms, or when pre-filtering operations must be performed. Although radioactive decay follows a Poisson distribution, deviation from Poisson statistics occurs on projection data prior to reconstruction due to physical effects, measurement errors, correction of scatter and random coincidences. Modelling projection data can aid in understanding the statistical nature of the data in order to develop efficient processing methods and to reduce noise. This paper outlines the statistical behaviour of measured emission data evaluating the goodness of fit of the negative binomial (NB) distribution model to PET data for a wide range of emission activity values. An NB distribution model is characterized by the mean of the data and the dispersion parameter α that describes the deviation from Poisson statistics. Monte Carlo simulations were performed to evaluate: (a) the performances of the dispersion parameter α estimator, (b) the goodness of fit of the NB model for a wide range of activity values. We focused on the effect produced by correction for random and scatter events in the projection (sinogram) domain, due to their importance in quantitative analysis of PET data. The analysis developed herein allowed us to assess the accuracy of the NB distribution model to fit corrected sinogram data, and to evaluate the sensitivity of the dispersion parameter α to quantify deviation from Poisson statistics. By the sinogram ROI-based analysis, it was demonstrated that deviation on the measured data from Poisson statistics can be quantitatively characterized by the dispersion parameter α, in any noise conditions and corrections.

  15. Prerequisites for carbon capture and storage (CCS) in Sweden - a synthesis of the Baltic Sea Project; Foerutsaettningar foer avskiljning och lagring av koldioxid (CCS) i Sverige - En syntes av Oestersjoeprojektet

    Energy Technology Data Exchange (ETDEWEB)

    Gode, Jenny; Stigson, Peter; Hoeglund, Jonas; Bingel, Eva

    2011-07-01

    This publication summarizes a project on carbon capture and storage (CCS) in the Baltic region conducted at the initiative of the Energy Agency. The project is called 'the Baltic Project' and the aim has been to highlight the prospects for CCS in Sweden and how the Baltic Sea region affects this

  16. PINGU and the neutrino mass hierarchy: Statistical and systematical aspects

    International Nuclear Information System (INIS)

    Capozzi, F.; Marrone, A.; Lisi, E.

    2016-01-01

    The proposed PINGU project (Precision IceCube Next Generation Upgrade) is supposed to determine neutrino mass hierarchy through matter effects of atmospheric neutrinos crossing the Earth core and mantle, which leads to variations in the events spectrum in energy and zenith angle. The presence of non-negligible (and partly unknown) systematics on the spectral shape can make the statistical analysis particularly challenging in the limit of high statistics. Assuming plausible spectral shape uncertainties at the percent level (due to effective volume, cross section, resolution functions, oscillation parameters, etc.), we obtain a significant reduction in the sensitivity to the hierarchy. The obtained results show the importance of a dedicated research program aimed at a better characterization and reduction of the uncertainties in future high-statistics experiments with atmospheric neutrinos.

  17. Calculating statistical distributions from operator relations: The statistical distributions of various intermediate statistics

    International Nuclear Information System (INIS)

    Dai, Wu-Sheng; Xie, Mi

    2013-01-01

    In this paper, we give a general discussion on the calculation of the statistical distribution from a given operator relation of creation, annihilation, and number operators. Our result shows that as long as the relation between the number operator and the creation and annihilation operators can be expressed as a † b=Λ(N) or N=Λ −1 (a † b), where N, a † , and b denote the number, creation, and annihilation operators, i.e., N is a function of quadratic product of the creation and annihilation operators, the corresponding statistical distribution is the Gentile distribution, a statistical distribution in which the maximum occupation number is an arbitrary integer. As examples, we discuss the statistical distributions corresponding to various operator relations. In particular, besides the Bose–Einstein and Fermi–Dirac cases, we discuss the statistical distributions for various schemes of intermediate statistics, especially various q-deformation schemes. Our result shows that the statistical distributions corresponding to various q-deformation schemes are various Gentile distributions with different maximum occupation numbers which are determined by the deformation parameter q. This result shows that the results given in much literature on the q-deformation distribution are inaccurate or incomplete. -- Highlights: ► A general discussion on calculating statistical distribution from relations of creation, annihilation, and number operators. ► A systemic study on the statistical distributions corresponding to various q-deformation schemes. ► Arguing that many results of q-deformation distributions in literature are inaccurate or incomplete

  18. Decision support for choice optimal power generation projects: Fuzzy comprehensive evaluation model based on the electricity market

    International Nuclear Information System (INIS)

    Liang Zhihong; Yang Kun; Sun Yaowei; Yuan Jiahai; Zhang Hongwei; Zhang Zhizheng

    2006-01-01

    In 2002, China began to inspire restructuring of the electric power sector to improve its performance. Especially, with the rapid increase of electricity demand in China, there is a need for non-utility generation investment that cannot be met by government finance alone. However, a first prerequisite is that regulators and decision-makers (DMs) should carefully consider how to balance the need to attract private investment against the policy objectives of minimizing monopoly power and fostering competitive markets. So in the interim term of electricity market, a decentralized decision-making process should eventually replace the centralized generation capacity expansion planning. In this paper, firstly, on the basis of the current situation, a model for evaluating generation projects by comprehensive utilization of fuzzy appraisal and analytic hierarchy process (AHP) is developed. Secondly, a case study of generation project evaluation in China is presented to illustrate the effectiveness of the model in selecting optimal generation projects and attracting private investors. In the case study, with considerations of attracting adequate private investment and promoting energy conservation in China, five most promising policy instruments selected as evaluation factors include project duration, project costs, predicted on-grid price level, environmental protection, enterprise credit grading and performance. Finally, a comprehensive framework that enables the DM to have better concentration and to make more sound decisions by combining the model proposed with modern computer science is designed

  19. Technical prerequisites for efficient drive systems - Fundamentals for SwissEnergy measures; Technische Grundlagen effizienter Antriebssysteme. Grundlagen fuer Aktionen (Massnahmen) von Energieschweiz

    Energy Technology Data Exchange (ETDEWEB)

    Schnyder, G.; Ritz, Ch.

    2007-03-15

    This final report for the Swiss Federal Office of Energy (SFOE) reports on the technical prerequisites necessary for the implementation of various measures that are to be taken to promote efficient electrical drive systems. The document defines the approach taken and describes the methodologies to be used, including market analysis, the collection of basic data, the definition of measures and the acquisition of partners. The potential for making savings is estimated. Eight areas of action are defined, including the organisation of tutorials, exchange of experience, knowledge transfer, basic consulting services, the deployment of consultants, the setting-up of an Internet portal, information transfer in conferences and the optimisation of auxiliaries in domestic installations. A comprehensive annex completes the report.

  20. Supporting statistics in the workplace: experiences with two hospitals

    Directory of Open Access Journals (Sweden)

    M. Y. Mortlock

    2003-01-01

    Full Text Available This paper provides some reflections on the promotion of lifelong learning in statistics in the workplace. The initiative from which the reflections are drawn is a collaboration between a university and two public hospitals, of which one of the stated aims is to develop statistical skills among the hospitals' researchers. This is realized in the provision of ‘biostatistical clinics’ in which workplace teaching and learning of statistics takes place in one-on-one or small group situations. The central issue that is identified is the need to accommodate diversity: in backgrounds, motivations and learning needs of workplace learners (in this case medical researchers, in the workplace environments themselves and in the projects encountered. Operational issues for the statistician in providing such training are addressed. These considerations may reflect the experiences of the wider community of statisticians involved in service provision within a larger organization.

  1. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  2. Analyzing a Mature Software Inspection Process Using Statistical Process Control (SPC)

    Science.gov (United States)

    Barnard, Julie; Carleton, Anita; Stamper, Darrell E. (Technical Monitor)

    1999-01-01

    This paper presents a cooperative effort where the Software Engineering Institute and the Space Shuttle Onboard Software Project could experiment applying Statistical Process Control (SPC) analysis to inspection activities. The topics include: 1) SPC Collaboration Overview; 2) SPC Collaboration Approach and Results; and 3) Lessons Learned.

  3. Time Overrun in Construction Project

    Science.gov (United States)

    Othman, I.; Shafiq, Nasir; Nuruddin, M. F.

    2017-12-01

    Timely completion is the key criteria to achieve success in any project despite the industry. Unfortunately construction industry in Malaysia has been labelled as industry facing poor performance leading to failure in achieving effective time management. As the consequence most of the project face huge amount of time overrun. This study assesses the causes of construction projects time overrun in Malaysia using structured questionnaire survey. Each respondent is asked to assign a one-to-five rating for each of the 18 time factors identified from literature review. Out of the 50 questionnaires sent out, 33 were received back representing 68% of the response rate. Data received from the questionnaires were analysed and processed using the descriptive statistics procedures. Findings from the study revealed that design and documentation issues, project management and contract administration, ineffective project planning and scheduling, contractor’s site management, financial resource management were the major factors that cause the time overrun. This study is hoped to help the practitioners to implement the mitigation measure at planning stage in order to achieve successful construction projects.

  4. Special study for the statistical evaluation of groundwater data trends. Final report

    International Nuclear Information System (INIS)

    1993-05-01

    Analysis of trends over time in the concentrations of chemicals in groundwater at Uranium Mill Tailings Remedial Action (UMTRA) Project sites can provide valuable information for monitoring the performance of disposal cells and the effectiveness of groundwater restoration activities. Random variation in data may obscure real trends or may produce the illusion of a trend where none exists, so statistical methods are needed to reliably detect and estimate trends. Trend analysis includes both trend detection and estimation. Trend detection uses statistical hypothesis testing and provides a yes or no answer regarding the existence of a trend. Hypothesis tests try to reach a balance between false negative and false positive conclusions. To quantify the magnitude of a trend, estimation is required. This report presents the statistical concepts that are necessary for understanding trend analysis. The types of patterns most likely to occur in UMTRA data sets are emphasized. Two general approaches to analyzing data for trends are proposed and recommendations are given to assist UMTRA Project staff in selecting an appropriate method for their site data. Trend analysis is much more difficult when data contain values less than the reported laboratory detection limit. The complications that arise are explained. This report also discusses the impact of data collection procedures on statistical trend methods and offers recommendations to improve the efficiency of the methods and reduce sampling costs. Guidance for determining how many sampling rounds might be needed by statistical methods to detect trends of various magnitudes is presented. This information could be useful in planning site monitoring activities

  5. New selection effect of statistical investigations of supernova remnants

    International Nuclear Information System (INIS)

    Allakhverdiev, A.O.; Gusejnov, O.Kh.; Kasumov, F.K.

    1986-01-01

    The influence of H2 regions on the parameters of Supernova remnants (SNR) is investigated. It has been shown that the projection of such regions on the SNRs leads to local changes of morphological structure of young shell-type SNRs and considerable distortions of integral parameters of evolved shell-type SNRs (with D > 10 pc) and plerions, up to their complete undetectability on the background of classical and gigantic H2 regions. A new selection effect, in fact, arises from these factors connected with additional limitations made by the real structure of the interstellar medium on the statistical investigations of SNRs. The influence of this effect on the statistical completeness of objects has been estimated

  6. New selection effect in statistical investigations of supernova remnants

    Science.gov (United States)

    Allakhverdiev, A. O.; Guseinov, O. Kh.; Kasumov, F. K.

    1986-01-01

    The influence of H II regions on the parameters of supernova remnants (SNR) is investigated. It has been shown that the projection of such regions on the SNRs leads to: a) local changes of morphological structure of young shell-type SNRs and b) considerable distortions of integral parameters of evolved shell-type SNRs (with D > 10 pc) and plerions, up to their complete undetectability on the background of classical and gigantic H II regions. A new selection effect, in fact, arises from these factors connected with additional limitations made by the real structure of the interstellar medium on the statistical investigations of SNRs. The influence of this effect on the statistical completeness of objects has been estimated.

  7. A statistical evaluation of asbestos air concentrations

    Energy Technology Data Exchange (ETDEWEB)

    Lange, J.H. [Envirosafe Training and Consultants, Pittsburgh, PA (United States)

    1999-07-01

    Both area and personal air samples collected during an asbestos abatement project were matched and statistically analysed. Among the many parameters studied were fibre concentrations and their variability. Mean values for area and personal samples were 0.005 and 0.024 f cm{sup -}-{sup 3} of air, respectively. Summary values for area and personal samples suggest that exposures are low with no single exposure value exceeding the current OSHA TWA value of 0.1 f cm{sup -3} of air. Within- and between-worker analysis suggests that these data are homogeneous. Comparison of within- and between-worker values suggests that the exposure source and variability for abatement are more related to the process than individual practices. This supports the importance of control measures for abatement. Study results also suggest that area and personal samples are not statistically related, that is, there is no association observed for these two sampling methods when data are analysed by correlation or regression analysis. Personal samples were statistically higher in concentration than area samples. Area sampling cannot be used as a surrogate exposure for asbestos abatement workers. (author)

  8. The Kolmogorov-Obukhov Statistical Theory of Turbulence

    Science.gov (United States)

    Birnir, Björn

    2013-08-01

    In 1941 Kolmogorov and Obukhov postulated the existence of a statistical theory of turbulence, which allows the computation of statistical quantities that can be simulated and measured in a turbulent system. These are quantities such as the moments, the structure functions and the probability density functions (PDFs) of the turbulent velocity field. In this paper we will outline how to construct this statistical theory from the stochastic Navier-Stokes equation. The additive noise in the stochastic Navier-Stokes equation is generic noise given by the central limit theorem and the large deviation principle. The multiplicative noise consists of jumps multiplying the velocity, modeling jumps in the velocity gradient. We first estimate the structure functions of turbulence and establish the Kolmogorov-Obukhov 1962 scaling hypothesis with the She-Leveque intermittency corrections. Then we compute the invariant measure of turbulence, writing the stochastic Navier-Stokes equation as an infinite-dimensional Ito process, and solving the linear Kolmogorov-Hopf functional differential equation for the invariant measure. Finally we project the invariant measure onto the PDF. The PDFs turn out to be the normalized inverse Gaussian (NIG) distributions of Barndorff-Nilsen, and compare well with PDFs from simulations and experiments.

  9. Industrial commodity statistics yearbook 2001. Production statistics (1992-2001)

    International Nuclear Information System (INIS)

    2003-01-01

    This is the thirty-fifth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1992-2001 for about 200 countries and areas

  10. Industrial commodity statistics yearbook 2002. Production statistics (1993-2002)

    International Nuclear Information System (INIS)

    2004-01-01

    This is the thirty-sixth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title 'The Growth of World industry' and the next eight editions under the title 'Yearbook of Industrial Statistics'. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1993-2002 for about 200 countries and areas

  11. Industrial commodity statistics yearbook 2000. Production statistics (1991-2000)

    International Nuclear Information System (INIS)

    2002-01-01

    This is the thirty-third in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. Most of the statistics refer to the ten-year period 1991-2000 for about 200 countries and areas

  12. Role Of Non-Governmental Organizations Leadership In The Implementation Of Community Development Projects In Arumeru District Tanzania

    Directory of Open Access Journals (Sweden)

    Rajabu Ally Mtunge

    2015-08-01

    Full Text Available The purpose of this study was to examine the role of leadership in the implementation of community development projects by local non-governmental organizations in Arumeru District Tanzania. The study applied survey design which covered the sample of 46 respondents including District Executive Director District Social Workers Non-Governmental Organization leaders workers and volunteers and community members in Arumeru district Tanzania. The study employed simple random sampling technique in order to ensure equal chance of an individual being involved in this study as inferential statistics considered. Data collected from a sample of 46 NGOs employees using a semi-structured questionnaire with both closed and open-ended questions. The collected data analyzed using both descriptive and inferential statistics. The descriptive statistical tools used included frequencies mean and standard deviation while inferential statistical tool used was correlation. Statistical Package for the Social Sciences SPSS version 19 used for analyzing the data collected. The study achieved a response of 46 out of a sample of 47 representing a response rate of 97.87. The results show that a significant number of NGOs 34.8 had not completed their projects 21.7 stated that less than five projects were complete and 43.5 of the respondents confirmed that more than five projects not completed over the last one year. Regarding the influence of leadership on implementation of project spearmans rank correlation revealed a very strong positive correlation 0.910 between of leadership vision and implementation of community development projects a strong positive correlation between communication and implementation of community development projects rho 0.730 n 46 p .001 strong positive correlation between commitment and implementation of community developmental projects which was statistically significant rs .601 p .000 and a positive correlation between accountability and

  13. Sparse Power-Law Network Model for Reliable Statistical Predictions Based on Sampled Data

    Directory of Open Access Journals (Sweden)

    Alexander P. Kartun-Giles

    2018-04-01

    Full Text Available A projective network model is a model that enables predictions to be made based on a subsample of the network data, with the predictions remaining unchanged if a larger sample is taken into consideration. An exchangeable model is a model that does not depend on the order in which nodes are sampled. Despite a large variety of non-equilibrium (growing and equilibrium (static sparse complex network models that are widely used in network science, how to reconcile sparseness (constant average degree with the desired statistical properties of projectivity and exchangeability is currently an outstanding scientific problem. Here we propose a network process with hidden variables which is projective and can generate sparse power-law networks. Despite the model not being exchangeable, it can be closely related to exchangeable uncorrelated networks as indicated by its information theory characterization and its network entropy. The use of the proposed network process as a null model is here tested on real data, indicating that the model offers a promising avenue for statistical network modelling.

  14. A statistical rain attenuation prediction model with application to the advanced communication technology satellite project. 1: Theoretical development and application to yearly predictions for selected cities in the United States

    Science.gov (United States)

    Manning, Robert M.

    1986-01-01

    A rain attenuation prediction model is described for use in calculating satellite communication link availability for any specific location in the world that is characterized by an extended record of rainfall. Such a formalism is necessary for the accurate assessment of such availability predictions in the case of the small user-terminal concept of the Advanced Communication Technology Satellite (ACTS) Project. The model employs the theory of extreme value statistics to generate the necessary statistical rainrate parameters from rain data in the form compiled by the National Weather Service. These location dependent rain statistics are then applied to a rain attenuation model to obtain a yearly prediction of the occurrence of attenuation on any satellite link at that location. The predictions of this model are compared to those of the Crane Two-Component Rain Model and some empirical data and found to be very good. The model is then used to calculate rain attenuation statistics at 59 locations in the United States (including Alaska and Hawaii) for the 20 GHz downlinks and 30 GHz uplinks of the proposed ACTS system. The flexibility of this modeling formalism is such that it allows a complete and unified treatment of the temporal aspects of rain attenuation that leads to the design of an optimum stochastic power control algorithm, the purpose of which is to efficiently counter such rain fades on a satellite link.

  15. Compilation of accident statistics in PSE

    International Nuclear Information System (INIS)

    Jobst, C.

    1983-04-01

    The objective of the investigations on transportation carried out within the framework of the 'Project - Studies on Safety in Waste Management (PSE II)' is the determination of the risk of accidents in the transportation of radioactive materials by rail. The fault tree analysis is used for the determination of risks in the transportation system. This method offers a possibility for the determination of frequency and consequences of accidents which could lead to an unintended release of radionuclides. The study presented compiles all data obtained from the accident statistics of the Federal German Railways. (orig./RB) [de

  16. Suppression of intensity transition artifacts in statistical x-ray computer tomography reconstruction through Radon inversion initialization

    International Nuclear Information System (INIS)

    Zbijewski, Wojciech; Beekman, Freek J.

    2004-01-01

    Statistical reconstruction (SR) methods provide a general and flexible framework for obtaining tomographic images from projections. For several applications SR has been shown to outperform analytical algorithms in terms of resolution-noise trade-off achieved in the reconstructions. A disadvantage of SR is the long computational time required to obtain the reconstructions, in particular when large data sets characteristic for x-ray computer tomography (CT) are involved. As was shown recently, by combining statistical methods with block iterative acceleration schemes [e.g., like in the ordered subsets convex (OSC) algorithm], the reconstruction time for x-ray CT applications can be reduced by about two orders of magnitude. There are, however, some factors lengthening the reconstruction process that hamper both accelerated and standard statistical algorithms to similar degree. In this simulation study based on monoenergetic and scatter-free projection data, we demonstrate that one of these factors is the extremely high number of iterations needed to remove artifacts that can appear around high-contrast structures. We also show (using the OSC method) that these artifacts can be adequately suppressed if statistical reconstruction is initialized with images generated by means of Radon inversion algorithms like filtered back projection (FBP). This allows the reconstruction time to be shortened by even as much as one order of magnitude. Although the initialization of the statistical algorithm with FBP image introduces some additional noise into the first iteration of OSC reconstruction, the resolution-noise trade-off and the contrast-to-noise ratio of final images are not markedly compromised

  17. Statistical methods for assessing agreement between continuous measurements

    DEFF Research Database (Denmark)

    Sokolowski, Ineta; Hansen, Rikke Pilegaard; Vedsted, Peter

    Background: Clinical research often involves study of agreement amongst observers. Agreement can be measured in different ways, and one can obtain quite different values depending on which method one uses. Objective: We review the approaches that have been discussed to assess the agreement between...... continuous measures and discuss their strengths and weaknesses. Different methods are illustrated using actual data from the `Delay in diagnosis of cancer in general practice´ project in Aarhus, Denmark. Subjects and Methods: We use weighted kappa-statistic, intraclass correlation coefficient (ICC......), concordance coefficient, Bland-Altman limits of agreement and percentage of agreement to assess the agreement between patient reported delay and doctor reported delay in diagnosis of cancer in general practice. Key messages: The correct statistical approach is not obvious. Many studies give the product...

  18. Comparison of precipitation nowcasting by extrapolation and statistical-advection methods

    Czech Academy of Sciences Publication Activity Database

    Sokol, Zbyněk; Kitzmiller, D.; Pešice, Petr; Mejsnar, Jan

    2013-01-01

    Roč. 123, 1 April (2013), s. 17-30 ISSN 0169-8095 R&D Projects: GA MŠk ME09033 Institutional support: RVO:68378289 Keywords : Precipitation forecast * Statistical models * Regression * Quantitative precipitation forecast * Extrapolation forecast Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 2.421, year: 2013 http://www.sciencedirect.com/science/article/pii/S0169809512003390

  19. Statistical data processing of mobility curves of univalent weak bases

    Czech Academy of Sciences Publication Activity Database

    Šlampová, Andrea; Boček, Petr

    2008-01-01

    Roč. 29, č. 2 (2008), s. 538-541 ISSN 0173-0835 R&D Projects: GA AV ČR IAA400310609; GA ČR GA203/05/2106 Institutional research plan: CEZ:AV0Z40310501 Keywords : mobility curve * univalent weak bases * statistical evaluation Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 3.509, year: 2008

  20. [AN OVERALL SOUND PROCESS] Syntactic parameters, statistic parameters, and universals

    Directory of Open Access Journals (Sweden)

    Nicolas Meeùs

    2016-05-01

    My paper intends to show that comparative musicology, in facts if not in principles, appears inherently linked to the syntactic elements of music – and so also any encyclopedic project aiming at uncovering universals in music. Not that statistic elements cannot be universal, but that they cannot be commented as such, because they remain largely unquantifiable.

  1. Relativistic beaming and quasar statistics

    International Nuclear Information System (INIS)

    Orr, M.J.L.; Browne, I.W.A.

    1982-01-01

    The statistical predictions of a unified scheme for the radio emission from quasars are explored. This scheme attributes the observed differences between flat- and steep-spectrum quasars to projection and the effects of relativistic beaming of the emission from the nuclear components. We use a simple quasar model consisting of a compact relativistically beamed core with spectral index zero and unbeamed lobes, spectral index - 1, to predict the proportion of flat-spectrum sources in flux-limited samples selected at different frequencies. In our model this fraction depends on the core Lorentz factor, γ and we find that a value of approximately 5 gives satisfactory agreement with observation. In a similar way the model is used to construct the expected number/flux density counts for flat-spectrum quasars from the observed steep-spectrum counts. Again, good agreement with the observations is obtained if the average core Lorentz factor is about 5. Independent estimates of γ from observations of superluminal motion in quasars are of the same order of magnitude. We conclude that the statistical properties of quasars are entirely consistent with the predictions of simple relativistic-beam models. (author)

  2. Harmonic statistics

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2017-01-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  3. Harmonic statistics

    Energy Technology Data Exchange (ETDEWEB)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    2017-05-15

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  4. Application for approval of the Cold Lake Expansion Project: volume 3: environmental impact assessment: socio-economic assessment

    International Nuclear Information System (INIS)

    Anon.

    1997-02-01

    Anticipated social and economic effects from the construction and operation of the proposed Cold Lake Expansion Project in Alberta were described. The regions and communities that would be directly affected by the project were identified and project impacts with respect to the following areas were analyzed: population and demographics, local economy and labour force, education, health, social and recreational services, protective and emergency services, physical community infrastructure, real estate, and current community planning and land use initiatives. The main sources of information used for the analysis were public statistical data compiled by Statistics Canada, provincial government statistics and municipal planning documents. The overall social impact of the project was expected to range from low to medium. 50 refs., 44 tabs., 6 figs

  5. Statistical mechanics for a class of quantum statistics

    International Nuclear Information System (INIS)

    Isakov, S.B.

    1994-01-01

    Generalized statistical distributions for identical particles are introduced for the case where filling a single-particle quantum state by particles depends on filling states of different momenta. The system of one-dimensional bosons with a two-body potential that can be solved by means of the thermodynamic Bethe ansatz is shown to be equivalent thermodynamically to a system of free particles obeying statistical distributions of the above class. The quantum statistics arising in this way are completely determined by the two-particle scattering phases of the corresponding interacting systems. An equation determining the statistical distributions for these statistics is derived

  6. METHODOLOGY FOR PROJECTION OF OCCUPATIONAL TRENDS IN THE DENVER STANDARD METROPOLITAN STATISTICAL AREA.

    Science.gov (United States)

    FISHMAN, LESLIE; AND OTHERS

    THE FINAL STAGE OF A PROGRAM FOR ACHIEVING A BALANCE BETWEEN THE AGGREGATE SUPPLY AND DEMAND FOR LABOR IS THE DISTRIBUTION OF THE INDUSTRY EMPLOYMENT ESTIMATES INTO OCCUPATIONAL REQUIREMENTS BASED ON PROJECTIONS OF OCCUPATIONAL PATTERNS BY INDUSTRY. THIS CAN BE USED TO EVALUATE POTENTIAL AREAS OF SUBSTANTIAL SURPLUS OR SHORTAGE, AND PROVIDE THE…

  7. Statistical methods for longitudinal data with agricultural applications

    DEFF Research Database (Denmark)

    Anantharama Ankinakatte, Smitha

    The PhD study focuses on modeling two kings of longitudinal data arising in agricultural applications: continuous time series data and discrete longitudinal data. Firstly, two statistical methods, neural networks and generalized additive models, are applied to predict masistis using multivariate...... algorithm. This was found to compare favourably with the algorithm implemented in the well-known Beagle software. Finally, an R package to apply APFA models developed as part of the PhD project is described...

  8. From statistical proofs of the Kochen-Specker theorem to noise-robust noncontextuality inequalities

    Science.gov (United States)

    Kunjwal, Ravi; Spekkens, Robert W.

    2018-05-01

    The Kochen-Specker theorem rules out models of quantum theory wherein projective measurements are assigned outcomes deterministically and independently of context. This notion of noncontextuality is not applicable to experimental measurements because these are never free of noise and thus never truly projective. For nonprojective measurements, therefore, one must drop the requirement that an outcome be assigned deterministically in the model and merely require that it be assigned a distribution over outcomes in a manner that is context-independent. By demanding context independence in the representation of preparations as well, one obtains a generalized principle of noncontextuality that also supports a quantum no-go theorem. Several recent works have shown how to derive inequalities on experimental data which, if violated, demonstrate the impossibility of finding a generalized-noncontextual model of this data. That is, these inequalities do not presume quantum theory and, in particular, they make sense without requiring an operational analog of the quantum notion of projectiveness. We here describe a technique for deriving such inequalities starting from arbitrary proofs of the Kochen-Specker theorem. It extends significantly previous techniques that worked only for logical proofs, which are based on sets of projective measurements that fail to admit of any deterministic noncontextual assignment, to the case of statistical proofs, which are based on sets of projective measurements that d o admit of some deterministic noncontextual assignments, but not enough to explain the quantum statistics.

  9. Assessment of hi-resolution multi-ensemble statistical downscaling regional climate scenarios over Japan

    Science.gov (United States)

    Dairaku, K.

    2017-12-01

    The Asia-Pacific regions are increasingly threatened by large scale natural disasters. Growing concerns that loss and damages of natural disasters are projected to further exacerbate by climate change and socio-economic change. Climate information and services for risk assessments are of great concern. Fundamental regional climate information is indispensable for understanding changing climate and making decisions on when and how to act. To meet with the needs of stakeholders such as National/local governments, spatio-temporal comprehensive and consistent information is necessary and useful for decision making. Multi-model ensemble regional climate scenarios with 1km horizontal grid-spacing over Japan are developed by using CMIP5 37 GCMs (RCP8.5) and a statistical downscaling (Bias Corrected Spatial Disaggregation (BCSD)) to investigate uncertainty of projected change associated with structural differences of the GCMs for the periods of historical climate (1950-2005) and near future climate (2026-2050). Statistical downscaling regional climate scenarios show good performance for annual and seasonal averages for precipitation and temperature. The regional climate scenarios show systematic underestimate of extreme events such as hot days of over 35 Celsius and annual maximum daily precipitation because of the interpolation processes in the BCSD method. Each model projected different responses in near future climate because of structural differences. The most of CMIP5 37 models show qualitatively consistent increase of average and extreme temperature and precipitation. The added values of statistical/dynamical downscaling methods are also investigated for locally forced nonlinear phenomena, extreme events.

  10. SU-F-I-10: Spatially Local Statistics for Adaptive Image Filtering

    International Nuclear Information System (INIS)

    Iliopoulos, AS; Sun, X; Floros, D; Zhang, Y; Yin, FF; Ren, L; Pitsianis, N

    2016-01-01

    Purpose: To facilitate adaptive image filtering operations, addressing spatial variations in both noise and signal. Such issues are prevalent in cone-beam projections, where physical effects such as X-ray scattering result in spatially variant noise, violating common assumptions of homogeneous noise and challenging conventional filtering approaches to signal extraction and noise suppression. Methods: We present a computational mechanism for probing into and quantifying the spatial variance of noise throughout an image. The mechanism builds a pyramid of local statistics at multiple spatial scales; local statistical information at each scale includes (weighted) mean, median, standard deviation, median absolute deviation, as well as histogram or dynamic range after local mean/median shifting. Based on inter-scale differences of local statistics, the spatial scope of distinguishable noise variation is detected in a semi- or un-supervised manner. Additionally, we propose and demonstrate the incorporation of such information in globally parametrized (i.e., non-adaptive) filters, effectively transforming the latter into spatially adaptive filters. The multi-scale mechanism is materialized by efficient algorithms and implemented in parallel CPU/GPU architectures. Results: We demonstrate the impact of local statistics for adaptive image processing and analysis using cone-beam projections of a Catphan phantom, fitted within an annulus to increase X-ray scattering. The effective spatial scope of local statistics calculations is shown to vary throughout the image domain, necessitating multi-scale noise and signal structure analysis. Filtering results with and without spatial filter adaptation are compared visually, illustrating improvements in imaging signal extraction and noise suppression, and in preserving information in low-contrast regions. Conclusion: Local image statistics can be incorporated in filtering operations to equip them with spatial adaptivity to spatial

  11. SU-F-I-10: Spatially Local Statistics for Adaptive Image Filtering

    Energy Technology Data Exchange (ETDEWEB)

    Iliopoulos, AS; Sun, X [Duke University, Durham, NC (United States); Floros, D [Aristotle University of Thessaloniki (Greece); Zhang, Y; Yin, FF; Ren, L [Duke University Medical Center, Durham, NC (United States); Pitsianis, N [Aristotle University of Thessaloniki (Greece); Duke University, Durham, NC (United States)

    2016-06-15

    Purpose: To facilitate adaptive image filtering operations, addressing spatial variations in both noise and signal. Such issues are prevalent in cone-beam projections, where physical effects such as X-ray scattering result in spatially variant noise, violating common assumptions of homogeneous noise and challenging conventional filtering approaches to signal extraction and noise suppression. Methods: We present a computational mechanism for probing into and quantifying the spatial variance of noise throughout an image. The mechanism builds a pyramid of local statistics at multiple spatial scales; local statistical information at each scale includes (weighted) mean, median, standard deviation, median absolute deviation, as well as histogram or dynamic range after local mean/median shifting. Based on inter-scale differences of local statistics, the spatial scope of distinguishable noise variation is detected in a semi- or un-supervised manner. Additionally, we propose and demonstrate the incorporation of such information in globally parametrized (i.e., non-adaptive) filters, effectively transforming the latter into spatially adaptive filters. The multi-scale mechanism is materialized by efficient algorithms and implemented in parallel CPU/GPU architectures. Results: We demonstrate the impact of local statistics for adaptive image processing and analysis using cone-beam projections of a Catphan phantom, fitted within an annulus to increase X-ray scattering. The effective spatial scope of local statistics calculations is shown to vary throughout the image domain, necessitating multi-scale noise and signal structure analysis. Filtering results with and without spatial filter adaptation are compared visually, illustrating improvements in imaging signal extraction and noise suppression, and in preserving information in low-contrast regions. Conclusion: Local image statistics can be incorporated in filtering operations to equip them with spatial adaptivity to spatial

  12. Analyzing the effect of selected control policy measures and sociodemographic factors on alcoholic beverage consumption in Europe within the AMPHORA project: statistical methods.

    Science.gov (United States)

    Baccini, Michela; Carreras, Giulia

    2014-10-01

    This paper describes the methods used to investigate variations in total alcoholic beverage consumption as related to selected control intervention policies and other socioeconomic factors (unplanned factors) within 12 European countries involved in the AMPHORA project. The analysis presented several critical points: presence of missing values, strong correlation among the unplanned factors, long-term waves or trends in both the time series of alcohol consumption and the time series of the main explanatory variables. These difficulties were addressed by implementing a multiple imputation procedure for filling in missing values, then specifying for each country a multiple regression model which accounted for time trend, policy measures and a limited set of unplanned factors, selected in advance on the basis of sociological and statistical considerations are addressed. This approach allowed estimating the "net" effect of the selected control policies on alcohol consumption, but not the association between each unplanned factor and the outcome.

  13. Developing Statistical Literacy Using Real-World Data: Investigating Socioeconomic Secondary Data Resources Used in Research and Teaching

    Science.gov (United States)

    Carter, Jackie; Noble, Susan; Russell, Andrew; Swanson, Eric

    2011-01-01

    Increasing volumes of statistical data are being made available on the open web, including from the World Bank. This "data deluge" provides both opportunities and challenges. Good use of these data requires statistical literacy. This paper presents results from a project that set out to better understand how socioeconomic secondary data…

  14. Statistics with JMP graphs, descriptive statistics and probability

    CERN Document Server

    Goos, Peter

    2015-01-01

    Peter Goos, Department of Statistics, University ofLeuven, Faculty of Bio-Science Engineering and University ofAntwerp, Faculty of Applied Economics, BelgiumDavid Meintrup, Department of Mathematics and Statistics,University of Applied Sciences Ingolstadt, Faculty of MechanicalEngineering, GermanyThorough presentation of introductory statistics and probabilitytheory, with numerous examples and applications using JMPDescriptive Statistics and Probability provides anaccessible and thorough overview of the most important descriptivestatistics for nominal, ordinal and quantitative data withpartic

  15. Hazard rate model and statistical analysis of a compound point process

    Czech Academy of Sciences Publication Activity Database

    Volf, Petr

    2005-01-01

    Roč. 41, č. 6 (2005), s. 773-786 ISSN 0023-5954 R&D Projects: GA ČR(CZ) GA402/04/1294 Institutional research plan: CEZ:AV0Z10750506 Keywords : couting process * compound process * Cox regression model * intensity Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.343, year: 2005

  16. A Case Study in Elementary Statistics: The Florida Panther Population

    Science.gov (United States)

    Lazowski, Andrew; Stopper, Geffrey

    2013-01-01

    We describe a case study that was created to intertwine the fields of biology and mathematics. This project is given in an elementary probability and statistics course for non-math majors. Some goals of this case study include: to expose students to biology in a math course, to apply probability to real-life situations, and to display how far a…

  17. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Digital Model-Based Engineering: Expectations, Prerequisites, and Challenges of Infusion

    Science.gov (United States)

    Hale, J. P.; Zimmerman, P.; Kukkala, G.; Guerrero, J.; Kobryn, P.; Puchek, B.; Bisconti, M.; Baldwin, C.; Mulpuri, M.

    2017-01-01

    Digital model-based engineering (DMbE) is the use of digital artifacts, digital environments, and digital tools in the performance of engineering functions. DMbE is intended to allow an organization to progress from documentation-based engineering methods to digital methods that may provide greater flexibility, agility, and efficiency. The term 'DMbE' was developed as part of an effort by the Model-Based Systems Engineering (MBSE) Infusion Task team to identify what government organizations might expect in the course of moving to or infusing MBSE into their organizations. The Task team was established by the Interagency Working Group on Engineering Complex Systems, an informal collaboration among government systems engineering organizations. This Technical Memorandum (TM) discusses the work of the MBSE Infusion Task team to date. The Task team identified prerequisites, expectations, initial challenges, and recommendations for areas of study to pursue, as well as examples of efforts already in progress. The team identified the following five expectations associated with DMbE infusion, discussed further in this TM: (1) Informed decision making through increased transparency, and greater insight. (2) Enhanced communication. (3) Increased understanding for greater flexibility/adaptability in design. (4) Increased confidence that the capability will perform as expected. (5) Increased efficiency. The team identified the following seven challenges an organization might encounter when looking to infuse DMbE: (1) Assessing value added to the organization. Not all DMbE practices will be applicable to every situation in every organization, and not all implementations will have positive results. (2) Overcoming organizational and cultural hurdles. (3) Adopting contractual practices and technical data management. (4) Redefining configuration management. The DMbE environment changes the range of configuration information to be managed to include performance and design models

  19. Open Access!: Review of Online Statistics: An Interactive Multimedia Course of Study by David Lane

    Directory of Open Access Journals (Sweden)

    Samuel L. Tunstall

    2016-01-01

    Full Text Available David M. Lane (project leader. Online Statistics Education: An Interactive Multimedia Course of Study (http://onlinestatbook.com/ Also: David M. Lane (primary author and editor, with David Scott, Mikki Hebl, Rudy Guerra, Dan Osherson, and Heidi Zimmer. Introduction to Statistics. Online edition (http://onlinestatbook.com/Online_Statistics_Education.pdf, 694 pp. It is rare that students receive high-quality textbooks for free, but David Lane's Online Statistics: An Interactive Multimedia Course of Study permits precisely that. This review gives an overview of the many features in Lane's online textbook, including the Java Applets, the textbook itself, and the resources available for instructors. A discussion of uses of the site, as well as a comparison of the text to alternative online statistics textbooks, is included.

  20. Statistics Anxiety and Business Statistics: The International Student

    Science.gov (United States)

    Bell, James A.

    2008-01-01

    Does the international student suffer from statistics anxiety? To investigate this, the Statistics Anxiety Rating Scale (STARS) was administered to sixty-six beginning statistics students, including twelve international students and fifty-four domestic students. Due to the small number of international students, nonparametric methods were used to…

  1. Successful Control of Major Project Budgets

    Directory of Open Access Journals (Sweden)

    Steen Lichtenberg

    2016-07-01

    Full Text Available This paper differs from scientific papers describing current research. In line with the theme of this special issue, it challenges conventional risk management practice against the background of former research results successfully finished decades ago. It is well-known that conventional practice frequently results in budget overruns of large projects. International reviews document that. Severe delays of schedules are also well-known. This paper describes successful research results from almost three decades ago, which successfully challenges this severe problem and has led to new practices. The research involved is an unusual mix: Scandinavian researchers from psychology, statistical theory and engineering economy. The resulting procedure has been widely used since around 1990 and challenges conventional procedures. The procedure is documented to be able to yield statistically correct prognoses, when the “rules of the game” have been correctly followed. After a short summary of the basic situation, this paper summarizes the research, followed by some resulting experiences, focusing on two recent studies each of 40 infrastructures and other major projects. In both sets, the actual final cost largely equaled the expected project cost. This result is a marked change from international past and present experience. Finally, the need for further research and progress is discussed.

  2. Joint Duty Prerequisite for Promotion to 07 (Brigadier General

    Science.gov (United States)

    1989-03-13

    NUMBER)(O LTC Julius E. Coats, Jr. 9. PERFORMING ORGANIZATIN NAME AND ADDRESS I0. PROGRAM ELEMENT. PROJECT. tASK U.S. Army War College AREA 4 WORK...new personnel policy; to wit, the Army leadership at all levels should view joint duty re- quirement for selection for flag officer with a positive...the Army leadership at all levels should view joint duty requirement for selection for flag officer with a positive attitude, not as a means for

  3. Applied statistical training to strengthen analysis and health research capacity in Rwanda.

    Science.gov (United States)

    Thomson, Dana R; Semakula, Muhammed; Hirschhorn, Lisa R; Murray, Megan; Ndahindwa, Vedaste; Manzi, Anatole; Mukabutera, Assumpta; Karema, Corine; Condo, Jeanine; Hedt-Gauthier, Bethany

    2016-09-29

    To guide efficient investment of limited health resources in sub-Saharan Africa, local researchers need to be involved in, and guide, health system and policy research. While extensive survey and census data are available to health researchers and program officers in resource-limited countries, local involvement and leadership in research is limited due to inadequate experience, lack of dedicated research time and weak interagency connections, among other challenges. Many research-strengthening initiatives host prolonged fellowships out-of-country, yet their approaches have not been evaluated for effectiveness in involvement and development of local leadership in research. We developed, implemented and evaluated a multi-month, deliverable-driven, survey analysis training based in Rwanda to strengthen skills of five local research leaders, 15 statisticians, and a PhD candidate. Research leaders applied with a specific research question relevant to country challenges and committed to leading an analysis to publication. Statisticians with prerequisite statistical training and experience with a statistical software applied to participate in class-based trainings and complete an assigned analysis. Both statisticians and research leaders were provided ongoing in-country mentoring for analysis and manuscript writing. Participants reported a high level of skill, knowledge and collaborator development from class-based trainings and out-of-class mentorship that were sustained 1 year later. Five of six manuscripts were authored by multi-institution teams and submitted to international peer-reviewed scientific journals, and three-quarters of the participants mentored others in survey data analysis or conducted an additional survey analysis in the year following the training. Our model was effective in utilizing existing survey data and strengthening skills among full-time working professionals without disrupting ongoing work commitments and using few resources. Critical to our

  4. Impact of promoting sustainable agriculture project on livelihood ...

    African Journals Online (AJOL)

    This study on impact assessment of Promoting Sustainable Agriculture Project ... the Fisher Index, Focused Group Discussion and descriptive statistical analysis. ... The qualitative analysis showed that 30%, 45% and 10% of men, women and ...

  5. Towards bridging the gap between climate change projections and maize producers in South Africa

    Science.gov (United States)

    Landman, Willem A.; Engelbrecht, Francois; Hewitson, Bruce; Malherbe, Johan; van der Merwe, Jacobus

    2018-05-01

    Multi-decadal regional projections of future climate change are introduced into a linear statistical model in order to produce an ensemble of austral mid-summer maximum temperature simulations for southern Africa. The statistical model uses atmospheric thickness fields from a high-resolution (0.5° × 0.5°) reanalysis-forced simulation as predictors in order to develop a linear recalibration model which represents the relationship between atmospheric thickness fields and gridded maximum temperatures across the region. The regional climate model, the conformal-cubic atmospheric model (CCAM), projects maximum temperatures increases over southern Africa to be in the order of 4 °C under low mitigation towards the end of the century or even higher. The statistical recalibration model is able to replicate these increasing temperatures, and the atmospheric thickness-maximum temperature relationship is shown to be stable under future climate conditions. Since dry land crop yields are not explicitly simulated by climate models but are sensitive to maximum temperature extremes, the effect of projected maximum temperature change on dry land crops of the Witbank maize production district of South Africa, assuming other factors remain unchanged, is then assessed by employing a statistical approach similar to the one used for maximum temperature projections.

  6. Spreadsheets as tools for statistical computing and statistics education

    OpenAIRE

    Neuwirth, Erich

    2000-01-01

    Spreadsheets are an ubiquitous program category, and we will discuss their use in statistics and statistics education on various levels, ranging from very basic examples to extremely powerful methods. Since the spreadsheet paradigm is very familiar to many potential users, using it as the interface to statistical methods can make statistics more easily accessible.

  7. Evaluation of Project Based Learning in the Area of Manufacturing and Statistics in the Degree of Industrial Technology

    Science.gov (United States)

    Buj-Corral, Irene; Marco-Almagro, Lluís; Riba, Alex; Vivancos-Calvet, Joan; Tort-Martorell, Xavier

    2015-01-01

    In the subject Project I in the second year of the Degree in Industrial Technology Engineering taught at the School of Industrial Engineering of Barcelona (ETSEIB), subgroups of 3-4 students within groups of 20 students develop a project along a semester. Results of 2 projects are presented related to manufacturing, measurement of parts and the…

  8. Development and validation of a national data registry for midwife-led births: the Midwives Alliance of North America Statistics Project 2.0 dataset.

    Science.gov (United States)

    Cheyney, Melissa; Bovbjerg, Marit; Everson, Courtney; Gordon, Wendy; Hannibal, Darcy; Vedam, Saraswathi

    2014-01-01

    In 2004, the Midwives Alliance of North America's (MANA's) Division of Research developed a Web-based data collection system to gather information on the practices and outcomes associated with midwife-led births in the United States. This system, called the MANA Statistics Project (MANA Stats), grew out of a widely acknowledged need for more reliable data on outcomes by intended place of birth. This article describes the history and development of the MANA Stats birth registry and provides an analysis of the 2.0 dataset's content, strengths, and limitations. Data collection and review procedures for the MANA Stats 2.0 dataset are described, along with methods for the assessment of data accuracy. We calculated descriptive statistics for client demographics and contributing midwife credentials, and assessed the quality of data by calculating point estimates, 95% confidence intervals, and kappa statistics for key outcomes on pre- and postreview samples of records. The MANA Stats 2.0 dataset (2004-2009) contains 24,848 courses of care, 20,893 of which are for women who planned a home or birth center birth at the onset of labor. The majority of these records were planned home births (81%). Births were attended primarily by certified professional midwives (73%), and clients were largely white (92%), married (87%), and college-educated (49%). Data quality analyses of 9932 records revealed no differences between pre- and postreviewed samples for 7 key benchmarking variables (kappa, 0.98-1.00). The MANA Stats 2.0 data were accurately entered by participants; any errors in this dataset are likely random and not systematic. The primary limitation of the 2.0 dataset is that the sample was captured through voluntary participation; thus, it may not accurately reflect population-based outcomes. The dataset's primary strength is that it will allow for the examination of research questions on normal physiologic birth and midwife-led birth outcomes by intended place of birth.

  9. The Meta-Analysis of Clinical Judgment Project: Fifty-Six Years of Accumulated Research on Clinical Versus Statistical Prediction

    Science.gov (United States)

    Aegisdottir, Stefania; White, Michael J.; Spengler, Paul M.; Maugherman, Alan S.; Anderson, Linda A.; Cook, Robert S.; Nichols, Cassandra N.; Lampropoulos, Georgios K.; Walker, Blain S.; Cohen, Genna; Rush, Jeffrey D.

    2006-01-01

    Clinical predictions made by mental health practitioners are compared with those using statistical approaches. Sixty-seven studies were identified from a comprehensive search of 56 years of research; 92 effect sizes were derived from these studies. The overall effect of clinical versus statistical prediction showed a somewhat greater accuracy for…

  10. Download this PDF file

    African Journals Online (AJOL)

    are some common pitfalls in using statistical methodology which may result ... It is also easy to overlook statistical and mathematical assumptions of ..... Knowledge as a Prerequisite for Confounding Evaluation: An Application to. Birth Defects ...

  11. Register-based statistics statistical methods for administrative data

    CERN Document Server

    Wallgren, Anders

    2014-01-01

    This book provides a comprehensive and up to date treatment of  theory and practical implementation in Register-based statistics. It begins by defining the area, before explaining how to structure such systems, as well as detailing alternative approaches. It explains how to create statistical registers, how to implement quality assurance, and the use of IT systems for register-based statistics. Further to this, clear details are given about the practicalities of implementing such statistical methods, such as protection of privacy and the coordination and coherence of such an undertaking. Thi

  12. Cancer Statistics

    Science.gov (United States)

    ... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...

  13. Statistical evaluation of a project to estimate fish trajectories through the intakes of Kaplan hydropower turbines

    Science.gov (United States)

    Sutton, Virginia Kay

    This paper examines statistical issues associated with estimating paths of juvenile salmon through the intakes of Kaplan turbines. Passive sensors, hydrophones, detecting signals from ultrasonic transmitters implanted in individual fish released into the preturbine region were used to obtain the information to estimate fish paths through the intake. Aim and location of the sensors affects the spatial region in which the transmitters can be detected, and formulas relating this region to sensor aiming directions are derived. Cramer-Rao lower bounds for the variance of estimators of fish location are used to optimize placement of each sensor. Finally, a statistical methodology is developed for analyzing angular data collected from optimally placed sensors.

  14. Statistical and Machine-Learning Classifier Framework to Improve Pulse Shape Discrimination System Design

    Energy Technology Data Exchange (ETDEWEB)

    Wurtz, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kaplan, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-10-28

    Pulse shape discrimination (PSD) is a variety of statistical classifier. Fully-­realized statistical classifiers rely on a comprehensive set of tools for designing, building, and implementing. PSD advances rely on improvements to the implemented algorithm. PSD advances can be improved by using conventional statistical classifier or machine learning methods. This paper provides the reader with a glossary of classifier-­building elements and their functions in a fully-­designed and operational classifier framework that can be used to discover opportunities for improving PSD classifier projects. This paper recommends reporting the PSD classifier’s receiver operating characteristic (ROC) curve and its behavior at a gamma rejection rate (GRR) relevant for realistic applications.

  15. Seeking a Balance between the Statistical and Scientific Elements in Psychometrics

    Science.gov (United States)

    Wilson, Mark

    2013-01-01

    In this paper, I will review some aspects of psychometric projects that I have been involved in, emphasizing the nature of the work of the psychometricians involved, especially the balance between the statistical and scientific elements of that work. The intent is to seek to understand where psychometrics, as a discipline, has been and where it…

  16. The statistical distribution of aerosol properties in sourthern West Africa

    Science.gov (United States)

    Haslett, Sophie; Taylor, Jonathan; Flynn, Michael; Bower, Keith; Dorsey, James; Crawford, Ian; Brito, Joel; Denjean, Cyrielle; Bourrianne, Thierry; Burnet, Frederic; Batenburg, Anneke; Schulz, Christiane; Schneider, Johannes; Borrmann, Stephan; Sauer, Daniel; Duplissy, Jonathan; Lee, James; Vaughan, Adam; Coe, Hugh

    2017-04-01

    The population and economy in southern West Africa have been growing at an exceptional rate in recent years and this trend is expected to continue, with the population projected to more than double to 800 million by 2050. This will result in a dramatic increase in anthropogenic pollutants, already estimated to have tripled between 1950 and 2000 (Lamarque et al., 2010). It is known that aerosols can modify the radiative properties of clouds. As such, the entrainment of anthropogenic aerosol into the large banks of clouds forming during the onset of the West African Monsoon could have a substantial impact on the region's response to climate change. Such projections, however, are greatly limited by the scarcity of observations in this part of the world. As part of the Dynamics-Aerosol-Chemistry-Cloud Interactions in West Africa (DACCIWA) project, three research aircraft were deployed, each carrying equipment capable of measuring aerosol properties in-situ. Instrumentation included Aerosol Mass Spectrometers (AMS), Single Particle Soot Photometers (SP2), Condensation Particle Counters (CPC) and Scanning Mobility Particle Sizers (SMPS). Throughout the intensive aircraft campaign, 155 hours of scientific flights covered an area including large parts of Benin, Togo, Ghana and parts of Côte D'Ivoire. Approximately 70 hours were dedicated to the measurement of cloud-aerosol interactions, with many other flights producing data contributing towards this objective. Using datasets collected during this campaign period, it is possible to build a robust statistical understanding of aerosol properties in this region for the first time, including size distributions and optical and chemical properties. Here, we describe preliminary results from aerosol measurements on board the three aircraft. These have been used to describe aerosol properties throughout the region and time period encompassed by the DACCIWA aircraft campaign. Such statistics will be invaluable for improving future

  17. Statistical methods for quality assurance basics, measurement, control, capability, and improvement

    CERN Document Server

    Vardeman, Stephen B

    2016-01-01

    This undergraduate statistical quality assurance textbook clearly shows with real projects, cases and data sets how statistical quality control tools are used in practice. Among the topics covered is a practical evaluation of measurement effectiveness for both continuous and discrete data. Gauge Reproducibility and Repeatability methodology (including confidence intervals for Repeatability, Reproducibility and the Gauge Capability Ratio) is thoroughly developed. Process capability indices and corresponding confidence intervals are also explained. In addition to process monitoring techniques, experimental design and analysis for process improvement are carefully presented. Factorial and Fractional Factorial arrangements of treatments and Response Surface methods are covered. Integrated throughout the book are rich sets of examples and problems that help readers gain a better understanding of where and how to apply statistical quality control tools. These large and realistic problem sets in combination with the...

  18. Software Used to Generate Cancer Statistics - SEER Cancer Statistics

    Science.gov (United States)

    Videos that highlight topics and trends in cancer statistics and definitions of statistical terms. Also software tools for analyzing and reporting cancer statistics, which are used to compile SEER's annual reports.

  19. Additional methodology development for statistical evaluation of reactor safety analyses

    International Nuclear Information System (INIS)

    Marshall, J.A.; Shore, R.W.; Chay, S.C.; Mazumdar, M.

    1977-03-01

    The project described is motivated by the desire for methods to quantify uncertainties and to identify conservatisms in nuclear power plant safety analysis. The report examines statistical methods useful for assessing the probability distribution of output response from complex nuclear computer codes, considers sensitivity analysis and several other topics, and also sets the path for using the developed methods for realistic assessment of the design basis accident

  20. Effectiveness of prerequisites and the HACCP plan in the control of microbial contamination in ice cream and cheese companies.

    Science.gov (United States)

    Domenech, Eva; Amorós, José Antonio; Escriche, Isabel

    2013-03-01

    In food safety, implementation of prerequisites and application of Hazard Analysis and Critical Control Points (HACCP) guarantee the control of processes, and microbiological criteria permit validation of their effectiveness. With these aims in mind, this article presents the results obtained by the official control carried out by the Valencian administration in ice cream and cheese companies, located in the Xativa/Ontinyente area (Valencian region, Spain) in the period between 2005 and 2010. The audits of Good Hygienic Practices (GHP) and HACCP show that "Structure & Design" followed by "Hygiene & Cleaning" and "Traceability" were the evaluated items with most nonconformities. Pathogenic microorganisms were not found in any of the final products analyzed. Microorganism indicators of unhygienic conditions were present in 100% of the analyses; however, 87.98% of them had low levels, which did not exceed the microbiological criteria. These results highlight the general good effectiveness of the safety management systems implemented and emphasize that companies and official control must continue working in order to guarantee the consumers' welfare.

  1. Ensemble of regional climate model projections for Ireland

    Science.gov (United States)

    Nolan, Paul; McGrath, Ray

    2016-04-01

    of over 35 days per year. Results show significant projected decreases in mean annual, spring and summer precipitation amounts by mid-century. The projected decreases are largest for summer, with "likely" reductions ranging from 0% to 20%. The frequencies of heavy precipitation events show notable increases (approximately 20%) during the winter and autumn months. The number of extended dry periods is projected to increase substantially during autumn and summer. Regional variations of projected precipitation change remain statistically elusive. The energy content of the wind is projected to significantly decrease for the future spring, summer and autumn months. Projected increases for winter were found to be statistically insignificant. The projected decreases were largest for summer, with "likely" values ranging from 3% to 15%. Results suggest that the tracks of intense storms are projected to extend further south over Ireland relative to those in the reference simulation. As extreme storm events are rare, the storm-tracking research needs to be extended. Future work will focus on analysing a larger ensemble, thus allowing a robust statistical analysis of extreme storm track projections.

  2. Rainfall Downscaling Conditional on Upper-air Variables: Assessing Rainfall Statistics in a Changing Climate

    Science.gov (United States)

    Langousis, Andreas; Deidda, Roberto; Marrocu, Marino; Kaleris, Vassilios

    2014-05-01

    Due to its intermittent and highly variable character, and the modeling parameterizations used, precipitation is one of the least well reproduced hydrologic variables by both Global Climate Models (GCMs) and Regional Climate Models (RCMs). This is especially the case at a regional level (where hydrologic risks are assessed) and at small temporal scales (e.g. daily) used to run hydrologic models. In an effort to remedy those shortcomings and assess the effect of climate change on rainfall statistics at hydrologically relevant scales, Langousis and Kaleris (2013) developed a statistical framework for simulation of daily rainfall intensities conditional on upper air variables. The developed downscaling scheme was tested using atmospheric data from the ERA-Interim archive (http://www.ecmwf.int/research/era/do/get/index), and daily rainfall measurements from western Greece, and was proved capable of reproducing several statistical properties of actual rainfall records, at both annual and seasonal levels. This was done solely by conditioning rainfall simulation on a vector of atmospheric predictors, properly selected to reflect the relative influence of upper-air variables on ground-level rainfall statistics. In this study, we apply the developed framework for conditional rainfall simulation using atmospheric data from different GCM/RCM combinations. This is done using atmospheric data from the ENSEMBLES project (http://ensembleseu.metoffice.com), and daily rainfall measurements for an intermediate-sized catchment in Italy; i.e. the Flumendosa catchment. Since GCM/RCM products are suited to reproduce the local climatology in a statistical sense (i.e. in terms of relative frequencies), rather than ensuring a one-to-one temporal correspondence between observed and simulated fields (i.e. as is the case for ERA-interim reanalysis data), we proceed in three steps: a) we use statistical tools to establish a linkage between ERA-Interim upper-air atmospheric forecasts and

  3. Understanding Statistics and Statistics Education: A Chinese Perspective

    Science.gov (United States)

    Shi, Ning-Zhong; He, Xuming; Tao, Jian

    2009-01-01

    In recent years, statistics education in China has made great strides. However, there still exists a fairly large gap with the advanced levels of statistics education in more developed countries. In this paper, we identify some existing problems in statistics education in Chinese schools and make some proposals as to how they may be overcome. We…

  4. An ANOVA approach for statistical comparisons of brain networks.

    Science.gov (United States)

    Fraiman, Daniel; Fraiman, Ricardo

    2018-03-16

    The study of brain networks has developed extensively over the last couple of decades. By contrast, techniques for the statistical analysis of these networks are less developed. In this paper, we focus on the statistical comparison of brain networks in a nonparametric framework and discuss the associated detection and identification problems. We tested network differences between groups with an analysis of variance (ANOVA) test we developed specifically for networks. We also propose and analyse the behaviour of a new statistical procedure designed to identify different subnetworks. As an example, we show the application of this tool in resting-state fMRI data obtained from the Human Connectome Project. We identify, among other variables, that the amount of sleep the days before the scan is a relevant variable that must be controlled. Finally, we discuss the potential bias in neuroimaging findings that is generated by some behavioural and brain structure variables. Our method can also be applied to other kind of networks such as protein interaction networks, gene networks or social networks.

  5. Geologic mapping as a prerequisite to hazardous waste facility siting

    International Nuclear Information System (INIS)

    LaMoreaux, P.E.

    1993-01-01

    The nation's welfare is based on its capability to develop the mineral, water, and energy resources of the land. In addition, these resources must be developed with adequate consideration of environmental impact and the future welfare of the country. Geologic maps are an absolute necessity in the discovery and development of natural resources; for managing radioactive, toxic, and hazardous wastes; and for the assessment of hazards and risks such as those associated with volcanic action, earthquakes, landslides, and subsidence. Geologic maps are the basis for depicting rocks and rock materials, minerals, coal, oil, and water at or near the earth's surface. Hazardous waste facility projects require the preparation of detailed geologic maps. Throughout most of the USA, this type of mapping detail is not available. If these maps were available, it is estimated that the duration of an individual project could be reduced by at least one-fourth (1/4). Therefore, adequate site-specific mapping is required if one is to eliminate environmental problems associated with hazardous, toxic, radioactive, and municipal waste sites

  6. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  7. Outcomes of care for 16,924 planned home births in the United States: the Midwives Alliance of North America Statistics Project, 2004 to 2009.

    Science.gov (United States)

    Cheyney, Melissa; Bovbjerg, Marit; Everson, Courtney; Gordon, Wendy; Hannibal, Darcy; Vedam, Saraswathi

    2014-01-01

    Between 2004 and 2010, the number of home births in the United States rose by 41%, increasing the need for accurate assessment of the safety of planned home birth. This study examines outcomes of planned home births in the United States between 2004 and 2009. We calculated descriptive statistics for maternal demographics, antenatal risk profiles, procedures, and outcomes of planned home births in the Midwives Alliance of North American Statistics Project (MANA Stats) 2.0 data registry. Data were analyzed according to intended and actual place of birth. Among 16,924 women who planned home births at the onset of labor, 89.1% gave birth at home. The majority of intrapartum transfers were for failure to progress, and only 4.5% of the total sample required oxytocin augmentation and/or epidural analgesia. The rates of spontaneous vaginal birth, assisted vaginal birth, and cesarean were 93.6%, 1.2%, and 5.2%, respectively. Of the 1054 women who attempted a vaginal birth after cesarean, 87% were successful. Low Apgar scores (home births in the United States, outcomes are congruent with the best available data from population-based, observational studies that evaluated outcomes by intended place of birth and perinatal risk factors. Low-risk women in this cohort experienced high rates of physiologic birth and low rates of intervention without an increase in adverse outcomes. © 2014 by the American College of Nurse-Midwives.

  8. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  9. METHODS OF SELECTING THE EFFECTIVE MODELS OF BUILDINGS REPROFILING PROJECTS

    Directory of Open Access Journals (Sweden)

    Александр Иванович МЕНЕЙЛЮК

    2016-02-01

    Full Text Available The article highlights the important task of project management in reprofiling of buildings. It is expedient to pay attention to selecting effective engineering solutions to reduce the duration and cost reduction at the project management in the construction industry. This article presents a methodology for the selection of efficient organizational and technical solutions for the reconstruction of buildings reprofiling. The method is based on a compilation of project variants in the program Microsoft Project and experimental statistical analysis using the program COMPEX. The introduction of this technique in the realigning of buildings allows choosing efficient models of projects, depending on the given constraints. Also, this technique can be used for various construction projects.

  10. Cost diviation in road construction projects: The case of Palestine

    Directory of Open Access Journals (Sweden)

    Ibrahim Mahamid

    2012-02-01

    Full Text Available This paper investigates the statistical relationship between actual and estimated cost of road construction projects using data from road construction projects awarded in the West Bank in Palestine over the years 2004–2008. The study is based on a sample of 169 road construction projects. Based on this data, regression models are developed. The findings reveal that 100% of projects suffer from cost diverge, it is found that 76% of projects have cost under estimation while 24% have cost over estimation. The discrepancy between estimated and actual cost has an average of 14.6%, ranging from -39% to 98%. The relation between the project size (length and width and the cost diverge is discussed.

  11. On divergence of finite measures and their applicability in statistics and information theory

    Czech Academy of Sciences Publication Activity Database

    Vajda, Igor; Stummer, W.

    2009-01-01

    Roč. 44, č. 2 (2009), s. 169-187 ISSN 0233-1888 R&D Projects: GA MŠk(CZ) 1M0572; GA ČR(CZ) GA102/07/1131 Institutional research plan: CEZ:AV0Z10750506 Keywords : Local and global divergences of finite measures * Divergences of sigma-finite measures * Statistical censoring * Pinsker's inequality, Ornstein's distance * Differential power entropies Subject RIV: BD - Theory of Information Impact factor: 0.759, year: 2009 http://library.utia.cas.cz/separaty/2009/SI/vajda-on divergence of finite measures and their applicability in statistics and information theory.pdf

  12. Statistical Process Control for KSC Processing

    Science.gov (United States)

    Ford, Roger G.; Delgado, Hector; Tilley, Randy

    1996-01-01

    The 1996 Summer Faculty Fellowship Program and Kennedy Space Center (KSC) served as the basis for a research effort into statistical process control for KSC processing. The effort entailed several tasks and goals. The first was to develop a customized statistical process control (SPC) course for the Safety and Mission Assurance Trends Analysis Group. The actual teaching of this course took place over several weeks. In addition, an Internet version of the same course complete with animation and video excerpts from the course when it was taught at KSC was developed. The application of SPC to shuttle processing took up the rest of the summer research project. This effort entailed the evaluation of SPC use at KSC, both present and potential, due to the change in roles for NASA and the Single Flight Operations Contractor (SFOC). Individual consulting on SPC use was accomplished as well as an evaluation of SPC software for KSC use in the future. A final accomplishment of the orientation of the author to NASA changes, terminology, data format, and new NASA task definitions will allow future consultation when the needs arise.

  13. Statistical methods of estimating mining costs

    Science.gov (United States)

    Long, K.R.

    2011-01-01

    Until it was defunded in 1995, the U.S. Bureau of Mines maintained a Cost Estimating System (CES) for prefeasibility-type economic evaluations of mineral deposits and estimating costs at producing and non-producing mines. This system had a significant role in mineral resource assessments to estimate costs of developing and operating known mineral deposits and predicted undiscovered deposits. For legal reasons, the U.S. Geological Survey cannot update and maintain CES. Instead, statistical tools are under development to estimate mining costs from basic properties of mineral deposits such as tonnage, grade, mineralogy, depth, strip ratio, distance from infrastructure, rock strength, and work index. The first step was to reestimate "Taylor's Rule" which relates operating rate to available ore tonnage. The second step was to estimate statistical models of capital and operating costs for open pit porphyry copper mines with flotation concentrators. For a sample of 27 proposed porphyry copper projects, capital costs can be estimated from three variables: mineral processing rate, strip ratio, and distance from nearest railroad before mine construction began. Of all the variables tested, operating costs were found to be significantly correlated only with strip ratio.

  14. Analysis of statistical misconception in terms of statistical reasoning

    Science.gov (United States)

    Maryati, I.; Priatna, N.

    2018-05-01

    Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.

  15. Statistical Inference at Work: Statistical Process Control as an Example

    Science.gov (United States)

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  16. Buddy-Tutor Project. Hilo Intermediate School. Final Report, March-July, 1974.

    Science.gov (United States)

    Hawaii Univ., Honolulu. Social Welfare Development and Research Center.

    An operational description of the 1973-74 Buddy-Tutor Project at Hilo Intermediate School in Hilo, Hawaii and an evaluative assessment of its outcome with statistical treatment of the data is provided in this report. This project is an exploratory behavioral intervention program for educationally deprived students and focuses its efforts on the…

  17. A Statistical Primer: Understanding Descriptive and Inferential Statistics

    OpenAIRE

    Gillian Byrne

    2007-01-01

    As libraries and librarians move more towards evidence‐based decision making, the data being generated in libraries is growing. Understanding the basics of statistical analysis is crucial for evidence‐based practice (EBP), in order to correctly design and analyze researchas well as to evaluate the research of others. This article covers the fundamentals of descriptive and inferential statistics, from hypothesis construction to sampling to common statistical techniques including chi‐square, co...

  18. Solution of the statistical bootstrap with Bose statistics

    International Nuclear Information System (INIS)

    Engels, J.; Fabricius, K.; Schilling, K.

    1977-01-01

    A brief and transparent way to introduce Bose statistics into the statistical bootstrap of Hagedorn and Frautschi is presented. The resulting bootstrap equation is solved by a cluster expansion for the grand canonical partition function. The shift of the ultimate temperature due to Bose statistics is determined through an iteration process. We discuss two-particle spectra of the decaying fireball (with given mass) as obtained from its grand microcanonical level density

  19. Statistical methods

    CERN Document Server

    Szulc, Stefan

    1965-01-01

    Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

  20. Statistical optics

    CERN Document Server

    Goodman, Joseph W

    2015-01-01

    This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications.  The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i

  1. Full Counting Statistics for Interacting Fermions with Determinantal Quantum Monte Carlo Simulations.

    Science.gov (United States)

    Humeniuk, Stephan; Büchler, Hans Peter

    2017-12-08

    We present a method for computing the full probability distribution function of quadratic observables such as particle number or magnetization for the Fermi-Hubbard model within the framework of determinantal quantum Monte Carlo calculations. Especially in cold atom experiments with single-site resolution, such a full counting statistics can be obtained from repeated projective measurements. We demonstrate that the full counting statistics can provide important information on the size of preformed pairs. Furthermore, we compute the full counting statistics of the staggered magnetization in the repulsive Hubbard model at half filling and find excellent agreement with recent experimental results. We show that current experiments are capable of probing the difference between the Hubbard model and the limiting Heisenberg model.

  2. All of statistics a concise course in statistical inference

    CERN Document Server

    Wasserman, Larry

    2004-01-01

    This book is for people who want to learn probability and statistics quickly It brings together many of the main ideas in modern statistics in one place The book is suitable for students and researchers in statistics, computer science, data mining and machine learning This book covers a much wider range of topics than a typical introductory text on mathematical statistics It includes modern topics like nonparametric curve estimation, bootstrapping and classification, topics that are usually relegated to follow-up courses The reader is assumed to know calculus and a little linear algebra No previous knowledge of probability and statistics is required The text can be used at the advanced undergraduate and graduate level Larry Wasserman is Professor of Statistics at Carnegie Mellon University He is also a member of the Center for Automated Learning and Discovery in the School of Computer Science His research areas include nonparametric inference, asymptotic theory, causality, and applications to astrophysics, bi...

  3. Comparison of additive (absolute) risk projection models and multiplicative (relative) risk projection models in estimating radiation-induced lifetime cancer risk

    International Nuclear Information System (INIS)

    Kai, Michiaki; Kusama, Tomoko

    1990-01-01

    Lifetime cancer risk estimates depend on risk projection models. While the increasing lengths of follow-up observation periods of atomic bomb survivors in Hiroshima and Nagasaki bring about changes in cancer risk estimates, the validity of the two risk projection models, the additive risk projection model (AR) and multiplicative risk projection model (MR), comes into question. This paper compares the lifetime risk or loss of life-expectancy between the two projection models on the basis of BEIR-III report or recently published RERF report. With Japanese cancer statistics the estimates of MR were greater than those of AR, but a reversal of these results was seen when the cancer hazard function for India was used. When we investigated the validity of the two projection models using epidemiological human data and animal data, the results suggested that MR was superior to AR with respect to temporal change, but there was little evidence to support its validity. (author)

  4. The Structural Reforms of the Chinese Statistical System Die Strukturreformen des chinesischen Statistiksystems

    Directory of Open Access Journals (Sweden)

    Günter Moser

    2009-04-01

    Full Text Available The quality of statistical data covering the economic and social development of the People’s Republic of China has been questioned by international and national data users for years. The reasons for this doubt lie mainly in the structure of the Chinese system of statistics. Two parallel systems exist which operate largely autonomously: the national system of statistics and the sectoral system of statistics. In the area of the national statistical system, the National Bureau of Statistics (NBS has the authority to order and collect statistics. This competence lies with the ministries and authorities below the ministerial level. This article describes and analyses these structures, the resulting problems, and the reform measures taken to date. It also aims to provide a better understanding of the statistical data about the People’s Republic of China and to enable an assessment of them within a changing structural context. In conclusion, approaches to further reforms will be provided based on the author’s long-standing experience in cooperation projects with the official Chinese statistics agencies. Die Qualität der Statistiken zur ökonomischen und sozialen Entwicklung in der Volksrepublik China ist in letzter Zeit sowohl von ausländischen wie auch von einheimischen Nutzern der Daten in Frage gestellt worden. Die Gründe dafür liegen vor allem in der Struktur des Erhebungssystems in China.

  5. World Energy Projection System model documentation

    International Nuclear Information System (INIS)

    Hutzler, M.J.; Anderson, A.T.

    1997-09-01

    The World Energy Projection System (WEPS) was developed by the Office of Integrated Analysis and Forecasting within the Energy Information Administration (EIA), the independent statistical and analytical agency of the US Department of Energy. WEPS is an integrated set of personal computer based spreadsheets containing data compilations, assumption specifications, descriptive analysis procedures, and projection models. The WEPS accounting framework incorporates projections from independently documented models and assumptions about the future energy intensity of economic activity (ratios of total energy consumption divided by gross domestic product GDP), and about the rate of incremental energy requirements met by natural gas, coal, and renewable energy sources (hydroelectricity, geothermal, solar, wind, biomass, and other renewable resources). Projections produced by WEPS are published in the annual report, International Energy Outlook. This report documents the structure and procedures incorporated in the 1998 version of the WEPS model. It has been written to provide an overview of the structure of the system and technical details about the operation of each component of the model for persons who wish to know how WEPS projections are produced by EIA

  6. World Energy Projection System model documentation

    Energy Technology Data Exchange (ETDEWEB)

    Hutzler, M.J.; Anderson, A.T.

    1997-09-01

    The World Energy Projection System (WEPS) was developed by the Office of Integrated Analysis and Forecasting within the Energy Information Administration (EIA), the independent statistical and analytical agency of the US Department of Energy. WEPS is an integrated set of personal computer based spreadsheets containing data compilations, assumption specifications, descriptive analysis procedures, and projection models. The WEPS accounting framework incorporates projections from independently documented models and assumptions about the future energy intensity of economic activity (ratios of total energy consumption divided by gross domestic product GDP), and about the rate of incremental energy requirements met by natural gas, coal, and renewable energy sources (hydroelectricity, geothermal, solar, wind, biomass, and other renewable resources). Projections produced by WEPS are published in the annual report, International Energy Outlook. This report documents the structure and procedures incorporated in the 1998 version of the WEPS model. It has been written to provide an overview of the structure of the system and technical details about the operation of each component of the model for persons who wish to know how WEPS projections are produced by EIA.

  7. MQSA National Statistics

    Science.gov (United States)

    ... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...

  8. Recent developments of the ROOT mathematical and statistical software

    International Nuclear Information System (INIS)

    Moneta, L; Antcheva, I; Brun, R

    2008-01-01

    Advanced mathematical and statistical computational methods are required by the LHC experiments to analyzed their data. These methods are provided by the Math work package of the ROOT project. An overview of the recent developments of this work package is presented by describing the restructuring of the core mathematical library in a coherent set of C++ classes and interfaces. The achieved improvements, in terms of performances and quality, of numerical methods present in ROOT are shown as well. New developments in the fitting and minimization packages are reviewed. A new graphics interface has been developed to drive the fitting process and new classes are being introduced to extend the fitting functionality. Furthermore, recent and planned developments of integrating in the ROOT environment new advanced statistical tools required for the analysis of the LHC data are presented

  9. Generalized quantum statistics

    International Nuclear Information System (INIS)

    Chou, C.

    1992-01-01

    In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics

  10. Regional projection of climate impact indices over the Mediterranean region

    Science.gov (United States)

    Casanueva, Ana; Frías, M.; Dolores; Herrera, Sixto; Bedia, Joaquín; San Martín, Daniel; Gutiérrez, José Manuel; Zaninovic, Ksenija

    2014-05-01

    Climate Impact Indices (CIIs) are being increasingly used in different socioeconomic sectors to transfer information about climate change impacts and risks to stakeholders. CIIs are typically based on different weather variables such as temperature, wind speed, precipitation or humidity and comprise, in a single index, the relevant meteorological information for the particular impact sector (in this study wildfires and tourism). This dependence on several climate variables poses important limitations to the application of statistical downscaling techniques, since physical consistency among variables is required in most cases to obtain reliable local projections. The present study assesses the suitability of the "direct" downscaling approach, in which the downscaling method is directly applied to the CII. In particular, for illustrative purposes, we consider two popular indices used in the wildfire and tourism sectors, the Fire Weather Index (FWI) and the Physiological Equivalent Temperature (PET), respectively. As an example, two case studies are analysed over two representative Mediterranean regions of interest for the EU CLIM-RUN project: continental Spain for the FWI and Croatia for the PET. Results obtained with this "direct" downscaling approach are similar to those found from the application of the statistical downscaling to the individual meteorological drivers prior to the index calculation ("component" downscaling) thus, a wider range of statistical downscaling methods could be used. As an illustration, future changes in both indices are projected by applying two direct statistical downscaling methods, analogs and linear regression, to the ECHAM5 model. Larger differences were found between the two direct statistical downscaling approaches than between the direct and the component approaches with a single downscaling method. While these examples focus on particular indices and Mediterranean regions of interest for CLIM-RUN stakeholders, the same study

  11. The sanitation value chain: its concept and new research collaboration project

    Science.gov (United States)

    Funamizu, N.

    2017-03-01

    Sanitation is essential for promoting health, preventing environment pollution, conserving ecosystem, and recovering and recycling resources. Therefore, it can be said that sanitation is closely related to such current global issues as poverty, urban slum, conservation of ecosystem, and resources management. Namely, the question, “How can we handle the waste from 10 billion people in future?” is a global environmental problem to be solved. In developing world, population is growing rapidly especially in urban slums and they have still high under 5 mortality and poverty issues. It also reported that 2.4 billion people are still using unimproved sanitation facilities, including 946 million people who are still practicing open defecation in 2015 (UN, 2015). On the other hand, depopulation and aging are progressing especially in rural area of developed world. Based on the above mentioned background, new research project on sanitation value chain has started. This is a collaboration project with LIPI, RIHN (Research Institute of Humanity and Nature, Kyoto) and HU (Hokkaido University). The concept of the sanitation value chain and the brief summary of the project are discussed in the keynote presentation. The concept of sanitation value chain proposed in the project : The project is proposing new concept, Sanitation Value Chain, which has the following basic policies: 1) Put values of people/and community in the centre of discussion, and prepare sanitation system to drive this value chain; 2) Design the sanitation system by focusing on incentive for individual users and community; 3) Recognize a sanitation system as an integrated system with social and technical systems; 4) Design the sanitation system by making a good matching between social characteristics and prerequisites of the technologies. The goals of the research are 1) To propose the Sanitation Value Chain as a common solution for both developing and developed countries, 2) To show the validity of the

  12. Statistics For Dummies

    CERN Document Server

    Rumsey, Deborah

    2011-01-01

    The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou

  13. Ad hoc statistical consulting within a large research organization

    CSIR Research Space (South Africa)

    Elphinstone, CD

    2009-08-01

    Full Text Available requests were growing to the extent where it was difficult to manage them together with project and research workload. Also, the access to computing and some basic statistical literacy meant that a high proportion of advanced queries were received.... The challenge was to achieve this in a cost effective way with limited financial and personnel resources. Experience Some of the challenges experienced with the HotSeat service: • Researchers consulting with a statistician after the data is collected...

  14. Knowledge fusion: Comparison of fuzzy curve smoothers to statistically motivated curve smoothers

    International Nuclear Information System (INIS)

    Burr, T.; Strittmatter, R.B.

    1996-03-01

    This report describes work during FY 95 that was sponsored by the Department of Energy, Office of Nonproliferation and National Security (NN) Knowledge Fusion (KF) Project. The project team selected satellite sensor data to use as the one main example to which its analysis algorithms would be applied. The specific sensor-fusion problem has many generic features, which make it a worthwhile problem to attempt to solve in a general way. The generic problem is to recognize events of interest from multiple time series that define a possibly noisy background. By implementing a suite of time series modeling and forecasting methods and using well-chosen alarm criteria, we reduce the number of false alarms. We then further reduce the number of false alarms by analyzing all suspicious sections of data, as judged by the alarm criteria, with pattern recognition methods. This report gives a detailed comparison of two of the forecasting methods (fuzzy forecaster and statistically motivated curve smoothers as forecasters). The two methods are compared on five simulated and five real data sets. One of the five real data sets is satellite sensor data. The conclusion is the statistically motivated curve smoother is superior on simulated data of the type we studied. The statistically motivated method is also superior on most real data. In defense of the fuzzy-logic motivated methods, we point out that fuzzy-logic methods were never intended to compete with statistical methods on numeric data. Fuzzy logic was developed to handle real-world situations where either real data was not available or was supplemented with either ''expert opinion'' or some sort of linguistic information

  15. Statistical methods for detecting differentially abundant features in clinical metagenomic samples.

    Directory of Open Access Journals (Sweden)

    James Robert White

    2009-04-01

    Full Text Available Numerous studies are currently underway to characterize the microbial communities inhabiting our world. These studies aim to dramatically expand our understanding of the microbial biosphere and, more importantly, hope to reveal the secrets of the complex symbiotic relationship between us and our commensal bacterial microflora. An important prerequisite for such discoveries are computational tools that are able to rapidly and accurately compare large datasets generated from complex bacterial communities to identify features that distinguish them.We present a statistical method for comparing clinical metagenomic samples from two treatment populations on the basis of count data (e.g. as obtained through sequencing to detect differentially abundant features. Our method, Metastats, employs the false discovery rate to improve specificity in high-complexity environments, and separately handles sparsely-sampled features using Fisher's exact test. Under a variety of simulations, we show that Metastats performs well compared to previously used methods, and significantly outperforms other methods for features with sparse counts. We demonstrate the utility of our method on several datasets including a 16S rRNA survey of obese and lean human gut microbiomes, COG functional profiles of infant and mature gut microbiomes, and bacterial and viral metabolic subsystem data inferred from random sequencing of 85 metagenomes. The application of our method to the obesity dataset reveals differences between obese and lean subjects not reported in the original study. For the COG and subsystem datasets, we provide the first statistically rigorous assessment of the differences between these populations. The methods described in this paper are the first to address clinical metagenomic datasets comprising samples from multiple subjects. Our methods are robust across datasets of varied complexity and sampling level. While designed for metagenomic applications, our software

  16. Involvement of Individuals in the Development of Technical Solutions and Rules of Management for Building Renovation Projects: A Case Study of Latvia

    Science.gov (United States)

    Pukite, I.; Grekis, A.; Geipele, I.; Zeltins, N.

    2017-08-01

    In March 2016, the Latvian government approved a new support program for increasing energy efficiency in residential apartment buildings. For the support of renovation of apartment buildings in the period from 2016 to 2023, 166 470 588 EUR will be available. Different persons, such as energy auditors, designers, architects, project managers and builders, will be involved in the process of planning, development and implementation of building renovation. At the development stage of the building renovation project, special attention should be devoted to the first stage - energy audit and technical project development. The problem arises due to the fact that each of these individuals, during the development of technical building documentation, does not work as a completely unified system. The implementation of construction project planning and organisational management system is one of the most important factors to guarantee that the quality of building renovation project is ensured in accordance with the laws and regulatory standards. The paper studies mutual cooperation, professionalism and the role of information feedback of personnel involved in the planning stage of building renovation, which is an essential prerequisite for the renovation process in order to achieve high quality of work and reduce the energy performance indicator. The present research includes the analysis of different technical solutions and their impact on energy efficiency. Mutual harmonisation of technical specifications is also investigated.

  17. A Simplified Algorithm for Statistical Investigation of Damage Spreading

    International Nuclear Information System (INIS)

    Gecow, Andrzej

    2009-01-01

    On the way to simulating adaptive evolution of complex system describing a living object or human developed project, a fitness should be defined on node states or network external outputs. Feedbacks lead to circular attractors of these states or outputs which make it difficult to define a fitness. The main statistical effects of adaptive condition are the result of small change tendency and to appear, they only need a statistically correct size of damage initiated by evolutionary change of system. This observation allows to cut loops of feedbacks and in effect to obtain a particular statistically correct state instead of a long circular attractor which in the quenched model is expected for chaotic network with feedback. Defining fitness on such states is simple. We calculate only damaged nodes and only once. Such an algorithm is optimal for investigation of damage spreading i.e. statistical connections of structural parameters of initial change with the size of effected damage. It is a reversed-annealed method--function and states (signals) may be randomly substituted but connections are important and are preserved. The small damages important for adaptive evolution are correctly depicted in comparison to Derrida annealed approximation which expects equilibrium levels for large networks. The algorithm indicates these levels correctly. The relevant program in Pascal, which executes the algorithm for a wide range of parameters, can be obtained from the author.

  18. Software for statistical data analysis used in Higgs searches

    International Nuclear Information System (INIS)

    Gumpert, Christian; Moneta, Lorenzo; Cranmer, Kyle; Kreiss, Sven; Verkerke, Wouter

    2014-01-01

    The analysis and interpretation of data collected by the Large Hadron Collider (LHC) requires advanced statistical tools in order to quantify the agreement between observation and theoretical models. RooStats is a project providing a statistical framework for data analysis with the focus on discoveries, confidence intervals and combination of different measurements in both Bayesian and frequentist approaches. It employs the RooFit data modelling language where mathematical concepts such as variables, (probability density) functions and integrals are represented as C++ objects. RooStats and RooFit rely on the persistency technology of the ROOT framework. The usage of a common data format enables the concept of digital publishing of complicated likelihood functions. The statistical tools have been developed in close collaboration with the LHC experiments to ensure their applicability to real-life use cases. Numerous physics results have been produced using the RooStats tools, with the discovery of the Higgs boson by the ATLAS and CMS experiments being certainly the most popular among them. We will discuss tools currently used by LHC experiments to set exclusion limits, to derive confidence intervals and to estimate discovery significances based on frequentist statistics and the asymptotic behaviour of likelihood functions. Furthermore, new developments in RooStats and performance optimisation necessary to cope with complex models depending on more than 1000 variables will be reviewed

  19. Uncertainties in the Dutch Reference Projections. Background information for the report 'Reference Projections Energy and Emissions 2005-2020'

    International Nuclear Information System (INIS)

    Seebregts, A.J.; Gijsen, A.

    2005-09-01

    The Dutch targets for greenhouse gases, ammonia and non-methane VOCs will likely be met in 2010 according to our calculations from an uncertainty analysis in the framework of the project on Reference Projections for energy, climate and acidifying emissions. However, it is unlikely that the targets for sulphur dioxide and nitrogen oxide will be attained This study distinguished between sources of uncertainty in the input variables of the Reference Projections. These sources were quantified with the help of the 'Guidance for Uncertainty Assessment and Communication' and 'expert judgement'. With the aid of a statistical Monte Carlo analysis, margins and probability distributions were determined for the most important outcomes of the Reference Projections. These probability distributions led, for example, to several statements being made on the chances of meeting certain targets. The use of 'Guidance for Uncertainty Assessment and Communication' was also evaluated [nl

  20. Multivariate clustering of reindeer herding districts in Sweden according to range prerequisites for reindeer husbandry

    Directory of Open Access Journals (Sweden)

    Henrik Lundqvist

    2009-01-01

    Full Text Available The 51 reindeer herding districts in Sweden vary in productivity and prerequisites for reindeer herding. In this study we characterize and group reindeer herding districts based on relevant factors affecting reindeer productivity, i.e. topography, vegetation, forage value, habitat fragmentation and reachability, as well as season lengths, snow fall, ice-crust probability, and insect harassment, totally quantified in 15 variables. The herding districts were grouped into seven main groups and three single outliers through cluster analyses. The largest group, consisting of 14 herding districts, was further divided into four subgroups. The range properties of herding districts and groups of districts were characterized through principal component analyses. By comparisons of the suggested grouping of herding districts with existing administrative divisions, these appeared not to coincide. A new division of herding districts into six administrative sets of districts was suggested in order to improve administrative planning and management of the reindeer herding industry. The results also give possibilities for projections of alterations caused by an upcoming global climate change. Large scale investigations using geographical information systems (GIS and meteorological data would be helpful for administrative purposes, both nationally and internationally, as science-based decision tools in legislative, economical, ecological and structural assessments. Abstract in Swedish / Sammanfattning: Multivariat gruppering av svenska samebyar baserat på renbetesmarkernas grundförutsettningar Svenska renskötselområdet består av 51 samebyar som varierar i produktivitet och förutsättningar för renskötsel. Vi analyserade variationen mellan samebyar med avseende på 15 variabler som beskriver topografi, vegetation, betesvärde, fragmentering av betesmarker, klimat, skareförekomst och aktivitet av parasiterande insekter och vi föreslår en indelning av

  1. Identification and Definition of Lexically Ambiguous Words in Statistics by Tutors and Students

    Science.gov (United States)

    Richardson, Alice M.; Dunn, Peter K.; Hutchins, Rene

    2013-01-01

    Lexical ambiguity arises when a word from everyday English is used differently in a particular discipline, such as statistics. This paper reports on a project that begins by identifying tutors' perceptions of words that are potentially lexically ambiguous to students, in two different ways. Students' definitions of nine lexically ambiguous words…

  2. Inflated Uncertainty in Multimodel-Based Regional Climate Projections

    Science.gov (United States)

    Madsen, Marianne Sloth; Langen, Peter L.; Boberg, Fredrik; Christensen, Jens Hesselbjerg

    2017-11-01

    Multimodel ensembles are widely analyzed to estimate the range of future regional climate change projections. For an ensemble of climate models, the result is often portrayed by showing maps of the geographical distribution of the multimodel mean results and associated uncertainties represented by model spread at the grid point scale. Here we use a set of CMIP5 models to show that presenting statistics this way results in an overestimation of the projected range leading to physically implausible patterns of change on global but also on regional scales. We point out that similar inconsistencies occur in impact analyses relying on multimodel information extracted using statistics at the regional scale, for example, when a subset of CMIP models is selected to represent regional model spread. Consequently, the risk of unwanted impacts may be overestimated at larger scales as climate change impacts will never be realized as the worst (or best) case everywhere.

  3. Comparative validation of statistical and dynamical downscaling models on a dense grid in central Europe: temperature

    Czech Academy of Sciences Publication Activity Database

    Huth, Radan; Mikšovský, J.; Štěpánek, P.; Belda, M.; Farda, A.; Chládová, Zuzana; Pišoft, P.

    2015-01-01

    Roč. 120, 3-4 (2015), s. 533-553 ISSN 0177-798X R&D Projects: GA ČR(CZ) GAP209/11/2405 EU Projects: European Commission(XE) 37005 Institutional support: RVO:68378289 Keywords : statistical downscaling models * regional climate models * central Europe Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 2.433, year: 2015 http://link.springer.com/ article /10.1007%2Fs00704-014-1190-3

  4. Statistics in Schools

    Science.gov (United States)

    Information Statistics in Schools Educate your students about the value and everyday use of statistics. The Statistics in Schools program provides resources for teaching and learning with real life data. Explore the site for standards-aligned, classroom-ready activities. Statistics in Schools Math Activities History

  5. Projected large flood event sensitivity to projection selection and temporal downscaling methodology

    Energy Technology Data Exchange (ETDEWEB)

    Raff, D. [U.S. Dept. of the Interior, Bureau of Reclamation, Denver, Colorado (United States)

    2008-07-01

    Large flood events, that influence regulatory guidelines as well as safety of dams decisions, are likely to be affected by climate change. This talk will evaluate the use of climate projections downscaled and run through a rainfall - runoff model and its influence on large flood events. The climate spatial downscaling is performed statistically and a re-sampling and scaling methodology is used to temporally downscale from monthly to daily signals. The signals are run through a National Weather Service operational rainfall-runoff model to produce 6-hour flows. The flows will be evaluated for changes in large events at look-ahead horizons from 2011 - 2040, 2041 - 2070, and 2071 - 2099. The sensitivity of results will be evaluated with respect to projection selection criteria and re-sampling and scaling criteria for the Boise River in Idaho near Lucky Peak Dam. (author)

  6. Projected large flood event sensitivity to projection selection and temporal downscaling methodology

    International Nuclear Information System (INIS)

    Raff, D.

    2008-01-01

    Large flood events, that influence regulatory guidelines as well as safety of dams decisions, are likely to be affected by climate change. This talk will evaluate the use of climate projections downscaled and run through a rainfall - runoff model and its influence on large flood events. The climate spatial downscaling is performed statistically and a re-sampling and scaling methodology is used to temporally downscale from monthly to daily signals. The signals are run through a National Weather Service operational rainfall-runoff model to produce 6-hour flows. The flows will be evaluated for changes in large events at look-ahead horizons from 2011 - 2040, 2041 - 2070, and 2071 - 2099. The sensitivity of results will be evaluated with respect to projection selection criteria and re-sampling and scaling criteria for the Boise River in Idaho near Lucky Peak Dam. (author)

  7. Hedonic approaches based on spatial econometrics and spatial statistics: application to evaluation of project benefits

    Science.gov (United States)

    Tsutsumi, Morito; Seya, Hajime

    2009-12-01

    This study discusses the theoretical foundation of the application of spatial hedonic approaches—the hedonic approach employing spatial econometrics or/and spatial statistics—to benefits evaluation. The study highlights the limitations of the spatial econometrics approach since it uses a spatial weight matrix that is not employed by the spatial statistics approach. Further, the study presents empirical analyses by applying the Spatial Autoregressive Error Model (SAEM), which is based on the spatial econometrics approach, and the Spatial Process Model (SPM), which is based on the spatial statistics approach. SPMs are conducted based on both isotropy and anisotropy and applied to different mesh sizes. The empirical analysis reveals that the estimated benefits are quite different, especially between isotropic and anisotropic SPM and between isotropic SPM and SAEM; the estimated benefits are similar for SAEM and anisotropic SPM. The study demonstrates that the mesh size does not affect the estimated amount of benefits. Finally, the study provides a confidence interval for the estimated benefits and raises an issue with regard to benefit evaluation.

  8. Studying the microlenses mass function from statistical analysis of the caustic concentration

    Energy Technology Data Exchange (ETDEWEB)

    Mediavilla, T; Ariza, O [Departamento de Estadistica e Investigacion Operativa, Universidad de Cadiz, Avda de Ramon Puyol, s/n 11202 Algeciras (Spain); Mediavilla, E [Instituto de Astrofisica de Canarias, Avda Via Lactea s/n, La Laguna (Spain); Munoz, J A, E-mail: teresa.mediavilla@ca.uca.es, E-mail: octavio.ariza@uca.es, E-mail: emg@iac.es [Departamento de Astrofisica y Astronomia, Universidad de Valencia, Burjassot, Valencia (Spain)

    2011-09-22

    The statistical distribution of caustic crossings by the images of a lensed quasar depends on the properties of the distribution of microlenses in the lens galaxy. We use a procedure based in Inverse Polygon Mapping to easily identify the critical and caustic curves generated by a distribution of stars in the lens galaxy. We analyze the statistical distributions of the number of caustic crossings by a pixel size source for several projected mass densities and different mass distributions. We compare the results of simulations with theoretical binomial distributions. Finally we apply this method to the study of the stellar mass distribution in the lens galaxy of QSO 2237+0305.

  9. Using Statistical Analysis Software to Advance Nitro Plasticizer Wettability

    Energy Technology Data Exchange (ETDEWEB)

    Shear, Trevor Allan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-29

    Statistical analysis in science is an extremely powerful tool that is often underutilized. Additionally, it is frequently the case that data is misinterpreted or not used to its fullest extent. Utilizing the advanced software JMP®, many aspects of experimental design and data analysis can be evaluated and improved. This overview will detail the features of JMP® and how they were used to advance a project, resulting in time and cost savings, as well as the collection of scientifically sound data. The project analyzed in this report addresses the inability of a nitro plasticizer to coat a gold coated quartz crystal sensor used in a quartz crystal microbalance. Through the use of the JMP® software, the wettability of the nitro plasticizer was increased by over 200% using an atmospheric plasma pen, ensuring good sample preparation and reliable results.

  10. Substorm-associated large-scale magnetic field changes in the magnetotail: a prerequisite for "magnetotail deflation" events

    Directory of Open Access Journals (Sweden)

    H. Nakai

    Full Text Available An attempt is made to search for a critical condition in the lobe magnetic field to initiate large-scale magnetic field changes associated with substorm expansions. Using data from ISEE-1 for 1978, sudden decreases in the lobe magnetic field accompanied by magnetic field dipolarizations are identified. In this study, such events are designated as the magnetotail deflation. The magnetic field component parallel to the equatorial plane, BE , is normalized to a fixed geocentric distance, BEN , and is corrected for the compression effect of the solar wind dynamic pres-sure, BENC . It is shown that the BENC value just prior to a magnetotail deflation correlates well with the Dst index; BENC = 37.5 - 0.217 Dst0, where Dst0 denotes the Dst value corrected for the solar wind dynamic pressure. This regression function appears to delineate the upper limit of BENC values, when they are sorted by the Dst0 index. On the basis of this finding it is suggested that a prerequisite condition for magnetotail deflations must exist in the magnetosphere.

    Key words. Magnetospheric physics (magnetotail; current systems; storms and substorms

  11. COMBO-FISH Enables High Precision Localization Microscopy as a Prerequisite for Nanostructure Analysis of Genome Loci

    Directory of Open Access Journals (Sweden)

    Rainer Kaufmann

    2010-10-01

    Full Text Available With the completeness of genome databases, it has become possible to develop a novel FISH (Fluorescence in Situ Hybridization technique called COMBO-FISH (COMBinatorial Oligo FISH. In contrast to other FISH techniques, COMBO-FISH makes use of a bioinformatics approach for probe set design. By means of computer genome database searching, several oligonucleotide stretches of typical lengths of 15–30 nucleotides are selected in such a way that all uniquely colocalize at the given genome target. The probes applied here were Peptide Nucleic Acids (PNAs—synthetic DNA analogues with a neutral backbone—which were synthesized under high purity conditions. For a probe repetitively highlighted in centromere 9, PNAs labeled with different dyes were tested, among which Alexa 488Ò showed reversible photobleaching (blinking between dark and bright state a prerequisite for the application of SPDM (Spectral Precision Distance/Position Determination Microscopy a novel technique of high resolution fluorescence localization microscopy. Although COMBO-FISH labeled cell nuclei under SPDM conditions sometimes revealed fluorescent background, the specific locus was clearly discriminated by the signal intensity and the resulting localization accuracy in the range of 10–20 nm for a detected oligonucleotide stretch. The results indicate that COMBO-FISH probes with blinking dyes are well suited for SPDM, which will open new perspectives on molecular nanostructural analysis of the genome.

  12. 77 FR 33729 - Disability and Rehabilitation Research Projects and Centers Program-National Data and Statistical...

    Science.gov (United States)

    2012-06-07

    ... inclusion and integration of individuals with disabilities into society, and promote the employment... DEPARTMENT OF EDUCATION Disability and Rehabilitation Research Projects and Centers Program.... Final priority; National Institute on Disability and Rehabilitation Research (NIDRR)--Disability and...

  13. Statistical downscaling of CMIP5 outputs for projecting future changes in rainfall in the Onkaparinga catchment

    Energy Technology Data Exchange (ETDEWEB)

    Rashid, Md. Mamunur, E-mail: mdmamunur.rashid@mymail.unisa.edu.au [Centre for Water Management and Reuse, School of Natural and Built Environments, University of South Australia, Mawson Lakes, SA 5095 (Australia); Beecham, Simon, E-mail: simon.beecham@unisa.edu.au [Centre for Water Management and Reuse, School of Natural and Built Environments, University of South Australia, Mawson Lakes, SA 5095 (Australia); Chowdhury, Rezaul K., E-mail: rezaulkabir@uaeu.ac.ae [Centre for Water Management and Reuse, School of Natural and Built Environments, University of South Australia, Mawson Lakes, SA 5095 (Australia); Department of Civil and Environmental Engineering, United Arab Emirates University, Al Ain, PO Box 15551 (United Arab Emirates)

    2015-10-15

    A generalized linear model was fitted to stochastically downscaled multi-site daily rainfall projections from CMIP5 General Circulation Models (GCMs) for the Onkaparinga catchment in South Australia to assess future changes to hydrologically relevant metrics. For this purpose three GCMs, two multi-model ensembles (one by averaging the predictors of GCMs and the other by regressing the predictors of GCMs against reanalysis datasets) and two scenarios (RCP4.5 and RCP8.5) were considered. The downscaling model was able to reasonably reproduce the observed historical rainfall statistics when the model was driven by NCEP reanalysis datasets. Significant bias was observed in the rainfall when downscaled from historical outputs of GCMs. Bias was corrected using the Frequency Adapted Quantile Mapping technique. Future changes in rainfall were computed from the bias corrected downscaled rainfall forced by GCM outputs for the period 2041–2060 and these were then compared to the base period 1961–2000. The results show that annual and seasonal rainfalls are likely to significantly decrease for all models and scenarios in the future. The number of dry days and maximum consecutive dry days will increase whereas the number of wet days and maximum consecutive wet days will decrease. Future changes of daily rainfall occurrence sequences combined with a reduction in rainfall amounts will lead to a drier catchment, thereby reducing the runoff potential. Because this is a catchment that is a significant source of Adelaide's water supply, irrigation water and water for maintaining environmental flows, an effective climate change adaptation strategy is needed in order to face future potential water shortages. - Highlights: • A generalized linear model was used for multi-site daily rainfall downscaling. • Rainfall was downscaled from CMIP5 GCM outputs. • Two multi-model ensemble approaches were used. • Bias was corrected using the Frequency Adapted Quantile Mapping

  14. Renal artery origins: best angiographic projection angles.

    Science.gov (United States)

    Verschuyl, E J; Kaatee, R; Beek, F J; Patel, N H; Fontaine, A B; Daly, C P; Coldwell, D M; Bush, W H; Mali, W P

    1997-10-01

    To determine the best projection angles for imaging the renal artery origins in profile. A mathematical model of the anatomy at the renal artery origins in the transverse plane was used to analyze the amount of aortic lumen that projects over the renal artery origins at various projection angles. Computed tomographic (CT) angiographic data about the location of 400 renal artery origins in 200 patients were statistically analyzed. In patients with an abdominal aortic diameter no larger than 3.0 cm, approximately 0.5 mm of the proximal part of the renal artery and origin may be hidden from view if there is a projection error of +/-10 degrees from the ideal image. A combination of anteroposterior and 20 degrees and 40 degrees left anterior oblique projections resulted in a 92% yield of images that adequately profiled the renal artery origins. Right anterior oblique projections resulted in the least useful images. An error in projection angle of +/-10 degrees is acceptable for angiographic imaging of the renal artery origins. Patients sex, site of interest (left or right artery), and local diameter of the abdominal aorta are important factors to consider.

  15. Successfully reducing newborn asphyxia in the labour unit in a large academic medical centre: a quality improvement project using statistical process control.

    Science.gov (United States)

    Hollesen, Rikke von Benzon; Johansen, Rie Laurine Rosenthal; Rørbye, Christina; Munk, Louise; Barker, Pierre; Kjaerbye-Thygesen, Anette

    2018-02-03

    A safe delivery is part of a good start in life, and a continuous focus on preventing harm during delivery is crucial, even in settings with a good safety record. In January 2013, the labour unit at Copenhagen University Hospital, Hvidovre, undertook a quality improvement (QI) project to prevent asphyxia and reduced the percentage of newborns with asphyxia by 48%. The change theory consisted of two primary elements: (1) the clinical content, including three clinical bundles of evidence-based care, a 'delivery bundle', an 'oxytocin bundle' and a 'vacuum extraction bundle'; (2) an implementation theory, including improving skills in interpretation of cardiotocography, use of QI methods and participation in a national learning network. The Model for Improvement and Deming's system of profound knowledge were used as a methodological framework. Data on compliance with the care bundles and the number of deliveries between newborns with asphyxia (Apgar statistical process control. Compliance with all three clinical care bundles improved to 95% or more, and the percentages of newborns with pH <7 and Apgar <7 after 5 min were reduced by 48% and 31%, respectively. In general, the QI approach strengthened multidisciplinary teamwork, systematised workflow and structured communication around the deliveries. Changes included making a standard memo in the medical record, the use of a bedside whiteboard, bedside handovers, shared decisions with a peer when using an oxytocin infusion and the use of a checklist before vacuum extractions. This QI project illustrates how aspects of patient safety, such as the prevention of asphyxia, can be improved using QI methods to more reliably implement best practice, even in high-performing systems. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  16. All projects related to | Page 520 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Topic: HEALTH SURVEYS, HEALTH STATISTICS, DATA COLLECTING, INFORMATION TECHNOLOGY, COMPUTERS. Region: Bangladesh, India, Viet Nam, Tanzania, Canada. Program: Foundations for Innovation. Total Funding: CA$ 65,200.00. Building Peace and Security Research Capacity in Eastern Africa. Project.

  17. Directions for new developments on statistical design and analysis of small population group trials.

    Science.gov (United States)

    Hilgers, Ralf-Dieter; Roes, Kit; Stallard, Nigel

    2016-06-14

    Most statistical design and analysis methods for clinical trials have been developed and evaluated where at least several hundreds of patients could be recruited. These methods may not be suitable to evaluate therapies if the sample size is unavoidably small, which is usually termed by small populations. The specific sample size cut off, where the standard methods fail, needs to be investigated. In this paper, the authors present their view on new developments for design and analysis of clinical trials in small population groups, where conventional statistical methods may be inappropriate, e.g., because of lack of power or poor adherence to asymptotic approximations due to sample size restrictions. Following the EMA/CHMP guideline on clinical trials in small populations, we consider directions for new developments in the area of statistical methodology for design and analysis of small population clinical trials. We relate the findings to the research activities of three projects, Asterix, IDeAl, and InSPiRe, which have received funding since 2013 within the FP7-HEALTH-2013-INNOVATION-1 framework of the EU. As not all aspects of the wide research area of small population clinical trials can be addressed, we focus on areas where we feel advances are needed and feasible. The general framework of the EMA/CHMP guideline on small population clinical trials stimulates a number of research areas. These serve as the basis for the three projects, Asterix, IDeAl, and InSPiRe, which use various approaches to develop new statistical methodology for design and analysis of small population clinical trials. Small population clinical trials refer to trials with a limited number of patients. Small populations may result form rare diseases or specific subtypes of more common diseases. New statistical methodology needs to be tailored to these specific situations. The main results from the three projects will constitute a useful toolbox for improved design and analysis of small

  18. Using statistical models to explore ensemble uncertainty in climate impact studies: the example of air pollution in Europe

    Directory of Open Access Journals (Sweden)

    V. E. P. Lemaire

    2016-03-01

    Full Text Available Because of its sensitivity to unfavorable weather patterns, air pollution is sensitive to climate change so that, in the future, a climate penalty could jeopardize the expected efficiency of air pollution mitigation measures. A common method to assess the impact of climate on air quality consists in implementing chemistry-transport models forced by climate projections. However, the computing cost of such methods requires optimizing ensemble exploration techniques. By using a training data set from a deterministic projection of climate and air quality over Europe, we identified the main meteorological drivers of air quality for eight regions in Europe and developed statistical models that could be used to predict air pollutant concentrations. The evolution of the key climate variables driving either particulate or gaseous pollution allows selecting the members of the EuroCordex ensemble of regional climate projections that should be used in priority for future air quality projections (CanESM2/RCA4; CNRM-CM5-LR/RCA4 and CSIRO-Mk3-6-0/RCA4 and MPI-ESM-LR/CCLM following the EuroCordex terminology. After having tested the validity of the statistical model in predictive mode, we can provide ranges of uncertainty attributed to the spread of the regional climate projection ensemble by the end of the century (2071–2100 for the RCP8.5. In the three regions where the statistical model of the impact of climate change on PM2.5 offers satisfactory performances, we find a climate benefit (a decrease of PM2.5 concentrations under future climate of −1.08 (±0.21, −1.03 (±0.32, −0.83 (±0.14 µg m−3, for respectively Eastern Europe, Mid-Europe and Northern Italy. In the British-Irish Isles, Scandinavia, France, the Iberian Peninsula and the Mediterranean, the statistical model is not considered skillful enough to draw any conclusion for PM2.5. In Eastern Europe, France, the Iberian Peninsula, Mid-Europe and Northern Italy, the statistical model of the

  19. Bayesian statistic methods and theri application in probabilistic simulation models

    Directory of Open Access Journals (Sweden)

    Sergio Iannazzo

    2007-03-01

    Full Text Available Bayesian statistic methods are facing a rapidly growing level of interest and acceptance in the field of health economics. The reasons of this success are probably to be found on the theoretical fundaments of the discipline that make these techniques more appealing to decision analysis. To this point should be added the modern IT progress that has developed different flexible and powerful statistical software framework. Among them probably one of the most noticeably is the BUGS language project and its standalone application for MS Windows WinBUGS. Scope of this paper is to introduce the subject and to show some interesting applications of WinBUGS in developing complex economical models based on Markov chains. The advantages of this approach reside on the elegance of the code produced and in its capability to easily develop probabilistic simulations. Moreover an example of the integration of bayesian inference models in a Markov model is shown. This last feature let the analyst conduce statistical analyses on the available sources of evidence and exploit them directly as inputs in the economic model.

  20. Designing Solutions by a Student Centred Approach: Integration of Chemical Process Simulation with Statistical Tools to Improve Distillation Systems

    Directory of Open Access Journals (Sweden)

    Isabel M. Joao

    2017-09-01

    Full Text Available Projects thematically focused on simulation and statistical techniques for designing and optimizing chemical processes can be helpful in chemical engineering education in order to meet the needs of engineers. We argue for the relevance of the projects to improve a student centred approach and boost higher order thinking skills. This paper addresses the use of Aspen HYSYS by Portuguese chemical engineering master students to model distillation systems together with statistical experimental design techniques in order to optimize the systems highlighting the value of applying problem specific knowledge, simulation tools and sound statistical techniques. The paper summarizes the work developed by the students in order to model steady-state processes, dynamic processes and optimize the distillation systems emphasizing the benefits of the simulation tools and statistical techniques in helping the students learn how to learn. Students strengthened their domain specific knowledge and became motivated to rethink and improve chemical processes in their future chemical engineering profession. We discuss the main advantages of the methodology from the students’ and teachers perspective