WorldWideScience

Sample records for standard chi-square distribution

  1. Applied noncentral Chi-squared distribution in CFAR detection of hyperspectral projected images

    Science.gov (United States)

    Li, Zhiyong; Chen, Dong; Shi, Gongtao; Yang, Guopeng; Wang, Gang

    2015-10-01

    In this paper, the noncentral chi-squared distribution is applied in the Constant False Alarm Rate (CFAR) detection of hyperspectral projected images to distinguish the anomaly points from background. Usually, the process of the hyperspectral anomaly detectors can be considered as a linear projection. These operators are linear transforms and their results are quadratic form which comes from the transform of spectral vector. In general, chi-squared distribution could be the proper choice to describe the statistical characteristic of this projected image. However, because of the strong correlation among the bands, the standard central chi-squared distribution often cannot fit the stochastic characteristic of the projected images precisely. In this paper, we use a noncentral chi-squared distribution to approximate the projected image of subspace based anomaly detectors. Firstly, the statistical modal of the projected multivariate data is analysed, and a noncentral chi-squared distribution is deduced. Then, the approach of the parameters calculation is introduced. At last, the aerial hyperspectral images are used to verify the effectiveness of the proposed method in tightly modeling the projected image statistic distribution.

  2. Noncentral Chi-Square versus Normal Distributions in Describing the Likelihood Ratio Statistic: The Univariate Case and Its Multivariate Implication

    Science.gov (United States)

    Yuan, Ke-Hai

    2008-01-01

    In the literature of mean and covariance structure analysis, noncentral chi-square distribution is commonly used to describe the behavior of the likelihood ratio (LR) statistic under alternative hypothesis. Due to the inaccessibility of the rather technical literature for the distribution of the LR statistic, it is widely believed that the…

  3. THE CHI SQUARE TEST

    OpenAIRE

    Ugoni, Antony; Walker, Bruce F.

    1995-01-01

    The Chi square test is a statistical test which measures the association between two categorical variables. A working knowledge of tests of this nature are important for the chiropractor and osteopath in order to be able to critically appraise the literature.

  4. A chi-square goodness-of-fit test for non-identically distributed random variables: with application to empirical Bayes

    Energy Technology Data Exchange (ETDEWEB)

    Conover, W.J. [Texas Tech Univ., Lubbock, TX (United States); Cox, D.D. [Rice Univ., Houston, TX (United States); Martz, H.F. [Los Alamos National Lab., NM (United States)

    1997-12-01

    When using parametric empirical Bayes estimation methods for estimating the binomial or Poisson parameter, the validity of the assumed beta or gamma conjugate prior distribution is an important diagnostic consideration. Chi-square goodness-of-fit tests of the beta or gamma prior hypothesis are developed for use when the binomial sample sizes or Poisson exposure times vary. Nine examples illustrate the application of the methods, using real data from such diverse applications as the loss of feedwater flow rates in nuclear power plants, the probability of failure to run on demand and the failure rates of the high pressure coolant injection systems at US commercial boiling water reactors, the probability of failure to run on demand of emergency diesel generators in US commercial nuclear power plants, the rate of failure of aircraft air conditioners, baseball batting averages, the probability of testing positive for toxoplasmosis, and the probability of tumors in rats. The tests are easily applied in practice by means of corresponding Mathematica{reg_sign} computer programs which are provided.

  5. A chi-square goodness-of-fit test for non-identically distributed random variables: with application to empirical Bayes

    International Nuclear Information System (INIS)

    Conover, W.J.; Cox, D.D.; Martz, H.F.

    1997-12-01

    When using parametric empirical Bayes estimation methods for estimating the binomial or Poisson parameter, the validity of the assumed beta or gamma conjugate prior distribution is an important diagnostic consideration. Chi-square goodness-of-fit tests of the beta or gamma prior hypothesis are developed for use when the binomial sample sizes or Poisson exposure times vary. Nine examples illustrate the application of the methods, using real data from such diverse applications as the loss of feedwater flow rates in nuclear power plants, the probability of failure to run on demand and the failure rates of the high pressure coolant injection systems at US commercial boiling water reactors, the probability of failure to run on demand of emergency diesel generators in US commercial nuclear power plants, the rate of failure of aircraft air conditioners, baseball batting averages, the probability of testing positive for toxoplasmosis, and the probability of tumors in rats. The tests are easily applied in practice by means of corresponding Mathematica reg-sign computer programs which are provided

  6. Chi-square test and its application in hypothesis testing

    Directory of Open Access Journals (Sweden)

    Rakesh Rana

    2015-01-01

    Full Text Available In medical research, there are studies which often collect data on categorical variables that can be summarized as a series of counts. These counts are commonly arranged in a tabular format known as a contingency table. The chi-square test statistic can be used to evaluate whether there is an association between the rows and columns in a contingency table. More specifically, this statistic can be used to determine whether there is any difference between the study groups in the proportions of the risk factor of interest. Chi-square test and the logic of hypothesis testing were developed by Karl Pearson. This article describes in detail what is a chi-square test, on which type of data it is used, the assumptions associated with its application, how to manually calculate it and how to make use of an online calculator for calculating the Chi-square statistics and its associated P-value.

  7. Chi-squared goodness of fit tests with applications

    CERN Document Server

    Balakrishnan, N; Nikulin, MS

    2013-01-01

    Chi-Squared Goodness of Fit Tests with Applications provides a thorough and complete context for the theoretical basis and implementation of Pearson's monumental contribution and its wide applicability for chi-squared goodness of fit tests. The book is ideal for researchers and scientists conducting statistical analysis in processing of experimental data as well as to students and practitioners with a good mathematical background who use statistical methods. The historical context, especially Chapter 7, provides great insight into importance of this subject with an authoritative author team

  8. SPECT electronic collimation resolution enhancement using chi-square minimization

    International Nuclear Information System (INIS)

    Durkee, J.W. Jr.; Antich, P.P.; Tsyganov, E.N.; Constantinescu, A.; Fernando, J.L.; Kulkarni, P.V.; Smith, B.J.; Arbique, G.M.; Lewis, M.A.; Nguyen, T.; Raheja, A.; Thambi, G.; Parkey, R.W.

    1998-01-01

    An electronic collimation technique is developed which utilizes the chi-square goodness-of-fit measure to filter scattered gammas incident upon a medical imaging detector. In this data mining technique, Compton kinematic expressions are used as the chi-square fitting templates for measured energy-deposition data involving multiple-interaction scatter sequences. Fit optimization is conducted using the Davidon variable metric minimization algorithm to simultaneously determine the best-fit gamma scatter angles and their associated uncertainties, with the uncertainty associated with the first scatter angle corresponding to the angular resolution precision for the source. The methodology requires no knowledge of materials and geometry. This pattern recognition application enhances the ability to select those gammas that will provide the best resolution for input to reconstruction software. Illustrative computational results are presented for a conceptual truncated-ellipsoid polystyrene position-sensitive fibre head-detector Monte Carlo model using a triple Compton scatter gamma sequence assessment for a 99m Tc point source. A filtration rate of 94.3% is obtained, resulting in an estimated sensitivity approximately three orders of magnitude greater than a high-resolution mechanically collimated device. The technique improves the nominal single-scatter angular resolution by up to approximately 24 per cent as compared with the conventional analytic electronic collimation measure. (author)

  9. A Note on the Asymptotic Normality of the Kernel Deconvolution Density Estimator with Logarithmic Chi-Square Noise

    Directory of Open Access Journals (Sweden)

    Yang Zu

    2015-07-01

    Full Text Available This paper studies the asymptotic normality for the kernel deconvolution estimator when the noise distribution is logarithmic chi-square; both identical and independently distributed observations and strong mixing observations are considered. The dependent case of the result is applied to obtain the pointwise asymptotic distribution of the deconvolution volatility density estimator in discrete-time stochastic volatility models.

  10. How-To-Do-It: Snails, Pill Bugs, Mealworms, and Chi-Square? Using Invertebrate Behavior to Illustrate Hypothesis Testing with Chi-Square.

    Science.gov (United States)

    Biermann, Carol

    1988-01-01

    Described is a study designed to introduce students to the behavior of common invertebrate animals, and to use of the chi-square statistical technique. Discusses activities with snails, pill bugs, and mealworms. Provides an abbreviated chi-square table and instructions for performing the experiments and statistical tests. (CW)

  11. Chi-squared: A simpler evaluation function for multiple-instance learning

    National Research Council Canada - National Science Library

    McGovern, Amy; Jensen, David

    2003-01-01

    ...) but finds the best concept using the chi-square statistic. This approach is simpler than diverse density and allows us to search more extensively by using properties of the contingency table to prune in a guaranteed manner...

  12. Jet pairing algorithm for the 6-jet Higgs channel via energy chi-square criterion

    International Nuclear Information System (INIS)

    Magallanes, J.B.; Arogancia, D.C.; Gooc, H.C.; Vicente, I.C.M.; Bacala, A.M.; Miyamoto, A.; Fujii, K.

    2002-01-01

    Study and discovery of the Higgs bosons at JLC (Joint Linear Collider) is one of the tasks of ACFA (Asian Committee for future Accelerators)-JLC Group. The mode of Higgs production at JLC is e + e - → Z 0 H 0 . In this paper, studies are concentrated on the Higgsstrahlung process and the selection of its signals by getting the right jet-pairing algorithm of 6-jet final state at 300 GeV assuming that Higgs boson mass is 120 GeV and luminosity is 500 fb -1 . The total decay width Γ (H 0 → all) and the efficiency of the signals at the JLC are studied utilizing the 6-jet channel. Out of the 91,500 Higgsstrahlung events, 4,174 6-jet events are selected. PYTHIA Monte Carlo Generator generates the 6-jet Higgsstrahlung channel according to the Standard Model. The generated events are then simulated by Quick Simulator using the JCL parameters. After tagging all 6 quarks which correspond to the 6-jet final state of the Higgsstrahlung, the mean energy of the Z, H, and W's are obtained. Having calculated these information, the event energy chi-square is defined and it is found that the correct combination have generally smaller value. This criterion can be used to find correct jet-pairing algorithm and as one of the cuts for the background signals later on. Other chi-definitions are also proposed. (S. Funahashi)

  13. Calibration of Self-Efficacy for Conducting a Chi-Squared Test of Independence

    Science.gov (United States)

    Zimmerman, Whitney Alicia; Goins, Deborah D.

    2015-01-01

    Self-efficacy and knowledge, both concerning the chi-squared test of independence, were examined in education graduate students. Participants rated statements concerning self-efficacy and completed a related knowledge assessment. After completing a demographic survey, participants completed the self-efficacy and knowledge scales a second time.…

  14. Chi-square-based scoring function for categorization of MEDLINE citations.

    Science.gov (United States)

    Kastrin, A; Peterlin, B; Hristovski, D

    2010-01-01

    Text categorization has been used in biomedical informatics for identifying documents containing relevant topics of interest. We developed a simple method that uses a chi-square-based scoring function to determine the likelihood of MEDLINE citations containing genetic relevant topic. Our procedure requires construction of a genetic and a nongenetic domain document corpus. We used MeSH descriptors assigned to MEDLINE citations for this categorization task. We compared frequencies of MeSH descriptors between two corpora applying chi-square test. A MeSH descriptor was considered to be a positive indicator if its relative observed frequency in the genetic domain corpus was greater than its relative observed frequency in the nongenetic domain corpus. The output of the proposed method is a list of scores for all the citations, with the highest score given to those citations containing MeSH descriptors typical for the genetic domain. Validation was done on a set of 734 manually annotated MEDLINE citations. It achieved predictive accuracy of 0.87 with 0.69 recall and 0.64 precision. We evaluated the method by comparing it to three machine-learning algorithms (support vector machines, decision trees, naïve Bayes). Although the differences were not statistically significantly different, results showed that our chi-square scoring performs as good as compared machine-learning algorithms. We suggest that the chi-square scoring is an effective solution to help categorize MEDLINE citations. The algorithm is implemented in the BITOLA literature-based discovery support system as a preprocessor for gene symbol disambiguation process.

  15. False star detection and isolation during star tracking based on improved chi-square tests.

    Science.gov (United States)

    Zhang, Hao; Niu, Yanxiong; Lu, Jiazhen; Yang, Yanqiang; Su, Guohua

    2017-08-01

    The star sensor is a precise attitude measurement device for a spacecraft. Star tracking is the main and key working mode for a star sensor. However, during star tracking, false stars become an inevitable interference for star sensor applications, which may result in declined measurement accuracy. A false star detection and isolation algorithm in star tracking based on improved chi-square tests is proposed in this paper. Two estimations are established based on a Kalman filter and a priori information, respectively. The false star detection is operated through adopting the global state chi-square test in a Kalman filter. The false star isolation is achieved using a local state chi-square test. Semi-physical experiments under different trajectories with various false stars are designed for verification. Experiment results show that various false stars can be detected and isolated from navigation stars during star tracking, and the attitude measurement accuracy is hardly influenced by false stars. The proposed algorithm is proved to have an excellent performance in terms of speed, stability, and robustness.

  16. FREQFIT: Computer program which performs numerical regression and statistical chi-squared goodness of fit analysis

    International Nuclear Information System (INIS)

    Hofland, G.S.; Barton, C.C.

    1990-01-01

    The computer program FREQFIT is designed to perform regression and statistical chi-squared goodness of fit analysis on one-dimensional or two-dimensional data. The program features an interactive user dialogue, numerous help messages, an option for screen or line printer output, and the flexibility to use practically any commercially available graphics package to create plots of the program's results. FREQFIT is written in Microsoft QuickBASIC, for IBM-PC compatible computers. A listing of the QuickBASIC source code for the FREQFIT program, a user manual, and sample input data, output, and plots are included. 6 refs., 1 fig

  17. Association between litterers' profile and littering behavior: A chi-square approach

    Science.gov (United States)

    Asmui, Mas'udah; Zaki, Suhanom Mohd; Wahid, Sharifah Norhuda Syed; Mokhtar, Noorsuraya Mohd; Harith, Siti Suhaila

    2017-05-01

    Littering is not a novelty, yet a prolonged issue. The solutions have been discussed for a long time; however this issue still remains unresolved. Littering is commonly associated with littering behavior and awareness. The littering behavior is normally influenced by the litter profile such as gender, family income, education level and age. Jengka Street market, which is located in Pahang, is popularly known as a trade market. It offers diversities of wet and dry goods and is awaited by local residents and tourists. This study analyzes association between litterers' profile and littering behavior. Littering behavior is measured based on factors of trash bin facilities, awareness campaign and public littering behavior. 114 respondents were involved in this study with 62 (54.39%) are female aged more than 18 years old and majority of these female respondents are diploma holders. In addition, 78.95% of the respondents have family income below than RM3,000.00 per month. Based on the data analysis, it was found that first-time visitors littered higher than frequent visitors, lack of providing trash bin facilities contributes to positive littering behavior and there is a significant association between litterers' age and littering behavior by using chi-square approach.

  18. Computation of the Percentage Points of the Chi-Square Distribution

    Science.gov (United States)

    1977-04-01

    344 1 10 NL N 1~0 (r, P) 1N’ N ") A ATUN 4q T I-)N a .9W" 0 (A s tf% o 1 .4 .4 ifN #Im W ý 4 00UN JU’N N JN 3 1 MIn10 %3 it 0 1) P-N N NI N0 7 C ) k...Avenue Kirtland Air Force Base Silver Spring, Maryland 20910 Albuquerque. Newv Mexico 87117 Attn: Strategic Analysis Support Group Document Librarian...Los Alamios Scientific Liboratory P.O. Box 1663 Prof. George F. Carrier Los Alamios, New Mexico 87544 Pierce Hall, Room 311 Attn: Report Library

  19. Measures of effect size for chi-squared and likelihood-ratio goodness-of-fit tests.

    Science.gov (United States)

    Johnston, Janis E; Berry, Kenneth J; Mielke, Paul W

    2006-10-01

    A fundamental shift in editorial policy for psychological journals was initiated when the fourth edition of the Publication Manual of the American Psychological Association (1994) placed emphasis on reporting measures of effect size. This paper presents measures of effect size for the chi-squared and the likelihood-ratio goodness-of-fit statistic tests.

  20. Multiway contingency tables: Monte Carlo resampling probability values for the chi-squared and likelihood-ratio tests.

    Science.gov (United States)

    Long, Michael A; Berry, Kenneth J; Mielke, Paul W

    2010-10-01

    Monte Carlo resampling methods to obtain probability values for chi-squared and likelihood-ratio test statistics for multiway contingency tables are presented. A resampling algorithm provides random arrangements of cell frequencies in a multiway contingency table, given fixed marginal frequency totals. Probability values are obtained from the proportion of resampled test statistic values equal to or greater than the observed test statistic value.

  1. ATLAS particle detector CSC ROD software design and implementation, and, Addition of K physics to chi-squared analysis of FDQM

    CERN Document Server

    Hawkins, Donovan Lee

    2005-01-01

    In this thesis I present a software framework for use on the ATLAS muon CSC readout driver. This C++ framework uses plug-in Decoders incorporating hand-optimized assembly language routines to perform sparsification and data formatting. The software is designed with both flexibility and performance in mind, and runs on a custom 9U VME board using Texas Instruments TMS360C6203 digital signal processors. I describe the requirements of the software, the methods used in its design, and the results of testing the software with simulated data. I also present modifications to a chi-squared analysis of the Standard Model and Four Down Quark Model (FDQM) originally done by Dr. Dennis Silverman. The addition of four new experiments to the analysis has little effect on the Standard Model but provides important new restrictions on the FDQM. The method used to incorporate these new experiments is presented, and the consequences of their addition are reviewed.

  2. Chi-Square Test of Word of Mouth Marketing with Impact on the Evaluation of Patients' Hospital and Services: An Application in Teaching and Research Hospital

    Directory of Open Access Journals (Sweden)

    Yelda ŞENER

    2014-12-01

    Full Text Available The purpose of this study, using data provided from 223 inpatients in a teaching and research hospital, hospital’s preference is to explain the effect of word of mouth marketing. For this purpose, word of mouth marketing process is evaluated in terms of providing information about the hospital and the patient’s level of intimacy, both of patients and information provider’s level of expertise with related to hospital and services, the patient’s perceived level of risk for hospitals and services and providing information’s level of impact on patient being treated in hospital. The obtain data, after evaluation by frequency distributions these factors impact on word of mouth marketing is demonstrated by descriptive statistics, chi-square analysis and pearson’s correlation analysis. As a result of this study is concluded word of mouth marketing on the training and research hospital is preferred by the patints to have a significant impact.

  3. Principal components in the discrimination of outliers: A study in simulation sample data corrected by Pearson's and Yates´s chi-square distance

    Directory of Open Access Journals (Sweden)

    Manoel Vitor de Souza Veloso

    2016-04-01

    Full Text Available Current study employs Monte Carlo simulation in the building of a significance test to indicate the principal components that best discriminate against outliers. Different sample sizes were generated by multivariate normal distribution with different numbers of variables and correlation structures. Corrections by chi-square distance of Pearson´s and Yates's were provided for each sample size. Pearson´s correlation test showed the best performance. By increasing the number of variables, significance probabilities in favor of hypothesis H0 were reduced. So that the proposed method could be illustrated, a multivariate time series was applied with regard to sales volume rates in the state of Minas Gerais, obtained in different market segments.

  4. Landslide susceptibility mapping using decision-tree based CHi-squared automatic interaction detection (CHAID) and Logistic regression (LR) integration

    International Nuclear Information System (INIS)

    Althuwaynee, Omar F; Pradhan, Biswajeet; Ahmad, Noordin

    2014-01-01

    This article uses methodology based on chi-squared automatic interaction detection (CHAID), as a multivariate method that has an automatic classification capacity to analyse large numbers of landslide conditioning factors. This new algorithm was developed to overcome the subjectivity of the manual categorization of scale data of landslide conditioning factors, and to predict rainfall-induced susceptibility map in Kuala Lumpur city and surrounding areas using geographic information system (GIS). The main objective of this article is to use CHi-squared automatic interaction detection (CHAID) method to perform the best classification fit for each conditioning factor, then, combining it with logistic regression (LR). LR model was used to find the corresponding coefficients of best fitting function that assess the optimal terminal nodes. A cluster pattern of landslide locations was extracted in previous study using nearest neighbor index (NNI), which were then used to identify the clustered landslide locations range. Clustered locations were used as model training data with 14 landslide conditioning factors such as; topographic derived parameters, lithology, NDVI, land use and land cover maps. Pearson chi-squared value was used to find the best classification fit between the dependent variable and conditioning factors. Finally the relationship between conditioning factors were assessed and the landslide susceptibility map (LSM) was produced. An area under the curve (AUC) was used to test the model reliability and prediction capability with the training and validation landslide locations respectively. This study proved the efficiency and reliability of decision tree (DT) model in landslide susceptibility mapping. Also it provided a valuable scientific basis for spatial decision making in planning and urban management studies

  5. Risk Factors Analysis and Death Prediction in Some Life-Threatening Ailments Using Chi-Square Case-Based Reasoning (χ2 CBR) Model.

    Science.gov (United States)

    Adeniyi, D A; Wei, Z; Yang, Y

    2018-01-30

    A wealth of data are available within the health care system, however, effective analysis tools for exploring the hidden patterns in these datasets are lacking. To alleviate this limitation, this paper proposes a simple but promising hybrid predictive model by suitably combining the Chi-square distance measurement with case-based reasoning technique. The study presents the realization of an automated risk calculator and death prediction in some life-threatening ailments using Chi-square case-based reasoning (χ 2 CBR) model. The proposed predictive engine is capable of reducing runtime and speeds up execution process through the use of critical χ 2 distribution value. This work also showcases the development of a novel feature selection method referred to as frequent item based rule (FIBR) method. This FIBR method is used for selecting the best feature for the proposed χ 2 CBR model at the preprocessing stage of the predictive procedures. The implementation of the proposed risk calculator is achieved through the use of an in-house developed PHP program experimented with XAMP/Apache HTTP server as hosting server. The process of data acquisition and case-based development is implemented using the MySQL application. Performance comparison between our system, the NBY, the ED-KNN, the ANN, the SVM, the Random Forest and the traditional CBR techniques shows that the quality of predictions produced by our system outperformed the baseline methods studied. The result of our experiment shows that the precision rate and predictive quality of our system in most cases are equal to or greater than 70%. Our result also shows that the proposed system executes faster than the baseline methods studied. Therefore, the proposed risk calculator is capable of providing useful, consistent, faster, accurate and efficient risk level prediction to both the patients and the physicians at any time, online and on a real-time basis.

  6. A DoS/DDoS Attack Detection System Using Chi-Square Statistic Approach

    Directory of Open Access Journals (Sweden)

    Fang-Yie Leu

    2010-04-01

    Full Text Available Nowadays, users can easily access and download network attack tools, which often provide friendly interfaces and easily operated features, from the Internet. Therefore, even a naive hacker can also launch a large scale DoS or DDoS attack to prevent a system, i.e., the victim, from providing Internet services. In this paper, we propose an agent based intrusion detection architecture, which is a distributed detection system, to detect DoS/DDoS attacks by invoking a statistic approach that compares source IP addresses' normal and current packet statistics to discriminate whether there is a DoS/DDoS attack. It first collects all resource IPs' packet statistics so as to create their normal packet distribution. Once some IPs' current packet distribution suddenly changes, very often it is an attack. Experimental results show that this approach can effectively detect DoS/DDoS attacks.

  7. A revisit to contingency table and tests of independence: bootstrap is preferred to Chi-square approximations as well as Fisher's exact test.

    Science.gov (United States)

    Lin, Jyh-Jiuan; Chang, Ching-Hui; Pal, Nabendu

    2015-01-01

    To test the mutual independence of two qualitative variables (or attributes), it is a common practice to follow the Chi-square tests (Pearson's as well as likelihood ratio test) based on data in the form of a contingency table. However, it should be noted that these popular Chi-square tests are asymptotic in nature and are useful when the cell frequencies are "not too small." In this article, we explore the accuracy of the Chi-square tests through an extensive simulation study and then propose their bootstrap versions that appear to work better than the asymptotic Chi-square tests. The bootstrap tests are useful even for small-cell frequencies as they maintain the nominal level quite accurately. Also, the proposed bootstrap tests are more convenient than the Fisher's exact test which is often criticized for being too conservative. Finally, all test methods are applied to a few real-life datasets for demonstration purposes.

  8. Results from the Cryogenic Dark Matter Search Using a Chi Squared Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sander, Joel [UC, Santa Barbara

    2007-12-01

    Most of the mass-energy density of the universe remains undetected and is only understood through its affects on visible, baryonic matter. The visible, baryonic matter accounts for only about half of a percent of the universe's total mass-energy budget, while the remainder of the mass-energy of the universe remains dark or undetected. About a quarter of the dark mass-energy density of the universe is comprised of massive particles that do not interact via the strong or electromagnetic forces. If these particles interact via the weak force, they are termed weakly interacting massive particles or WIMPs, and their interactions with baryonic matter could be detectable. The CDMS II experiment attempts to detect WIMP interactions in the Soudan Underground Laboratory using germanium detectors and silicon detectors. A WIMP can interact a with detector nuclei causing the nuclei to recoil. A nuclear recoil is distinguished from background electron recoils by comparing the deposited ionization and phonon energies. Electron recoils occurring near detector surfaces are more difficult to reject. This thesis describes the results of a χ2 analysis designed to reject events occurring near detector surfaces. Because no WIMP signal was observed, separate limits using the germanium and silicon detectors are set on the WIMP cross section under standard astrophysical assumptions.

  9. O uso da estatística de qui-quadrado no controle de processos The noncentral chi square statistic applied to process control

    Directory of Open Access Journals (Sweden)

    Antônio Fernando Branco Costa

    2005-08-01

    Full Text Available Dois gráficos de controle são, usualmente, utilizados no monitoramento da média e da variância de um processo. Em geral, utiliza-se o gráfico de "Xbarra" para a detecção de alterações da média, e o gráfico de R para a sinalização de aumentos da variabilidade. Neste artigo, propõe-se o uso de uma única estatística e, portanto, de um único gráfico, como alternativa à prática comum do monitoramento de processos por meio de dois gráficos de controle. O gráfico proposto, baseado na estatística de Qui-quadrado não-central, tem se mostrado mais eficiente que os gráficos de "Xbarra" e R. Além disso, se as decisões sobre as condições dos parâmetros do processo são baseadas no histórico das observações e não apenas na última observação, então o uso da estatística de Qui-quadrado não-central é indicado para a detecção de pequenas perturbações. Neste estudo, são também apresentados os gráficos de controle da média móvel ponderada exponencialmente (EWMA baseados na estatística Qui-quadrado não-central.It is standard practice to use joint charts in process control, one designed to detect shifts in the mean and the other to detect changes in the variance of the process. In this paper, we propose the use of a single chart to control both mean and variance. Based on the noncentral chi square statistic, the single chart is faster in detecting shifts in the mean and increases in variance than its competitor, the joint "Xbar" and R charts. The noncentral chi square statistic can also be used with the EWMA procedure, particularly in the detection of small mean shifts, accompanied or not by slight increases in variance.

  10. ANALYSE DATA IN THE FORM OF HISTOGRAM WITH COMPARISON BETWEEN KOLMOGOROV-SMIRNOV TEST AND CHI SQUARE TEST

    CERN Document Server

    Pg Haji Mohd Ariffin, Ak Muhamad Amirul Irfan

    2015-01-01

    This paper presents the project that I have been tasked while attending a three-month Summer Programme at CERN. The Project specification is to analyse the result of a weekly data produced by Compact Muon Solenoid (CMS) in the form of histograms. CMS is a detector which is a multi-purpose apparatus use to operate at the Large Hadron Collider (LHC) at CERN. It will yield head-on collisions of two proton (ion) beams of 7 TeV (2.75 TeV per nucleon) each, with a design luminosity of 10 34 cm -2s-1. A comparison of the results is then made using two methods namely Kolmogorov Smirnov Statistic Test and Chi-Squared Test. These tests will be further elaborated in the subsequent paragraphs. To execute this project, I have to firstly study the entire basic computer programming in particular C++ and the ROOT Basic Programmes. This is important to ensure the tasks given can be resolved within the given time. A program is subsequently written to provide output of histogram and calculation of Kolmogorov-Smirnov Test and Ch...

  11. Chi-squared Automatic Interaction Detection Decision Tree Analysis of Risk Factors for Infant Anemia in Beijing, China.

    Science.gov (United States)

    Ye, Fang; Chen, Zhi-Hua; Chen, Jie; Liu, Fang; Zhang, Yong; Fan, Qin-Ying; Wang, Lin

    2016-05-20

    In the past decades, studies on infant anemia have mainly focused on rural areas of China. With the increasing heterogeneity of population in recent years, available information on infant anemia is inconclusive in large cities of China, especially with comparison between native residents and floating population. This population-based cross-sectional study was implemented to determine the anemic status of infants as well as the risk factors in a representative downtown area of Beijing. As useful methods to build a predictive model, Chi-squared automatic interaction detection (CHAID) decision tree analysis and logistic regression analysis were introduced to explore risk factors of infant anemia. A total of 1091 infants aged 6-12 months together with their parents/caregivers living at Heping Avenue Subdistrict of Beijing were surveyed from January 1, 2013 to December 31, 2014. The prevalence of anemia was 12.60% with a range of 3.47%-40.00% in different subgroup characteristics. The CHAID decision tree model has demonstrated multilevel interaction among risk factors through stepwise pathways to detect anemia. Besides the three predictors identified by logistic regression model including maternal anemia during pregnancy, exclusive breastfeeding in the first 6 months, and floating population, CHAID decision tree analysis also identified the fourth risk factor, the maternal educational level, with higher overall classification accuracy and larger area below the receiver operating characteristic curve. The infant anemic status in metropolis is complex and should be carefully considered by the basic health care practitioners. CHAID decision tree analysis has demonstrated a better performance in hierarchical analysis of population with great heterogeneity. Risk factors identified by this study might be meaningful in the early detection and prompt treatment of infant anemia in large cities.

  12. A Search for WIMP Dark Matter Using an Optimized Chi-square Technique on the Final Data from the Cryogenic Dark Matter Search Experiment (CDMS II)

    Energy Technology Data Exchange (ETDEWEB)

    Manungu Kiveni, Joseph [Syracuse Univ., NY (United States)

    2012-12-01

    This dissertation describes the results of a WIMP search using CDMS II data sets accumulated at the Soudan Underground Laboratory in Minnesota. Results from the original analysis of these data were published in 2009; two events were observed in the signal region with an expected leakage of 0.9 events. Further investigation revealed an issue with the ionization-pulse reconstruction algorithm leading to a software upgrade and a subsequent reanalysis of the data. As part of the reanalysis, I performed an advanced discrimination technique to better distinguish (potential) signal events from backgrounds using a 5-dimensional chi-square method. This dataanalysis technique combines the event information recorded for each WIMP-search event to derive a backgrounddiscrimination parameter capable of reducing the expected background to less than one event, while maintaining high efficiency for signal events. Furthermore, optimizing the cut positions of this 5-dimensional chi-square parameter for the 14 viable germanium detectors yields an improved expected sensitivity to WIMP interactions relative to previous CDMS results. This dissertation describes my improved (and optimized) discrimination technique and the results obtained from a blind application to the reanalyzed CDMS II WIMP-search data.

  13. Figure-of-merit (FOM), an improved criterion over the normalized chi-squared test for assessing goodness-of-fit of gamma-ray spectral peaks

    International Nuclear Information System (INIS)

    Garo Balian, H.; Eddy, N.W.

    1977-01-01

    A careful experimenter knows that in order to choose the best curve fits of peaks from a gamma ray spectrum for such purposes as energy or intensity calibration, half-life determination, etc., the application of the normalized chi-squared test, [chisub(N)] 2 =chi 2 /(n-m), is insufficient. One must normally verify the goodness-of-fit with plots, detailed scans of residuals, etc. Because of different techniques of application, variations in backgrounds, in peak sizes and shapes, etc., quotation of the [chisub(N)] 2 value associated with an individual peak fit conveys very little information unless accompanied by considerable ancillary data. (This is not to say that the traditional chi 2 formula should not be used as the source of the normal equations in the least squares fitting procedure. But after the fitting, it is unreliable as a criterion for comparison with other fits.) The authors present a formula designated figure-of-merit (FOM) which greatly improves on the uncertainty and fluctuations of the [chisub(N)] 2 formula. An FOM value of less than 2.5% indicates a good fit (in the authors' judgement) irrespective of background conditions and variations in peak sizes and shapes. Furthermore, the authors feel the FOM formula is less subject to fluctuations resulting from different techniques of application. (Auth.)

  14. Type I error rates and power of several versions of scaled chi-square difference tests in investigations of measurement invariance.

    Science.gov (United States)

    Brace, Jordan Campbell; Savalei, Victoria

    2017-09-01

    A Monte Carlo simulation study was conducted to investigate Type I error rates and power of several corrections for nonnormality to the normal theory chi-square difference test in the context of evaluating measurement invariance via structural equation modeling. Studied statistics include the uncorrected difference test, D ML , Satorra and Bentler's (2001) original correction, D SB1 , Satorra and Bentler's (2010) strictly positive correction, D SB10 , and a hybrid procedure, D SBH (Asparouhov & Muthén, 2013). Multiple-group data were generated from confirmatory factor analytic population models invariant on all parameters, or lacking invariance on residual variances, indicator intercepts, or factor loadings. Conditions varied in terms of the number of indicators associated with each factor in the population model, the location of noninvariance (if any), sample size, sample size ratio in the 2 groups, and nature of nonnormality. Type I error rates and power of corrected statistics were evaluated for a series of 4 nested invariance models. Overall, the strictly positive correction, D SB10 , is the best and most consistently performing statistic, as it was found to be much less sensitive than the original correction, D SB1 , to model size and sample evenness. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. Electrocardiography interpretation training in emergency medicine: methods, resources, competency assessment, and national standardization

    OpenAIRE

    Özel, Betül Akbuğa; Demircan, Ahmet; Keleş, Ayfer; Bildik, Fikret; Özel, Deniz; Ergin, Mehmet; Günaydin, Gül Pamukçu

    2015-01-01

    Objective(s). The aim of this study was to evaluate the status of electrocardiography (ECG) training in emergency medicine residency programs in Turkey, and the attitude of the program representatives towards standardization of such training. Methods. This investigation was planned as a cross-sectional study. An 18-item questionnaire was distributed to directors of residency programs. Responses were evaluated using SPSS (v.16.0), and analyzed using the chi-square test. Results. Thirty...

  16. Modelo de detección de intrusiones en sistemas computacionales, realizando selección de características con chi square, entrenamiento y clasificación con ghsom

    Directory of Open Access Journals (Sweden)

    Johan Mardini

    2017-07-01

    Full Text Available Dado que la información se ha constituido en uno de los activos más valiosos de las organizaciones, es necesario salvaguardarla a través de diferentes estrategias de protección, con el fin de evitar accesos intrusivos o cualquier tipo de incidente que cause el deterioro y mal uso de la misma. Precisamente por ello, en este artículo se evalúa la eficiencia de un modelo de detección de intrusiones de red, utilizando métricas de sensibilidad, especificidad, precisión y exactitud, mediante un proceso de simulación que utiliza el DATASET NSL-KDD DARPA, y en concreto las características más relevantes con CHI SQUARE. Esto último a partir de una red neuronal que hace uso de un algoritmo de aprendizaje no supervisado y que se basa en mapas auto organizativos jerárquicos. Con todo ello se clasificó el tráfico de la red BI-CLASE de forma automática. Como resultado se encontró que el clasificador GHSOM utilizado con la técnica CHI SQUARE genera su mejor resultado a 15 características con precisión, sensibilidad, especificidad y exactitud

  17. Neutral and Stable Equilibria of Genetic Systems and The Hardy-Weinberg Principle: Limitations of the Chi-Square Test and Advantages of Auto-Correlation Functions of Allele Frequencies

    Directory of Open Access Journals (Sweden)

    Francisco A Bosco

    2012-12-01

    Full Text Available Since the foundations of Population Genetics the notion of genetic equilibrium (in close analogy to Classical Mechanics has been associated with the Hardy-Weinberg (HW Principle and the identification of equilibrium is currently assumed by stating that the HW axioms are valid if appropriate values of Chi-Square (p<0.05 are observed in experiments. Here we show by numerical experiments with the genetic system of one locus/two alleles that considering large ensembles of populations the Chi-Square test is not decisive and may lead to false negatives in random mating populations and false positives in nonrandom mating populations. This result confirms the logical statement that statistical tests cannot be used to deduce if the genetic population is under the HW conditions. Furthermore, we show that under the HW conditions populations of any size evolve in time according to what can be identified as neutral dynamics to which the very notion of equilibrium is unattainable for any practical purpose. Therefore, under the HW conditions the identification of equilibrium properties needs a different approach and the use of more appropriate concepts. We also show that by relaxing the condition of random mating the dynamics acquires all the characteristics of asymptotic stable equilibrium. As a consequence our results show that the question of equilibrium in genetic systems should be approached in close analogy to non-equilibrium statistical physics and its observability should be focused on dynamical quantities like the typical decay properties of the allelic auto correlation function in time. In this perspective one should abandon the classical notion of genetic equilibrium and its relation to the HW proportions and open investigations in the direction of searching for unifying general principles of population genetic transformations capable to take in consideration these systems in their full complexity.

  18. Distributed Energy Resource (DER) Cybersecurity Standards

    Energy Technology Data Exchange (ETDEWEB)

    Saleem, Danish [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Johnson, Jay [Sandia National Laboratories

    2017-11-08

    This presentation covers the work that Sandia National Laboratories and National Renewable Energy Laboratory are doing for distributed energy resource cybersecurity standards, prepared for NREL's Annual Cybersecurity & Resilience Workshop on October 9-10, 2017.

  19. Banding of connection standards for distributed generation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-05-04

    This report presents the views of distributed network operators (DNOs), developers, equipment manufacturers and consultants on the current banding of distributed generation in terms of connection standards and recommendations. The Documents ER G59/1, ER G75/1, ER G83/1 and ETR 113/1 covering recommendations for the connection of embedded generating plant to distribution systems and guidance notes for the protection of embedded generating plant are examined. The way in which the recommendations are applied in practice is investigated. Multiple distribution generator installations, fault ride through, and banding are considered as well as both protection required and maximum generator sizes at respective voltage levels.

  20. PROWAY - a standard for distributed control systems

    International Nuclear Information System (INIS)

    Gellie, R.W.

    1980-01-01

    The availability of cheap and powerful microcomputer and data communications equipment has led to a major revision of instrumentation and control systems. Intelligent devices can now be used and distributed about the control system in a systematic and economic manner. These sub-units are linked by a communications system to provide a total system capable of meeting the required plant objectives. PROWAY, an international standard process data highway for interconnecting processing units in distributed industrial process control systems, is currently being developed. This paper describes the salient features and current status of the PROWAY effort. (auth)

  1. Economic-statistical design of variable parameters non-central chi-square control chart Projeto econômico-estatístico de gráficos de controle qui-quadrado não-central com parâmetros variáveis

    Directory of Open Access Journals (Sweden)

    Maysa Sacramento de Magalhães

    2011-06-01

    Full Text Available Production processes are monitored by control charts since their inception by Shewhart (1924. This surveillance is useful in improving the production process due to increased stabilization of the process, and consequently standardization of the output. Control charts keep track of a few key quality characteristics of the outcome of the production process. This is done by means of univariate or multivariate charts. Small improvements in control chart methodology can have significant economic impact in the production process. In this investigation, we propose the monitoring of a single variable by means of a variable parameter non-central chi-square control chart. The design of the chart is accomplished by means of optimizing a cost function. We use here a simulated annealing optimization tool, due to the difficulty of classical gradient based optimization techniques to handle the optimization of the cost function. The results show some of the drawbacks of using this model.Processos de produção são monitorados por gráficos de controle desde a sua introdução por Shewhart (1924. Este monitoramento é útil na melhoria do processo de produção devido à crescente estabilização do processo, e consequentemente, padronização do produto. Gráficos de controle mantêm vigilância de características de qualidade de um processo de produção. Isto é feito por intermédio de gráficos univariados ou multivariados. Melhorias na metodologia de gráficos de controle podem levar a um impacto econômico significativo no processo de produção. Neste artigo, propomos um gráfico de controle de parâmetros variáveis baseado na estatística qui-quadrado nãocentral para monitorar uma característica de qualidade de interesse. O projeto do gráfico é realizado através da otimização de uma função custo. O algoritmo simulated annealing é usado devido à dificuldade dos métodos clássicos de otimização baseados no gradiente, de lidarem com a

  2. Characterization of Distributive and Standard Ideals in Semilattices

    African Journals Online (AJOL)

    Bheema

    introduced and studied by Hashimoto (1952); and Gratzer and Schmidt (1961). Properties of distributive ideals of Birkhoff (1967) are considered in our work. In this paper we studied the notion of distributive (dually) ideal and standard ideal in a semilattice of Gratzer (1978) and produced a characterization theorem of ...

  3. Trade and the distributional politics of international labour standards

    OpenAIRE

    Oslington, Paul

    2005-01-01

    This paper constructs a simple general equilibrium model of the trade and distributional effects of spreading advanced country international labour standards to developing countries. Labour standards (including minimum safety requirements, prohibition of prison and child labour, and rights to unionise) are represented as a floor to the cost of employing labour. The model shows how the spread of standards affects the terms of trade and pattern of international specialisation, and can shift ...

  4. Quantum chi-squared and goodness of fit testing

    Energy Technology Data Exchange (ETDEWEB)

    Temme, Kristan [IQIM, California Institute of Technology, Pasadena, California 91125 (United States); Verstraete, Frank [Fakultät für Physik, Universität Wien, Boltzmanngasse 5, 1090 Wien, Austria and Faculty of Science, Ghent University, B-9000 Ghent (Belgium)

    2015-01-15

    A quantum mechanical hypothesis test is presented for the hypothesis that a certain setup produces a given quantum state. Although the classical and the quantum problems are very much related to each other, the quantum problem is much richer due to the additional optimization over the measurement basis. A goodness of fit test for i.i.d quantum states is developed and a max-min characterization for the optimal measurement is introduced. We find the quantum measurement which leads both to the maximal Pitman and Bahadur efficiencies, and determine the associated divergence rates. We discuss the relationship of the quantum goodness of fit test to the problem of estimating multiple parameters from a density matrix. These problems are found to be closely related and we show that the largest error of an optimal strategy, determined by the smallest eigenvalue of the Fisher information matrix, is given by the divergence rate of the goodness of fit test.

  5. Components of the Pearson-Fisher chi-squared statistic

    Directory of Open Access Journals (Sweden)

    G. D. Raynery

    2002-01-01

    interpretation of the corresponding test statistic components has not previously been investigated. This paper provides the necessary details, as well as an overview of the decomposition options available, and revisits two published examples.

  6. One implementation of the Chi Square Test with SPSS

    OpenAIRE

    Tinoco Gómez, Oscar

    2014-01-01

    Chi Cuadrado illustrates the use of the statistical software SPSS applied to the test to prove independence between two variables. The application carries out in the evaluation of the impact generated in the educational page of the Faculty of Administrative Sciences of the National University Federico Villarreal in relation to the use of some of the tools of the technologies called of information and communication in the process of formation profesional. Se ilustra el uso del software esta...

  7. Determination analysis of energy conservation standards for distribution transformers

    Energy Technology Data Exchange (ETDEWEB)

    Barnes, P.R.; Van Dyke, J.W.; McConnell, B.W.; Das, S.

    1996-07-01

    This report contains information for US DOE to use in making a determination on proposing energy conservation standards for distribution transformers as required by the Energy Policy Act of 1992. Potential for saving energy with more efficient liquid-immersed and dry-type distribution transformers could be significant because these transformers account for an estimated 140 billion kWh of the annual energy lost in the delivery of electricity. Objective was to determine whether energy conservation standards for distribution transformers would have the potential for significant energy savings, be technically feasible, and be economically justified from a national perspective. It was found that energy conservation for distribution transformers would be technically and economically feasible. Based on the energy conservation options analyzed, 3.6-13.7 quads of energy could be saved from 2000 to 2030.

  8. Can Bt maize change the spatial distribution of predator Cycloneda ...

    African Journals Online (AJOL)

    The aggregation index (variance/mean ratio, Morisita index, and exponent k of the negative binomial distribution) and Chi-square fit of the observed and expected values to the theoretical frequency distribution (Poisson, binomial, and negative binomial positive) revealed that, in both cultivars, the adults of C. sanguinea ...

  9. International Review of Standards and Labeling Programs for Distribution Transformers

    Energy Technology Data Exchange (ETDEWEB)

    Letschert, Virginie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Scholand, Michael [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Carreño, Ana María [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hernandez, Carolina [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-06-20

    Transmission and distribution (T&D) losses in electricity networks represent 8.5% of final energy consumption in the world. In Latin America, T&D losses range between 6% and 20% of final energy consumption, and represent 7% in Chile. Because approximately one-third of T&D losses take place in distribution transformers alone, there is significant potential to save energy and reduce costs and carbon emissions through policy intervention to increase distribution transformer efficiency. A large number of economies around the world have recognized the significant impact of addressing distribution losses and have implemented policies to support market transformation towards more efficient distribution transformers. As a result, there is considerable international experience to be shared and leveraged to inform countries interested in reducing distribution losses through policy intervention. The report builds upon past international studies of standards and labeling (S&L) programs for distribution transformers to present the current energy efficiency programs for distribution transformers around the world.

  10. Standardized approach for developing probabilistic exposure factor distributions

    Energy Technology Data Exchange (ETDEWEB)

    Maddalena, Randy L.; McKone, Thomas E.; Sohn, Michael D.

    2003-03-01

    The effectiveness of a probabilistic risk assessment (PRA) depends critically on the quality of input information that is available to the risk assessor and specifically on the probabilistic exposure factor distributions that are developed and used in the exposure and risk models. Deriving probabilistic distributions for model inputs can be time consuming and subjective. The absence of a standard approach for developing these distributions can result in PRAs that are inconsistent and difficult to review by regulatory agencies. We present an approach that reduces subjectivity in the distribution development process without limiting the flexibility needed to prepare relevant PRAs. The approach requires two steps. First, we analyze data pooled at a population scale to (1) identify the most robust demographic variables within the population for a given exposure factor, (2) partition the population data into subsets based on these variables, and (3) construct archetypal distributions for each subpopulation. Second, we sample from these archetypal distributions according to site- or scenario-specific conditions to simulate exposure factor values and use these values to construct the scenario-specific input distribution. It is envisaged that the archetypal distributions from step 1 will be generally applicable so risk assessors will not have to repeatedly collect and analyze raw data for each new assessment. We demonstrate the approach for two commonly used exposure factors--body weight (BW) and exposure duration (ED)--using data for the U.S. population. For these factors we provide a first set of subpopulation based archetypal distributions along with methodology for using these distributions to construct relevant scenario-specific probabilistic exposure factor distributions.

  11. Distribution of standard deviation of an observable among superposed states

    International Nuclear Information System (INIS)

    Yu, Chang-shui; Shao, Ting-ting; Li, Dong-mo

    2016-01-01

    The standard deviation (SD) quantifies the spread of the observed values on a measurement of an observable. In this paper, we study the distribution of SD among the different components of a superposition state. It is found that the SD of an observable on a superposition state can be well bounded by the SDs of the superposed states. We also show that the bounds also serve as good bounds on coherence of a superposition state. As a further generalization, we give an alternative definition of incompatibility of two observables subject to a given state and show how the incompatibility subject to a superposition state is distributed.

  12. Distribution of standard deviation of an observable among superposed states

    Science.gov (United States)

    Yu, Chang-shui; Shao, Ting-ting; Li, Dong-mo

    2016-10-01

    The standard deviation (SD) quantifies the spread of the observed values on a measurement of an observable. In this paper, we study the distribution of SD among the different components of a superposition state. It is found that the SD of an observable on a superposition state can be well bounded by the SDs of the superposed states. We also show that the bounds also serve as good bounds on coherence of a superposition state. As a further generalization, we give an alternative definition of incompatibility of two observables subject to a given state and show how the incompatibility subject to a superposition state is distributed.

  13. Applicability of the FASTBUS standard to distributed control

    International Nuclear Information System (INIS)

    Deiss, S.R.; Downing, R.W.; Gustavson, D.B.; Larsen, R.S.; Logg, C.A.; Paffrath, L.

    1981-03-01

    The new FASTBUS standard has been designed to provide a framework for distributed processing in both experimental data acquisition and accelerator control. The features of FASTBUS which support distributed control are a priority arbitration scheme which allows intercrate as well as intracrate message flow between processors and slave devices; and a high bandwidth to permit efficient sharing of the data paths by high-speed devices. Sophisticated diagnostic aids permit system-wide error checking and/or correction. Software has been developed for large distributed systems. This consists of a system data base description, and initialization algorithms to allocate address space and establish preferred message routes. A diagnostics package is also being developed, based on an independent Ethernet-like serial link. The paper describes available hardware and software, on-going developments, and current applications

  14. On the null distribution of Bayes factors in linear regression

    Science.gov (United States)

    We show that under the null, the 2 log (Bayes factor) is asymptotically distributed as a weighted sum of chi-squared random variables with a shifted mean. This claim holds for Bayesian multi-linear regression with a family of conjugate priors, namely, the normal-inverse-gamma prior, the g-prior, and...

  15. Prevalence, distribution and risk factors associated with taeniid ...

    African Journals Online (AJOL)

    In this study we used the sodium chloride floatation technique and well structured close ended questionnaires to determine the prevalence, distribution and risk factors associated with these infections in trade dogs in Dawaki, Plateau State. Data were analysed using chi-square (x2) test, odds ratio and logistic regression at ...

  16. Distribution of Standard deviation of an observable among superposed states

    OpenAIRE

    Yu, Chang-shui; Shao, Ting-ting; Li, Dong-mo

    2016-01-01

    The standard deviation (SD) quantifies the spread of the observed values on a measurement of an observable. In this paper, we study the distribution of SD among the different components of a superposition state. It is found that the SD of an observable on a superposition state can be well bounded by the SDs of the superposed states. We also show that the bounds also serve as good bounds on coherence of a superposition state. As a further generalization, we give an alternative definition of in...

  17. Data distribution architecture based on standard real time protocol

    International Nuclear Information System (INIS)

    Castro, R.; Vega, J.; Pereira, A.; Portas, A.

    2009-01-01

    Data distribution architecture (DDAR) has been designed conforming to new requirements, taking into account the type of data that is going to be generated from experiments in International Thermonuclear Experimental Reactor (ITER). The main goal of this architecture is to implement a system that is able to manage on line all data that is being generated by an experiment, supporting its distribution for: processing, storing, analysing or visualizing. The first objective is to have a distribution architecture that supports long pulse experiments (even hours). The described system is able to distribute, using real time protocol (RTP), stored data or live data generated while the experiment is running. It enables researchers to access data on line instead of waiting for the end of the experiment. Other important objective is scalability, so the presented architecture can easily grow based on actual necessities, simplifying estimation and design tasks. A third important objective is security. In this sense, the architecture is based on standards, so complete security mechanisms can be applied, from secure transmission solutions until elaborated access control policies, and it is full compatible with multi-organization federation systems as PAPI or Shibboleth.

  18. RELIABLE COGNITIVE DIMENSIONAL DOCUMENT RANKING BY WEIGHTED STANDARD CAUCHY DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    S Florence Vijila

    2017-04-01

    Full Text Available Categorization of cognitively uniform and consistent documents such as University question papers are in demand by e-learners. Literature indicates that Standard Cauchy distribution and the derived values are extensively used for checking uniformity and consistency of documents. The paper attempts to apply this technique for categorizing question papers according to four selective cognitive dimensions. For this purpose cognitive dimensional keyword sets of these four categories (also termed as portrayal concepts are assumed and an automatic procedure is developed to quantify these dimensions in question papers. The categorization is relatively accurate when checked with manual methods. Hence simple and well established term frequency / inverse document frequency ‘tf/ IDF’ technique is considered for automating the categorization process. After the documents categorization, standard Cauchy formula is applied to rank order the documents that have the least differences among Cauchy value, (according to Cauchy theorem so as obtain consistent and uniform documents in an order or ranked. For the purpose of experiments and social survey, seven question papers (documents have been designed with various consistencies. To validate this proposed technique social survey is administered on selective samples of e-learners of Tamil Nadu, India. Results are encouraging and conclusions drawn out of the experiments will be useful to researchers of concept mining and categorizing documents according to concepts. Findings have also contributed utility value to e-learning system designers.

  19. 77 FR 10997 - Energy Conservation Program: Energy Conservation Standards for Distribution Transformers; Correction

    Science.gov (United States)

    2012-02-24

    ... Conservation Program: Energy Conservation Standards for Distribution Transformers; Correction AGENCY: Office of... standards for distribution transformers. It was recently discovered that values in certain tables of the...,'' including distribution transformers. The Energy Policy Act of 1992 (EPACT 1992), Public Law 102-486, amended...

  20. Standardization of quantum key distribution and the ETSI standardization initiative ISG-QKD

    Science.gov (United States)

    Länger, Thomas; Lenhart, Gaby

    2009-05-01

    In recent years, quantum key distribution (QKD) has been the object of intensive research activities and of rapid progress, and it is now developing into a competitive industry with commercial products. Once QKD systems are transferred from the controlled environment of physical laboratories into a real-world environment for practical use, a number of practical security, compatibility and connectivity issues need to be resolved. In particular, comprehensive security evaluation and watertight security proofs need to be addressed to increase trust in QKD. System interoperability with existing infrastructures and applications as well as conformance with specific user requirements have to be assured. Finding common solutions to these problems involving all actors can provide an advantage for the commercialization of QKD as well as for further technological development. The ETSI industry specification group for QKD (ISG-QKD) offers a forum for creating such universally accepted standards and will promote significant leverage effects on coordination, cooperation and convergence in research, technical development and business application of QKD.

  1. Tsallis distribution as a standard maximum entropy solution with 'tail' constraint

    International Nuclear Information System (INIS)

    Bercher, J.-F.

    2008-01-01

    We show that Tsallis' distributions can be derived from the standard (Shannon) maximum entropy setting, by incorporating a constraint on the divergence between the distribution and another distribution imagined as its tail. In this setting, we find an underlying entropy which is the Renyi entropy. Furthermore, escort distributions and generalized means appear as a direct consequence of the construction. Finally, the 'maximum entropy tail distribution' is identified as a Generalized Pareto Distribution

  2. Linear Estimation of Standard Deviation of Logistic Distribution ...

    African Journals Online (AJOL)

    The paper presents a theoretical method based on order statistics and a FORTRAN program for computing the variance and relative efficiencies of the standard deviation of the logistic population with respect to the Cramer-Rao lower variance bound and the best linear unbiased estimators (BLUE\\'s) when the mean is ...

  3. 78 FR 43817 - Distribution of Reference Biological Standards and Biological Preparations

    Science.gov (United States)

    2013-07-22

    ... HUMAN SERVICES 42 CFR Part 7 RIN 0920-AA53 Distribution of Reference Biological Standards and Biological... sections of its regulations titled ``Distribution of Reference Biological Standards and Biological... section states that HHS/CDC may prepare any biological product described under section 351 of the Public...

  4. Central limit theorems for classical likelihood ratio tests for high-dimensional normal distributions

    OpenAIRE

    Jiang, Tiefeng; Yang, Fan

    2013-01-01

    For random samples of size $n$ obtained from $p$-variate normal distributions, we consider the classical likelihood ratio tests (LRT) for their means and covariance matrices in the high-dimensional setting. These test statistics have been extensively studied in multivariate analysis, and their limiting distributions under the null hypothesis were proved to be chi-square distributions as $n$ goes to infinity and $p$ remains fixed. In this paper, we consider the high-dimensional case where both...

  5. 77 FR 32916 - Energy Conservation Standards for Distribution Transformers: Public Meeting and Availability of...

    Science.gov (United States)

    2012-06-04

    ...-2010-BT-STD-0048] RIN 1904-AC04 Energy Conservation Standards for Distribution Transformers: Public... information that it is making available about the liquid-immersed distribution transformer equipment classes... equipment classes for several different types of liquid-immersed distribution transformers. In addition to...

  6. 76 FR 45471 - Energy Efficiency Standards for Distribution Transformers; Notice of Intent To Negotiate Proposed...

    Science.gov (United States)

    2011-07-29

    ... EERE-2010-BT-STD-0048] RIN 1904-AC04 Energy Efficiency Standards for Distribution Transformers; Notice...-type distribution transformers. The purpose of the subcommittee will be to discuss and, if possible, reach consensus on a proposed rule for the energy efficiency of distribution transformers, as authorized...

  7. 14 CFR 25.1445 - Equipment standards for the oxygen distributing system.

    Science.gov (United States)

    2010-01-01

    ... distributing system. 25.1445 Section 25.1445 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT... Miscellaneous Equipment § 25.1445 Equipment standards for the oxygen distributing system. (a) When oxygen is supplied to both crew and passengers, the distribution system must be designed for either— (1) A source of...

  8. The number of degrees of freedom for statistical distribution of s wave reduced neutron width for several nuclei

    International Nuclear Information System (INIS)

    Zhixiang, Z.

    1983-01-01

    The least squares fit has been performed using chi-squared distribution function for all available evaluated data for s-wave reduced neutron width of several nuclei. The number of degrees of freedom and average value have been obtained. The missing levels of weak s-wave resonances and extra p-wave levels have been taken into account, if any. For 75 As and 103 Rh, s-wave population has been separated by Bayes' theorem before making fit. The results thus obtained are consistent with Porter-Thomas distribution, i.e., chi-squared distribution with γ=1, as one would expect. It has not been found in this work that the number of degrees of freedom for the distribution of s-wave reduced neutron width might be greater than one as reported by H.C.Sharma et al. (1976) at the international conference on interactions of neutrons with nuclei. (Auth.)

  9. IEEE 1547 and 2030 Standards for Distributed Energy Resources Interconnection and Interoperability with the Electricity Grid

    Energy Technology Data Exchange (ETDEWEB)

    Basso, T.

    2014-12-01

    Public-private partnerships have been a mainstay of the U.S. Department of Energy and the National Renewable Energy Laboratory (DOE/NREL) approach to research and development. These partnerships also include technology development that enables grid modernization and distributed energy resources (DER) advancement, especially renewable energy systems integration with the grid. Through DOE/NREL and industry support of Institute of Electrical and Electronics Engineers (IEEE) standards development, the IEEE 1547 series of standards has helped shape the way utilities and other businesses have worked together to realize increasing amounts of DER interconnected with the distribution grid. And more recently, the IEEE 2030 series of standards is helping to further realize greater implementation of communications and information technologies that provide interoperability solutions for enhanced integration of DER and loads with the grid. For these standards development partnerships, for approximately $1 of federal funding, industry partnering has contributed $5. In this report, the status update is presented for the American National Standards IEEE 1547 and IEEE 2030 series of standards. A short synopsis of the history of the 1547 standards is first presented, then the current status and future direction of the ongoing standards development activities are discussed.

  10. Standard services for the capture, processing, and distribution of packetized telemetry data

    Science.gov (United States)

    Stallings, William H.

    1989-01-01

    Standard functional services for the capture, processing, and distribution of packetized data are discussed with particular reference to the future implementation of packet processing systems, such as those for the Space Station Freedom. The major functions are listed under the following major categories: input processing, packet processing, and output processing. A functional block diagram of a packet data processing facility is presented, showing the distribution of the various processing functions as well as the primary data flow through the facility.

  11. Distributions of the Kullback-Leibler divergence with applications.

    Science.gov (United States)

    Belov, Dmitry I; Armstrong, Ronald D

    2011-05-01

    The Kullback-Leibler divergence (KLD) is a widely used method for measuring the fit of two distributions. In general, the distribution of the KLD is unknown. Under reasonable assumptions, common in psychometrics, the distribution of the KLD is shown to be asymptotically distributed as a scaled (non-central) chi-square with one degree of freedom or a scaled (doubly non-central) F. Applications of the KLD for detecting heterogeneous response data are discussed with particular emphasis on test security. © The British Psychological Society.

  12. Standardization of 32P activity determination method in soil-root cores for root distribution studies

    International Nuclear Information System (INIS)

    Sharma, R.B.; Ghildyal, B.P.

    1976-01-01

    The root distribution of wheat variety UP 301 was obtained by determining the 32 P activity in soil-root cores by two methods, viz., ignition and triacid digestion. Root distribution obtained by these two methods was compared with that by standard root core washing procedure. The percent error in root distribution as determined by triacid digestion method was within +- 2.1 to +- 9.0 as against +- 5.5 to +- 21.2 by ignition method. Thus triacid digestion method proved better over the ignition method. (author)

  13. 78 FR 57293 - Distribution of Reference Biological Standards and Biological Preparations

    Science.gov (United States)

    2013-09-18

    ... From the Federal Register Online via the Government Publishing Office ] DEPARTMENT OF HEALTH AND HUMAN SERVICES 42 CFR Part 7 RIN 0920-AA52 Distribution of Reference Biological Standards and Biological Preparations AGENCY: Centers for Disease Control and Prevention (HHS/CDC), Department of Health and Human...

  14. Standard Error of Linear Observed-Score Equating for the NEAT Design with Nonnormally Distributed Data

    Science.gov (United States)

    Zu, Jiyun; Yuan, Ke-Hai

    2012-01-01

    In the nonequivalent groups with anchor test (NEAT) design, the standard error of linear observed-score equating is commonly estimated by an estimator derived assuming multivariate normality. However, real data are seldom normally distributed, causing this normal estimator to be inconsistent. A general estimator, which does not rely on the…

  15. Practical Statistics for Particle Physics Analyses: Chi-Squared and Goodness of Fit (2/4)

    CERN Multimedia

    CERN. Geneva; Moneta, Lorenzo

    2016-01-01

    This will be a 4-day series of 2-hour sessions as part of CERN's Academic Training Course. Each session will consist of a 1-hour lecture followed by one hour of practical computing, which will have exercises based on that day's lecture. While it is possible to follow just the lectures or just the computing exercises, we highly recommend that, because of the way this course is designed, participants come to both parts. In order to follow the hands-on exercises sessions, students need to bring their own laptops. The exercises will be run on a dedicated CERN Web notebook service, SWAN (swan.cern.ch), which is open to everybody holding a CERN computing account. The requirement to use the SWAN service is to have a CERN account and to have also access to Cernbox, the shared storage service at CERN. New users of cernbox are invited to activate beforehand cernbox by simply connecting to https://cernbox.cern.ch. A basic prior knowledge of ROOT and C++ is also recommended for participation in the practical session....

  16. Chi-Square Discriminators for Transiting Planet Detection in Kepler Data

    OpenAIRE

    Seader, Shawn; Tenenbaum, Peter; Jenkins, Jon M.; Burke, Christopher J.

    2013-01-01

    The Kepler spacecraft observes a host of target stars to detect transiting planets. Requiring a 7.1 sigma detection in twelve quarters of data yields over 100,000 detections, many of which are false alarms. After a second cut is made on a robust detection statistic, some 50,000 or more targets still remain. These false alarms waste resources as they propagate through the remainder of the software pipeline and so a method to discriminate against them is crucial in maintaining the desired sensi...

  17. Learning Word Embeddings with Chi-Square Weights for Healthcare Tweet Classification

    Directory of Open Access Journals (Sweden)

    Sicong Kuang

    2017-08-01

    Full Text Available Twitter is a popular source for the monitoring of healthcare information and public disease. However, there exists much noise in the tweets. Even though appropriate keywords appear in the tweets, they do not guarantee the identification of a truly health-related tweet. Thus, the traditional keyword-based classification task is largely ineffective. Algorithms for word embeddings have proved to be useful in many natural language processing (NLP tasks. We introduce two algorithms based on an existing word embedding learning algorithm: the continuous bag-of-words model (CBOW. We apply the proposed algorithms to the task of recognizing healthcare-related tweets. In the CBOW model, the vector representation of words is learned from their contexts. To simplify the computation, the context is represented by an average of all words inside the context window. However, not all words in the context window contribute equally to the prediction of the target word. Greedily incorporating all the words in the context window will largely limit the contribution of the useful semantic words and bring noisy or irrelevant words into the learning process, while existing word embedding algorithms also try to learn a weighted CBOW model. Their weights are based on existing pre-defined syntactic rules while ignoring the task of the learned embedding. We propose learning weights based on the words’ relative importance in the classification task. Our intuition is that such learned weights place more emphasis on words that have comparatively more to contribute to the later task. We evaluate the embeddings learned from our algorithms on two healthcare-related datasets. The experimental results demonstrate that embeddings learned from the proposed algorithms outperform existing techniques by a relative accuracy improvement of over 9%.

  18. A study of standard building blocks for the design of fault-tolerant distributed computer systems

    Science.gov (United States)

    Rennels, D. A.; Avizienis, A.; Ercegovac, M.

    1978-01-01

    This paper presents the results of a study that has established a standard set of four semiconductor VLSI building-block circuits. These circuits can be assembled with off-the-shelf microprocessors and semiconductor memory modules into fault-tolerant distributed computer configurations. The resulting multi-computer architecture uses self-checking computer modules backed up by a limited number of spares. A redundant bus system is employed for communication between computer modules.

  19. Probability distribution function values in mobile phones;Valores de funciones de distribución de probabilidad en el teléfono móvil

    Directory of Open Access Journals (Sweden)

    Luis Vicente Chamorro Marcillllo

    2013-06-01

    Full Text Available Engineering, within its academic and application forms, as well as any formal research work requires the use of statistics and every inferential statistical analysis requires the use of values of probability distribution functions that are generally available in tables. Generally, management of those tables have physical problems (wasteful transport and wasteful consultation and operational (incomplete lists and limited accuracy. The study, “Probability distribution function values in mobile phones”, permitted determining – through a needs survey applied to students involved in statistics studies at Universidad de Nariño – that the best known and most used values correspond to Chi-Square, Binomial, Student’s t, and Standard Normal distributions. Similarly, it showed user’s interest in having the values in question within an alternative means to correct, at least in part, the problems presented by “the famous tables”. To try to contribute to the solution, we built software that allows immediately and dynamically obtaining the values of the probability distribution functions most commonly used by mobile phones.

  20. Standard test method for distribution coefficients of inorganic species by the batch method

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This test method covers the determination of distribution coefficients of chemical species to quantify uptake onto solid materials by a batch sorption technique. It is a laboratory method primarily intended to assess sorption of dissolved ionic species subject to migration through pores and interstices of site specific geomedia. It may also be applied to other materials such as manufactured adsorption media and construction materials. Application of the results to long-term field behavior is not addressed in this method. Distribution coefficients for radionuclides in selected geomedia are commonly determined for the purpose of assessing potential migratory behavior of contaminants in the subsurface of contaminated sites and waste disposal facilities. This test method is also applicable to studies for parametric studies of the variables and mechanisms which contribute to the measured distribution coefficient. 1.2 The values stated in SI units are to be regarded as standard. No other units of measurement a...

  1. Regional drought assessment using a distributed hydrological model coupled with Standardized Runoff Index

    Directory of Open Access Journals (Sweden)

    H. Shen

    2015-05-01

    Full Text Available Drought assessment is essential for coping with frequent droughts nowadays. Owing to the large spatio-temporal variations in hydrometeorology in most regions in China, it is very necessary to use a physically-based hydrological model to produce rational spatial and temporal distributions of hydro-meteorological variables for drought assessment. In this study, the large-scale distributed hydrological model Variable Infiltration Capacity (VIC was coupled with a modified standardized runoff index (SRI for drought assessment in the Weihe River basin, northwest China. The result indicates that the coupled model is capable of reasonably reproducing the spatial distribution of drought occurrence. It reflected the spatial heterogeneity of regional drought and improved the physical mechanism of SRI. This model also has potential for drought forecasting, early warning and mitigation, given that accurate meteorological forcing data are available.

  2. Optimal transformations leading to normal distributions of positron emission tomography standardized uptake values

    Science.gov (United States)

    Scarpelli, Matthew; Eickhoff, Jens; Cuna, Enrique; Perlman, Scott; Jeraj, Robert

    2018-02-01

    The statistical analysis of positron emission tomography (PET) standardized uptake value (SUV) measurements is challenging due to the skewed nature of SUV distributions. This limits utilization of powerful parametric statistical models for analyzing SUV measurements. An ad-hoc approach, which is frequently used in practice, is to blindly use a log transformation, which may or may not result in normal SUV distributions. This study sought to identify optimal transformations leading to normally distributed PET SUVs extracted from tumors and assess the effects of therapy on the optimal transformations. Methods. The optimal transformation for producing normal distributions of tumor SUVs was identified by iterating the Box-Cox transformation parameter (λ) and selecting the parameter that maximized the Shapiro-Wilk P-value. Optimal transformations were identified for tumor SUVmax distributions at both pre and post treatment. This study included 57 patients that underwent 18F-fluorodeoxyglucose (18F-FDG) PET scans (publically available dataset). In addition, to test the generality of our transformation methodology, we included analysis of 27 patients that underwent 18F-Fluorothymidine (18F-FLT) PET scans at our institution. Results. After applying the optimal Box-Cox transformations, neither the pre nor the post treatment 18F-FDG SUV distributions deviated significantly from normality (P  >  0.10). Similar results were found for 18F-FLT PET SUV distributions (P  >  0.10). For both 18F-FDG and 18F-FLT SUV distributions, the skewness and kurtosis increased from pre to post treatment, leading to a decrease in the optimal Box-Cox transformation parameter from pre to post treatment. There were types of distributions encountered for both 18F-FDG and 18F-FLT where a log transformation was not optimal for providing normal SUV distributions. Conclusion. Optimization of the Box-Cox transformation, offers a solution for identifying normal SUV transformations for when

  3. GLN standard as a facilitator of physical location identification within process of distribution

    Directory of Open Access Journals (Sweden)

    Davor Dujak

    2017-09-01

    Full Text Available Background: Distribution, from the business point of view, is a set of decisions and actions that will provide the right products at the right time and place, in line with customer expectations. It is a process that generates significant cost, but also effectively implemented, significantly affects the positive perception of the company. Institute of Logistics and Warehousing (IliM, based on the research results related to the optimization of the distribution network and consulting projects for companies, indicates the high importance of the correct description of the physical location within the supply chains in order to make transport processes more effective. Individual companies work on their own geocoding of warehouse locations and location of their business partners (suppliers, customers, but the lack of standardization in this area causes delays related to delivery problems with reaching the right destination. Furthermore, the cooperating companies do not have a precise indication of the operating conditions of each location, e.g. Time windows of the plant, logistic units accepted at parties, supported transport etc. Lack of this information generates additional costs associated with re-operation and the costs of lost benefits for the lack of goods on time. The solution to this problem seems to be a wide-scale implementation of GS1 standard which is the Global Location Number (GLN, that, thanks to a broad base of information will assist the distribution processes. Material and methods: The results of survey conducted among Polish companies in the second half of 2016 indicate an unsatisfactory degree of implementation of the transport processes, resulting from incorrect or inaccurate description of the location, and thus, a significant number of errors in deliveries. Accordingly, authors studied literature and examined case studies indicating the possibility of using GLN standard to identify the physical location and to show the

  4. Shock Reactivity Study on Standard and Reduced Sensitivity Rdx of Different Particle Size Distributions

    Science.gov (United States)

    McGregor, N. M.; Lindfors, A. J.

    2007-12-01

    Embedded gauge experiments have been performed using a three inch high velocity powder gun to assess the effects of RDX particle size and crystal quality on shock induced reactivity in support of the Combat Safe Insensitive Munitions (CSIM) program. Four monomodal experimental compositions containing 73% solids loading by weight and 27% HTPB binder were tested. The compositions were made using either standard or reduced sensitivity grades of RDX in Class 5 or Class 1 150-300 micron sieve cut particle size classes. Results have shown marked changes in the mode of reaction between the two particle size classes. Both RDX grades at the Class 1 sieve cut particle size distribution showed significant reaction at the shock front as well as behind the front. The Class 5 RDX compositions however showed little reaction at the shock front with rapid growth behind the front. Reaction modes were similar but occurring at greater input pressures for the reduced sensitivity grade of RDX compared to the corresponding particle size distribution standard grade RDX counterpart.

  5. British Standard method for determining the luminance distribution of electro-optical x-ray image intensifiers

    International Nuclear Information System (INIS)

    1982-01-01

    Under the direction of the Light Electrical Engineering Standards Committee, a British Standard method has been prepared for determining the luminance distribution of electro-optical X-ray image intensifiers. The luminance distribution is determined from the measurement of the luminance over the area of the output image related to conditions of uniform exposure rate in the entrance plane of an electro-optical X-ray image intensifier. (U.K.)

  6. Establishment of a Standard Analytical Model of Distribution Network with Distributed Generators and Development of Multi Evaluation Method for Network Configuration Candidates

    Science.gov (United States)

    Hayashi, Yasuhiro; Kawasaki, Shoji; Matsuki, Junya; Matsuda, Hiroaki; Sakai, Shigekazu; Miyazaki, Teru; Kobayashi, Naoki

    Since a distribution network has many sectionalizing switches, there are huge radial network configuration candidates by states (opened or closed) of sectionalizing switches. Recently, the total number of distributed generation such as photovoltaic generation system and wind turbine generation system connected to the distribution network is drastically increased. The distribution network with the distributed generators must be operated keeping reliability of power supply and power quality. Therefore, the many configurations of the distribution network with the distributed generators must be evaluated multiply from various viewpoints such as distribution loss, total harmonic distortion, voltage imbalance and so on. In this paper, the authors propose a multi evaluation method to evaluate the distribution network configuration candidates satisfied with constraints of voltage and line current limit from three viewpoints ((1) distribution loss, (2) total harmonic distortion and (3) voltage imbalance). After establishing a standard analytical model of three sectionalized and three connected distribution network configuration with distributed generators based on the practical data, the multi evaluation for the established model is carried out by using the proposed method based on EMTP (Electro-Magnetic Transients Programs).

  7. Thermal neutron standard field with a Maxwellian distribution using the KUR heavy water facility

    International Nuclear Information System (INIS)

    Kanda, K.; Kobayashi, K.; Okamoto, S.; Shibata, T.

    1978-01-01

    A heavy water facility attached to the KUR (Kyoto University Reactor, swimming pool type. 5 MW) yeilds pure thermal neutrons with a Maxwellian distribution. The facility is placed next to the core of KUR and contains about 2t of heavy water. The width of the heavy water layer is about 140 cm. The neutron spectrum was measured with the time-of-flight technique using a fast chopper. The measured spectrum was in good agreement with a Maxwellian distribution in the whole energy region for thermal neutrons. The neutron temperature was slightly higher than the heavy water temperature. The contamination of epithermal and fast neutrons caused by photo-neutrons from the γ-n reaction in heavy water is very small. The maximum intensity of thermal neutrons is 3 X 10 11 n/cm.s. When a bismuth scatterrer is attached, the gamma ray contamination is decreased to a ratio of 0.05 of gamma rays to neutrons in Rem. This standard neutron field has been used for such experiments as thermal neutron cross section measurement, diffusion length measurement, detector calibration, activation analysis and for biomedical purposes. (Auth.)

  8. Mt. Graham: optical turbulence vertical distribution with standard and high resolution

    Science.gov (United States)

    Masciadri, Elena; Stoesz, Jeff; Hagelin, Susanna; Lascaux, Franck

    2010-07-01

    A characterization of the optical turbulence vertical distribution and all the main integrated astroclimatic parameters derived from the C2N and the wind speed profiles above Mt. Graham is presented. The statistic includes measurements related to 43 nights done with a Generalized Scidar (GS) used in standard configuration with a vertical resolution of ~1 km on the whole 20-22 km and with the new technique (HVR-GS) in the first kilometer. The latter achieves a resolution of ~ 20-30 m in this region of the atmosphere. Measurements done in different periods of the year permit us to provide a seasonal variation analysis of the C2N. A discretized distribution of the typical C2N profiles useful for the Ground Layer Adaptive Optics (GLAO) simulations is provided and a specific analysis for the LBT Laser Guide Star system ARGOS case is done including the calculation of the 'gray zones' for J, H and K bands. Mt. Graham confirms to be an excellent site with median values of the seeing without dome contribution equal to 0.72", the isoplanatic angle equal to 2.5" and the wavefront coherence time equal to 4.8 msec. We provide a cumulative distribution of the percentage of turbulence developed below H* where H* is included in the (0,1 km) range. We find that 50% of the whole turbulence develops in the first 80 m from the ground. The turbulence decreasing rate is very similar to what has been observed above Mauna Kea.

  9. A Virtual Geophysical Network: Using Industry Standard Technology to Link Geographically Distributed Sensors and Data Centers

    Science.gov (United States)

    Ahern, T. K.; Benson, R. B.; Crotwell, H. P.

    2003-12-01

    The IRIS Data Management System has long supported distributed data centers as a method of providing scientific researchers access to data from seismological networks around the world. For nearly a decade, the NetDC system used email as the method through which users could access data centers located around the globe in a seamless fashion. More recently the IRIS DMC has partnered with the University of South Carolina to develop a new method through which a virtual data center can be created. The Common Object Request Broker Architecture (CORBA) technology is an industry standard distributed computing architecture. Traditionally used by major corporations, IRIS has developed a Data Handling Interface (DHI) system that is capable of connecting services at participating data centers (servers) to applications running on end-users computing platforms (clients). For seismology we have identified three services. 1) A network service that provides information about geophysical observatories around the world such as where the sensors exist, what types of information are recorded on the sensors, and calibration information that allows proper use of the data, 2) an event service that allows applications to access information about earthquakes and seismological events and 3) waveform services that allow users to gain access to seismograms or time series data from other geophysical sensors. Seismological Data Centers operate the servers thereby allowing a variety of client applications to directly access the information at these data centers. Currently IRIS, the U. of South Carolina, UC Berkeley, and a European Data Center (ORFEUS) have been involved in the DHI project. This talk will highlight some of the DHI enabled clients that allow geophysical information to be directly transferred to the clients. Since the data center servers appear with the same interface specification (Interface Definition Language) a client that can talk to one DHI server can talk to any DHI enabled

  10. Autoregressive moving average fitting for real standard deviation in Monte Carlo power distribution calculation

    International Nuclear Information System (INIS)

    Ueki, Taro

    2010-01-01

    The noise propagation of tallies in the Monte Carlo power method can be represented by the autoregressive moving average process of orders p and p-1 (ARMA(p,p-1)], where p is an integer larger than or equal to two. The formula of the autocorrelation of ARMA(p,q), p≥q+1, indicates that ARMA(3,2) fitting is equivalent to lumping the eigenmodes of fluctuation propagation in three modes such as the slow, intermediate and fast attenuation modes. Therefore, ARMA(3,2) fitting was applied to the real standard deviation estimation of fuel assemblies at particular heights. The numerical results show that straightforward ARMA(3,2) fitting is promising but a stability issue must be resolved toward the incorporation in the distributed version of production Monte Carlo codes. The same numerical results reveal that the average performance of ARMA(3,2) fitting is equivalent to that of the batch method in MCNP with a batch size larger than one hundred and smaller than two hundred cycles for a 1100 MWe pressurized water reactor. The bias correction of low lag autocovariances in MVP/GMVP is demonstrated to have the potential of improving the average performance of ARMA(3,2) fitting. (author)

  11. [Development of national and international standards of population age distribution for medical statistics, health-demographic analysis and risk assessment].

    Science.gov (United States)

    Demin, V F; Pal'tsev, M A; Chaban, E A

    2013-01-01

    The current European standard (CES) and the World population age distribution standard is widely used in medical and demographic studies, performed by international (WHO, etc.) and national organizations. The Russian Federal Service of States Statistics (RosStat) uses CES in demographic yearbooks and other publications. The standard is applied in calculation of the standardized mortality rate (SMR) of the population in different countries and territories. Risk assessment is also used CES. In the basis of the standards there has been laid the idea to assess mortality according to uniform standard, so to get possibility to compare the mortality rate of the population in different countries and regions, different genders and different calendar years. Analysis of the results of test calculations of the values of the SMR for the population of Russia and other countries with the use of current standards has revealed serious shortcomings of the latters and set up the task of improving them. A new concept of the development of standards based on the use of the concept of stable equilibrium of the age distribution of the population and survivorship function is proposed.

  12. A data science based standardized Gini index as a Lorenz dominance preserving measure of the inequality of distributions.

    Directory of Open Access Journals (Sweden)

    Alfred Ultsch

    Full Text Available The Gini index is a measure of the inequality of a distribution that can be derived from Lorenz curves. While commonly used in, e.g., economic research, it suffers from ambiguity via lack of Lorenz dominance preservation. Here, investigation of large sets of empirical distributions of incomes of the World's countries over several years indicated firstly, that the Gini indices are centered on a value of 33.33% corresponding to the Gini index of the uniform distribution and secondly, that the Lorenz curves of these distributions are consistent with Lorenz curves of log-normal distributions. This can be employed to provide a Lorenz dominance preserving equivalent of the Gini index. Therefore, a modified measure based on log-normal approximation and standardization of Lorenz curves is proposed. The so-called UGini index provides a meaningful and intuitive standardization on the uniform distribution as this characterizes societies that provide equal chances. The novel UGini index preserves Lorenz dominance. Analysis of the probability density distributions of the UGini index of the World's counties income data indicated multimodality in two independent data sets. Applying Bayesian statistics provided a data-based classification of the World's countries' income distributions. The UGini index can be re-transferred into the classical index to preserve comparability with previous research.

  13. A data science based standardized Gini index as a Lorenz dominance preserving measure of the inequality of distributions.

    Science.gov (United States)

    Ultsch, Alfred; Lötsch, Jörn

    2017-01-01

    The Gini index is a measure of the inequality of a distribution that can be derived from Lorenz curves. While commonly used in, e.g., economic research, it suffers from ambiguity via lack of Lorenz dominance preservation. Here, investigation of large sets of empirical distributions of incomes of the World's countries over several years indicated firstly, that the Gini indices are centered on a value of 33.33% corresponding to the Gini index of the uniform distribution and secondly, that the Lorenz curves of these distributions are consistent with Lorenz curves of log-normal distributions. This can be employed to provide a Lorenz dominance preserving equivalent of the Gini index. Therefore, a modified measure based on log-normal approximation and standardization of Lorenz curves is proposed. The so-called UGini index provides a meaningful and intuitive standardization on the uniform distribution as this characterizes societies that provide equal chances. The novel UGini index preserves Lorenz dominance. Analysis of the probability density distributions of the UGini index of the World's counties income data indicated multimodality in two independent data sets. Applying Bayesian statistics provided a data-based classification of the World's countries' income distributions. The UGini index can be re-transferred into the classical index to preserve comparability with previous research.

  14. OVERVIEW OF EVALUATION OF DISTRIBUTION OF CERTIFICATES OF STANDARDS AND ISO 9000 ISO 14000 IN NATIONAL AND FOREIGN COMPANIES

    OpenAIRE

    GILSON EDUARDO TARRENTO; CELSO FERNANDES JOAQUIM JUNIOR

    2010-01-01

    Quality is considered as an essential factor for the competitiveness of companies due to the necessity to improve processes and better use of their resources. In this context, the standardization of processes can assist on improving customer satisfaction and contribute to sustainability and image of organizations. Considering that quality certification leads to standardization, this paper aimed to discuss the outlook of the distribution of the certificates of ISO 9000 and ISO 14000, gr...

  15. Airside HVAC BESTEST: HVAC Air-Distribution System Model Test Cases for ASHRAE Standard 140

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, Ronald [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Neymark, Joel [J. Neymark & Associates; Kennedy, Mike D. [Mike D. Kennedy, Inc.; Gall, J. [AAON, Inc.; Henninger, R. [GARD Analytics, Inc.; Hong, T. [Lawrence Berkeley National Laboratory; Knebel, D. [AAON, Inc.; McDowell, T. [Thermal Energy System Specialists, LLC; Witte, M. [GARD Analytics, Inc.; Yan, D. [Tsinghua University; Zhou, X. [Tsinghua University

    2017-08-07

    This paper summarizes recent work to develop new airside HVAC equipment model analytical verification test cases for ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs. The analytical verification test method allows comparison of simulation results from a wide variety of building energy simulation programs with quasi-analytical solutions, further described below. Standard 140 is widely cited for evaluating software for use with performance-path energy efficiency analysis, in conjunction with well-known energy-efficiency standards including ASHRAE Standard 90.1, the International Energy Conservation Code, and other international standards. Airside HVAC Equipment is a common area of modelling not previously explicitly tested by Standard 140. Integration of the completed test suite into Standard 140 is in progress.

  16. Designing a Distributed Space Systems Simulation in Accordance with the Simulation Interoperability Standards Organization (SISO)

    Science.gov (United States)

    Cowen, Benjamin

    2011-01-01

    Simulations are essential for engineering design. These virtual realities provide characteristic data to scientists and engineers in order to understand the details and complications of the desired mission. A standard development simulation package known as Trick is used in developing a source code to model a component (federate in HLA terms). The runtime executive is integrated into an HLA based distributed simulation. TrickHLA is used to extend a Trick simulation for a federation execution, develop a source code for communication between federates, as well as foster data input and output. The project incorporates international cooperation along with team collaboration. Interactions among federates occur throughout the simulation, thereby relying on simulation interoperability. Communication through the semester went on between participants to figure out how to create this data exchange. The NASA intern team is designing a Lunar Rover federate and a Lunar Shuttle federate. The Lunar Rover federate supports transportation across the lunar surface and is essential for fostering interactions with other federates on the lunar surface (Lunar Shuttle, Lunar Base Supply Depot and Mobile ISRU Plant) as well as transporting materials to the desired locations. The Lunar Shuttle federate transports materials to and from lunar orbit. Materials that it takes to the supply depot include fuel and cargo necessary to continue moon-base operations. This project analyzes modeling and simulation technologies as well as simulation interoperability. Each team from participating universities will work on and engineer their own federate(s) to participate in the SISO Spring 2011 Workshop SIW Smackdown in Boston, Massachusetts. This paper will focus on the Lunar Rover federate.

  17. The rate commitment to ISO 214 standard among the persian abstracts of approved research projects at school of health management and medical informatics, Isfahan University of Medical Sciences, Isfahan, Iran.

    Science.gov (United States)

    Papi, Ahmad; Khalaji, Davoud; Rizi, Hasan Ashrafi; Shabani, Ahmad; Hassanzadeh, Akbar

    2014-01-01

    Commitment to abstracting standards has a very significant role in information retrieval. The present research aimed to evaluate the rate of Commitment to ISO 214 Standard among the Persian abstracts of approved research projects at School of Health Management and Medical Informatics, Isfahan University of Medical Sciences, Isfahan, Iran. This descriptive study used a researcher-made checklist to collect data, which was then analyzed through content analysis. The studied population consisted of 227 approved research projects in the School of Health Management and Medical Informatics, Isfahan University of Medical Sciences during 2001-2010. The validity of the checklist was measured by face and content validity. Data was collected through direct observations. Statistical analyzes including descriptive (frequency distribution and percent) and inferential statistics (Chi-square test) were performed in SPSS-16. The highest and lowest commitment rates to ISO 214 standard were in using third person pronouns (100%) and using active verbs (34/4%), respectively. In addition, the highest commitment rates to ISO 214 standard (100%) related to mentioning third person pronouns, starting the abstract with a sentence to explain the subject of the research, abstract placement, and including keyword in 2009. On the other hand, during 2001-2003, the lowest commitment rate was observed in reporting research findings (16/7%). Moreover, various educational groups differed significantly only in commitment to study goals, providing research findings, and abstaining from using abbreviations, signs, and acronyms. Furthermore, educational level of the corresponding author was significantly related with extracting the keywords from the text. Other factors of ISO 214 standard did not have significant relations with the educational level of the corresponding author. In general, a desirable rate of commitment to ISO 214 standard was observed among the Persian abstracts of approved research

  18. Critical factors influencing hospitals' adoption of HL7 version 2 standards: an empirical investigation.

    Science.gov (United States)

    Lin, Chi-Hung; Lin, I-Chun; Roan, Jin-Sheng; Yeh, Jehn-Shan

    2012-06-01

    Industry predictions focus on future e-hospitals that will integrate all stakeholders into a seamless network, allowing data to be shared. The Health Level Seven (HL7) is a standard for the interchange of data within the healthcare industry. It simplifies communication interfaces and allows the interoperability among heterogeneous applications. Although the benefits of adopting HL7 are well known, only a few hospitals in Taiwan have actually adopted it. What are the reasons behind the hospitals' lack of intention to adopt HL7? Most prior studies on HL7 have focused on technical issues and general overlooked the managerial side. This has caused a lack of understanding of factors influencing hospitals' decision on HL7 adoption. In fact, main reasons behind a hospital's decision on whether to adopt an innovative technology are more often related to organizational than purely technical issues. Hence, we pay our attention to these organizational considerations over HL7 adoption. Based on the Innovation Diffusion Theory, we proposed a research model to explore the critical factors influencing Taiwan hospitals' adoption intention of HL7. 472 questionnaires were distributed to all accredited hospitals in Taiwan and 122 were returned. The valid response rate was 25.21% (119). Factor analysis, logistic regression and Pearson Chi-square test were conducted to verify the research model. The results showed that environmental pressure, top management attitude towards HL7, staff's technology capability, system integrity, and hospital's scale were critical factors influencing hospitals' intention on whether to adopt HL7. The research findings provided the government, the healthcare industry, the hospital administrators and the academia with practical and theoretical references. These factors should be considered in planning promotion plan to encourage hospital adoption of HL7. This study also opens up a new research direction as well as a new viewpoint, and consequentially

  19. The negotiation of quality standards: A social interactionists approach to fruit and vegetable distribution in Argentina

    NARCIS (Netherlands)

    Arce, A.M.G.; Viteri, M.L.

    2013-01-01

    This article addresses food quality standards. It suggests that writing on standards creates a flat view of the subject, failing to grasp the richness of the multiple self-organizing practices that shape quality within functioning markets. The article documents the social dimension of quality and

  20. Investigation of element distribution and homogeneity of TXRF samples using SR-micro-XRF to validate the use of an internal standard and improve external standard quantification.

    Science.gov (United States)

    Horntrich, C; Smolek, S; Maderitsch, A; Simon, R; Kregsamer, P; Streli, C

    2011-06-01

    Total reflection X-ray fluorescence analysis (TXRF) offers a nondestructive qualitative and quantitative analysis of trace elements. Due to its outstanding properties TXRF is widely used in the semiconductor industry for the analysis of silicon wafer surfaces and in the chemical analysis of liquid samples. Two problems occur in quantification: the large statistical uncertainty in wafer surface analysis and the validity of using an internal standard in chemical analysis. In general TXRF is known to allow for linear calibration. For small sample amounts (low nanogram (ng) region) the thin film approximation is valid neglecting absorption effects of the exciting and the detected radiation. For higher total amounts of samples deviations from the linear relation between fluorescence intensity and sample amount can be observed. This could be caused by the sample itself because inhomogeneities and different sample shapes can lead to differences of the emitted fluorescence intensities and high statistical errors. The aim of the study was to investigate the elemental distribution inside a sample. Single and multi-element samples were investigated with Synchrotron-radiation-induced micro X-ray Fluorescence Analysis (SR-μ-XRF) and with an optical microscope. It could be proven that the microscope images are all based on the investigated elements. This allows the determination of the sample shape and potential inhomogeneities using only light microscope images. For the multi-element samples, it was furthermore shown that the elemental distribution inside the samples is homogeneous. This justifies internal standard quantification.

  1. Distribution of stress in greenhouses frames estimated by aerodynamic coefficients of Brazilian and European standards

    Directory of Open Access Journals (Sweden)

    José Gabriel Vieira Neto

    2016-04-01

    Full Text Available ABSTRACT Widely disseminated in both national and international scenarios, greenhouses are agribusiness solutions which are designed to allow for greater efficiency and control of the cultivation of plants. Bearing this in mind, the construction of greenhouses should take into consideration the incidence of wind, and other such aspects of comfort and safety, and ensure they are factored into the design of structural elements. In this study, we evaluated the effects of pressure coefficients established by the European standard EN 13031-1 (2001 and the Brazilian standard ABNT (1988, which are applicable to the structures of greenhouses with flat roofs, taking into account the following variables: roof slope, external and internal pressure coefficients and height-span ratio of the structure. Using the ANSYSTM computer program, zones of columns and roof were discretized by the Beam44 finite element to identify the maximum and minimum stress portions connected to the aerodynamic coefficients. With this analysis, we found that, in the smallest roof slope (a equal to 20°, the frame stress was quite similar for standards adopted. On the other hand, for the greatest inclination (a equal to 26°, the stress was consistently lower under the Brazilian standard. In view of this, we came to the conclusion that the differences between stresses when applying both standards were more significant at the higher degrees of height-span ratio and roof slope.

  2. Hypothesis Tests in Complex Wishart Distributions

    Science.gov (United States)

    Frery, Alejandro C.; Cintra, Renato J.; Nascimento, Abraao D. C.

    2011-03-01

    Quantifying the image contrast between different regions is a major step in image processing. Images obtained from coherent illumination processes are inexorably contaminated with speckle noise. Multilook polarimetric synthetic aperture radar (PolSAR) imagery is a prominent example. Statistical image processing based on the complex Wishart distribution represents a successful approach for modelling PolSAR backscatter data from pasture and forest regions. This work introduces statistical tests for image contrast. Such tests are based on the Kullback-Leibler, Bhattacharyya, Hellinger, Rényi, and chi-square distances. Results Monte Carlo experiments separate the Kullback-Leibler distance as the best one. This result was based on empirical test sizes under several situations which include pure and contaminated data. The proposed methodology was applied to real data, obtained by an E-SAR sensor over surroundings of Weßling, Bavaria, Germany.

  3. Effects of Year-Round Schooling on Disadvantaged Students and the Distribution of Standardized Test Performance

    Science.gov (United States)

    Graves, Jennifer

    2011-01-01

    Using detailed longitudinal data for the state of California, this paper estimates the effect of year-round school calendars on nationally standardized test performance of traditionally disadvantaged students. The student subgroups studied in this paper are: low socioeconomic status, limited English proficiency, Hispanic and Latino, and African…

  4. Determination of the relations governing the evolution of the standard deviations of the distribution of pollution

    International Nuclear Information System (INIS)

    Crabol, B.

    1985-04-01

    An original concept on the difference of behaviour of the high frequency (small-scale) and low frequency (large-scale) atmospheric turbulence relatively to the mean wind speed has been introduced. Through a dimensional analysis based on TAYLOR's formulation, it has been shown that the parameter of the atmospheric dispersion standard-deviations was the travel distance near the source, and the travel time far from the source. Using hypotheses on the energy spectrum in the atmosphere, a numerical application has made it possible to quantify the evolution of the horizontal standard deviation for different mean wind speeds between 0,2 and 10m/s. The areas of validity of the parameter (travel distance or travel time) are clearly shown. The first one is confined in the near field and is all the smaller if the wind speed decreases. For t > 5000s, the dependence on the wind speed of the horizontal standard-deviation expressed in function of the travel time becomes insignificant. The horizontal standard-deviation is only function of the travel time. Results are compared with experimental data obtained in the atmosphere. The similar evolution of the calculated and experimental curves confirms the validity of the hypothesis and input data in calculation. This study can be applied to radioactive effluents transport in the atmosphere

  5. The factor analytic structure of the Roberts Apperception Test for Children: a comparison of the standardization sample with a sample of chronically ill children.

    Science.gov (United States)

    Palomares, R S; Crowley, S L; Worchel, F F; Olson, T K; Rae, W A

    1991-06-01

    A confirmatory principal component factor analysis of the Roberts Apperception Test for Children was conducted using the standardization sample and a sample of chronically ill children. An interpretation of three- and four-factor solutions identified the three-factor solution as superior to the four-factor solution as measured by chi-square goodness of fit and coefficients of convergence. A cluster analysis using Ward's minimum variance method was calculated to determine the typical profiles that best describe the chronically ill sample. Results of this analysis reveal two distinct profiles that differ primarily on the level of adaptive psychological functioning.

  6. Security Standards and Best Practice Considerations for Quantum Key Distribution (QKD)

    Science.gov (United States)

    2012-03-01

    Information Technology Promotion Agency Netherlands National Communications Security Agency Spain Ministerio de Administraciones Públicas and Centro...communicate via public channel to reveal which basis they selected for each photon. Photons where matching bases were chosen are kept; photons where...Distribution for a Satellite Uplink with 50 dB Channel Loss” addresses how to implement QKD via satellite, which act as a trusted node to link two or more

  7. Standardized Low-Power Wireless Communication Technologies for Distributed Sensing Applications

    Directory of Open Access Journals (Sweden)

    Xavier Vilajosana

    2014-02-01

    Full Text Available Recent standardization efforts on low-power wireless communication technologies, including time-slotted channel hopping (TSCH and DASH7 Alliance Mode (D7AM, are starting to change industrial sensing applications, enabling networks to scale up to thousands of nodes whilst achieving high reliability. Past technologies, such as ZigBee, rooted in IEEE 802.15.4, and ISO 18000-7, rooted in frame-slotted ALOHA (FSA, are based on contention medium access control (MAC layers and have very poor performance in dense networks, thus preventing the Internet of Things (IoT paradigm from really taking off. Industrial sensing applications, such as those being deployed in oil refineries, have stringent requirements on data reliability and are being built using new standards. Despite the benefits of these new technologies, industrial shifts are not happening due to the enormous technology development and adoption costs and the fact that new standards are not well-known and completely understood. In this article, we provide a deep analysis of TSCH and D7AM, outlining operational and implementation details with the aim of facilitating the adoption of these technologies to sensor application developers.

  8. Web-Enabled Distributed Health-Care Framework for Automated Malaria Parasite Classification: an E-Health Approach.

    Science.gov (United States)

    Maity, Maitreya; Dhane, Dhiraj; Mungle, Tushar; Maiti, A K; Chakraborty, Chandan

    2017-10-26

    Web-enabled e-healthcare system or computer assisted disease diagnosis has a potential to improve the quality and service of conventional healthcare delivery approach. The article describes the design and development of a web-based distributed healthcare management system for medical information and quantitative evaluation of microscopic images using machine learning approach for malaria. In the proposed study, all the health-care centres are connected in a distributed computer network. Each peripheral centre manages its' own health-care service independently and communicates with the central server for remote assistance. The proposed methodology for automated evaluation of parasites includes pre-processing of blood smear microscopic images followed by erythrocytes segmentation. To differentiate between different parasites; a total of 138 quantitative features characterising colour, morphology, and texture are extracted from segmented erythrocytes. An integrated pattern classification framework is designed where four feature selection methods viz. Correlation-based Feature Selection (CFS), Chi-square, Information Gain, and RELIEF are employed with three different classifiers i.e. Naive Bayes', C4.5, and Instance-Based Learning (IB1) individually. Optimal features subset with the best classifier is selected for achieving maximum diagnostic precision. It is seen that the proposed method achieved with 99.2% sensitivity and 99.6% specificity by combining CFS and C4.5 in comparison with other methods. Moreover, the web-based tool is entirely designed using open standards like Java for a web application, ImageJ for image processing, and WEKA for data mining considering its feasibility in rural places with minimal health care facilities.

  9. The Nigeria wealth distribution and health seeking behaviour: evidence from the 2012 national HIV/AIDS and reproductive health survey.

    Science.gov (United States)

    Fagbamigbe, Adeniyi F; Bamgboye, Elijah A; Yusuf, Bidemi O; Akinyemi, Joshua O; Issa, Bolakale K; Ngige, Evelyn; Amida, Perpetua; Bashorun, Adebobola; Abatta, Emmanuel

    2015-01-01

    Recently, Nigeria emerged as the largest economy in Africa and the 26th in the world. However, a pertinent question is how this new economic status has impacted on the wealth and health of her citizens. There is a dearth of empirical study on the wealth distribution in Nigeria which could be important in explaining the general disparities in their health seeking behavior. An adequate knowledge of Nigeria wealth distribution will no doubt inform policy makers in their decision making to improve the quality of life of Nigerians. This study is a retrospective analysis of the assets of household in Nigeria collected during the 2012 National HIV/AIDS and Reproductive Health Survey (NARHS Plus 2). We used the principal component analysis methods to construct wealth quintiles across households in Nigeria. At 5% significance level, we used ANOVA to determine differences in some health outcomes across the WQs and chi-square test to assess association between WQs and some reproductive health seeking behaviours. The wealth quintiles were found to be internally valid and coherent. However, there is a wide gap in the reproductive health seeking behavior of household members across the wealth quintiles with members of households in lower quintiles having lesser likelihood (33.0%) to receive antenatal care than among those in the highest quintiles (91.9%). While only 3% were currently using modern contraceptives in the lowest wealth quintile, it was 17.4% among the highest wealth quintile (p < 0.05). The wealth quintiles showed a great disparity in the standard of living of Nigerian households across geo-political zones, states and rural-urban locations which had greatly influenced household health seeking behavior.

  10. Child and Adolescent Behavior Inventory (CABI): Standardization for Age 6-17 Years and First Clinical Application.

    Science.gov (United States)

    Cianchetti, Carlo; Pasculli, Marcello; Pittau, Andrea; Campus, Maria Grazia; Carta, Valeria; Littarru, Roberta; Fancello, Giuseppina Sannio; Zuddas, Alessandro; Ledda, Maria Giuseppina

    2017-01-01

    The Child and Adolescent Behavior Inventory (CABI) is a questionnaire designed to collect information from the parents of children and adolescents, both for the preparation of screening and epidemiological studies and for clinical evaluation. It has been published in CPEMH in 2013, with the first data on 8-10 years old school children. Here we report an extended standardization on a school population 6-17 years old and the first results of the application in a clinical sample. Parents, after giving their informed consent, answered to the questionnaire. Complete and reliable data were obtained from the parents of 659 school children and adolescents 6-17 y.o., with a balanced distribution of gender. Moreover, in a population of 84 patients, the results with the CABI were compared with the clinical evaluation and the CBCL. In the school population, scores were different in relation to gender and age. The values of externalizing disorders were higher in males, with the highest values for ADHD in the 6-10 y.o. children. On the contrary, the scores of internalizing disorders and of eating disorders tended to be slightly higher in females. In the clinical population, scores at the CABI were in agreement with the clinical evaluation in 84% cases for depressive symptoms (compared to CBCL 66%), 53% for anxiety symptoms (CBCL 42%) and 87% for ODD (CBCL 69%), differences, however; without statistical significance (chi square). The study obtained normative data for the CABI and gave information of the behavioral differences in relation to age and gender of the school population as evaluated by parents/caregivers. Clinically, the CABI provided useful information for the clinical evaluation of the patient, sometimes with better agreement with the final diagnosis compared to the CBCL.

  11. Standard distribution for unclassified scientific and technical reports: instructions and category scope notes

    International Nuclear Information System (INIS)

    1980-12-01

    The US Department of Energy Technical Information Center (DOE/TIC) uses a subject category scheme for classifying and distributing DOE-originated or -sponsored unclassified scientific and technical reports. This document contains the subject category scope notes used for these purposes. Originators of DOE or DOE-sponsored scientific and technical reports are urged to adhere to the instructions contained this publication. A limited number of copies of the unabridged version (addresses included) are available to Department of Energy offices and their contractors as DOE/TIC-4500(Rev.69)(Unabridged)

  12. Analysis of Chi-square Automatic Interaction Detection (CHAID) and Classification and Regression Tree (CRT) for Classification of Corn Production

    Science.gov (United States)

    Susanti, Yuliana; Zukhronah, Etik; Pratiwi, Hasih; Respatiwulan; Sri Sulistijowati, H.

    2017-11-01

    To achieve food resilience in Indonesia, food diversification by exploring potentials of local food is required. Corn is one of alternating staple food of Javanese society. For that reason, corn production needs to be improved by considering the influencing factors. CHAID and CRT are methods of data mining which can be used to classify the influencing variables. The present study seeks to dig up information on the potentials of local food availability of corn in regencies and cities in Java Island. CHAID analysis yields four classifications with accuracy of 78.8%, while CRT analysis yields seven classifications with accuracy of 79.6%.

  13. Predicting Vocational Rehabilitation Outcomes for People with Alcohol Abuse/Dependence: An Application of Chi-Squared Automatic Interaction Detector

    Science.gov (United States)

    Brickham, Dana M.

    2012-01-01

    People with alcohol abuse/dependence disabilities are often faced with a complex recovery process due to the exacerbating and chronic aspects of their condition. Vocational rehabilitation for people with alcohol abuse/dependence can help individuals access and maintain employment, and through employment can enhance physical and psychological…

  14. On the Existence of Uniformly Most Powerful Bayesian Tests With Application to Non-Central Chi-Squared Tests

    OpenAIRE

    Nikooienejad, Amir; Johnson, Valen E.

    2018-01-01

    Uniformly most powerful Bayesian tests (UMPBTs) are an objective class of Bayesian hypothesis tests that can be considered the Bayesian counterpart of classical uniformly most powerful tests. Unfortunately, UMPBTs have only been exposed for application in one parameter exponential family models. The purpose of this article is to describe methodology for deriving UMPBTs for a larger class of tests. Specifically, we introduce sufficient conditions for the existence of UMPBTs and propose a unifi...

  15. Quantitative analysis of urban sprawl in Tripoli using Pearson's Chi-Square statistics and urban expansion intensity index

    Science.gov (United States)

    Al-sharif, Abubakr A. A.; Pradhan, Biswajeet; Zulhaidi Mohd Shafri, Helmi; Mansor, Shattri

    2014-06-01

    Urban expansion is a spatial phenomenon that reflects the increased level of importance of metropolises. The remotely sensed data and GIS have been widely used to study and analyze the process of urban expansions and their patterns. The capital of Libya (Tripoli) was selected to perform this study and to examine its urban growth patterns. Four satellite imageries of the study area in different dates (1984, 1996, 2002 and 2010) were used to conduct this research. The main goal of this work is identification and analyzes the urban sprawl of Tripoli metropolitan area. Urban expansion intensity index (UEII) and degree of freedom test were used to analyze and assess urban expansions in the area of study. The results show that Tripoli has sprawled urban expansion patterns; high urban expansion intensity index; and its urban development had high degree of freedom according to its urban expansion history during the time period (1984-2010). However, the novel proposed hypothesis used for zones division resulted in very good insight understanding of urban expansion direction and the effect of the distance from central business of district (CBD).

  16. High Level Architecture Distributed Space System Simulation for Simulation Interoperability Standards Organization Simulation Smackdown

    Science.gov (United States)

    Li, Zuqun

    2011-01-01

    Modeling and Simulation plays a very important role in mission design. It not only reduces design cost, but also prepares astronauts for their mission tasks. The SISO Smackdown is a simulation event that facilitates modeling and simulation in academia. The scenario of this year s Smackdown was to simulate a lunar base supply mission. The mission objective was to transfer Earth supply cargo to a lunar base supply depot and retrieve He-3 to take back to Earth. Federates for this scenario include the environment federate, Earth-Moon transfer vehicle, lunar shuttle, lunar rover, supply depot, mobile ISRU plant, exploratory hopper, and communication satellite. These federates were built by teams from all around the world, including teams from MIT, JSC, University of Alabama in Huntsville, University of Bordeaux from France, and University of Genoa from Italy. This paper focuses on the lunar shuttle federate, which was programmed by the USRP intern team from NASA JSC. The shuttle was responsible for provide transportation between lunar orbit and the lunar surface. The lunar shuttle federate was built using the NASA standard simulation package called Trick, and it was extended with HLA functions using TrickHLA. HLA functions of the lunar shuttle federate include sending and receiving interaction, publishing and subscribing attributes, and packing and unpacking fixed record data. The dynamics model of the lunar shuttle was modeled with three degrees of freedom, and the state propagation was obeying the law of two body dynamics. The descending trajectory of the lunar shuttle was designed by first defining a unique descending orbit in 2D space, and then defining a unique orbit in 3D space with the assumption of a non-rotating moon. Finally this assumption was taken away to define the initial position of the lunar shuttle so that it will start descending a second after it joins the execution. VPN software from SonicWall was used to connect federates with RTI during testing

  17. Viability of Distributed Manufacturing of Bicycle Components with 3-D Printing: CEN Standardized Polylactic Acid Pedal Testing

    Directory of Open Access Journals (Sweden)

    Nagendra G. Tanikella

    2017-05-01

    Full Text Available Recent advancements in open-source self-replicating rapid prototypers (RepRap have radically reduced costs of 3-D printing. The cost of additive manufacturing enables distributed manufacturing of open source appropriate technologies (OSAT to assist in sustainable development. In order to investigate the potential this study makes a careful investigation of the use of RepRap 3-D printers to fabricate widely used Black Mamba bicycle components in the developing world. Specifically, this study tests pedals. A CAD model of the pedal was created using parametric open source software (FreeCAD to enable future customization. Then poly-lactic acid, a biodegradable and recyclable bioplastic was selected among the various commercial 3-D printable materials based on strength and cost. The pedal was 3-D printed on a commercial RepRap and tested following the CEN (European Committee for Standardization standards for racing bicycles for 1 static strength, 2 impact, and 3 dynamic durability. The results show the pedals meet the CEN standards and can be used on bicycles. The 3-D printed pedals are significantly lighter than the stock pedals used on the Black Mamba, which provides a performance enhancement while reducing the cost if raw PLA or recycled materials are used, which assists in reducing bicycle costs even for those living in extreme poverty. Other bicycle parts could also be manufactured using 3-D printers for a return on investment on the 3-D printer indicating that this model of distributed manufacturing of OSAT may be technically and economically appropriate through much of the Global South.

  18. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  19. Preliminary estimation of infantile exposure to BPA based on the standard quality of baby bottles distributed in Isfahan urban society

    Directory of Open Access Journals (Sweden)

    Zohreh Abdi Moghadam

    2013-01-01

    Full Text Available Aims: This study was aimed to estimate the bisphenol A (BPA intake from baby bottles, considering the diversity and the standard quality of the baby bottles distributed in an Isfahan urban society. Materials and Methods: A cross-sectional study was performed in Isfahan in 2011. Baby shops ( n = 33 and drug stores ( n = 7 in four district areas were included in the study. The distribution of baby bottles was investigated regarding their brand, origin, and being labeled "BPA free." Estimation of exposure to BPA from baby bottles was made based on the national and international representative data. Results: The products marked as "BPA free" were found among the western products and limited to two of the selected areas. No "BPA free" marked baby bottle was distinguished among the Iranian made products. Of the 8% exclusively formula-fed infants, 90% may be the high consumers of BPA from polycarbonate baby bottles, with an intake of 1.5-2 μg/kg b.w./day for the moderate and 7.5-10 μg/kg b.w./day in case of worse condition. Conclusion: Considering the current globally accepted threshold daily intake (TDI for BPA, primary exposure estimation is that feeding using non-BPA-free baby bottles is not a serious health concern in Iran. Thought that threshold level of TDI is discussed to be reduced in future, improvement and revision of the national standards can be effective in reducing the exposure to BPA in Iranian infants so as to provide large margin of safety for them.

  20. A WEB-BASED FRAMEWORK FOR VISUALIZING INDUSTRIAL SPATIOTEMPORAL DISTRIBUTION USING STANDARD DEVIATIONAL ELLIPSE AND SHIFTING ROUTES OF GRAVITY CENTERS

    Directory of Open Access Journals (Sweden)

    Y. Song

    2017-09-01

    Full Text Available Analysing spatiotemporal distribution patterns and its dynamics of different industries can help us learn the macro-level developing trends of those industries, and in turn provides references for industrial spatial planning. However, the analysis process is challenging task which requires an easy-to-understand information presentation mechanism and a powerful computational technology to support the visual analytics of big data on the fly. Due to this reason, this research proposes a web-based framework to enable such a visual analytics requirement. The framework uses standard deviational ellipse (SDE and shifting route of gravity centers to show the spatial distribution and yearly developing trends of different enterprise types according to their industry categories. The calculation of gravity centers and ellipses is paralleled using Apache Spark to accelerate the processing. In the experiments, we use the enterprise registration dataset in Mainland China from year 1960 to 2015 that contains fine-grain location information (i.e., coordinates of each individual enterprise to demonstrate the feasibility of this framework. The experiment result shows that the developed visual analytics method is helpful to understand the multi-level patterns and developing trends of different industries in China. Moreover, the proposed framework can be used to analyse any nature and social spatiotemporal point process with large data volume, such as crime and disease.

  1. a Web-Based Framework for Visualizing Industrial Spatiotemporal Distribution Using Standard Deviational Ellipse and Shifting Routes of Gravity Centers

    Science.gov (United States)

    Song, Y.; Gui, Z.; Wu, H.; Wei, Y.

    2017-09-01

    Analysing spatiotemporal distribution patterns and its dynamics of different industries can help us learn the macro-level developing trends of those industries, and in turn provides references for industrial spatial planning. However, the analysis process is challenging task which requires an easy-to-understand information presentation mechanism and a powerful computational technology to support the visual analytics of big data on the fly. Due to this reason, this research proposes a web-based framework to enable such a visual analytics requirement. The framework uses standard deviational ellipse (SDE) and shifting route of gravity centers to show the spatial distribution and yearly developing trends of different enterprise types according to their industry categories. The calculation of gravity centers and ellipses is paralleled using Apache Spark to accelerate the processing. In the experiments, we use the enterprise registration dataset in Mainland China from year 1960 to 2015 that contains fine-grain location information (i.e., coordinates of each individual enterprise) to demonstrate the feasibility of this framework. The experiment result shows that the developed visual analytics method is helpful to understand the multi-level patterns and developing trends of different industries in China. Moreover, the proposed framework can be used to analyse any nature and social spatiotemporal point process with large data volume, such as crime and disease.

  2. Pulp Stones, Prevalence and Distribution in an Iranian Population.

    Science.gov (United States)

    Kuzekanani, Maryam; Haghani, Jahangir; Walsh, Laurence J; Estabragh, Mohammad Am

    2018-01-01

    This study determined the prevalence and distribution of pulp stones in the permanent dentition of an adult population using their periapical radiographs. The study followed a cross-sectional design. A total of 800 periapical radiographs collected from 412 patients attending dental clinics in Kerman, Islamic Republic of Iran, were examined using magnification. Pulp stones were present in 9.6% of all permanent teeth examined, being most common in maxillary first and second molars, followed by mandibular first and second molars. They were present in 31.5% of all adult patients, with a significantly increased prevalence in females compared with males (40.5 vs 23.9%, chi-squared test p endodontic treatment. They obstruct access to the canal orifices and thus complicate endodontic treatment. Knowing where and when pulp stones are likely to occur improves the quality of root canal treatments.

  3. Use of a non-linear method for including the mass uncertainty of gravimetric standards and system measurement errors in the fitting of calibration curves for XRFA freeze-dried UNO3 standards

    International Nuclear Information System (INIS)

    Pickles, W.L.; McClure, J.W.; Howell, R.H.

    1978-05-01

    A sophisticated nonlinear multiparameter fitting program was used to produce a best fit calibration curve for the response of an x-ray fluorescence analyzer to uranium nitrate, freeze dried, 0.2% accurate, gravimetric standards. The program is based on unconstrained minimization subroutine, VA02A. The program considers the mass values of the gravimetric standards as parameters to be fit along with the normal calibration curve parameters. The fitting procedure weights with the system errors and the mass errors in a consistent way. The resulting best fit calibration curve parameters reflect the fact that the masses of the standard samples are measured quantities with a known error. Error estimates for the calibration curve parameters can be obtained from the curvature of the ''Chi-Squared Matrix'' or from error relaxation techniques. It was shown that nondispersive XRFA of 0.1 to 1 mg freeze-dried UNO 3 can have an accuracy of 0.2% in 1000 s

  4. Distribution of hydrogen within the HDR-containment under severe accident conditions. OECD standard problem. Final comparison report

    International Nuclear Information System (INIS)

    Karwat, H.

    1992-08-01

    The present report summarizes the results of the International Standard Problem Exercise ISP-29, based on the HDR Hydrogen Distribution Experiment E11.2. Post-test analyses are compared to experimentally measured parameters, well-known to the analysis. This report has been prepared by the Institute for Reactor Dynamics and Reactor Safety of the Technical University Munich under contract with the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) which received funding for this activity from the German Ministry for Research and Technology (BMFT) under the research contract RS 792. The HDR experiment E11.2 has been performed by the Kernforschungszentrum Karlsruhe (KfK) in the frame of the project 'Projekt HDR-Sicherheitsprogramm' sponsored by the BMFT. Ten institutions from eight countries participated in the post-test analysis exercise which was focussing on the long-lasting gas distribution processes expected inside a PWR containment under severe accident conditions. The gas release experiment was coupled to a long-lasting steam release into the containment typical for an unmitigated small break loss-of-coolant accident. In lieu of pure hydrogen a gas mixture consisting of 15% hydrogen and 85% helium has been applied in order to avoid reaching flammability during the experiment. Of central importance are common overlay plots comparing calculated transients with measurements of the global pressure, the local temperature-, steam- and gas concentration distributions throughout the entire HDR containment. The comparisons indicate relatively large margins between most calculations and the experiment. Having in mind that this exercise was specified as an 'open post-test' analysis of well-known measured data the reasons for discrepancies between measurements and simulations were extensively discussed during a final workshop. It was concluded that analytical shortcomings as well as some uncertainties of experimental boundary conditions may be responsible for deviations

  5. A parametric model to estimate the proportion from true null using a distribution for p-values.

    Science.gov (United States)

    Yu, Chang; Zelterman, Daniel

    2017-10-01

    Microarray studies generate a large number of p-values from many gene expression comparisons. The estimate of the proportion of the p-values sampled from the null hypothesis draws broad interest. The two-component mixture model is often used to estimate this proportion. If the data are generated under the null hypothesis, the p-values follow the uniform distribution. What is the distribution of p-values when data are sampled from the alternative hypothesis? The distribution is derived for the chi-squared test. Then this distribution is used to estimate the proportion of p-values sampled from the null hypothesis in a parametric framework. Simulation studies are conducted to evaluate its performance in comparison with five recent methods. Even in scenarios with clusters of correlated p-values and a multicomponent mixture or a continuous mixture in the alternative, the new method performs robustly. The methods are demonstrated through an analysis of a real microarray dataset.

  6. Survey of basic medical researchers on the awareness of animal experimental designs and reporting standards in China.

    Directory of Open Access Journals (Sweden)

    Bin Ma

    Full Text Available To investigate the awareness and use of the Systematic Review Center for Laboratory Animal Experimentation's (SYRCLE risk-of-bias tool, the Animal Research: Reporting of In Vivo Experiments (ARRIVE reporting guidelines, and Gold Standard Publication Checklist (GSPC in China in basic medical researchers of animal experimental studies.A national questionnaire-based survey targeting basic medical researchers was carried in China to investigate the basic information and awareness of SYRCLE's risk of bias tool, ARRIVE guidelines, GSPC, and animal experimental bias risk control factors. The EpiData3.1 software was used for data entry, and Microsoft Excel 2013 was used for statistical analysis in this study. The number of cases (n and percentage (% of classified information were statistically described, and the comparison between groups (i.e., current students vs. research staff was performed using chi-square test.A total of 298 questionnaires were distributed, and 272 responses were received, which included 266 valid questionnaires (from 118 current students and 148 research staff. Among the 266 survey participants, only 15.8% was aware of the SYRCLE's risk of bias tool, with significant difference between the two groups (P = 0.003, and the awareness rates of ARRIVE guidelines and GSPC were only 9.4% and 9.0%, respectively; 58.6% survey participants believed that the reports of animal experimental studies in Chinese literature were inadequate, with significant difference between the two groups (P = 0.004. In addition, only approximately 1/3 of the survey participants had read systematic reviews and meta-analysis reports of animal experimental studies; only 16/266 (6.0% had carried out/participated in and 11/266 (4.1% had published systematic reviews/meta-analysis of animal experimental studies.The awareness and use rates of SYRCLE's risk-of-bias tool, the ARRIVE guidelines, and the GSPC were low among Chinese basic medical researchers. Therefore

  7. Survey of basic medical researchers on the awareness of animal experimental designs and reporting standards in China.

    Science.gov (United States)

    Ma, Bin; Xu, Jia-Ke; Wu, Wen-Jing; Liu, Hong-Yan; Kou, Cheng-Kun; Liu, Na; Zhao, Lulu

    2017-01-01

    To investigate the awareness and use of the Systematic Review Center for Laboratory Animal Experimentation's (SYRCLE) risk-of-bias tool, the Animal Research: Reporting of In Vivo Experiments (ARRIVE) reporting guidelines, and Gold Standard Publication Checklist (GSPC) in China in basic medical researchers of animal experimental studies. A national questionnaire-based survey targeting basic medical researchers was carried in China to investigate the basic information and awareness of SYRCLE's risk of bias tool, ARRIVE guidelines, GSPC, and animal experimental bias risk control factors. The EpiData3.1 software was used for data entry, and Microsoft Excel 2013 was used for statistical analysis in this study. The number of cases (n) and percentage (%) of classified information were statistically described, and the comparison between groups (i.e., current students vs. research staff) was performed using chi-square test. A total of 298 questionnaires were distributed, and 272 responses were received, which included 266 valid questionnaires (from 118 current students and 148 research staff). Among the 266 survey participants, only 15.8% was aware of the SYRCLE's risk of bias tool, with significant difference between the two groups (P = 0.003), and the awareness rates of ARRIVE guidelines and GSPC were only 9.4% and 9.0%, respectively; 58.6% survey participants believed that the reports of animal experimental studies in Chinese literature were inadequate, with significant difference between the two groups (P = 0.004). In addition, only approximately 1/3 of the survey participants had read systematic reviews and meta-analysis reports of animal experimental studies; only 16/266 (6.0%) had carried out/participated in and 11/266 (4.1%) had published systematic reviews/meta-analysis of animal experimental studies. The awareness and use rates of SYRCLE's risk-of-bias tool, the ARRIVE guidelines, and the GSPC were low among Chinese basic medical researchers. Therefore, specific

  8. Multivariate data analysis as a semi-quantitative tool for interpretive evaluation of comparability or equivalence of aerodynamic particle size distribution profiles.

    Science.gov (United States)

    Shi, Shuai; Hickey, Anthony J

    2009-01-01

    The purpose of this article is to investigate the performance of multivariate data analysis, especially orthogonal partial least square (OPLS) analysis, as a semi-quantitative tool to evaluate the comparability or equivalence of aerodynamic particle size distribution (APSD) profiles of orally inhaled and nasal drug products (OINDP). Monte Carlo simulation was employed to reconstitute APSD profiles based on 55 realistic scenarios proposed by the Product Quality Research Institute (PQRI) working group. OPLS analyses with different data pretreatment methods were performed on each of the reconstituted profiles. Compared to unit-variance scaling, equivalence determined based on OPLS analysis with Pareto scaling was shown to be more consistent with the working group assessment. Chi-square statistics was employed to compare the performance of OPLS analysis (Pareto scaling) with that of the combination test (i.e., chi-square ratio statistics and population bioequivalence test for impactor-sized mass) in terms of achieving greater consistency with the working group evaluation. A p value of 0.036 suggested that OPLS analysis with Pareto scaling may be more predictive than the combination test with respect to consistency. Furthermore, OPLS analysis may also be employed to analyze part of the APSD profiles that contribute to the calculation of the mass median aerodynamic diameter. Our results show that OPLS analysis performed on partial deposition sites do not interfere with the performance on all deposition sites.

  9. Comparative analysis through probability distributions of a data set

    Science.gov (United States)

    Cristea, Gabriel; Constantinescu, Dan Mihai

    2018-02-01

    In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.

  10. Gold internal standard correction for elemental imaging of soft tissue sections by LA-ICP-MS: element distribution in eye microstructures.

    Science.gov (United States)

    Konz, Ioana; Fernández, Beatriz; Fernández, M Luisa; Pereiro, Rosario; González, Héctor; Alvarez, Lydia; Coca-Prados, Miguel; Sanz-Medel, Alfredo

    2013-04-01

    Laser ablation coupled to inductively coupled plasma mass spectrometry has been developed for the elemental imaging of Mg, Fe and Cu distribution in histological tissue sections of fixed eyes, embedded in paraffin, from human donors (cadavers). This work presents the development of a novel internal standard correction methodology based on the deposition of a homogeneous thin gold film on the tissue surface and the use of the (197)Au(+) signal as internal standard. Sample preparation (tissue section thickness) and laser conditions were carefully optimized, and internal normalisation using (197)Au(+) was compared with (13)C(+) correction for imaging applications. (24)Mg(+), (56)Fe(+) and (63)Cu(+) distributions were investigated in histological sections of the anterior segment of the eye (including the iris, ciliary body, cornea and trabecular meshwork) and were shown to be heterogeneously distributed along those tissue structures. Reproducibility was assessed by imaging different human eye sections from the same donor and from ten different eyes from adult normal donors, which showed that similar spatial maps were obtained and therefore demonstrate the analytical potential of using (197)Au(+) as internal standard. The proposed analytical approach could offer a robust tool with great practical interest for clinical studies, e.g. to investigate trace element distribution of metals and their alterations in ocular diseases.

  11. What influences national and foreign physicians' geographic distribution? An analysis of medical doctors' residence location in Portugal.

    Science.gov (United States)

    Russo, Giuliano; Ferrinho, Paulo; de Sousa, Bruno; Conceição, Cláudia

    2012-07-02

    The debate over physicians' geographical distribution has attracted the attention of the economic and public health literature over the last forty years. Nonetheless, it is still to date unclear what influences physicians' location, and whether foreign physicians contribute to fill the geographical gaps left by national doctors in any given country. The present research sets out to investigate the current distribution of national and international physicians in Portugal, with the objective to understand its determinants and provide an evidence base for policy-makers to identify policies to influence it. A cross-sectional study of physicians currently registered in Portugal was conducted to describe the population and explore the association of physician residence patterns with relevant personal and municipality characteristics. Data from the Portuguese Medical Council on physicians' residence and characteristics were analysed, as well as data from the National Institute of Statistics on municipalities' population, living standards and health care network. Descriptive statistics, chi-square tests, negative binomial and logistic regression modelling were applied to determine: (a) municipality characteristics predicting Portuguese and International physicians' geographical distribution, and; (b) doctors' characteristics that could increase the odds of residing outside the country's metropolitan areas. There were 39,473 physicians in Portugal in 2008, 51.1% of whom male, and 40.2% between 41 and 55 years of age. They were predominantly Portuguese (90.5%), with Spanish, Brazilian and African nationalities also represented. Population, Population's Purchasing Power, Nurses per capita and Municipality Development Index (MDI) were the municipality characteristics displaying the strongest association with national physicians' location. For foreign physicians, the MDI was not statistically significant, while municipalities' foreign population applying for residence

  12. NODC Standard Product: International ocean atlas Volume 4 - Atlas of temperature / salinity frequency distributions (2 disc set) (NCEI Accession 0101473)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This Atlas presents more than 80,000 plots of the empirical frequency distributions of temperature and salinity for each 5-degree square area of the North Atlantic...

  13. Information and communication technologies for operating of smart distribution grids based on the German standardization roadmap; Informations- und Kommunikationstechnologien zur Betriebsfuehrung smarter Verteilungsnetze auf Basis der Deutschen Normungsroadmap

    Energy Technology Data Exchange (ETDEWEB)

    Brunner, Christoph [IT4Power, Zug (Switzerland); Buchholz, Bernd Michael [NTB Technoservice, Pyrbaum (Germany); Hampel, Herman [iAD GmbH, Grosshabersdorf (Germany); Naumann, A. [Magdeburg Univ. (Germany)

    2012-07-01

    The current challenges of the distribution networks are caused by a growing volume of distributed energy in-feed and new types of load. They require the introduction of information and communication technologies (ICT) down to the low voltage level. Innovative monitoring and control tasks are developed in the framework of the European lighthouse project ''Web2Energy'' (W2E) and performed in the practice of the 20/0.4 kV network of the HSE AG in Darmstadt. The overview of the realized functions is given and the related information exchange between the control centre and the distributed plants is considered. The project applies the standards IEC 61850 for data communication and IEC 61968/70 for the data management (CIM - Common Information Model) in the control center (CC). The client - server architecture of the developed communication system is considered in detail. The project related W2E CC serves the aspects of smart distribution also in the context with market activities. Data acquisition and control of the 20/0.4 kV terminals and the various power plants are executed by a mini remote terminal unit. The W2E RTU offers a 100 MBd Ethernet interface providing the IEC 61850 protocol for the access to the communication system. The required application specific extensions of the standards are discussed and the first experiences for application in the practice are demonstrated. (orig.)

  14. Influence of forest roads standards and networks on water yield as predicted by the distributed hydrology-soil-vegetation model

    Science.gov (United States)

    Salli F. Dymond; W. Michael Aust; Steven P. Prisley; Mark H. Eisenbies; James M. Vose

    2013-01-01

    Throughout the country, foresters are continually looking at the effects of logging and forest roads on stream discharge and overall stream health. In the Pacific Northwest, a distributed hydrology-soil-vegetation model (DHSVM) has been used to predict the effects of logging on peak discharge in mountainous regions. DHSVM uses elevation, meteorological, vegetation, and...

  15. Evaluation of Antibiotic Residues in Pasteurized and Raw Milk Distributed in the South of Khorasan-e Razavi Province, Iran.

    Science.gov (United States)

    Moghadam, Mortez Mohammadzadeh; Amiri, Mostafa; Riabi, Hamed Ramezani Awal; Riabi, Hamid Ramezani Awal

    2016-12-01

    The presence of antibiotic residues in milk and other products livestock is a health problem which can endanger public health. Antibiotics are used widely in animal husbandry to treat diseases related to bacterial infections. Antimicrobial drugs have been in use for decades in industry. They are commonly used in livestock facilities to treat mastitis. This study aimed to investigate antibiotic residues in pasteurized milk distributed in schools, in milk collection centers, and in milk production factories in Gonabad city. This cross-sectional study was conducted on 251 samples of commercial pasteurized milk packet distributed in schools (code A), raw milk collection centers in Gonabad city (code B), and pasteurized milk production factories (code C) in Gonabad city. The Copan test kit of Denmark Christian Hansen Company was used to monitor antibiotic residues in milk. The data were analysed employing Chi-square test and one-way analysis of variance (ANOVA) to determine significant differences using SPSS software version 20. The significant level was considered at pmilk samples were collected out of which 143 (57%) were code A, 84 (33.5%) code B and 24 (9.6%) code C. Total number of 189 samples (75.2%) were negative and 62 (24.8%) were positive. From the three types of milk samples, 41 samples (28.7%) of the code A, 18 samples (21.4%) of the code B and 3 samples (12.5%) of the code C were positive. In general, from the milk samples most contaminated with antibiotics, 17 samples were positive in January and regarding code A, 13 samples were positive in the same month. There was not a significant difference among the three types of milk (p>0.05). The highest number of milk samples (n=7) contaminated with antibiotics were related to code B (38.5%). Most positive cases were related to code A in winter. Also, there was no significant difference among the three types of contaminated milk regarding the year and month (p=0.164 and p=0.917, respectively). Pasteurized milk

  16. Finite element analysis of the stress distributions in peri-implant bone in modified and standard-threaded dental implants

    Directory of Open Access Journals (Sweden)

    Serkan Dundar

    2016-01-01

    Full Text Available The aim of this study was to examine the stress distributions with three different loads in two different geometric and threaded types of dental implants by finite element analysis. For this purpose, two different implant models, Nobel Replace and Nobel Active (Nobel Biocare, Zurich, Switzerland, which are currently used in clinical cases, were constructed by using ANSYS Workbench 12.1. The stress distributions on components of the implant system under three different static loadings were analysed for the two models. The maximum stress values that occurred in all components were observed in FIII (300 N. The maximum stress values occurred in FIII (300 N when the Nobel Replace implant is used, whereas the lowest ones, in the case of FI (150 N loading in the Nobel Active implant. In all models, the maximum tensions were observed to be in the neck region of the implants. Increasing the connection between the implant and the bone surface may allow more uniform distribution of the forces of the dental implant and may protect the bone around the implant. Thus, the implant could remain in the mouth for longer periods. Variable-thread tapered implants can increase the implant and bone contact.

  17. Mobility particle size spectrometers: harmonization of technical standards and data structure to facilitate high quality long-term observations of atmospheric particle number size distributions

    Directory of Open Access Journals (Sweden)

    A. Wiedensohler

    2012-03-01

    Full Text Available Mobility particle size spectrometers often referred to as DMPS (Differential Mobility Particle Sizers or SMPS (Scanning Mobility Particle Sizers have found a wide range of applications in atmospheric aerosol research. However, comparability of measurements conducted world-wide is hampered by lack of generally accepted technical standards and guidelines with respect to the instrumental set-up, measurement mode, data evaluation as well as quality control. Technical standards were developed for a minimum requirement of mobility size spectrometry to perform long-term atmospheric aerosol measurements. Technical recommendations include continuous monitoring of flow rates, temperature, pressure, and relative humidity for the sheath and sample air in the differential mobility analyzer.

    We compared commercial and custom-made inversion routines to calculate the particle number size distributions from the measured electrical mobility distribution. All inversion routines are comparable within few per cent uncertainty for a given set of raw data.

    Furthermore, this work summarizes the results from several instrument intercomparison workshops conducted within the European infrastructure project EUSAAR (European Supersites for Atmospheric Aerosol Research and ACTRIS (Aerosols, Clouds, and Trace gases Research InfraStructure Network to determine present uncertainties especially of custom-built mobility particle size spectrometers. Under controlled laboratory conditions, the particle number size distributions from 20 to 200 nm determined by mobility particle size spectrometers of different design are within an uncertainty range of around ±10% after correcting internal particle losses, while below and above this size range the discrepancies increased. For particles larger than 200 nm, the uncertainty range increased to 30%, which could not be explained. The network reference mobility spectrometers with identical design agreed within ±4% in the

  18. Service Oriented Integration of Distributed Heterogeneous IT Systems in Production Engineering Using Information Standards and Linked Data

    Directory of Open Access Journals (Sweden)

    Navid Shariat Zadeh

    2017-01-01

    Full Text Available While design of production systems based on digital models brings benefits, the communication of models comes with challenges since models typically reside in a heterogeneous IT environment using different syntax and semantics. Coping with heterogeneity requires a smart integration strategy. One main paradigm to integrate data and IT systems is to deploy information standards. In particular, ISO 10303 STEP has been endorsed as a suitable standard to exchange a wide variety of product manufacturing data. One the other hand, service-oriented tool integration solutions are progressively adopted for the integration of data and IT-tools, especially with the emergence of Open Services for Lifecycle Collaboration whose focus is on the linking of data from heterogeneous software tools. In practice, there should be a combination of these approaches to facilitate the integration process. Hence, the aim of this paper is to investigate the applications of the approaches and the principles behind them and try to find criteria for where to use which approach. In addition, we explore the synergy between them and consequently suggest an approach based on combination of them. In addition, a systematic approach is suggested to identify required level of integrations and their corresponding approaches exemplified in a typical IT system architecture in Production Engineering.

  19. Are the stages of change socioeconomically distributed? A scoping review.

    Science.gov (United States)

    Adams, Jean; White, Martin

    2007-01-01

    To conduct a rapid scoping review to explore the hypothesis that socioeconomic affluence is associated with a more advanced stage of change for health behaviors. Key-word searches of MEDline, Embase, PyschlNFO, and www.google.com were conducted. Studies identified by the searches were included if they were published between 1982 and September 2003, written in English, and reported information on the distribution of the stages of change for any health behavior according to a marker of socioeconomic position (SEP). Data on the behavior studied, the sample studied, the measure of SEP used, the definitions of the stages of change used, and the distribution of the stages of change according SEP were extracted by a single reviewer. As far as possible, data were reanalyzed by the chi-square test to determine if there was evidence that the distribution of the stages of change varied according to SEP. A formal meta-analysis was not appropriate. Results. Twenty-one studies reporting data on 30 samples and 188,850 individuals were included. Significant variations in the distribution of the stages of change were found according to SEP, in the expected direction, in 16 (53%) samples representing 171,183 (91%) individuals. There is substantial published evidence that more-affluent people tend to be in more-advanced stages of change than are more-deprived people.

  20. Comparison of high energy gamma rays from absolute value of b greater than 30 deg with the galactic neutral hydrogen distribution

    Science.gov (United States)

    Ozel, M. E.; Ogelman, H.; Tumer, T.; Fichtel, C. E.; Hartman, R. C.; Kniffen, D. A.; Thompson, F. J.

    1978-01-01

    High-energy gamma-ray (energy above 35 MeV) data from the SAS 2 satellite have been used to compare the intensity distribution of gamma rays with that of neutral hydrogen (H I) density along the line of sight, at high galactic latitudes (absolute values greater than 30 deg). A model has been constructed for the case where the observed gamma-ray intensity has been assumed to be the sum of a galactic component proportional to the H I distribution plus an isotropic extragalactic emission. A chi-squared test of the model parameters indicates that about 30% of the total high-latitude emission may originate within the Galaxy.

  1. Standard bone healing stages occur during delayed bone healing, albeit with a different temporal onset and spatial distribution of callus tissues.

    Science.gov (United States)

    Peters, Anja; Schell, Hanna; Bail, Hermann J; Hannemann, Marion; Schumann, Tanja; Duda, Georg N; Lienau, Jasmin

    2010-09-01

    Bone healing is considered as a recapitulation of a developmental program initiated at the time of injury. This study tested the hypothesis that in delayed bone healing the regular cascade of healing events, including remodeling of woven to lamellar bone, would be similar compared to standard healing, although the temporal onset would be delayed. A tibial osteotomy was performed in sheep and stabilized with a rotationally unstable fixator leading to delayed healing. The sheep were sacrificed at 2, 3, 6, 9 weeks and 6 months postoperatively. The temporal and spatial tissue distributions in the calluses and the bone microstructure were examined by histology. Although histological analysis demonstrated temporal and spatial callus tissue distribution differences, delayed healing exhibited the same characteristic stages as those seen during uneventful standard healing. The delayed healing process was characterized by a prolonged presence of hematoma, a different spatial distribution of new bone and delayed and prolonged endochondral bone formation. A change in the spatial distribution of callus formation was seen by week 6 leading to bone formation and resorption of the cortical bone fragments, dependent on the degree to which the cortical bone fragments were dislocated. At 6 months, only 5 out of 8 animals showed complete bony bridging with a continuous periosteum, although lamellar bone and newly formed woven bone were present in the other 3 animals. This study demonstrates that during delayed bone healing all stages of the healing cascade likely take place, even if bony consolidation does not occur. Furthermore, the healing outcome might be related to the periosteum's regenerative capacity leading to bony union or absence of bony bridging.

  2. Building America Case Study: Standard- Versus High-Velocity Air Distribution in High-Performance Townhomes, Denver, Colorado

    Energy Technology Data Exchange (ETDEWEB)

    2017-06-09

    IBACOS investigated the performance of a small-diameter high velocity heat pump system compared to a conventional system in a new construction triplex townhouse. A ductless heat pump system also was installed for comparison, but the homebuyer backed out because of aesthetic concerns about that system. In total, two buildings, having identical solar orientation and comprised of six townhomes, were monitored for comfort and energy performance. Results show that the small-diameter system provides more uniform temperatures from floor to floor in the three-story townhome. No clear energy consumption benefit was observed from either system. The builder is continuing to explore the small-diameter system as its new standard system to provide better comfort and indoor air quality. The homebuilder also explored the possibility of shifting its townhome product to meet the U.S. Department of Energy Challenge Home National Program Requirements. Ultimately, the builder decided that adoption of these practices would be too disruptive midstream in the construction cycle. However, the townhomes met the ENERGY STAR Version 3.0 program requirements.

  3. Dose distribution of gamma radiation in a new geometric configuration of a standard carton date package and its experimental application for disinfestation of packed dates

    Science.gov (United States)

    Ahmed, M. S. H.; Al-Taweel, A. A.; Hameed, A. A.

    1994-06-01

    A new geometrical configuration composed of three standard carton boxes (SCBs) full with polyethylene bags (PBs), where each bag contains 1 kg of date, was placed on a single turntable of Gammabeam-650 and irradiated with low doses. The mean "radiation absorbed dose" for disinfestation of this geometrical unit at 15 equally distributed positions (Fricke dosimeters) inside 3 SCBs put on a single turntable was calculated to be 0.46 ± 0.20 kGy and dose uniformity ratio ( U) = 1.0019/0.2500 = 4.00. The development and genetic tests carried out on insects found in the PBs 1-2 days after irradiation resulted in that all insects were completely sterile and died within a short period of time. No sign of any reinfestation was recorded at all in the treated packages even after 30 days of storage in an insectory. Apparently the prevention of insects from invading and/or penetrating the date packages is due mainly to the new combination of standard carton boxes that are widely used for commercial purposes and hermetically heat-sealed polyethylene bags of dates in addition to the entire prevention of reproduction induced by the "low" doses of γ radiation. Therefore, by using similar geometrical configuration, 18 big standard carton date packages can be simultaneously disinfected, using the same range of doses or so, by utilizing all the 6 turntables inside the radiation chamber of the Gammabeam-650 irradiation facility.

  4. Curve fitting of the corporate recovery rates: the comparison of Beta distribution estimation and kernel density estimation.

    Directory of Open Access Journals (Sweden)

    Rongda Chen

    Full Text Available Recovery rate is essential to the estimation of the portfolio's loss and economic capital. Neglecting the randomness of the distribution of recovery rate may underestimate the risk. The study introduces two kinds of models of distribution, Beta distribution estimation and kernel density distribution estimation, to simulate the distribution of recovery rates of corporate loans and bonds. As is known, models based on Beta distribution are common in daily usage, such as CreditMetrics by J.P. Morgan, Portfolio Manager by KMV and Losscalc by Moody's. However, it has a fatal defect that it can't fit the bimodal or multimodal distributions such as recovery rates of corporate loans and bonds as Moody's new data show. In order to overcome this flaw, the kernel density estimation is introduced and we compare the simulation results by histogram, Beta distribution estimation and kernel density estimation to reach the conclusion that the Gaussian kernel density distribution really better imitates the distribution of the bimodal or multimodal data samples of corporate loans and bonds. Finally, a Chi-square test of the Gaussian kernel density estimation proves that it can fit the curve of recovery rates of loans and bonds. So using the kernel density distribution to precisely delineate the bimodal recovery rates of bonds is optimal in credit risk management.

  5. Curve fitting of the corporate recovery rates: the comparison of Beta distribution estimation and kernel density estimation.

    Science.gov (United States)

    Chen, Rongda; Wang, Ze

    2013-01-01

    Recovery rate is essential to the estimation of the portfolio's loss and economic capital. Neglecting the randomness of the distribution of recovery rate may underestimate the risk. The study introduces two kinds of models of distribution, Beta distribution estimation and kernel density distribution estimation, to simulate the distribution of recovery rates of corporate loans and bonds. As is known, models based on Beta distribution are common in daily usage, such as CreditMetrics by J.P. Morgan, Portfolio Manager by KMV and Losscalc by Moody's. However, it has a fatal defect that it can't fit the bimodal or multimodal distributions such as recovery rates of corporate loans and bonds as Moody's new data show. In order to overcome this flaw, the kernel density estimation is introduced and we compare the simulation results by histogram, Beta distribution estimation and kernel density estimation to reach the conclusion that the Gaussian kernel density distribution really better imitates the distribution of the bimodal or multimodal data samples of corporate loans and bonds. Finally, a Chi-square test of the Gaussian kernel density estimation proves that it can fit the curve of recovery rates of loans and bonds. So using the kernel density distribution to precisely delineate the bimodal recovery rates of bonds is optimal in credit risk management.

  6. Quantitative analysis of urban sprawl in Tripoli using Pearson's Chi-Square statistics and urban expansion intensity index

    International Nuclear Information System (INIS)

    Al-sharif, Abubakr A A; Pradhan, Biswajeet; Shafri, Helmi Zulhaidi Mohd; Mansor, Shattri

    2014-01-01

    Urban expansion is a spatial phenomenon that reflects the increased level of importance of metropolises. The remotely sensed data and GIS have been widely used to study and analyze the process of urban expansions and their patterns. The capital of Libya (Tripoli) was selected to perform this study and to examine its urban growth patterns. Four satellite imageries of the study area in different dates (1984, 1996, 2002 and 2010) were used to conduct this research. The main goal of this work is identification and analyzes the urban sprawl of Tripoli metropolitan area. Urban expansion intensity index (UEII) and degree of freedom test were used to analyze and assess urban expansions in the area of study. The results show that Tripoli has sprawled urban expansion patterns; high urban expansion intensity index; and its urban development had high degree of freedom according to its urban expansion history during the time period (1984-2010). However, the novel proposed hypothesis used for zones division resulted in very good insight understanding of urban expansion direction and the effect of the distance from central business of district (CBD)

  7. Factors Contributing to Successful Employment Outcomes for Hispanic Women Who Are Deaf: Utilization of Chi-Squared Automatic Interaction Detector and Logistic Regression Analysis

    Science.gov (United States)

    Feist, Amber M.

    2013-01-01

    Hispanic women who are deaf constitute a heterogeneous group of individuals with varying vocational needs. To understand the unique needs of this population, it is important to analyze how consumer characteristics, presence of public supports, and type of services provided influence employment outcomes for Hispanic women who are deaf. The purpose…

  8. The stochastic community and the logistic-J distribution

    Science.gov (United States)

    Dewdney, A. K.

    2003-12-01

    A new formal model called the multispecies logistical (MSL) system produces species/abundance distributions that are compared statistically with those found in natural communities. The system, which is able to handle thousands of individuals from hundreds of species, iteratively selects random pairs of individuals and transfers a unit of biomass (or energy) between the respective species. Several elaborations of the model, including those with trophic compartments, appear to produce the same distribution. The theoretical distribution underlying the MSL system is a hyperbolic section, here called the logistic-J distribution. In the study reported here, the logistic-J distribution has been fitted to the species-abundance histograms of 125 randomly selected taxocoenoses. Since the overall chi square score of the logistic-J achieved near-optimality in this study, it cannot be distinguished statistically from the J-curves observed by field biologists. For comparison, the log-series distribution was given the same test and scored significantly higher (more poorly) than the mean logistic-J score. If there is a single, major distribution underlying natural communities, it is not the log-series distribution. Nor, owing to a mathematical error in its formulation, can it be the lognormal distribution. In the MSL system each species follows a "stochastic orbit" about the mean abundance producing, in consequence, the logistic-J distribution. Such orbits are produced by any system in which the probabilities of reproduction and death are approximately equal. Accordingly, the "stochastic communities hypothesis" is proposed here as the overall mechanism governing abundances in all natural communities. It is not a single mechanism, per se, but the net effect of all environmental influences.

  9. Weight-for-age standard score - distribution and effect on in-hospital mortality: A retrospective analysis in pediatric cardiac surgery

    Directory of Open Access Journals (Sweden)

    Antony George

    2015-01-01

    Full Text Available Objective: To study the distribution of weight for age standard score (Z score in pediatric cardiac surgery and its effect on in-hospital mortality. Introduction: WHO recommends Standard Score (Z score to quantify and describe anthropometric data. The distribution of weight for age Z score and its effect on mortality in congenital heart surgery has not been studied. Methods: All patients of younger than 5 years who underwent cardiac surgery from July 2007 to June 2013, under single surgical unit at our institute were enrolled. Z score for weight for age was calculated. Patients were classified according to Z score and mortality across the classes was compared. Discrimination and calibration of the for Z score model was assessed. Improvement in predictability of mortality after addition of Z score to Aristotle Comprehensive Complexity (ACC score was analyzed. Results: The median Z score was -3.2 (Interquartile range -4.24 to -1.91] with weight (mean±SD of 8.4 ± 3.38 kg. Overall mortality was 11.5%. 71% and 52.59% of patients had Z score < -2 and < -3 respectively. Lower Z score classes were associated with progressively increasing mortality. Z score as continuous variable was associated with O.R. of 0.622 (95% CI- 0.527 to 0.733, P < 0.0001 for in-hospital mortality and remained significant predictor even after adjusting for age, gender, bypass duration and ACC score. Addition of Z score to ACC score improved its predictability for in-hosptial mortality (δC - 0.0661 [95% CI - 0.017 to 0.0595, P = 0.0169], IDI- 3.83% [95% CI - 0.017 to 0.0595, P = 0.00042]. Conclusion: Z scores were lower in our cohort and were associated with in-hospital mortality. Addition of Z score to ACC score significantly improves predictive ability for in-hospital mortality.

  10. Solving the interoperability challenge of a distributed complex patient guidance system: a data integrator based on HL7's Virtual Medical Record standard.

    Science.gov (United States)

    Marcos, Carlos; González-Ferrer, Arturo; Peleg, Mor; Cavero, Carlos

    2015-05-01

    We show how the HL7 Virtual Medical Record (vMR) standard can be used to design and implement a data integrator (DI) component that collects patient information from heterogeneous sources and stores it into a personal health record, from which it can then retrieve data. Our working hypothesis is that the HL7 vMR standard in its release 1 version can properly capture the semantics needed to drive evidence-based clinical decision support systems. To achieve seamless communication between the personal health record and heterogeneous data consumers, we used a three-pronged approach. First, the choice of the HL7 vMR as a message model for all components accompanied by the use of medical vocabularies eases their semantic interoperability. Second, the DI follows a service-oriented approach to provide access to system components. Third, an XML database provides the data layer.Results The DI supports requirements of a guideline-based clinical decision support system implemented in two clinical domains and settings, ensuring reliable and secure access, high performance, and simplicity of integration, while complying with standards for the storage and processing of patient information needed for decision support and analytics. This was tested within the framework of a multinational project (www.mobiguide-project.eu) aimed at developing a ubiquitous patient guidance system (PGS). The vMR model with its extension mechanism is demonstrated to be effective for data integration and communication within a distributed PGS implemented for two clinical domains across different healthcare settings in two nations. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. A novel standardized algorithm using SPECT/CT evaluating unhappy patients after unicondylar knee arthroplasty– a combined analysis of tracer uptake distribution and component position

    International Nuclear Information System (INIS)

    Suter, Basil; Testa, Enrique; Stämpfli, Patrick; Konala, Praveen; Rasch, Helmut; Friederich, Niklaus F; Hirschmann, Michael T

    2015-01-01

    The introduction of a standardized SPECT/CT algorithm including a localization scheme, which allows accurate identification of specific patterns and thresholds of SPECT/CT tracer uptake, could lead to a better understanding of the bone remodeling and specific failure modes of unicondylar knee arthroplasty (UKA). The purpose of the present study was to introduce a novel standardized SPECT/CT algorithm for patients after UKA and evaluate its clinical applicability, usefulness and inter- and intra-observer reliability. Tc-HDP-SPECT/CT images of consecutive patients (median age 65, range 48–84 years) with 21 knees after UKA were prospectively evaluated. The tracer activity on SPECT/CT was localized using a specific standardized UKA localization scheme. For tracer uptake analysis (intensity and anatomical distribution pattern) a 3D volumetric quantification method was used. The maximum intensity values were recorded for each anatomical area. In addition, ratios between the respective value in the measured area and the background tracer activity were calculated. The femoral and tibial component position (varus-valgus, flexion-extension, internal and external rotation) was determined in 3D-CT. The inter- and intraobserver reliability of the localization scheme, grading of the tracer activity and component measurements were determined by calculating the intraclass correlation coefficients (ICC). The localization scheme, grading of the tracer activity and component measurements showed high inter- and intra-observer reliabilities for all regions (tibia, femur and patella). For measurement of component position there was strong agreement between the readings of the two observers; the ICC for the orientation of the femoral component was 0.73-1.00 (intra-observer reliability) and 0.91-1.00 (inter-observer reliability). The ICC for the orientation of the tibial component was 0.75-1.00 (intra-observer reliability) and 0.77-1.00 (inter-observer reliability). The SPECT

  12. A novel standardized algorithm using SPECT/CT evaluating unhappy patients after unicondylar knee arthroplasty--a combined analysis of tracer uptake distribution and component position.

    Science.gov (United States)

    Suter, Basil; Testa, Enrique; Stämpfli, Patrick; Konala, Praveen; Rasch, Helmut; Friederich, Niklaus F; Hirschmann, Michael T

    2015-03-20

    The introduction of a standardized SPECT/CT algorithm including a localization scheme, which allows accurate identification of specific patterns and thresholds of SPECT/CT tracer uptake, could lead to a better understanding of the bone remodeling and specific failure modes of unicondylar knee arthroplasty (UKA). The purpose of the present study was to introduce a novel standardized SPECT/CT algorithm for patients after UKA and evaluate its clinical applicability, usefulness and inter- and intra-observer reliability. Tc-HDP-SPECT/CT images of consecutive patients (median age 65, range 48-84 years) with 21 knees after UKA were prospectively evaluated. The tracer activity on SPECT/CT was localized using a specific standardized UKA localization scheme. For tracer uptake analysis (intensity and anatomical distribution pattern) a 3D volumetric quantification method was used. The maximum intensity values were recorded for each anatomical area. In addition, ratios between the respective value in the measured area and the background tracer activity were calculated. The femoral and tibial component position (varus-valgus, flexion-extension, internal and external rotation) was determined in 3D-CT. The inter- and intraobserver reliability of the localization scheme, grading of the tracer activity and component measurements were determined by calculating the intraclass correlation coefficients (ICC). The localization scheme, grading of the tracer activity and component measurements showed high inter- and intra-observer reliabilities for all regions (tibia, femur and patella). For measurement of component position there was strong agreement between the readings of the two observers; the ICC for the orientation of the femoral component was 0.73-1.00 (intra-observer reliability) and 0.91-1.00 (inter-observer reliability). The ICC for the orientation of the tibial component was 0.75-1.00 (intra-observer reliability) and 0.77-1.00 (inter-observer reliability). The SPECT/CT algorithm

  13. Modelling road accident blackspots data with the discrete generalized Pareto distribution.

    Science.gov (United States)

    Prieto, Faustino; Gómez-Déniz, Emilio; Sarabia, José María

    2014-10-01

    This study shows how road traffic networks events, in particular road accidents on blackspots, can be modelled with simple probabilistic distributions. We considered the number of crashes and the number of fatalities on Spanish blackspots in the period 2003-2007, from Spanish General Directorate of Traffic (DGT). We modelled those datasets, respectively, with the discrete generalized Pareto distribution (a discrete parametric model with three parameters) and with the discrete Lomax distribution (a discrete parametric model with two parameters, and particular case of the previous model). For that, we analyzed the basic properties of both parametric models: cumulative distribution, survival, probability mass, quantile and hazard functions, genesis and rth-order moments; applied two estimation methods of their parameters: the μ and (μ+1) frequency method and the maximum likelihood method; used two goodness-of-fit tests: Chi-square test and discrete Kolmogorov-Smirnov test based on bootstrap resampling; and compared them with the classical negative binomial distribution in terms of absolute probabilities and in models including covariates. We found that those probabilistic models can be useful to describe the road accident blackspots datasets analyzed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Standard setting of objective structured practical examination by modified Angoff method: A pilot study.

    Science.gov (United States)

    Kamath, M Ganesh; Pallath, Vinod; Ramnarayan, K; Kamath, Asha; Torke, Sharmila; Gonsalves, James

    2016-01-01

    The undergraduate curriculum at our institution is divided system-wise into four blocks, each block ending with theory and objective structured practical examination (OSPE). The OSPE in Physiology consists of 12 stations, and a conventional minimum score to qualify is 50%. We aimed to incorporate standard setting using the modified Angoff method in OSPE to differentiate the competent from the non-competent student and to explore the possibility of introducing standard setting in Physiology OSPE at our institution. Experts rated the OSPE using the modified Angoff method to obtain the standard set cut-off in two of the four blocks. We assessed the OSPE marks of 110 first year medical students. Chi-square test was used to compare the number of students who scored less than standard set cut-off and conventional cut-off; correlation coefficient was used to assess the relation between OSPE and theory marks in both blocks. Feedback was obtained from the experts. The standard set was 62% and 67% for blocks II and III, respectively. The use of standard set cut-off resulted in 16.3% (n=18) and 22.7% (n=25) students being declared unsuccessful in blocks II and III, respectively. Comparison between the number, who scored less than standard set and conventional cut-off was statistically significant (p=0.001). The correlation coefficient was 0.65 (p=0.003) and 0.52 (p<0.001) in blocks II and III, respectively. The experts welcomed the idea of standard setting. Standard setting helped in differentiating the competent from the non-competent student, indicating that standard setting enhances the quality of OSPE as an assessment tool.

  15. ABO blood group distribution and ischaemic heart disease

    International Nuclear Information System (INIS)

    Lutfullah, A.; Bhatti, T.A.; Hanif, A.; Shaikh, S.H.

    2011-01-01

    To study the association of ABO blood groups with ischaemic heart disease (IHD) in our setting. Analytic comparative study. Department of Cardiology, Mayo hospital, Lahore over a period of two years from January 2008 to December 2009. The study group included 907 patients of IHD. The distribution of ABO blood groups in IHD patients was compared with the control group of 907 non-IHD individuals. Data was analyzed using SPSS 16. Chi-square test for significance was used. P-value less than 0.05 was taken as significant. In this study, the following pattern of ABO blood groups was observed in IHD patients and non-IHD patients respectively : Blood group A 251 (27.67%) and 248 (27.34%); Blood group B 329 (36.27%) and 358 (39.47%); Blood group O 235 (25.90%) and 240 (24.46%); Blood group AB 92 (10.14%) and 61 (6.72%), P-value = 0.06. There is no association of ABO blood groups and ischaemic heart disease. (author)

  16. Secure distributed genome analysis for GWAS and sequence comparison computation

    Science.gov (United States)

    2015-01-01

    Background The rapid increase in the availability and volume of genomic data makes significant advances in biomedical research possible, but sharing of genomic data poses challenges due to the highly sensitive nature of such data. To address the challenges, a competition for secure distributed processing of genomic data was organized by the iDASH research center. Methods In this work we propose techniques for securing computation with real-life genomic data for minor allele frequency and chi-squared statistics computation, as well as distance computation between two genomic sequences, as specified by the iDASH competition tasks. We put forward novel optimizations, including a generalization of a version of mergesort, which might be of independent interest. Results We provide implementation results of our techniques based on secret sharing that demonstrate practicality of the suggested protocols and also report on performance improvements due to our optimization techniques. Conclusions This work describes our techniques, findings, and experimental results developed and obtained as part of iDASH 2015 research competition to secure real-life genomic computations and shows feasibility of securely computing with genomic data in practice. PMID:26733307

  17. Investigating the Statistical Distribution of Learning Coverage in MOOCs

    Directory of Open Access Journals (Sweden)

    Xiu Li

    2017-11-01

    Full Text Available Learners participating in Massive Open Online Courses (MOOC have a wide range of backgrounds and motivations. Many MOOC learners enroll in the courses to take a brief look; only a few go through the entire content, and even fewer are able to eventually obtain a certificate. We discovered this phenomenon after having examined 92 courses on both xuetangX and edX platforms. More specifically, we found that the learning coverage in many courses—one of the metrics used to estimate the learners’ active engagement with the online courses—observes a Zipf distribution. We apply the maximum likelihood estimation method to fit the Zipf’s law and test our hypothesis using a chi-square test. In the xuetangX dataset, the learning coverage in 53 of 76 courses fits Zipf’s law, but in all of 16 courses on the edX platform, the learning coverage rejects the Zipf’s law. The result from our study is expected to bring insight to the unique learning behavior on MOOC.

  18. Algorithm for systematic peak extraction from atomic pair distribution functions.

    Science.gov (United States)

    Granlund, L; Billinge, S J L; Duxbury, P M

    2015-07-01

    The study presents an algorithm, ParSCAPE, for model-independent extraction of peak positions and intensities from atomic pair distribution functions (PDFs). It provides a statistically motivated method for determining parsimony of extracted peak models using the information-theoretic Akaike information criterion (AIC) applied to plausible models generated within an iterative framework of clustering and chi-square fitting. All parameters the algorithm uses are in principle known or estimable from experiment, though careful judgment must be applied when estimating the PDF baseline of nanostructured materials. ParSCAPE has been implemented in the Python program SrMise. Algorithm performance is examined on synchrotron X-ray PDFs of 16 bulk crystals and two nanoparticles using AIC-based multimodeling techniques, and particularly the impact of experimental uncertainties on extracted models. It is quite resistant to misidentification of spurious peaks coming from noise and termination effects, even in the absence of a constraining structural model. Structure solution from automatically extracted peaks using the Liga algorithm is demonstrated for 14 crystals and for C60. Special attention is given to the information content of the PDF, theory and practice of the AIC, as well as the algorithm's limitations.

  19. Distribution and persistence of the anti sea-lice drug teflubenzuron in wild fauna and sediments around a salmon farm, following a standard treatment

    Energy Technology Data Exchange (ETDEWEB)

    Samuelsen, Ole B. [Institute of Marine Research, P.O. Box 1870 Nordnes, N-5817 Bergen (Norway); Lunestad, Bjørn T.; Hannisdal, Rita [National Institute of Nutrition and Seafood Research, P.O. Box 2029 Nordnes, N-5817 Bergen (Norway); Bannister, Raymond; Olsen, Siri [Institute of Marine Research, P.O. Box 1870 Nordnes, N-5817 Bergen (Norway); Tjensvoll, Tore [National Institute of Nutrition and Seafood Research, P.O. Box 2029 Nordnes, N-5817 Bergen (Norway); Farestveit, Eva; Ervik, Arne [Institute of Marine Research, P.O. Box 1870 Nordnes, N-5817 Bergen (Norway)

    2015-03-01

    The salmon louse (Lepeoptheirus salmonis) is a challenge in the farming of Atlantic salmon (Salmo salar). To treat an infestation, different insecticides are used like the orally administered chitin synthetase inhibitor teflubenzuron. The concentrations and distribution of teflubenzuron were measured in water, organic particles, marine sediment and biota caught in the vicinity of a fish farm following a standard medication. Low concentrations were found in water samples whereas the organic waste from the farm, collected by sediment traps had concentrations higher than the medicated feed. Most of the organic waste was distributed to the bottom close to the farm but organic particles containing teflubenzuron were collected 1100 m from the farm. The sediment under the farm consisted of 5 to 10% organic material and therefore the concentration of teflubenzuron was much lower than in the organic waste. Teflubenzuron was persistent in the sediment with a stipulated halflife of 170 days. Sediment consuming polychaetes had high but decreasing concentrations of teflubenzuron throughout the experimental period, reflecting the decrease of teflubenzuron in the sediment. During medication most wild fauna contained teflubenzuron residues and where polychaetes and saith had highest concentrations. Eight months later only polychaetes and some crustaceans contained drug residues. What dosages that induce mortality in various crustaceans following short or long-term exposure is not known but the results indicate that the concentrations in defined individuals of king crab, shrimp, squat lobster and Norway lobster were high enough shortly after medication to induce mortality if moulting was imminent. Considering food safety, saith and the brown meat of crustaceans contained at first sampling concentrations of teflubenzuron higher than the MRL-value set for Atlantic salmon. The concentrations were, however, moderate and the amount of saith fillet or brown meat of crustaceans to be

  20. Distribution-based estimates of clinically significant changes in the International Standards for Neurological Classification of Spinal Cord Injury motor and sensory scores.

    Science.gov (United States)

    Scivoletto, G; Tamburella, F; Laurenza, L; Molinari, M

    2013-06-01

    Although the psychometric properties and statistical significance of the International Standards for Neurological Classification of Spinal Cord Injury Patients (ISNCSCI) have been widely examined, the clinical significance of motor and sensory scores (i.e., the improvement in score that has a meaningful impact on patients) is unknown. To calculate the clinical significance of the International Standards for Neurological Classification of Spinal Cord Injury Patients (ISNCSCI). Analysis of prospectively collected data. Spinal Cord Unit of a rehabilitation hospital in the centre of Italy. Analysis of the data of 600 patients with registration of the ISNCSCI Motor scores (total score and separately upper and lower extremity scores) and ISNCSCI Sensory scores. Clinical significance was calculated per several distribution-based approaches: minimal important differences, effect size-based estimates for small and substantial changes, standard error of measurement, and minimal detectable change. The calculated clinical significance was compared with improvements by the patients to determine the percentage of patients who achieved significant improvement. Furthermore, the functional status (as evaluated by the Spinal Cord Independence measure [SCIM III]) was studied in patients who achieved significant improvement compared to those who did not achieve them. The results of the study showed that motor scores were more amenable to change than sensory scores. A 5-point change in motor score resulted in a clinically significant improvement of 0.2 standard deviation units, and an 11-point change in motor score was associated with an improvement of 0.5 standard deviation units. The percentages of patients with a significant improvement varied from 8 to 80% according to the level and severity of the lesion. In some AIS grade/level of lesion groups, patients who achieved clinical significant scores also showed a better functional status with significantly higher SCIM III scores than

  1. Adaptation Measurement of CAD/CAM Dental Crowns with X-Ray Micro-CT: Metrological Chain Standardization and 3D Gap Size Distribution

    Directory of Open Access Journals (Sweden)

    L. Tapie

    2016-01-01

    Full Text Available Computer-Aided Design and Manufacturing systems are increasingly used to produce dental prostheses, but the parts produced suffer from a lack of evaluation, especially concerning the internal gap of the final assembly, that is, the space between the prepared tooth and the prosthesis. X-ray micro-Computed Tomography (micro-CT is a noninvasive imaging technique enabling the internal inspection of the assembly. It has proved to be an efficient tool for measuring the gap. In this study, a critical review of the protocols using micro-CT to quantify the gap is proposed as an introduction to a new protocol aimed at minimizing errors and enabling comparison between CAD/CAM systems. To compare different systems, a standardized protocol is proposed including two reference geometries. Micro-CT is used to acquire the reference geometries. A new 3D method is then proposed and a new indicator is defined (Gap Size Distribution (GSD. In addition, the usual 2D measurements are described and discussed. The 3D gap measurement method proposed can be used in clinical case geometries and has the considerable advantage of minimizing the data processing steps before performing the measurements.

  2. Evaluation of Kinetic Entropy of Breast Masses Initially Found on MRI using Whole-lesion Curve Distribution Data: Comparison with the Standard Kinetic Analysis.

    Science.gov (United States)

    Shimauchi, Akiko; Abe, Hiroyuki; Schacht, David V; Yulei, Jian; Pineda, Federico D; Jansen, Sanaz A; Ganesh, Rajiv; Newstead, Gillian M

    2015-08-01

    To quantify kinetic heterogeneity of breast masses that were initially detected with dynamic contrast-enhanced MRI, using whole-lesion kinetic distribution data obtained from computer-aided evaluation (CAE), and to compare that with standard kinetic curve analysis. Clinical MR images from 2006 to 2011 with breast masses initially detected with MRI were evaluated with CAE. The relative frequencies of six kinetic patterns (medium-persistent, medium-plateau, medium-washout, rapid-persistent, rapid-plateau, rapid-washout) within the entire lesion were used to calculate kinetic entropy (KE), a quantitative measure of enhancement pattern heterogeneity. Initial uptake (IU) and signal enhancement ratio (SER) were obtained from the most-suspicious kinetic curve. Mann-Whitney U test and ROC analysis were conducted for differentiation of malignant and benign masses. Forty benign and 37 malignant masses comprised the case set. IU and SER were not significantly different between malignant and benign masses, whereas KE was significantly greater for malignant than benign masses (p = 0.748, p = 0.083, and p kinetic heterogeneity of whole-lesion time-curve data with KE has the potential to improve differentiation of malignant from benign breast masses on breast MRI. • Kinetic heterogeneity can be quantified by computer-aided evaluation of breast MRI • Kinetic entropy was greater in malignant masses than benign masses • Kinetic entropy has the potential to improve differentiation of breast masses.

  3. The impact of a standardized program on short and long-term outcomes in bariatric surgery.

    Science.gov (United States)

    Aird, Lisa N F; Hong, Dennis; Gmora, Scott; Breau, Ruth; Anvari, Mehran

    2017-02-01

    The purpose of this study was to determine whether there has been an improvement in short- and long-term clinical outcomes since 2010, when the Ontario Bariatric Network led a province-wide initiative to establish a standardized system of care for bariatric patients. The system includes nine bariatric centers, a centralized referral system, and a research registry. Standardization of procedures has progressed yearly, including guidelines for preoperative assessment and perioperative care. Analysis of the OBN registry data was performed by fiscal year between April 2010 and March 2015. Three-month overall postoperative complication rates and 30 day postoperative mortality were calculated. The mean percentage of weight loss at 1, 2, and 3 years postoperative, and regression of obesity-related diseases were calculated. The analysis of continuous and nominal data was performed using ANOVA, Chi-square, and McNemar's testing. A multiple logistic regression analysis was performed for factors affecting postoperative complication rate. Eight thousand and forty-three patients were included in the bariatric registry between April 2010 and March 2015. Thirty-day mortality was rare (bariatric care has contributed to improvements in complication rates and supported prolonged weight loss and regression of obesity-related diseases in patients undergoing bariatric surgery in Ontario.

  4. Management of post abortion complications in Botswana -The need for a standardized approach.

    Directory of Open Access Journals (Sweden)

    Tadele Melese

    Full Text Available Post abortion complications are the third leading cause of maternal death after hemorrhage and hypertension in Botswana where abortion is not legalized. This study aimed at assessing the management of post abortion complications in Botswana.A retrospective study was conducted at four hospitals in Botswana in 2014. Socio-demographic, patient management and outcomes data were extracted from patients' medical records. Descriptive statistics and chi-square test were used to analyze and present the data.A total of 619 patients' medical records were reviewed. The duration of hospital stay prior to uterine evacuation ranged from less than an hour to 480 hours. All the patients received either prophylactic or therapeutic antibiotics. Use of parenteral antibiotics was significantly associated with severity of abortion, second trimester abortion, use of blood products and the interval between management's decision and uterine evacuation. Uterine evacuation for retained products of conception was achieved by metallic curettage among 516 (83.4% patients and by vacuum aspiration in 18 (2.9%. At all the study sites, Misoprostol or Oxytocin were used concurrently with surgical evacuation of the uterus. None use of analgesics or anesthetics in the four hospitals ranged between 12.4% to 28.8%.There is evidence of delayed patient care and prolonged hospital stay. Metallic curette was the primary method used for uterine evacuation across all the facilities. Pain management and antibiotics use was not standardized. A protocol has to be developed with the aim of standardizing post abortion care.

  5. The level of observing standard tracheostomy care and some barriers from perspective of nurses

    Directory of Open Access Journals (Sweden)

    Mosazade Sari Z

    2015-05-01

    Full Text Available Background & objective: Tracheostomy is one of the most common surgical procedures in intensive care unit. Although the implementation of standard care can lead to decrease the postoperative complications, but some factors causes the negative attitude in nurses and consequently lead to implementing the routine and non-standard care. The current study was conducted aimed to determine the level of observing standard tracheostomy care and some barriers from perspective of nurses.  Materials and Methods: In this descriptive-analytical study 42 nurses who were working in two selected hospitals in Tehran in 2013 were entered the study through census. Data were gathered through researcher made questionnaire for demographic data, observation checklist for standard tracheopstomy care and its barriers. The researcher observed 3 times concerning the how to care of tracheostomy by nurses and then nurses completed the demographic and barriers to standard care questionnaires. Data were analyzed by SPSS 21using one sample T-test, independent T-test, chi-square, ANOVA and Pearsons’ correlation coefficient.  Results: According to results, only 2 nurses (4.8 percent performed the standard tracheostomy care. The Most barriers affecting on the standard tracheostomy care were the administrative barriers (54.76 percent. Also, there was a moderate positive correlation between standard care with administrative barriers (r=0.46, p=0.002, personal barriers (r=0.38, p=0.012 and all of the barriers (r= 0.46, p=0.002. Conclusion: Most tracheostomy care are not according to standard protocols and most important barriers of its implementation are administrative barriers. With attention to the importance of this care, some necessary measures must be taken to resolve the barriers.

  6. Eliciting hyperparameters of prior distributions for the parameters of paired comparison models

    Directory of Open Access Journals (Sweden)

    Nasir Abbas

    2013-02-01

    Full Text Available Normal 0 false false false EN-US X-NONE AR-SA In the study of paired comparisons (PC, items may be ranked or issues may be prioritized through subjective assessment of certain judges. PC models are developed and then used to serve the purpose of ranking. The PC models may be studied through classical or Bayesian approach. Bayesian inference is a modern statistical technique used to draw conclusions about the population parameters. Its beauty lies in incorporating prior information about the parameters into the analysis in addition to current information (i.e. data. The prior and current information are formally combined to yield a posterior distribution about the population parameters, which is the work bench of the Bayesian statisticians. However, the problems the Bayesians face correspond to the selection and formal utilization of prior distribution. Once the type of prior distribution is decided to be used, the problem of estimating the parameters of the prior distribution (i.e. elicitation still persists. Different methods are devised to serve the purpose. In this study an attempt is made to use Minimum Chi-square (hence forth MCS for the elicitation purpose. Though it is a classical estimation technique, but is used here for the election purpose. The entire elicitation procedure is illustrated through a numerical data set.

  7. Changing distributions of larger ungulates in the Kruger National Park from ecological aerial survey data

    Directory of Open Access Journals (Sweden)

    George J. Chirima

    2012-07-01

    Full Text Available Documenting current species distribution patterns and their association with habitat types is important as a basis for assessing future range shifts in response to climate change or other influences. We used the adaptive local convex hull (a-LoCoH method to map distribution ranges of 12 ungulate species within the Kruger National Park (KNP based on locations recorded during aerial surveys (1980–1993. We used log-linear models to identify changes in regional distribution patterns and chi-square tests to determine shifts in habitat occupation over this period. We compared observed patterns with earlier, more subjectively derived distribution maps for these species. Zebra, wildebeest and giraffe distributions shifted towards the far northern section of the KNP, whilst buffalo and kudu showed proportional declines in the north. Sable antelope distribution contracted most in the north, whilst tsessebe, eland and roan antelope distributions showed no shifts. Warthog and waterbuck contracted in the central and northern regions, respectively. The distribution of impala did not change. Compared with earlier distributions, impala, zebra, buffalo, warthog and waterbuck had become less strongly concentrated along rivers. Wildebeest, zebra, sable antelope and tsessebe had become less prevalent in localities west of the central region. Concerning habitat occupation, the majority of grazers showed a concentration on basaltic substrates, whilst sable antelope favoured mopane-dominated woodland and sour bushveld on granite. Buffalo showed no strong preference for any habitats and waterbuck were concentrated along rivers. Although widespread, impala were absent from sections of mopane shrubveld and sandveld. Kudu and giraffe were widespread through most habitats, but with a lesser prevalence in northern mopane-dominated habitats. Documented distribution shifts appeared to be related to the completion of the western boundary fence and widened provision of

  8. Changing distributions of larger ungulates in the Kruger National Park from ecological aerial survey data

    Directory of Open Access Journals (Sweden)

    George J. Chirima

    2012-01-01

    Full Text Available Documenting current species distribution patterns and their association with habitat types is important as a basis for assessing future range shifts in response to climate change or other influences. We used the adaptive local convex hull (a-LoCoH method to map distribution ranges of 12 ungulate species within the Kruger National Park (KNP based on locations recorded during aerial surveys (1980–1993. We used log-linear models to identify changes in regional distribution patterns and chi-square tests to determine shifts in habitat occupation over this period. We compared observed patterns with earlier, more subjectively derived distribution maps for these species. Zebra, wildebeest and giraffe distributions shifted towards the far northern section of the KNP, whilst buffalo and kudu showed proportional declines in the north. Sable antelope distribution contracted most in the north, whilst tsessebe, eland and roan antelope distributions showed no shifts. Warthog and waterbuck contracted in the central and northern regions, respectively. The distribution of impala did not change. Compared with earlier distributions, impala, zebra, buffalo, warthog and waterbuck had become less strongly concentrated along rivers. Wildebeest, zebra, sable antelope and tsessebe had become less prevalent in localities west of the central region. Concerning habitat occupation, the majority of grazers showed a concentration on basaltic substrates, whilst sable antelope favoured mopane-dominated woodland and sour bushveld on granite. Buffalo showed no strong preference for any habitats and waterbuck were concentrated along rivers. Although widespread, impala were absent from sections of mopane shrubveld and sandveld. Kudu and giraffe were widespread through most habitats, but with a lesser prevalence in northern mopane-dominated habitats. Documented distribution shifts appeared to be related to the completion of the western boundary fence and widened provision of

  9. Radiographic assessment of distribution of mandibular third molar impaction: A retrospective study

    Directory of Open Access Journals (Sweden)

    Tejavathi Nagaraj

    2016-01-01

    Full Text Available Introduction: Third molars are the most common teeth that may follow an abortive eruption path and become impacted as a result of pathology, anatomical structures or insufficient osseous space posterior to the second molars. Aims and Objectives: The present study evaluated (1 the distribution of the impaction of mandibular third molar; (2 the distribution of the patterns of impaction radiographically; and (3 the gender distribution for pattern of impaction. Materials and Methods: This hospital-based retrospective study was conducted over a course of 6 months in the Department of Oral Medicine and Radiology and presents the analysis of 122 panoramic radiographs of patients between the age group of 18-30 years. They were interpreted and assessed for the impaction of mandibular third molars. Statistical analysis was done by Chi-square test. Results: Bilateral impaction of mandibular third molar is more common than unilateral in both the sexes, with mesioangular being the most common pattern. In males, mesioangular pattern was followed by horizontal, whereas in females it was followed by vertical. Conclusion: The present study provides useful data regarding the radiographic status of impacted mandibular third molars in patients.

  10. Statistical analysis of secondary particle distributions in relativistic nucleus-nucleus collisions

    Science.gov (United States)

    Mcguire, Stephen C.

    1987-01-01

    The use is described of several statistical techniques to characterize structure in the angular distributions of secondary particles from nucleus-nucleus collisions in the energy range 24 to 61 GeV/nucleon. The objective of this work was to determine whether there are correlations between emitted particle intensity and angle that may be used to support the existence of the quark gluon plasma. The techniques include chi-square null hypothesis tests, the method of discrete Fourier transform analysis, and fluctuation analysis. We have also used the method of composite unit vectors to test for azimuthal asymmetry in a data set of 63 JACEE-3 events. Each method is presented in a manner that provides the reader with some practical detail regarding its application. Of those events with relatively high statistics, Fe approaches 0 at 55 GeV/nucleon was found to possess an azimuthal distribution with a highly non-random structure. No evidence of non-statistical fluctuations was found in the pseudo-rapidity distributions of the events studied. It is seen that the most effective application of these methods relies upon the availability of many events or single events that possess very high multiplicities.

  11. Skewness of the generalized centrifugal force divergence for a joint normal distribution of strain and vorticity components

    Science.gov (United States)

    Hua, Bach Lien

    1994-09-01

    This note attempts to connect the skewness of the probability distribution function (PDF) of pressure, which is commonly observed in two-dimensional turbulence, to differences in the geometry of the strain and vorticity fields. This paper illustrates analytically the respective roles of strain and vorticity in shaping the PDF of pressure, in the particular case of a joint normal distribution of velocity gradients. The latter assumption is not valid in general in direct numerical simulations (DNS) of two-dimensional turbulence but may apply to geostrophic turbulence in presence of a differential rotation (β effect). In essence, minus the Laplacian of pressure is the difference of squared strain and vorticity, a quantity which is named the generalized centrifugal force divergence (GCFD). Square strain and vorticity distributions follow chi-square statistics with unequal numbers of degrees of freedom, when one assumes a joint normal distribution of their components. Squared strain has two degrees of freedom and squared vorticity only one, thereby causing a skewness of the PDF of GCFD and hence of pressure.

  12. Analysis of vehicle headway distribution on multi-lane freeway considering car–truck interaction

    Directory of Open Access Journals (Sweden)

    Dewen Kong

    2016-04-01

    Full Text Available The primary objective of this study is to identify the characteristics of vehicle headway on multi-lane freeway under lane management in China considering car–truck interaction. More specifically, the study focused on answering the following two questions: (1 whether the car–truck interaction has impact on headway, that is, headway varies by different leading and following vehicle types and (2 what is the best-fitted distribution model for particular headway type under lane management. The team collected traffic data, including traffic flow rates, percentage of trucks, speeds, headways, and so on from four segments of Shanghai-Nanjing freeway, Jiangsu Province in China. Then, some statistical methods were used to analyze the vehicle headway. It was found that car–car, car/truck, and truck–truck headways are significantly different from each other. Also, the traffic flow rate, percentage of trucks, and lane position were found to have an influence on the vehicle headway through the tests. Using the maximum-likelihood estimation, Kolmogorov–Smirnov test, and chi-square test techniques, the distribution models and parameter functions for each headway type were built and validated. The results showed that lognormal model is suitable for car–car and truck–truck headway types, and inverse Gaussian model fits the car/truck headway type well.

  13. Comparison of the Porter-Thomas distribution with neutron resonance data of even-even nuclei

    International Nuclear Information System (INIS)

    Camarda, H.S.

    1994-01-01

    The low-energy neutron resonance data of the even-even nuclei 152 Sm, 158 Gd, 162 Dy, 166,168 Er, 182 W, 232 Th, and 236,238 U have been examined in order to test the validity of the Porter-Thomas distribution of the reduced neutron widths---a chi-squared distribution with one degree of freedom (v=1). In an attempt to circumvent the ever-present problems of missed or spurious s wave levels as well as extra p wave levels, a maximum likelihood statistic was employed which used only measured widths greater than some minimum value. A Bayes-theory test applied to the data helped to ensure that p wave contamination of the s wave level population was not significant. The error-weighted value of the number of degrees of freedom for the nine nuclei studied, left-angle v right-angle=0.98±0.10, is consistent with the theoretical expectation of v=1

  14. Extended Distributed State Estimation: A Detection Method against Tolerable False Data Injection Attacks in Smart Grids

    Directory of Open Access Journals (Sweden)

    Dai Wang

    2014-03-01

    Full Text Available False data injection (FDI is considered to be one of the most dangerous cyber-attacks in smart grids, as it may lead to energy theft from end users, false dispatch in the distribution process, and device breakdown during power generation. In this paper, a novel kind of FDI attack, named tolerable false data injection (TFDI, is constructed. Such attacks exploit the traditional detector’s tolerance of observation errors to bypass the traditional bad data detection. Then, a method based on extended distributed state estimation (EDSE is proposed to detect TFDI in smart grids. The smart grid is decomposed into several subsystems, exploiting graph partition algorithms. Each subsystem is extended outward to include the adjacent buses and tie lines, and generate the extended subsystem. The Chi-squares test is applied to detect the false data in each extended subsystem. Through decomposition, the false data stands out distinctively from normal observation errors and the detection sensitivity is increased. Extensive TFDI attack cases are simulated in the Institute of Electrical and Electronics Engineers (IEEE 14-, 39-, 118- and 300-bus systems. Simulation results show that the detection precision of the EDSE-based method is much higher than that of the traditional method, while the proposed method significantly reduces the associated computational costs.

  15. Efficacy of parenting education compared to the standard method in improvement of reading and writing disabilities in children.

    Science.gov (United States)

    Karahmadi, Mojgan; Shakibayee, Fereshteh; Amirian, Hushang; Bagherian-Sararoudi, Reza; Maracy, Mohammad Reza

    2014-01-01

    The present study aimed to evaluate the effect of parenting education on improvement of reading and writing disabilities in children. A randomized controlled trial was done on primary school students with reading and writing disabilities and their mothers. The subjects were divided into three groups with 26 members in each group. The first group (mothers' education group) received 6 one-hour new educational sessions. The second group (standard group) received 12-15 standard educational sessions for learning disability, and the third group (control group) which consisted of students with learning disability did not receive any treatments. Research instruments included reading and writing tests, and demographic questionnaire. The three groups were evaluated via pretest and posttests at baseline and after one and three months of educational interventions. Data were analyzed using the chi-square, t-test, and repeated measures multivariate analysis of variance (MANOVA). The mean reading speed had the most progression in the mothers' education group. Comparison among reading speed, reading accuracy, and spelling scores has been statistically significant (F 2, 6 = 90.64;p 0.05). The mean reading accuracy, mostly increased after 3-month interventions in the mothers group. The control group had the lowest mean reading accuracy scores. Parenting education in mothers had a positive effect on the treatment of children with reading and writing disabilities. None. Clinical Trial Registration-URL: http://www.irct.ir. Unique identifier: IRCT201101205653N1.

  16. Spatial Distribution of Eggs of Alabama argillacea Hübner and Heliothis virescens Fabricius (Lepidoptera: Noctuidae on Bt and non-BtCotton

    Directory of Open Access Journals (Sweden)

    TATIANA R. RODRIGUES

    2015-12-01

    Full Text Available ABSTRACT Among the options to control Alabama argillacea (Hübner, 1818 and Heliothis virescens (Fabricius, 1781 on cotton, insecticide spraying and biological control have been extensively used. The GM'Bt' cotton has been introduced as an extremely viable alternative, but it is yet not known how transgenic plants affect populations of organisms that are interrelated in an agroecosystem. For this reason, it is important to know how the spatial arrangement of pests and beneficial insect are affected, which may call for changes in the methods used for sampling these species. This study was conducted with the goal to investigate the pattern of spatial distribution of eggs of A. argillacea and H. virescens in DeltaOpalTM (non-Bt and DP90BTMBt cotton cultivars. Data were collected during the agricultural year 2006/2007 in two areas of 5,000 m2, located in in the district of Nova América, Caarapó municipality. In each sampling area, comprising 100 plots of 50 m2, 15 evaluations were performed on two plants per plot. The sampling consisted in counting the eggs. The aggregation index (variance/mean ratio, Morisita index and exponent k of the negative binomial distribution and chi-square fit of the observed and expected values to the theoretical frequency distribution (Poisson, Binomial and Negative Binomial Positive, showed that in both cultivars, the eggs of these species are distributed according to the aggregate distribution model, fitting the pattern of negative binomial distribution.

  17. Outcome of tight versus standard glycemic control in coronary artery bypass patients

    International Nuclear Information System (INIS)

    Subhani, H.

    2012-01-01

    Objectives: To compare the outcome of tight versus standard glycemic control and its impact on post operative morbidity and short term mortality in patients undergoing Coronary Artery Bypass Grafting (CA-BG). Patients and Methods: A prospective surveillance of 124 patients undergoing isolated CABG surgery (on pump) was included in the study, 62 patients in each group were randomly assigned to tight and standard glucose control group. The main exposure was insulin in respect to level of blood glucose and the primary outcome measures were Sternotomy wound infection, Leg wound infection and new Myocardial Infarction. Surgical Site infection was assessed on a daily basis during the patient's stay in the Department of Cardio-thoracic Surgery, Sheikh Zayed Hospital, Lahore or within 30 days of operation prompting the patient to return to the hospital. Chi-square test or test was used to identify the significance of various short term morbidities and mortality. Results: In this study, 12 patients in the standard group and 4 patients in the tightly controlled group developed Sternal wound infection (p value 0.046). Similarly, 9 versus 2 patients in the standard and tight group respectively developed Leg wound infection (p-value 0.035). Test of proportion was applied and it was found that there was significant difference in the pro-portion of infection in the two groups (p value 0.05). However, there were no significant differences in other morbidities and the short term mortality. Conclusion: Study confirmed that tight glucose con-trol post operatively in CABG patient's results in reduced sternal and leg wound infection rates; however, there was no effect on other morbidities and short term mortality. (author)

  18. Determination of the relations governing trends in the standard deviations of the distribution of pollution based on observations on the atmospheric turbulence spectrum and the possibility of laboratory simulation

    International Nuclear Information System (INIS)

    Crabol, B.

    1980-01-01

    Using TAYLOR's calculation, which takes account of the low-pass filter effect of the transfer time on the value for the standard deviation of particle dispersion, we have introduced a high-pass filter which translate the effect of the time of observation, by definition finite, onto the true atmospheric scale. It is then possible to identify those conditions under which the relations governing variation of the standard deviations of pollution distribution are dependent upon: the distance of transfer alone, the time of transfer alone. Thence, making certain simplifying assumptions, practical quantitive relationships are deduced for the variation of the horizontal standard deviation of pollution dispersion as a function of wind speed and time of transfer

  19. The Distribution of Instructional Time and Its Effect on Group Cohesion in the Foreign Language Classroom: A Comparison of Intensive and Standard Format Courses

    Science.gov (United States)

    Hinger, Barbara

    2006-01-01

    This paper argues for the influence of the distribution of instructional time on group cohesion in the foreign language classroom and postulates that concentrating classroom time enhances group cohesion. To test the hypothesis, a comparative classroom study of two groups of Spanish learners in their second year of learning, one following an…

  20. Was I Right? Testing Observations against Predictions in Mendelian Genetics.

    Science.gov (United States)

    Hursch, Thomas Mercer

    1979-01-01

    Two statistical tools, the Chi-square and standard error approaches, are compared for use in Mendelian genetics experiments. Although the Chi-square technique is more often used, the standard error approach is to be preferred for both research investigations and student experiments. (BB)

  1. Prevalence and distribution of permanent canine agenesis in dental paediatric and orthodontic patients in Hungary.

    Science.gov (United States)

    Rózsa, N; Nagy, K; Vajó, Z; Gábris, K; Soós, A; Alberth, M; Tarján, I

    2009-08-01

    Non-syndromic permanent canine agenesis, or combined with agenesis, or developmental absence of other tooth types, has occasionally been described in the literature, but isolated forms are rarely observed. The purpose of the present retrospective radiographic study was to provide data on the prevalence and distribution of permanent canine agenesis in the Hungarian population. Dental panoramic tomograms and the medical history data of 4417, 6- to 18-year-old children (average age 12 years, male-to-female ratio 1:1), who presented for treatment at the Department of Paediatric Dentistry and Orthodontics of the Semmelweis University Budapest, Hungary, were examined. Patients with systemic diseases were excluded. Chi-square and Fisher's tests were performed to determine statistical significance at a level of P agenesis. The overall prevalence was 0.29 per cent. The prevalence of permanent canine agenesis was 0.27 per cent in the maxilla and 0.09 per cent in the mandible (P Dental anomalies associated with permanent canine agenesis were found: 11 patients had retention of the primary canines, 10 other types of agenesis of the permanent teeth, one a primary supernumerary tooth, one a supernumerary cusp, and nine occlusal disturbances.

  2. Study of the Distribution of Malassezia Species in Patients with Pityriasis Versicolor in Kolar Region, Karnataka.

    Science.gov (United States)

    Archana, Banur Raju; Beena, Paravangada Madappa; Kumar, Shiva

    2015-01-01

    Pityriasis versicolor is a superficial, chronically recurring fungal infection caused by Malassezia species. Recently it has been revised taxanomically into 14 species, in that only 7 species have been well studied in relation to pityriasis versicolor. To identify Malassezia species isolated from patients with pityriasis versicolor and to find out any correlation between the species with clinical presentation of lesions. Prospective study comprising of 100 clinically diagnosed cases of pityriasis versicolor attending Dermatology Outpatient Department over a period of 1 year. The clinical specimens were collected under aseptic precautions and subjected to culture on Sabouraud's Dextrose Agar overlaid with olive oil and modified Dixon agar. The isolates were identified by biochemical tests. Statistical analysis was done using proportion, mean and chi-square test. Of the 100 cases, 73% were males, 26% were females and predominant age group was 21-30 years. Out of 100 samples, 70 yielded growth. The most common isolate was M. sympodialis (50%), followed by M. furfur (32.86%), M. globosa (14.28%) and M. slooffiae (2.86%). Among 100 cases, 74% had hypopigmented and 26% had hyperpigmented lesions. M. sympodialis and M. furur were predominantly isolated from hypopigmented lesions and M. globosa and M. slooffiae were found to be more common in hyperpigmented lesions. M. sympodialis was the most common isolate, followed by M. furfur, M. globosa and M. slooffiae. There was no significant difference in distribution of different species in patients with hypo or hyper pigmented lesions.

  3. Assessment of naked mole-rat distribution and threats in Eastern Ethiopia

    Directory of Open Access Journals (Sweden)

    Mengistu Wale

    2016-08-01

    Full Text Available Objective: To identify the distribution, threats and community attitudes towards naked molerat in Eastern Ethiopia. Methods: Data were collected through direct observation and interview and Chi-square at 95% confidence interval was used for significance test. Results: Naked mole-rat was identified in Fafan, City/Shinele, Eastern Hararghe Zone and Dire Dawa Administrative. The main threats of naked mole-rat identified were agricultural expansion, human killing and lack of awareness. From a total of 100 respondents, 92% of them considered naked mole-rat as pest as a result that 46% of them participated in direct killing. Literacy rate significantly affects the willingness to participate in the conservation of naked mole-rat (χ2 = 7.478, df = 1, P < 0.05. From a total of 26% respondents who did not show the willingness to participate in the conservation, 80.8% of them were illiterate. Conclusions: Naked mole-rat is fairly common in many of the study sites. However, rapid shift from nomadic life style to cultivation of crops and lacks of awareness were the main threats of naked mole-rat. Therefore, since there is no conservation action currently, further comprehensive study is required to design conservation plan for this species.

  4. A New Stellar Atmosphere Grid and Comparisons with HST /STIS CALSPEC Flux Distributions

    Energy Technology Data Exchange (ETDEWEB)

    Bohlin, Ralph C.; Fleming, Scott W.; Gordon, Karl D.; Koekemoer, Anton M. [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Mészáros, Szabolcs; Kovács, József [ELTE Gothard Astrophysical Observatory, H-9700 Szombathely, Szent Imre Herceg St. 112 (Hungary)

    2017-05-01

    The Space Telescope Imaging Spectrograph has measured the spectral energy distributions for several stars of types O, B, A, F, and G. These absolute fluxes from the CALSPEC database are fit with a new spectral grid computed from the ATLAS-APOGEE ATLAS9 model atmosphere database using a chi-square minimization technique in four parameters. The quality of the fits are compared for complete LTE grids by Castelli and Kurucz (CK04) and our new comprehensive LTE grid (BOSZ). For the cooler stars, the fits with the MARCS LTE grid are also evaluated, while the hottest stars are also fit with the NLTE Lanz and Hubeny OB star grids. Unfortunately, these NLTE models do not transition smoothly in the infrared to agree with our new BOSZ LTE grid at the NLTE lower limit of T {sub eff} = 15,000 K. The new BOSZ grid is available via the Space Telescope Institute MAST archive and has a much finer sampled IR wavelength scale than CK04, which will facilitate the modeling of stars observed by the James Webb Space Telescope . Our result for the angular diameter of Sirius agrees with the ground-based interferometric value.

  5. Uncertainties of Predictions from Parton Distribution Functions 1, the Lagrange Multiplier Method

    CERN Document Server

    Stump, D R; Brock, R; Casey, D; Huston, J; Kalk, J; Lai, H L; Tung, W K

    2002-01-01

    We apply the Lagrange Multiplier method to study the uncertainties of physical predictions due to the uncertainties of parton distribution functions (PDFs), using the cross section for W production at a hadron collider as an archetypal example. An effective chi-squared function based on the CTEQ global QCD analysis is used to generate a series of PDFs, each of which represents the best fit to the global data for some specified value of the cross section. By analyzing the likelihood of these "alterative hypotheses", using available information on errors from the individual experiments, we estimate that the fractional uncertainty of the cross section due to current experimental input to the PDF analysis is approximately 4% at the Tevatron, and 10% at the LHC. We give sets of PDFs corresponding to these up and down variations of the cross section. We also present similar results on Z production at the colliders. Our method can be applied to any combination of physical variables in precision QCD phenomenology, an...

  6. Particle mobility size spectrometers: harmonization of technical standards and data structure to facilitate high quality long-term observations of atmospheric particle number size distributions

    OpenAIRE

    A. Wiedensohler; W. Birmili; A. Nowak; A. Sonntag; K. Weinhold; M. Merkel; B. Wehner; T. Tuch; S. Pfeifer; M. Fiebig; A. M. Fjäraa; E. Asmi; K. Sellegri; R. Depuy; H. Venzac

    2010-01-01

    Particle mobility size spectrometers often referred to as DMPS (Differential Mobility Particle Sizers) or SMPS (Scanning Mobility Particle Sizers) have found a wide application in atmospheric aerosol research. However, comparability of measurements conducted world-wide is hampered by lack of generally accepted technical standards with respect to the instrumental set-up, measurement mode, data evaluation as well as quality control. This article results from several instrument intercomp...

  7. [Geospatial distribution and detection of dengue virus in Aedes (Stegomyia) aegypti mosquitos in Ciudad Juárez, Chihuahua, Mexico].

    Science.gov (United States)

    de la Mora-Covarrubias, Antonio; Jiménez-Vega, Florinda; Treviño-Aguilar, Sandra Maritza

    2010-01-01

    To determine the distribution of Aedes aegypti in Ciudad Juárez, Mexico and evaluate it as a carrier of the dengue virus. Mosquitoes were collected using CDC minitraps. Nearest neighbor and K-function were used as geospatial tools. The chi-square test was utilized to evaluate the association between the presence of the vector and sociodemographic variables. Evidence of infection was detected by RT-PCR. A total of 122 female mosquitoes were captured. A tendency in the cluster distribution (R= -1.2995, p= 0.05) of the mosquito was shown up to 4000 m but none of the sociodemographic variables showed significant associations. Seven of the pools tested were positive for DEN-2, ten were positive for DEN- 3, and seven for both serotypes. This is the first report on the presence of Aedes aegypti mosquitoes infected with dengue in the region, which will enable the promotion of monitoring in order to reduce the probability of occurrence of the disease among the border population.

  8. Allele frequency distribution of 1691G >A F5 (which confers Factor V Leiden) across Europe, including Slavic populations.

    Science.gov (United States)

    Clark, Jeremy S C; Adler, Grażyna; Salkic, Nermin N; Ciechanowicz, Andrzej

    2013-11-01

    The allele 1691A F5, conferring Factor V Leiden, is a common risk factor in venous thromboembolism. The frequency distribution for this allele in Western Europe has been well documented; but here data from Central, Eastern and South-Eastern Europe has been included. In order to assess the significance of the collated data, a chi-squared test was applied, and Tukey tests and z-tests with Bonferroni correction were compared. A distribution with a North-Southeast band of high frequency of the 1691A F5 allele was discovered with a pocket including some Southern Slavic populations with low frequency. European countries/regions can be arbitrarily delimited into low (group 1, <2.8 %, mean 1.9 % 1691A F5 allele) or high (group 2, ≥2.8 %, mean 4.0 %) frequency groups, with many significant differences between groups, but only one intra-group difference (the Tukey test is suggested to be superior to the z-tests). In Europe a North-Southeast band of 1691A F5 high frequency has been found, clarified by inclusion of data from Central, Eastern and South-Eastern Europe, which surrounds a pocket of low frequency in the Balkans which could possibly be explained by Slavic migration. There seem to be no indications of variation in environmental selection due to geographical location.

  9. ENSURING THERMAL REGIME FOR THE SUPPLY DISTRIBUTED DEVICES IN THE COMPOSITION OF THE SHIP'S SECONDARY POWER SUPPLY SYSTEMS ON THE BASE OF THE STANDARDIZED UNITS

    Directory of Open Access Journals (Sweden)

    T. A. Ismailov

    2016-01-01

    Full Text Available Aim. The article deals with the problem of constructing the power supply devices in the composition of the ship's secondary power systems based on standardized blocks and securing their thermal regime.Methods. It is stated that with the advent of modern power electronics multifunctional components the secondary power supply developers got possibilities to improve the quality of secondary power supply and to upgrade the existing systems.Results. The advantages of unified power units, having a function of parallel operation are revealed. Heat transfer processes in a vertical channel with free convection, and the calculation of the minimum width of the channel, which provides efficient heat removal have been analyzed.Conclusion.A model is proposed for determining the minimum distance between the blocks without deterioration of heat transfer in the channel formed by the walls of adjacent blocks.

  10. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  11. Download this PDF file

    African Journals Online (AJOL)

    Dr Kazungu

    importance of the sector in overall GDP, export earnings and employment as well as its forward and backward .... and fertilizer and finds that farmers with low education and land holdings are less likely to adopt improved ..... approximately a chi-square or a mixed chi-square distribution with degrees of freedom equal to.

  12. Distribution of motor unit potential velocities in short static and prolonged dynamic contractions at low forces: use of the within-subject's skewness and standard deviation variables.

    Science.gov (United States)

    Klaver-Król, E G; Henriquez, N R; Oosterloo, S J; Klaver, P; Bos, J M; Zwarts, M J

    2007-11-01

    Behaviour of motor unit potential (MUP) velocities in relation to (low) force and duration was investigated in biceps brachii muscle using a surface electrode array. Short static tests of 3.8 s (41 subjects) and prolonged dynamic tests (prolonged tests) of 4 min (30 subjects) were performed as position tasks, applying forces up to 20% of maximal voluntary contraction (MVC). Four variables, derived from the inter-peak latency technique, were used to describe changes in the surface electromyography signal: the mean muscle fibre conduction velocity (CV), the proportion between slow and fast MUPs expressed as the within-subject skewness of MUP velocities, the within-subject standard deviation of MUP velocities [SD-peak velocity (PV)], and the amount of MUPs per second (peak frequency=PF). In short static tests and the initial phase of prolonged tests, larger forces induced an increase of the CV and PF, accompanied with the shift of MUP velocities towards higher values, whereas the SD-PV did not change. During the first 1.5-2 min of the prolonged lower force levels tests (unloaded, and loaded 5 and 10% MVC) the CV and SD-PV slightly decreased and the MUP velocities shifted towards lower values; then the three variables stabilized. The PF values did not change in these tests. However, during the prolonged higher force (20% MVC) test, the CV decreased and MUP velocities shifted towards lower values without stabilization, while the SD-PV broadened and the PF decreased progressively. It is argued that these combined results reflect changes in both neural regulatory strategies and muscle membrane state.

  13. Distribution of motor unit potential velocities in short static and prolonged dynamic contractions at low forces: use of the within-subject’s skewness and standard deviation variables

    Science.gov (United States)

    Henriquez, N. R.; Oosterloo, S. J.; Klaver, P.; Bos, J. M.; Zwarts, M. J.

    2007-01-01

    Behaviour of motor unit potential (MUP) velocities in relation to (low) force and duration was investigated in biceps brachii muscle using a surface electrode array. Short static tests of 3.8 s (41 subjects) and prolonged dynamic tests (prolonged tests) of 4 min (30 subjects) were performed as position tasks, applying forces up to 20% of maximal voluntary contraction (MVC). Four variables, derived from the inter-peak latency technique, were used to describe changes in the surface electromyography signal: the mean muscle fibre conduction velocity (CV), the proportion between slow and fast MUPs expressed as the within-subject skewness of MUP velocities, the within-subject standard deviation of MUP velocities [SD-peak velocity (PV)], and the amount of MUPs per second (peak frequency = PF). In short static tests and the initial phase of prolonged tests, larger forces induced an increase of the CV and PF, accompanied with the shift of MUP velocities towards higher values, whereas the SD-PV did not change. During the first 1.5–2 min of the prolonged lower force levels tests (unloaded, and loaded 5 and 10% MVC) the CV and SD-PV slightly decreased and the MUP velocities shifted towards lower values; then the three variables stabilized. The PF values did not change in these tests. However, during the prolonged higher force (20% MVC) test, the CV decreased and MUP velocities shifted towards lower values without stabilization, while the SD-PV broadened and the PF decreased progressively. It is argued that these combined results reflect changes in both neural regulatory strategies and muscle membrane state. PMID:17874124

  14. Spatial distribution of scorpions according to the socioeconomic conditions in Campina Grande,State of Paraíba, Brazil

    Directory of Open Access Journals (Sweden)

    Thassiany Sarmento Oliveira de Almeida

    Full Text Available Abstract: INTRODUCTION: Due to its frequency and morbidity, such as that caused by scorpions have achieved public health importance in certain regions of the world. The present exploratory ecological study aimed to characterize the epidemiological profile and spatial distribution of scorpion stings in Campina Grande, State of Paraíba in Northeastern Brazil. METHODS: Geographical information system techniques were used to record the scorpion stings, and Google Earth software, Track Maker, and ArcGIS 10 Esri were used as geocoding databases. The Moran test was used to evaluate spatial correlation, and the Pearson chi-square test was used to analyze associations between scorpion stings and socioeconomic variables. RESULTS: The study evaluated 1,466 scorpion stings. Envenomations were more frequent among women (n = 908, 61.9%, and most patients were aged 13-28 years (n = 428, 29.2%. The Southern region of the city had the largest number of registered cases (n = 548, 37.4%, followed by the Western region (n = 510, 34.8%. CONCLUSIONS: Spatial analysis of scorpionism revealed an irregular occurrence in Campina Grande. Further, no association was observed between the socioeconomic factors analyzed and the geographic location of the scorpion envenomations. Detection of spatial areas with an increased risk of scorpionism can help prioritize adoption of preventive measures in these regions to reduce the associated incidence and morbidity.

  15. Analysis of HBV genotype distribution and its association with liver cirrhosis in Xinjiang Uygur Autonomous Region, China

    Directory of Open Access Journals (Sweden)

    WANG Xiaozhong

    2014-12-01

    Full Text Available ObjectiveTo investigate the distribution of hepatitis B virus (HBV genotypes among patients in Xinjiang Uygur Autonomous Region, China, and to explore its association with liver cirrhosis. MethodsHBV genotypes of 1018 hepatitis B patients were determined by PCR analysis. The relationship of HBV genotype with clinical outcomes and relevant chronic liver diseases was assessed by contingency chi-square test, Kruskal-Wallis test, and multivariate unconditional logistic regression analysis. ResultsAmong the 828 patients whose HBV genotyping was completed in this study, type C was the major genotype and the percentage was 54.11% (448/828, 25.15% (200/828 had type B, and 16.18% (134/828 had type D. Among the 116 patients with liver cirrhosis, 20.84% had type C, which was significantly more frequent than other genotypes (P<0.00. The multivariate unconditional logistic regression model identified several risk factors for liver cirrhosis, including duration of hepatitis B≥10 years, C genotype, high HBV DNA viral load, and impaired liver function characterized by abnormal alanine aminotransferase test. Among all these factors, genotype C had the highest relevance to liver cirrhosis (OR=2819. ConclusionThe leading genotype of HBV in Xinjiang Uygur Autonomous Region is type C, followed by type B and type D. Genotype C is an independent risk factor for HBV-related liver cirrhosis.

  16. Distributed creativity

    DEFF Research Database (Denmark)

    Glaveanu, Vlad Petre

    This book challenges the standard view that creativity comes only from within an individual by arguing that creativity also exists ‘outside’ of the mind or more precisely, that the human mind extends through the means of action into the world. The notion of ‘distributed creativity’ is not commonly...... used within the literature and yet it has the potential to revolutionise the way we think about creativity, from how we define and measure it to what we can practically do to foster and develop creativity. Drawing on cultural psychology, ecological psychology and advances in cognitive science......, this book offers a basic framework for the study of distributed creativity that considers three main dimensions of creative work: sociality, materiality and temporality. Starting from the premise that creativity is distributed between people, between people and objects and across time, the book reviews...

  17. ['Gold standard', not 'golden standard'

    NARCIS (Netherlands)

    Claassen, J.A.H.R.

    2005-01-01

    In medical literature, both 'gold standard' and 'golden standard' are employed to describe a reference test used for comparison with a novel method. The term 'gold standard' in its current sense in medical research was coined by Rudd in 1979, in reference to the monetary gold standard. In the same

  18. The diverse distribution of risk factors between breast cancer subtypes of ER, PR and HER2: a 10-year retrospective multi-center study in China.

    Directory of Open Access Journals (Sweden)

    Qingkun Song

    Full Text Available INTRODUCTION: Hormone receptors, human epidermal growth factor receptor 2 and some risk factors determine therapies and prognosis of breast cancer. The risk factors distributed differently between patients with receptors. This study aimed to investigate the distribution of risk factors between subtypes of breast cancer by the 3 receptors in Chinese native women with a large sample size. METHODS: The multi-center study analyzed 4211 patient medical records from 1999 to 2008 in 7 regions of China. Data on patients' demographic information, risk factors (menopausal status, parity, body mass index and receptor statuses were extracted. Breast cancer subtypes included ER (+/-, PR (+/-, HER2 (+/-, 4 ER/PR and 4 molecular subtypes. Wilcoxon and Chi-square tests were used to estimate the difference. The unconditional logistic regression model was used for analysis, and presented p-value after Bonferroni correction in the results. RESULTS: Compared to patients with negative progesterone receptor, the positive patients were younger at diagnosis, and reported less likely in postmenopausal status and lower parity (p1 parity (OR = 1.36 (p1 parity (OR = 1.19 (p20% (p<0.05. CONCLUSION: In this study, it was found that Chinese female patients did have statistically significant differences of age, menopausal status, parity and body mass index between breast cancer subtypes. Studies are warranted to further investigate the risk factors between subtypes, which was meaningful for prevention and treatment among Chinese females.

  19. Distributed intelligence in CAMAC

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1977-01-01

    A simple extension of the CAMAC standard is described which allows distributed intelligence at the crate level. By distributed intelligence is meant that there is more than one source of control in a system. This standard is just now emerging from the NIM Dataway Working Group and its European counterpart. 1 figure

  20. Distributed intelligence in CAMAC

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1977-01-01

    The CAMAC digital interface standard has served us well since 1969. During this time there have been enormous advances in digital electronics. In particular, low cost microprocessors now make it feasible to consider use of distributed intelligence even in simple data acquisition systems. This paper describes a simple extension of the CAMAC standard which allows distributed intelligence at the crate level

  1. Charged-particle thermonuclear reaction rates: I. Monte Carlo method and statistical distributions

    International Nuclear Information System (INIS)

    Longland, R.; Iliadis, C.; Champagne, A.E.; Newton, J.R.; Ugalde, C.; Coc, A.; Fitzgerald, R.

    2010-01-01

    A method based on Monte Carlo techniques is presented for evaluating thermonuclear reaction rates. We begin by reviewing commonly applied procedures and point out that reaction rates that have been reported up to now in the literature have no rigorous statistical meaning. Subsequently, we associate each nuclear physics quantity entering in the calculation of reaction rates with a specific probability density function, including Gaussian, lognormal and chi-squared distributions. Based on these probability density functions the total reaction rate is randomly sampled many times until the required statistical precision is achieved. This procedure results in a median (Monte Carlo) rate which agrees under certain conditions with the commonly reported recommended 'classical' rate. In addition, we present at each temperature a low rate and a high rate, corresponding to the 0.16 and 0.84 quantiles of the cumulative reaction rate distribution. These quantities are in general different from the statistically meaningless 'minimum' (or 'lower limit') and 'maximum' (or 'upper limit') reaction rates which are commonly reported. Furthermore, we approximate the output reaction rate probability density function by a lognormal distribution and present, at each temperature, the lognormal parameters μ and σ. The values of these quantities will be crucial for future Monte Carlo nucleosynthesis studies. Our new reaction rates, appropriate for bare nuclei in the laboratory, are tabulated in the second paper of this issue (Paper II). The nuclear physics input used to derive our reaction rates is presented in the third paper of this issue (Paper III). In the fourth paper of this issue (Paper IV) we compare our new reaction rates to previous results.

  2. Pattern and Distribution of Colorectal Cancer in Tanzania: A Retrospective Chart Audit at Two National Hospitals

    Directory of Open Access Journals (Sweden)

    Leonard K. Katalambula

    2016-01-01

    Full Text Available Background. Colorectal cancer (CRC is a growing public health concern with increasing rates in countries with previously known low incidence. This study determined pattern and distribution of CRC in Tanzania and identified hot spots in case distribution. Methods. A retrospective chart audit reviewed hospital registers and patient files from two national institutions. Descriptive statistics, Chi square (χ2 tests, and regression analyses were employed and augmented by data visualization to display risk variable differences. Results. CRC cases increased sixfold in the last decade in Tanzania. There was a 1.5% decrease in incidences levels of rectal cancer and 2% increase for colon cancer every year from 2005 to 2015. Nearly half of patients listed Dar es Salaam as their primary residence. CRC was equally distributed between males (50.06% and females (49.94%, although gender likelihood of diagnosis type (i.e., rectal or colon was significantly different (P=0.027. More than 60% of patients were between 40 and 69 years. Conclusions. Age (P=0.0183 and time (P=0.004 but not gender (P=0.0864 were significantly associated with rectal cancer in a retrospective study in Tanzania. Gender (P=0.0405, age (P=0.0015, and time (P=0.0075 were all significantly associated with colon cancer in this study. This retrospective study found that colon cancer is more prevalent among males at a relatively younger age than rectal cancer. Further, our study showed that although more patients were diagnosed with rectal cancer, the trend has shown that colon cancer is increasing at a faster rate.

  3. Pattern and Distribution of Colorectal Cancer in Tanzania: A Retrospective Chart Audit at Two National Hospitals.

    Science.gov (United States)

    Katalambula, Leonard K; Ntwenya, Julius Edward; Ngoma, Twalib; Buza, Joram; Mpolya, Emmanuel; Mtumwa, Abdallah H; Petrucka, Pammla

    2016-01-01

    Background . Colorectal cancer (CRC) is a growing public health concern with increasing rates in countries with previously known low incidence. This study determined pattern and distribution of CRC in Tanzania and identified hot spots in case distribution. Methods . A retrospective chart audit reviewed hospital registers and patient files from two national institutions. Descriptive statistics, Chi square ( χ 2 ) tests, and regression analyses were employed and augmented by data visualization to display risk variable differences. Results . CRC cases increased sixfold in the last decade in Tanzania. There was a 1.5% decrease in incidences levels of rectal cancer and 2% increase for colon cancer every year from 2005 to 2015. Nearly half of patients listed Dar es Salaam as their primary residence. CRC was equally distributed between males (50.06%) and females (49.94%), although gender likelihood of diagnosis type (i.e., rectal or colon) was significantly different ( P = 0.027). More than 60% of patients were between 40 and 69 years. Conclusions . Age ( P = 0.0183) and time ( P = 0.004) but not gender ( P = 0.0864) were significantly associated with rectal cancer in a retrospective study in Tanzania. Gender ( P = 0.0405), age ( P = 0.0015), and time ( P = 0.0075) were all significantly associated with colon cancer in this study. This retrospective study found that colon cancer is more prevalent among males at a relatively younger age than rectal cancer. Further, our study showed that although more patients were diagnosed with rectal cancer, the trend has shown that colon cancer is increasing at a faster rate.

  4. Medical ethical standards in dermatology: an analytical study of knowledge, attitudes and practices.

    Science.gov (United States)

    Mostafa, W Z; Abdel Hay, R M; El Lawindi, M I

    2015-01-01

    Dermatology practice has not been ethically justified at all times. The objective of the study was to find out dermatologists' knowledge about medical ethics, their attitudes towards regulatory measures and their practices, and to study the different factors influencing the knowledge, the attitude and the practices of dermatologists. This is a cross-sectional comparative study conducted among 214 dermatologists, from five Academic Universities and from participants in two conferences. A 54 items structured anonymous questionnaire was designed to describe the demographical characteristics of the study group as well as their knowledge, attitude and practices regarding the medical ethics standards in clinical and research settings. Five scoring indices were estimated regarding knowledge, attitude and practice. Inferential statistics were used to test differences between groups as indicated. The Student's t-test and analysis of variance were carried out for quantitative variables. The chi-squared test was conducted for qualitative variables. The results were considered statistically significant at a P > 0.05. Analysis of the possible factors having impact on the overall scores revealed that the highest knowledge scores were among dermatologists who practice in an academic setting plus an additional place; however, this difference was statistically non-significant (P = 0.060). Female dermatologists showed a higher attitude score compared to males (P = 0.028). The highest significant attitude score (P = 0.019) regarding clinical practice was recorded among those practicing cosmetic dermatology. The different studied groups of dermatologists revealed a significant impact on the attitude score (P = 0.049), and the evidence-practice score (P dermatology research. © 2014 European Academy of Dermatology and Venereology.

  5. Effect of simulated resistance, fleeing, and use of force on standardized field sobriety testing.

    Science.gov (United States)

    Ho, Jeffrey; Dawes, Donald; Nystrom, Paul; Moore, Johanna; Steinberg, Lila; Tilton, Annemarie; Miner, James

    2015-07-01

    When a law enforcement officer (LEO) stops a suspect believed to be operating (a vehicle) while impaired (OWI), the suspect may resist or flee, and the LEO may respond with force. The suspect may then undergo a Standardized Field Sobriety Test (SFST) to gauge impairment. It is not known whether resistance, fleeing, or actions of force can create an inaccurate SFST result. We examined the effect of resistance, fleeing, and force on the SFST. Human volunteers were prospectively randomized to have a SFST before and after one of five scenarios: (1) five-second conducted electrical weapon exposure; (2) 100-yard (91.4 m) sprint; (3) 45-second physical fight; (4) police dog bite with protective gear; and (5) Oleoresin Capsicum spray to the face with eyes shielded. The SFST was administered and graded by a qualified LEO. After the SFST, the volunteer entered their scenario and was then administered another SFST. Data were analyzed using descriptive statistics. SFST performance was compared before and after using chi-square tests. Fifty-seven subjects enrolled. Three received a single-point penalty during one component of the three-component SFST pre-scenario. No subject received a penalty point in any components of the SFST post-scenario (p = 0.08). This is the first human study to examine the effects of physical resistance, flight, and use of force on the SFST result. We did not detect a difference in the performance of subjects taking the SFST before and after exposure to resistance, flight, or a simulated use of force. © Australian Council for Educational Research 2014.

  6. GAIA Service and Standard Assessment

    DEFF Research Database (Denmark)

    Dormann, Claire; Øst, Alexander Gorm

    A delivery from the ACTS-project GAIA. The report validates the gAIA architecture and standard. It provides results concerning the deployment of distributed brokerage systems over broadband networks.......A delivery from the ACTS-project GAIA. The report validates the gAIA architecture and standard. It provides results concerning the deployment of distributed brokerage systems over broadband networks....

  7. Accounting standards

    NARCIS (Netherlands)

    Stellinga, B.; Mügge, D.

    2014-01-01

    The European and global regulation of accounting standards have witnessed remarkable changes over the past twenty years. In the early 1990s, EU accounting practices were fragmented along national lines and US accounting standards were the de facto global standards. Since 2005, all EU listed

  8. Co-occurrence and distribution of East (L1014S) and West (L1014F) African knock-down resistance in Anopheles gambiae sensu lato population of Tanzania

    Science.gov (United States)

    Kabula, Bilali; Kisinza, William; Tungu, Patrick; Ndege, Chacha; Batengana, Benard; Kollo, Douglas; Malima, Robert; Kafuko, Jessica; Mohamed, Mahdi; Magesa, Stephen

    2014-01-01

    Objective Insecticide resistance molecular markers can provide sensitive indicators of resistance development in Anopheles vector populations. Assaying these makers is of paramount importance in the resistance monitoring programme. We investigated the presence and distribution of knock-down resistance (kdr) mutations in Anopheles gambiae s.l. in Tanzania. Methods Indoor-resting Anopheles mosquitoes were collected from 10 sites and tested for insecticide resistance using the standard WHO protocol. Polymerase chain reaction-based molecular diagnostics were used to genotype mosquitoes and detect kdr mutations. Results The An. gambiae tested were resistance to lambdacyhalothrin in Muheza, Arumeru and Muleba. Out of 350 An. gambiae s.l. genotyped, 35% were An. gambiae s.s. and 65% An. arabiensis. L1014S and L1014F mutations were detected in both An. gambiae s.s. and An. arabiensis. L1014S point mutation was found at the allelic frequency of 4–33%, while L1014F was at the allelic frequency 6–41%. The L1014S mutation was much associated with An. gambiae s.s. (χ2 = 23.41; P protocolo estándar de la OMS. Mediante un diagnóstico molecular basado en la PCR se genotiparon los mosquitos y se detectaron los genotipos kdr. Resultados Los An. gambiae evaluados eran resistentes a lambdacialotrina en Muheza, Arumeru y Muleba. De 350 An. gambiae s.l. genotipados, 35% eran An. gambiae s.s. y 65% eran An. arabiensis. Se detectaron mutaciones L1014S y L1014F tanto en An. gambiae s.s. como en An. arabiensis. La mutación puntual L1014S se encontró con una frecuencia alélica de 4-33%, mientras que L1014F tenía una frecuencia alélica de 6-14%. La mutación L1014S estaba ampliamente asociada a An. gambiae s.s. (Chi-Cuadrado = 23.41; P < 0.0001) y la L1014F estaba asociada con An. arabiensis (Chi-Square = 11.21; P = 0.0008). El alelo L1014S estaba significativamente asociado con mosquitos resistentes a la lambdacialotrina (P < 0.001). Conclusión La

  9. dOFV distributions: a new diagnostic for the adequacy of parameter uncertainty in nonlinear mixed-effects models applied to the bootstrap.

    Science.gov (United States)

    Dosne, Anne-Gaëlle; Niebecker, Ronald; Karlsson, Mats O

    2016-12-01

    Knowledge of the uncertainty in model parameters is essential for decision-making in drug development. Contrarily to other aspects of nonlinear mixed effects models (NLMEM), scrutiny towards assumptions around parameter uncertainty is low, and no diagnostic exists to judge whether the estimated uncertainty is appropriate. This work aims at introducing a diagnostic capable of assessing the appropriateness of a given parameter uncertainty distribution. The new diagnostic was applied to case bootstrap examples in order to investigate for which dataset sizes case bootstrap is appropriate for NLMEM. The proposed diagnostic is a plot comparing the distribution of differences in objective function values (dOFV) of the proposed uncertainty distribution to a theoretical Chi square distribution with degrees of freedom equal to the number of estimated model parameters. The uncertainty distribution was deemed appropriate if its dOFV distribution was overlaid with or below the theoretical distribution. The diagnostic was applied to the bootstrap of two real data and two simulated data examples, featuring pharmacokinetic and pharmacodynamic models and datasets of 20-200 individuals with between 2 and 5 observations on average per individual. In the real data examples, the diagnostic indicated that case bootstrap was unsuitable for NLMEM analyses with around 70 individuals. A measure of parameter-specific "effective" sample size was proposed as a potentially better indicator of bootstrap adequacy than overall sample size. In the simulation examples, bootstrap confidence intervals were shown to underestimate inter-individual variability at low sample sizes. The proposed diagnostic proved a relevant tool for assessing the appropriateness of a given parameter uncertainty distribution and as such it should be routinely used.

  10. The paradoxical distribution of a shallow-rooted keystone species away from surface water, near the water-limited edge of its range in the Sonoran Desert: Seed-seedling conflicts

    Science.gov (United States)

    Drezner, Taly Dawn

    2013-02-01

    Species distributions reflect limiting factors, particularly near the margins of their range where density and abundance decrease as environmental factors decrease or increase to non-optimal conditions. I test whether the keystone saguaro cactus (Carnegiea gigantea), a shallow-rooted species, is indeed distributed disproportionately in areas of concentrated drainage (runnels) in a water-limited population. Carnegiea and a common nurse were sampled at a marginal site in and out of areas with concentrated surface water and chi-square analysis was used to determine the pattern of distribution. In this study I found that, surprisingly, near the hot, water-limited edge of their range, C. gigantea are found significantly less often in areas where more water would be available to them. For example, while only 20% of nurses were on interfluves, half of Carnegiea protégé were there. One possible explanation is that the subsequent redistribution of seeds by water away from preferred microsites may be important in shaping the final pattern of successful establishment. The shallow-rooted Carnegiea relies entirely on surface water for its moisture; it is thus paradoxical that the surface water so fundamentally essential to its survival throughout its life appears to hinder its establishment in precisely those sites where the greatest surface water would be available, particularly near the water-limited edge of its range.

  11. Relationship between cement distribution pattern and new compression fracture after percutaneous vertebroplasty.

    Science.gov (United States)

    Tanigawa, Noboru; Komemushi, Atsushi; Kariya, Shuji; Kojima, Hiroyuki; Shomura, Yuzo; Omura, Naoto; Sawada, Satoshi

    2007-12-01

    The objective of this study was to prospectively investigate relationships between cement distribution patterns and the occurrence rates of new compression fractures after percutaneous vertebroplasty. Percutaneous vertebroplasty was performed for osteoporotic compression fractures in 76 consecutive patients. Patients were divided into two groups according to the cement filling pattern shown on radiography and CT: cleft pattern group (group C, n = 34), compact and solid cement filling pattern in vertebrae; and trabecular pattern group (group T, n = 42), sponge-like filling pattern. A visual analog scale (VAS) was used to assess pain severity, and anterior and lateral radiographs of the thoracic and lumbar vertebrae were obtained 1-3 days and 1, 4, 10, 22, and 34 months after percutaneous vertebroplasty. Differences in treatment efficacy and the occurrence rates of new compression fractures were examined and compared for both groups using the Mann-Whitney U test and chi-square test. A significant difference was seen between groups with respect to the volume of cement injected per vertebra (mean volume: group C, 4.5 mL; group T, 3.7 mL; p = 0.01). VAS improvement did not differ significantly between group C (4.6) and group T (4.5). The mean follow-up period was 19.5 months, during which new compression fractures were significantly more frequent in group C (17 of 34 [50%]) than in group T (11 of 42 [26.2%]; p = 0.03). Although cement distribution patterns do not significantly affect initial clinical response, a higher incidence of new compression fractures is seen in patients with treated vertebrae exhibiting a cleft pattern.

  12. Wireless installation standard

    International Nuclear Information System (INIS)

    Lim, Hwang Bin

    2007-12-01

    This is divided six parts which are radio regulation law on securing of radio resource, use of radio resource, protection of radio resource, radio regulation enforcement ordinance with securing, distribution and assignment of radio regulation, radio regulation enforcement regulation on utility of radio resource and technical qualification examination, a wireless installation regulation of technique standard and safety facility standard, radio regulation such as certification regulation of information communicative machines and regulation of radio station on compliance of signal security, radio equipment in radio station, standard frequency station and emergency communication.

  13. Re-evaluation of pulmonary titanium dioxide nanoparticle distribution using the "relative deposition index": Evidence for clearance through microvasculature

    Directory of Open Access Journals (Sweden)

    Gehr Peter

    2007-08-01

    Full Text Available Abstract Background Translocation of nanoparticles (NP from the pulmonary airways into other pulmonary compartments or the systemic circulation is controversially discussed in the literature. In a previous study it was shown that titanium dioxide (TiO2 NP were "distributed in four lung compartments (air-filled spaces, epithelium/endothelium, connective tissue, capillary lumen in correlation with compartment size". It was concluded that particles can move freely between these tissue compartments. To analyze whether the distribution of TiO2 NP in the lungs is really random or shows a preferential targeting we applied a newly developed method for comparing NP distributions. Methods Rat lungs exposed to an aerosol containing TiO2 NP were prepared for light and electron microscopy at 1 h and at 24 h after exposure. Numbers of TiO2 NP associated with each compartment were counted using energy filtering transmission electron microscopy. Compartment size was estimated by unbiased stereology from systematically sampled light micrographs. Numbers of particles were related to compartment size using a relative deposition index and chi-squared analysis. Results Nanoparticle distribution within the four compartments was not random at 1 h or at 24 h after exposure. At 1 h the connective tissue was the preferential target of the particles. At 24 h the NP were preferentially located in the capillary lumen. Conclusion We conclude that TiO2 NP do not move freely between pulmonary tissue compartments, although they can pass from one compartment to another with relative ease. The residence time of NP in each tissue compartment of the respiratory system depends on the compartment and the time after exposure. It is suggested that a small fraction of TiO2 NP are rapidly transported from the airway lumen to the connective tissue and subsequently released into the systemic circulation.

  14. Communications standards

    CERN Document Server

    Stokes, A V

    1986-01-01

    Communications Standards deals with the standardization of computer communication networks. This book examines the types of local area networks (LANs) that have been developed and looks at some of the relevant protocols in more detail. The work of Project 802 is briefly discussed, along with a protocol which has developed from one of the LAN standards and is now a de facto standard in one particular area, namely the Manufacturing Automation Protocol (MAP). Factors that affect the usage of networks, such as network management and security, are also considered. This book is divided into three se

  15. A comparison of three methods of assessing differential item functioning (DIF) in the Hospital Anxiety Depression Scale: ordinal logistic regression, Rasch analysis and the Mantel chi-square procedure.

    Science.gov (United States)

    Cameron, Isobel M; Scott, Neil W; Adler, Mats; Reid, Ian C

    2014-12-01

    It is important for clinical practice and research that measurement scales of well-being and quality of life exhibit only minimal differential item functioning (DIF). DIF occurs where different groups of people endorse items in a scale to different extents after being matched by the intended scale attribute. We investigate the equivalence or otherwise of common methods of assessing DIF. Three methods of measuring age- and sex-related DIF (ordinal logistic regression, Rasch analysis and Mantel χ(2) procedure) were applied to Hospital Anxiety Depression Scale (HADS) data pertaining to a sample of 1,068 patients consulting primary care practitioners. Three items were flagged by all three approaches as having either age- or sex-related DIF with a consistent direction of effect; a further three items identified did not meet stricter criteria for important DIF using at least one method. When applying strict criteria for significant DIF, ordinal logistic regression was slightly less sensitive. Ordinal logistic regression, Rasch analysis and contingency table methods yielded consistent results when identifying DIF in the HADS depression and HADS anxiety scales. Regardless of methods applied, investigators should use a combination of statistical significance, magnitude of the DIF effect and investigator judgement when interpreting the results.

  16. Radiographic Study of the Prevalence and Distribution of Hypodontia Associated with Unilateral and Bilateral Clef Lip and Palate in a Hungarian Population

    Science.gov (United States)

    Berniczei-Roykó, Ádám; Tappe, Jan-Hendrik; Krinner, Axel; Gredes, Tomasz; Végh, András; Gábor, Katona; Linkowska-Świdzińska, Kamila; Botzenhart, Ute Ulrike

    2016-01-01

    Background Cleft defects are one of the most frequent birth-deformities of the orofacial region and they are commonly associated with anomalies of the tooth structure, size, shape, formation, eruption, and tooth number. The aim of our study was to evaluate the prevalence, distribution, and potential association of combined hypodontia in cleft-affected patients with regard to all types of teeth in both jaws in the permanent dentition. Material/Methods This retrospective radiographic analysis included patients with various types of clefts treated orthodontically in the Department of Orofacial Orthopedics and Orthodontics at Heim Pàl Children’s Hospital, Budapest. There were 150 patients (84 males, 66 females) with non-syndromic unilateral (UCLP; n=120 patients) or bilateral (BCLP; n=30 patients) cleft formation (lip, alveolus and palate) who met the inclusion criteria. Statistical analysis was performed using the chi-square test and Fisher’s exact test (significance level ppremolars of the upper and lower jaw. A significant correlation of congenital missing teeth was observed in left-sided clefts between the upper and lower second premolar in the cleft area. Conclusions Hypodontia inside and outside the cleft area was frequently observed. This should affect the therapy plans, especially if the cleft-sided premolar is also absent. Further comprehensive research including numerous random samples is necessary for better estimating other possible associations. PMID:27767023

  17. A genomic study on distribution of human leukocyte antigen (HLA-A and HLA-B alleles in Lak population of Iran

    Directory of Open Access Journals (Sweden)

    Farhad Shahsavar

    2017-03-01

    Full Text Available Anthropological studies based on the highly polymorphic gene, human leukocyte antigen (HLA, provide useful information for bone marrow donor registry, forensic medicine, disease association studies, as well as infertility treatment, designing peptide vaccines against tumors, and infectious or autoimmune diseases. The aim of this study was to determine HLA-A and HLA-B allele frequencies in 100 unrelated Lak/lᴂk/individuals from Lorestan province of Iran. Finally, we compared the results with that previously described in Iranian population. Commercial HLA-Type kits from BAG (Lich, Germany company were used for determination of the HLA-A and HLA-B allele frequencies in genomic DNA, based on polymerase chain reaction with sequence specific primer (PCR-SSP assay. The differences between the populations in distribution of HLA-A and HLA-B alleles were estimated by chi-squared test with Yate's correction. The most frequent HLA-A alleles were *24 (20%, *02 (18%, *03 (12% and *11 (10%, and the most frequent HLA-B alleles were *35 (24%, *51 (16%, *18 (6% and *38 (6% in Lak population. HLA-A*66 (1%, *74(1% and HLA-B*48 (1%, *55(1% were the least observed frequencies in Lak population. Our results based on HLA-A and HLA-B allele frequencies showed that Lak population possesses the previously reported general features of Iranians but still with unique.

  18. Prevalence of enterovirus and hepatitis A virus in bivalve molluscs from Galicia (NW Spain): inadequacy of the EU standards of microbiological quality.

    Science.gov (United States)

    Romalde, J L; Area, E; Sánchez, G; Ribao, C; Torrado, I; Abad, X; Pintó, R M; Barja, J L; Bosch, A

    2002-03-25

    A study of the presence of hepatitis A virus (HAV) and enterovirus (EV) in shellfish from the northwestern coast of Spain, one of the most important mussel producers in the world, was carried out employing dot-blot hybridization and RT-PCR techniques. In addition, bacterial contamination of the samples was evaluated by Escherichia coli (EC) counts, according to the European Union (EU) standards of shellfish microbiological quality. Shellfish samples included raft-cultured and wild mussels, as well as wild clams and cockles. Bacterial counts showed that the majority of samples (40.8%) could be classified as moderately polluted following the EU standards, and therefore should undergo depuration processes. However, differences in bacterial contamination were observed between cultured mussel and wild shellfish. Thus, percentage of clean samples (<230 EC/100 g shellfish) was clearly higher in cultured mussels (49.1%) than in wild mussels (22.8%) or clams and cockles (10.7%). HAV was detected in 27.4% and EV in 43.9% of the samples that were analyzed. Simultaneous detection of both viral types occurred in 14.1% of the samples. Statistical tests of dependence (chi-square test) showed no relationship either between viral and bacterial contamination, or between the presence of HAV and EV. Comparative analysis of hybridization and RT-PCR for viral detection yielded different results depending on the virus type that was studied, RT-PCR being effective for HAV but not for EV detection. The obtained results reinforce once again the inadequacy of bacteriological standards to assess viral contamination and suggest that although virological analysis of shellfish is possible by molecular techniques, interlaboratory standardization and validation studies are needed before the routine use in monitoring shellfish microbiological safety.

  19. Achieving Standardization

    DEFF Research Database (Denmark)

    Henningsson, Stefan

    2016-01-01

    competitive, national customs and regional economic organizations are seeking to establish a standardized solution for digital reporting of customs data. However, standardization has proven hard to achieve in the socio-technical e-Customs solution. In this chapter, the authors identify and describe what has......International e-Customs is going through a standardization process. Driven by the need to increase control in the trade process to address security challenges stemming from threats of terrorists, diseases, and counterfeit products, and to lower the administrative burdens on traders to stay...... to be harmonized in order for a global company to perceive e-Customs as standardized. In doing so, they contribute an explanation of the challenges associated with using a standardization mechanism for harmonizing socio-technical information systems....

  20. Achieving Standardization

    DEFF Research Database (Denmark)

    Henningsson, Stefan

    2014-01-01

    competitive, national customs and regional economic organizations are seeking to establish a standardized solution for digital reporting of customs data. However, standardization has proven hard to achieve in the socio-technical e-Customs solution. In this chapter, the authors identify and describe what has......International e-Customs is going through a standardization process. Driven by the need to increase control in the trade process to address security challenges stemming from threats of terrorists, diseases, and counterfeit products, and to lower the administrative burdens on traders to stay...... to be harmonized in order for a global company to perceive e-Customs as standardized. In doing so, they contribute an explanation of the challenges associated with using a standardization mechanism for harmonizing socio-technical information systems....

  1. Training Standardization

    International Nuclear Information System (INIS)

    Agnihotri, Newal

    2003-01-01

    The article describes the benefits of and required process and recommendations for implementing the standardization of training in the nuclear power industry in the United States and abroad. Current Information and Communication Technologies (ICT) enable training standardization in the nuclear power industry. The delivery of training through the Internet, Intranet and video over IP will facilitate this standardization and bring multiple benefits to the nuclear power industry worldwide. As the amount of available qualified and experienced professionals decreases because of retirements and fewer nuclear engineering institutions, standardized training will help increase the number of available professionals in the industry. Technology will make it possible to use the experience of retired professionals who may be interested in working part-time from a remote location. Well-planned standardized training will prevent a fragmented approach among utilities, and it will save the industry considerable resources in the long run. It will also ensure cost-effective and safe nuclear power plant operation

  2. Standardized precipitation index based on pearson type III distribution Índice padronizado de precipitação baseado na distribuição pearson tipo III

    Directory of Open Access Journals (Sweden)

    Gabriel Constantino Blain

    2011-06-01

    Full Text Available The initial step in calculating the Standardized Precipitation Index (SPI is to determine a probability density function (pdf that describes the precipitation series under analysis. Once this pdf is determined, the cumulative probability of an observed precipitation amount is computed. The inverse normal function is then applied to the cumulative probability. The result is the SPI. This article assessed the changes in SPI final values, when computed based on Gamma 2-parameters (Gam and Pearson Type III (PE3 distributions (SPIGam and SPIPE3, respectively. Monthly rainfall series, available from five weather stations of the State of São Paulo, were chosen for this study. Considering quantitative and qualitative assessments of goodness-of-fit (evaluated at 1-, 3-, and 6-months precipitation totals, the PE3 distribution seems to be a better choice than the Gam distribution, in describing the long-term rainfall series of the State of São Paulo. In addition, it was observed that the number of SPI time series that could be seen as normally distributed was higher when this drought index was computed from the PE3 distribution. Thus, the use of the Pearson type III distribution within the calculation algorithm of the SPI is recommended in the State of São Paulo.O cálculo do Índice Padronizado de Precipitação (IPP inicia-se com a adoção de uma distribuição paramétrica (dp utilizada para a estimativa das probabilidades de ocorrência associadas a uma série de precipitação pluvial. Após essa escolha, a probabilidade acumulada de ocorrência de um determinado valor de precipitação é calculada. O IPP é obtido após a aplicação da função normal inversa a essa probabilidade acumulada. O artigo avaliou as alterações nos valores finais do IPP, quando calculado com base nas distribuições Gama com dois parâmetros (Gam e Pearson tipo III (PE3; IPPGam e IPPPE3, respectivamente. Utilizaram-se dados de precipitação pluvial de cinco

  3. Delineation of upper urinary tract segments at MDCT urography in patients with extra-urinary mass lesions: retrospective comparison of standard and low-dose protocols for the excretory phase of imaging

    Energy Technology Data Exchange (ETDEWEB)

    Mueller-Lisse, Ulrike L. [University of Munich, Department of Urology, Munich (Germany); University of Munich Medical School, Department of Urology, Muenchen (Germany); Coppenrath, Eva M.; Meindl, Thomas; Degenhart, Christoph; Scherr, Michael K.; Reiser, Maximilian F.; Mueller-Lisse, Ullrich G. [University of Munich, Department of Radiology, Munich (Germany); Stief, Christian G. [University of Munich, Department of Urology, Munich (Germany)

    2011-02-15

    Excretory-phase CT urography (CTU) may replace excretory urography in patients without urinary tumors. However, radiation exposure is a concern. We retrospectively compared upper urinary tract (UUT) delineation in low-dose and standard CTU. CTU (1-2 phases, 120 KV, 4 x 2.5 mm, pitch 0.875, i.v. non-ionic contrast media, iodine 36 g) was obtained with standard (14 patients, n = 27 UUTs, average 175.6 mAs/slice, average delay 16.8 min) or low-dose (26 patients, n = 86 UUTs, 29 mAs/slice, average delay 19.6 min) protocols. UUT was segmented into intrarenal collecting system (IRCS), upper, middle, and lower ureter (UU,MU,LU). Two independent readers (R1,R2) graded UUT segments as 1-not delineated, 2-partially delineated, 3-completely delineated (noisy margins), 4-completely delineated (clear margins). Chi-square statistics were calculated for partial versus complete delineation and complete delineation (clear margins), respectively. Complete delineation of UUT was similar in standard and low-dose CTU (R1, p > 0.15; R2, p > 0.2). IRCS, UU, and MU clearly delineated similarly often in standard and low-dose CTU (R1, p > 0.25; R2, p > 0.1). LU clearly delineated more often in standard protocols (R1, 18/6 standard, 38/31 low-dose, p > 0.1; R2 18/6 standard, 21/48 low-dose, p < 0.05). Low-dose CTU sufficiently delineated course of UUT and may locate obstruction/dilation, but appears unlikely to find intraluminal LU lesions. (orig.)

  4. Delineation of upper urinary tract segments at MDCT urography in patients with extra-urinary mass lesions: retrospective comparison of standard and low-dose protocols for the excretory phase of imaging.

    Science.gov (United States)

    Mueller-Lisse, Ulrike L; Coppenrath, Eva M; Meindl, Thomas; Degenhart, Christoph; Scherr, Michael K; Stief, Christian G; Reiser, Maximilian F; Mueller-Lisse, Ullrich G

    2011-02-01

    Excretory-phase CT urography (CTU) may replace excretory urography in patients without urinary tumors. However, radiation exposure is a concern. We retrospectively compared upper urinary tract (UUT) delineation in low-dose and standard CTU. CTU (1-2 phases, 120 KV, 4 × 2.5 mm, pitch 0.875, i.v. non-ionic contrast media, iodine 36 g) was obtained with standard (14 patients, n = 27 UUTs, average 175.6 mAs/slice, average delay 16.8 min) or low-dose (26 patients, n = 86 UUTs, 29 mAs/slice, average delay 19.6 min) protocols. UUT was segmented into intrarenal collecting system (IRCS), upper, middle, and lower ureter (UU,MU,LU). Two independent readers (R1,R2) graded UUT segments as 1-not delineated, 2-partially delineated, 3-completely delineated (noisy margins), 4-completely delineated (clear margins). Chi-square statistics were calculated for partial versus complete delineation and complete delineation (clear margins), respectively. Complete delineation of UUT was similar in standard and low-dose CTU (R1, p > 0.15; R2, p > 0.2). IRCS, UU, and MU clearly delineated similarly often in standard and low-dose CTU (R1, p > 0.25; R2, p > 0.1). LU clearly delineated more often in standard protocols (R1, 18/6 standard, 38/31 low-dose, p > 0.1; R2 18/6 standard, 21/48 low-dose, p < 0.05). Low-dose CTU sufficiently delineated course of UUT and may locate obstruction/dilation, but appears unlikely to find intraluminal LU lesions.

  5. Dyke thicknesses follow a Weibull distribution controlled by host-rock strength and magmatic overpressure

    Science.gov (United States)

    Krumbholz, M.; Hieronymus, C.; Burchardt, S.; Troll, V. R.; Tanner, D. C.; Friese, N.

    2012-04-01

    Dykes are the primary transport channels of magma through the crust and form large parts of volcanic edifices and the oceanic crust. Their dimensions are primary parameters that control magma transport rates and therefore influence, e.g. the size of fissure eruptions and crustal growth. Since the mechanics of dyke emplacement are essentially similar and independent of the tectonic setting, dyke properties should generally follow the same statistical laws. The measurement of dyke thicknesses is, of all parameters, least affected by censoring and truncation effects and therefore most accurately accessible. Nevertheless, dyke thicknesses have been ascribed to follow many different statistical distributions, such as negative exponential and power law. We tested large datasets of dyke thicknesses from different tectonic settings (mid-ocean ridge, oceanic intra-plate) for different statistical distributions (log-normal, exponential, power law (with fixed or variable lower cut-off), Rayleigh, Chi-square, and Weibull). For this purpose, we first converted the probability density functions of each dataset to cumulative distribution functions, thus avoiding arbitrariness in bin size. A non-linear, least-squares fit was then used to compute the parameter(s) of the distribution function. The goodness-of-fit was evaluated using three methods: (1) the residual sum of squares, (2) the Kolmogorov-Smirnov statistics, and (3) p-values using 10,000 synthetic datasets. The results show that, in general, dyke thickness is best described by a Weibull distribution. This suggests material strength is a function of the dimensions of included weaknesses (e.g. fractures), following the "weakest link of a chain" principle. Our datasets may be further subdivided according to dyke lithology (magma type) and type (regional dyke vs. inclined sheet), which leads to an increasingly better fit of the Weibull distribution. Weibull is hence the statistical distribution that universally describes dyke

  6. Prevalence and geographic distribution of herniated intervertebral disc in Korean 19-year-old male from 2008 to 2009: a study based on Korean conscription -national and geographic prevalence of herniated intervertebral disc in Korean 19YO male-.

    Science.gov (United States)

    Lee, Sang Hun; Oh, Chang Hyun; Yoon, Seung Hwan; Park, Hyeong-chun; Park, Chong Oon

    2013-09-01

    This study was to determine the prevalence of herniated intervertebral disc (HIVD) among Korean 19-year-old male in a large national sample and to compare the prevalence across geographic regions based on the data of conscription. We analyzed the conscription data of 615508 cases who were 19-year-old male, given an examination for conscription at nationwide Korean Military Manpower Administration from January 2008 to December 2009. Prevalence was determined by dividing the number of cases by the number of persons enrolled for 2 years. The analyses included of a cross-tabulations and nonparametric chi-square to compare the prevalence according to geographic region, disc severity, and conscription year. The prevalence of HIVD among 19-year-old male was 0.47%. Seoul had the highest prevalence of HIVD (total HIVD was 0.60%, and severe HIVD was 0.44%). The prevalence of HIVD was lower in Jeollabuk- do and Jeollanam-do (total HIVD was 0.25-0.27%, and severe HIVD was 0.16-0.17%). Annual prevalence of HIVD was slightly decreased in 2009, but geographic distribution annually was not different. In Korean 19-year-old male, the national prevalence of adolescent HIVD was 0.60%, but different geographic distribution was observed. It is quite possible that secondary contributing factor(s) interfere with the different geographic prevalence of HIVD.

  7. STUDY OF THE PATTERN AND DISTRIBUTION OF BRONCHOGENIC CARCINOMA IN COMPUTED TOMOGRAPHY OF CHEST

    Directory of Open Access Journals (Sweden)

    Harsha D. S

    2017-06-01

    Full Text Available BACKGROUND Bronchogenic carcinoma is a leading cause of cancer related deaths, more than Colon cancer, breast cancer and prostate cancer combined. Chest computed tomography (CT chest is widely used for diagnosis, part of staging, planning treatment and monitoring. The type and distribution of lesion in chest CT may give a fair idea regarding the nature and histology of lesion. Aims and Objectives- To study the chest CT patterns of bronchogenic carcinoma and to correlate the patterns with histological cell type. MATERIALS AND METHODS It was a hospital based retrospective study involving 101 patients aged 35-80 years with histologically diagnosed bronchogenic carcinoma patients over a period of five years. Chest CT patterns were studied and compared to histology. Statistical analysis was done by chi square test. RESULTS Mass lesions formed 88.1% of cases (p value 0.0001, which was significant. This was followed by solitary pulmonary nodule (5.9%, consolidation (2.97% and cavitatory lesion (2.97%. 52% of mass lesions were located in both upper lobes and this was significant (p value 0.0001 Adenocarcinoma was the most common cell type. There were 6 (5.94% solitary pulmonary nodules. Among solitary pulmonary nodules majority were adenocarcinoma (83.33%. 2.97% with cavitating malignancy, all were squamous cell carcinoma. CONCLUSION Upper lobe mass lesion is the most common presentation of bronchogenic carcinoma in computed tomography of chest. Solitary pulmonary nodules are commonly located in upper lobes. Adenocarcinoma is the commonest cell type. Squamous cell carcinoma is the most common cause for cavitating bronchogenic carcinoma and common on right side. Adenocarcinoma is overall most common cell type.

  8. Detection of Antibiotic Resistant Listeria spp. in Beef Burgers Distributed in Ahvaz City, Iran

    Directory of Open Access Journals (Sweden)

    Maktabi

    2016-03-01

    Full Text Available Background Listeria spp. are able to be survive in many foods during frozen storage. One particular species, Listeria monocytogenes, is one of the most important food-borne pathogens globally. The antimicrobial resistance of pathogenic microorganisms is a worldwide public health concern because of increasing global trade and travel. Objectives The aim of this study was to evaluate the occurrence and antibiotic resistance of Listeria spp. in the Iranian beef burgers distributed in Ahvaz city. Materials and Methods During a five-month period, 150 frozen burgers were purchased from local markets in Ahvaz city, and tested for presence of Listeria spp. The experimental procedure consisted of a one-step enrichment in Listeria enrichment broth, followed by plating on Oxford agar. Suspected colonies were subjected to subsequent biochemical tests and a polymerase chain reaction (PCR assay. The susceptibility of the isolates to various antibiotics was investigated using the Kirby-Bauer disk diffusion method, and the results were analyzed via the chi-square test and Fisher’s exact test using SPSS 16.0 software. Results Out of 150 samples, only two were contaminated with Listeria innocua, and the statistical analysis showed no significant differences in the prevalence of Listeria between companies (P > 0.05. One of the isolates was resistant to tetracycline and the other to co-trimoxazole. Both of the isolates showed an intermediate susceptibility to chloramphenicol; however, they were sensitive to the other tested antibiotics. Conclusions L. innocua is not a pathogen, but the presence of the bacterium could be an indicator of probable contamination with L. monocytogenes. Moreover, there is a potential risk to public health from the consumption of raw or undercooked burgers, which may increase the possibility of the acquisition of resistance to antibiotics.

  9. Distribution of Molar Incisor Hypomineralization in Malaysian Children Attending University Dental Clinic.

    Science.gov (United States)

    Hussein, A S; Faisal, M; Haron, M; Ghanim, A M; Abu-Hassan, M I

    2015-01-01

    Molar-Incisor Hypomineralization (MIH) is a condition of hypomineralized enamel of systemic origin affecting first permanent molars and frequently permanent incisors. It is considered a global problem and data from South-East Asian countries, including Malaysia are lacking. Hence the aim of this study were to investigate the distribution and severity of MIH in a group of children aged 7-12 year olds attending pediatric dental clinic at Faculty of Dentistry, Universiti Teknologi MARA (UiTM), Malaysia. Hundred and fifty four children age 7-12 year-old with mean age of 9.14 ±1.682 had their first permanent molars and permanent incisors were examined at Faculty of Dentistry, UiTM using European Academy of Paediatric Dentistry 2003 (EAPD) criteria for diagnosis of MIH. Children at least one first permanent molar affected were considered as having MIH. Data were recorded and statistically analysed using descriptive analysis and Chi square test. Twenty six of the total examined children (n=154) had MIH (16.9%). There was no statistical difference between males and females in the prevalence of MIH. However, a statistical significant difference was found by age groups. The first permanent molars were more frequently affected (58%) as compared to permanent incisors. Mandibular molars were to have the highest rate of MIH (15.5%). The right and left sides were equally affected. Mild defects were the most frequent lesion type (96.6%). This study revealed that MIH is a common condition (16.9%). Molars were more frequently affected than incisors with mild defects were the most common lesion status. Further studies on this defect amongst Malaysian children are worthwhile.

  10. Assessment of hemostatic changes after crystalloid and colloid fluid preloading in trauma patients using standard coagulation parameters and thromboelastography

    Directory of Open Access Journals (Sweden)

    Chhavi Sawhney

    2013-01-01

    Full Text Available Background: The choice of an ideal fluid administered post trauma and its subsequent influence on coagulation still poses a clinical dilemma. Hence, this study was designed to assess the influence of in vivo hemodilution with various fluid preparations (4% gelatin, 6% hydoxyethyl starch (HES, Ringer′s lactate, 0.9% normal saline on coagulation using standard coagulation parameters and real-time thromboelastography (TEG in patients undergoing elective surgery post trauma. Methods: In a randomized, double-blind study, 100 patients of either sex and age, belonging to ASA Grades I and II, scheduled for elective surgeries were allocated into four groups of 25 each according to the type of fluid infused. Group G (4% gelatin, Group N (0.9% normal saline, Group R (Ringer′s lactate, and Group H (6% HES received preloading with 1 L of fluid according to the group. The coagulation status of the patients was assessed during perioperative period (before surgery, after fluid preloading, and at the end of the surgery using both conventional coagulation analysis and TEG. Statistical Analysis: Analysis of variance (ANOVA, post hoc and Pearson Chi-square test were used. Results: In all the patients preloaded with gelatin, there was a significant increase in prothrombin time index (PTI; 14.88±0.90 vs. 13.78±3.01, P0 14 min state in the postoperative period. Conclusion: Crystalloids are optimal volume expanders in trauma, with RL having beneficial effects on coagulation system (decrease in k time and increase in MA and A20. Among the colloids, HES 6% (130/0.4 affects coagulation parameters (increase in PTI, INR, R time, k time more than gelatin. Trial registration (protocol number-IEC/NP-189/2011.

  11. Maximal standard dose of parenteral iron for hemodialysis patients: an MRI-based decision tree learning analysis.

    Directory of Open Access Journals (Sweden)

    Guy Rostoker

    Full Text Available Iron overload used to be considered rare among hemodialysis patients after the advent of erythropoesis-stimulating agents, but recent MRI studies have challenged this view. The aim of this study, based on decision-tree learning and on MRI determination of hepatic iron content, was to identify a noxious pattern of parenteral iron administration in hemodialysis patients.We performed a prospective cross-sectional study from 31 January 2005 to 31 August 2013 in the dialysis centre of a French community-based private hospital. A cohort of 199 fit hemodialysis patients free of overt inflammation and malnutrition were treated for anemia with parenteral iron-sucrose and an erythropoesis-stimulating agent (darbepoetin, in keeping with current clinical guidelines. Patients had blinded measurements of hepatic iron stores by means of T1 and T2* contrast MRI, without gadolinium, together with CHi-squared Automatic Interaction Detection (CHAID analysis.The CHAID algorithm first split the patients according to their monthly infused iron dose, with a single cutoff of 250 mg/month. In the node comprising the 88 hemodialysis patients who received more than 250 mg/month of IV iron, 78 patients had iron overload on MRI (88.6%, 95% CI: 80% to 93%. The odds ratio for hepatic iron overload on MRI was 3.9 (95% CI: 1.81 to 8.4 with >250 mg/month of IV iron as compared to <250 mg/month. Age, gender (female sex and the hepcidin level also influenced liver iron content on MRI.The standard maximal amount of iron infused per month should be lowered to 250 mg in order to lessen the risk of dialysis iron overload and to allow safer use of parenteral iron products.

  12. Control of grid user payment. Antitrust legal standards of control for the examination of grid user payments of the german operators of electricity distribution networks in the system of the negotiated grid access; Netznutzungsentgeltkontrolle. Kartellrechtliche Kontrollmassstaebe fuer die Ueberpruefung von Netznutzungsentgelten der deutschen Elektrizitaetsverteilungsnetzbetreiber im System des verhandelten Netzzungangs

    Energy Technology Data Exchange (ETDEWEB)

    Stappert, H.

    2007-07-01

    For years their exists a controversy concerning to the permissible height of payments for the use of distribution networks in the electricity supply in the system of the negotiated grid access. Under this aspect, the author of the contribution under consideration reports on antitrust legal standards of control for the examination of grid user payments of the German operators of electricity distribution networks. The main aspects are: test standard; relation to energy law; market demarcation; position of the norm receiver; control methods; spatial comparison of interior prices; control of costs.

  13. Frequency standards

    CERN Document Server

    Riehle, Fritz

    2006-01-01

    Of all measurement units, frequency is the one that may be determined with the highest degree of accuracy. It equally allows precise measurements of other physical and technical quantities, whenever they can be measured in terms of frequency.This volume covers the central methods and techniques relevant for frequency standards developed in physics, electronics, quantum electronics, and statistics. After a review of the basic principles, the book looks at the realisation of commonly used components. It then continues with the description and characterisation of important frequency standards

  14. Response rates to standard interferon treatment in HCV genotype 3a.

    Science.gov (United States)

    Qureshi, Saleem; Batool, Uzma; Iqbal, Musarrat; Qureshi, Omarah; Kaleem, Rao; Aziz, Hina; Azhar, Muhammad

    2009-01-01

    Chronic Hepatitis C infection infects almost 130 to 170 million or approximately 2.2-3% of world's population. HCV is one of the main causes of chronic liver disease leading to progressive liver injury, fibrosis, cirrhosis and liver cancer. It is also one of the leading indications for liver transplantation worldwide. The objective of the study was to determine the response of treatment with standard Interferon and Ribazole in treatment naïve Hepatitis C infected patients. This quasi-experimental study was carried out at the Department of Medicine, KRL General Hospital Islamabad, from January 2003 to January 2005. A total of 250 patients were enrolled in this descriptive study. All patients were anti HCV positive, PCR positive for HCV RNA and had 3a genotype. A non-probability purposive sampling technique was applied to collect data. After taking a written and informed consent; specially designed performa containing the patient profile, family transmission, and baseline laboratory values was filled. Patients were treated with a set protocol of Interferon plus Ribavarin therapy (IFN alpha 2a, 3 mIU thrice weekly for 24 weeks plus Ribavarin 1,000 to 1,200 mg/day) for six months. Chi-Square tests were used to analyse the data. Primary end point was a sustained virological response (SVR) that is response assessed after six months of completion of treatment. Response rates to standard Interferon plus Ribazole therapy were studied over two years period. Out of the total of 250 patients, 60 patients were excluded; as 30 patients did not meet inclusion criteria, 23 patients were lost to follow. Seven patients declined treatment. Out of the 190 patients, 155 (81.6%) achieved End of Treatment Complete Response (EOTCR) whereas 35 (18.4%) were nonresponders (NR). These 155 patients, who showed complete response were followed for six months after the treatment to assess sustained viral response, which was seen in 112 (72.25%) patients whereas 43 (27.7%) were relapsers

  15. Parturition intervals and distributions of parturitions by months of buffalo in Southern and South-eastern Brazil

    Directory of Open Access Journals (Sweden)

    J.C. De Souza

    2010-02-01

    Full Text Available Reproductive rate is an important component of economic success in livestock production. Parturition interval (IEP is a direct measure of the productivity of the animal. Long IEP reduce the number of calves produced per year. The objective this study was to determine the distribution of parturitions across month and to evaluate factors affecting IEP. The data included 7,588 parturitions of Murrah, Mediterranean and Carabobo buffalo from 10 herds in Southern and South-eastern Brazil. The analysis of distribution of parturitions evaluated the effects of month, year and their interaction on birth date of calves by using a Chi-Square test in SAS PROC FREQ (SAS Institute, Cary, NC, USA. Parturition intervals (n = 2,630 were evaluated using analysis of variance in SAS PROC GLM. The model for IEP included the fixed effects of season (December to May = 1, June to November = 2, year, season x year, sex of the preceding parturition, age of weaning of the previous calf, and herd. All sources of variation were significant (P<0.0001 except sex of the preceding parturition (P <0.85. The mean IEP was 446.7 ± 10.4 days, for seasons 1 and 2 IEP were 419.8 ± 11.3 and 473.6 ± 40.7 days, respectively, a difference of 54 days. As weaning age increased there was a lengthening of IEP. Buffalo in Brazil showed seasonal parturition with calving concentrated from January to April, although the frequency by month differed across years (P<0.0001. These months also had the lowest calving interval.

  16. Relevant Standards

    Indian Academy of Sciences (India)

    .86: Ethernet over LAPS. Standard in China and India. G.7041: Generic Framing Procedure (GFP). Supports Ethernet as well as other data formats (e.g., Fibre Channel); Protocol of ... IEEE 802.3x for flow control of incoming Ethernet data ...

  17. Chapter 1: Standard Model processes

    OpenAIRE

    Becher, Thomas

    2017-01-01

    This chapter documents the production rates and typical distributions for a number of benchmark Standard Model processes, and discusses new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  18. Willingness to Venture into Agriculture-related Enterprises after ...

    African Journals Online (AJOL)

    NTL NG

    Chi-square analysis showed a significant association between students' marital ... 2008 leaving Nigeria far behind Malaysia that ..... Table 1: Socioeconomic characteristics of the agriculture students in FUNAAB (n = 112). Socioeconomic characteristics. Frequency Percentages. (%). Mean Standard deviation. Sex. Male.

  19. Quality of life for post-polio syndrome: a patient derived, Rasch standard scale.

    Science.gov (United States)

    Young, Carolyn A; Quincey, Anne-Marie C; Wong, Samantha M; Tennant, Alan

    2018-03-01

    To design a disease-specific quality of life (QoL) questionnaire for people with post-polio syndrome (PPS). Qualitative interviews were conducted with 45 people with PPS to identify themes and derive potential items reflecting impact upon QoL. After cognitive debriefing, these were made into a questionnaire pack along with comparative questionnaires and posted to 319 patients. The 271 (85%) returned questionnaires were subjected to exploratory factor analysis (EFA) and Rasch analysis. A 25 item scale, the post-polio quality of life scale (PP-QoL), showed good fit to the Rasch model (conditional chi-square p = 0.156), unidimensionality (% t-tests 2.0: CI 0.7-3.8), and Cronbach's alpha of 0.87. With the latent estimate transformed to a 0-100 scale, the mean score was 56.9 (SD 18.5) with only 3.3% of respondents at the floor or ceiling of the scale. Test-retest reliability showed an intraclass correlation coefficient (ICC) (2.1) of 0.916, and correlation of 0.85. The disease-specific PP-QoL demonstrated excellent reliability, appropriate concurrent validity, and satisfied the standards of the Rasch model. It enables examination of the impact of health status upon perceived QoL, and the impact of rehabilitation interventions. The scale is freely available for academic or not-for-profit users to improve research in this neglected, disabling condition. Implications for Rehabilitation In post-polio syndrome (PPS), existing work examines aspects of health-related quality of life (HRQoL), such as activity limitations. A disease-specific QoL measure would enable researchers to model the impact of health status, such as fatigue or mobility restrictions, upon QoL in PPS. The post-polio quality of life scale (PP-QoL) is based on the patients' lived experience, meets Rasch standards and is free for use for academic and not-for-profit researchers. The raw score is reliable for individual use in clinical settings, and interval scale transformation is available for parametric

  20. [Standardization of hospital feeding].

    Science.gov (United States)

    Caracuel García, Ángel Manuel

    2015-05-07

    Normalization can be understood as the establishing measures against repetitive situations through the development, dissemination, and application of technical design documents called standards. In Andalusia there are 45 public hospitals with 14,606 beds, and in which 11,700 full pensions / day are served. The Working Group on Hospital Food Standardization of the Andalusian Society for Clinical Nutrition and Dietetics, started in 2010, working on the certification of suppliers, product specifications, and meals technical card. - Develop a specific tool to help improving food safety through the certification of their suppliers. - Develop a standardized technical specifications of foodstuffs necessary for the development of menus established codes diets Andalusian hospitals document. - Develop a catalog of data sheets plates of hospital meals, to homogenize menus, respecting local and unifying criteria for qualitative and quantitative ingredients. - Providing documentation and studying of several public hospitals in Andalusia: • Product specifications and certification of suppliers. • International standards certification and distribution companies. • Legislation. • Data sheets for the menu items. • Specifications of different product procurement procedures. - Development of the draft standard HOSPIFOOD®, and approval of the version “0.0”. - Training course for auditors to this standard. - Development of a raw materials catalog as technical cards. - Meals Technical cards review and election of the ones which will be part of the document. After nearly three years of work, we have achieved the following products: - Standardized database of technical specifications for the production of food dietary codes for: fish, seafood, meat and meat products, meats and pates, ready meals, bread and pastries, preserves, milk and dairy products, oils, cereals, legumes , vegetables, fruits, fresh and frozen vegetables, condiments and spices. - Standardized database of

  1. Overlapping distribution of Plasmodium falciparum and soil transmitted helminths in a malaria hyperendemic region, North-Central Nigeria

    Directory of Open Access Journals (Sweden)

    Olalere Shittu

    2017-11-01

    Full Text Available Objective: Malaria and soil transmitted helminths (STH are endemic in many resource poor communities in sub-Saharan Africa (SSA and there appears to be a synergistic relationship among the duo culminating into an overlap in prevalence and intensities. Methods: Peripheral blood smears and fresh stool samples were obtained from consenting individuals in the study population. Routine microscopy examination was conducted on valid samples. Malaria parasitaemia in thick film was estimated by counting the number of parasites per 200 white blood cells (WBC and the parasite count/μL was determined by a fixed value of 8 000 WBC. Fresh stool samples collected were fixed in 10% forma-saline solution and immediately processed for intestinal parasite egg identification. Intensity of helminth eggs in stool samples was assessed using the Kato-Katz technique. Prevalence and intensity of infections between ages and sexes were tested using the Chi-square (χ2 and One-way analysis of variance (ANOVA respectively. For each value, the 95% confidence interval (95% CI (P < 0.05 was calculated. The association between STH prevalence and malaria mean parasitaemia load was assessed with student independent-t-test. Results: Seven hundred and thirty seven (737 individuals comprising 287 (38.9% males and 450 (61.1% females participated in the study. Ascaris lumbricoides determined the increase prevalence of Plasmodium falciparum (198 (OR = 2.59; 95% CI: 1.894 – 3.545. The intensities in the associations were highly significant (P < 0.001. Malaria, ascariasis and trichuriasis prevalence decreased with age and therefore exhibited marked age dependency patterns. However, only hookworm spread and prevalence increased with age. Overlapping distribution occurred in all infections with respect to the different age groups. Conclusions: In endemic communities like the present study area, a stable but mild infection intensities is observed all year round. Intervention and mass

  2. Differential distribution of age and HBV serological markers in liver cirrhosis and non-cirrhotic patients with primary liver cancer

    Directory of Open Access Journals (Sweden)

    XU Xiuhua

    2013-03-01

    Full Text Available ObjectiveTo compare the age distributions and presence of hepatitis B virus (HBV serological markers between primary hepatic cancer (PHC patients with and without liver cirrhosis. MethodsA total of 547 PHC cases were analyzed retrospectively. After dividing into two groups according to liver cirrhosis status, the between-group differences in age and HBV serological markers, such as hepatitis B e antigen (HBeAg status, were statistically compared using the Chi-squared test. ResultsThe number of cirrhotic and non-cirrhotic PHC patients was 265 and 282, respectively. HBV infection was present in 221 cirrhotic PHC patients and 256 non-cirrhotic PHC patients (834% vs. 90.8%. There was a substantial bias in the proportion of males to females in the cirrhotic PHC patients (7.83∶1. The number of PHC patients <60 years old was similar between the cirrhotic and non-cirrhotic groups, but the non-cirrhotic group had significantly more patients >60 years old (P<0.005. In cirrhotic PHC patients, the HBV infection rate was highest in the <40 years old age group (96.7% and the HBeAg serological conversion rate was highest in the 40-60 years old age group (89.5%. In non-cirrhotic PHC patients, the 40-60 years old age group showed the highest HBV infection rate (90.3% but the lowest HBeAg serological conversion rate (80.0%. ConclusionPHC with liver cirrhosis mainly occurred in males, with the HBV infection rate being higher in individuals <60 years old. Non-cirrhotic PHC patients were more often >60 years old. Many of the HBV-infected PHC patients with cirrhosis had high HBeAg serological conversion rate.

  3. Phenotypic and allelic distribution of the ABO and Rhesus (D) blood groups in the Cameroonian population.

    Science.gov (United States)

    Ndoula, S T; Noubiap, J J N; Nansseu, J R N; Wonkam, A

    2014-06-01

    Data on blood group phenotypes are important for blood transfusion programs, for disease association and population genetics studies. This study aimed at reporting the phenotypic and allelic distribution of ABO and Rhesus (Rh) groups in various ethnolinguistic groups in the Cameroonians. We obtained ABO and Rhesus blood groups and self-identified ethnicity from 14,546 Cameroonian students. Ethnicity was classified in seven major ethnolinguistic groups: Afro-Asiatic, Nilo-Saharan, Niger-Kordofanian/West Atlantic, Niger-Kordofanian/Adamawa-Ubangui, Niger-Kordofanian/Benue-Congo/Bantu/Grassfield, Niger-Kordofanian/Benue-Congo/Bantu/Mbam and Niger-Kordofanian/Benue-Congo/Bantu/Equatorial. ABO allelic frequencies were determined using the Bernstein method. Differences in phenotypic distribution of blood groups were assessed using the chi-square test; a P value blood groups O, A, B and AB were 48.62%, 25.07%, 21.86% and 4.45%, respectively. Rhesus-positive was 96.32%. The allelic frequencies of O, A and B genes were 0.6978, 0.1605 and 0.1416, respectively. Phenotypic frequencies of the blood groups in the general study population and in the different ethnolinguistic groups were in agreement with Hardy-Weinberg equilibrium expectations (P > 0.05). The frequencies of O, A, and B blood phenotypes were significantly lower, respectively, in the Nilo-Saharan group (P = 0.009), the Niger-Kordofanian/Benue-Congo/Bantu groups (P = 0.021) and the Niger-Kordofanian/West-Atlantic group. AB blood group was most frequent in the Niger-Kordofanian/Adamawa-Ubangui group (P = 0.024). Our study provides the first data on ethnic distribution of ABO and Rhesus blood groups in the Cameroonian population and suggests that its general profile is similar to those of several sub-Saharan African populations. We found some significant differences in phenotypic distribution amongst major ethnolinguistic groups. These data may be important for blood donor recruitment policy and blood transfusion

  4. [Evaluation of the quality of life in patients with breast cancer at different TNM stages after standardized treatment].

    Science.gov (United States)

    Huang, Rong; Huang, Yuan; Tao, Ping; Li, Hui; Wang, Qiong; Li, Hui; Li, Jia-yuan

    2013-01-01

    To evaluate the quality of life (QOL) in patients with breast cancer at different TNM stages and to estimate the value of EuroQol Five Dimension Indicator (EQ-5D) in measuring QOL among Chinese breast cancer patients. A survey with Quality of Life Instruments for Cancer Patients-Breast Cancer (QLICP-BR) and EQ-5D was undertaken in breast cancer patients who had completed their standardized treatment (except for the endocrine treatment) six months ago. Chi-square test, one-way ANOVA, and covariance analysis were used to evaluate the possible factors influencing the QOL of breast cancer patients. Simultaneously, with the results of Quality of Life Instruments for Cancer Patients-General Module (QLICP-GM, which is included in QLICP-BR.) and the total scores of QLICP-BR as standard, we conducted Pearson correlation analysis to evaluate the value of EQ-5D. A total of 178 female breast cancer survivors were collected from March 2010 to September 2010. There were 47 cases (26.4%) at stage 0 and I, 81 cases (45.5%) at stage II, and 50 cases (28.1%) at stage III and IV. The total standardized score of QLICP-BR was 72.55 ± 3.10 in patients at stage 0 and I, 64.09 ± 2.69 in patients at stage II and 58.21 ± 3.00 in patients at stage III and IV. The total standardized score of QLICP-BR and social domain of patients at stage 0 and I were higher than patients at stage II (all P stage 0 and I were higher than patients at stage III and IV (all P stages when age, degree of education, birth place (metropolis or rural), occupation, domestic income, and medical insurance were controlled (P = 0.002). Correlation analysis indicated that EQ-5D has a positive correlation with QLICP-GM and QLICP-BR (all P stage breast cancer is better than those at late stage. Early diagnosis and treatment can improve QOL of breast cancer patients. Chinese version of EQ-5D can well detect the differences of QOL among patients with different TNM stages, which can be used for evaluating QOL in Chinese

  5. A multicenter study of the family educational rights and privacy act and the standardized letter of recommendation: impact on emergency medicine residency applicant and faculty behaviors.

    Science.gov (United States)

    Diab, Jessica; Riley, Stephanie; Downes, Andrew; Gaeta, Theodore; Hern, H Gene; Hwang, Eric; Kass, Lawrence; Kelly, Michael; Luber, Samuel D; Martel, Marc; Minns, Alicia; Patterson, Leigh; Pazderka, Philip; Sayan, Osman; Thurman, Jason; Vallee, Phyllis; Overton, David

    2014-06-01

    Residency applicants have the right to see letters of recommendation written on their behalf. It is not known whether applicants are affected by waiving this right. Our multicenter study assessed how frequently residency applicants waived their FERPA rights to view their letters of recommendation, and whether this affected the ratings they were given by faculty. We reviewed all ERAS-submitted letters of recommendation to 14 ACGME-accredited programs in 2006-2007. We collected ERAS ID, program name, FERPA declaration, standardized letter of recommendation (SLOR) use, and SLOR Global Assessment ranking. The percentage of applicants who waived their FERPA rights was determined. Chi-square tests of independence assessed whether applicants' decision to waive their FERPA rights was associated with their SLOR Global Assessment. We examined 1776 applications containing 6424 letters of recommendations. Of 2736 letters that specified a Global Assessment, 2550 (93%) applicants waived their FERPA rights, while 186 did not. Of the applicants who chose not to waive their rights, 45.6% received a ranking of Outstanding, 35.5% Excellent, 18.3% Very Good, and 1.6% Good. Of applicants who waived their FERPA rights, 35.1% received a ranking of Outstanding, 49.6% Excellent, 13.7% Very Good, and 1.6% Good. Applicants who did not waive their FERPA rights were more likely to receive an Outstanding Assessment (P  =  .003). The majority (93%) of residency applicants waived their FERPA rights. Those who did not waive their rights had a statistically higher chance of receiving an Outstanding Assessment than those who did.

  6. Distribution of Parasitic Cestod

    Directory of Open Access Journals (Sweden)

    S Karimi

    2008-04-01

    Full Text Available Background: Ligulae intestinalis is a parasitic cestode, which has the economic-health importance in fishery industries. The aim of this study was to determine the prevalence of this parasite in Mazandaran. The effects of habitat temperature and kind of pool (sandy-cement were considered as well. Methods: In this study, 103 fish samples were obtained in all stages; the samples (male and female were divided into 3 groups based on length of fish, temperature, origin of cultured fish, kind of pool, height from sea and sex. Macroscopic and microscopic observations were carried out in all stages of the parasite (procercoid, plerocercoid and adult. Chi-square and Pearson's double square tests (P<0.05 were conducted in order to evaluate the prevalence and determination of reliability in six sampling areas, respectively. Results: Total rate of the parasites were 9.7% in all groups. There was significant difference between parasitism rate and height of sea level, kind of pool (maximum in sandy pools and high temperature. The multi analyses regarding to above-mentioned three criteria also indicated meaningful difference between these criteria and parasitism rate. Seasonal conditions enhance the prevalence of ligulae intestinalis. Conclusion: Flexibility in parasite's life cycle and choosing different hosts makes it challenging case in fishery industry; moreover its prevalence could be predicted according to environmental conditions so choosing the minimal at risk place for salmonids farming. Further studies are recommended for evaluating the problems in fish fertility and probable risk for infected fish consumers.

  7. Transformation of an empirical distribution to normal distribution by the use of Johnson system of translation and symmetrical quantile method

    OpenAIRE

    Ludvík Friebel; Jana Friebelová

    2006-01-01

    This article deals with approximation of empirical distribution to standard normal distribution using Johnson transformation. This transformation enables us to approximate wide spectrum of continuous distributions with a normal distribution. The estimation of parameters of transformation formulas is based on percentiles of empirical distribution. There are derived theoretical probability distribution functions of random variable obtained on the base of backward transformation standard normal ...

  8. The Dynamics of Standardization

    DEFF Research Database (Denmark)

    Brunsson, Nils; Rasche, Andreas; Seidl, David

    2012-01-01

    This paper suggests that when the phenomenon of standards and standardization is examined from the perspective of organization studies, three aspects stand out: the standardization of organizations, standardization by organizations and standardization as (a form of) organization. Following a comp...

  9. Distributional Inference

    NARCIS (Netherlands)

    Kroese, A.H.; van der Meulen, E.A.; Poortema, Klaas; Schaafsma, W.

    1995-01-01

    The making of statistical inferences in distributional form is conceptionally complicated because the epistemic 'probabilities' assigned are mixtures of fact and fiction. In this respect they are essentially different from 'physical' or 'frequency-theoretic' probabilities. The distributional form is

  10. Standard biological parts knowledgebase.

    Science.gov (United States)

    Galdzicki, Michal; Rodriguez, Cesar; Chandran, Deepak; Sauro, Herbert M; Gennari, John H

    2011-02-24

    We have created the Knowledgebase of Standard Biological Parts (SBPkb) as a publically accessible Semantic Web resource for synthetic biology (sbolstandard.org). The SBPkb allows researchers to query and retrieve standard biological parts for research and use in synthetic biology. Its initial version includes all of the information about parts stored in the Registry of Standard Biological Parts (partsregistry.org). SBPkb transforms this information so that it is computable, using our semantic framework for synthetic biology parts. This framework, known as SBOL-semantic, was built as part of the Synthetic Biology Open Language (SBOL), a project of the Synthetic Biology Data Exchange Group. SBOL-semantic represents commonly used synthetic biology entities, and its purpose is to improve the distribution and exchange of descriptions of biological parts. In this paper, we describe the data, our methods for transformation to SBPkb, and finally, we demonstrate the value of our knowledgebase with a set of sample queries. We use RDF technology and SPARQL queries to retrieve candidate "promoter" parts that are known to be both negatively and positively regulated. This method provides new web based data access to perform searches for parts that are not currently possible.

  11. Standard biological parts knowledgebase.

    Directory of Open Access Journals (Sweden)

    Michal Galdzicki

    2011-02-01

    Full Text Available We have created the Knowledgebase of Standard Biological Parts (SBPkb as a publically accessible Semantic Web resource for synthetic biology (sbolstandard.org. The SBPkb allows researchers to query and retrieve standard biological parts for research and use in synthetic biology. Its initial version includes all of the information about parts stored in the Registry of Standard Biological Parts (partsregistry.org. SBPkb transforms this information so that it is computable, using our semantic framework for synthetic biology parts. This framework, known as SBOL-semantic, was built as part of the Synthetic Biology Open Language (SBOL, a project of the Synthetic Biology Data Exchange Group. SBOL-semantic represents commonly used synthetic biology entities, and its purpose is to improve the distribution and exchange of descriptions of biological parts. In this paper, we describe the data, our methods for transformation to SBPkb, and finally, we demonstrate the value of our knowledgebase with a set of sample queries. We use RDF technology and SPARQL queries to retrieve candidate "promoter" parts that are known to be both negatively and positively regulated. This method provides new web based data access to perform searches for parts that are not currently possible.

  12. Acceptability of condom promotion and distribution among 10-19 year-old adolescents in Mpwapwa and Mbeya rural districts, Tanzania.

    Science.gov (United States)

    Exavery, Amon; Mubyazi, Godfrey M; Rugemalila, Jovitha; Mushi, Adiel K; Massaga, Julius J; Malebo, Hamisi M; Tenu, Filemon; Ikingura, Joyce K; Malekia, Sia; Makundi, Emmanuel A; Ruta, Acleus Sm; Ogondiek, John W; Wiketye, Victor; Malecela, Mwelecele N

    2012-07-29

    The HIV/AIDS pandemic remains a leading challenge for global health. Although condoms are acknowledged for their key role on preventing HIV transmission, low and inappropriate use of condoms persists in Tanzania and elsewhere in Africa. This study assesses factors affecting acceptability of condom promotion and distribution among adolescents in Mpwapwa and Mbeya rural districts of Tanzania. Data were collected in 2011 as part of a larger cross-sectional survey on condom use among 10-19 year-olds in Mpwapwa and Mbeya rural districts of Tanzania using a structured questionnaire. Associations between acceptability of condom promotion and distribution and each of the explanatory variables were tested using Chi Square. Multivariate logistic regression model was used to examine independent predictors of the acceptability of condom promotion and distribution using STATA (11) statistical software at 5% significance level. Mean age of the 1,327 adolescent participants (50.5% being males) was 13.5 years (SD = 1.4). Acceptance of condom promotion and distribution was found among 37% (35% in Mpwapwa and 39% in Mbeya rural) of the adolescents. Being sexually active and aged 15-19 was the strongest predictor of the acceptability of condom promotion and distribution (OR = 7.78, 95% CI 4.65-12.99). Others were; not agreeing that a condom is effective in preventing transmissions of STIs including HIV (OR = 0.34, 95% CI 0.20-0.56), being a resident of Mbeya rural district (OR = 1.67, 95% CI 1.28-2.19), feeling comfortable being seen by parents/guardians holding/buying condoms (OR = 2.20, 95% CI 1.40-3.46) and living with a guardian (OR = 1.48, 95% CI 1.08-2.04). Acceptability of condom promotion and distribution among adolescents in Mpwapwa and Mbeya rural is low. Effect of sexual activity on the acceptability of condom promotion and distribution is age-dependent and was the strongest. Feeling comfortable being seen by parents/guardians buying or holding condoms, perceived ability

  13. Acceptability of condom promotion and distribution among 10–19 year-old adolescents in Mpwapwa and Mbeya rural districts, Tanzania

    Directory of Open Access Journals (Sweden)

    Exavery Amon

    2012-07-01

    Full Text Available Abstract Background The HIV/AIDS pandemic remains a leading challenge for global health. Although condoms are acknowledged for their key role on preventing HIV transmission, low and inappropriate use of condoms persists in Tanzania and elsewhere in Africa. This study assesses factors affecting acceptability of condom promotion and distribution among adolescents in Mpwapwa and Mbeya rural districts of Tanzania. Methods Data were collected in 2011 as part of a larger cross-sectional survey on condom use among 10–19 year-olds in Mpwapwa and Mbeya rural districts of Tanzania using a structured questionnaire. Associations between acceptability of condom promotion and distribution and each of the explanatory variables were tested using Chi Square. Multivariate logistic regression model was used to examine independent predictors of the acceptability of condom promotion and distribution using STATA (11 statistical software at 5% significance level. Results Mean age of the 1,327 adolescent participants (50.5% being males was 13.5 years (SD = 1.4. Acceptance of condom promotion and distribution was found among 37% (35% in Mpwapwa and 39% in Mbeya rural of the adolescents. Being sexually active and aged 15–19 was the strongest predictor of the acceptability of condom promotion and distribution (OR = 7.78, 95% CI 4.65-12.99. Others were; not agreeing that a condom is effective in preventing transmissions of STIs including HIV (OR = 0.34, 95% CI 0.20-0.56, being a resident of Mbeya rural district (OR = 1.67, 95% CI 1.28-2.19, feeling comfortable being seen by parents/guardians holding/buying condoms (OR = 2.20, 95% CI 1.40-3.46 and living with a guardian (OR = 1.48, 95% CI 1.08-2.04. Conclusion Acceptability of condom promotion and distribution among adolescents in Mpwapwa and Mbeya rural is low. Effect of sexual activity on the acceptability of condom promotion and distribution is age-dependent and was the strongest. Feeling

  14. Acceptability of condom promotion and distribution among 10–19 year-old adolescents in Mpwapwa and Mbeya rural districts, Tanzania

    Science.gov (United States)

    2012-01-01

    Background The HIV/AIDS pandemic remains a leading challenge for global health. Although condoms are acknowledged for their key role on preventing HIV transmission, low and inappropriate use of condoms persists in Tanzania and elsewhere in Africa. This study assesses factors affecting acceptability of condom promotion and distribution among adolescents in Mpwapwa and Mbeya rural districts of Tanzania. Methods Data were collected in 2011 as part of a larger cross-sectional survey on condom use among 10–19 year-olds in Mpwapwa and Mbeya rural districts of Tanzania using a structured questionnaire. Associations between acceptability of condom promotion and distribution and each of the explanatory variables were tested using Chi Square. Multivariate logistic regression model was used to examine independent predictors of the acceptability of condom promotion and distribution using STATA (11) statistical software at 5% significance level. Results Mean age of the 1,327 adolescent participants (50.5% being males) was 13.5 years (SD = 1.4). Acceptance of condom promotion and distribution was found among 37% (35% in Mpwapwa and 39% in Mbeya rural) of the adolescents. Being sexually active and aged 15–19 was the strongest predictor of the acceptability of condom promotion and distribution (OR = 7.78, 95% CI 4.65-12.99). Others were; not agreeing that a condom is effective in preventing transmissions of STIs including HIV (OR = 0.34, 95% CI 0.20-0.56), being a resident of Mbeya rural district (OR = 1.67, 95% CI 1.28-2.19), feeling comfortable being seen by parents/guardians holding/buying condoms (OR = 2.20, 95% CI 1.40-3.46) and living with a guardian (OR = 1.48, 95% CI 1.08-2.04). Conclusion Acceptability of condom promotion and distribution among adolescents in Mpwapwa and Mbeya rural is low. Effect of sexual activity on the acceptability of condom promotion and distribution is age-dependent and was the strongest. Feeling comfortable being

  15. Distribution of the ACME-arcA gene among meticillin-resistant Staphylococcus haemolyticus and identification of a novel ccr allotype in ACME-arcA-positive isolates.

    Science.gov (United States)

    Pi, Borui; Yu, Meihong; Chen, Yagang; Yu, Yunsong; Li, Lanjuan

    2009-06-01

    The aim of this study was to investigate the prevalence and characteristics of ACME (arginine catabolic mobile element)-arcA-positive isolates among meticillin-resistant Staphylococcus haemolyticus (MRSH). ACME-arcA, native arcA and SCCmec elements were detected by PCR. Susceptibilities to 10 antimicrobial agents were compared between ACME-arcA-positive and -negative isolates by chi-square test. PFGE was used to investigate the clonal relatedness of ACME-arcA-positive isolates. The phylogenetic relationships of ACME-arcA and native arcA were analysed using the neighbour-joining methods of mega software. A total of 42 (47.7 %) of 88 isolates distributed in 13 PFGE types were positive for the ACME-arcA gene. There were no significant differences in antimicrobial susceptibility between ACME-arcA-positive and -negative isolates. A novel ccr allotype (ccrAB(SHP)) was identified in ACME-arcA-positive isolates. Among 42 ACME-arcA-positive isolates: 8 isolates harboured SCCmec V, 8 isolates harboured class C1 mec complex and ccrAB(SHP); 22 isolates harbouring class C1 mec complex and 4 isolates harbouring class C2 mec complex were negative for all known ccr allotypes. The ACME-arcA-positive isolates were first found in MRSH with high prevalence and clonal diversity, which suggests a mobility of ACME within MRSH. The results from this study revealed that MRSH is likely to be one of the potential reservoirs of ACME for Staphylococcus aureus.

  16. [Analysis of computed tomography-based distribution of metastatic cervical nodes in 218 cases of nasopharyngeal carcinoma].

    Science.gov (United States)

    Wang, Xiao-Shen; Hu, Chao-Su; Wu, Yong-Ru; Qiu, Xing-Xian; Feng, Yan

    2004-09-01

    The application of intensity- modulated radiation therapy (IMRT) for nasopharyngeal carcinoma (NPC) requires a precise delineation of the nodal area and nodal clinical target volume (CTV) on computed tomography (CT) images,and the prerequisite is to find out the rules of CT-based distribution of metastatic lymph nodes of NPC. This study was designed to analyze the rules of CT-based distribution of nodal involvements of NPC according to the guidelines of nodal levels proposed by Radiation Therapy Oncology Group (RTOG). From Jul. 2003 to Nov.2003, 259 newly diagnosed NPC patients received radiotherapy at Fudan University Affiliated Cancer Hospital. All patients had transversal contrast enhanced CT scan from base of skull to clavicle before treatment. Diagnostic radiologists and radiation oncologists together assessed the nodal distribution in each RTOG nodal level. Chi-square test was used to analyze the correlation between T stage and nodal metastasis rate. The neck was further divided into 3 regions by the verge of hyoid bone and the inferior border of cricoid cartilage to assess leap metastasis of nodes. A total of 218 patients (84.2%)had nodal involvement. The distribution was as follow: 0 in level Ia, 6 (2.8%) in level Ib, 115 (52.8%) in level IIa,192 (88.1%) in level IIb, 78 (35.8%) in level III, 20 (9.2%) in level IV, 65 (29.9%) in level V, 0 in level VI,157 (72.0%)in retropharynx, and 2 (0.9%) at preauricular area. Leap metastases were found in only 5 patients (2.3%). No significant correlation was found between T stage and nodal involvement. NPC has a high probability of nodal metastases, nodes in level IIa,IIb, and retropharynx are most likely to be involved. Nodes metastasized mostly from the upper to the lower level, and from the proximal to the distal part, with a very low leap metastasis rate. The relationship between T stage and nodal involvement has no statistical significance.

  17. Infection control and practice of standard precautions among healthcare workers in northern Nigeria

    Directory of Open Access Journals (Sweden)

    O E Amoran

    2013-01-01

    Full Text Available Background: Healthcare-associated infections (HAIs have been reported to be a serious problem in the healthcare services as they are common causes of illness and mortality among hospitalized patients including healthcare workers (HCWs. Compliance with these standard precautions has been shown to reduce the risk of exposure to blood and body fluids. Aims: This study therefore assesses the level of knowledge and compliance with standard precautions by the various cadre of HCWs and the factors influencing compliance in hospital environment in Nasarawa State, Northern Nigeria. Settings and Design: Nasarawa State has a current human immunodeficiency virus/acquired immunodeficiency syndrome (HIV/AIDS prevalence rate of 10.0%, which was higher than most states in Nigeria with a high level of illiteracy and ignorance. Majority of the people reside in the rural areas while a few are found in the towns, informal settlements with no direct access to healthcare facilities are common. Materials and Methods: This study is an analytical, cross-sectional study. Proportional sampling technique was used to obtain a representative sample and a structured self-administered questionnaire was used to collect relevant information from the healthcare providers working in Nasarawa State from January to February 2009. Statistical analysis used: To describe patient characteristics, we calculated proportions and medians. For categorical variables, we compared proportions using chi-square tests. A logistic regression model was produced with infection control as outcome variable to identify associated factors. Results: A total of 421 HCWs were interviewed, Majority (77.9% correctly describe universal precaution and infection control with 19.2, 19.2, and 28.0%, respectively unable to recognize vaccination, postexposure prophylaxis, and surveillance for emerging diseases as standard precaution for infection control. About 70.1% usually wear gloves before handling patients or

  18. Distributed Visualization

    Data.gov (United States)

    National Aeronautics and Space Administration — Distributed Visualization allows anyone, anywhere, to see any simulation, at any time. Development focuses on algorithms, software, data formats, data systems and...

  19. The exponentiated generalized Pareto distribution | Adeyemi | Ife ...

    African Journals Online (AJOL)

    Recently Gupta et al. (1998) introduced the exponentiated exponential distribution as a generalization of the standard exponential distribution. In this paper, we introduce a three-parameter generalized Pareto distribution, the exponentiated generalized Pareto distribution (EGP). We present a comprehensive treatment of the ...

  20. Dyadic distributions

    International Nuclear Information System (INIS)

    Golubov, B I

    2007-01-01

    On the basis of the concept of pointwise dyadic derivative dyadic distributions are introduced as continuous linear functionals on the linear space D d (R + ) of infinitely differentiable functions compactly supported by the positive half-axis R + together with all dyadic derivatives. The completeness of the space D' d (R + ) of dyadic distributions is established. It is shown that a locally integrable function on R + generates a dyadic distribution. In addition, the space S d (R + ) of infinitely dyadically differentiable functions on R + rapidly decreasing in the neighbourhood of +∞ is defined. The space S' d (R + ) of dyadic distributions of slow growth is introduced as the space of continuous linear functionals on S d (R + ). The completeness of the space S' d (R + ) is established; it is proved that each integrable function on R + with polynomial growth at +∞ generates a dyadic distribution of slow growth. Bibliography: 25 titles.

  1. Standards for Standardized Logistic Regression Coefficients

    Science.gov (United States)

    Menard, Scott

    2011-01-01

    Standardized coefficients in logistic regression analysis have the same utility as standardized coefficients in linear regression analysis. Although there has been no consensus on the best way to construct standardized logistic regression coefficients, there is now sufficient evidence to suggest a single best approach to the construction of a…

  2. Momentum distributions

    International Nuclear Information System (INIS)

    Simmons, R.O.

    1984-01-01

    The content of the portion of the workshop concerned with momentum distributions in condensed matter is outlined and the neutron scattering approach to their measurement is briefly described. Results concerning helium systems are reviewed. Some theoretical aspects are briefly mentioned

  3. Spatial and Time Pattern Distribution of Water Birds Community at Mangrove Ecosystem of Bengawan Solo Estuary - Gresik Regency

    Directory of Open Access Journals (Sweden)

    Sutopo .

    2018-01-01

    Full Text Available Mangrove ecosystem in Bengawan Solo estuary is a part of the essential ecosystem and also as important and endemic birds’ areas. Aim of this study is to analysis the parameter of habitat condition, analysis the different of time and spatial pattern and provide the management strategy for water birds and habitat. Reseach was carry out at January – May, 2017 (two period observation. Methods are used i.e. concentration count, single and unit plot, point count, interview and field observation. Data analyze using chi-square, grid-line point and mark point, beak-type and vegetation analysis. There are 41 (forty one species of water birds (23 migrant species and 17 native species. Chi-square analysis have significance difference both the time and spatial and also type of feed with chi-square values (χ2 hit.(2;0,95 > χ2 tab.(2;0,95. Migrant birds’ occupy the mudflat for feeding and resting ground, while the native birds use pond areas. Common the invertebrate species as feed for migrant like crustace and native birds are tend to feed fish and shrimp. Feeding and resting activities by migrant birds was influence by water-tidal condition. Total of water birds population are 112.100+ individual. Total of mangrove species was identified are 15 (fifteen species, and dominant at three habitus by Avicennia alba.Keywords: Bengawan Solo Estuary, mangrove ecosystem, spatial and time, water birds

  4. Malaysian NDT standards

    International Nuclear Information System (INIS)

    Khazali Mohd Zin

    2001-01-01

    In order to become a developed country, Malaysia needs to develop her own national standards. It has been projected that by the year 2020, Malaysia requires about 8,000 standards (Department of Standard Malaysia). Currently more than 2,000 Malaysian Standards have been gazette by the government which considerably too low before tire year 2020. NDT standards have been identified by the standard working group as one of the areas to promote our national standards. In this paper the author describes the steps taken to establish the Malaysian very own NDT standards. The project starts with the establishment of radiographic standards. (Author)

  5. The allelic distribution of -308 Tumor Necrosis Factor-alpha gene polymorphism in South African women with cervical cancer and control women

    Directory of Open Access Journals (Sweden)

    Williamson Anna-Lise

    2006-01-01

    Full Text Available Abstract Background Cervical cancer is due to infection with specific high-risk types of human papillomavirus (HPV. Although the incidence of genital HPV infection in various population groups is high, most of these regress without intervention. Investigating genetic host factors and cellular immune responses, particularly cytokines, could help to understand the association between genital HPV infection and carcinogenesis. The tumor necrosis factor alpha (TNF-α cytokine plays an important role in all stages of cervical cancer and has the ability to induce the regression of human tumors. Therefore the aim of the study was to investigate the allelic distribution of -308 TNF-α gene polymorphism in South African women with cervical cancer compared to control women. Methods Included in our study were women with histologically proven cancer of the cervix (n = 244 and hospital-based controls (n = 228. All patients and controls were from mixed race and black population groups in South Africa. The detection of a bi-allelic -308 (A/G polymorphism in the promoter region of TNF-α was investigated using the amplification refractory mutation system-polymerase chain reaction (ARMS-PCR technique. The distributions of the allelic frequencies were stratified in both patients and controls into two South African ethnic population groups. Results In this study we observed no association between the distribution of -308 TNF-α polymorphism and the risk of developing cervical cancer even after combining the data from the two ethnic populations (X2 = 2.26. In addition, using the chi-squared test we found no significant association between the known risk factors for cervical cancer and the allele distribution of -308 TNF-α. However, the frequency of the rare high-producing allele -308A of TNF-α was significantly lower in the South African population when compared to Caucasians and Chinese population groups. Conclusion We demonstrated no association between -308 TNF

  6. The allelic distribution of -308 Tumor Necrosis Factor-alpha gene polymorphism in South African women with cervical cancer and control women

    International Nuclear Information System (INIS)

    Govan, Vandana A; Constant, Debbie; Hoffman, Margaret; Williamson, Anna-Lise

    2006-01-01

    Cervical cancer is due to infection with specific high-risk types of human papillomavirus (HPV). Although the incidence of genital HPV infection in various population groups is high, most of these regress without intervention. Investigating genetic host factors and cellular immune responses, particularly cytokines, could help to understand the association between genital HPV infection and carcinogenesis. The tumor necrosis factor alpha (TNF-α) cytokine plays an important role in all stages of cervical cancer and has the ability to induce the regression of human tumors. Therefore the aim of the study was to investigate the allelic distribution of -308 TNF-α gene polymorphism in South African women with cervical cancer compared to control women. Included in our study were women with histologically proven cancer of the cervix (n = 244) and hospital-based controls (n = 228). All patients and controls were from mixed race and black population groups in South Africa. The detection of a bi-allelic -308 (A/G) polymorphism in the promoter region of TNF-α was investigated using the amplification refractory mutation system-polymerase chain reaction (ARMS-PCR) technique. The distributions of the allelic frequencies were stratified in both patients and controls into two South African ethnic population groups. In this study we observed no association between the distribution of -308 TNF-α polymorphism and the risk of developing cervical cancer even after combining the data from the two ethnic populations (X 2 = 2.26). In addition, using the chi-squared test we found no significant association between the known risk factors for cervical cancer and the allele distribution of -308 TNF-α. However, the frequency of the rare high-producing allele -308A of TNF-α was significantly lower in the South African population when compared to Caucasians and Chinese population groups. We demonstrated no association between -308 TNF-α polymorphism and the risk of cervical cancer among two

  7. Standard of Living in the European Union

    OpenAIRE

    Stávková, Jana; Žufan, Pavel; Birčiaková, Naďa

    2012-01-01

    This chapter focuses on the measurement of standard of living and on the factors influencing its level. Specific attention is paid to the indicators of standard of living, and the frequent employment of GDP is discussed and compared with possible alternatives. Household income is also a factor of central importance in determining standard of living, and the chapter assesses this factor in terms of household income distribution, the setting of poverty limits, the measurement of income disparit...

  8. Standard Populations (Millions) for Age-Adjustment - SEER Population Datasets

    Science.gov (United States)

    Download files containing standard population data for use in statististical software. The files contain the same data distributed with SEER*Stat software. You can also view the standard populations, either 19 age groups or single ages.

  9. The International Standards Organisation offshore structures standard

    International Nuclear Information System (INIS)

    Snell, R.O.

    1994-01-01

    The International Standards Organisation has initiated a program to develop a suite of ISO Codes and Standards for the Oil Industry. The Offshore Structures Standard is one of seven topics being addressed. The scope of the standard will encompass fixed steel and concrete structures, floating structures, Arctic structures and the site specific assessment of mobile drilling and accommodation units. The standard will use as base documents the existing recommended practices and standards most frequently used for each type of structure, and will develop them to incorporate best published and recognized practice and knowledge where it provides a significant improvement on the base document. Work on the Code has commenced under the direction of an internationally constituted sub-committee comprising representatives from most of the countries with a substantial offshore oil and gas industry. This paper outlines the background to the code and the format, content and work program

  10. What is "Standard" About the Standard Deviation

    OpenAIRE

    Newberger, Florence; Safer, Alan M.; Watson, Saleem

    2010-01-01

    The choice of the formula for standard deviation is explained in elementary statistics textbooks in various ways. We give an explanation for this formula by representing the data as a vector in $\\mathbb R^n$ and considering its distance from a central tendency vector. In this setting the "standard" formula represents a shortest distance in the standard metric. We also show that different metrics lead to different measures of central tendency.

  11. Standard Reference Tables -

    Data.gov (United States)

    Department of Transportation — The Standard Reference Tables (SRT) provide consistent reference data for the various applications that support Flight Standards Service (AFS) business processes and...

  12. Peculiarity of the temporal distributions of seismic events in the Central America and Mexico.

    Science.gov (United States)

    Sasorova, E.; Levin, B.

    2010-03-01

    At first the interannual earthquake distributions and its peculiarity in predetermined region are considered. The hypothesis about within-year variability existence for the events of various energy levels was tested. The worldwide catalogs ISC (International Seismic Catalog) and NEIC (USGS) were used. It was extracted all EQs for the Pacific part of the given region from 1964 with Mb>=4.0. The entire set of events under analysis was divided into several magnitude ranges (MR). The analysis of the completeness of events in defined MRs was carried out. The aftershocks were canceled from the list. Further analysis was performed separately for each MR. Then the events in each magnitude level were subdivided once again into two groups: shallow events (H Htr), where Htr is depth threshold value. Then we are checking if the distributions of the events during the year period are uniform or these distributions are no uniform. We are testing our data separately for each magnitude level and for every depth level. The null hypothesis about uniform EQ distributions in the course of year was disproved for the most samples with shallow EQ (95%). But the null hypothesis was confirmed for deep earthquakes. We use the Chi-Square test for well-filled sequences and method of statistical testing for poor-filled sequences. The Htr value determines the boundary, which divided the seismic events in two groups. If the EQ's sources located above this boundary then such EQ's are distributed non-uniformly in the course of year. While if the EQ sources located below this boundary then distribution of such EQ during the year period are uniform. It was found by using special software procedure that the Htr boundary between the shallow and the deep events in the most cases was arranged in deep 60-80 km. The noticeable increase number of seismic events in short time intervals as a rule two times in year, and significant reducing of seismic activity in the rest part of the year was shown. It was

  13. Distributed systems

    CERN Document Server

    Van Steen, Maarten

    2017-01-01

    For this third edition of "Distributed Systems," the material has been thoroughly revised and extended, integrating principles and paradigms into nine chapters: 1. Introduction 2. Architectures 3. Processes 4. Communication 5. Naming 6. Coordination 7. Replication 8. Fault tolerance 9. Security A separation has been made between basic material and more specific subjects. The latter have been organized into boxed sections, which may be skipped on first reading. To assist in understanding the more algorithmic parts, example programs in Python have been included. The examples in the book leave out many details for readability, but the complete code is available through the book's Website, hosted at www.distributed-systems.net.

  14. Spatial distribution

    DEFF Research Database (Denmark)

    Borregaard, Michael Krabbe; Hendrichsen, Ditte Katrine; Nachman, Gøsta Støger

    2008-01-01

    populations reflects the location and fragmentation pattern of the habitat types preferred by the species, and the complex dynamics of migration, colonization, and population growth taking place over the landscape. Within these, individuals are distributed among each other in regular or clumped patterns...

  15. Standardization and the European Standards Organisations

    Directory of Open Access Journals (Sweden)

    Marta Orviska

    2014-01-01

    Full Text Available Standardization is a relatively neglected aspect of the EU regulatory process and yet it is fundamental to that process and arguably has recently been the key vehicle in making the single market an economic reality. Yet the key standardization bodies in the EU, the ESOs, are scarcely known to the public and seldom discussed in the literature. In this article we redress this imbalance, arguing that standardization and integration are closely related concepts. We also argue that the ESOs have developed a degree of autonomy in expanding the boundaries of standardization and even in developing their own links with the rest of the world. Recent proposals put forward by the European Commission can be seen as an attempt to reduce that autonomy. These proposals emphasize the speed of, and stakeholder involvement in, standards production, which we further suggest are somewhat conflicting aims.

  16. Seroprevalence of transfusion transmitted infection among blood donors at Jijiga blood bank, Eastern Ethiopia: retrospective 4 years study.

    Science.gov (United States)

    Mohammed, Yusuf; Bekele, Alemayehu

    2016-02-27

    A transfusion transmissible infection (TTI) is any infection that is transmissible from person to- person through parenteral administration of blood or blood products. The magnitude of transfusion-transmitted infections (TTI) varies from country to country depending on TTI's load in that particular population. Measuring their severity, WHO (World Health Organization) has recommended pre-transfusion blood test for Human immunodeficiency virus (HIV), Hepatitis B virus (HBV), Hepatitis C Virus (HCV) and Syphilis as mandatory. The aim of the current study was to assess the trend and prevalence of TTI among blood donors in Jijiga Blood Bank between 2010 and 2013. A Retrospective cross-sectional study was conducted by reviewing the records from 2010 to 2013 at Jijiga Blood Bank. All blood donors who presented to the blood bank and screened for TTI during the study period were included. The data was collected, entered and analyzed using Epi Info 3.5.1 & Microsoft Excel 2007. The descriptive statistics were determined in means of percentages. Chi-square was used for trend analysis and p-value was used to declare the statistical significance between the variable. There were a total of 4224 people donated blood during study period. Males formed the majority of the donor population accounting for 4171 (98.7%). Majority 4139 (98%) of donors were Replacement donors. The overall prevalence of transfusion-transmitted infection was 487/4224 (11.5%). The prevalence for HBsAg, HCV, HIV, & Syphilis antibodies was 460 (10. 9%), 17 (0.4%), 6 (0.1%) and 4 (0.1%) respectively. Majority 460/487 (94.5%) of infection was HBsAg. Statistically significant difference was observed in number of donation as well as sero-positivity from year 2010 to 2013 (Chi-square 9.24, p value = 0.02), in Trends of HBsAg from year to year (Chi-square 11.14, p value = 0.01), HIV virus was seen as the age of donors increases (Chi-square 8.37, p value = 0.01) and There was also statistically significance

  17. Standards for holdup measurement

    International Nuclear Information System (INIS)

    Zucker, M.S.

    1982-01-01

    Holdup measurement, needed for material balance, depend intensively on standards and on interpretation of the calibration procedure. More than other measurements, the calibration procedure using the standard becomes part of the standard. Standards practical for field use and calibration techniques have been developed. While accuracy in holdup measurements is comparatively poor, avoidance of bias is a necessary goal

  18. Distributed processor systems

    International Nuclear Information System (INIS)

    Zacharov, B.

    1976-01-01

    In recent years, there has been a growing tendency in high-energy physics and in other fields to solve computational problems by distributing tasks among the resources of inter-coupled processing devices and associated system elements. This trend has gained further momentum more recently with the increased availability of low-cost processors and with the development of the means of data distribution. In two lectures, the broad question of distributed computing systems is examined and the historical development of such systems reviewed. An attempt is made to examine the reasons for the existence of these systems and to discern the main trends for the future. The components of distributed systems are discussed in some detail and particular emphasis is placed on the importance of standards and conventions in certain key system components. The ideas and principles of distributed systems are discussed in general terms, but these are illustrated by a number of concrete examples drawn from the context of the high-energy physics environment. (Auth.)

  19. Collaboration Between Multistakeholder Standards

    DEFF Research Database (Denmark)

    Rasche, Andreas; Maclean, Camilla

    Public interest in corporate social responsibility (CSR) has resulted in a wide variety of multistakeholder CSR standards in which companies can choose to participate. While such standards reflect collaborative governance arrangements between public and private actors, the market for corporate...... responsibility is unlikely to support a great variety of partly competing and overlapping standards. Increased collaboration between these standards would enhance both their impact and their adoption by firms. This report examines the nature, benefits, and shortcomings of existing multistakeholder standards...

  20. Fluorescent standards for photodynamic therapy

    Science.gov (United States)

    Belko, N.; Kavalenka, S.; Samtsov, M.

    2016-08-01

    Photodynamic therapy is an evolving technique for treatment of various oncological diseases. This method employs photosensitizers - species that lead to death of tumor cells after the photoactivation. For further development and novel applications of photodynamic therapy new photosensitizers are required. After synthesis of a new photosensitizer it is important to know its concentration in different biological tissues after its administration and distribution. The concentration is frequently measured by the extraction method, which has some disadvantages, e.g. it requires many biological test subjects that are euthanized during the measurement. We propose to measure the photosensitizer concentration in tissue by its fluorescence. For this purpose fluorescent standards were developed. The standards are robust and simple to produce; their fluorescence signal does not change with time. The fluorescence intensity of fluorescent standards seems to depend linearly on the dye concentration. A set of standards thus allow the calibration of a spectrometer. Finally, the photosensitizer concentration can be determined by the fluorescence intensity after comparing the corresponding spectrum with spectra of the set of fluorescent standards. A biological test subject is not euthanized during this kind of experiment. We hope this more humane technique can be used in future instead of the extraction method.

  1. Evaluation of the overlapping of posterior teeth in two techniques of improved interproximal panoramic program and standard panoramic

    Directory of Open Access Journals (Sweden)

    Goodarzi pour D

    2010-06-01

    Full Text Available "nBackground and Aims: Overlapping of the proximal surfaces of posterior teeth in the panoramic radiography is a major concern. Therefore, an option has been developed in the panoramic unit of Planmeca Promax, namely improved interproximal mode. This mode causes lower horizental angle with the teeth contact region during the unit rotation decreasing overlapping of the panoramic images of the posterior teeth especially premolar teeth. The present study was done to compare the overlapping of posterior teeth using two techniques of improved interproximal panoramic program and standard panoramic. "nMaterials and Methods: In this diagnostic study, 32 patients requiring panoramic radiographies at their posterior teeth during their routine diagnosis and treatment process with the mean age of 27.3 years were participated. No patients showed crowding of posterior teeth or missed and restored posterior teeth. The participants' panoramic radiographies were randomly taken by two techniques of improved interproximal panoramic and standard panoramic using Planmeca Promax device. The overlapping of the panoramic images was blindly assessed by an oral radiologist. The overlapping in both techniques was reported by frequency and percentage. The comparisons were done by Chi-square test between two techniques and the odds ratio of overlapping was estimated using regression analysis. "nResults: In standard panoramic techniques, 38.5% (148 contacts of 384 contacts of the proximal surfaces overlapped while the overlapping of the proximal surfaces was observed in 18.8% (72 contacts of 384 overall contacts in improved interproximal technique. Significant differences were noted between two techniques regarding overlapping (P<0.001. Also 66.4% and 39.1% of 4-5 teeth contacts overlapped in standard and improved techniques. The values were reported to be 39.1% and 12.5% in contacts of 5-6 teeth and 10.2% and 4.7% in the contacts of 6-7 teeth in both techniques

  2. Quasihomogeneous distributions

    CERN Document Server

    von Grudzinski, O

    1991-01-01

    This is a systematic exposition of the basics of the theory of quasihomogeneous (in particular, homogeneous) functions and distributions (generalized functions). A major theme is the method of taking quasihomogeneous averages. It serves as the central tool for the study of the solvability of quasihomogeneous multiplication equations and of quasihomogeneous partial differential equations with constant coefficients. Necessary and sufficient conditions for solvability are given. Several examples are treated in detail, among them the heat and the Schrödinger equation. The final chapter is devoted to quasihomogeneous wave front sets and their application to the description of singularities of quasihomogeneous distributions, in particular to quasihomogeneous fundamental solutions of the heat and of the Schrödinger equation.

  3. MAIL DISTRIBUTION

    CERN Multimedia

    J. Ferguson

    2002-01-01

    Following discussions with the mail contractor and Mail Service personnel, an agreement has been reached which permits deliveries to each distribution point to be maintained, while still achieving a large proportion of the planned budget reduction in 2002. As a result, the service will revert to its previous level throughout the Laboratory as rapidly as possible. Outgoing mail will be collected from a single collection point at the end of each corridor. Further discussions are currently in progress between ST, SPL and AS divisions on the possibility of an integrated distribution service for internal mail, stores items and small parcels, which could lead to additional savings from 2003 onwards, without affecting service levels. J. Ferguson AS Division

  4. Distributed SLAM

    Science.gov (United States)

    Binns, Lewis A.; Valachis, Dimitris; Anderson, Sean; Gough, David W.; Nicholson, David; Greenway, Phil

    2002-07-01

    Previously, we have developed techniques for Simultaneous Localization and Map Building based on the augmented state Kalman filter. Here we report the results of experiments conducted over multiple vehicles each equipped with a laser range finder for sensing the external environment, and a laser tracking system to provide highly accurate ground truth. The goal is simultaneously to build a map of an unknown environment and to use that map to navigate a vehicle that otherwise would have no way of knowing its location, and to distribute this process over several vehicles. We have constructed an on-line, distributed implementation to demonstrate the principle. In this paper we describe the system architecture, the nature of the experimental set up, and the results obtained. These are compared with the estimated ground truth. We show that distributed SLAM has a clear advantage in the sense that it offers a potential super-linear speed-up over single vehicle SLAM. In particular, we explore the time taken to achieve a given quality of map, and consider the repeatability and accuracy of the method. Finally, we discuss some practical implementation issues.

  5. Hospital standardized mortality ratio: consequences of adjusting hospital mortality with indirect standardization.

    Directory of Open Access Journals (Sweden)

    Maurice E Pouw

    Full Text Available BACKGROUND: The hospital standardized mortality ratio (HSMR is developed to evaluate and improve hospital quality. Different methods can be used to standardize the hospital mortality ratio. Our aim was to assess the validity and applicability of directly and indirectly standardized hospital mortality ratios. METHODS: Retrospective scenario analysis using routinely collected hospital data to compare deaths predicted by the indirectly standardized case-mix adjustment method with observed deaths. Discharges from Dutch hospitals in the period 2003-2009 were used to estimate the underlying prediction models. We analysed variation in indirectly standardized hospital mortality ratios (HSMRs when changing the case-mix distributions using different scenarios. Sixty-one Dutch hospitals were included in our scenario analysis. RESULTS: A numerical example showed that when interaction between hospital and case-mix is present and case-mix differs between hospitals, indirectly standardized HSMRs vary between hospitals providing the same quality of care. In empirical data analysis, the differences between directly and indirectly standardized HSMRs for individual hospitals were limited. CONCLUSION: Direct standardization is not affected by the presence of interaction between hospital and case-mix and is therefore theoretically preferable over indirect standardization. Since direct standardization is practically impossible when multiple predictors are included in the case-mix adjustment model, indirect standardization is the only available method to compute the HSMR. Before interpreting such indirectly standardized HSMRs the case-mix distributions of individual hospitals and the presence of interactions between hospital and case-mix should be assessed.

  6. Use of a web-based educational intervention to improve knowledge of healthy diet and lifestyle in women with Gestational Diabetes Mellitus compared to standard clinic-based education.

    Science.gov (United States)

    Sayakhot, Padaphet; Carolan-Olah, Mary; Steele, Cheryl

    2016-08-05

    This study introduced a web-based educational intervention for Australian women with gestational diabetes mellitus (GDM). The aim was to improve knowledge on healthy diet and lifestyle in GDM. Evaluation of the intervention explored women's knowledge and understanding of GDM, healthy diet, healthy food, and healthy lifestyle, after using the web-based program compared to women receiving standard clinic-based GDM education. A total of 116 women, aged 18-45 years old, newly diagnosed with GDM, participated (Intervention (n) = 56 and control (n) = 60). Women were randomly allocated to the intervention or control groups and both groups attended a standard GDM education class. Group 1(Intervention) additionally used an online touch screen/computer program. All women completed a questionnaire following the computer program and/or the education class. All questions evaluating levels of knowledge had more than one correct answer and scores were graded from 0 to 1, with each correct component receiving a score, eg. 0.25 per each correct answer in a 4 answer question. Chi-square test was performed to compare the two groups regarding knowledge of GDM. Findings indicated that the majority of women in the intervention group reported correct answers for "types of carbohydrate foods" for pregnant women with GDM, compared to the control group (62.5 % vs 58.3 %, respectively). Most women in both groups had an excellent understanding of "fruits and vegetables" (98.2 % vs 98.3 %), and the majority of women in the intervention group understood that they should exercise daily for 30 min, compared to the control group (92.9 % vs 91.7 %). Both groups had a good understanding across all categories, however, the majority of women in the intervention group scored all correct answers (score = 1) in term of foetal effects (17.9 % vs 13.3 %, respectively), maternal predictors (5.4 % vs 5 %), care requirements (39.3 % vs 23.3 %), GDM perceptions (48.2 % vs 46.7 %) and

  7. Standards, Standards, Standards: The Unintended Consequences of Widening Participation?

    Science.gov (United States)

    Stuart, Mary

    2002-01-01

    Debate over widening access to higher education is narrowing to a focus on preservation of standards. Examination of the discourses of school policy, classroom environment, and peer culture shows how these competing cultures can work against efforts to increase participation. (Contains 17 references.) (SK)

  8. THE DECOMPOSITION OF POVERTY: A DISTRIBUTIVE APPROACH TO LIVING STANDARDS

    Directory of Open Access Journals (Sweden)

    Mbu Daniel Tambi

    2012-04-01

    Full Text Available This study attempts to carry out a comprehensive analysis of the evolution of poverty trends using national household consumption survey I and II collected in 1996 and 2001 respectively. The theoretical decomposition frameworks propelling the study are motivated mainly by the Shapley value while empirical estimates are obtained from DAD 4.4. From our findings, we observe that Rural forest and Rural highlands regions were hardest hit by poverty and inequality trends in Cameroon. The result shows that the within-regions effects were found to be more instrumental in accounting for changes in all the classes of poverty measures than the inter-sector population shift effects in the period under review. While the between-region effects were systematically contributing in alleviating poverty in the Rural forest and Rural highlands and at the same time aggravating poverty in Yaounde, Douala. Based on our result, we suggest that policies and strategies for reducing poverty/inequality should place particular emphasis on the countryside and on a region-by-region approach such as decentralization, increase provision of rural extension services (roads, electricity, markets, portable water.

  9. Decomposition of inequality in the distribution of living standards in ...

    African Journals Online (AJOL)

    This paper examines the characteristics of inequality and its decomposition via the Generalized Entropy Class of indices. It uses the 1996 Cameroon Household Consumption Survey generated by the national Statistics Office. Inequality is most pronounced in the urban areas, among the highly educated, women heads of ...

  10. Distribution switchgear

    CERN Document Server

    Stewart, Stan

    2004-01-01

    Switchgear plays a fundamental role within the power supply industry. It is required to isolate faulty equipment, divide large networks into sections for repair purposes, reconfigure networks in order to restore power supplies and control other equipment.This book begins with the general principles of the Switchgear function and leads on to discuss topics such as interruption techniques, fault level calculations, switching transients and electrical insulation; making this an invaluable reference source. Solutions to practical problems associated with Distribution Switchgear are also included.

  11. Agent Standards Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation of the work herein proposed is the development of standards for software autonomous agents. These standards are essential to achieve software...

  12. Catalytic Functions of Standards

    NARCIS (Netherlands)

    K. Blind (Knut)

    2009-01-01

    textabstractThe three different areas and the examples have illustrated several catalytic functions of standards for innovation. First, the standardisation process reduces the time to market of inventions, research results and innovative technologies. Second, standards themselves promote the

  13. BTS statistical standards manual

    Science.gov (United States)

    2005-10-01

    The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...

  14. Dental Assisting Program Standards.

    Science.gov (United States)

    Georgia Univ., Athens. Dept. of Vocational Education.

    This publication contains statewide standards for the dental assisting program in Georgia. The standards are divided into 12 categories: foundations (philosophy, purpose, goals, program objectives, availability, evaluation); admissions (admission requirements, provisional admission requirements, recruitment, evaluation and planning); program…

  15. Protection of Distribution Systems with Distributed Energy Resources

    DEFF Research Database (Denmark)

    Bak-Jensen, Birgitte; Browne, Matthew; Calone, Roberto

    of 17 months of work of the Joint Working Group B5/C6.26/CIRED “Protection of Distribution Systems with Distributed Energy Resources”. The working group used the CIGRE report TB421 “The impact of Renewable Energy Sources and Distributed Generation on Substation Protection and Automation”, published......The usage of Distributed Energy Resources (DER) in utilities around the world is expected to increase significantly. The existing distribution systems have been generally designed for unidirectional power flow, and feeders are opened and locked out for any fault within. However, in the future...... by WG B5.34 as the entry document for the work on this report. In doing so, the group aligned the content and the scope of this report, the network structures considered, possible islanding, standardized communication and adaptive protection, interface protection, connection schemes and protection...

  16. The role of food standards in development

    DEFF Research Database (Denmark)

    Trifkovic, Neda

    of the emergence of food standards on farmers’ wellbeing, (ii) the effects of various forms of vertical coordination on household welfare and (iii) the consequences of the concurrent emergence of food standards and vertical coordination in the Vietnamese pangasius sector. The first paper, Food Standards are Good...... — for Middle-Class Farmers, joint with Henrik Hansen, estimates the impact of food standards on farmers’ wellbeing using the data from the Vietnamese pangasius sector. In this paper we estimate both the average effect as well as the effects on poorer and richer farmers using the instrumental variable quantile...... regression. We find that large returns from food standards are possible but the gains are substantial only for the ‘middle-class’ farmers, occupying the range between 50% and 85% quantiles of the expenditure distribution. Overall, this result points to an exclusionary impact of food standards for the poorest...

  17. The Distance Standard Deviation

    OpenAIRE

    Edelmann, Dominic; Richards, Donald; Vogel, Daniel

    2017-01-01

    The distance standard deviation, which arises in distance correlation analysis of multivariate data, is studied as a measure of spread. New representations for the distance standard deviation are obtained in terms of Gini's mean difference and in terms of the moments of spacings of order statistics. Inequalities for the distance variance are derived, proving that the distance standard deviation is bounded above by the classical standard deviation and by Gini's mean difference. Further, it is ...

  18. Radiological Control Technician: Standardized technician Qualification Standard

    International Nuclear Information System (INIS)

    1992-10-01

    The Qualification Standard states and defines the knowledge and skill requirements necessary for successful completion of the Radiological Control Technician Training Program. The standard is divided into three phases: Phase I concerns RCT Academic training. There are 13 lessons associated with the core academics program and 19 lessons associated with the site academics program. The staff member should sign the appropriate blocks upon successful completion of the examination for that lesson or group of lessons. In addition, facility specific lesson plans may be added to meet the knowledge requirements in the Job Performance Measures (JPM) of the practical program. Phase II concerns RCT core/site practical (JPMs) training. There are thirteen generic tasks associated with the core practical program. Both the trainer/evaluator and student should sign the appropriate block upon successful completion of the JPM. In addition, facility specific tasks may be added or generic tasks deleted based on the results of the facility job evaluation. Phase III concerns the oral examination board successful completion of the oral examination board is documented by the signature of the chairperson of the board. Upon completion of all of the standardized technician qualification requirements, final qualification is verified by the student and the manager of the Radiological Control Department and acknowledged by signatures on the qualification standard. The completed Qualification Standard shall be maintained as an official training record

  19. Quality of semantic standards

    NARCIS (Netherlands)

    Folmer, E.J.A.

    2012-01-01

    Little scientific literature addresses the issue of quality of semantic standards, albeit a problem with high economic and social impact. Our problem survey, including 34 semantic Standard Setting Organizations (SSOs), gives evidence that quality of standards can be improved, but for improvement a

  20. Automotive Technology Skill Standards

    Science.gov (United States)

    Garrett, Tom; Asay, Don; Evans, Richard; Barbie, Bill; Herdener, John; Teague, Todd; Allen, Scott; Benshoof, James

    2009-01-01

    The standards in this document are for Automotive Technology programs and are designed to clearly state what the student should know and be able to do upon completion of an advanced high-school automotive program. Minimally, the student will complete a three-year program to achieve all standards. Although these exit-level standards are designed…

  1. [Ophthalmology and standardization].

    Science.gov (United States)

    Heitz, R

    1989-01-01

    The standards are the references for quality and safety of materials, instruments and devices in ophtalmological use. The French standardisation association, "Association Française de Normalisation" (AFNOR), drafts his standards in connection with the concerned professionals. The ophthalmologists are concerned by standards of diagnostic and therapeutic instruments, intraocular and orbital implants, contact lenses, spectacle frames and glasses, and ocular protectors.

  2. Evaluating Sustainability: a Need for Standards

    Directory of Open Access Journals (Sweden)

    Güler Aras

    2008-06-01

    of a business. We further argue that this is problematic in the present global environment when stewardship of resources is becoming paramount. We therefore argue that sustainability is actually based upon efficiency in the transformational process and equity in the distribution of effects. We therefore argue for the need for standards in analysing and measuring sustainability and outline a more complete model which recognises distributional implications, and is developed into a model of operationalisability

  3. Parton Distributions

    CERN Document Server

    Dittmar, M.; Glazov, A.; Moch, S.; Altarelli, G.; Anderson, J.; Ball, R.D.; Beuf, G.; Boonekamp, M.; Burkhardt, H.; Caola, F.; Ciafaloni, M.; Colferai, D.; Cooper-Sarkar, A.; de Roeck, A.; Del Debbio, L.; Feltesse, J.; Gelis, F.; Grebenyuk, J.; Guffanti, A.; Halyol, V.; Latorre, J.I.; Lendermann, V.; Li, G.; Motyka, L.; Petersen, T.; Piccione, A.; Radescu, V.; Rogal, M.; Rojo, J.; Royon, C.; Salam, G.P.; Salek, D.; Stasto, A.M.; Thorne, R.S.; Ubiali, M.; Vermaseren, J.A.M.; Vogt, A.; Watt, G.; White, C.D.

    2009-01-01

    We provide an assessment of the state of the art in various issues related to experimental measurements, phenomenological methods and theoretical results relevant for the determination of parton distribution functions (PDFs) and their uncertainties, with the specific aim of providing benchmarks of different existing approaches and results in view of their application to physics at the LHC. We discuss higher order corrections, we review and compare different approaches to small x resummation, and we assess the possible relevance of parton saturation in the determination of PDFS at HERA and its possible study in LHC processes. We provide various benchmarks of PDF fits, with the specific aim of studying issues of error propagation, non-gaussian uncertainties, choice of functional forms of PDFs, and combination of data from different experiments and different processes. We study the impact of combined HERA (ZEUS-H1) structure function data, their impact on PDF uncertainties, and their implications for the computa...

  4. Interconnection of Distributed Energy Resources

    Energy Technology Data Exchange (ETDEWEB)

    Reiter, Emerson [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-04-19

    This is a presentation on interconnection of distributed energy resources, including the relationships between different aspects of interconnection, best practices and lessons learned from different areas of the U.S., and an update on technical advances and standards for interconnection.

  5. Measuring the arrival times of overlapped photo-events

    CERN Document Server

    Stoyanov, D V; Kolarov, G V

    2000-01-01

    We have developed and tested experimentally and by computer simulations a novel method for measuring the individual arrival times of temporally non-resolved (overlapped) single-photon detector pulses in high-intensity streams. The method is based on a set of linear transformations of photo-event pulses into a deconvolution, a transformation to standardized functions, a non-linear Chi-square fitting, etc. The retrieving accuracy varies within the range of 2-4 ns at an ADC sampling interval of 50 ns. The simultaneous measurement without dead time effects of the main photon statistics distributions as the distribution of intraevent intervals and the charge distributions has been demonstrated. The method covers the intermediate scale of intensities, where other well-known techniques for processing of photo-events are ineffective. It can be effectively applied in nuclear experiments, time-resolved spectroscopy, optical remote sensing, etc.

  6. Requirements of quality standards

    International Nuclear Information System (INIS)

    Mueller, J.

    1977-01-01

    The lecture traces the development of nuclear standards, codes, and Federal regulations on quality assurance (QA) for nuclear power plants and associated facilities. The technical evolution of the last twelve years, especially in the area of nuclear technology, led to different activities and regulatory initiatives, and the present result is: several nations have their own homemade standards. The lecture discusses the former and especially current activities in standard development, and gives a description of the requirements of QA-standards used in USA and Europe, especially Western Germany. Furthermore the lecture attempts to give a comparison and an evaluation of the international quality standards from the author's viewpoint. Finally the lecture presents an outlook for the future international implications of QA-standards. There is an urgent need within the nuclear industry for simplification and standardization of QA-standards. The relationship between the various standards, and the applicability of the standards need clarification and a better transparancy. To point out these problems is the purpose of the lecture. (orig.) [de

  7. Electrical, instrumentation, and control codes and standards

    International Nuclear Information System (INIS)

    Kranning, A.N.

    1978-01-01

    During recent years numerous documents in the form of codes and standards have been developed and published to provide design, fabrication and construction rules and criteria applicable to instrumentation, control and power distribution facilities for nuclear power plants. The contents of this LTR were prepared by NUS Corporation under Subcontract K5108 and provide a consolidated index and listing of the documents selected for their application to procurement of materials and design of modifications and new construction at the LOFT facility. These codes and standards should be applied together with the National Electrical Code, the ID Engineering Standards and LOFT Specifications to all LOFT instrument and electrical design activities

  8. Optimizing Mexico’s Water Distribution Services

    Science.gov (United States)

    2011-10-28

    private participation in urban areas.10 Other federal organizations managing the water distribution sector include the Secretaria del Medio ... Ambiente y Recursos Naturales (SEMARNAT) to which CNA is responsible. SEMARNAT manages environmental standards and water discharge fees.11 The Secretary

  9. source apportionment and distribution of polycyclic aromatic

    African Journals Online (AJOL)

    PTDFSA11_Laptop

    pyrogenic-derived from incomplete combustion of recent. (e.g., biomass .... aromatic hydrocarbons (PAHs) used as internal standards and surrogates: ..... w ood. Combustion petroleum petroleum petroleum. (a). SOURCE APPORTIONMENT AND DISTRIBUTION OF POLYCYCLIC AROMATIC HYDROCARBONS. 141 ...

  10. Beyond the standard model

    International Nuclear Information System (INIS)

    Wilczek, F.

    1993-01-01

    The standard model of particle physics is highly successful, although it is obviously not a complete or final theory. In this presentation the author argues that the structure of the standard model gives some quite concrete, compelling hints regarding what lies beyond. Essentially, this presentation is a record of the author's own judgement of what the central clues for physics beyond the standard model are, and also it is an attempt at some pedagogy. 14 refs., 6 figs

  11. Governing through standards

    DEFF Research Database (Denmark)

    Brøgger, Katja

    This abstract adresses the ways in which new education standards have become integral to new modes of education governance. The paper explores the role of standards for accelerating the shift from national to transnational governance in higher education. Drawing on the case of higher education...... development in Scandinavia, the paper focuses on the unintended effects of the new international standards. The Bologna process was reframed and recontextualized in ways that undermined the very system it was set out to transform and govern....

  12. Standardization of splash device

    Science.gov (United States)

    Fernández Raga, María; Peters, Piet

    2017-04-01

    The erosion is a complex process that has been studied extensively by numerous researchers, requiring a prolongued time effort and a large economic investment. To be effective, the measurements of erosion should be precise, controlled and replicable, and to assure efectiveness, measurement devices of erosion should be properly designed, constructed, well calibrated and also they should be operated by a trained person (Stroosnijder, 2005). Because researchers try to improve old devices, the equipment is constantly being redesigned, making the measurements not comparable and furthermore, producing a lack of available standarized device. The lack of standardization of erosion equipment is more obvious in the case of the local splash erosion, where the nature of the process makes very difficult to isolate its effects. In this article we compare the results obtained from five of the most common splash erosion devices (selected from more than 16 different currently types), under the same rain conditions, with the objective of facilitate the standardization of the method that will be more easy to build, minimizing the error. A set of six splash devices were setted in well known positions under simulated rain, to measured the differences, among devices and the accuracy of the data recovered after 10 minutes of rainfall simulation under different intensities (from 60 to 130 mm/h). The rainfall simulator of Wageningen was used, using sand as splash erosion source. Differences in the infiltration were also measured, and a calibration of sizes and speeds of the raindrops was done using the photography method (Hamidreza-Sadeghi et al., 2013). The splash devices selected for this study were unbounded splash devices (like the funnel, the cup (Fernandez-Raga et al., 2010) and the splash flume (Jomaa et al., 2010)), and bounded devices that allow the calculation of splash rate, (like the new cup (Scholten et al., 2011) and the Morgan tray). The behaviour of different splash devices

  13. Consumo e padronização de fármacos antihipertensivos na farmácia-ensino da Universidade Estadual de Maringá (FEN-UEM Distribution and standardization of antihypertensive medications in the pharmacy of State University of Maringá (FEN-UEM

    Directory of Open Access Journals (Sweden)

    Roberto Kenji Nakamura Cuman

    1999-07-01

    Full Text Available Com o objetivo de padronizar fármacos anti-hipertensivos na FEN-UEM, foram analisados a aquisição e o consumo desses fármacos. Os dados foram coletados a partir das notas fiscais de compra das especialidades farmacêuticas anti-hipertensivas (EAH pela referida farmácia. O estudo foi realizado no período de primeiro de janeiro de 1990 a trinta e um de dezembro de 1994. Foram analisados a quantidade de EAH adquirida, o princípio ativo e a classe farmacológica. A FEN - UEM adquiriu 4.751 EAH nesse período. Houve aumento na aquisição de inibidores da enzima conversora de angiotensina (63,6% e redução na de diuréticos (54,2%, bloqueadores beta-adrenérgicos (36,3%, agonistas alfa centrais (64,3% e associações (28,2%. A aquisição de bloqueadores de canais de cálcio foi constante nesse período. Os dados sugerem que a aquisição e a dispensação desses medicamentos estão correlacionadas com a terapêutica atual, permitindo avaliar o perfil de fármacos anti-hipertensivos na Farmácia-Ensino da UEM e padronizar as EAH a serem dispensadas pelas farmácias.With the objective of standardizing antihypertensive drugs in FEN-UEM, the acquisition and consumption of these drugs were analyzed. The data were collected starting from the invoices of purchase of the pharmaceutical antihypertensives specialties (EAH for the referred pharmacy. The study was carried out in the period of January 1st, 1990 trough December 31st 1994. The amount of acquired EAH, active substance and pharmacological class were analyzed. The FEN-UEM acquired 4,751 EAH in this period. There was increased in the acquisition of angiotensin-converting enzyme inhibitors (63,6% and reduction for diuretics (54,2%, beta-blockers (36,3%, alpha central agonists (64,3% and associations (28,2%. The acquisition of calcium antagonists was constant in this period. The data suggest that the acquisition and distribution of these medications are correlated with the current therapeutics

  14. The Standard Model course

    CERN Multimedia

    CERN. Geneva HR-RFA

    2006-01-01

    Suggested Readings: Aspects of Quantum Chromodynamics/A Pich, arXiv:hep-ph/0001118. - The Standard Model of Electroweak Interactions/A Pich, arXiv:hep-ph/0502010. - The Standard Model of Particle Physics/A Pich The Standard Model of Elementary Particle Physics will be described. A detailed discussion of the particle content, structure and symmetries of the theory will be given, together with an overview of the most important experimental facts which have established this theoretical framework as the Standard Theory of particle interactions.

  15. International hearing protector standardization

    DEFF Research Database (Denmark)

    Poulsen, Torben

    2002-01-01

    Hearing protectors shall fulfill some minimum requirements to their performance. As hearing protector manufacturers sell the products all over the world, the testing and certification of hearing protectors has become an international issue. The ISO working group WG17 under the headlines Acoustics......, Noise, produce hearing protector standards to be used at an international level. The presentation will cover the ongoing work in WG17, including the revision of existing standards (ISO 4869-1, ISO 4869-3), upcoming new standards (ISO 4869-7) and the plans and status for future standards (performance...

  16. Standardization: colorfull or dull?

    Science.gov (United States)

    van Nes, Floris L.

    2003-01-01

    After mentioning the necessity of standardization in general, this paper explains how human factors, or ergonomics standardization by ISO and the deployment of information technology were linked. Visual display standardization is the main topic; the present as well as the future situation in this field are treated, mainly from an ISO viewpoint. Some observations are made about the necessary and interesting co-operation between physicists and psychologists, of different nationality, who both may be employed by either private enterprise or governmental institutions, in determining visual display requirements. The display standard that is to succeed the present ISO standards in this area: ISO 9241-3, -7, -8 and ISO 13406-1, -2, will have a scope that is not restricted to office tasks. This means a large extension of the contexts for which display requirements have to be investigated and specified especially if mobile use of displays, under outdoor lighting conditions, is included. The new standard will be structured in such a way that it is better accessible than the present ones for different categories of standards users. The subject color in the new standard is elaborated here. A number of questions are asked as to which requirements on color rendering should be made, taking new research results into account, and how far the new standard should go in making recommendations to the display user.

  17. Standard NIM instrumentation system

    CERN Document Server

    1990-01-01

    NIM is a standard modular instrumentation system that is in wide use throughout the world. As the NIM system developed and accommodations were made to a dynamic instrumentation field and a rapidly advancing technology, additions, revisions and clarifications were made. These were incorporated into the standard in the form of addenda and errata. This standard is a revision of the NIM document, AEC Report TID- 20893 (Rev 4) dated July 1974. It includes all the addenda and errata items that were previously issued as well as numerous additional items to make the standard current with modern technology and manufacturing practice.

  18. Position paper on standardization

    International Nuclear Information System (INIS)

    1991-04-01

    The ''NPOC Strategic Plan for Building New Nuclear Plants'' creates a framework within which new standardized nuclear plants may be built. The Strategic Plan is an expression of the nuclear energy industry's serious intent to create the necessary conditions for new plant construction and operation. One of the key elements of the Strategic Plan is a comprehensive industry commitment to standardization: through design certification, combined license, first-of-a-kind engineering, construction, operation and maintenance of nuclear power plants. The NPOC plan proposes four stages of standardization in advanced light water reactors (ALWRs). The first stage is established by the ALWR Utility Requirements Document which specifies owner/operator requirements at a functional level covering all elements of plant design and construction, and many aspects of operations and maintenance. The second stage of standardization is that achieved in the NRC design certification. This certification level includes requirements, design criteria and bases, functional descriptions and performance requirements for systems to assure plant safety. The third stage of standardization, commercial standardization, carries the design to a level of completion beyond that required for design certification to enable the industry to achieve potential increases in efficiency and economy. The final stage of standardization is enhanced standardization beyond design. A standardized approach is being developed in construction practices, operating, maintenance training, and procurement practices. This comprehensive standardization program enables the NRC to proceed with design certification with the confidence that standardization beyond the regulations will be achieved. This confidence should answer the question of design detail required for design certification, and demonstrate that the NRC should require no further regulatory review beyond that required by 10 CFR Part 52

  19. A gestão dos recursos naturais nas organizações certificadas pela norma NBR ISO 14001 Managementof natural resources in organizations certified by NBR ISO 14001 standard

    Directory of Open Access Journals (Sweden)

    Celso Machado Junior

    2013-03-01

    Full Text Available O estudo destaca um conjunto de variáveis associadas aos processos de controle e às ações para mitigar o consumo dos recursos nas operações com o objetivo de verificar se as empresas certificadas pela norma NBR ISO 14001 apresentam procedimentos de gestão ambiental significativamente diferentes dos adotados pelas empresas não certificadas por essa norma. É uma pesquisa descritiva assentada em uma amostra composta por 649 empresas de diferentes ramos de atividade, que disponibilizaram informações para a publicação na revista Análise Gestão Ambiental (2008. Foram objeto de estudo os seguintes recursos: água, energia elétrica, combustíveis, lenha/carvão e recursos minerais. A utilização de tratamento estatístico, consubstanciada na regressão logística e no quiquadrado, permitiu evidenciar que as empresas certificadas pela norma NBR ISO 14001 acentuam um conjunto maior dos fatores ambientais em sua gestão por meio de controles, ações e programas estruturados, demonstrando assim maior preocupação socioambiental.This study focuses on a set of variables associated with control processes and actions to mitigate the consumption of resources in operations in order to determine whether the companies certified by NBR ISO 14001 standard have environmental management procedures significantly different from those adopted by the companies which are not certified. It is a descriptive research based on a sample of 649 companies of different businesses which provided information for publication in the journal "Análise Gestão Ambiental" (2008. The objects of this study were the following: water, electricity, fuel, wood/coal and mineral resources. Using statistical analysis, embodied in the logistic regression and chi-square test, it was observed that the companies certified by NBR ISO 14001 standard invest in a larger set of environmental factors in their management, through controls, actions, and programs structured to reduce the

  20. Reclaimed wastewater use alternatives and quality standards

    OpenAIRE

    Dalahmeh, Sahar; Baresel, Christian

    2014-01-01

    Reclaimed wastewater use is crucial for increasing water availability, improving water resources management, minimising environmental pollution and permitting sustainable nutrient recycling. However, wastewater also contains microbiological and chemical pollutants posing risks to human health and the environment, and these risks have to be handled. Successful use of reclaimed wastewater requires stringent standards for its treatment, disposal and distribution. This report summarises global an...

  1. A HISTORY OF ASHRAE STANDARDS 152P.

    Energy Technology Data Exchange (ETDEWEB)

    ANDREWS,J.W.

    2003-10-31

    The American Society of Heating, Refrigerating, and Air-conditioning Engineers (ASHRAE) has been developing a standard test method for evaluating the efficiency of ducts and other types of thermal distribution systems in single-family residential buildings. This report presents an overview of the structure, function, and historical development of this test method.

  2. 77 FR 43542 - Cost Accounting Standards: Cost Accounting Standards 412 and 413-Cost Accounting Standards...

    Science.gov (United States)

    2012-07-25

    ... Accounting Standards: Cost Accounting Standards 412 and 413--Cost Accounting Standards Pension Harmonization Rule AGENCY: Cost Accounting Standards Board, Office of Federal Procurement Policy, Office of... Policy (OFPP), Cost Accounting Standards Board (Board), is publishing technical corrections to the final...

  3. State Skill Standards: Photography

    Science.gov (United States)

    Howell, Frederick; Reed, Loretta; Jensen, Capra; Robison, Gary; Taylor, Susan; Pavesich, Christine

    2007-01-01

    The Department of Education has undertaken an ambitious effort to develop statewide skill standards for all content areas in career and technical education. The standards in this document are for photography programs and are designed to clearly state what the student should know and be able to do upon completion of an advanced high-school program.…

  4. Standards and Administration.

    Science.gov (United States)

    Gross, S. P.

    1978-01-01

    Presents a literature review of water quality standards and administration, covering publications of 1976-77. Consideration is given to municipal facilities, National Pollutant Discharge Elimination Systems, regional and international water quality management, and effluent standards. A list of 99 references is also presented. (HM)

  5. How many standards?

    DEFF Research Database (Denmark)

    Maegaard, Marie

    2009-01-01

    Discussions of standardisation and standard languages has a long history in linguistics. Tore Kristiansen has contributed to these discussions in various ways, and in this chapter I will focus on his claim that young Danes operate with two standards, one for the media and one for the school...

  6. Standard Weights and Measures

    Indian Academy of Sciences (India)

    The mass standard, represented by the proto- type kilogram, is the only remaining artifact, but there are promising proposals to replace that in the near future. Ever since humans started living in community settle- ments, day to day activities have required the adoption of a set of standards for weights and measures. For ex-.

  7. Teachers Voices Interpreting Standards

    Directory of Open Access Journals (Sweden)

    Leo C. Rigsby

    2003-11-01

    Full Text Available The State of Virginia has adopted state-mandated testing that aims to raise the standards of performance for children in our schools in a manner that assigns accountability to schools and to teachers. In this paper we argue that the conditions under which the standards were created and the testing implemented undermine the professionalism of teachers. We believe this result has the further consequence of compromising the critical thinking and learning processes of children. We argue this has happened because teachers’ views and experiences have driven neither the setting of standards nor the assessment of their achievement. We use data from essays by teachers in an innovative masters program to compare teachers’ experiences involving the Virginia Standards of Learning with ideal standards for professional development adopted by the National Board for Professional Teaching Standards. We argue that there are serious negative consequences of the failure to include dialogue with K-12 teachers in setting standards and especially in the creation of assessments to measure performances relative to the standards. We believe the most successful, honest, and morally defensible processes must be built on the experience and wisdom of classroom teachers.

  8. Revisiting Professional Teacher Standards

    Science.gov (United States)

    Watson, Amanda

    2016-01-01

    The Australian Society for Music Education's (ASME) involvement in the development of professional standards for music educators was a significant and active research time in the history of the Society. As ASME celebrates its golden jubilee, it is appropriate to revisit that history and consider the future prospects of subject-specific standards.…

  9. Weston Standard battery

    CERN Multimedia

    This is a Weston AOIP standard battery with its calibration certificate (1956). Inside, the glassware forms an "H". Its name comes from the British physicist Edward Weston. A standard is the materialization of a given quantity whose value is known with great accuracy.

  10. Different approaches of high speed data transmission standards

    OpenAIRE

    M. Ehlert

    2004-01-01

    A number of standards addresses the problem of high-speed data transmission on serial or serial-parallel data lines. Serial-parallel data transmission means the transmitted information is distributed on parallel data lines. Even though several standards exist, there are only a few basic techniques used in most of these standards. This paper is giving an overview of these different basic techniques used in the physical layer of today’s data transmission standards, for exam...

  11. Handbook of distribution

    International Nuclear Information System (INIS)

    Mo, In Gyu

    1992-01-01

    This book tells of business strategy and distribution innovation, purpose of intelligent distribution, intelligent supply distribution, intelligent production distribution, intelligent sale distribution software for intelligence and future and distribution. It also introduces component technology keeping intelligent distribution such as bar cord, OCR, packing, and intelligent auto-warehouse, system technology, and cases in America, Japan and other countries.

  12. A case of standardization?

    DEFF Research Database (Denmark)

    Rod, Morten Hulvej; Høybye, Mette Terp

    2016-01-01

    Guidelines are increasingly used in an effort to standardize and systematize health practices at the local level and to promote evidence-based practice. The implementation of guidelines frequently faces problems, however, and standardization processes may in general have other outcomes than...... the ones envisioned by the makers of standards. In 2012, the Danish National Health Authorities introduced a set of health promotion guidelines that were meant to guide the decision making and priority setting of Denmark's 98 local governments. The guidelines provided recommendations for health promotion...... and standardization. It remains an open question whether or not the guidelines lead to more standardized policies and interventions, but we suggest that the guidelines promote a risk factor-oriented approach as the dominant frame for knowledge, reasoning, decision making and priority setting in health promotion. We...

  13. Evaluating Living Standard Indicators

    Directory of Open Access Journals (Sweden)

    Birčiaková Naďa

    2015-09-01

    Full Text Available This paper deals with the evaluation of selected available indicators of living standards, divided into three groups, namely economic, environmental, and social. We have selected six countries of the European Union for analysis: Bulgaria, the Czech Republic, Hungary, Luxembourg, France, and Great Britain. The aim of this paper is to evaluate indicators measuring living standards and suggest the most important factors which should be included in the final measurement. We have tried to determine what factors influence each indicator and what factors affect living standards. We have chosen regression analysis as our main method. From the study of factors, we can deduce their impact on living standards, and thus the value of indicators of living standards. Indicators with a high degree of reliability include the following factors: size and density of population, health care and spending on education. Emissions of carbon dioxide in the atmosphere also have a certain lower degree of reliability.

  14. On the excited state wave functions of Dirac fermions in the random ...

    Indian Academy of Sciences (India)

    The self-duality of the theory under the transformation → 1/ is discussed. We also calculate the distribution functions of 0 = |0 ()|2, (i.e. (0); 0 () is the ground state wave function), which behaves as the log-normal distribution function. It is also shown that in small 0, (0) behaves as a chi-square distribution.

  15. 1994 DOE Technical Standards Program Workshop: Proceedings

    International Nuclear Information System (INIS)

    Spellman, D.J.

    1994-01-01

    The DOE Technical Standards Program has been structured to provide guidance and assistance for the development, adoption, and use of voluntary standards within the Department. OMB Circular A-119, ''Federal Participation in the Development and Use of Voluntary Standards'' establishes the policy to be followed in working with voluntary standards bodies, and in adopting and using voluntary standards whenever feasible. The DOE Technical Standards Program is consistent with this policy and is dedicated to the task of promoting its implementation. The theme of this year's workshop is ''Standards Initiatives in Environmental Management fostering the development and use of industry standards for safe, environmentally responsible operations.'' The objective of the workshop is to increase the participant's awareness of the standardization activities taking place nationally and internationally and the impact of these activities on their efforts, and to facilitate the exchange of experiences, processes, and tools for implementing the program. Workshop sessions will include presentations by industry and Government notables in the environment, safety, and health arena with ample opportunity for everyone to ask questions and share experiences. There will be a breakout session which will concentrate on resolution of issues arising from the implementation of the DOE Technical Standards Program and a plenary session to discuss the plans developed by the breakout groups. Many organizations provide services and products which support the development, processing, distribution, and retrieval of standards. Those organizations listed at the end of the agenda will have exhibits available for your perusal throughout the workshop. Last year's workshop was very successful in stimulating an understanding of an interest in the standards program. This year, we hope to build on that success and provide an environment for the synergism of ideas to enhance the program and advance its implementation

  16. How Sample Size Affects a Sampling Distribution

    Science.gov (United States)

    Mulekar, Madhuri S.; Siegel, Murray H.

    2009-01-01

    If students are to understand inferential statistics successfully, they must have a profound understanding of the nature of the sampling distribution. Specifically, they must comprehend the determination of the expected value and standard error of a sampling distribution as well as the meaning of the central limit theorem. Many students in a high…

  17. Determining Normal-Distribution Tolerance Bounds Graphically

    Science.gov (United States)

    Mezzacappa, M. A.

    1983-01-01

    Graphical method requires calculations and table lookup. Distribution established from only three points: mean upper and lower confidence bounds and lower confidence bound of standard deviation. Method requires only few calculations with simple equations. Graphical procedure establishes best-fit line for measured data and bounds for selected confidence level and any distribution percentile.

  18. Standard dilution analysis.

    Science.gov (United States)

    Jones, Willis B; Donati, George L; Calloway, Clifton P; Jones, Bradley T

    2015-02-17

    Standard dilution analysis (SDA) is a novel calibration method that may be applied to most instrumental techniques that will accept liquid samples and are capable of monitoring two wavelengths simultaneously. It combines the traditional methods of standard additions and internal standards. Therefore, it simultaneously corrects for matrix effects and for fluctuations due to changes in sample size, orientation, or instrumental parameters. SDA requires only 200 s per sample with inductively coupled plasma optical emission spectrometry (ICP OES). Neither the preparation of a series of standard solutions nor the construction of a universal calibration graph is required. The analysis is performed by combining two solutions in a single container: the first containing 50% sample and 50% standard mixture; the second containing 50% sample and 50% solvent. Data are collected in real time as the first solution is diluted by the second one. The results are used to prepare a plot of the analyte-to-internal standard signal ratio on the y-axis versus the inverse of the internal standard concentration on the x-axis. The analyte concentration in the sample is determined from the ratio of the slope and intercept of that plot. The method has been applied to the determination of FD&C dye Blue No. 1 in mouthwash by molecular absorption spectrometry and to the determination of eight metals in mouthwash, wine, cola, nitric acid, and water by ICP OES. Both the accuracy and precision for SDA are better than those observed for the external calibration, standard additions, and internal standard methods using ICP OES.

  19. Operator licensing examiner standards

    International Nuclear Information System (INIS)

    1993-01-01

    The Operator Licensing Examiner Standards provide policy and guidance to NRC examiners and establish the procedures and practices for examining licensees and applicants for reactor operator and senior reactor operator licenses at power reactor facilities pursuant to Part 55 of Title 10 of the Code of Federal Regulations (10 CFR 55). The Examiner Standards are intended to assist NRC examiners and facility licensees to better understand the initial and requalification examination processes and to ensure the equitable and consistent administration of examinations to all applicants. These standards are not a substitute for the operator licensing regulations and are subject to revision or other internal operator licensing policy changes

  20. Right Ventricular Involvement in either Anterior or Inferior Myocardial Infarction

    Directory of Open Access Journals (Sweden)

    Firoozeh Abtahi

    2016-06-01

    Full Text Available Background: Unlike left ventricular function, less attention has been paid to Right Ventricular (RV function after Myocardial Infarction (MI. Objectives: The current study aimed to compare RV function in patients with inferior and anterior MI. Patients and Methods: During the study period, 60 patients consecutively presented to the Emergency Department with chest pain were divided into two groups based on their electrocardiographic findings. Accordingly, 25 patients had inferior MI (IMI group and 35 ones had anterior MI (AMI group. Echocardiography was performed 48 hours after starting the standard therapy. Conventional echocardiographic parameters and Tissue Doppler Imaging (TDI measurements were acquired from the standard views. Student t-test and the chi-square test were respectively used for comparisons of the normally distributed continuous and categorical variables in the two groups. Besides, P < 0.05 was considered to be statistically significant.

  1. The Distributed Wind Cost Taxonomy

    Energy Technology Data Exchange (ETDEWEB)

    Forsyth, Trudy; Jimenez, Tony; Preus, Robert; Tegen, Suzanne; Baring-Gould, Ian

    2017-03-28

    To date, there has been no standard method or tool to analyze the installed and operational costs for distributed wind turbine systems. This report describes the development of a classification system, or taxonomy, for distributed wind turbine project costs. The taxonomy establishes a framework to help collect, sort, and compare distributed wind cost data that mirrors how the industry categorizes information. The taxonomy organizes costs so they can be aggregated from installers, developers, vendors, and other sources without losing cost details. Developing a peer-reviewed taxonomy is valuable to industry stakeholders because a common understanding the details of distributed wind turbine costs and balance of station costs is a first step to identifying potential high-value cost reduction opportunities. Addressing cost reduction potential can help increase distributed wind's competitiveness and propel the U.S. distributed wind industry forward. The taxonomy can also be used to perform cost comparisons between technologies and track trends for distributed wind industry costs in the future. As an initial application and piloting of the taxonomy, preliminary cost data were collected for projects of different sizes and from different regions across the contiguous United States. Following the methods described in this report, these data are placed into the established cost categories.

  2. DOE Standard: Fire protection design criteria

    Energy Technology Data Exchange (ETDEWEB)

    1999-07-01

    The development of this Standard reflects the fact that national consensus standards and other design criteria do not comprehensively or, in some cases, adequately address fire protection issues at DOE facilities. This Standard provides supplemental fire protection guidance applicable to the design and construction of DOE facilities and site features (such as water distribution systems) that are also provided for fire protection. It is intended to be used in conjunction with the applicable building code, National Fire Protection Association (NFPA) Codes and Standards, and any other applicable DOE construction criteria. This Standard replaces certain mandatory fire protection requirements that were formerly in DOE 5480.7A, ``Fire Protection``, and DOE 6430.1A, ``General Design Criteria``. It also contains the fire protection guidelines from two (now canceled) draft standards: ``Glove Box Fire Protection`` and ``Filter Plenum Fire Protection``. (Note: This Standard does not supersede the requirements of DOE 5480.7A and DOE 6430.1A where these DOE Orders are currently applicable under existing contracts.) This Standard, along with the criteria delineated in Section 3, constitutes the basic criteria for satisfying DOE fire and life safety objectives for the design and construction or renovation of DOE facilities.

  3. DOE Standard: Fire protection design criteria

    International Nuclear Information System (INIS)

    1999-07-01

    The development of this Standard reflects the fact that national consensus standards and other design criteria do not comprehensively or, in some cases, adequately address fire protection issues at DOE facilities. This Standard provides supplemental fire protection guidance applicable to the design and construction of DOE facilities and site features (such as water distribution systems) that are also provided for fire protection. It is intended to be used in conjunction with the applicable building code, National Fire Protection Association (NFPA) Codes and Standards, and any other applicable DOE construction criteria. This Standard replaces certain mandatory fire protection requirements that were formerly in DOE 5480.7A, ''Fire Protection'', and DOE 6430.1A, ''General Design Criteria''. It also contains the fire protection guidelines from two (now canceled) draft standards: ''Glove Box Fire Protection'' and ''Filter Plenum Fire Protection''. (Note: This Standard does not supersede the requirements of DOE 5480.7A and DOE 6430.1A where these DOE Orders are currently applicable under existing contracts.) This Standard, along with the criteria delineated in Section 3, constitutes the basic criteria for satisfying DOE fire and life safety objectives for the design and construction or renovation of DOE facilities

  4. SOFG: Standards requirements

    International Nuclear Information System (INIS)

    Gerganov, T.; Grigorov, S.; Kozhukharov, V.; Brashkova, N.

    2005-01-01

    It is well-known that Solid Oxide Fuel Cells will have industrial application in the nearest future. In this context, the problem of SOFC materials and SOFC systems standardization is of high level of priority. In the present study the attention is focused on the methods for physical and chemical characterization of the materials for SOFC components fabrication and about requirements on single SOFC cells tests. The status of the CEN, ISO, ASTM (ANSI, ASSN) and JIS class of standards has been verified. Standards regarding the test methods for physical-chemical characterization of vitreous materials (as sealing SOFC component), ceramic materials (as electrodes and electrolyte components, including alternative materials used) and metallic materials (interconnect components) are subject of overview. It is established that electrical, mechanical, surface and interfacial phenomena, chemical durability and thermal corrosion behaviour are the key areas for standardization of the materials for SOFC components

  5. Scrutinising safety standards

    Energy Technology Data Exchange (ETDEWEB)

    Pietersen, C.M. [Safety Service Center BV (Netherlands)

    2001-01-01

    Until the introduction of the International Electrotechnical Commission's standard IEC 61508, there was no provision for relating risks to people and the environment, to the risks of financial loss. Although IEC 61508 fills most of the gaps in the process industry, there are still some loopholes. Four points covered by the standard are listed. It is expected that the standard will lead to an optimum cost-benefit situation with 'fit-for-purpose' safety and minimum nuisance shutdowns, or process plant disturbances. It should give clear guidance on 'how safe is safe enough.' IEC 61508 can be implemented through various routes. Insurance companies and regulators are starting to require implementation of the new standard. Five points which need to be ascertained for implementation are listed and diagrams illustrate the IEC 61508 safety lifecycle and the risk-reduction requirements.

  6. STELLA Standards Framework.

    Science.gov (United States)

    English in Australia, 2001

    2001-01-01

    Presents a provisional framework for the STELLA (Standards for Teachers of English Language and Literacy in Australia), which identifies broad dimensions of teaching together with groupings of related attributes derived from teachers' narratives. (RS)

  7. Ozone Standard Reference Photometer

    Data.gov (United States)

    Federal Laboratory Consortium — The Standard Reference Photometer (SRP) Program began in the early 1980s as collaboration between NIST and the U.S. Environmental Protection Agency (EPA) to design,...

  8. National Pesticide Standard Repository

    Science.gov (United States)

    EPA's National Pesticide Standards Repository collects and maintains an inventory of analytical “standards” of registered pesticides in the United States, as well as some that are not currently registered for food and product testing and monitoring.

  9. Beyond the standard model

    International Nuclear Information System (INIS)

    Gaillard, M.K.

    1990-04-01

    The unresolved issues of the standard model are reviewed, with emphasis on the gauge hierarchy problem. A possible mechanism for generating a hierarchy in the context of superstring theory is described. 24 refs

  10. Standardization in synthetic biology.

    Science.gov (United States)

    Müller, Kristian M; Arndt, Katja M

    2012-01-01

    Synthetic Biology is founded on the idea that complex biological systems are built most effectively when the task is divided in abstracted layers and all required components are readily available and well-described. This requires interdisciplinary collaboration at several levels and a common understanding of the functioning of each component. Standardization of the physical composition and the description of each part is required as well as a controlled vocabulary to aid design and ensure interoperability. Here, we describe standardization initiatives from several disciplines, which can contribute to Synthetic Biology. We provide examples of the concerted standardization efforts of the BioBricks Foundation comprising the request for comments (RFC) and the Registry of Standardized Biological parts as well as the international Genetically Engineered Machine (iGEM) competition.

  11. AKRO: Standard Prices

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Standard prices are generated for cost recovery programs in the Individual Fishing Quota (IFQ) halibut and sablefish, BSAI Rationalized crab, and Central Gulf of...

  12. Fiber optics standard dictionary

    CERN Document Server

    Weik, Martin H

    1997-01-01

    Fiber Optics Vocabulary Development In 1979, the National Communications System published Technical InfonnationBulle­ tin TB 79-1, Vocabulary for Fiber Optics and Lightwave Communications, written by this author. Based on a draft prepared by this author, the National Communications System published Federal Standard FED-STD-1037, Glossary of Telecommunications Terms, in 1980 with no fiber optics tenns. In 1981, the first edition of this dictionary was published under the title Fiber Optics and Lightwave Communications Standard Dictionary. In 1982, the then National Bureau of Standards, now the National Institute of Standards and Technology, published NBS Handbook 140, Optical Waveguide Communications Glossary, which was also published by the General Services Admin­ istration as PB82-166257 under the same title. Also in 1982, Dynamic Systems, Inc. , Fiberoptic Sensor Technology Handbook, co-authored and edited by published the this author, with an extensive Fiberoptic Sensors Glossary. In 1989, the handbook w...

  13. FDA Recognized Consensus Standards

    Data.gov (United States)

    U.S. Department of Health & Human Services — This database consists of those national and international standards recognized by FDA which manufacturers can declare conformity to and is part of the information...

  14. RESIDUAL RISK ASSESSMENT: GAS DISTRIBUTION STAGE ...

    Science.gov (United States)

    This document describes the residual risk assessment for the Gas Distribution Stage 1 souce category. For stationary sources, section 112 (f) of the Clean Air Act requires EPA to assess risks to human health and the environment following implementation of technology-based control standards. If these technology-based control standards do not provide an ample margin of safety, then EPA is required to promulgate addtional standards. This document describes the methodology and results of the residual risk assessment performed for the Gas Distribution Stage 1 source category. The results of this analyiss will assist EPA in determining whether a residual risk rule for this source category is appropriate.

  15. Standard Firebrand Generator

    Data.gov (United States)

    Federal Laboratory Consortium — This apparatus, developed at EL, has been constructed to generate a controlled and repeatable size and mass distribution of glowing firebrands. The purpose of NIST...

  16. The Gold Standard Programme

    DEFF Research Database (Denmark)

    Neumann, Tim; Rasmussen, Mette; Ghith, Nermin

    2013-01-01

    To evaluate the real-life effect of an evidence-based Gold Standard Programme (GSP) for smoking cessation interventions in disadvantaged patients and to identify modifiable factors that consistently produce the highest abstinence rates.......To evaluate the real-life effect of an evidence-based Gold Standard Programme (GSP) for smoking cessation interventions in disadvantaged patients and to identify modifiable factors that consistently produce the highest abstinence rates....

  17. Standard software for CAMAC

    International Nuclear Information System (INIS)

    Lenkszus, F.R.

    1978-01-01

    The NIM Committee (National Instrumentation Methods Committee) of the U.S. Department of Energy and the ESONE Committee of European Laboratories have jointly specified standard software for use with CAMAC. Three general approaches were followed: the definition of a language called IML for use in CAMAC systems, the definition of a standard set of subroutine calls, and real-time extensions to the BASIC language. This paper summarizes the results of these efforts. 1 table

  18. Beyond the standard model

    International Nuclear Information System (INIS)

    Pleitez, V.

    1994-01-01

    The search for physics laws beyond the standard model is discussed in a general way, and also some topics on supersymmetry theories. An approach is made on recent possibilities rise in the leptonic sector. Finally, models with SU(3) c X SU(2) L X U(1) Y symmetry are considered as alternatives for the extensions of the elementary particles standard model. 36 refs., 1 fig., 4 tabs

  19. Impact factor distribution revisited

    Science.gov (United States)

    Huang, Ding-wei

    2017-09-01

    We explore the consistency of a new type of frequency distribution, where the corresponding rank distribution is Lavalette distribution. Empirical data of journal impact factors can be well described. This distribution is distinct from Poisson distribution and negative binomial distribution, which were suggested by previous study. By a log transformation, we obtain a bell-shaped distribution, which is then compared to Gaussian and catenary curves. Possible mechanisms behind the shape of impact factor distribution are suggested.

  20. Technical standards in nuclear area

    International Nuclear Information System (INIS)

    Grimberg, M.

    1978-01-01

    The technical standardization in nuclear area is discussed. Also, the competence of CNEN in standardization pursuit is analysed. Moreover, the process of working up of technical standards is explained; in addition, some kinds of technical standards are discussed. (author) [pt

  1. ISO radiation sterilization standards

    International Nuclear Information System (INIS)

    Lambert, Byron J.; Hansen, Joyce M.

    1998-01-01

    This presentation provides an overview of the current status of the ISO radiation sterilization standards. The ISO standards are voluntary standards which detail both the validation and routine control of the sterilization process. ISO 11137 was approved in 1994 and published in 1995. When reviewing the standard you will note that less than 20% of the standard is devoted to requirements and the remainder is guidance on how to comply with the requirements. Future standards developments in radiation sterilization are being focused on providing additional guidance. The guidance that is currently provided in informative annexes of ISO 11137 includes: device/packaging materials, dose setting methods, and dosimeters and dose measurement, currently, there are four Technical Reports being developed to provide additional guidance: 1. AAMI Draft TIR, 'Radiation Sterilization Material Qualification' 2. ISO TR 13409-1996, 'Sterilization of health care products - Radiation sterilization - Substantiation of 25 kGy as a sterilization dose for small or infrequent production batches' 3. ISO Draft TR, 'Sterilization of health care products - Radiation sterilization Selection of a sterilization dose for a single production batch' 4. ISO Draft TR, 'Sterilization of health care products - Radiation sterilization-Product Families, Plans for Sampling and Frequency of Dose Audits'

  2. DOE technical standards list. Department of Energy standards index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-08-01

    This document was prepared for use by personnel involved in the selection and use of DOE technical standards and other Government and non-Government standards. This TSL provides listing of current DOE technical standards, non-Government standards that have been adopted by DOE, other Government documents in which DOE has a recorded interest, and canceled DOE technical standards. Information on new DOE technical standards projects, technical standards released for coordination, recently published DOE technical standards, and activities of non-Government standards bodies that may be of interest to DOE is published monthly in Standards Actions.

  3. DOE technical standards list: Department of Energy standards index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-05-01

    This Department of Energy (DOE) technical standards list (TSL) has been prepared by the Office of Nuclear Safety Policy and Standards (EH-31) on the basis of currently available technical information. Periodic updates of this TSL will be issued as additional information is received on standardization documents being issued, adopted, or canceled by DOE. This document was prepared for use by personnel involved in the selection and use of DOE technical standards and other Government and non-Government standards. This TSL provides listings of current DOE technical standards, non-Government standards that have been adopted by DOE, other standards-related documents in which DOE has a recorded interest, and canceled DOE technical standards. Information on new DOE technical standards projects, technical standards released for coordination, recently published DOE technical standards, and activities of non-Government standards bodies that may be of interest to DOE is published monthly in Standards Actions.

  4. Product Distributions for Distributed Optimization. Chapter 1

    Science.gov (United States)

    Bieniawski, Stefan R.; Wolpert, David H.

    2004-01-01

    With connections to bounded rational game theory, information theory and statistical mechanics, Product Distribution (PD) theory provides a new framework for performing distributed optimization. Furthermore, PD theory extends and formalizes Collective Intelligence, thus connecting distributed optimization to distributed Reinforcement Learning (FU). This paper provides an overview of PD theory and details an algorithm for performing optimization derived from it. The approach is demonstrated on two unconstrained optimization problems, one with discrete variables and one with continuous variables. To highlight the connections between PD theory and distributed FU, the results are compared with those obtained using distributed reinforcement learning inspired optimization approaches. The inter-relationship of the techniques is discussed.

  5. SAMICS marketing and distribution model

    Science.gov (United States)

    1978-01-01

    A SAMICS (Solar Array Manufacturing Industry Costing Standards) was formulated as a computer simulation model. Given a proper description of the manufacturing technology as input, this model computes the manufacturing price of solar arrays for a broad range of production levels. This report presents a model for computing these marketing and distribution costs, the end point of the model being the loading dock of the final manufacturer.

  6. IAEA Safety Standards

    International Nuclear Information System (INIS)

    2016-09-01

    The IAEA Safety Standards Series comprises publications of a regulatory nature covering nuclear safety, radiation protection, radioactive waste management, the transport of radioactive material, the safety of nuclear fuel cycle facilities and management systems. These publications are issued under the terms of Article III of the IAEA’s Statute, which authorizes the IAEA to establish “standards of safety for protection of health and minimization of danger to life and property”. Safety standards are categorized into: • Safety Fundamentals, stating the basic objective, concepts and principles of safety; • Safety Requirements, establishing the requirements that must be fulfilled to ensure safety; and • Safety Guides, recommending measures for complying with these requirements for safety. For numbering purposes, the IAEA Safety Standards Series is subdivided into General Safety Requirements and General Safety Guides (GSR and GSG), which are applicable to all types of facilities and activities, and Specific Safety Requirements and Specific Safety Guides (SSR and SSG), which are for application in particular thematic areas. This booklet lists all current IAEA Safety Standards, including those forthcoming

  7. GISB: Efficiency through standardization

    International Nuclear Information System (INIS)

    White, B.

    1995-01-01

    For those who participated in the numerous day-long development sessions held in the dim, stale basement auditorium of the Department of Energy, the ida that the Gas Industry standards Board (GISB) would be producing standards anytime soon seemed a distant dream. However, the hazy vision of just over a year ago has now become a reality. As summer turns to fall and young gas schedulers throughout this country dream of the gridiron, GISB will have already issued a model electronic-trading partner agreement and 12 standards for capacity-release transactions, as well as three standards for nomination-related transactions. Under the steady hand of Executive directors Rae McQuade and a board of director that looks like a Who's Who of the gas industry, GISB has developed into a organization that will directly influence how gas is purchased, transported, and accounted and paid for in the 21st century. The paper describes the background of the organization, standards that have been released, and issues still to be addressed

  8. Sports eyewear protective standards.

    Science.gov (United States)

    Dain, Stephen J

    2016-01-01

    Eye injuries sustained during sport comprise up to 20 per cent of all injuries to the eye serious enough for medical attention to be sought. The prevalence of eye injuries in sport is not easily assessed due to lack of authoritative participation rates, so most studies report total numbers in a time period. The evidence on the proportion of all ocular injuries that are from sport is reviewed. The relative frequencies in different sports are compared in a qualitative manner and the sports with greater numbers of ocular injuries are detailed. In common with occupational injuries to the eye, most sports eye injuries are considered preventable. The hierarchy of action for occupational risk is detailed and adapted to use in a sports scenario. All the available international, regional and national standards on sports eye protection are detailed and their provisions compared. The major function of the standards is to provide adequate protection against the hazard of the sport concerned. These are detailed and compared as a function of energy transfer. Eye protection must not introduce additional or secondary hazards (for instance, fracturing into sharp fragments on impact) and not introduce features that would deter the wearing of eye protection (for instance, restricting field of view to impede playing the sport). The provisions of the standards intended to limit secondary hazards are detailed and compared. The need for future work in standards writing and the activities of the International Standardization Organization in sports eye protection are detailed. © 2016 Optometry Australia.

  9. COMPUTER INTEGRATED MANUFACTURING: OVERVIEW OF MODERN STANDARDS

    Directory of Open Access Journals (Sweden)

    A. Рupena

    2016-09-01

    Full Text Available The article deals with modern international standards ISA-95 and ISA-88 on the development of computer inegreted manufacturing. It is shown scope of standards in the context of a hierarchical model of the enterprise. Article is built in such a way to describe the essence of the standards in the light of the basic descriptive models: product definition, resources, schedules and actual performance of industrial activity. Description of the product definition is given by hierarchical presentation of products at various levels of management. Much attention is given to describe this type of resources like equipment, which is logical chain to all these standards. For example, the standard batch process control shows the relationship between the definition of product and equipment on which it is made. The article shows the hierarchy of planning ERP-MES / MOM-SCADA (in terms of standard ISA-95, which traces the decomposition of common production plans of enterprises for specific works at APCS. We consider the appointment of the actual performance of production at MES / MOM considering KPI. Generalized picture of operational activity on a level MES / MOM is shown via general circuit diagrams of the relationship of activities and information flows between the functions. The article is finished by a substantiation of necessity of distribution, approval and development of standards ISA-88 and ISA-95 in Ukraine. The article is an overview and can be useful to specialists in computer-integrated systems control and management of industrial enterprises, system integrators and suppliers.

  10. The ATLAS distributed analysis system

    International Nuclear Information System (INIS)

    Legger, F

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  11. The ATLAS distributed analysis system

    Science.gov (United States)

    Legger, F.; Atlas Collaboration

    2014-06-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  12. Standards and quality

    CERN Document Server

    El-Tawil, Anwar

    2015-01-01

    The book brings together a number of subjects of prime importance for any practicing engineer and, students of engineering. The book explains the concepts and functions of voluntary standards, mandatory technical regulations, conformity assessment (testing and measurement of products), certification, quality and quality management systems as well as other management systems such as environmental, social responsibility and food safety management systems.The book also gives a comprehensive description of the role of metrology systems that underpin conformity assessment. A description is given of typical national systems of standards, quality and metrology and how they relate directly or through regional structures to international systems. The book also covers the relation between standards and trade and explains the context and stipulations of the Technical Barriers to Trade Agreement of the World Trade Organization (WTO).

  13. Operator licensing examiner standards

    International Nuclear Information System (INIS)

    1983-10-01

    The Operator Licensing Examiner Standards provide policy and guidance to NRC examiners and establish the procedures and practices for examining and licensing of applicants for NRC operator licenses pursuant to Part 55 of Title 10 of the Code of Federal Regulations (10 CFR 55). They are intended to assist NRC examiners and facility licensees to understand the examination process better and to provide for equitable and consistent administration of examinations to all applicants by NRC examiners. These standards are not a substitute for the operator licensing regulations and are subject to revision or other internal operator examination licensing policy changes. As appropriate, these standards will be revised periodically to accommodate comments and reflect new information or experience

  14. Standardization of depression measurement

    DEFF Research Database (Denmark)

    Wahl, Inka; Löwe, Bernd; Bjørner, Jakob

    2014-01-01

    OBJECTIVES: To provide a standardized metric for the assessment of depression severity to enable comparability among results of established depression measures. STUDY DESIGN AND SETTING: A common metric for 11 depression questionnaires was developed applying item response theory (IRT) methods. Data...... of 33,844 adults were used for secondary analysis including routine assessments of 23,817 in- and outpatients with mental and/or medical conditions (46% with depressive disorders) and a general population sample of 10,027 randomly selected participants from three representative German household surveys....... RESULTS: A standardized metric for depression severity was defined by 143 items, and scores were normed to a general population mean of 50 (standard deviation = 10) for easy interpretability. It covers the entire range of depression severity assessed by established instruments. The metric allows...

  15. Conference: STANDARD MODEL @ LHC

    CERN Multimedia

    2012-01-01

    HCØ institute Universitetsparken 5 DK-2100 Copenhagen Ø Denmark Room: Auditorium 2 STANDARD MODEL @ LHC Niels Bohr International Academy and Discovery Center 10-13 April 2012 This four day meeting will bring together both experimental and theoretical aspects of Standard Model phenomenology at the LHC. The very latest results from the LHC experiments will be under discussion. Topics covered will be split into the following categories:     * QCD (Hard,Soft & PDFs)     * Vector Boson production     * Higgs searches     * Top Quark Physics     * Flavour physics

  16. Operator licensing examiner standards

    International Nuclear Information System (INIS)

    1987-05-01

    The Operator Licensing Examiner Standards provide policy and guidance to NRC examiners and establish the procedures and practices for examining and licensing of applicants for NRC operator licenses pursuant to Part 55 of Title 10 of the Code of Federal Regulations (10 CFR 55). They are intended to assist NRC examiners and facility licensees to understand the examination process better and to provide for equitable and consistent administration of examinations to all applicants by NRC examiners. These standards are not a substitute for the operator licensing regulations and are subject to revision or other internal operator examination licensing policy changes

  17. The Standard Model

    Science.gov (United States)

    Burgess, Cliff; Moore, Guy

    2012-04-01

    List of illustrations; List of tables; Preface; Acknowledgments; Part I. Theoretical Framework: 1. Field theory review; 2. The standard model: general features; 3. Cross sections and lifetimes; Part II. Applications: Leptons: 4. Elementary boson decays; 5. Leptonic weak interactions: decays; 6. Leptonic weak interactions: collisions; 7. Effective Lagrangians; Part III. Applications: Hadrons: 8. Hadrons and QCD; 9. Hadronic interactions; Part IV. Beyond the Standard Model: 10. Neutrino masses; 11. Open questions, proposed solutions; Appendix A. Experimental values for the parameters; Appendix B. Symmetries and group theory review; Appendix C. Lorentz group and the Dirac algebra; Appendix D. ξ-gauge Feynman rules; Appendix E. Metric convention conversion table; Select bibliography; Index.

  18. Standard for metric practice

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    This standard gives guidance for application of the modernized metric system in the United States. The International System of Units, developed and maintained by the General Conference on Weights and Measures (abbreviated CGPM from the official French name Conference Generale des Poids et Measures) is intended as a basis for worldwide standardization of measurement units. The name International System of Units and the international abbreviation SI 2 were adopted by the 11th CGPM in 1960. SI is a complete, coherent system that is being universally adopted

  19. Coordinate Standard Measurement Development

    Energy Technology Data Exchange (ETDEWEB)

    Hanshaw, R.A.

    2000-02-18

    A Shelton Precision Interferometer Base, which is used for calibration of coordinate standards, was improved through hardware replacement, software geometry error correction, and reduction of vibration effects. Substantial increases in resolution and reliability, as well as reduction in sampling time, were achieved through hardware replacement; vibration effects were reduced substantially through modification of the machine component dampening and software routines; and the majority of the machine's geometry error was corrected through software geometry error correction. Because of these modifications, the uncertainty of coordinate standards calibrated on this device has been reduced dramatically.

  20. Islam, Standards, and Technoscience

    DEFF Research Database (Denmark)

    Fischer, Johan

    Halal (literally, "permissible" or "lawful") production, trade, and standards have become essential to state-regulated Islam and to companies in contemporary Malaysia and Singapore, giving these two countries a special position in the rapidly expanding global market for halal products: in these n......Halal (literally, "permissible" or "lawful") production, trade, and standards have become essential to state-regulated Islam and to companies in contemporary Malaysia and Singapore, giving these two countries a special position in the rapidly expanding global market for halal products...

  1. Standard Weights and Measures

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 8. Standard Weights and Measures. Vasant Natarajan. General Article Volume 6 Issue 8 August 2001 pp 44-59. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/006/08/0044-0059. Author Affiliations.

  2. International Financial Reporting Standards

    DEFF Research Database (Denmark)

    Pontoppidan, Caroline Aggestam

    2011-01-01

    The advance of International Financial Reporting Standards (IFRS) across the globe has accelerated over the last few years. This is placing increasing demands on educators to respond to these changes by an increased focus on IFRS in the curricula of accounting students. This paper reviews a range...

  3. Beyond the Standard Model

    CERN Document Server

    Csáki, Csaba

    2015-01-01

    We introduce aspects of physics beyond the Standard Model focusing on supersymmetry, extra dimensions, and a composite Higgs as solutions to the Hierarchy problem. Lectures given at the 2013 European School of High Energy Physics, Parádfürdo, Hungary, 5-18 June 2013.

  4. Standardized Curriculum for Cosmetology.

    Science.gov (United States)

    Mississippi State Dept. of Education, Jackson. Office of Vocational, Technical and Adult Education.

    Standardized curricula are provided for two courses for the secondary vocational education program in Mississippi: cosmetology I and II. The 18 units in cosmetology I are as follows: introduction to cosmetology; Vocational Industrial Clubs of America; the look you like; bacteriology; sterilization and sanitation; hair and disorders; draping,…

  5. Standardization of Sign Languages

    Science.gov (United States)

    Adam, Robert

    2015-01-01

    Over the years attempts have been made to standardize sign languages. This form of language planning has been tackled by a variety of agents, most notably teachers of Deaf students, social workers, government agencies, and occasionally groups of Deaf people themselves. Their efforts have most often involved the development of sign language books…

  6. Islam, Standards, and Technoscience

    DEFF Research Database (Denmark)

    Fischer, Johan

    Halal (literally, "permissible" or "lawful") production, trade, and standards have become essential to state-regulated Islam and to companies in contemporary Malaysia and Singapore, giving these two countries a special position in the rapidly expanding global market for halal products: in these n......Halal (literally, "permissible" or "lawful") production, trade, and standards have become essential to state-regulated Islam and to companies in contemporary Malaysia and Singapore, giving these two countries a special position in the rapidly expanding global market for halal products......, this book provides an exploration of the role of halal production, trade, and standards. Fischer explains how the global markets for halal comprise divergent zones in which Islam, markets, regulatory institutions, and technoscience interact and diverge. Focusing on the "bigger institutional picture......" that frames everyday halal consumption, Fischer provides a multisited ethnography of the overlapping technologies and techniques of production, trade, and standards that together warrant a product as "halal," and thereby help to format the market. Exploring global halal in networks, training, laboratories...

  7. Low Impact Development Standards

    Energy Technology Data Exchange (ETDEWEB)

    Loftin, Samuel R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-02

    The goal of the LID Standards is to provide guidance on the planning, design, construction and maintenance of green infrastructure (GI) features at Los Alamos National Laboratory. The success of LID at LANL is dependent on maintaining a consistent approach to achieve effective application, operation, and maintenance of these storm water control features.

  8. Elevating standards, improving safety.

    Science.gov (United States)

    Clarke, Richard

    2014-08-01

    In our latest 'technical guidance' article, Richard Clarke, sales and marketing director at one of the UK's leading lift and escalator specialists, Schindler, examines some of the key issues surrounding the specification, maintenance, and operation of lifts in hospitals to help ensure the highest standards of safety and reliability.

  9. Beyond the standard model

    International Nuclear Information System (INIS)

    Cuypers, F.

    1997-05-01

    These lecture notes are intended as a pedagogical introduction to several popular extensions of the standard model of strong and electroweak interactions. The topics include the Higgs sector, the left-right symmetric model, grand unification and supersymmetry. Phenomenological consequences and search procedures are emphasized. (author) figs., tabs., 18 refs

  10. Standard hakkab tootma Mercedesele

    Index Scriptorium Estoniae

    2005-01-01

    AS Standard sõlmis ühe maailma suurema autotootjaga DaimlerChrysler AG koostöölepingu, mille kohaselt hakkab ettevõte tootma ning müüma kontori- ja teenindussaalide mööblit kontserni kuuluva Mercedes-Benzi Euroopa diileritele

  11. Mixed RIA standard

    International Nuclear Information System (INIS)

    Talan, P.; Mucha, J.; Krizan, J.

    1986-01-01

    For the radioimmunoassay of digoxin, 3,5,3'-triiodothyronine, 17β-estradiol, progesterone, testosterone and α 1 -fetoprotein a mixed standard was prepared of these substances in a gamma globulin solution at a concentration of 0.8 to 1.4 wt.% in an aqueous buffer at pH within the range of 6 - 9. The standard contains digoxin at a concentration of 10 -4 to 10 nmol/l, 17β-estradiol at 10 -4 to 2 nmol/l, progesteron at 10 -4 to 100 nmol/l, testosterone at 1o -4 to 21 nmol/l, and α 1 -fetoprotein at 10 -4 to 10 nmol/l with at least two of these substances having concentrations higher than 10 -3 nmol/l. Examples are given of the preparation of the mixed standard with different concentrations of the components. The use of the standard has the following advantages: it is labor saving, reduces the risk of failure in the manufacture of RIA kits, eliminates mistakes in the selection of kits for the determination of different substances and allows a more economical use of material. (E.S.)

  12. Beyond the Standard Model

    CERN Multimedia

    CERN. Geneva

    2005-01-01

    The necessity for new physics beyond the Standard Model will be motivated. Theoretical problems will be exposed and possible solutions will be described. The goal is to present the exciting new physics ideas that will be tested in the near future. Supersymmetry, grand unification, extra dimensions and string theory will be presented.

  13. A Non-standard Empirical Likelihood for Time Series

    DEFF Research Database (Denmark)

    Nordman, Daniel J.; Bunzel, Helle; Lahiri, Soumendra N.

    -standard asymptotics and requires a significantly different development compared to standard BEL. We establish the large-sample distribution of log-ratio statistics from the new BEL method for calibrating confidence regions for mean or smooth function parameters of time series. This limit law is not the usual chi...

  14. 40 CFR 63.424 - Standards: Equipment leaks.

    Science.gov (United States)

    2010-07-01

    ... Standards for Gasoline Distribution Facilities (Bulk Gasoline Terminals and Pipeline Breakout Stations) § 63.424 Standards: Equipment leaks. (a) Each owner or operator of a bulk gasoline terminal or pipeline... shall be recorded in the log book. When a leak is detected, an initial attempt at repair shall be made...

  15. Job Grading Standard for Electric Power Controller WG-5407.

    Science.gov (United States)

    Civil Service Commission, Washington, DC. Bureau of Policies and Standards.

    The standard is used to grade nonsupervisory jobs involved in controlling the generation or distribution of electric power. The jobs are located at power generating plants, power distribution centers, and substations. The work requires ability to anticipate load changes due to work schedules, weather, and other variables, in order to engage or cut…

  16. DOE standard: Radiological control

    International Nuclear Information System (INIS)

    1999-07-01

    The Department of Energy (DOE) has developed this Standard to assist line managers in meeting their responsibilities for implementing occupational radiological control programs. DOE has established regulatory requirements for occupational radiation protection in Title 10 of the Code of Federal Regulations, Part 835 (10 CFR 835), ''Occupational Radiation Protection''. Failure to comply with these requirements may lead to appropriate enforcement actions as authorized under the Price Anderson Act Amendments (PAAA). While this Standard does not establish requirements, it does restate, paraphrase, or cite many (but not all) of the requirements of 10 CFR 835 and related documents (e.g., occupational safety and health, hazardous materials transportation, and environmental protection standards). Because of the wide range of activities undertaken by DOE and the varying requirements affecting these activities, DOE does not believe that it would be practical or useful to identify and reproduce the entire range of health and safety requirements in this Standard and therefore has not done so. In all cases, DOE cautions the user to review any underlying regulatory and contractual requirements and the primary guidance documents in their original context to ensure that the site program is adequate to ensure continuing compliance with the applicable requirements. To assist its operating entities in achieving and maintaining compliance with the requirements of 10 CFR 835, DOE has established its primary regulatory guidance in the DOE G 441.1 series of Guides. This Standard supplements the DOE G 441.1 series of Guides and serves as a secondary source of guidance for achieving compliance with 10 CFR 835

  17. DOE standard: Radiological control

    Energy Technology Data Exchange (ETDEWEB)

    1999-07-01

    The Department of Energy (DOE) has developed this Standard to assist line managers in meeting their responsibilities for implementing occupational radiological control programs. DOE has established regulatory requirements for occupational radiation protection in Title 10 of the Code of Federal Regulations, Part 835 (10 CFR 835), ``Occupational Radiation Protection``. Failure to comply with these requirements may lead to appropriate enforcement actions as authorized under the Price Anderson Act Amendments (PAAA). While this Standard does not establish requirements, it does restate, paraphrase, or cite many (but not all) of the requirements of 10 CFR 835 and related documents (e.g., occupational safety and health, hazardous materials transportation, and environmental protection standards). Because of the wide range of activities undertaken by DOE and the varying requirements affecting these activities, DOE does not believe that it would be practical or useful to identify and reproduce the entire range of health and safety requirements in this Standard and therefore has not done so. In all cases, DOE cautions the user to review any underlying regulatory and contractual requirements and the primary guidance documents in their original context to ensure that the site program is adequate to ensure continuing compliance with the applicable requirements. To assist its operating entities in achieving and maintaining compliance with the requirements of 10 CFR 835, DOE has established its primary regulatory guidance in the DOE G 441.1 series of Guides. This Standard supplements the DOE G 441.1 series of Guides and serves as a secondary source of guidance for achieving compliance with 10 CFR 835.

  18. Hg(+) Frequency Standards

    Science.gov (United States)

    Prestage, John D.; Tjoelker, Robert L.; Maleki, Lute

    2000-01-01

    In this paper we review the development of Hg(+) microwave frequency standards for use in high reliability and continuous operation applications. In recent work we have demonstrated short-term frequency stability of 3 x 10(exp -14)/nu(sub tau) when a cryogenic oscillator of stability 2-3 x 10(exp 15) was used a the local oscillator. The trapped ion frequency standard employs a Hg-202 discharge lamp to optically pump the trapped Hg(+)-199 clock ions and a helium buffer gas to cool the ions to near room temperature. We describe a small Hg(+) ion trap based frequency standard with an extended linear ion trap (LITE) architecture which separates the optical state selection region from the clock resonance region. This separation allows the use of novel trap configurations in the resonance region since no optical pumping is carried out there. A method for measuring the size of an ion cloud inside a linear trap with a 12-rod trap is currently being investigated. At approx. 10(exp -12), the 2nd order Doppler shift for trapped mercury ion frequency standards is one of the largest frequency offsets and its measurement to the 1% level would represent an advance in insuring the very long-term stability of these standards to the 10(exp -14) or better level. Finally, we describe atomic clock comparison experiments that can probe for a time variation of the fine structure constant, alpha = e(exp 2)/2(pi)hc, at the level of 10(exp -20)/year as predicted in some Grand Unified String Theories.

  19. Department of Energy Standards Index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-08-01

    This TSL, intended for use in selecting and using DOE technical standards and other Government and non-Government standards, provides listing of current and inactive DOE technical standards, non-Government standards adopted by DOE, other Government documents in which DOE has a recorded interest, and cancelled DOE technical standards.

  20. DOE technical standards list: Department of Energy standards index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    This technical standards list (TSL) was prepared for use by personnel involved in the selection and use of US DOE technical standards and other government and non-government standards. This TSL provides listings of current DOE technical standards, non-government standards that have been adopted by DOE, other government documents in which DOE has a recorded interest, and cancelled DOE technical standards. Standards are indexed by type in the appendices to this document. Definitions of and general guidance for the use of standards are also provided.

  1. Cloud Standardization: Consistent Business Processes and Information

    Directory of Open Access Journals (Sweden)

    Razvan Daniel ZOTA

    2013-01-01

    Full Text Available Cloud computing represents one of the latest emerging trends in distributed computing that enables the existence of hardware infrastructure and software applications as services. The present paper offers a general approach to the cloud computing standardization as a mean of improving the speed of adoption for the cloud technologies. Moreover, this study tries to show out how organizations may achieve more consistent business processes while operating with cloud computing technologies.

  2. Antibody reactions methods in safety standards

    International Nuclear Information System (INIS)

    Shubik, V.M.; Sirasdinov, V.G.; Zasedatelev, A.A.; Kal'nitskij, S.A.; Livshits, R.E.

    1978-01-01

    Results of determinations are presented of autoantibodies in white rats to which the radionuclides 137 Cs, 226 Ra, and 90 Sr that show different distribution patterns in the body, have been administered chronically. Autoantiboby production is found to increase when the absorbed doses are close to or exceeding seven- to tenfold the maximum permissible values. The results obtained point to the desirability of autoantibody determination in studies aimed at setting hygienic standards for the absorption of radioactive substances

  3. An exponential distribution

    International Nuclear Information System (INIS)

    Anon

    2009-01-01

    In this presentation author deals with the probabilistic evaluation of product life on the example of the exponential distribution. The exponential distribution is special one-parametric case of the weibull distribution.

  4. 76 FR 4155 - National Emission Standards for Hazardous Air Pollutants for Source Categories: Gasoline...

    Science.gov (United States)

    2011-01-24

    ... 63 National Emission Standards for Hazardous Air Pollutants for Source Categories: Gasoline Distribution Bulk Terminals, Bulk Plants, and Pipeline Facilities; and Gasoline Dispensing Facilities; Final...] RIN 2060-AP16 National Emission Standards for Hazardous Air Pollutants for Source Categories: Gasoline...

  5. A general setting for symmetric distributions and their relationship to general distributions

    OpenAIRE

    Jupp, P.E.; Regoli, G.; Azzalini, A.

    2016-01-01

    A standard method of obtaining non-symmetrical distributions is that of modulating symmetrical distributions by multiplying the densities by a perturbation factor. This has been considered mainly for central symmetry of a Euclidean space in the origin. This paper enlarges the concept of modulation to the general setting of symmetry under the action of a compact topological group on the sample space. The main structural result relates the density of an arbitrary distribution to the density of ...

  6. 77 FR 59151 - Regional Reliability Standard PRC-006-NPCC-1-Automatic Underfrequency Load Shedding

    Science.gov (United States)

    2012-09-26

    .... This allows each smaller entity's respective planning coordinator to achieve the desired aggregate... applies to generator owners, planning coordinators, distribution providers, and transmission owners in the... proposed regional Reliability Standard applies to generator owners, planning coordinators, distribution...

  7. Distributed Computing: An Overview

    OpenAIRE

    Md. Firoj Ali; Rafiqul Zaman Khan

    2015-01-01

    Decrease in hardware costs and advances in computer networking technologies have led to increased interest in the use of large-scale parallel and distributed computing systems. Distributed computing systems offer the potential for improved performance and resource sharing. In this paper we have made an overview on distributed computing. In this paper we studied the difference between parallel and distributed computing, terminologies used in distributed computing, task allocation in distribute...

  8. Method for secure electronic voting system: face recognition based approach

    Science.gov (United States)

    Alim, M. Affan; Baig, Misbah M.; Mehboob, Shahzain; Naseem, Imran

    2017-06-01

    In this paper, we propose a framework for low cost secure electronic voting system based on face recognition. Essentially Local Binary Pattern (LBP) is used for face feature characterization in texture format followed by chi-square distribution is used for image classification. Two parallel systems are developed based on smart phone and web applications for face learning and verification modules. The proposed system has two tire security levels by using person ID followed by face verification. Essentially class specific threshold is associated for controlling the security level of face verification. Our system is evaluated three standard databases and one real home based database and achieve the satisfactory recognition accuracies. Consequently our propose system provides secure, hassle free voting system and less intrusive compare with other biometrics.

  9. Fissure sealants: Knowledge and practice of Yemeni dental practitioners.

    Science.gov (United States)

    Al-Maweri, Sadeq Ali; Al-Jamaei, Aisha Ahmed; Halboub, Esam Saleh; Al-Soneidar, Walid Ahmed; Tarakji, Bassel; Alsalhani, Anas

    2016-01-01

    This study was conducted to evaluate Yemeni dental practitioners' knowledge and practices concerning fissure sealants. A modified questionnaire consisted of 25-items was distributed to 500 dentists working in Sana'a City. Descriptive statistics and Chi-square/Fisher's exact tests were used for statistical analyses. The response rate was 74%. Most of the respondents were male (61.3%), general practitioners (84.2%), and had sealants, with the majority (88%) believed that there is strong scientific evidence about fissure sealants effectiveness and around 90% showed a good understanding of sealant placement instructions. On the other hand, respondents showed insufficient knowledge about sealants clinical practice. Although a high proportion of dental practitioners showed adequate knowledge about dental sealant, following guidelines and standardized procedures in clinical practice is lacking. These emphasize the need for regular continuing education courses for dental professional.

  10. Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Peskin, M.E.

    1997-05-01

    These lectures constitute a short course in ``Beyond the Standard Model`` for students of experimental particle physics. The author discusses the general ideas which guide the construction of models of physics beyond the Standard model. The central principle, the one which most directly motivates the search for new physics, is the search for the mechanism of the spontaneous symmetry breaking observed in the theory of weak interactions. To illustrate models of weak-interaction symmetry breaking, the author gives a detailed discussion of the idea of supersymmetry and that of new strong interactions at the TeV energy scale. He discusses experiments that will probe the details of these models at future pp and e{sup +}e{sup {minus}} colliders.

  11. Beyond the Standard Model

    International Nuclear Information System (INIS)

    Peskin, M.E.

    1997-05-01

    These lectures constitute a short course in ''Beyond the Standard Model'' for students of experimental particle physics. The author discusses the general ideas which guide the construction of models of physics beyond the Standard model. The central principle, the one which most directly motivates the search for new physics, is the search for the mechanism of the spontaneous symmetry breaking observed in the theory of weak interactions. To illustrate models of weak-interaction symmetry breaking, the author gives a detailed discussion of the idea of supersymmetry and that of new strong interactions at the TeV energy scale. He discusses experiments that will probe the details of these models at future pp and e + e - colliders

  12. Non_standard Wood

    DEFF Research Database (Denmark)

    Tamke, Martin

    . Using parametric design tools and computer controlled production facilities Copenhagens Centre for IT and Architecture undertook a practice based research into performance based non-standard element design and mass customization techniques. In close cooperation with wood construction software......Non-Standard elements in architecture bear the promise of a better more specific performance (Oosterhuis 2003). A new understanding of design evolves, which is focusing on open ended approaches, able to negotiate between shifting requirements and to integrate knowledge on process and material......, but the integration of traditional wood craft techniques. The extensive use of self adjusting, load bearing wood-wood joints contributed to ease in production and assembly of a performance based architecture....

  13. Standardization of Speech Corpus

    Directory of Open Access Journals (Sweden)

    Ai-jun Li

    2007-12-01

    Full Text Available Speech corpus is the basis for analyzing the characteristics of speech signals and developing speech synthesis and recognition systems. In China, almost all speech research and development affiliations are developing their own speech corpora. We have so many different kinds numbers of Chinese speech corpora that it is important to be able to conveniently share these speech corpora to avoid wasting time and money and to make research work more efficient. The primary goal of this research is to find a standard scheme which can make the corpus be established more efficiently and be used or shared more easily. A huge speech corpus on 10 regional accented Chinese, RASC863 (a Regional Accent Speech Corpus funded by National 863 Project will be exemplified to illuminate the standardization of speech corpus production.

  14. Testing the Standard Model

    CERN Document Server

    Riles, K

    1998-01-01

    The Large Electron Project (LEP) accelerator near Geneva, more than any other instrument, has rigorously tested the predictions of the Standard Model of elementary particles. LEP measurements have probed the theory from many different directions and, so far, the Standard Model has prevailed. The rigour of these tests has allowed LEP physicists to determine unequivocally the number of fundamental 'generations' of elementary particles. These tests also allowed physicists to ascertain the mass of the top quark in advance of its discovery. Recent increases in the accelerator's energy allow new measurements to be undertaken, measurements that may uncover directly or indirectly the long-sought Higgs particle, believed to impart mass to all other particles.

  15. Standard cartridges used in gamma spectrometry measurements of radioactive halogens

    International Nuclear Information System (INIS)

    Lepy, M.C.; Etcheverry, M.; Morel, J.; Chauvenet, B.

    1988-10-01

    Activated charcoal cartridges are used to trap radioactive halogens contained in gaseous effluents of nuclear facilities. Two types of standard cartridges, with barium 133 or europium 152 are available. One of the models simulates a volumic distribution, and the other a surface distribution of the radionuclides inside the cartridge. They are characterized in terms of activity with an uncertainty lower than 5 %. The standard cartridges utilization conditions are specified and the main measurement error causes are analyzed. The proper routine use of these standards should allow us to get results with an accuracy better than 10 % [fr

  16. Standard Model physics

    CERN Multimedia

    Altarelli, Guido

    1999-01-01

    Introduction structure of gauge theories. The QEDand QCD examples. Chiral theories. The electroweak theory. Spontaneous symmetry breaking. The Higgs mechanism Gauge boson and fermion masses. Yukawa coupling. Charges current couplings. The Cabibo-Kobayashi-Maskawa matrix and CP violation. Neutral current couplings. The Glasow-Iliopoulos-Maiani mechanism. Gauge boson and Higgs coupling. Radiative corrections and loops. Cancellation of the chiral anomaly. Limits on the Higgs comparaison. Problems of the Standard Model. Outlook.

  17. Standard model and beyond

    International Nuclear Information System (INIS)

    Quigg, C.

    1984-09-01

    The SU(3)/sub c/ circle crossSU(2)/sub L/circle crossU(1)/sub Y/ gauge theory of ineractions among quarks and leptons is briefly described, and some recent notable successes of the theory are mentioned. Some shortcomings in our ability to apply the theory are noted, and the incompleteness of the standard model is exhibited. Experimental hints that Nature may be richer in structure than the minimal theory are discussed. 23 references

  18. DOE standard: Firearms safety

    International Nuclear Information System (INIS)

    1996-02-01

    Information in this document is applicable to all DOE facilities, elements, and contractors engaged in work that requires the use of firearms as provided by law or contract. The standard in this document provides principles and practices for implementing a safe and effective firearms safety program for protective forces and for non-security use of firearms. This document describes acceptable interpretations and methods for meeting Order requirements

  19. Determination of standard data

    International Nuclear Information System (INIS)

    Pychlau, P.

    1986-01-01

    The standard data used for diagnostic radiography refer to the filter system, the Bucky grid, the film type, exposure, and film processing method. The same type of reference data is established for fluoroscopic screen devices, with data on the fluoroscopic exposure time and the area exposure product in addition. The measurements are done in compliance with section 29 X-Ray Ordinance and DIN 6868. (DG) [de

  20. Non_standard Wood

    DEFF Research Database (Denmark)

    Tamke, Martin

    . Using parametric design tools and computer controlled production facilities Copenhagens Centre for IT and Architecture undertook a practice based research into performance based non-standard element design and mass customization techniques. In close cooperation with wood construction software......, but the integration of traditional wood craft techniques. The extensive use of self adjusting, load bearing wood-wood joints contributed to ease in production and assembly of a performance based architecture....

  1. DOE standard: Firearms safety

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    Information in this document is applicable to all DOE facilities, elements, and contractors engaged in work that requires the use of firearms as provided by law or contract. The standard in this document provides principles and practices for implementing a safe and effective firearms safety program for protective forces and for non-security use of firearms. This document describes acceptable interpretations and methods for meeting Order requirements.

  2. New standard environmental management

    International Nuclear Information System (INIS)

    Andriola, Luca; Luciani, Roberto

    2006-01-01

    The ISO 14001:2004 standard, like ISO 9001:2000 on quality management, transcends the preventive approach (based on a rigid and more or less adequate process-management model still mainly inspired by traditional production methods) and introduces in its stead a highly flexible approach applicable to any socio-economic activity. It is structured by processes rather than system elements, and is based on the quest for efficacy and ongoing improvement [it

  3. Telemetry Standards, Part 1

    Science.gov (United States)

    2015-07-01

    protocol RCC Range Commanders Council RFC Request For Comment RIU remote interface unit RMM removable memory module RS Recommended Standard RSCF...followed by hex characters Comments COMMENTS G\\COM Allowed when: Always Provide the additional information requested or any other information desired...if applicable. Range: 6 characters Comments COMMENTS M-x\\COM Allowed when: When M\\ID is specified Provide the additional information requested or

  4. Natural matrix standards

    International Nuclear Information System (INIS)

    Bowen, V.T.

    1976-01-01

    Environmental radiochemistry needs, for use in analytical intercomparision and as standard reference materials, very large homogeneous samples of a variety of matrices, each naturally contaminated by a variety of longer-lived radionuclides, at several different ranges of concentrations. The reasons for this need are discussed, and the minimum assortment of matrices of radionuclides, and of concentrations is established. Sources of suitable materials are suggested, and the international approach to meeting this need is emphasized

  5. Energy labels and standards

    International Nuclear Information System (INIS)

    Newman, J.

    2000-01-01

    Improving energy efficiency at the end-use level is increasingly important as Climate Change commitments force policy makers to look for areas where greenhouse gas emissions reduction can be achieved rapidly. Indeed, although much improvement has been mode over the past 25 years, significant potential for improving energy efficiency still exists. Labelling and minimum efficiency standards for appliances and equipment have proven to be one of the most promising policy instruments. Used for many years in some IEA Member countries, they delivered tangible results. They are among the cheapest and least intrusive of policies. Policy makers cannot afford to neglect them. This book examines current and post experiences of countries using labels and standards to improve energy end-use efficiency. It identifies successful policy approaches, focusing on what works best. It also provides insight into the opportunities ahead, including the widespread use of computer chips in appliances, cars and equipment. This book should be of great help not only to administrations planning to introduce labelling schemes, but also to those in the process of strengthening their current programmes. Policy makers in developing countries will also find here all necessary justification for implementing labelling and standards in their economy. 74 refs

  6. Operator licensing examiner standards

    International Nuclear Information System (INIS)

    1994-06-01

    The Operator Licensing Examiner Standards provide policy and guidance to NRC examiners and establish the procedures and practices for examining licensees and applicants for reactor operator and senior reactor operator licenses at power reactor facilities pursuant to Part 55 of Title 10 of the Code of Federal Regulations (10 CFR 55). The Examiner Standards are intended to assist NRC examiners and facility licensees to better understand the initial and requalification examination processes and to ensure the equitable and consistent administration of examinations to all applicants. These standards are not a substitute for the operator licensing regulations and are subject to revision or other internal operator licensing policy changes. Revision 7 was published in January 1993 and became effective in August 1993. Supplement 1 is being issued primarily to implement administrative changes to the requalification examination program resulting from the amendment to 10 CFR 55 that eliminated the requirement for every licensed operator to pass an NRC-conducted requalification examination as a condition for license renewal. The supplement does not substantially alter either the initial or requalification examination processes and will become effective 30 days after its publication is noticed in the Federal Register. The corporate notification letters issued after the effective date will provide facility licensees with at least 90 days notice that the examinations will be administered in accordance with the revised procedures

  7. Implementing PAT with Standards

    Science.gov (United States)

    Chandramohan, Laakshmana Sabari; Doolla, Suryanarayana; Khaparde, S. A.

    2016-02-01

    Perform Achieve Trade (PAT) is a market-based incentive mechanism to promote energy efficiency. The purpose of this work is to address the challenges inherent to inconsistent representation of business processes, and interoperability issues in PAT like cap-and-trade mechanisms especially when scaled. Studies by various agencies have highlighted that as the mechanism evolves including more industrial sectors and industries in its ambit, implementation will become more challenging. This paper analyses the major needs of PAT (namely tracking, monitoring, auditing & verifying energy-saving reports, and providing technical support & guidance to stakeholders); and how the aforesaid reasons affect them. Though current technologies can handle these challenges to an extent, standardization activities for implementation have been scanty for PAT and this work attempts to evolve them. The inconsistent modification of business processes, rules, and procedures across stakeholders, and interoperability among heterogeneous systems are addressed. This paper proposes the adoption of specifically two standards into PAT, namely Business Process Model and Notation for maintaining consistency in business process modelling, and Common Information Model (IEC 61970, 61968, 62325 combined) for information exchange. Detailed architecture and organization of these adoptions are reported. The work can be used by PAT implementing agencies, stakeholders, and standardization bodies.

  8. Emission- and product standards

    International Nuclear Information System (INIS)

    Jong, P. de

    1988-01-01

    This report makes part of a series of eight reports which have been drawn up in behalf of the dutch Policy Notition Radiation Standards (BNS). In this report the results are presented of an inventarization of the use of radioactive materials and ionizing-radiation emitting apparates in the Netherlands. Ch. 2 deals with the varous applications of radioactive materials in the Netherlands. Herein also the numbers and the various locations by application, and the amounts and character of the radioactive materials used, come under discussion. Besides, the various waste currents are considered separately. The use of ionizing-radiation emitting apparates is treated in ch. 3. In ch. 4 the differences and agreements of the various applications, concentrating on the emission and product standards to be drawn up, are entered further. Also on the base of these considerations, a number of starting points are formulated with regard to the way in which emission and product standards may be drawn up. Ch. 7 deals with the conclusions and indicates the most important hiates. (H.W.). 25 refs.; 5 figs.; 25 tabs

  9. ISO 50001 Energy Management Standard

    Energy Technology Data Exchange (ETDEWEB)

    None

    2013-08-12

    This powerful standard from the International Organization for Standardization (ISO) provides an internationally recognized framework for organizations to voluntarily implement an energy management system.

  10. Standardization as Spaces of Diversity

    Directory of Open Access Journals (Sweden)

    Allison Marie Loconto

    2017-08-01

    Full Text Available Standards have become an important object of investigation in social science and STS scholars have called for a more systematic program of research to study standards or standardization (Busch 2011; Timmermans and Epstein 2010. In this considering concepts paper, we engage with their program for a sociology of standards and propose a new way to think about standards and standardization as “spaces of diversity” so as to push our thinking forward about how standards, standardization and innovation processes are linked. We consider standardization as the dynamic interaction in three spaces (standards in the making, standards in action, and standards in circulation where diversity reemerges only to be tentatively reduced or limited through new rounds of standard setting. We illustrate how diversity is an integral part of standardization with the example of the Rainforest Alliance standard for tea production as it circulated from Costa Rica to Kenya, where it was made and put into action and then circulated again to other African, Asian, and Latin American countries. We end with a proposition for future research on standards to address these other spaces of standards as loci of standardization and innovation.

  11. Reactor power distribution monitor

    International Nuclear Information System (INIS)

    Hoizumi, Atsushi.

    1986-01-01

    Purpose: To grasp the margin for the limit value of the power distribution peaking factor inside the reactor under operation by using the reactor power distribution monitor. Constitution: The monitor is composed of the 'constant' file, (to store in-reactor power distributions obtained from analysis), TIP and thermocouple, lateral output distribution calibrating apparatus, axial output distribution synthesizer and peaking factor synthesizer. The lateral output distribution calibrating apparatus is used to make calibration by comparing the power distribution obtained from the thermocouples to the power distribution obtained from the TIP, and then to provide the power distribution lateral peaking factors. The axial output distribution synthesizer provides the power distribution axial peaking factors in accordance with the signals from the out-pile neutron flux detector. These axial and lateral power peaking factors are synthesized with high precision in the three-dimensional format and can be monitored at any time. (Kamimura, M.)

  12. ESDIS Standards Office (ESO): Requirements, Standards and Practices

    Science.gov (United States)

    Mitchell, Andrew E.; Mcinerney, Mark Allen; Enloe, Yonsok K.; Conover, Helen T.; Doyle, Allan

    2016-01-01

    The ESDIS Standards Office assists the ESDIS Project in formulating standards policy for NASA Earth Science Data Systems (ESDS), coordinates standards activities within ESDIS, and provides technical expertise and assistance with standards related tasks within the NASA Earth Science Data System Working Groups (ESDSWG). This poster summarizes information found on the earthdata.nasa.gov site that describes the ESO.

  13. 76 FR 75840 - Revising Standards Referenced in the Acetylene Standard

    Science.gov (United States)

    2011-12-05

    .... OSHA-2011-0183] RIN 1218-AC64 Revising Standards Referenced in the Acetylene Standard AGENCY: Occupational Safety and Health Administration (OSHA), Department of Labor. ACTION: Notice of proposed... standards developing organization (``SDO standards''). OSHA also is publishing a direct final rule in today...

  14. Consistency Across Standards or Standards in a New Business Model

    Science.gov (United States)

    Russo, Dane M.

    2010-01-01

    Presentation topics include: standards in a changing business model, the new National Space Policy is driving change, a new paradigm for human spaceflight, consistency across standards, the purpose of standards, danger of over-prescriptive standards, a balance is needed (between prescriptive and general standards), enabling versus inhibiting, characteristics of success-oriented standards, characteristics of success-oriented standards, and conclusions. Additional slides include NASA Procedural Requirements 8705.2B identifies human rating standards and requirements, draft health and medical standards for human rating, what's been done, government oversight models, examples of consistency from anthropometry, examples of inconsistency from air quality and appendices of government and non-governmental human factors standards.

  15. Beyond the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Lykken, Joseph D.; /Fermilab

    2010-05-01

    'BSM physics' is a phrase used in several ways. It can refer to physical phenomena established experimentally but not accommodated by the Standard Model, in particular dark matter and neutrino oscillations (technically also anything that has to do with gravity, since gravity is not part of the Standard Model). 'Beyond the Standard Model' can also refer to possible deeper explanations of phenomena that are accommodated by the Standard Model but only with ad hoc parameterizations, such as Yukawa couplings and the strong CP angle. More generally, BSM can be taken to refer to any possible extension of the Standard Model, whether or not the extension solves any particular set of puzzles left unresolved in the SM. In this general sense one sees reference to the BSM 'theory space' of all possible SM extensions, this being a parameter space of coupling constants for new interactions, new charges or other quantum numbers, and parameters describing possible new degrees of freedom or new symmetries. Despite decades of model-building it seems unlikely that we have mapped out most of, or even the most interesting parts of, this theory space. Indeed we do not even know what is the dimensionality of this parameter space, or what fraction of it is already ruled out by experiment. Since Nature is only implementing at most one point in this BSM theory space (at least in our neighborhood of space and time), it might seem an impossible task to map back from a finite number of experimental discoveries and measurements to a unique BSM explanation. Fortunately for theorists the inevitable limitations of experiments themselves, in terms of resolutions, rates, and energy scales, means that in practice there are only a finite number of BSM model 'equivalence classes' competing at any given time to explain any given set of results. BSM phenomenology is a two-way street: not only do experimental results test or constrain BSM models, they also suggest

  16. Renewable resource policy when distributional impacts matter

    International Nuclear Information System (INIS)

    Horan, R.D.; Shortle, J.S.; Bulte, E.H.

    1999-01-01

    The standard assumption in bioeconomic resource models is that optimal policies maximize the present value of economic surplus to society. This assumption implies that regulatory agencies should not be concerned with the distributional consequences of management strategies. Both contemporary welfare-theoretic and rent-seeking approaches suggests distributional issues are important in designing resource management policies. This paper explores resource management when the managing agency has preferences defined over the economic welfare of various groups with a direct economic interest in the use of resources. Policy schemes consistent with this approach are derived and compared with standard results. 42 refs

  17. Standardization of biodosimetry operations

    International Nuclear Information System (INIS)

    Dainiak, Nicholas

    2016-01-01

    Methods and procedures for generating, interpreting and scoring the frequency of dicentric chromosomes vary among cytogenetic biodosimetry laboratories (CBLs). This variation adds to the already considerable lack of precision inherent in the dicentric chromosome assay (DCA). Although variability in sample collection, cell preparation, equipment and dicentric frequency scoring can never be eliminated with certainty, it can be substantially minimized, resulting in reduced scatter and improved precision. Use of standard operating procedures and technician exchange may help to mitigate variation. Although the development and adoption of international standards (ISO 21243 and ISO 19238) has helped to reduce variation in standard operating procedures (SOPs), all CBLs must maintain process improvement, and those with challenges may require additional assistance. Sources of variation that may not be readily apparent in the SOPs for sample collection and processing include variability in ambient laboratory conditions, media, serum lot and quantity and the use of particular combinations of cytokines. Variability in maintenance and calibration of metafer equipment, and in scoring criteria, reader proficiency and personal factors may need to be addressed. The calibration curve itself is a source of variation that requires control, using the same known-dose samples among CBLs, measurement of central tendency, and generation of common curves with periodic reassessment to detect drifts in dicentric yield. Finally, the dose estimate should be based on common scoring criteria, using of the z-statistic. Although theoretically possible, it is practically impossible to propagate uncertainty over the entire calibration curve due to the many factors contributing to variance. Periodic re-evaluation of the curve is needed by comparison with newly published curves (using statistical analysis of differences) and determining their potential causes. (author)

  18. Status of conversion of NE standards to national consensus standards

    International Nuclear Information System (INIS)

    Jennings, S.D.

    1990-06-01

    One major goal of the Nuclear Standards Program is to convert existing NE standards into national consensus standards (where possible). This means that an NE standard in the same subject area using the national consensus process. This report is a summary of the activities that have evolved to effect conversion of NE standards to national consensus standards, and the status of current conversion activities. In some cases, all requirements in an NE standard will not be incorporated into the published national consensus standard because these requirements may be considered too restrictive or too specific for broader application by the nuclear industry. If these requirements are considered necessary for nuclear reactor program applications, the program standard will be revised and issued as a supplement to the national consensus standard. The supplemental program standard will contain only those necessary requirements not reflected by the national consensus standard. Therefore, while complete conversion of program standards may not always be realized, the standards policy has been fully supported in attempting to make maximum use of the national consensus standard. 1 tab

  19. Quasi standard model physics

    International Nuclear Information System (INIS)

    Peccei, R.D.

    1986-01-01

    Possible small extensions of the standard model are considered, which are motivated by the strong CP problem and by the baryon asymmetry of the Universe. Phenomenological arguments are given which suggest that imposing a PQ symmetry to solve the strong CP problem is only tenable if the scale of the PQ breakdown is much above M W . Furthermore, an attempt is made to connect the scale of the PQ breakdown to that of the breakdown of lepton number. It is argued that in these theories the same intermediate scale may be responsible for the baryon number of the Universe, provided the Kuzmin Rubakov Shaposhnikov (B+L) erasing mechanism is operative. (orig.)

  20. Non-standard antennas

    CERN Document Server

    Le Chevalier, Francois; Staraj, Robert

    2013-01-01

    This book aims at describing the wide variety of new technologies and concepts of non-standard antenna systems - reconfigurable, integrated, terahertz, deformable, ultra-wideband, using metamaterials, or MEMS,  etc, and how they open the way to a wide range of applications, from personal security and communications to multifunction radars and towed sonars, or satellite navigation systems, with space-time diversity on transmit and receive. A reference book for designers  in this lively scientific community linking antenna experts and signal processing engineers.

  1. Hazard Communication Standard

    International Nuclear Information System (INIS)

    Sichak, S.

    1991-01-01

    The current rate of technological advances has brought with it an overwhelming increase in the usage of chemicals in the workplace and in the home. Coupled to this increase has been a heightened awareness in the potential for acute and chronic injuries attributable to chemical insults. The Hazard Communication Standard has been introduced with the desired goal of reducing workplace exposures to hazardous substances and thereby achieving a corresponding reduction in adverse health effects. It was created and proclaimed by the US Department of Labor and regulated by the Occupational Safety and Health Administration. 1 tab

  2. Standard-model bundles

    CERN Document Server

    Donagi, Ron; Pantev, Tony; Waldram, Dan; Donagi, Ron; Ovrut, Burt; Pantev, Tony; Waldram, Dan

    2002-01-01

    We describe a family of genus one fibered Calabi-Yau threefolds with fundamental group ${\\mathbb Z}/2$. On each Calabi-Yau $Z$ in the family we exhibit a positive dimensional family of Mumford stable bundles whose symmetry group is the Standard Model group $SU(3)\\times SU(2)\\times U(1)$ and which have $c_{3} = 6$. We also show that for each bundle $V$ in our family, $c_{2}(Z) - c_{2}(V)$ is the class of an effective curve on $Z$. These conditions ensure that $Z$ and $V$ can be used for a phenomenologically relevant compactification of Heterotic M-theory.

  3. Primary length standard adjustment

    Science.gov (United States)

    Ševčík, Robert; Guttenová, Jana

    2007-04-01

    This paper deals with problems and techniques connected with primary length standard adjusting, which includes disassembling of the device and by use of the secondary laser with collimated beam and diffraction laws successively reassembling of the laser. In the reassembling process the device was enhanced with substituting the thermal grease cooling of cold finger by copper socket cooler. This improved external cooling system enables more effective cooling of molecular iodine in the cell, which allows better pressure stability of iodine vapor and easier readjustment of the system.

  4. Extended Poisson Exponential Distribution

    Directory of Open Access Journals (Sweden)

    Anum Fatima

    2015-09-01

    Full Text Available A new mixture of Modified Exponential (ME and Poisson distribution has been introduced in this paper. Taking the Maximum of Modified Exponential random variable when the sample size follows a zero truncated Poisson distribution we have derived the new distribution, named as Extended Poisson Exponential distribution. This distribution possesses increasing and decreasing failure rates. The Poisson-Exponential, Modified Exponential and Exponential distributions are special cases of this distribution. We have also investigated some mathematical properties of the distribution along with Information entropies and Order statistics of the distribution. The estimation of parameters has been obtained using the Maximum Likelihood Estimation procedure. Finally we have illustrated a real data application of our distribution.

  5. Distributed user interfaces usability and collaboration

    CERN Document Server

    Lozano, María D; Tesoriero, Ricardo; Penichet, Victor MR

    2013-01-01

    Written by international researchers in the field of Distributed User Interfaces (DUIs), this book brings together important contributions regarding collaboration and usability in Distributed User Interface settings. Throughout the thirteen chapters authors address key questions concerning how collaboration can be improved by using DUIs, including: in which situations a DUI is suitable to ease the collaboration among users; how usability standards can be used to evaluate the usability of systems based on DUIs; and accurately describe case studies and prototypes implementing these concerns

  6. Math: Basic Skills Content Standards

    Science.gov (United States)

    CASAS - Comprehensive Adult Student Assessment Systems (NJ1), 2008

    2008-01-01

    This document presents content standards tables for math. [CASAS content standards tables are designed for educators at national, state and local levels to inform the alignment of content standards, instruction and assessment. The Content Standards along with the CASAS Competencies form the basis of the CASAS integrated assessment and curriculum…

  7. Repeated Interaction in Standard Setting

    NARCIS (Netherlands)

    Larouche, Pierre; Schütt, Florian

    2016-01-01

    As part of the standard-setting process, certain patents become essential. This may allow the owners of these standard-essential patents to hold up implementers of the standard, who can no longer turn to substitute technologies. However, many real-world standards evolve over time, with several

  8. European standards for composite construction

    NARCIS (Netherlands)

    Stark, J.W.B.

    2000-01-01

    The European Standards Organisation (CEN) has planned to develop a complete set of harmonized European building standards. This set includes standards for composite steel and concrete buildings and bridges. The Eurocodes, being the design standards, form part of this total system of European

  9. Emergency Management Standards and Schools

    Science.gov (United States)

    National Clearinghouse for Educational Facilities, 2009

    2009-01-01

    This publication discusses emergency management standards for school use and lists standards recommended by FEMA's National Incident Management System (NIMS). Schools are encouraged to review these standards carefully and to adopt, where applicable, those that meet their needs. The lists of standards, resources, and references contained herein…

  10. New Trends in Multimedia Standards: MPEG4 and JPEG2000

    Directory of Open Access Journals (Sweden)

    Jie Liang

    1999-01-01

    Full Text Available The dramatic increase in both computational power, brought on by the introduction of increasingly powerful chips, and the communications bandwidth, unleashed by the introduction of cable modem and ADSL, lays a solid foundation for the take-off of multimedia applications. Standards always play an important role in multimedia applications due to the need for wide distribution of multimedia contents. Standards have long played pivotal roles in the development of multimedia equipment and contents. MPEG4 and JPEG2000 are two recent multimedia standards under development under the auspice of the International Standards Organization (ISO. These new standards introduce new technology and new features that will become enabling technology for many emerging applications. In this paper, we describe the new trends and new developments that shape these new standards, and illustrate the potential impact of these new standards on multimedia applications.

  11. Distributed Data Management and Distributed File Systems

    CERN Document Server

    Girone, Maria

    2015-01-01

    The LHC program has been successful in part due to the globally distributed computing resources used for collecting, serving, processing, and analyzing the large LHC datasets. The introduction of distributed computing early in the LHC program spawned the development of new technologies and techniques to synchronize information and data between physically separated computing centers. Two of the most challenges services are the distributed file systems and the distributed data management systems. In this paper I will discuss how we have evolved from local site services to more globally independent services in the areas of distributed file systems and data management and how these capabilities may continue to evolve into the future. I will address the design choices, the motivations, and the future evolution of the computing systems used for High Energy Physics.

  12. XML Diagnostics Description Standard

    International Nuclear Information System (INIS)

    Neto, A.; Fernandes, H.; Varandas, C.; Lister, J.; Yonekawa, I.

    2006-01-01

    A standard for the self-description of fusion plasma diagnostics will be presented, based on the Extensible Markup Language (XML). The motivation is to maintain and organise the information on all the components of a laboratory experiment, from the hardware to the access security, to save time and money when problems arises. Since there is no existing standard to organise this kind of information, every Association stores and organises each experiment in different ways. This can lead to severe problems when the organisation schema is poorly documented or written in national languages. The exchange of scientists, researchers and engineers between laboratories is a common practice nowadays. Sometimes they have to install new diagnostics or to update existing ones and frequently they lose a great deal of time trying to understand the currently installed system. The most common problems are: no documentation available; the person who understands it has left; documentation written in the national language. Standardisation is the key to solving all the problems mentioned. From the commercial information on the diagnostic (component supplier; component price) to the hardware description (component specifications; drawings) to the operation of the equipment (finite state machines) through change control (who changed what and when) and internationalisation (information at least in the native language and in English), a common XML schema will be proposed. This paper will also discuss an extension of these ideas to the self-description of ITER plant systems, since the problems will be identical. (author)

  13. The standard model

    International Nuclear Information System (INIS)

    Marciano, W.J.

    1994-03-01

    In these lectures, my aim is to provide a survey of the standard model with emphasis on its renormalizability and electroweak radiative corrections. Since this is a school, I will try to be somewhat pedagogical by providing examples of loop calculations. In that way, I hope to illustrate some of the commonly employed tools of particle physics. With those goals in mind, I have organized my presentations as follows: In Section 2, renormalization is discussed from an applied perspective. The technique of dimensional regularization is described and used to define running couplings and masses. The utility of the renormalization group for computing leading logs is illustrated for the muon anomalous magnetic moment. In Section 3 electroweak radiative corrections are discussed. Standard model predictions are surveyed and used to constrain the top quark mass. The S, T, and U parameters are introduced and employed to probe for ''new physics''. The effect of Z' bosons on low energy phenomenology is described. In Section 4, a detailed illustration of electroweak radiative corrections is given for atomic parity violation. Finally, in Section 5, I conclude with an outlook for the future

  14. Standards for scrotal ultrasonography

    Directory of Open Access Journals (Sweden)

    Janusz F. Tyloch

    2016-12-01

    Full Text Available The paper presents a description of essential equipment requirements for scrotal ultrasonography, including current ultrasound techniques, as well as a review of the most common scrotal pathologies. Patient preparation for the examination as well as ultrasound methodology for the assessment of scrotal and inguinal canal structures are discussed. The standard for scrotal ultrasound examination includes a precise B-mode evaluation, including testicular volumetric assessment performed using automatic measurement options based on the formula of a rotating ellipsoid or three measurements perpendicular to one another. Also, criteria for morphological assessment of abnormalities within testicular or epididymal parenchyma, including a precise evaluation of lesion size, delineation, shape and vascular pattern obtained with Doppler US, have been proposed. Standard assessment further includes epididymal evaluation, including epididymal size in the case of enlargement. The paper additionally discusses the method of ultrasonographic examination and describes the most common pathologies occurring within scrotal structures, including a quantitative analysis of hydrocele and other abnormal fluid reservoirs. We have also presented criteria for the assessment of varicocele as well as color and spectral Doppler flows in scrotal pathologies. Furthermore, we have proposed key components of scrotal ultrasound documentation, so that the contained data could be used to establish appropriate diagnosis, allowing for both adequate clinical management and the reproducibility of subsequent US evaluations performed by either the same or a different examiner. The most common causes of diagnostic errors have also been discussed.

  15. Standards for scrotal ultrasonography

    Science.gov (United States)

    Tyloch, Janusz F.

    2016-01-01

    The paper presents a description of essential equipment requirements for scrotal ultrasonography, including current ultrasound techniques, as well as a review of the most common scrotal pathologies. Patient preparation for the examination as well as ultrasound methodology for the assessment of scrotal and inguinal canal structures are discussed. The standard for scrotal ultrasound examination includes a precise B-mode evaluation, including testicular volumetric assessment performed using automatic measurement options based on the formula of a rotating ellipsoid or three measurements perpendicular to one another. Also, criteria for morphological assessment of abnormalities within testicular or epididymal parenchyma, including a precise evaluation of lesion size, delineation, shape and vascular pattern obtained with Doppler US, have been proposed. Standard assessment further includes epididymal evaluation, including epididymal size in the case of enlargement. The paper additionally discusses the method of ultrasonographic examination and describes the most common pathologies occurring within scrotal structures, including a quantitative analysis of hydrocele and other abnormal fluid reservoirs. We have also presented criteria for the assessment of varicocele as well as color and spectral Doppler flows in scrotal pathologies. Furthermore, we have proposed key components of scrotal ultrasound documentation, so that the contained data could be used to establish appropriate diagnosis, allowing for both adequate clinical management and the reproducibility of subsequent US evaluations performed by either the same or a different examiner. The most common causes of diagnostic errors have also been discussed. PMID:28138410

  16. Information architecture: Profile of adopted standards

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-09-01

    The Department of Energy (DOE), like other Federal agencies, is under increasing pressure to use information technology to improve efficiency in mission accomplishment as well as delivery of services to the public. Because users and systems have become interdependent, DOE has enterprise wide needs for common application architectures, communication networks, databases, security, and management capabilities. Users need open systems that provide interoperability of products and portability of people, data, and applications that are distributed throughout heterogeneous computing environments. The level of interoperability necessary requires the adoption of DOE wide standards, protocols, and best practices. The Department has developed an information architecture and a related standards adoption and retirement process to assist users in developing strategies and plans for acquiring information technology products and services based upon open systems standards that support application software interoperability, portability, and scalability. This set of Departmental Information Architecture standards represents guidance for achieving higher degrees of interoperability within the greater DOE community, business partners, and stakeholders. While these standards are not mandatory, particular and due consideration of their applications in contractual matters and use in technology implementations Department wide are goals of the Chief Information Officer.

  17. Earthrods: The need for an Australian standard

    Energy Technology Data Exchange (ETDEWEB)

    Lyle, W. [Transnorth Pty Ltd., Morphett Vale, SA (Australia)

    1995-12-31

    Earth-rods are an integral and important part of most electrical distribution systems but there is no Australian Standard covering their design and detail. This leads to a wide and diverse range of earth-rod types in use as many authorities prepare their own criteria which may differ from others and yet the basic performance of the earth-rods is expected to be the same. This paper presents a case for the preparation of an Australian Standard so that users can select standard earth-rods and manufacturers can optimize production costs. It identifies the many and various types, forms and materials used in Australia for earth-rods by power authorities, communications, consumers and other major users of earthing systems. Add-on points, couplings, driving heads and installation methods are also discussed. The paper concludes with a request that the Electricity Supply Association of Australia (ESAA) and/or Standards Australia become involved in establishing an Australian Standard for earth-rods covering such aspects as materials, dimensions, cladding thickness, coupling characteristics, mechanical and electrical properties and tests, and any other variables which control the effectiveness of the earth-rods. (author). 6 figs.

  18. Food Standards are Good – for Middle-Class Farmers

    DEFF Research Database (Denmark)

    Hansen, Henrik; Trifkovic, Neda

    2014-01-01

    We estimate the causal effect of food standards on Vietnamese pangasius farmers’ wellbeing measured by per capita consumption expenditure. We estimate both the average effects and the local average treatment effects on poorer and richer farmers by instrumental variable quantile regression. Our...... results indicate that large returns can be accrued from food standards, but only for the upper middle-class farmers, i.e., those between the 50% and 85% quantiles of the expenditure distribution. Overall, our result points to an exclusionary impact of standards for the poorest farmers while the richest do...... not apply standards because the added gain is too small....

  19. Food Standards are Good– for Middle-class Farmers

    DEFF Research Database (Denmark)

    Hansen, Henrik; Trifkovic, Neda

    We estimate the causal effect of food standards on Vietnamese pangasius farmers’ wellbeing measured by per capita consumption expenditure. We estimate both the average effects and the local average treatment effects on poorer and richer farmers by instrumental variable quantile regression. Our...... results indicate that large returns can be accrued from food standards, but only for the upper middle-class farmers, i.e., those between the 50% and 85% quantiles of the expenditure distribution. Overall, our result points to an exclusionary impact of standards for the poorest farmers while the richest do...... not apply standards because the added gain is too small....

  20. Parton distributions with LHC data

    DEFF Research Database (Denmark)

    Ball, R.D.; Deans, C.S.; Del Debbio, L.

    2013-01-01

    with each other and with all the older data sets included in the fit. We present predictions for various standard candle cross-sections, and compare them to those obtained previously using NNPDF2.1, and specifically discuss the impact of ATLAS electroweak data on the determination of the strangeness......We present the first determination of parton distributions of the nucleon at NLO and NNLO based on a global data set which includes LHC data: NNPDF2.3. Our data set includes, besides the deep inelastic, Drell-Yan, gauge boson production and jet data already used in previous global PDF...... determinations, all the relevant LHC data for which experimental systematic uncertainties are currently available: ATLAS and LHCb W and Z rapidity distributions from the 2010 run, CMS W electron asymmetry data from the 2011 run, and ATLAS inclusive jet cross-sections from the 2010 run. We introduce an improved...

  1. Current and emerging standards in document imaging and storage

    Science.gov (United States)

    Baronas, Jean M.

    1992-05-01

    Standards publications being developed by scientists, engineers, and business managers in the Association for Information and Image Management (AIIM) standards committees can be applied to `electronic image management' (EIM) processes including: document image transfer, retrieval and evaluation; optical disk and document scanning; and document design and conversion. When combined with EIM system planning and operations, standards can help generate image databases that are interchangeable among a variety of systems. AIIM is an accredited American National Standards Institute (ANSI) standards developer with more than twenty committees. The committees are comprised of 300 volunteers representing users, vendors, and manufacturers. The standards publications that are developed in these committees have national acceptance. They provide the basis for international harmonization in the development of new International Organization for Standardization (ISO) standards. Until standard implementation parameters are established, the application of different approaches to image management cause uncertainty in EIM system compatibility, calibration, performance, and upward compatibility. The AIIM standards for these applications can be used to decrease the uncertainty, successfully integrate imaging processes, and promote `open systems.' This paper describes AIIM's EIM standards and a new effort at AIIM, a database on standards projects in a wide framework, including image capture, recording, processing, duplication, distribution, display, evaluation, preservation, and media. The AIIM Imagery Database covers imaging standards being developed by many organizations in many different countries. It contains standards publications' dates, origins, related national and international projects, status, keywords, and abstracts. The ANSI Image Technology Standards Board (ITSB) requested that such a database be established, as did the International Standards Organization

  2. GENDER, SOCIAL TRUST AND POLITICAL SOCIALIZATION IN ...

    African Journals Online (AJOL)

    girls have for these agents were examined on the assumption that the higher the trust for an agent, the more effective that agent would be, and vice versa. self- administered questionnaires Were utilized to generate the empirical data, which was analyzed using the chi-square and standardized residuals non-parametric ...

  3. Research

    African Journals Online (AJOL)

    abp

    2017-06-28

    Jun 28, 2017 ... 3 probable'' neonatal sepsis groups. Chi-square was used to compare proportions of neonates with elevated PCT and CRP among the groups. Blood culture was used as the gold standard for receiver operator characteristics (ROC) of PCT and CRP. Area under the ROC curve (AUC) for PCT and CRP were ...

  4. Study of obesity associated proopiomelanocortin gene polymorphism

    African Journals Online (AJOL)

    Farida El-Baz Mohamed

    2016-03-10

    Mar 10, 2016 ... and eating habits in a sample of obese Egyptian children and adolescents. Farida El-Baz Mohamed a .... problems, sleep disorders) and family history of obesity, hypertension, diabetes mellitus, liver disease or ..... mean, standard deviation and Student's t-test [Unpaired],. Chi-square, Mann–Whitney test by ...

  5. Estimating structural equation models with non-normal variables by using transformations

    NARCIS (Netherlands)

    Montfort, van K.; Mooijaart, A.; Meijerink, F.

    2009-01-01

    We discuss structural equation models for non-normal variables. In this situation the maximum likelihood and the generalized least-squares estimates of the model parameters can give incorrect estimates of the standard errors and the associated goodness-of-fit chi-squared statistics. If the sample

  6. Potential Effects of the Adoption and Implementstion of International ...

    African Journals Online (AJOL)

    It presents the results from a questionnaire survey of a sample of accounting lecturers, auditors and accountants. The data were analyzed using the Chi Square. The study found that International Financial Reporting Standards have the potential for yielding greater benefits than current GAAP, improve business performance ...

  7. Epidemiological and clinical description of Lassa fever in Jos, Nigeria

    African Journals Online (AJOL)

    Main laboratory variable of interest was LF polymerase chain reaction (PCR) test. Categorical variables were compared using Chi square test or Fisher's exact test. Continuous variableswere expressed as mean ± standard deviation or as median with range. Means were compared by student's t test or Mann-Whitney U test.

  8. Morbidty pattern amongst the adult population in three levels of ...

    African Journals Online (AJOL)

    The data collected was analyzed using SPSS statistical software (version 16) and the values expressed as mean ± standard deviation. Tests of statistical significance were also performed using the chi square and goodness of fit tools. Results showed that Malaria was the leading cause of morbidity followed by Hypertension ...

  9. Appraisal of the inherent socio-demographic dynamics of HIV/AIDS ...

    African Journals Online (AJOL)

    Data were collected with standard closed ended semi-structured questionnaires self-administered to consenting, 605 HIV/AIDS patients, selected using the multistage random sampling technique, logistic linear regression, randomized block design and Pearson's Chi square test (á=0.01) were used to analyse the data ...

  10. Distributed Wind Competitiveness Improvement Project

    Energy Technology Data Exchange (ETDEWEB)

    2016-05-01

    The Competitiveness Improvement Project (CIP) is a periodic solicitation through the U.S. Department of Energy and its National Renewable Energy Laboratory. Manufacturers of small and medium wind turbines are awarded cost-shared grants via a competitive process to optimize their designs, develop advanced manufacturing processes, and perform turbine testing. The goals of the CIP are to make wind energy cost competitive with other distributed generation technology and increase the number of wind turbine designs certified to national testing standards. This fact sheet describes the CIP and funding awarded as part of the project.

  11. The ATLAS Distributed Analysis System

    CERN Document Server

    Legger, F; The ATLAS collaboration; Pacheco Pages, A; Stradling, A

    2013-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...

  12. The ATLAS Distributed Analysis System

    CERN Document Server

    Legger, F; The ATLAS collaboration

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...

  13. Association of MTHFR polymorphisms with nsCL/P in Chinese ...

    African Journals Online (AJOL)

    The distribution of MTHFR genotypes and frequency of alleles were compared between patients and controls by chi-square test. The odds ratios (OR) and corresponding 95% confidence intervals (95% CIs) were calculated to estimate the strength of association of MTHFR gene (C677T and A1298C). Results: For rs1801131 ...

  14. Covariance Structure Model Fit Testing under Missing Data: An Application of the Supplemented EM Algorithm

    Science.gov (United States)

    Cai, Li; Lee, Taehun

    2009-01-01

    We apply the Supplemented EM algorithm (Meng & Rubin, 1991) to address a chronic problem with the "two-stage" fitting of covariance structure models in the presence of ignorable missing data: the lack of an asymptotically chi-square distributed goodness-of-fit statistic. We show that the Supplemented EM algorithm provides a…

  15. Abstract Résumé Background

    African Journals Online (AJOL)

    2010-12-04

    Dec 4, 2010 ... in the transport, communication and distribution sector; 31% of the companies in the manufacturing and construction sector; and 21% of the companies in the trading sector. The Chi-square value for state of HIV and AIDS workplace policy by company sector was significant [x2 (df=3, N=152) = 12.021, ...

  16. The Effects of Selection Strategies for Bivariate Loglinear Smoothing Models on NEAT Equating Functions

    Science.gov (United States)

    Moses, Tim; Holland, Paul W.

    2010-01-01

    In this study, eight statistical strategies were evaluated for selecting the parameterizations of loglinear models for smoothing the bivariate test score distributions used in nonequivalent groups with anchor test (NEAT) equating. Four of the strategies were based on significance tests of chi-square statistics (Likelihood Ratio, Pearson,…

  17. Impact of Students' Financial Strength on their Academic Performance

    African Journals Online (AJOL)

    Toshiba

    designed questionnaires distributed by stratified random sampling with proportional allocation scheme, to find out the impact of finance on students‟ academic performance with reference to Kaduna. Polytechnic. The statistical tests we applied are Chi-square, Phi coefficient and Bi-serial correlation. It was found that ...

  18. A Note on the Power Provided by Sibships of Sizes 2, 3, and 4 in Genetic Covariance Modeling of a Codominant QTL.

    NARCIS (Netherlands)

    Dolan, C.V.; Boomsma, D.I.; Neale, M.C.

    1999-01-01

    The contribution of size 3 and size 4 sibships to power in covariance structure modeling of a codominant QTL is investigated. Power calculations are based on the noncentral chi-square distribution. Sixteen sets of parameter values are considered. Results indicate that size 3 and size 4 sibships

  19. Pearson-type goodness-of-fit test with bootstrap maximum likelihood estimation.

    Science.gov (United States)

    Yin, Guosheng; Ma, Yanyuan

    2013-01-01

    The Pearson test statistic is constructed by partitioning the data into bins and computing the difference between the observed and expected counts in these bins. If the maximum likelihood estimator (MLE) of the original data is used, the statistic generally does not follow a chi-squared distribution or any explicit distribution. We propose a bootstrap-based modification of the Pearson test statistic to recover the chi-squared distribution. We compute the observed and expected counts in the partitioned bins by using the MLE obtained from a bootstrap sample. This bootstrap-sample MLE adjusts exactly the right amount of randomness to the test statistic, and recovers the chi-squared distribution. The bootstrap chi-squared test is easy to implement, as it only requires fitting exactly the same model to the bootstrap data to obtain the corresponding MLE, and then constructs the bin counts based on the original data. We examine the test size and power of the new model diagnostic procedure using simulation studies and illustrate it with a real data set.

  20. porphyrin with single strand DNAs

    Indian Academy of Sciences (India)

    for organization of porphyrin molecules into extended assemblies, providing opportunities for construction of supramolecular structures.6–8 Among the porphyrin .... and consequently the mono- and bi-exponential nature of the decays were judged by the reduced chi-square. (χ2) values and distribution of the weighted ...

  1. A descriptive study of the morbidity pattern of older persons ...

    African Journals Online (AJOL)

    Data were analyzed using Stata version 13 (Texas, USA). Frequency distributions were used for descriptive analysis, and chi-square test was used to test associations. Results: More than a half, 2919 (59.7%), of the respondents were females and almost three quarters 3501 (71.7%) were aged between 60 and 74 years.

  2. Assessing model fit in latent class analysis when asymptotics do not hold

    NARCIS (Netherlands)

    van Kollenburg, Geert H.; Mulder, Joris; Vermunt, Jeroen K.

    2015-01-01

    The application of latent class (LC) analysis involves evaluating the LC model using goodness-of-fit statistics. To assess the misfit of a specified model, say with the Pearson chi-squared statistic, a p-value can be obtained using an asymptotic reference distribution. However, asymptotic p-values

  3. Physical Activity Patterns among U.S. Adults with Disabilities

    Science.gov (United States)

    Chiu, Chung-Yi; An, Ruopeng

    2016-01-01

    Purpose: To characterize physical activity patterns among people with disabilities using data from a nationally representative health survey. Method: Individual-level data came from the Behavioral Risk Factor Surveillance System 2011 survey. Pearson's chi-squared tests were conducted to assess the difference in the proportion distribution of…

  4. Where are the gaps in improving maternal and child health in ...

    African Journals Online (AJOL)

    Descriptive analyses were conducted using a Pearson's Chi-Squared test, assuming a binomial distribution and a confidence level of alpha=0.05. Results: Our results indicated that there were marked regional differences in maternal and child health indicators between these two rural sites, with M'bagne generally ...

  5. Counseling as a Stochastic Process: Fitting a Markov Chain Model to Initial Counseling Interviews

    Science.gov (United States)

    Lichtenberg, James W.; Hummel, Thomas J.

    1976-01-01

    The goodness of fit of a first-order Markov chain model to six counseling interviews was assessed by using chi-square tests of homogeneity and simulating sampling distributions of selected process characteristics against which the same characteristics in the actual interviews were compared. The model fit four of the interviews. Presented at AERA,…

  6. Internet Journal of Medical Update

    African Journals Online (AJOL)

    admin

    Statistical tests used were the student's t-test, the chi-squared test, and Pearson correlation coefficient. .... aggressiveness, pessimism, lack of concentration, depression, and confusion. The distribution of psychological complaints in males and females is shown in Table 2. Association of complications with age, gender.

  7. Different responses of soybean cyst nematode resistance between ...

    Indian Academy of Sciences (India)

    2016-12-02

    Dec 2, 2016 ... Chi-square test of frequency distribution of families' female index (FI) showed that resistance to SCN was significantly different between NJ(RN)P7 and ... Jinan, Shandong 250100, People's Republic of China; Sericultural Research Institute, Chinese Academy of Agricultural Sciences, Zhenjiang, Jiangsu ...

  8. Socio-Economic analysis and fishing activites of lagoon and marine ...

    African Journals Online (AJOL)

    Fishermen have distinctive social and economic features which affect their fishing operations. The study was therefore designed to identify the socio-economic characteristics of fisher-folks living in lagoon and coastal communities and assesses the fishing activities. Frequency distribution, percentages, chi-square and ...

  9. Gender and area of specialization vis-à-vis students' enrolments in ...

    African Journals Online (AJOL)

    owner

    quantitative pathway using percentages and Chi Square (Gay, 1992) to portray the status and tendency in students' enrolment per area of specialization. Findings and discussion. A typical student distribution across platforms. In order to show the trends in proportions of students in platforms and degree programmes the ...

  10. road sector development and economic growth in ethiopia1

    African Journals Online (AJOL)

    Eyerusalem

    Ibrahim Worku: Road sector development and economic growth in Ethiopia. 134 identified, i.e., the excluded instruments are "relevant", meaning correlated with the endogenous regressors. Under the null that the equation is underidentified, the statistic is distributed as chi-squared with degrees of freedom (L1-K1+1).

  11. The applicability of Herzberg's two-factor theory on the junior non ...

    African Journals Online (AJOL)

    The data was analysed using frequency and percentage distribution as well as chi-square statistics. The study established that the junior non-academic staff at Mak. is not highly motivated despite the presence of high levels of Herzberg's satisfiers such as promotion and recognition in the University. It also established that ...

  12. Psychological Type and Preferences in the Academic Environment

    Science.gov (United States)

    1994-09-01

    288 viii List of izures Figure Page 1. Preference Strengths ........................................ .... 20 2. Preference... Strength Variations .................................... 21 3. Chi-Square Analysis for Preferred Choices Within Distributions ......... 44 4. Chi...Sixteen Myers-Briggs Type Indicator Combinations SESING INTUITING With Thinking With Feeling With Feeling With Thinking INTROVERSION Judging ISTJ ISFJ INFJ

  13. Statistical power of likelihood ratio and Wald tests in latent class models with covariates

    NARCIS (Netherlands)

    Gudicha, D.W.; Schmittmann, V.D.; Vermunt, J.K.

    2017-01-01

    This paper discusses power and sample-size computation for likelihood ratio and Wald testing of the significance of covariate effects in latent class models. For both tests, asymptotic distributions can be used; that is, the test statistic can be assumed to follow a central Chi-square under the null

  14. Prevalence and correlates of hunger among primary and secondary ...

    African Journals Online (AJOL)

    conducted in 2009 to estimate the prevalence of self-reported hunger within the last 30 days among primary and secondary school age group. It also assessed the association between self-reported hunger and some selected list of independent variables using frequency distribution, chi- squared test and logistic regression.

  15. Different responses of soybean cyst nematode resistance between ...

    Indian Academy of Sciences (India)

    2016-12-02

    Dec 2, 2016 ... SCN race 1 (HG types 2.5.7), while 7605 is highly susceptible. Chi-square test of frequency distribution of families' female index (FI) showed that resistance to SCN was significantly different between NJ(RN)P7 and JN(RN)P7 popula- tions. Three recessive genes conditioned the inheritance of resistance to ...

  16. Dexmedetomidine an adjuvant to levobupivacaine in ...

    African Journals Online (AJOL)

    Duration of motor and sensory block and time to first rescue analgesia were recorded. Data analysis was done by SPSS version 16.0 [SPSS Inc ILLINOIS, USA, 2008]. Categorical variables were analyzed using Pearson”s Chi-square test. Normally distributed numerical variables were analyzed using unpaired “t” test.

  17. Kin networks and migration in Sagbama Local Government Area of ...

    African Journals Online (AJOL)

    The non-probability (snowball technique) sampling technique was used in selecting 13 migrants' streams in Sagbama Town through the questionnaire instrument. Analysis of the data collected for the study was based on frequency distribution table and simple percentages; chi square and multiple linear regression at 0.05 ...

  18. Incorporating Skew into RMS Surface Roughness Probability Distribution

    Science.gov (United States)

    Stahl, Mark T.; Stahl, H. Philip.

    2013-01-01

    The standard treatment of RMS surface roughness data is the application of a Gaussian probability distribution. This handling of surface roughness ignores the skew present in the surface and overestimates the most probable RMS of the surface, the mode. Using experimental data we confirm the Gaussian distribution overestimates the mode and application of an asymmetric distribution provides a better fit. Implementing the proposed asymmetric distribution into the optical manufacturing process would reduce the polishing time required to meet surface roughness specifications.

  19. Telemetry Standards, IRIG Standard 106-17. Chapter 10. Digital Recording Standard

    Science.gov (United States)

    2017-07-01

    formatting characters, representing the numeric date on which the file was created (e.g., BCS codes for the decimal digits 02092000...Telemetry Standards, IRIG Standard 106-17 Chapter 10, July 2017 CHAPTER 10 Digital Recording Standard Acronyms...10-v Chapter 10. Digital Recording Standard

  20. 76 FR 81295 - Cost Accounting Standards: Cost Accounting Standards 412 and 413-Cost Accounting Standards...

    Science.gov (United States)

    2011-12-27

    ... Federal Procurement Policy 48 CFR Part 9904 Cost Accounting Standards: Cost Accounting Standards 412 and 413--Cost Accounting Standards Pension Harmonization Rule; Final Rule #0;#0;Federal Register / Vol. 76... MANAGEMENT AND BUDGET Office of Federal Procurement Policy 48 CFR Part 9904 Cost Accounting Standards: Cost...