WorldWideScience

Sample records for minimum random testing

  1. 77 FR 75896 - Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2013

    Science.gov (United States)

    2012-12-26

    ...-11213, Notice No. 16] Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2013...., Washington, DC 20590, (telephone 202-493- 1342); or Kathy Schnakenberg, FRA Alcohol/Drug Program Specialist... from FRA's Management Information System, the rail industry's random drug testing positive rate has...

  2. 75 FR 79308 - Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2011

    Science.gov (United States)

    2010-12-20

    ...-11213, Notice No. 14] Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2011... random testing positive rates were .037 percent for drugs and .014 percent for alcohol. Because the... effective December 20, 2010. FOR FURTHER INFORMATION CONTACT: Lamar Allen, Alcohol and Drug Program Manager...

  3. 75 FR 1547 - Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2010

    Science.gov (United States)

    2010-01-12

    ...-11213, Notice No. 13] RIN 2130-AA81 Alcohol and Drug Testing: Determination of Minimum Random Testing... percent for alcohol. Because the industry-wide random drug testing positive rate has remained below 1.0... effective upon publication. FOR FURTHER INFORMATION CONTACT: Lamar Allen, Alcohol and Drug Program Manager...

  4. 76 FR 80781 - Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2012

    Science.gov (United States)

    2011-12-27

    ...-11213, Notice No. 15] RIN 2130-AA81 Alcohol and Drug Testing: Determination of Minimum Random Testing...: Lamar Allen, Alcohol and Drug Program Manager, Office of Safety Enforcement, Mail Stop 25, Federal... Kathy Schnakenberg, FRA Alcohol/Drug Program Specialist, (telephone (719) 633-8955). Issued in...

  5. 78 FR 78275 - Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2014

    Science.gov (United States)

    2013-12-26

    ...-11213, Notice No. 17] Alcohol and Drug Testing: Determination of Minimum Random Testing Rates for 2014... December 26, 2013. FOR FURTHER INFORMATION CONTACT: Jerry Powers, FRA Drug and Alcohol Program Manager, W38...-493-6313); or Sam Noe, FRA Drug and Alcohol Program Specialist, (telephone 615-719- 2951). Issued in...

  6. Randomization tests

    CERN Document Server

    Edgington, Eugene

    2007-01-01

    Statistical Tests That Do Not Require Random Sampling Randomization Tests Numerical Examples Randomization Tests and Nonrandom Samples The Prevalence of Nonrandom Samples in Experiments The Irrelevance of Random Samples for the Typical Experiment Generalizing from Nonrandom Samples Intelligibility Respect for the Validity of Randomization Tests Versatility Practicality Precursors of Randomization Tests Other Applications of Permutation Tests Questions and Exercises Notes References Randomized Experiments Unique Benefits of Experiments Experimentation without Mani

  7. MINIMUM ENTROPY DECONVOLUTION OF ONE-AND MULTI-DIMENSIONAL NON-GAUSSIAN LINEAR RANDOM PROCESSES

    Institute of Scientific and Technical Information of China (English)

    程乾生

    1990-01-01

    The minimum entropy deconvolution is considered as one of the methods for decomposing non-Gaussian linear processes. The concept of peakedness of a system response sequence is presented and its properties are studied. With the aid of the peakedness, the convergence theory of the minimum entropy deconvolution is established. The problem of the minimum entropy deconvolution of multi-dimensional non-Gaussian linear random processes is first investigated and the corresponding theory is given. In addition, the relation between the minimum entropy deconvolution and parameter method is discussed.

  8. Minimum spanning trees and random resistor networks in d dimensions.

    Science.gov (United States)

    Read, N

    2005-09-01

    We consider minimum-cost spanning trees, both in lattice and Euclidean models, in d dimensions. For the cost of the optimum tree in a box of size L , we show that there is a correction of order L(theta) , where theta or =1 . The arguments all rely on the close relation of Kruskal's greedy algorithm for the minimum spanning tree, percolation, and (for some arguments) random resistor networks. The scaling of the entropy and free energy at small nonzero T , and hence of the number of near-optimal solutions, is also discussed. We suggest that the Steiner tree problem is in the same universality class as the minimum spanning tree in all dimensions, as is the traveling salesman problem in two dimensions. Hence all will have the same value of theta=-3/4 in two dimensions.

  9. 78 FR 77196 - Random Drug and Alcohol Testing Percentage Rates of Covered Aviation Employees for the Period of...

    Science.gov (United States)

    2013-12-20

    ... DEPARTMENT OF TRANSPORTATION Federal Aviation Administration Random Drug and Alcohol Testing... the minimum random drug and alcohol testing percentage rates for the period January 1, 2014, through... Federal Regulations Title 14, section 120.109(b) (for drug testing), and 120.217(c) (for alcohol testing...

  10. 75 FR 76069 - Random Drug and Alcohol Testing Percentage Rates of Covered Aviation Employees for the Period of...

    Science.gov (United States)

    2010-12-07

    ... DEPARTMENT OF TRANSPORTATION Federal Aviation Administration Random Drug and Alcohol Testing... minimum random drug and alcohol testing percentage rates for the period January 1, 2011, through December... Regulations Title 14, section 120.109(b) (for drug testing), and 120.217(c) (for alcohol testing). Issued in...

  11. 77 FR 71669 - Random Drug and Alcohol Testing Percentage Rates of Covered Aviation Employees for the Period of...

    Science.gov (United States)

    2012-12-03

    ... DEPARTMENT OF TRANSPORTATION Federal Aviation Administration Random Drug and Alcohol Testing... the minimum random drug and alcohol testing percentage rates for the period January 1, 2013, through... Regulations Title 14, Sec. Sec. 120.109(b) (for drug testing), and 120.217(c) (for alcohol testing). Issued in...

  12. 76 FR 74843 - Random Drug and Alcohol Testing Percentage Rates of Covered Aviation Employees for the Period of...

    Science.gov (United States)

    2011-12-01

    ... DEPARTMENT OF TRANSPORTATION Federal Aviation Administration Random Drug and Alcohol Testing... the minimum random drug and alcohol testing percentage rates for the period January 1, 2012, through... Regulations Title 14, Sec. 120.109(b) (for drug testing), and 120.217(c) (for alcohol testing). Issued in...

  13. 25 CFR 547.14 - What are the minimum technical standards for electronic random number generation?

    Science.gov (United States)

    2010-04-01

    ... random number generation? 547.14 Section 547.14 Indians NATIONAL INDIAN GAMING COMMISSION, DEPARTMENT OF... CLASS II GAMES § 547.14 What are the minimum technical standards for electronic random number generation...) Unpredictability; and (3) Non-repeatability. (b) Statistical Randomness.(1) Numbers produced by an RNG shall be...

  14. 42 CFR 84.145 - Motor-operated blower test; minimum requirements.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Motor-operated blower test; minimum requirements... Supplied-Air Respirators § 84.145 Motor-operated blower test; minimum requirements. (a) Motor-operated... connection between the motor and the blower shall be so constructed that the motor may be disengaged from the...

  15. A new reliability measure based on specified minimum distances before the locations of random variables in a finite interval

    International Nuclear Information System (INIS)

    Todinov, M.T.

    2004-01-01

    A new reliability measure is proposed and equations are derived which determine the probability of existence of a specified set of minimum gaps between random variables following a homogeneous Poisson process in a finite interval. Using the derived equations, a method is proposed for specifying the upper bound of the random variables' number density which guarantees that the probability of clustering of two or more random variables in a finite interval remains below a maximum acceptable level. It is demonstrated that even for moderate number densities the probability of clustering is substantial and should not be neglected in reliability calculations. In the important special case where the random variables are failure times, models have been proposed for determining the upper bound of the hazard rate which guarantees a set of minimum failure-free operating intervals before the random failures, with a specified probability. A model has also been proposed for determining the upper bound of the hazard rate which guarantees a minimum availability target. Using the models proposed, a new strategy, models and reliability tools have been developed for setting quantitative reliability requirements which consist of determining the intersection of the hazard rate envelopes (hazard rate upper bounds) which deliver a minimum failure-free operating period before random failures, a risk of premature failure below a maximum acceptable level and a minimum required availability. It is demonstrated that setting reliability requirements solely based on an availability target does not necessarily mean a low risk of premature failure. Even at a high availability level, the probability of premature failure can be substantial. For industries characterised by a high cost of failure, the reliability requirements should involve a hazard rate envelope limiting the risk of failure below a maximum acceptable level

  16. Robust inference from multiple test statistics via permutations: a better alternative to the single test statistic approach for randomized trials.

    Science.gov (United States)

    Ganju, Jitendra; Yu, Xinxin; Ma, Guoguang Julie

    2013-01-01

    Formal inference in randomized clinical trials is based on controlling the type I error rate associated with a single pre-specified statistic. The deficiency of using just one method of analysis is that it depends on assumptions that may not be met. For robust inference, we propose pre-specifying multiple test statistics and relying on the minimum p-value for testing the null hypothesis of no treatment effect. The null hypothesis associated with the various test statistics is that the treatment groups are indistinguishable. The critical value for hypothesis testing comes from permutation distributions. Rejection of the null hypothesis when the smallest p-value is less than the critical value controls the type I error rate at its designated value. Even if one of the candidate test statistics has low power, the adverse effect on the power of the minimum p-value statistic is not much. Its use is illustrated with examples. We conclude that it is better to rely on the minimum p-value rather than a single statistic particularly when that single statistic is the logrank test, because of the cost and complexity of many survival trials. Copyright © 2013 John Wiley & Sons, Ltd.

  17. Minimum intervention dentistry approach to managing early childhood caries: a randomized control trial.

    Science.gov (United States)

    Arrow, Peter; Klobas, Elizabeth

    2015-12-01

    A pragmatic randomized control trial was undertaken to compare the minimum intervention dentistry (MID) approach, based on the atraumatic restorative treatment procedures (MID-ART: Test), against the standard care approach (Control) to treat early childhood caries in a primary care setting. Consenting parent/child dyads were allocated to the Test or Control group using stratified block randomization. Inclusion and exclusion criteria were applied. Participants were examined at baseline and at follow-up by two calibrated examiners blind to group allocation status (κ = 0.77), and parents completed a questionnaire at baseline and follow-up. Dental therapists trained in MID-ART provided treatment to the Test group and dentists treated the Control group using standard approaches. The primary outcome of interest was the number of children who were referred for specialist pediatric care. Secondary outcomes were the number of teeth treated, changes in child oral health-related quality of life and dental anxiety and parental perceptions of care received. Data were analyzed on an intention to treat basis; risk ratio for referral for specialist care, test of proportions, Wilcoxon rank test and logistic regression were used. Three hundred and seventy parents/carers were initially screened; 273 children were examined at baseline and 254 were randomized (Test = 127; Control = 127): mean age = 3.8 years, SD 0.90; 59% male, mean dmft = 4.9, SD 4.0. There was no statistically significant difference in age, sex, baseline caries experience or child oral health-related quality of life between the Test and Control group. At follow-up (mean interval 11.4 months, SD 3.1 months), 220 children were examined: Test = 115, Control = 105. Case-notes review of 231 children showed Test = 6 (5%) and Control = 53 (49%) were referred for specialist care, P < 0.0001. More teeth were filled in the Test group (mean = 2.93, SD 2.48) than in the Control group (mean = 1.54, SD

  18. A 34 ampere-hour nickel-cadmium minimum trickle charge testing

    Science.gov (United States)

    Timmerman, P. J.

    1985-01-01

    The current rates used for trickle charging batteries are critical in maintaining a full charge and in preventing an overcharge condition. The importance of the trickle charge rate comes from the design, maintenance and operational requirements of an electrical power system. The results of minimum trickle charge testing performed on six 34 ampere-hour, nickel-cadmium cells manufactured by General Electric are described. The purpose of the testing was to identify the minimum trickle charge rates at temperatures of 15 C and 30 C.

  19. Minimum scale controlled topology optimization and experimental test of a micro thermal actuator

    DEFF Research Database (Denmark)

    Heo, S.; Yoon, Gil Ho; Kim, Y.Y.

    2008-01-01

    This paper is concerned with the optimal topology design, fabrication and test of a micro thermal actuator. Because the minimum scale was controlled during the design optimization process, the production yield rate of the actuator was improved considerably; alternatively, the optimization design ...... tested. The test showed that control over the minimum length scale in the design process greatly improves the yield rate and reduces the performance deviation....... without scale control resulted in a very low yield rate. Using the minimum scale controlling topology design method developed earlier by the authors, micro thermal actuators were designed and fabricated through a MEMS process. Moreover, both their performance and production yield were experimentally...

  20. Heart rate-based lactate minimum test: a reproducible method.

    NARCIS (Netherlands)

    Strupler, M.; Muller, G.; Perret, C.

    2009-01-01

    OBJECTIVE: To find the individual intensity for aerobic endurance training, the lactate minimum test (LMT) seems to be a promising method. LMTs described in the literature consist of speed or work rate-based protocols, but for training prescription in daily practice mostly heart rate is used. The

  1. QUASI-RANDOM TESTING OF COMPUTER SYSTEMS

    Directory of Open Access Journals (Sweden)

    S. V. Yarmolik

    2013-01-01

    Full Text Available Various modified random testing approaches have been proposed for computer system testing in the black box environment. Their effectiveness has been evaluated on the typical failure patterns by employing three measures, namely, P-measure, E-measure and F-measure. A quasi-random testing, being a modified version of the random testing, has been proposed and analyzed. The quasi-random Sobol sequences and modified Sobol sequences are used as the test patterns. Some new methods for Sobol sequence generation have been proposed and analyzed.

  2. The minimum sit-to-stand height test: reliability, responsiveness and relationship to leg muscle strength.

    Science.gov (United States)

    Schurr, Karl; Sherrington, Catherine; Wallbank, Geraldine; Pamphlett, Patricia; Olivetti, Lynette

    2012-07-01

    To determine the reliability of the minimum sit-to-stand height test, its responsiveness and its relationship to leg muscle strength among rehabilitation unit inpatients and outpatients. Reliability study using two measurers and two test occasions. Secondary analysis of data from two clinical trials. Inpatient and outpatient rehabilitation services in three public hospitals. Eighteen hospital patients and five others participated in the reliability study. Seventy-two rehabilitation unit inpatients and 80 outpatients participated in the clinical trials. The minimum sit-to-stand height test was assessed using a standard procedure. For the reliability study, a second tester repeated the minimum sit-to-stand height test on the same day. In the inpatient clinical trial the measures were repeated two weeks later. In the outpatient trial the measures were repeated five weeks later. Knee extensor muscle strength was assessed in the clinical trials using a hand-held dynamometer. The reliability for the minimum sit-to-stand height test was excellent (intraclass correlation coefficient (ICC) 0.91, 95% confidence interval (CI) 0.81-0.96). The standard error of measurement was 34 mm. Responsiveness was moderate in the inpatient trial (effect size: 0.53) but small in the outpatient trial (effect size: 0.16). A small proportion (8-17%) of variability in minimum sit-to-stand height test was explained by knee extensor muscle strength. The minimum sit-to-stand height test has excellent reliability and moderate responsiveness in an inpatient rehabilitation setting. Responsiveness in an outpatient rehabilitation setting requires further investigation. Performance is influenced by factors other than knee extensor muscle strength.

  3. The minimum test battery to screen for binocular vision anomalies: report 3 of the BAND study.

    Science.gov (United States)

    Hussaindeen, Jameel Rizwana; Rakshit, Archayeeta; Singh, Neeraj Kumar; Swaminathan, Meenakshi; George, Ronnie; Kapur, Suman; Scheiman, Mitchell; Ramani, Krishna Kumar

    2018-03-01

    This study aims to report the minimum test battery needed to screen non-strabismic binocular vision anomalies (NSBVAs) in a community set-up. When large numbers are to be screened we aim to identify the most useful test battery when there is no opportunity for a more comprehensive and time-consuming clinical examination. The prevalence estimates and normative data for binocular vision parameters were estimated from the Binocular Vision Anomalies and Normative Data (BAND) study, following which cut-off estimates and receiver operating characteristic curves to identify the minimum test battery have been plotted. In the receiver operating characteristic phase of the study, children between nine and 17 years of age were screened in two schools in the rural arm using the minimum test battery, and the prevalence estimates with the minimum test battery were found. Receiver operating characteristic analyses revealed that near point of convergence with penlight and red filter (> 7.5 cm), monocular accommodative facility ( 1.25 prism dioptres) were significant factors with cut-off values for best sensitivity and specificity. This minimum test battery was applied to a cohort of 305 children. The mean (standard deviation) age of the subjects was 12.7 (two) years with 121 males and 184 females. Using the minimum battery of tests obtained through the receiver operating characteristic analyses, the prevalence of NSBVAs was found to be 26 per cent. Near point of convergence with penlight and red filter > 10 cm was found to have the highest sensitivity (80 per cent) and specificity (73 per cent) for the diagnosis of convergence insufficiency. For the diagnosis of accommodative infacility, monocular accommodative facility with a cut-off of less than seven cycles per minute was the best predictor for screening (92 per cent sensitivity and 90 per cent specificity). The minimum test battery of near point of convergence with penlight and red filter, difference between distance and near

  4. A Bayesian sequential design with adaptive randomization for 2-sided hypothesis test.

    Science.gov (United States)

    Yu, Qingzhao; Zhu, Lin; Zhu, Han

    2017-11-01

    Bayesian sequential and adaptive randomization designs are gaining popularity in clinical trials thanks to their potentials to reduce the number of required participants and save resources. We propose a Bayesian sequential design with adaptive randomization rates so as to more efficiently attribute newly recruited patients to different treatment arms. In this paper, we consider 2-arm clinical trials. Patients are allocated to the 2 arms with a randomization rate to achieve minimum variance for the test statistic. Algorithms are presented to calculate the optimal randomization rate, critical values, and power for the proposed design. Sensitivity analysis is implemented to check the influence on design by changing the prior distributions. Simulation studies are applied to compare the proposed method and traditional methods in terms of power and actual sample sizes. Simulations show that, when total sample size is fixed, the proposed design can obtain greater power and/or cost smaller actual sample size than the traditional Bayesian sequential design. Finally, we apply the proposed method to a real data set and compare the results with the Bayesian sequential design without adaptive randomization in terms of sample sizes. The proposed method can further reduce required sample size. Copyright © 2017 John Wiley & Sons, Ltd.

  5. 42 CFR 84.157 - Airflow resistance test; Type C supplied-air respirator, pressure-demand class; minimum...

    Science.gov (United States)

    2010-10-01

    ... test; Type C supplied-air respirator, pressure-demand class; minimum requirements. (a) The static... 42 Public Health 1 2010-10-01 2010-10-01 false Airflow resistance test; Type C supplied-air respirator, pressure-demand class; minimum requirements. 84.157 Section 84.157 Public Health PUBLIC HEALTH...

  6. Personality Changes as a Function of Minimum Competency Test Success or Failure.

    Science.gov (United States)

    Richman, Charles L.; And Others

    1987-01-01

    The psychological effects of success and failure on the North Carolina Minimum Competency Test (MCT) were examined. Subjects were high school students, who were pre- and post-tested using the Rosenberg Self Esteem Scale and the High School Personality Questionnaire. Self-esteem decreased following knowledge of MCT failure. (LMO)

  7. The Distribution of Minimum of Ratios of Two Random Variables and Its Application in Analysis of Multi-hop Systems

    Directory of Open Access Journals (Sweden)

    A. Stankovic

    2012-12-01

    Full Text Available The distributions of random variables are of interest in many areas of science. In this paper, ascertaining on the importance of multi-hop transmission in contemporary wireless communications systems operating over fading channels in the presence of cochannel interference, the probability density functions (PDFs of minimum of arbitrary number of ratios of Rayleigh, Rician, Nakagami-m, Weibull and α-µ random variables are derived. These expressions can be used to study the outage probability as an important multi-hop system performance measure. Various numerical results complement the proposed mathematical analysis.

  8. Conditional Monte Carlo randomization tests for regression models.

    Science.gov (United States)

    Parhat, Parwen; Rosenberger, William F; Diao, Guoqing

    2014-08-15

    We discuss the computation of randomization tests for clinical trials of two treatments when the primary outcome is based on a regression model. We begin by revisiting the seminal paper of Gail, Tan, and Piantadosi (1988), and then describe a method based on Monte Carlo generation of randomization sequences. The tests based on this Monte Carlo procedure are design based, in that they incorporate the particular randomization procedure used. We discuss permuted block designs, complete randomization, and biased coin designs. We also use a new technique by Plamadeala and Rosenberger (2012) for simple computation of conditional randomization tests. Like Gail, Tan, and Piantadosi, we focus on residuals from generalized linear models and martingale residuals from survival models. Such techniques do not apply to longitudinal data analysis, and we introduce a method for computation of randomization tests based on the predicted rate of change from a generalized linear mixed model when outcomes are longitudinal. We show, by simulation, that these randomization tests preserve the size and power well under model misspecification. Copyright © 2014 John Wiley & Sons, Ltd.

  9. 42 CFR 84.156 - Airflow resistance test; Type C supplied-air respirator, demand class; minimum requirements.

    Science.gov (United States)

    2010-10-01

    ... C supplied-air respirator, demand class; minimum requirements. (a) Inhalation resistance shall not... 42 Public Health 1 2010-10-01 2010-10-01 false Airflow resistance test; Type C supplied-air respirator, demand class; minimum requirements. 84.156 Section 84.156 Public Health PUBLIC HEALTH SERVICE...

  10. Split-plot fractional designs: Is minimum aberration enough?

    DEFF Research Database (Denmark)

    Kulahci, Murat; Ramirez, Jose; Tobias, Randy

    2006-01-01

    Split-plot experiments are commonly used in industry for product and process improvement. Recent articles on designing split-plot experiments concentrate on minimum aberration as the design criterion. Minimum aberration has been criticized as a design criterion for completely randomized fractional...... factorial design and alternative criteria, such as the maximum number of clear two-factor interactions, are suggested (Wu and Hamada (2000)). The need for alternatives to minimum aberration is even more acute for split-plot designs. In a standard split-plot design, there are several types of two...... for completely randomized designs. Consequently, we provide a modified version of the maximum number of clear two-factor interactions design criterion to be used for split-plot designs....

  11. A step-up test procedure to find the minimum effective dose.

    Science.gov (United States)

    Wang, Weizhen; Peng, Jianan

    2015-01-01

    It is of great interest to find the minimum effective dose (MED) in dose-response studies. A sequence of decreasing null hypotheses to find the MED is formulated under the assumption of nondecreasing dose response means. A step-up multiple test procedure that controls the familywise error rate (FWER) is constructed based on the maximum likelihood estimators for the monotone normal means. When the MED is equal to one, the proposed test is uniformly more powerful than Hsu and Berger's test (1999). Also, a simulation study shows a substantial power improvement for the proposed test over four competitors. Three R-codes are provided in Supplemental Materials for this article. Go to the publishers online edition of Journal of Biopharmaceutical Statistics to view the files.

  12. Long-Term Capital Goods Importation and Minimum Wage Relationship in Turkey: Bounds Testing Approach

    Directory of Open Access Journals (Sweden)

    Tastan Serkan

    2015-04-01

    Full Text Available In order to examine the long-term relationship between capital goods importation and minimum wage, autoregressive distributed lag (ARDL bounds testing approach to the cointegration is used in the study. According to bounds test results, a cointegration relation exists between the capital goods importation and the minimum wage. Therefore an ARDL(4,0 model is estimated in order to determine the long and short term relations between variables. According to the empirical analysis, there is a positive and significant relationship between the capital goods importation and the minimum wage in Turkey in the long term. A 1% increase in the minimum wage leads to a 0.8% increase in the capital goods importation in the long term. The result is similar for short term coefficients. The relationship observed in the long term is preserved in short term, though in a lower level. In terms of error correction model, it can be concluded that error correction mechanism works as the error correction term is negative and significant. Short term deviations might be resolved with the error correction mechanism in the long term. Accordingly, approximately 75% of any deviation from equilibrium which might arise in the previous six month period will be resolved in the current six month period. This means that returning to long term equilibrium progresses rapidly.

  13. Testing random number generators for Monte Carlo applications

    International Nuclear Information System (INIS)

    Sim, L.H.

    1992-01-01

    Central to any system for modelling radiation transport phenomena using Monte Carlo techniques is the method by which pseudo random numbers are generated. This method is commonly referred to as the Random Number Generator (RNG). It is usually a computer implemented mathematical algorithm which produces a series of numbers uniformly distributed on the interval [0,1]. If this series satisfies certain statistical tests for randomness, then for practical purposes the pseudo random numbers in the series can be considered to be random. Tests of this nature are important not only for new RNGs but also to test the implementation of known RNG algorithms in different computer environments. Six RNGs have been tested using six statistical tests and one visual test. The statistical tests are the moments, frequency (digit and number), serial, gap, and poker tests. The visual test is a simple two dimensional ordered pair display. In addition the RNGs have been tested in a specific Monte Carlo application. This type of test is often overlooked, however it is important that in addition to satisfactory performance in statistical tests, the RNG be able to perform effectively in the applications of interest. The RNGs tested here are based on a variety of algorithms, including multiplicative and linear congruential, lagged Fibonacci, and combination arithmetic and lagged Fibonacci. The effect of the Bays-Durham shuffling algorithm on the output of a known bad RNG has also been investigated. 18 refs., 11 tabs., 4 figs. of

  14. Nonparametric Estimation of Distributions in Random Effects Models

    KAUST Repository

    Hart, Jeffrey D.

    2011-01-01

    We propose using minimum distance to obtain nonparametric estimates of the distributions of components in random effects models. A main setting considered is equivalent to having a large number of small datasets whose locations, and perhaps scales, vary randomly, but which otherwise have a common distribution. Interest focuses on estimating the distribution that is common to all datasets, knowledge of which is crucial in multiple testing problems where a location/scale invariant test is applied to every small dataset. A detailed algorithm for computing minimum distance estimates is proposed, and the usefulness of our methodology is illustrated by a simulation study and an analysis of microarray data. Supplemental materials for the article, including R-code and a dataset, are available online. © 2011 American Statistical Association.

  15. A Study of Minimum Competency Testing Programs. Final Summary and Analysis Report.

    Science.gov (United States)

    Gorth, William Phillip; Perkins, Marcy R.

    To portray the status of minimum competency testing programs nationwide; as of June 30, 1979, on-site visits were conducted with directors of all 31 state programs and 20 local district programs. Most programs were developed since 1976; 14 state and 13 local programs are fully implemented. The state board of education mandated 16 state programs;…

  16. Budget-aware random testing with T3: benchmarking at the SBST2016 testing tool contest

    NARCIS (Netherlands)

    Prasetya, S.W.B.

    2016-01-01

    Random testing has the advantage that it is usually fast. An interesting use case is to use it for bulk smoke testing, e.g. to smoke test a whole project. However, on a large project, even with random testing it may still take hours to complete. To optimize this, we have adapted an automated random

  17. The influence of random and systematic errors on a general definition of minimum detectable amount (MDA) applicable to all radiobioassay measurements

    International Nuclear Information System (INIS)

    Brodsky, A.

    1985-01-01

    An approach to defining minimum detectable amount (MDA) of radioactivity in a sample will be discussed, with the aim of obtaining comments helpful in developing a formulation of MDA that will be broadly applicable to all kinds of radiobioassay measurements, and acceptable to the scientists who make these measurements. Also, the influence of random and systematic errors on the defined MDA are examined

  18. Centered Differential Waveform Inversion with Minimum Support Regularization

    KAUST Repository

    Kazei, Vladimir

    2017-05-26

    Time-lapse full-waveform inversion has two major challenges. The first one is the reconstruction of a reference model (baseline model for most of approaches). The second is inversion for the time-lapse changes in the parameters. Common model approach is utilizing the information contained in all available data sets to build a better reference model for time lapse inversion. Differential (Double-difference) waveform inversion allows to reduce the artifacts introduced into estimates of time-lapse parameter changes by imperfect inversion for the baseline-reference model. We propose centered differential waveform inversion (CDWI) which combines these two approaches in order to benefit from both of their features. We apply minimum support regularization commonly used with electromagnetic methods of geophysical exploration. We test the CDWI method on synthetic dataset with random noise and show that, with Minimum support regularization, it provides better resolution of velocity changes than with total variation and Tikhonov regularizations in time-lapse full-waveform inversion.

  19. The Minimum Core for Numeracy Audit and Test

    CERN Document Server

    Patmore, Mark

    2008-01-01

    This book supports trainee teachers in the Lifelong Learning Sector in the assessment of their numeracy knowledge. A self-audit section is included to help trainees understand their level of competence and confidence in numeracy and will help them identify any gaps in their knowledge and skills. This is followed by exercises and activities to support and enhance learning. The book covers all the content of the LLUK standards for the minimum core for numeracy. Coverage and assessment of the minimum core have to be embedded in all Certificate and Diploma courses leading to QTLS and ATLS status.

  20. A Repetition Test for Pseudo-Random Number Generators

    OpenAIRE

    Gil, Manuel; Gonnet, Gaston H.; Petersen, Wesley P.

    2017-01-01

    A new statistical test for uniform pseudo-random number generators (PRNGs) is presented. The idea is that a sequence of pseudo-random numbers should have numbers reappear with a certain probability. The expectation time that a repetition occurs provides the metric for the test. For linear congruential generators (LCGs) failure can be shown theoretically. Empirical test results for a number of commonly used PRNGs are reported, showing that some PRNGs considered to have good statistical propert...

  1. Experimental tests on winter cereal: Sod seeding compared to minimum tillage and traditional plowing

    Directory of Open Access Journals (Sweden)

    Antoniotto Guidobono Cavalchini

    2013-09-01

    Full Text Available Compared to traditional plowing and minimum tillage, the sod seeding technique has been tested in order to evaluate the differences in energy consumption, labor and machinery requirement and CO2 emission reduction. The experiments were conducted on winter cereal seeding in a Po valley farm in October 2011. The tests were carried out as follows: wheat variety seeding, over corn and alfalfa crops, in large plots with three repetitions for each thesis. They included: sod seeding anticipated by round up weeding in the case of the plots over alfalfa; traditional plowing at 35 cm followed by rotary tillage and combined seeding (seeder plus rotary tiller; minimum tillage based on ripping at the same depth (35 cm and combined seeder ( seeder plus rotary tiller. The following farm operations - fertilizer, and other agrochemical distributionshave been the same in all the considered theses. The results, statistically significant (P<0.001 in terms of yields, highlighted slight differences: the best data in the case of the traditional plowing both in the case of wheat crop over corn and alfalfa (84.43 and 6.75 t/ha; slightly lower yields for the sod seeding (6.23 and 79.9 t/ha for corn and alfalfa respectively; lower in the case of minimum tillage (5.87; 79.77 t/ha in the two situations. Huge differences in energy and oil consumption have been recorded: in the case of succession to corn 61.47; 35.31; 4.27 kg oil/ha respectively for, traditional plowing, minimum tillage and sod seeding; in the case of alfalfa 61.2; 50.96; 5.14 kg oil/ha respectively for traditional plowing, minimum tillage and sod seeding. The innovative technique, highlighted huge energy saving with an oil consumption equal to 92% and 89% (P<0.001 of what happens in traditional plowing and minimum tillage. Large differences concern labor and machine productivity. These parameters together with oil consumption and machine size [power (kW and weight (t] lead to even greater differences in

  2. Testing, Selection, and Implementation of Random Number Generators

    National Research Council Canada - National Science Library

    Collins, Joseph C

    2008-01-01

    An exhaustive evaluation of state-of-the-art random number generators with several well-known suites of tests provides the basis for selection of suitable random number generators for use in stochastic simulations...

  3. 16 CFR Table 3 to Part 1512 - Minimum Acceptable Values for the Quantity A Defined in the Retroreflective Tire and Rim Test...

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Minimum Acceptable Values for the Quantity A Defined in the Retroreflective Tire and Rim Test Procedure 3 Table 3 to Part 1512 Commercial Practices... Retroreflective Tire and Rim Test Procedure Observation angle (degrees) Entrance angle (degrees) Minimum...

  4. Minimum bar size for flexure testing of irradiated SiC/SiC composite

    International Nuclear Information System (INIS)

    Youngblood, G.E.; Jones, R.H.

    1998-01-01

    This report covers material presented at the IEA/Jupiter Joint International Workshop on SiC/SiC Composites for Fusion structural Applications held in conjunction with ICFRM-8, Sendai, Japan, Oct. 23-24, 1997. The minimum bar size for 4-point flexure testing of SiC/SiC composite recommended by PNNL for irradiation effects studies is 30 x 6 x 2 mm 3 with a span-to-depth ratio of 10/1

  5. SB certification handout material requirements, test methods, responsibilities, and minimum classification levels for mixture-based specification for flexible base.

    Science.gov (United States)

    2012-10-01

    A handout with tables representing the material requirements, test methods, responsibilities, and minimum classification levels mixture-based specification for flexible base and details on aggregate and test methods employed, along with agency and co...

  6. Self-Testing Static Random-Access Memory

    Science.gov (United States)

    Chau, Savio; Rennels, David

    1991-01-01

    Proposed static random-access memory for computer features improved error-detecting and -correcting capabilities. New self-testing scheme provides for detection and correction of errors at any time during normal operation - even while data being written into memory. Faults in equipment causing errors in output data detected by repeatedly testing every memory cell to determine whether it can still store both "one" and "zero", without destroying data stored in memory.

  7. The Dickey-Fuller test for exponential random walks

    NARCIS (Netherlands)

    Davies, P.L.; Krämer, W.

    2003-01-01

    A common test in econometrics is the Dickey–Fuller test, which is based on the test statistic . We investigate the behavior of the test statistic if the data yt are given by an exponential random walk exp(Zt) where Zt = Zt-1 + [sigma][epsilon]t and the [epsilon]t are independent and identically

  8. Minimum Competency Testing: An Analysis of Student Outcomes for Those Not Mastering Mandated Testing Standards.

    Science.gov (United States)

    Jonas, Edward D., Jr.; Hayes-Wallace, Lamarian

    The effects of failing to pass a high school exit exam were examined for the Georgia Basic Skills Test (GBST). Data were collected on a random sample of students who were tenth graders in 1983 and in 1984. The following issues were studied: (1) impact of failure on self esteem, as measured by the Piers-Harris Children's Self-Concept Scale (P-H);…

  9. On tests of randomness for spatial point patterns

    International Nuclear Information System (INIS)

    Doguwa, S.I.

    1990-11-01

    New tests of randomness for spatial point patterns are introduced. These test statistics are then compared in a power study with the existing alternatives. These results of the power study suggest that one of the tests proposed is extremely powerful against both aggregated and regular alternatives. (author). 9 refs, 7 figs, 3 tabs

  10. A method for minimum risk portfolio optimization under hybrid uncertainty

    Science.gov (United States)

    Egorova, Yu E.; Yazenin, A. V.

    2018-03-01

    In this paper, we investigate a minimum risk portfolio model under hybrid uncertainty when the profitability of financial assets is described by fuzzy random variables. According to Feng, the variance of a portfolio is defined as a crisp value. To aggregate fuzzy information the weakest (drastic) t-norm is used. We construct an equivalent stochastic problem of the minimum risk portfolio model and specify the stochastic penalty method for solving it.

  11. Quantum random number generation for loophole-free Bell tests

    Science.gov (United States)

    Mitchell, Morgan; Abellan, Carlos; Amaya, Waldimar

    2015-05-01

    We describe the generation of quantum random numbers at multi-Gbps rates, combined with real-time randomness extraction, to give very high purity random numbers based on quantum events at most tens of ns in the past. The system satisfies the stringent requirements of quantum non-locality tests that aim to close the timing loophole. We describe the generation mechanism using spontaneous-emission-driven phase diffusion in a semiconductor laser, digitization, and extraction by parity calculation using multi-GHz logic chips. We pay special attention to experimental proof of the quality of the random numbers and analysis of the randomness extraction. In contrast to widely-used models of randomness generators in the computer science literature, we argue that randomness generation by spontaneous emission can be extracted from a single source.

  12. 9 CFR 147.51 - Authorized laboratory minimum requirements.

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false Authorized laboratory minimum requirements. 147.51 Section 147.51 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE... Authorized Laboratories and Approved Tests § 147.51 Authorized laboratory minimum requirements. These minimum...

  13. Applying the minimax principle to sequential mastery testing

    NARCIS (Netherlands)

    Vos, Hendrik J.

    2002-01-01

    The purpose of this paper is to derive optimal rules for sequential mastery tests. In a sequential mastery test, the decision is to classify a subject as a master, a nonmaster, or to continue sampling and administering another random item. The framework of minimax sequential decision theory (minimum

  14. The Minimum Core for Language and Literacy Audit and Test

    CERN Document Server

    Machin, Lynn

    2007-01-01

    This book supports trainee teachers in the Lifelong Learning Sector in the assessment of their literacy knowledge. A self-audit section is included to help trainees understand their level of competence and confidence in literacy and will help them identify any gaps in their knowledge and skills. This is followed by exercises and activities to support and enhance learning. The book covers all the content of the LLUK standards for the minimum core for literacy. Coverage and assessment of the minimum core have to be embedded in all Certificate and Diploma courses leading to QTLS and ATLS status.

  15. Random Vibration Testing of Advanced Wet Tantalum Capacitors

    Science.gov (United States)

    Teverovsky, Alexander

    2015-01-01

    Advanced wet tantalum capacitors allow for improved performance of power supply systems along with substantial reduction of size and weight of the systems that is especially beneficial for space electronics. Due to launch-related stresses, acceptance testing of all space systems includes random vibration test (RVT). However, many types of advanced wet tantalum capacitors cannot pass consistently RVT at conditions specified in MIL-PRF-39006, which impedes their use in space projects. This requires a closer look at the existing requirements, modes and mechanisms of failures, specifics of test conditions, and acceptance criteria. In this work, different lots of advanced wet tantalum capacitors from four manufacturers have been tested at step stress random vibration conditions while their currents were monitored before, during, and after the testing. It has been shown that the robustness of the parts and their reliability are mostly due to effective self-healing processes and limited current spiking or minor scintillations caused by RVT do not increase the risk of failures during operation. A simple model for scintillations events has been used to simulate current spiking during RVT and optimize test conditions. The significance of scintillations and possible effects of gas generation have been discussed and test acceptance criteria for limited current spiking have been suggested.

  16. 77 FR 2606 - Pipeline Safety: Random Drug Testing Rate

    Science.gov (United States)

    2012-01-18

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID PHMSA-2012-0004] Pipeline Safety: Random Drug Testing Rate AGENCY: Pipeline and Hazardous Materials... pipelines and operators of liquefied natural gas facilities must select and test a percentage of covered...

  17. 75 FR 9018 - Pipeline Safety: Random Drug Testing Rate

    Science.gov (United States)

    2010-02-26

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID PHMSA-2010-0034] Pipeline Safety: Random Drug Testing Rate AGENCY: Pipeline and Hazardous Materials... pipelines and operators of liquefied natural gas facilities must select and test a percentage of covered...

  18. Sequence-Based Prediction of RNA-Binding Proteins Using Random Forest with Minimum Redundancy Maximum Relevance Feature Selection

    Directory of Open Access Journals (Sweden)

    Xin Ma

    2015-01-01

    Full Text Available The prediction of RNA-binding proteins is one of the most challenging problems in computation biology. Although some studies have investigated this problem, the accuracy of prediction is still not sufficient. In this study, a highly accurate method was developed to predict RNA-binding proteins from amino acid sequences using random forests with the minimum redundancy maximum relevance (mRMR method, followed by incremental feature selection (IFS. We incorporated features of conjoint triad features and three novel features: binding propensity (BP, nonbinding propensity (NBP, and evolutionary information combined with physicochemical properties (EIPP. The results showed that these novel features have important roles in improving the performance of the predictor. Using the mRMR-IFS method, our predictor achieved the best performance (86.62% accuracy and 0.737 Matthews correlation coefficient. High prediction accuracy and successful prediction performance suggested that our method can be a useful approach to identify RNA-binding proteins from sequence information.

  19. The minimum wage in the Czech enterprises

    Directory of Open Access Journals (Sweden)

    Eva Lajtkepová

    2010-01-01

    Full Text Available Although the statutory minimum wage is not a new category, in the Czech Republic we encounter the definition and regulation of a minimum wage for the first time in the 1990 amendment to Act No. 65/1965 Coll., the Labour Code. The specific amount of the minimum wage and the conditions of its operation were then subsequently determined by government regulation in February 1991. Since that time, the value of minimum wage has been adjusted fifteenth times (the last increase was in January 2007. The aim of this article is to present selected results of two researches of acceptance of the statutory minimum wage by Czech enterprises. The first research makes use of the data collected by questionnaire research in 83 small and medium-sized enterprises in the South Moravia Region in 2005, the second one the data of 116 enterprises in the entire Czech Republic (in 2007. The data have been processed by means of the standard methods of descriptive statistics and of the appropriate methods of the statistical analyses (Spearman correlation coefficient of sequential correlation, Kendall coefficient, χ2 - independence test, Kruskal-Wallis test, and others.

  20. Minimum Bias Trigger in ATLAS

    International Nuclear Information System (INIS)

    Kwee, Regina

    2010-01-01

    Since the restart of the LHC in November 2009, ATLAS has collected inelastic pp collisions to perform first measurements on charged particle densities. These measurements will help to constrain various models describing phenomenologically soft parton interactions. Understanding the trigger efficiencies for different event types are therefore crucial to minimize any possible bias in the event selection. ATLAS uses two main minimum bias triggers, featuring complementary detector components and trigger levels. While a hardware based first trigger level situated in the forward regions with 2.2 < |η| < 3.8 has been proven to select pp-collisions very efficiently, the Inner Detector based minimum bias trigger uses a random seed on filled bunches and central tracking detectors for the event selection. Both triggers were essential for the analysis of kinematic spectra of charged particles. Their performance and trigger efficiency measurements as well as studies on possible bias sources will be presented. We also highlight the advantage of these triggers for particle correlation analyses. (author)

  1. A Model for Random Student Drug Testing

    Science.gov (United States)

    Nelson, Judith A.; Rose, Nancy L.; Lutz, Danielle

    2011-01-01

    The purpose of this case study was to examine random student drug testing in one school district relevant to: (a) the perceptions of students participating in competitive extracurricular activities regarding drug use and abuse; (b) the attitudes and perceptions of parents, school staff, and community members regarding student drug involvement; (c)…

  2. Minimum Performance on Clinical Tests of Physical Function to Predict Walking 6,000 Steps/Day in Knee Osteoarthritis: An Observational Study.

    Science.gov (United States)

    Master, Hiral; Thoma, Louise M; Christiansen, Meredith B; Polakowski, Emily; Schmitt, Laura A; White, Daniel K

    2018-07-01

    Evidence of physical function difficulties, such as difficulty rising from a chair, may limit daily walking for people with knee osteoarthritis (OA). The purpose of this study was to identify minimum performance thresholds on clinical tests of physical function predictive to walking ≥6,000 steps/day. This benchmark is known to discriminate people with knee OA who develop functional limitation over time from those who do not. Using data from the Osteoarthritis Initiative, we quantified daily walking as average steps/day from an accelerometer (Actigraph GT1M) worn for ≥10 hours/day over 1 week. Physical function was quantified using 3 performance-based clinical tests: 5 times sit-to-stand test, walking speed (tested over 20 meters), and 400-meter walk test. To identify minimum performance thresholds for daily walking, we calculated physical function values corresponding to high specificity (80-95%) to predict walking ≥6,000 steps/day. Among 1,925 participants (mean ± SD age 65.1 ± 9.1 years, mean ± SD body mass index 28.4 ± 4.8 kg/m 2 , and 55% female) with valid accelerometer data, 54.9% walked ≥6,000 steps/day. High specificity thresholds of physical function for walking ≥6,000 steps/day ranged 11.4-14.0 seconds on the 5 times sit-to-stand test, 1.13-1.26 meters/second for walking speed, or 315-349 seconds on the 400-meter walk test. Not meeting these minimum performance thresholds on clinical tests of physical function may indicate inadequate physical ability to walk ≥6,000 steps/day for people with knee OA. Rehabilitation may be indicated to address underlying impairments limiting physical function. © 2017, American College of Rheumatology.

  3. Is the minimum enough? Affordability of a nutritious diet for minimum wage earners in Nova Scotia (2002-2012).

    Science.gov (United States)

    Newell, Felicia D; Williams, Patricia L; Watt, Cynthia G

    2014-05-09

    This paper aims to assess the affordability of a nutritious diet for households earning minimum wage in Nova Scotia (NS) from 2002 to 2012 using an economic simulation that includes food costing and secondary data. The cost of the National Nutritious Food Basket (NNFB) was assessed with a stratified, random sample of grocery stores in NS during six time periods: 2002, 2004/2005, 2007, 2008, 2010 and 2012. The NNFB's cost was factored into affordability scenarios for three different household types relying on minimum wage earnings: a household of four; a lone mother with three children; and a lone man. Essential monthly living expenses were deducted from monthly net incomes using methods that were standardized from 2002 to 2012 to determine whether adequate funds remained to purchase a basic nutritious diet across the six time periods. A 79% increase to the minimum wage in NS has resulted in a decrease in the potential deficit faced by each household scenario in the period examined. However, the household of four and the lone mother with three children would still face monthly deficits ($44.89 and $496.77, respectively, in 2012) if they were to purchase a nutritiously sufficient diet. As a social determinant of health, risk of food insecurity is a critical public health issue for low wage earners. While it is essential to increase the minimum wage in the short term, adequately addressing income adequacy in NS and elsewhere requires a shift in thinking from a focus on minimum wage towards more comprehensive policies ensuring an adequate livable income for everyone.

  4. 10 CFR 707.7 - Random drug testing requirements and identification of testing designated positions.

    Science.gov (United States)

    2010-01-01

    ... contractor, to have the potential to significantly affect the environment, public health and safety, or... evidence of the use of illegal drugs of employees in testing designated positions identified in this... section shall provide for random tests at a rate equal to 30 percent of the total number of employees in...

  5. Electronic Nose Testing Procedure for the Definition of Minimum Performance Requirements for Environmental Odor Monitoring

    Directory of Open Access Journals (Sweden)

    Lidia Eusebio

    2016-09-01

    Full Text Available Despite initial enthusiasm towards electronic noses and their possible application in different fields, and quite a lot of promising results, several criticalities emerge from most published research studies, and, as a matter of fact, the diffusion of electronic noses in real-life applications is still very limited. In general, a first step towards large-scale-diffusion of an analysis method, is standardization. The aim of this paper is describing the experimental procedure adopted in order to evaluate electronic nose performances, with the final purpose of establishing minimum performance requirements, which is considered to be a first crucial step towards standardization of the specific case of electronic nose application for environmental odor monitoring at receptors. Based on the experimental results of the performance testing of a commercialized electronic nose type with respect to three criteria (i.e., response invariability to variable atmospheric conditions, instrumental detection limit, and odor classification accuracy, it was possible to hypothesize a logic that could be adopted for the definition of minimum performance requirements, according to the idea that these are technologically achievable.

  6. A pseudo-random number generator and its spectral test

    International Nuclear Information System (INIS)

    Wang Lai

    1998-01-01

    The author introduces a pseudo-random number generator and describes its algorithm and C language implementation. The performance of the generator is tested and compared with some well known LCG generators

  7. Test of Random Walk Behavior in Karachi Stock Exchange

    Directory of Open Access Journals (Sweden)

    Muhammad Mudassar

    2013-05-01

    Full Text Available Study was carried out to check the random behavior of the Karachi Stock Exchange (KSE 100 Index during the period of past three financial years to know whether investors could generate abnormal profits during the period or otherwise. Tests used were Runs Test, ADF Test, PP Test and Autocorrelation Function Test. During the study it was found that the performance of KSE 100 Index remained in weak form of inefficiency and investors have been able to generate excessive returns on their investment most of the times.

  8. Same day ART initiation versus clinic-based pre-ART assessment and counselling for individuals newly tested HIV-positive during community-based HIV testing in rural Lesotho - a randomized controlled trial (CASCADE trial).

    Science.gov (United States)

    Labhardt, Niklaus Daniel; Ringera, Isaac; Lejone, Thabo Ishmael; Masethothi, Phofu; Thaanyane, T'sepang; Kamele, Mashaete; Gupta, Ravi Shankar; Thin, Kyaw; Cerutti, Bernard; Klimkait, Thomas; Fritz, Christiane; Glass, Tracy Renée

    2016-04-14

    Achievement of the UNAIDS 90-90-90 targets in Sub-Sahara Africa is challenged by a weak care-cascade with poor linkage to care and retention in care. Community-based HIV testing and counselling (HTC) is widely used in African countries. However, rates of linkage to care and initiation of antiretroviral therapy (ART) in individuals who tested HIV-positive are often very low. A frequently cited reason for non-linkage to care is the time-consuming pre-ART assessment often requiring several clinic visits before ART-initiation. This two-armed open-label randomized controlled trial compares in individuals tested HIV-positive during community-based HTC the proposition of same-day community-based ART-initiation to the standard of care pre-ART assessment at the clinic. Home-based HTC campaigns will be conducted in catchment areas of six clinics in rural Lesotho. Households where at least one individual tested HIV positive will be randomized. In the standard of care group individuals receive post-test counselling and referral to the nearest clinic for pre-ART assessment and counselling. Once they have started ART the follow-up schedule foresees monthly clinic visits. Individuals randomized to the intervention group receive on the spot point-of-care pre-ART assessment and adherence counselling with the proposition to start ART that same day. Once they have started ART, follow-up clinic visits will be less frequent. First primary outcome is linkage to care (individual presents at the clinic at least once within 3 months after the HIV test). The second primary outcome is viral suppression 12 months after enrolment in the study. We plan to enrol a minimum of 260 households with 1:1 allocation and parallel assignment into both arms. This trial will show if in individuals tested HIV-positive during community-based HTC campaigns the proposition of same-day ART initiation in the community, combined with less frequent follow-up visits at the clinic could be a pragmatic approach to

  9. Why the null matters: statistical tests, random walks and evolution.

    Science.gov (United States)

    Sheets, H D; Mitchell, C E

    2001-01-01

    A number of statistical tests have been developed to determine what type of dynamics underlie observed changes in morphology in evolutionary time series, based on the pattern of change within the time series. The theory of the 'scaled maximum', the 'log-rate-interval' (LRI) method, and the Hurst exponent all operate on the same principle of comparing the maximum change, or rate of change, in the observed dataset to the maximum change expected of a random walk. Less change in a dataset than expected of a random walk has been interpreted as indicating stabilizing selection, while more change implies directional selection. The 'runs test' in contrast, operates on the sequencing of steps, rather than on excursion. Applications of these tests to computer generated, simulated time series of known dynamical form and various levels of additive noise indicate that there is a fundamental asymmetry in the rate of type II errors of the tests based on excursion: they are all highly sensitive to noise in models of directional selection that result in a linear trend within a time series, but are largely noise immune in the case of a simple model of stabilizing selection. Additionally, the LRI method has a lower sensitivity than originally claimed, due to the large range of LRI rates produced by random walks. Examination of the published results of these tests show that they have seldom produced a conclusion that an observed evolutionary time series was due to directional selection, a result which needs closer examination in light of the asymmetric response of these tests.

  10. Connectivity ranking of heterogeneous random conductivity models

    Science.gov (United States)

    Rizzo, C. B.; de Barros, F.

    2017-12-01

    To overcome the challenges associated with hydrogeological data scarcity, the hydraulic conductivity (K) field is often represented by a spatial random process. The state-of-the-art provides several methods to generate 2D or 3D random K-fields, such as the classic multi-Gaussian fields or non-Gaussian fields, training image-based fields and object-based fields. We provide a systematic comparison of these models based on their connectivity. We use the minimum hydraulic resistance as a connectivity measure, which it has been found to be strictly correlated with early time arrival of dissolved contaminants. A computationally efficient graph-based algorithm is employed, allowing a stochastic treatment of the minimum hydraulic resistance through a Monte-Carlo approach and therefore enabling the computation of its uncertainty. The results show the impact of geostatistical parameters on the connectivity for each group of random fields, being able to rank the fields according to their minimum hydraulic resistance.

  11. 30 CFR 18.97 - Inspection of machines; minimum requirements.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Inspection of machines; minimum requirements... TESTING, EVALUATION, AND APPROVAL OF MINING PRODUCTS ELECTRIC MOTOR-DRIVEN MINE EQUIPMENT AND ACCESSORIES Field Approval of Electrically Operated Mining Equipment § 18.97 Inspection of machines; minimum...

  12. Minimum critical crack depths in pressure vessels guidelines for nondestructive testing

    International Nuclear Information System (INIS)

    Crossley, M.R.; Townley, C.H.A.

    1983-09-01

    Estimates of the minimum critical depths which can be expected in high quality vessels designed to certain British and American Code rules are given. A simple means of allowing for fatigue crack growth in service is included. The data which are presented can be used to decide what sensitivity and what reporting levels should be employed during an ultrasonic inspection of a pressure vessel. It is emphasised that the minimum crack depths are those which would be relevant to a vessel in which the material is stressed to its maximum permitted value during operation. Stresses may, in practice, be significantly less than this. Less restrictive inspection standards may be established, if it were considered worthwhile to carry out a detailed stress analysis of the particular vessel under examination. (author)

  13. Humans can consciously generate random number sequences: a possible test for artificial intelligence.

    Science.gov (United States)

    Persaud, Navindra

    2005-01-01

    Computer algorithms can only produce seemingly random or pseudorandom numbers whereas certain natural phenomena, such as the decay of radioactive particles, can be utilized to produce truly random numbers. In this study, the ability of humans to generate random numbers was tested in healthy adults. Subjects were simply asked to generate and dictate random numbers. Generated numbers were tested for uniformity, independence and information density. The results suggest that humans can generate random numbers that are uniformly distributed, independent of one another and unpredictable. If humans can generate sequences of random numbers then neural networks or forms of artificial intelligence, which are purported to function in ways essentially the same as the human brain, should also be able to generate sequences of random numbers. Elucidating the precise mechanism by which humans generate random number sequences and the underlying neural substrates may have implications in the cognitive science of decision-making. It is possible that humans use their random-generating neural machinery to make difficult decisions in which all expected outcomes are similar. It is also possible that certain people, perhaps those with neurological or psychiatric impairments, are less able or unable to generate random numbers. If the random-generating neural machinery is employed in decision making its impairment would have profound implications in matters of agency and free will.

  14. An Ultra-light PRNG Passing Strict Randomness Tests and Suitable for Low Cost Tags

    Directory of Open Access Journals (Sweden)

    OZCANHAN, M. H.

    2016-08-01

    Full Text Available A pseudo-random number generator for low-cost RFID tags is presented. The scheme is simple, sequential and secure, yet has a high performance. Despite its lowest hardware complexity, our proposal represents a better alternative than previous proposals for low-cost tags. The scheme is based on the well-founded pseudo random number generator, Mersenne Twister. The proposed generator takes low-entropy seeds extracted from a physical characteristic of the tag and produces outputs that pass popular randomness tests. Contrarily, previous proposal tests are based on random number inputs from a popular online source, which are simply unavailable to tags. The high performance and satisfactory randomness of present work are supported by extensive test results and compared with similar previous works. Comparison using proven estimation formulae indicates that our proposal has the best hardware complexity, power consumption, and the least cost.

  15. Inficon Transpector MPH Mass Spectrometer Random Vibration Test Report

    Science.gov (United States)

    Santiago-Bond, Jo; Captain, Janine

    2015-01-01

    The purpose of this test report is to summarize results from the vibration testing of the INFICON Transpector MPH100M model Mass Spectrometer. It also identifies requirements satisfied, and procedures used in the test. As a payload of Resource Prospector, it is necessary to determine the survivability of the mass spectrometer to proto-qualification level random vibration. Changes in sensitivity of the mass spectrometer can be interpreted as a change in alignment of the instrument. The results of this test will be used to determine any necessary design changes as the team moves forward with flight design.

  16. An empirical test of pseudo random number generators by means of an exponential decaying process

    International Nuclear Information System (INIS)

    Coronel B, H.F.; Hernandez M, A.R.; Jimenez M, M.A.; Mora F, L.E.

    2007-01-01

    Empirical tests for pseudo random number generators based on the use of processes or physical models have been successfully used and are considered as complementary to theoretical tests of randomness. In this work a statistical methodology for evaluating the quality of pseudo random number generators is presented. The method is illustrated in the context of the so-called exponential decay process, using some pseudo random number generators commonly used in physics. (Author)

  17. Analisis Teoritis dan Empiris Uji Craps dari Diehard Battery of Randomness Test untuk Pengujian Pembangkit Bilangan Acaksemu

    Directory of Open Access Journals (Sweden)

    Sari Agustini Hafman

    2013-05-01

    Full Text Available According to Kerchoffs (1883, the security system should only rely on cryptographic keys which is used in that system. Generally, the key sequences are generated by a Pseudo Random Number Generator (PRNG or Random Number Generator (RNG. There are three types of randomness sequences that generated by the RNG and PRNG i.e. pseudorandom sequence, cryptographically secure pseudorandom sequences, and real random sequences. Several statistical tests, including diehard battery of tests of randomness, is used to check the type of randomness sequences that generated by PRNG or RNG. Due to its purpose, the principle on taking the testing parameters and the test statistic are associated with the validity of the conclusion produced by a statistical test, then the theoretical analysis is performed by applying a variety of statistical theory to evaluate craps test, one of the test included in the diehard battery of randomness tests. Craps test, inspired by craps game, aims to examine whether a PRNG produces an independent and identically distributed (iid pseudorandom sequences. To demonstrate the process to produce a test statistics equation and to show how craps games applied on that test, will be carried out theoretical analysis by applying a variety of statistical theory. Furthermore, empirical observations will be done by applying craps test on a PRNG in order to check the test effectiveness in detecting the distribution and independency of sequences which produced by PRNG

  18. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    Science.gov (United States)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2011-09-01

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested [1,2] and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi 2/Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  19. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    International Nuclear Information System (INIS)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2011-01-01

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi 2 /Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  20. The necessity of randomness in tests of Bell inequalities

    Energy Technology Data Exchange (ETDEWEB)

    Bednorz, Adam; Zielinski, Jakub

    2003-08-11

    The possibility that detectors may affect the input quantum entangled state is pointed out. It is suggested that experiments testing Bell inequalities should be repeated with more randomly oriented polarizers to both close communication loophole and refute certain local variable theories with low efficiency bound.

  1. Vaginal Swab Test Compared With the Urethral Q-tip Test for Urethral Mobility Measurement: A Randomized Controlled Trial.

    Science.gov (United States)

    Meyer, Isuzu; Szychowski, Jeff M; Illston, Jana D; Parden, Alison M; Richter, Holly E

    2016-02-01

    To assess whether use of a vaginal cotton-tipped swab is equivalent to the standard Q-tip test regarding urethral mobility. Secondarily, to examine whether both tests agree in hypermobility diagnosis, discomfort level, and patients' preference. In this randomized crossover trial, women with stress urinary incontinence without prolapse beyond the hymen were randomized to undergo either a vaginal or urethral mobility test first followed by the alternate approach. The primary outcome was the difference in rotation angle, from resting to maximum strain, between tests. The equivalence margin was ±10°. The secondary outcome was agreement in hypermobility diagnosis using two definitions: 1) maximum straining angle of 30° or greater from the horizontal plane; and 2) rotation angle 30° or greater. Discomfort was assessed using a 0-10 visual analog scale. Using 90% power assuming a standard deviation of 20°, 36 and 139 patients were needed for 10° and 5° equivalence margins, respectively. From January 2014 to March 2015, 140 women were randomized. The mean difference between the two tests was 5.1° (95% confidence interval 3.2-6.9°), meeting the predefined equivalence criteria. In the hypermobility diagnosis, the urethral and vaginal tests had no disagreement using definition 1 (P=.23), whereas the two tests disagreed using definition 2 (P=.03). The urethral approach had a higher discomfort level (Pstandard Q-tip test in measuring urethral mobility with less discomfort and is preferred by patients.

  2. Portfolio optimization and the random magnet problem

    Science.gov (United States)

    Rosenow, B.; Plerou, V.; Gopikrishnan, P.; Stanley, H. E.

    2002-08-01

    Diversification of an investment into independently fluctuating assets reduces its risk. In reality, movements of assets are mutually correlated and therefore knowledge of cross-correlations among asset price movements are of great importance. Our results support the possibility that the problem of finding an investment in stocks which exposes invested funds to a minimum level of risk is analogous to the problem of finding the magnetization of a random magnet. The interactions for this "random magnet problem" are given by the cross-correlation matrix C of stock returns. We find that random matrix theory allows us to make an estimate for C which outperforms the standard estimate in terms of constructing an investment which carries a minimum level of risk.

  3. Experimental investigations of the minimum ignition energy and the minimum ignition temperature of inert and combustible dust cloud mixtures.

    Science.gov (United States)

    Addai, Emmanuel Kwasi; Gabel, Dieter; Krause, Ulrich

    2016-04-15

    The risks associated with dust explosions still exist in industries that either process or handle combustible dust. This explosion risk could be prevented or mitigated by applying the principle of inherent safety (moderation). This is achieved by adding an inert material to a highly combustible material in order to decrease the ignition sensitivity of the combustible dust. The presented paper deals with the experimental investigation of the influence of adding an inert dust on the minimum ignition energy and the minimum ignition temperature of the combustible/inert dust mixtures. The experimental investigation was done in two laboratory scale equipment: the Hartmann apparatus and the Godbert-Greenwald furnace for the minimum ignition energy and the minimum ignition temperature test respectively. This was achieved by mixing various amounts of three inert materials (magnesium oxide, ammonium sulphate and sand) and six combustible dusts (brown coal, lycopodium, toner, niacin, corn starch and high density polyethylene). Generally, increasing the inert materials concentration increases the minimum ignition energy as well as the minimum ignition temperatures until a threshold is reached where no ignition was obtained. The permissible range for the inert mixture to minimize the ignition risk lies between 60 to 80%. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Minimum load reduction for once-through boiler power plants

    International Nuclear Information System (INIS)

    Colombo, P.; Godina, G.; Manganelli, R.

    2001-01-01

    In Italy the liberalization process of energy market is giving particular importance to the optimization of power plants performances; especially for those that will be called to satisfy grid peak demands. On those plants some techniques have been experimented for the reduction of minimum load; these techniques, investigated and tested by an engineering dynamic simulator, have been sequentially tested on plant. The minimum load for up 320 MW of Tavazzano power plants has been diminished from 140 down to 80 MW without plant modification [it

  5. Experimental investigations of the minimum ignition energy and the minimum ignition temperature of inert and combustible dust cloud mixtures

    Energy Technology Data Exchange (ETDEWEB)

    Addai, Emmanuel Kwasi, E-mail: emmanueladdai41@yahoo.com; Gabel, Dieter; Krause, Ulrich

    2016-04-15

    Highlights: • Ignition sensitivity of a highly flammable dust decreases upon addition of inert dust. • Minimum ignition temperature of a highly flammable dust increases when inert concentration increase. • Minimum ignition energy of a highly flammable dust increases when inert concentration increase. • The permissible range for the inert mixture to minimize the ignition risk lies between 60 to 80%. - Abstract: The risks associated with dust explosions still exist in industries that either process or handle combustible dust. This explosion risk could be prevented or mitigated by applying the principle of inherent safety (moderation). This is achieved by adding an inert material to a highly combustible material in order to decrease the ignition sensitivity of the combustible dust. The presented paper deals with the experimental investigation of the influence of adding an inert dust on the minimum ignition energy and the minimum ignition temperature of the combustible/inert dust mixtures. The experimental investigation was done in two laboratory scale equipment: the Hartmann apparatus and the Godbert-Greenwald furnace for the minimum ignition energy and the minimum ignition temperature test respectively. This was achieved by mixing various amounts of three inert materials (magnesium oxide, ammonium sulphate and sand) and six combustible dusts (brown coal, lycopodium, toner, niacin, corn starch and high density polyethylene). Generally, increasing the inert materials concentration increases the minimum ignition energy as well as the minimum ignition temperatures until a threshold is reached where no ignition was obtained. The permissible range for the inert mixture to minimize the ignition risk lies between 60 to 80%.

  6. Experimental investigations of the minimum ignition energy and the minimum ignition temperature of inert and combustible dust cloud mixtures

    International Nuclear Information System (INIS)

    Addai, Emmanuel Kwasi; Gabel, Dieter; Krause, Ulrich

    2016-01-01

    Highlights: • Ignition sensitivity of a highly flammable dust decreases upon addition of inert dust. • Minimum ignition temperature of a highly flammable dust increases when inert concentration increase. • Minimum ignition energy of a highly flammable dust increases when inert concentration increase. • The permissible range for the inert mixture to minimize the ignition risk lies between 60 to 80%. - Abstract: The risks associated with dust explosions still exist in industries that either process or handle combustible dust. This explosion risk could be prevented or mitigated by applying the principle of inherent safety (moderation). This is achieved by adding an inert material to a highly combustible material in order to decrease the ignition sensitivity of the combustible dust. The presented paper deals with the experimental investigation of the influence of adding an inert dust on the minimum ignition energy and the minimum ignition temperature of the combustible/inert dust mixtures. The experimental investigation was done in two laboratory scale equipment: the Hartmann apparatus and the Godbert-Greenwald furnace for the minimum ignition energy and the minimum ignition temperature test respectively. This was achieved by mixing various amounts of three inert materials (magnesium oxide, ammonium sulphate and sand) and six combustible dusts (brown coal, lycopodium, toner, niacin, corn starch and high density polyethylene). Generally, increasing the inert materials concentration increases the minimum ignition energy as well as the minimum ignition temperatures until a threshold is reached where no ignition was obtained. The permissible range for the inert mixture to minimize the ignition risk lies between 60 to 80%.

  7. Planetary tides during the Maunder sunspot minimum

    International Nuclear Information System (INIS)

    Smythe, C.M.; Eddy, J.A.

    1977-01-01

    Sun-centered planetary conjunctions and tidal potentials are here constructed for the AD1645 to 1715 period of sunspot absence, referred to as the 'Maunder Minimum'. These are found to be effectively indistinguishable from patterns of conjunctions and power spectra of tidal potential in the present era of a well established 11 year sunspot cycle. This places a new and difficult restraint on any tidal theory of sunspot formation. Problems arise in any direct gravitational theory due to the apparently insufficient forces and tidal heights involved. Proponents of the tidal hypothesis usually revert to trigger mechanisms, which are difficult to criticise or test by observation. Any tidal theory rests on the evidence of continued sunspot periodicity and the substantiation of a prolonged period of solar anomaly in the historical past. The 'Maunder Minimum' was the most drastic change in the behaviour of solar activity in the last 300 years; sunspots virtually disappeared for a 70 year period and the 11 year cycle was probably absent. During that time, however, the nine planets were all in their orbits, and planetary conjunctions and tidal potentials were indistinguishable from those of the present era, in which the 11 year cycle is well established. This provides good evidence against the tidal theory. The pattern of planetary tidal forces during the Maunder Minimum was reconstructed to investigate the possibility that the multiple planet forces somehow fortuitously cancelled at the time, that is that the positions of the slower moving planets in the 17th and early 18th centuries were such that conjunctions and tidal potentials were at the time reduced in number and force. There was no striking dissimilarity between the time of the Maunder Minimum and any period investigated. The failure of planetary conjunction patterns to reflect the drastic drop in sunspots during the Maunder Minimum casts doubt on the tidal theory of solar activity, but a more quantitative test

  8. Parameters Tuning of Model Free Adaptive Control Based on Minimum Entropy

    Institute of Scientific and Technical Information of China (English)

    Chao Ji; Jing Wang; Liulin Cao; Qibing Jin

    2014-01-01

    Dynamic linearization based model free adaptive control(MFAC) algorithm has been widely used in practical systems, in which some parameters should be tuned before it is successfully applied to process industries. Considering the random noise existing in real processes, a parameter tuning method based on minimum entropy optimization is proposed,and the feature of entropy is used to accurately describe the system uncertainty. For cases of Gaussian stochastic noise and non-Gaussian stochastic noise, an entropy recursive optimization algorithm is derived based on approximate model or identified model. The extensive simulation results show the effectiveness of the minimum entropy optimization for the partial form dynamic linearization based MFAC. The parameters tuned by the minimum entropy optimization index shows stronger stability and more robustness than these tuned by other traditional index,such as integral of the squared error(ISE) or integral of timeweighted absolute error(ITAE), when the system stochastic noise exists.

  9. Correlation of finite element free vibration predictions using random vibration test data. M.S. Thesis - Cleveland State Univ.

    Science.gov (United States)

    Chambers, Jeffrey A.

    1994-01-01

    Finite element analysis is regularly used during the engineering cycle of mechanical systems to predict the response to static, thermal, and dynamic loads. The finite element model (FEM) used to represent the system is often correlated with physical test results to determine the validity of analytical results provided. Results from dynamic testing provide one means for performing this correlation. One of the most common methods of measuring accuracy is by classical modal testing, whereby vibratory mode shapes are compared to mode shapes provided by finite element analysis. The degree of correlation between the test and analytical mode shapes can be shown mathematically using the cross orthogonality check. A great deal of time and effort can be exhausted in generating the set of test acquired mode shapes needed for the cross orthogonality check. In most situations response data from vibration tests are digitally processed to generate the mode shapes from a combination of modal parameters, forcing functions, and recorded response data. An alternate method is proposed in which the same correlation of analytical and test acquired mode shapes can be achieved without conducting the modal survey. Instead a procedure is detailed in which a minimum of test information, specifically the acceleration response data from a random vibration test, is used to generate a set of equivalent local accelerations to be applied to the reduced analytical model at discrete points corresponding to the test measurement locations. The static solution of the analytical model then produces a set of deformations that once normalized can be used to represent the test acquired mode shapes in the cross orthogonality relation. The method proposed has been shown to provide accurate results for both a simple analytical model as well as a complex space flight structure.

  10. A test for stationarity of spatio-temporal random fields on planar and spherical domains

    KAUST Repository

    Jun, Mikyoung

    2012-01-01

    A formal test for weak stationarity of spatial and spatio-temporal random fields is proposed. We consider the cases where the spatial domain is planar or spherical, and we do not require distributional assumptions for the random fields. The method can be applied to univariate or to multivariate random fields. Our test is based on the asymptotic normality of certain statistics that are functions of estimators of covariances at certain spatial and temporal lags under weak stationarity. Simulation results for spatial as well as spatio-temporal cases on the two types of spatial domains are reported. We describe the results of testing the stationarity of Pacific wind data, and of testing the axial symmetry of climate model errors for surface temperature using the NOAA GFDL model outputs and the observations from the Climate Research Unit in East Anglia and the Hadley Centre.

  11. Minimum Leakage Condenser Test Program

    International Nuclear Information System (INIS)

    1978-05-01

    This report presents the results and analysis of tests performed on four critical areas of large surface condensers: the tubes, tubesheets, tube/tubesheet joints and the water chambers. Significant changes in operation, service duty and the reliability considerations require that certain existing design criteria be verified and that improved design features be developed. The four critical areas were treated analytically and experimentally. The ANSYS finite element computer program was the basic analytical method and strain gages were used for obtaining experimental data. The results of test and analytical data are compared and recommendations made regarding potential improvement in condenser design features and analytical techniques

  12. On the correlation between minimum thickness and central deflection during small punch test

    International Nuclear Information System (INIS)

    Kumar, Pradeep; Chattopadhyay, J.; Dutta, B.K.

    2016-01-01

    Present paper deals with a detailed study on the correlation between minimum thickness (t/t_0) and central deflection (δ/t_0). Such data are obtained during the deformation of a small punch test of miniaturized specimen. Finite element studies have been carried out to investigate the effect of various parameters which are expected to influence this correlation. The parameters under consideration are material hardening, material yield stress, coefficient of friction and initial thickness of the specimen. It is shown that the correlation remains unaffected with respect to change in material parameters. Similarly, the coefficient of friction beyond 0.2 also does not affect the correlation. However, change in thickness has significant effect on the correlation. A modification has been suggested in the existing correlation to consider the influence of thickness change. The modified correlation is then used to calculate fracture toughness using the experimental results quoted in the literature. It is shown that the modified correlation improves the fracture toughness prediction considerably.

  13. Blinded trials taken to the test: an analysis of randomized clinical trials that report tests for the success of blinding

    DEFF Research Database (Denmark)

    Hróbjartsson, A; Forfang, E; Haahr, M T

    2007-01-01

    Blinding can reduce bias in randomized clinical trials, but blinding procedures may be unsuccessful. Our aim was to assess how often randomized clinical trials test the success of blinding, the methods involved and how often blinding is reported as being successful....

  14. 10 CFR 26.67 - Random drug and alcohol testing of individuals who have applied for authorization.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Random drug and alcohol testing of individuals who have... PROGRAMS Granting and Maintaining Authorization § 26.67 Random drug and alcohol testing of individuals who... other entity relies on drug and alcohol tests that were conducted before the individual applied for...

  15. Proposed minimum requirements for the operational characteristics and testing of closed circuit life support system control electronics.

    Science.gov (United States)

    Kirk, J C

    1998-01-01

    The popularization and transformation of scuba diving into a broadly practiced sport has served to ignite the interest of technically oriented divers into ever more demanding areas. This, along with the gradual release of military data, equipment, and techniques of closed circuit underwater breathing apparatus, has resulted in a virtual explosion of semiclosed and closed circuit systems for divers. Although many of these systems have been carefully thought out by capable designers, the impulse to rush to market with equipment that has not been fully developed and carefully tested is irresistible to marketers. In addition, the presence of systems developed by well-intentioned and otherwise competent designers who are, nonetheless, inexperienced in the field of life support can result in the sale of failure-prone equipment to divers who lack the knowledge and skills to identify deficiencies before disaster occurs. For this reason, a set of industry standards establishing minimum requirements and testing is needed to guide the designers of this equipment, and to protect the user community from incomplete or inadequate design. Many different technologies go into the development of closed circuit scuba. One key area is the design of electronics to monitor and maintain the critical gas mixtures of the closed circuit loop. Much of the system reliability and inherent danger is resident in the design of the circuitry and the software (if any) that runs it. This article will present a set of proposed minimum requirements, with the goal of establishing a dialog for the creation of guidelines for the classification, rating, design, and testing of embedded electronics for life support systems used in closed circuit applications. These guidelines will serve as the foundation for the later creation of a set of industry specifications.

  16. Reliability, standard error, and minimum detectable change of clinical pressure pain threshold testing in people with and without acute neck pain.

    Science.gov (United States)

    Walton, David M; Macdermid, Joy C; Nielson, Warren; Teasell, Robert W; Chiasson, Marco; Brown, Lauren

    2011-09-01

    Clinical measurement. To evaluate the intrarater, interrater, and test-retest reliability of an accessible digital algometer, and to determine the minimum detectable change in normal healthy individuals and a clinical population with neck pain. Pressure pain threshold testing may be a valuable assessment and prognostic indicator for people with neck pain. To date, most of this research has been completed using algometers that are too resource intensive for routine clinical use. Novice raters (physiotherapy students or clinical physiotherapists) were trained to perform algometry testing over 2 clinically relevant sites: the angle of the upper trapezius and the belly of the tibialis anterior. A convenience sample of normal healthy individuals and a clinical sample of people with neck pain were tested by 2 different raters (all participants) and on 2 different days (healthy participants only). Intraclass correlation coefficient (ICC), standard error of measurement, and minimum detectable change were calculated. A total of 60 healthy volunteers and 40 people with neck pain were recruited. Intrarater reliability was almost perfect (ICC = 0.94-0.97), interrater reliability was substantial to near perfect (ICC = 0.79-0.90), and test-retest reliability was substantial (ICC = 0.76-0.79). Smaller change was detectable in the trapezius compared to the tibialis anterior. This study provides evidence that novice raters can perform digital algometry with adequate reliability for research and clinical use in people with and without neck pain.

  17. Testing serial dependence by Random-shuffle surrogates and the Wayland method

    Energy Technology Data Exchange (ETDEWEB)

    Hirata, Yoshito [Department of Mathematical Informatics, University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656 (Japan); Aihara Complexity Modelling Project, ERATO, JST (Japan); Institute of Industrial Science, University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8505 (Japan)], E-mail: yoshito@sat.t.u-tokyo.ac.jp; Horai, Shunsuke [Aihara Complexity Modelling Project, ERATO, JST (Japan); Institute of Industrial Science, University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8505 (Japan); Suzuki, Hideyuki [Department of Mathematical Informatics, University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656 (Japan); Institute of Industrial Science, University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8505 (Japan); Aihara, Kazuyuki [Department of Mathematical Informatics, University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656 (Japan); Aihara Complexity Modelling Project, ERATO, JST (Japan); Institute of Industrial Science, University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8505 (Japan)

    2007-10-22

    Given time series, a primary concern is existence of serial dependence and determinism. They are often tested with Random-shuffle surrogates, which totally break serial dependence, and the Wayland method. Since the statistic of the Wayland method fundamentally shows a smaller value for a more deterministic time series, for real-world data, we usually expect that the statistic for the original data is smaller than or equal to those of Random-shuffle surrogates. However, we show herewith an opposite result with wind data in high time resolution. We argue that this puzzling phenomenon can be produced by observational or dynamical noise, both of which may be produced by a low-dimensional deterministic system. Thus the one-sided test is dangerous.

  18. Testing serial dependence by Random-shuffle surrogates and the Wayland method

    International Nuclear Information System (INIS)

    Hirata, Yoshito; Horai, Shunsuke; Suzuki, Hideyuki; Aihara, Kazuyuki

    2007-01-01

    Given time series, a primary concern is existence of serial dependence and determinism. They are often tested with Random-shuffle surrogates, which totally break serial dependence, and the Wayland method. Since the statistic of the Wayland method fundamentally shows a smaller value for a more deterministic time series, for real-world data, we usually expect that the statistic for the original data is smaller than or equal to those of Random-shuffle surrogates. However, we show herewith an opposite result with wind data in high time resolution. We argue that this puzzling phenomenon can be produced by observational or dynamical noise, both of which may be produced by a low-dimensional deterministic system. Thus the one-sided test is dangerous

  19. Test case prioritization using Cuscuta search

    Directory of Open Access Journals (Sweden)

    Mukesh Mann

    2014-12-01

    Full Text Available Most companies are under heavy time and resource constraints when it comes to testing a software system. Test prioritization technique(s allows the most useful tests to be executed first, exposing faults earlier in the testing process. Thus makes software testing more efficient and cost effective by covering maximum faults in minimum time. But test case prioritization is not an easy and straightforward process and it requires huge efforts and time. Number of approaches is available with their proclaimed advantages and limitations, but accessibility of any one of them is a subject dependent. In this paper, artificial Cuscuta search algorithm (CSA inspired by real Cuscuta parasitism is used to solve time constraint prioritization problem. We have applied CSA for prioritizing test cases in an order of maximum fault coverage with minimum test suite execution and compare its effectiveness with different prioritization ordering. Taking into account the experimental results, we conclude that (i The average percentage of faults detection (APFD is 82.5% using our proposed CSA ordering which is equal to the APFD of optimal and ant colony based ordering whereas No ordering, Random ordering and Reverse ordering has 76.25%, 75%, 68.75% of APFD respectively.

  20. Random number generators tested on quantum Monte Carlo simulations.

    Science.gov (United States)

    Hongo, Kenta; Maezono, Ryo; Miura, Kenichi

    2010-08-01

    We have tested and compared several (pseudo) random number generators (RNGs) applied to a practical application, ground state energy calculations of molecules using variational and diffusion Monte Carlo metheds. A new multiple recursive generator with 8th-order recursion (MRG8) and the Mersenne twister generator (MT19937) are tested and compared with the RANLUX generator with five luxury levels (RANLUX-[0-4]). Both MRG8 and MT19937 are proven to give the same total energy as that evaluated with RANLUX-4 (highest luxury level) within the statistical error bars with less computational cost to generate the sequence. We also tested the notorious implementation of linear congruential generator (LCG), RANDU, for comparison. (c) 2010 Wiley Periodicals, Inc.

  1. STEADY STATE MODELING OF THE MINIMUM CRITICAL CORE OF THE TRANSIENT REACTOR TEST FACILITY

    Energy Technology Data Exchange (ETDEWEB)

    Anthony L. Alberti; Todd S. Palmer; Javier Ortensi; Mark D. DeHart

    2016-05-01

    With the advent of next generation reactor systems and new fuel designs, the U.S. Department of Energy (DOE) has identified the need for the resumption of transient testing of nuclear fuels. The DOE has decided that the Transient Reactor Test Facility (TREAT) at Idaho National Laboratory (INL) is best suited for future testing. TREAT is a thermal neutron spectrum, air-cooled, nuclear test facility that is designed to test nuclear fuels in transient scenarios. These specific scenarios range from simple temperature transients to full fuel melt accidents. DOE has expressed a desire to develop a simulation capability that will accurately model the experiments before they are irradiated at the facility. It is the aim for this capability to have an emphasis on effective and safe operation while minimizing experimental time and cost. The multi physics platform MOOSE has been selected as the framework for this project. The goals for this work are to identify the fundamental neutronics properties of TREAT and to develop an accurate steady state model for future multiphysics transient simulations. In order to minimize computational cost, the effect of spatial homogenization and angular discretization are investigated. It was found that significant anisotropy is present in TREAT assemblies and to capture this effect, explicit modeling of cooling channels and inter-element gaps is necessary. For this modeling scheme, single element calculations at 293 K gave power distributions with a root mean square difference of 0.076% from those of reference SERPENT calculations. The minimum critical core configuration with identical gap and channel treatment at 293 K resulted in a root mean square, total core, radial power distribution 2.423% different than those of reference SERPENT solutions.

  2. Allocation of optimal distributed generation using GA for minimum ...

    African Journals Online (AJOL)

    user

    quality of supply and reliability in tern extending equipment maintenance intervals and ... The performance of the method is tested on 33-bus test system and ... minimum real power losses of the system by calculating DG size at different buses.

  3. Experience with High Voltage Tests of the W7-X Magnets in Paschen-Minimum Conditions

    International Nuclear Information System (INIS)

    Petersen-Zarling, B.M.; Risse, K.; Viebke, H.; Gustke, D.; Ehmler, H.; Baldzuhn, J.; Sborchia, C.; Scheller, H.

    2006-01-01

    The W7-X machine is a low-shear stellarator of the Wendelstein line, which is being assembled at the IPP Branch Institute of Greifswald, Germany. The machine features a superconducting magnet system with 50 non-planar and 20 planar magnets operated at about 6 T and discharged with peak voltage levels up to 6 kV. Following the factory tests, the magnets are delivered to CEA Saclay, France, for the final acceptance tests at cryogenic condition. A series of high voltage tests in air and vacuum are part of the final acceptance test. During these tests the quality of the insulation, especially the hand-wrapped ground insulation in the termination area, has proven not to be adequate. In order to improve the reliability of the insulation system and detect defects for early repair, high voltage tests in reduced pressure of air (Paschen-minimum conditions) have been added as part of the factory acceptance procedure. This has been implemented in the vacuum chambers of BNN/Ansaldo for the test of the 50 non-planar coils, while other tests have been carried out at CEA/Saclay after cold testing. IPP has also installed a vacuum tank to perform Paschen tests during the preparation of all the coils for assembly, including also the 20 planar coils which cannot be tested at the manufacturer Tesla. These tests have proven to be a powerful tool to detect hidden insulation defects and void/cavities in the primary impregnation system, which could not be detected otherwise with the standard high voltage tests. This paper will summarize the background and experience accumulated in about 2 years of Paschen tests on the W7-X coils, including a description of the equipment, main results and statistics, weak points detected and repaired on the coils, and possibilities of improvements in the development and production of the W7-X magnets. The importance and the need of Paschen tests as part of the acceptance procedure for superconducting magnets to be used in future projects will also be

  4. Self-Reported Drug and Alcohol Use and Attitudes toward Drug Testing in High Schools with Random Student Drug Testing

    Science.gov (United States)

    DuPont, Robert L.; Campbell, Michael D.; Campbell, Teresa G.; Shea, Corinne L.; DuPont, Helen S.

    2013-01-01

    Many schools implement random student drug testing (RSDT) programs as a drug prevention strategy. This study analyzes self-report surveys of students in eight secondary schools with well-established RSDT programs, comparing students who understood they were subject to testing and students who understood they were not subject to testing. Students…

  5. Offering self-administered oral HIV testing to truck drivers in Kenya to increase testing: a randomized controlled trial.

    Science.gov (United States)

    Kelvin, Elizabeth A; George, Gavin; Mwai, Eva; Nyaga, Eston; Mantell, Joanne E; Romo, Matthew L; Odhiambo, Jacob O; Starbuck, Lila; Govender, Kaymarlin

    2018-01-01

    We conducted a randomized controlled trial among 305 truck drivers from two North Star Alliance roadside wellness clinics in Kenya to see if offering HIV testing choices would increase HIV testing uptake. Participants were randomized to be offered (1) a provider-administered rapid blood (finger-prick) HIV test (i.e., standard of care [SOC]) or (2) a Choice between SOC or a self-administered oral rapid HIV test with provider supervision in the clinic. Participants in the Choice arm who refused HIV testing in the clinic were offered a test kit for home use with phone-based posttest counseling. We compared HIV test uptake using the Mantel Haenszel odds ratio (OR) adjusting for clinic. Those in the Choice arm had higher odds of HIV test uptake than those in the SOC arm (OR = 1.5), but the difference was not statistically significant (p = 0.189). When adding the option to take an HIV test kit for home use, the Choice arm had significantly greater odds of testing uptake (OR = 2.8, p = 0.002). Of those in the Choice arm who tested, 26.9% selected the SOC test, 64.6% chose supervised self-testing in the clinic, and 8.5% took a test kit for home use. Participants varied in the HIV test they selected when given choices. Importantly, when participants who refused HIV testing in the clinic were offered a test kit for home use, an additional 8.5% tested. Offering truck drivers a variety of HIV testing choices may increase HIV testing uptake in this key population.

  6. Statistical auditing and randomness test of lotto k/N-type games

    Science.gov (United States)

    Coronel-Brizio, H. F.; Hernández-Montoya, A. R.; Rapallo, F.; Scalas, E.

    2008-11-01

    One of the most popular lottery games worldwide is the so-called “lotto k/N”. It considers N numbers 1,2,…,N from which k are drawn randomly, without replacement. A player selects k or more numbers and the first prize is shared amongst those players whose selected numbers match all of the k randomly drawn. Exact rules may vary in different countries. In this paper, mean values and covariances for the random variables representing the numbers drawn from this kind of game are presented, with the aim of using them to audit statistically the consistency of a given sample of historical results with theoretical values coming from a hypergeometric statistical model. The method can be adapted to test pseudorandom number generators.

  7. A test for stationarity of spatio-temporal random fields on planar and spherical domains

    KAUST Repository

    Jun, Mikyoung; Genton, Marc G.

    2012-01-01

    A formal test for weak stationarity of spatial and spatio-temporal random fields is proposed. We consider the cases where the spatial domain is planar or spherical, and we do not require distributional assumptions for the random fields. The method

  8. Random Gap Detection Test (RGDT) performance of individuals with central auditory processing disorders from 5 to 25 years of age.

    Science.gov (United States)

    Dias, Karin Ziliotto; Jutras, Benoît; Acrani, Isabela Olszanski; Pereira, Liliane Desgualdo

    2012-02-01

    The aim of the present study was to assess the auditory temporal resolution ability in individuals with central auditory processing disorders, to examine the maturation effect and to investigate the relationship between the performance on a temporal resolution test with the performance on other central auditory tests. Participants were divided in two groups: 131 with Central Auditory Processing Disorder and 94 with normal auditory processing. They had pure-tone air-conduction thresholds no poorer than 15 dB HL bilaterally, normal admittance measures and presence of acoustic reflexes. Also, they were assessed with a central auditory test battery. Participants who failed at least one or more tests were included in the Central Auditory Processing Disorder group and those in the control group obtained normal performance on all tests. Following the auditory processing assessment, the Random Gap Detection Test was administered to the participants. A three-way ANOVA was performed. Correlation analyses were also done between the four Random Gap Detection Test subtests data as well as between Random Gap Detection Test data and the other auditory processing test results. There was a significant difference between the age-group performances in children with and without Central Auditory Processing Disorder. Also, 48% of children with Central Auditory Processing Disorder failed the Random Gap Detection Test and the percentage decreased as a function of age. The highest percentage (86%) was found in the 5-6 year-old children. Furthermore, results revealed a strong significant correlation between the four Random Gap Detection Test subtests. There was a modest correlation between the Random Gap Detection Test results and the dichotic listening tests. No significant correlation was observed between the Random Gap Detection Test data and the results of the other tests in the battery. Random Gap Detection Test should not be administered to children younger than 7 years old because

  9. Adoption of projected mortality table for the Slovenian market using the Poisson log-bilinear model to test the minimum standard for valuing life annuities

    Directory of Open Access Journals (Sweden)

    Darko Medved

    2015-01-01

    Full Text Available With the introduction of Solvency II a consistent market approach to the valuation of insurance assets and liabilities is required. For the best estimate of life annuity provisions one should estimate the longevity risk of the insured population in Slovenia. In this paper the current minimum standard in Slovenia for calculating pension annuities is tested using the Lee-Carter model. In particular, the mortality of the Slovenian population is projected using the best fit from the stochastic mortality projections method. The projected mortality statistics are then corrected with the selection effect and compared with the current minimum standard.

  10. Impact of HIPAA's minimum necessary standard on genomic data sharing.

    Science.gov (United States)

    Evans, Barbara J; Jarvik, Gail P

    2018-04-01

    This article provides a brief introduction to the Health Insurance Portability and Accountability Act of 1996 (HIPAA) Privacy Rule's minimum necessary standard, which applies to sharing of genomic data, particularly clinical data, following 2013 Privacy Rule revisions. This research used the Thomson Reuters Westlaw database and law library resources in its legal analysis of the HIPAA privacy tiers and the impact of the minimum necessary standard on genomic data sharing. We considered relevant example cases of genomic data-sharing needs. In a climate of stepped-up HIPAA enforcement, this standard is of concern to laboratories that generate, use, and share genomic information. How data-sharing activities are characterized-whether for research, public health, or clinical interpretation and medical practice support-affects how the minimum necessary standard applies and its overall impact on data access and use. There is no clear regulatory guidance on how to apply HIPAA's minimum necessary standard when considering the sharing of information in the data-rich environment of genomic testing. Laboratories that perform genomic testing should engage with policy makers to foster sound, well-informed policies and appropriate characterization of data-sharing activities to minimize adverse impacts on day-to-day workflows.

  11. Do Minimum Wages Fight Poverty?

    OpenAIRE

    David Neumark; William Wascher

    1997-01-01

    The primary goal of a national minimum wage floor is to raise the incomes of poor or near-poor families with members in the work force. However, estimates of employment effects of minimum wages tell us little about whether minimum wages are can achieve this goal; even if the disemployment effects of minimum wages are modest, minimum wage increases could result in net income losses for poor families. We present evidence on the effects of minimum wages on family incomes from matched March CPS s...

  12. Research on the Random Shock Vibration Test Based on the Filter-X LMS Adaptive Inverse Control Algorithm

    Directory of Open Access Journals (Sweden)

    Wang Wei

    2016-01-01

    Full Text Available The related theory and algorithm of adaptive inverse control were presented through the research which pointed out the adaptive inverse control strategy could effectively eliminate the noise influence on the system control. Proposed using a frequency domain filter-X LMS adaptive inverse control algorithm, and the control algorithm was applied to the two-exciter hydraulic vibration test system of random shock vibration control process and summarized the process of the adaptive inverse control strategies in the realization of the random shock vibration test. The self-closed-loop and field test show that using the frequency-domain filter-X LMS adaptive inverse control algorithm can realize high precision control of random shock vibration test.

  13. Random non-proportional fatigue tests with planar tri-axial fatigue testing machine

    Directory of Open Access Journals (Sweden)

    T. Inoue

    2016-10-01

    Full Text Available Complex stresses, which occur on the mechanical surfaces of transport machinery in service, bring a drastic degradation in fatigue life. However, it is hard to reproduce such complex stress states for evaluating the fatigue life with conventional multiaxial fatigue machines. We have developed a fatigue testing machine that enables reproduction of such complex stresses. The testing machine can reproduce arbitrary in-plane stress states by applying three independent loads to the test specimen using actuators which apply loads in the 0, 45, and 90 degree directions. The reproduction was tested with complex stress data obtained from the actual operation of transport machinery. As a result, it was found that the reproduced stress corresponded to the measured stress with an error range of less than 10 %. Then, we made a comparison between measured fatigue lives under random non-proportional loading conditions and predicted fatigue lives. It was found that predicted fatigue lives with cr, stress on critical plane, were over a factor of 10 against measured fatigue lives. On the other hand, predicted fatigue lives with ma, stress in consideration of a non-proportional level evaluated by using amplitude and direction of principal stress, were within a factor of 3 against measured fatigue lives

  14. Towards Finding the Global Minimum of the D-Wave Objective Function for Improved Neural Network Regressions

    Science.gov (United States)

    Dorband, J. E.

    2017-12-01

    The D-Wave 2X has successfully been used for regression analysis to derive carbon flux data from OCO-2 CO2 concentration using neural networks. The samples returned from the D-Wave should represent the minimum of an objective function presented to it. An accurate as possible minimum function value is needed for this analysis. Samples from the D-Wave are near minimum, but seldom are the global minimum of the function due to quantum noise. Two methods for improving the accuracy of minimized values represented by the samples returned from the D-Wave are presented. The first method finds a new sample with a minimum value near each returned D-Wave sample. The second method uses all the returned samples to find a more global minimum sample. We present three use-cases performed using the former method. In the first use case, it is demonstrated that an objective function with random qubits and coupler coefficients had an improved minimum. In the second use case, the samples corrected by the first method can improve the training of a Boltzmann machine neural network. The third use case demonstrated that using the first method can improve virtual qubit accuracy.The later method was also performed on the first use case.

  15. A test to measure the minimum burning pressure of water-based commercial explosives and their precursors

    Energy Technology Data Exchange (ETDEWEB)

    Turcotte, R.; Feng, H.; Badeen, C.M.; Goldthorp, S.; Johnson, C. [Natural Resources Canada, Ottawa, ON (Canada). Canadian Explosives Research Laboratory; Chan, S.K. [Orica Canada Inc., Brownsburg-Chatham, PQ (Canada)

    2009-05-15

    This paper described a testing protocol developed to measure the minimum burning pressure (MBP) of ammonium nitrate water-based emulsions (AWEs). Oxidizer solutions were prepared in a stainless steel beaker. A modified commercial mixer was used to emulsify the oil-surfactant phase with the oxidizer solutions and blend dry ingredients. Five high water content AWEs were then prepared and placed in pressurized vessels. Samples were ignited using a straight length of nichrome wire. Emulsion samples were transferred into a cylindrical test cell painted with non-conductive paint. Copper conductor leg-wires were connected to electrodes passing through the body of the vessel. When samples were equilibrated to the desired initial pressure, a constant current was supplied to the hot wire. Solid state relays were used to switch the current power supply on and off. Hot wire voltage signals were used to obtain temperature profiles for onset and ignition temperatures. The procedure to perform the MBP measurements was based on 3 types of classifying events, namely (1) no reaction, (2) partial reaction, and (3) slow decomposition. Results of the tests demonstrated that the 5 emulsions exhibited large differences in respective MBP values. Data from the study will be used to develop standards for the authorization of high explosives in Canada. 15 refs., 1 tab., 3 figs.

  16. Randomization tests

    National Research Council Canada - National Science Library

    Edgington, Eugene S

    1980-01-01

    .... This book provides all the necessary theory and practical guidelines, such as instructions for writing computer programs, to permit experimenters to transform any statistical test into a distribution-free test...

  17. An Experimental study on a Method of Computing Minimum flow rate

    International Nuclear Information System (INIS)

    Cho, Yeon Sik; Kim, Tae Hyun; Kim, Chang Hyun

    2009-01-01

    Many pump reliability problems in the Nuclear Power Plants (NPPs) are being attributed to the operation of the pump at flow rates well below its best efficiency point(BEP). Generally, the manufacturer and the user try to avert such problems by specifying a minimum flow, below which the pump should not be operated. Pump minimum flow usually involves two considerations. The first consideration is normally termed the 'thermal minimum flow', which is that flow required to prevent the fluid inside the pump from reaching saturation conditions. The other consideration is often referred to as 'mechanical minimum flow', which is that flow required to prevent mechanical damage. However, the criteria for specifying such a minimum flow are not clearly understood by all parties concerned. Also various factor and information for computing minimum flow are not easily available as considering for the pump manufacturer' proprietary. The objective of this study is to obtain experimental data for computing minimum flow rate and to understand the pump performances due to low flow operation. A test loop consisted of the pump to be used in NPPs, water tank, flow rate measurements and piping system with flow control devices was established for this study

  18. A test for the minimum scale of grooving on the Amatrice and Norcia earthquakes

    Science.gov (United States)

    Okamoto, K.; Brodsky, E. E.; Billi, A.

    2017-12-01

    As stress builds up along a fault, elastic strain energy builds until it cannot be accommodated by small-scale ductile deformation and then the fault brittlely fails. This brittle failure is associated with the grooving process that causes slickensides along fault planes. Therefore the scale at which slickensides disappear could be geological evidence of earthquake nucleation. Past studies found the minimum scale of grooving, however the studied fault surfaces were not exposed by recent earthquakes. These measurements could have been a product of chemical or mechanical weathering. On August 24th and October 30th of 2016, MW 6.0 and 6.5 earthquakes shook central Italy. The earthquakes caused decimeter to meter scale fault scarps along the Mt. Vettoretto Fault. Here, we analyze samples of a scarp using white light interferometry in order to determine if the minimum scale of grooving is present. Results suggest that grooving begins around 100 μm for these samples, which is consistent with previous findings of faults without any direct evidence of earthquakes. The measurement is also consistent with typical values of the frictional weakening distance Dc, which also is associated with a transition between ductile and brittle behavior. The measurements show that the minimum scale of grooving is a useful measure of the behavior of faults.

  19. Laboratory test on maximum and minimum void ratio of tropical sand matrix soils

    Science.gov (United States)

    Othman, B. A.; Marto, A.

    2018-04-01

    Sand is generally known as loose granular material which has a grain size finer than gravel and coarser than silt and can be very angular to well-rounded in shape. The present of various amount of fines which also influence the loosest and densest state of sand in natural condition have been well known to contribute to the deformation and loss of shear strength of soil. This paper presents the effect of various range of fines content on minimum void ratio e min and maximum void ratio e max of sand matrix soils. Laboratory tests to determine e min and e max of sand matrix soil were conducted using non-standard method introduced by previous researcher. Clean sand was obtained from natural mining site at Johor, Malaysia. A set of 3 different sizes of sand (fine sand, medium sand, and coarse sand) were mixed with 0% to 40% by weight of low plasticity fine (kaolin). Results showed that generally e min and e max decreased with the increase of fines content up to a minimal value of 0% to 30%, and then increased back thereafter.

  20. Can households earning minimum wage in Nova Scotia afford a nutritious diet?

    Science.gov (United States)

    Williams, Patricia L; Johnson, Christine P; Kratzmann, Meredith L V; Johnson, C Shanthi Jacob; Anderson, Barbara J; Chenhall, Cathy

    2006-01-01

    To assess the affordability of a nutritious diet for households earning minimum wage in Nova Scotia. Food costing data were collected in 43 randomly selected grocery stores throughout NS in 2002 using the National Nutritious Food Basket (NNFB). To estimate the affordability of a nutritious diet for households earning minimum wage, average monthly costs for essential expenses were subtracted from overall income to see if enough money remained for the cost of the NNFB. This was calculated for three types of household: 1) two parents and two children; 2) lone parent and two children; and 3) single male. Calculations were also made for the proposed 2006 minimum wage increase with expenses adjusted using the Consumer Price Index (CPI). The monthly cost of the NNFB priced in 2002 for the three types of household was 572.90 dollars, 351.68 dollars, and 198.73 dollars, respectively. Put into the context of basic living, these data showed that Nova Scotians relying on minimum wage could not afford to purchase a nutritious diet and meet their basic needs, placing their health at risk. These basic expenses do not include other routine costs, such as personal hygiene products, household and laundry cleaners, and prescriptions and costs associated with physical activity, education or savings for unexpected expenses. People working at minimum wage in Nova Scotia have not had adequate income to meet basic needs, including a nutritious diet. The 2006 increase in minimum wage to 7.15 dollars/hr is inadequate to ensure that Nova Scotians working at minimum wage are able to meet these basic needs. Wage increases and supplements, along with supports for expenses such as childcare and transportation, are indicated to address this public health problem.

  1. Ba 5s photoionization in the region of the second Cooper minimum

    International Nuclear Information System (INIS)

    Whitfield, S B; Wehlitz, R; Dolmatov, V K

    2011-01-01

    We investigate the 5s angular distribution parameter and partial photoionization cross section of atomic Ba in the region of the second Cooper minimum covering a photon energy region from 120 to 260 eV. We observe a strong drop in the Ba 5s β value from 2.0, reaching a minimum of 1.57 ± 0.07 at a photon energy of 150 eV. The β value then slowly rises back towards its nominal value of 2.0 at photon energies beyond the minimum. Our measured 5s partial cross section also shows a pronounced dip around 170 eV due to interchannel coupling with the Ba 4d photoelectrons. After combining our measurements with previous experimental values at lower photon energies, we obtain a consistent data set spanning the photon energy range prior to the onset of the partial cross section maximum and through the cross section minimum. We also calculate the 5s partial cross section under several different levels of approximation. We find that the generalized random-phase approximation with exchange calculation models the shape and position of the combined experimental cross section data set rather well after incorporating experimental ionization energies and a shift in the photon energy scale.

  2. Rising above the Minimum Wage.

    Science.gov (United States)

    Even, William; Macpherson, David

    An in-depth analysis was made of how quickly most people move up the wage scale from minimum wage, what factors influence their progress, and how minimum wage increases affect wage growth above the minimum. Very few workers remain at the minimum wage over the long run, according to this study of data drawn from the 1977-78 May Current Population…

  3. USING GENETIC ALGORTIHM TO SOLVE STEINER MINIMUM SPANNING TREE PROBLEM

    Directory of Open Access Journals (Sweden)

    Öznur İŞÇİ

    2006-03-01

    Full Text Available Genetic algorithms (GA are a stochastic research methods, and they produce solutions that are close to optimum or near optimum. In addition to GA's successful application to traveling salesman problem, square designation, allocation, workshop table, preparation of lesson/examination schedules, planning of communication networks, assembling line balanced, minimum spanning tree type many combinatorial optimization problems it would be applicable to make the best comparison in optimization. In this study a Java program is developed to solve Steiner minimum spanning tree problem by genetic algorithm and its performance is examined. According to the tests carried out on the problems that were given before in the literature, results that are close to optimum are obtained in by GA approach that is recommended in this study. For the predetermined points in the study, length and gain are calculated for Steiner minimum spanning tree problem and minimum spanning tree problem.

  4. Minimum Error Entropy Classification

    CERN Document Server

    Marques de Sá, Joaquim P; Santos, Jorge M F; Alexandre, Luís A

    2013-01-01

    This book explains the minimum error entropy (MEE) concept applied to data classification machines. Theoretical results on the inner workings of the MEE concept, in its application to solving a variety of classification problems, are presented in the wider realm of risk functionals. Researchers and practitioners also find in the book a detailed presentation of practical data classifiers using MEE. These include multi‐layer perceptrons, recurrent neural networks, complexvalued neural networks, modular neural networks, and decision trees. A clustering algorithm using a MEE‐like concept is also presented. Examples, tests, evaluation experiments and comparison with similar machines using classic approaches, complement the descriptions.

  5. [Intel random number generator-based true random number generator].

    Science.gov (United States)

    Huang, Feng; Shen, Hong

    2004-09-01

    To establish a true random number generator on the basis of certain Intel chips. The random numbers were acquired by programming using Microsoft Visual C++ 6.0 via register reading from the random number generator (RNG) unit of an Intel 815 chipset-based computer with Intel Security Driver (ISD). We tested the generator with 500 random numbers in NIST FIPS 140-1 and X(2) R-Squared test, and the result showed that the random number it generated satisfied the demand of independence and uniform distribution. We also compared the random numbers generated by Intel RNG-based true random number generator and those from the random number table statistically, by using the same amount of 7500 random numbers in the same value domain, which showed that the SD, SE and CV of Intel RNG-based random number generator were less than those of the random number table. The result of u test of two CVs revealed no significant difference between the two methods. Intel RNG-based random number generator can produce high-quality random numbers with good independence and uniform distribution, and solves some problems with random number table in acquisition of the random numbers.

  6. Maximum Kolmogorov-Sinai Entropy Versus Minimum Mixing Time in Markov Chains

    Science.gov (United States)

    Mihelich, M.; Dubrulle, B.; Paillard, D.; Kral, Q.; Faranda, D.

    2018-01-01

    We establish a link between the maximization of Kolmogorov Sinai entropy (KSE) and the minimization of the mixing time for general Markov chains. Since the maximisation of KSE is analytical and easier to compute in general than mixing time, this link provides a new faster method to approximate the minimum mixing time dynamics. It could be interesting in computer sciences and statistical physics, for computations that use random walks on graphs that can be represented as Markov chains.

  7. Employment effects of minimum wages

    OpenAIRE

    Neumark, David

    2014-01-01

    The potential benefits of higher minimum wages come from the higher wages for affected workers, some of whom are in low-income families. The potential downside is that a higher minimum wage may discourage employers from using the low-wage, low-skill workers that minimum wages are intended to help. Research findings are not unanimous, but evidence from many countries suggests that minimum wages reduce the jobs available to low-skill workers.

  8. Expedite random structure searching using objects from Wyckoff positions

    Science.gov (United States)

    Wang, Shu-Wei; Hsing, Cheng-Rong; Wei, Ching-Ming

    2018-02-01

    Random structure searching has been proved to be a powerful approach to search and find the global minimum and the metastable structures. A true random sampling is in principle needed yet it would be highly time-consuming and/or practically impossible to find the global minimum for the complicated systems in their high-dimensional configuration space. Thus the implementations of reasonable constraints, such as adopting system symmetries to reduce the independent dimension in structural space and/or imposing chemical information to reach and relax into low-energy regions, are the most essential issues in the approach. In this paper, we propose the concept of "object" which is either an atom or composed of a set of atoms (such as molecules or carbonates) carrying a symmetry defined by one of the Wyckoff positions of space group and through this process it allows the searching of global minimum for a complicated system to be confined in a greatly reduced structural space and becomes accessible in practice. We examined several representative materials, including Cd3As2 crystal, solid methanol, high-pressure carbonates (FeCO3), and Si(111)-7 × 7 reconstructed surface, to demonstrate the power and the advantages of using "object" concept in random structure searching.

  9. Crowdsourcing HIV Test Promotion Videos: A Noninferiority Randomized Controlled Trial in China.

    Science.gov (United States)

    Tang, Weiming; Han, Larry; Best, John; Zhang, Ye; Mollan, Katie; Kim, Julie; Liu, Fengying; Hudgens, Michael; Bayus, Barry; Terris-Prestholt, Fern; Galler, Sam; Yang, Ligang; Peeling, Rosanna; Volberding, Paul; Ma, Baoli; Xu, Huifang; Yang, Bin; Huang, Shujie; Fenton, Kevin; Wei, Chongyi; Tucker, Joseph D

    2016-06-01

    Crowdsourcing, the process of shifting individual tasks to a large group, may enhance human immunodeficiency virus (HIV) testing interventions. We conducted a noninferiority, randomized controlled trial to compare first-time HIV testing rates among men who have sex with men (MSM) and transgender individuals who received a crowdsourced or a health marketing HIV test promotion video. Seven hundred twenty-one MSM and transgender participants (≥16 years old, never before tested for HIV) were recruited through 3 Chinese MSM Web portals and randomly assigned to 1 of 2 videos. The crowdsourced video was developed using an open contest and formal transparent judging while the evidence-based health marketing video was designed by experts. Study objectives were to measure HIV test uptake within 3 weeks of watching either HIV test promotion video and cost per new HIV test and diagnosis. Overall, 624 of 721 (87%) participants from 31 provinces in 217 Chinese cities completed the study. HIV test uptake was similar between the crowdsourced arm (37% [114/307]) and the health marketing arm (35% [111/317]). The estimated difference between the interventions was 2.1% (95% confidence interval, -5.4% to 9.7%). Among those tested, 31% (69/225) reported a new HIV diagnosis. The crowdsourced intervention cost substantially less than the health marketing intervention per first-time HIV test (US$131 vs US$238 per person) and per new HIV diagnosis (US$415 vs US$799 per person). Our nationwide study demonstrates that crowdsourcing may be an effective tool for improving HIV testing messaging campaigns and could increase community engagement in health campaigns. NCT02248558. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  10. The minimum yield in channeling

    International Nuclear Information System (INIS)

    Uguzzoni, A.; Gaertner, K.; Lulli, G.; Andersen, J.U.

    2000-01-01

    A first estimate of the minimum yield was obtained from Lindhard's theory, with the assumption of a statistical equilibrium in the transverse phase-space of channeled particles guided by a continuum axial potential. However, computer simulations have shown that this estimate should be corrected by a fairly large factor, C (approximately equal to 2.5), called the Barrett factor. We have shown earlier that the concept of a statistical equilibrium can be applied to understand this result, with the introduction of a constraint in phase-space due to planar channeling of axially channeled particles. Here we present an extended test of these ideas on the basis of computer simulation of the trajectories of 2 MeV α particles in Si. In particular, the gradual trend towards a full statistical equilibrium is studied. We also discuss the introduction of this modification of standard channeling theory into descriptions of the multiple scattering of channeled particles (dechanneling) by a master equation and show that the calculated minimum yields are in very good agreement with the results of a full computer simulation

  11. CR-Calculus and adaptive array theory applied to MIMO random vibration control tests

    Science.gov (United States)

    Musella, U.; Manzato, S.; Peeters, B.; Guillaume, P.

    2016-09-01

    Performing Multiple-Input Multiple-Output (MIMO) tests to reproduce the vibration environment in a user-defined number of control points of a unit under test is necessary in applications where a realistic environment replication has to be achieved. MIMO tests require vibration control strategies to calculate the required drive signal vector that gives an acceptable replication of the target. This target is a (complex) vector with magnitude and phase information at the control points for MIMO Sine Control tests while in MIMO Random Control tests, in the most general case, the target is a complete spectral density matrix. The idea behind this work is to tailor a MIMO random vibration control approach that can be generalized to other MIMO tests, e.g. MIMO Sine and MIMO Time Waveform Replication. In this work the approach is to use gradient-based procedures over the complex space, applying the so called CR-Calculus and the adaptive array theory. With this approach it is possible to better control the process performances allowing the step-by-step Jacobian Matrix update. The theoretical bases behind the work are followed by an application of the developed method to a two-exciter two-axis system and by performance comparisons with standard methods.

  12. Minimum Wages and Poverty

    OpenAIRE

    Fields, Gary S.; Kanbur, Ravi

    2005-01-01

    Textbook analysis tells us that in a competitive labor market, the introduction of a minimum wage above the competitive equilibrium wage will cause unemployment. This paper makes two contributions to the basic theory of the minimum wage. First, we analyze the effects of a higher minimum wage in terms of poverty rather than in terms of unemployment. Second, we extend the standard textbook model to allow for incomesharing between the employed and the unemployed. We find that there are situation...

  13. Estimating minimum polycrystalline aggregate size for macroscopic material homogeneity

    International Nuclear Information System (INIS)

    Kovac, M.; Simonovski, I.; Cizelj, L.

    2002-01-01

    During severe accidents the pressure boundary of reactor coolant system can be subjected to extreme loadings, which might cause failure. Reliable estimation of the extreme deformations can be crucial to determine the consequences of severe accidents. Important drawback of classical continuum mechanics is idealization of inhomogenous microstructure of materials. Classical continuum mechanics therefore cannot predict accurately the differences between measured responses of specimens, which are different in size but geometrical similar (size effect). A numerical approach, which models elastic-plastic behavior on mesoscopic level, is proposed to estimate minimum size of polycrystalline aggregate above which it can be considered macroscopically homogeneous. The main idea is to divide continuum into a set of sub-continua. Analysis of macroscopic element is divided into modeling the random grain structure (using Voronoi tessellation and random orientation of crystal lattice) and calculation of strain/stress field. Finite element method is used to obtain numerical solutions of strain and stress fields. The analysis is limited to 2D models.(author)

  14. Quantum Statistical Testing of a Quantum Random Number Generator

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S [ORNL

    2014-01-01

    The unobservable elements in a quantum technology, e.g., the quantum state, complicate system verification against promised behavior. Using model-based system engineering, we present methods for verifying the opera- tion of a prototypical quantum random number generator. We begin with the algorithmic design of the QRNG followed by the synthesis of its physical design requirements. We next discuss how quantum statistical testing can be used to verify device behavior as well as detect device bias. We conclude by highlighting how system design and verification methods must influence effort to certify future quantum technologies.

  15. Cost-effective degradation test plan for a nonlinear random-coefficients model

    International Nuclear Information System (INIS)

    Kim, Seong-Joon; Bae, Suk Joo

    2013-01-01

    The determination of requisite sample size and the inspection schedule considering both testing cost and accuracy has been an important issue in the degradation test. This paper proposes a cost-effective degradation test plan in the context of a nonlinear random-coefficients model, while meeting some precision constraints for failure-time distribution. We introduce a precision measure to quantify the information losses incurred by reducing testing resources. The precision measure is incorporated into time-varying cost functions to reflect real circumstances. We apply a hybrid genetic algorithm to general cost optimization problem with reasonable constraints on the level of testing precision in order to determine a cost-effective inspection scheme. The proposed method is applied to the degradation data of plasma display panels (PDPs) following a bi-exponential degradation model. Finally, sensitivity analysis via simulation is provided to evaluate the robustness of the proposed degradation test plan.

  16. Child oral health-related quality of life and early childhood caries: a non-inferiority randomized control trial.

    Science.gov (United States)

    Arrow, P; Klobas, E

    2016-06-01

    The aim of this study was to compare changes in child oral health-related quality of life (COHRQoL) after treatment for early childhood caries (ECC) using two alternative treatment approaches. A randomized control trial with random allocation of parent/child dyads with ECC to test (minimum intervention) or control (standard care). Participating parents completed the Early Childhood Oral Health Impact Scale (ECOHIS) at baseline and follow-up. Changes in ECOHIS scores and extent of COHRQoL impacts between and within groups were tested using the chi-squared statistic for groups, Wilcoxon's rank-sum test, and matched-pairs signed-rank test. Two hundred and fifty-four children were randomized (test = 127; control = 127). At baseline, mean ECOHIS score 11.1, sd 8.2; mean age = 3.8 years, sd 0.90; mean dmft = 4.9, sd 4.0; and 59% male. After a mean interval of 11.4 months, 210 children were followed-up and returned a completed questionnaire (test = 111; control = 99). There was no significant difference in COHRQoL changes between test and control. For all the children combined, there were significantly fewer impacts at follow-up in the child and family domains and the total ECOHIS, Wilcoxon signed-rank test, p test and control in the extent of the improvement. © 2016 Australian Dental Association.

  17. 75 FR 6151 - Minimum Capital

    Science.gov (United States)

    2010-02-08

    ... capital and reserve requirements to be issued by order or regulation with respect to a product or activity... minimum capital requirements. Section 1362(a) establishes a minimum capital level for the Enterprises... entities required under this section.\\6\\ \\3\\ The Bank Act's current minimum capital requirements apply to...

  18. A Pareto-Improving Minimum Wage

    OpenAIRE

    Eliav Danziger; Leif Danziger

    2014-01-01

    This paper shows that a graduated minimum wage, in contrast to a constant minimum wage, can provide a strict Pareto improvement over what can be achieved with an optimal income tax. The reason is that a graduated minimum wage requires high-productivity workers to work more to earn the same income as low-productivity workers, which makes it more difficult for the former to mimic the latter. In effect, a graduated minimum wage allows the low-productivity workers to benefit from second-degree pr...

  19. Minimum critical mass systems

    International Nuclear Information System (INIS)

    Dam, H. van; Leege, P.F.A. de

    1987-01-01

    An analysis is presented of thermal systems with minimum critical mass, based on the use of materials with optimum neutron moderating and reflecting properties. The optimum fissile material distributions in the systems are obtained by calculations with standard computer codes, extended with a routine for flat fuel importance search. It is shown that in the minimum critical mass configuration a considerable part of the fuel is positioned in the reflector region. For 239 Pu a minimum critical mass of 87 g is found, which is the lowest value reported hitherto. (author)

  20. SS Cygni: The accretion disk in eruption and at minimum light

    International Nuclear Information System (INIS)

    Kiplinger, A.L.

    1979-01-01

    Absolute spectrophotometric observations of the dwarf nova SS Cygni have been obtained at maximum light, during the subsequent decline, and at minimum light. In order to provide a critical test of accretion disk theory, a model for a steady-state α-model accretion disk has been constructed which utilizes a grid of stellar energy distributions to synthesize the disk flux. Physical parameters for the accretion disk at maximum light are set by estimates of the intrinsic luminosity of the system that result from a desynthesis of a composite minimum light energy distribution. At maximum light, agreements between observational and theoretical continuum slopes and the Balmer jump are remarkably good. The model fails, however, during the eruption decline and at minimum light. It appears that the physical character of an accretion disk at minimum light must radiacally differ from the disk observed at maximum light

  1. Digital servo control of random sound test excitation. [in reverberant acoustic chamber

    Science.gov (United States)

    Nakich, R. B. (Inventor)

    1974-01-01

    A digital servocontrol system for random noise excitation of a test object in a reverberant acoustic chamber employs a plurality of sensors spaced in the sound field to produce signals in separate channels which are decorrelated and averaged. The average signal is divided into a plurality of adjacent frequency bands cyclically sampled by a time division multiplex system, converted into digital form, and compared to a predetermined spectrum value stored in digital form. The results of the comparisons are used to control a time-shared up-down counter to develop gain control signals for the respective frequency bands in the spectrum of random sound energy picked up by the microphones.

  2. Persian randomized dichotic digits test: Development and dichotic listening performance in young adults

    Directory of Open Access Journals (Sweden)

    Mohammad Ebrahim Mahdavi

    2015-02-01

    Full Text Available Background and Aims: The dichotic listening subtest is an important component of the test battery for auditory processing assessment in both children and adults. A randomized dichotic digits test (RDDT was created to compensate for sensitivity weakness of double digits when detecting abnormal ear asymmetry during dichotic listening. The aim of this study was the development and  intial evaluation of the Persian randomized dichotic digits test.Method: Persian digits 1-10 (except for the bisyllabic digit, 4 uttered by a native Persian language speaker were recorded in a studio. After alignment of intensity and temporal characteristics of digit waveforms, lists 1 and 2 of the RDDT were reproduced. List 1 of the test was administered at 55 dBHL on 50 right-handed normal hearing individuals (with an equal sex ratio in the age group of 18-25 years and hearing thresholds of 15 dBHL or better in audiometric frequencies.Results: Mean (standard deviation of percent-correct score for right and left ears and right ear advantage of the subjects was 94.3 (5.3, 84.8 (7.7, and 9.5 (7.0 percent, respectively. Sixty percent of the subjects showed normal results and unilateral and bilateral deficits were seen in 24 percent and 16 percent, respectively, of studied individuals.Conclusion: It seems the Persian version of RDDT test is the same as the original test as it is able to test ear asymmerty, unilateral and bilateral deficits in dichotic listening.

  3. 5 CFR 551.301 - Minimum wage.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Minimum wage. 551.301 Section 551.301... FAIR LABOR STANDARDS ACT Minimum Wage Provisions Basic Provision § 551.301 Minimum wage. (a)(1) Except... employees wages at rates not less than the minimum wage specified in section 6(a)(1) of the Act for all...

  4. Estimation of Genetic Parameters for First Lactation Monthly Test-day Milk Yields using Random Regression Test Day Model in Karan Fries Cattle

    Directory of Open Access Journals (Sweden)

    Ajay Singh

    2016-06-01

    Full Text Available A single trait linear mixed random regression test-day model was applied for the first time for analyzing the first lactation monthly test-day milk yield records in Karan Fries cattle. The test-day milk yield data was modeled using a random regression model (RRM considering different order of Legendre polynomial for the additive genetic effect (4th order and the permanent environmental effect (5th order. Data pertaining to 1,583 lactation records spread over a period of 30 years were recorded and analyzed in the study. The variance component, heritability and genetic correlations among test-day milk yields were estimated using RRM. RRM heritability estimates of test-day milk yield varied from 0.11 to 0.22 in different test-day records. The estimates of genetic correlations between different test-day milk yields ranged 0.01 (test-day 1 [TD-1] and TD-11 to 0.99 (TD-4 and TD-5. The magnitudes of genetic correlations between test-day milk yields decreased as the interval between test-days increased and adjacent test-day had higher correlations. Additive genetic and permanent environment variances were higher for test-day milk yields at both ends of lactation. The residual variance was observed to be lower than the permanent environment variance for all the test-day milk yields.

  5. Do exchange rates follow random walks? A variance ratio test of the ...

    African Journals Online (AJOL)

    The random-walk hypothesis in foreign-exchange rates market is one of the most researched areas, particularly in developed economies. However, emerging markets in sub-Saharan Africa have received little attention in this regard. This study applies Lo and MacKinlay's (1988) conventional variance ratio test and Wright's ...

  6. Minimum bias and underlying event studies at CDF

    International Nuclear Information System (INIS)

    Moggi, Niccolo

    2010-01-01

    Soft, non-perturbative, interactions are poorly understood from the theoretical point of view even though they form a large part of the hadronic cross section at the energies now available. We review the CDF studies on minimum-bias ad underlying event in p(bar p) collisions at 2 TeV. After proposing an operative definition of 'underlying event', we present part of a systematic set of measurements carried out by the CDF Collaboration with the goal to provide data to test and improve the QCD models of hadron collisions. Different analysis strategies of the underlying event and possible event topologies are discussed. Part of the CDF minimum-bias results are also presented: in this sample, that represent the full inelastic cross-section, we can test simultaneously our knowledge of all the components that concur to form hadronic interactions. Comparisons with MonteCarlo simulations are always shown along with the data. These measurements will also contribute to more precise estimates of the soft QCD background of high-p T observables.

  7. Nowcasting daily minimum air and grass temperature

    Science.gov (United States)

    Savage, M. J.

    2016-02-01

    Site-specific and accurate prediction of daily minimum air and grass temperatures, made available online several hours before their occurrence, would be of significant benefit to several economic sectors and for planning human activities. Site-specific and reasonably accurate nowcasts of daily minimum temperature several hours before its occurrence, using measured sub-hourly temperatures hours earlier in the morning as model inputs, was investigated. Various temperature models were tested for their ability to accurately nowcast daily minimum temperatures 2 or 4 h before sunrise. Temperature datasets used for the model nowcasts included sub-hourly grass and grass-surface (infrared) temperatures from one location in South Africa and air temperature from four subtropical sites varying in altitude (USA and South Africa) and from one site in central sub-Saharan Africa. Nowcast models used employed either exponential or square root functions to describe the rate of nighttime temperature decrease but inverted so as to determine the minimum temperature. The models were also applied in near real-time using an open web-based system to display the nowcasts. Extrapolation algorithms for the site-specific nowcasts were also implemented in a datalogger in an innovative and mathematically consistent manner. Comparison of model 1 (exponential) nowcasts vs measured daily minima air temperatures yielded root mean square errors (RMSEs) <1 °C for the 2-h ahead nowcasts. Model 2 (also exponential), for which a constant model coefficient ( b = 2.2) was used, was usually slightly less accurate but still with RMSEs <1 °C. Use of model 3 (square root) yielded increased RMSEs for the 2-h ahead comparisons between nowcasted and measured daily minima air temperature, increasing to 1.4 °C for some sites. For all sites for all models, the comparisons for the 4-h ahead air temperature nowcasts generally yielded increased RMSEs, <2.1 °C. Comparisons for all model nowcasts of the daily grass

  8. Primitive polynomials selection method for pseudo-random number generator

    Science.gov (United States)

    Anikin, I. V.; Alnajjar, Kh

    2018-01-01

    In this paper we suggested the method for primitive polynomials selection of special type. This kind of polynomials can be efficiently used as a characteristic polynomials for linear feedback shift registers in pseudo-random number generators. The proposed method consists of two basic steps: finding minimum-cost irreducible polynomials of the desired degree and applying primitivity tests to get the primitive ones. Finally two primitive polynomials, which was found by the proposed method, used in pseudorandom number generator based on fuzzy logic (FRNG) which had been suggested before by the authors. The sequences generated by new version of FRNG have low correlation magnitude, high linear complexity, less power consumption, is more balanced and have better statistical properties.

  9. A Randomized Study of Incentivizing HIV Testing for Parolees in Community Aftercare.

    Science.gov (United States)

    Saxena, Preeta; Hall, Elizabeth A; Prendergast, Michael

    2016-04-01

    HIV risk-behaviors are high in criminal justice populations and more efforts are necessary to address them among criminal justice-involved substance abusers. This study examines the role of incentives in promoting HIV testing among parolees. Participants were randomly assigned to either an incentive (n = 104) or education group (control; n = 98), where the incentive group received a voucher for testing for HIV. Bivariate comparisons showed that a larger proportion of those in the incentive group received HIV testing (59% versus 47%), but this was not statistically significant (p = .09). However, in a multivariate logistic regression model controlling for covariates likely to influence HIV-testing behavior, those in the incentive group had increased odds of HIV testing in comparison to those in the education group (OR = 1.99, p testing and other healthy behaviors in criminal justice populations.

  10. Dose-Weighted Adjusted Mantel-Haenszel Tests for Numeric Scaled Strata in a Randomized Trial

    Science.gov (United States)

    Gansky, Stuart A.; Cheng, Nancy F.; Koch, Gary G.

    2011-01-01

    A recent three-arm parallel groups randomized clinical prevention trial had a protocol deviation causing participants to have fewer active doses of an in-office treatment than planned. The original statistical analysis plan stipulated a minimal assumption randomization-based extended Mantel-Haenszel (EMH) trend test of the high frequency, low frequency, and zero frequency treatment groups and a binary outcome. Thus a dose-weighted adjusted EMH (DWAEMH) test was developed with an extra set of weights corresponding to the number of active doses actually available, in the spirit of a pattern mixture model. The method can easily be implemented using standard statistical software. A set of Monte Carlo simulations using a logistic model was undertaken with (and without) actual dose-response effects through 1000 replicates for empirical power estimates (and 2100 for empirical size). Results showed size was maintained and power was improved for DWAEMH versus EMH and logistic regression Wald tests in the presence of a dose effect and treatment by dose interaction. PMID:21709814

  11. Parallel Monte Carlo Particle Transport and the Quality of Random Number Generators: How Good is Good Enough?

    International Nuclear Information System (INIS)

    Procassini, R J; Beck, B R

    2004-01-01

    It might be assumed that use of a ''high-quality'' random number generator (RNG), producing a sequence of ''pseudo random'' numbers with a ''long'' repetition period, is crucial for producing unbiased results in Monte Carlo particle transport simulations. While several theoretical and empirical tests have been devised to check the quality (randomness and period) of an RNG, for many applications it is not clear what level of RNG quality is required to produce unbiased results. This paper explores the issue of RNG quality in the context of parallel, Monte Carlo transport simulations in order to determine how ''good'' is ''good enough''. This study employs the MERCURY Monte Carlo code, which incorporates the CNPRNG library for the generation of pseudo-random numbers via linear congruential generator (LCG) algorithms. The paper outlines the usage of random numbers during parallel MERCURY simulations, and then describes the source and criticality transport simulations which comprise the empirical basis of this study. A series of calculations for each test problem in which the quality of the RNG (period of the LCG) is varied provides the empirical basis for determining the minimum repetition period which may be employed without producing a bias in the mean integrated results

  12. EcmPred: Prediction of extracellular matrix proteins based on random forest with maximum relevance minimum redundancy feature selection

    KAUST Repository

    Kandaswamy, Krishna Kumar Umar

    2013-01-01

    The extracellular matrix (ECM) is a major component of tissues of multicellular organisms. It consists of secreted macromolecules, mainly polysaccharides and glycoproteins. Malfunctions of ECM proteins lead to severe disorders such as marfan syndrome, osteogenesis imperfecta, numerous chondrodysplasias, and skin diseases. In this work, we report a random forest approach, EcmPred, for the prediction of ECM proteins from protein sequences. EcmPred was trained on a dataset containing 300 ECM and 300 non-ECM and tested on a dataset containing 145 ECM and 4187 non-ECM proteins. EcmPred achieved 83% accuracy on the training and 77% on the test dataset. EcmPred predicted 15 out of 20 experimentally verified ECM proteins. By scanning the entire human proteome, we predicted novel ECM proteins validated with gene ontology and InterPro. The dataset and standalone version of the EcmPred software is available at http://www.inb.uni-luebeck.de/tools-demos/Extracellular_matrix_proteins/EcmPred. © 2012 Elsevier Ltd.

  13. Minimum income protection in the Netherlands

    NARCIS (Netherlands)

    van Peijpe, T.

    2009-01-01

    This article offers an overview of the Dutch legal system of minimum income protection through collective bargaining, social security, and statutory minimum wages. In addition to collective agreements, the Dutch statutory minimum wage offers income protection to a small number of workers. Its

  14. Control method for multi-input multi-output non-Gaussian random vibration test with cross spectra consideration

    Directory of Open Access Journals (Sweden)

    Ronghui ZHENG

    2017-12-01

    Full Text Available A control method for Multi-Input Multi-Output (MIMO non-Gaussian random vibration test with cross spectra consideration is proposed in the paper. The aim of the proposed control method is to replicate the specified references composed of auto spectral densities, cross spectral densities and kurtoses on the test article in the laboratory. It is found that the cross spectral densities will bring intractable coupling problems and induce difficulty for the control of the multi-output kurtoses. Hence, a sequential phase modification method is put forward to solve the coupling problems in multi-input multi-output non-Gaussian random vibration test. To achieve the specified responses, an improved zero memory nonlinear transformation is utilized first to modify the Fourier phases of the signals with sequential phase modification method to obtain one frame reference response signals which satisfy the reference spectra and reference kurtoses. Then, an inverse system method is used in frequency domain to obtain the continuous stationary drive signals. At the same time, the matrix power control algorithm is utilized to control the spectra and kurtoses of the response signals further. At the end of the paper, a simulation example with a cantilever beam and a vibration shaker test are implemented and the results support the proposed method very well. Keywords: Cross spectra, Kurtosis control, Multi-input multi-output, Non-Gaussian, Random vibration test

  15. Binary pseudorandom test standard to determine the modulation transfer function of optical microscopes

    Science.gov (United States)

    Lacey, Ian; Anderson, Erik H.; Artemiev, Nikolay A.; Babin, Sergey; Cabrini, Stefano; Calafiore, Guiseppe; Chan, Elaine R.; McKinney, Wayne R.; Peroz, Christophe; Takacs, Peter Z.; Yashchuk, Valeriy V.

    2015-09-01

    This work reports on the development of a binary pseudo-random test sample optimized to calibrate the MTF of optical microscopes. The sample consists of a number of 1-D and 2-D patterns, with different minimum sizes of spatial artifacts from 300 nm to 2 microns. We describe the mathematical background, fabrication process, data acquisition and analysis procedure to return spatial frequency based instrument calibration. We show that the developed samples satisfy the characteristics of a test standard: functionality, ease of specification and fabrication, reproducibility, and low sensitivity to manufacturing error.

  16. Testing self-regulation interventions to increase walking using factorial randomized N-of-1 trials.

    Science.gov (United States)

    Sniehotta, Falko F; Presseau, Justin; Hobbs, Nicola; Araújo-Soares, Vera

    2012-11-01

    To investigate the suitability of N-of-1 randomized controlled trials (RCTs) as a means of testing the effectiveness of behavior change techniques based on self-regulation theory (goal setting and self-monitoring) for promoting walking in healthy adult volunteers. A series of N-of-1 RCTs in 10 normal and overweight adults ages 19-67 (M = 36.9 years). We randomly allocated 60 days within each individual to text message-prompted daily goal-setting and/or self-monitoring interventions in accordance with a 2 (step-count goal prompt vs. alternative goal prompt) × 2 (self-monitoring: open vs. blinded Omron-HJ-113-E pedometer) factorial design. Aggregated data were analyzed using random intercept multilevel models. Single cases were analyzed individually. The primary outcome was daily pedometer step counts over 60 days. Single-case analyses showed that 4 participants significantly increased walking: 2 on self-monitoring days and 2 on goal-setting days, compared with control days. Six participants did not benefit from the interventions. In aggregated analyses, mean step counts were higher on goal-setting days (8,499.9 vs. 7,956.3) and on self-monitoring days (8,630.3 vs. 7,825.9). Multilevel analyses showed a significant effect of the self-monitoring condition (p = .01), the goal-setting condition approached significance (p = .08), and there was a small linear increase in walking over time (p = .03). N-of-1 randomized trials are a suitable means to test behavioral interventions in individual participants.

  17. Force Limited Random Vibration Test of TESS Camera Mass Model

    Science.gov (United States)

    Karlicek, Alexandra; Hwang, James Ho-Jin; Rey, Justin J.

    2015-01-01

    The Transiting Exoplanet Survey Satellite (TESS) is a spaceborne instrument consisting of four wide field-of-view-CCD cameras dedicated to the discovery of exoplanets around the brightest stars. As part of the environmental testing campaign, force limiting was used to simulate a realistic random vibration launch environment. While the force limit vibration test method is a standard approach used at multiple institutions including Jet Propulsion Laboratory (JPL), NASA Goddard Space Flight Center (GSFC), European Space Research and Technology Center (ESTEC), and Japan Aerospace Exploration Agency (JAXA), it is still difficult to find an actual implementation process in the literature. This paper describes the step-by-step process on how the force limit method was developed and applied on the TESS camera mass model. The process description includes the design of special fixtures to mount the test article for properly installing force transducers, development of the force spectral density using the semi-empirical method, estimation of the fuzzy factor (C2) based on the mass ratio between the supporting structure and the test article, subsequent validating of the C2 factor during the vibration test, and calculation of the C.G. accelerations using the Root Mean Square (RMS) reaction force in the spectral domain and the peak reaction force in the time domain.

  18. Do Higher Minimum Wages Benefit Health? Evidence From the UK.

    Science.gov (United States)

    Lenhart, Otto

    This study examines the link between minimum wages and health outcomes by using the introduction of the National Minimum Wage (NMW) in the United Kingdom in 1999 as an exogenous variation of earned income. A test for health effects by using longitudinal data from the British Household Panel Survey for a period of ten years was conducted. It was found that the NMW significantly improved several measures of health, including self-reported health status and the presence of health conditions. When examining potential mechanisms, it was shown that changes in health behaviors, leisure expenditures, and financial stress can explain the observed improvements in health.

  19. Understanding the Minimum Wage: Issues and Answers.

    Science.gov (United States)

    Employment Policies Inst. Foundation, Washington, DC.

    This booklet, which is designed to clarify facts regarding the minimum wage's impact on marketplace economics, contains a total of 31 questions and answers pertaining to the following topics: relationship between minimum wages and poverty; impacts of changes in the minimum wage on welfare reform; and possible effects of changes in the minimum wage…

  20. Youth minimum wages and youth employment

    NARCIS (Netherlands)

    Marimpi, Maria; Koning, Pierre

    2018-01-01

    This paper performs a cross-country level analysis on the impact of the level of specific youth minimum wages on the labor market performance of young individuals. We use information on the use and level of youth minimum wages, as compared to the level of adult minimum wages as well as to the median

  1. Discretization of space and time: determining the values of minimum length and minimum time

    OpenAIRE

    Roatta , Luca

    2017-01-01

    Assuming that space and time can only have discrete values, we obtain the expression of the minimum length and the minimum time interval. These values are found to be exactly coincident with the Planck's length and the Planck's time but for the presence of h instead of ħ .

  2. Minimum wage development in the Russian Federation

    OpenAIRE

    Bolsheva, Anna

    2012-01-01

    The aim of this paper is to analyze the effectiveness of the minimum wage policy at the national level in Russia and its impact on living standards in the country. The analysis showed that the national minimum wage in Russia does not serve its original purpose of protecting the lowest wage earners and has no substantial effect on poverty reduction. The national subsistence minimum is too low and cannot be considered an adequate criterion for the setting of the minimum wage. The minimum wage d...

  3. Unbiased All-Optical Random-Number Generator

    Science.gov (United States)

    Steinle, Tobias; Greiner, Johannes N.; Wrachtrup, Jörg; Giessen, Harald; Gerhardt, Ilja

    2017-10-01

    The generation of random bits is of enormous importance in modern information science. Cryptographic security is based on random numbers which require a physical process for their generation. This is commonly performed by hardware random-number generators. These often exhibit a number of problems, namely experimental bias, memory in the system, and other technical subtleties, which reduce the reliability in the entropy estimation. Further, the generated outcome has to be postprocessed to "iron out" such spurious effects. Here, we present a purely optical randomness generator, based on the bistable output of an optical parametric oscillator. Detector noise plays no role and postprocessing is reduced to a minimum. Upon entering the bistable regime, initially the resulting output phase depends on vacuum fluctuations. Later, the phase is rigidly locked and can be well determined versus a pulse train, which is derived from the pump laser. This delivers an ambiguity-free output, which is reliably detected and associated with a binary outcome. The resulting random bit stream resembles a perfect coin toss and passes all relevant randomness measures. The random nature of the generated binary outcome is furthermore confirmed by an analysis of resulting conditional entropies.

  4. A cluster randomized controlled trial testing the effectiveness of Houvast: A strengths-based intervention for homeless young adults

    NARCIS (Netherlands)

    Krabbenborg, M.A.M.; Boersma, S.N.; Veld, W.M. van der; Hulst, B. van; Vollebergh, W.A.M.; Wolf, J.R.L.M.

    2017-01-01

    Objective: To test the effectiveness of Houvast: a strengths-based intervention for homeless young adults. Method: A cluster randomized controlled trial was conducted with 10 Dutch shelter facilities randomly allocated to an intervention and a control group. Homeless young adults were interviewed

  5. Minimum emittance of three-bend achromats

    International Nuclear Information System (INIS)

    Li Xiaoyu; Xu Gang

    2012-01-01

    The calculation of the minimum emittance of three-bend achromats (TBAs) made by Mathematical software can ignore the actual magnets lattice in the matching condition of dispersion function in phase space. The minimum scaling factors of two kinds of widely used TBA lattices are obtained. Then the relationship between the lengths and the radii of the three dipoles in TBA is obtained and so is the minimum scaling factor, when the TBA lattice achieves its minimum emittance. The procedure of analysis and the results can be widely used in achromats lattices, because the calculation is not restricted by the actual lattice. (authors)

  6. Reliability assessment for safety critical systems by statistical random testing

    International Nuclear Information System (INIS)

    Mills, S.E.

    1995-11-01

    In this report we present an overview of reliability assessment for software and focus on some basic aspects of assessing reliability for safety critical systems by statistical random testing. We also discuss possible deviations from some essential assumptions on which the general methodology is based. These deviations appear quite likely in practical applications. We present and discuss possible remedies and adjustments and then undertake applying this methodology to a portion of the SDS1 software. We also indicate shortcomings of the methodology and possible avenues to address to follow to address these problems. (author). 128 refs., 11 tabs., 31 figs

  7. Reliability assessment for safety critical systems by statistical random testing

    Energy Technology Data Exchange (ETDEWEB)

    Mills, S E [Carleton Univ., Ottawa, ON (Canada). Statistical Consulting Centre

    1995-11-01

    In this report we present an overview of reliability assessment for software and focus on some basic aspects of assessing reliability for safety critical systems by statistical random testing. We also discuss possible deviations from some essential assumptions on which the general methodology is based. These deviations appear quite likely in practical applications. We present and discuss possible remedies and adjustments and then undertake applying this methodology to a portion of the SDS1 software. We also indicate shortcomings of the methodology and possible avenues to address to follow to address these problems. (author). 128 refs., 11 tabs., 31 figs.

  8. The Bootstrap, the Jackknife, and the Randomization Test: A Sampling Taxonomy.

    Science.gov (United States)

    Rodgers, J L

    1999-10-01

    A simple sampling taxonomy is defined that shows the differences between and relationships among the bootstrap, the jackknife, and the randomization test. Each method has as its goal the creation of an empirical sampling distribution that can be used to test statistical hypotheses, estimate standard errors, and/or create confidence intervals. Distinctions between the methods can be made based on the sampling approach (with replacement versus without replacement) and the sample size (replacing the whole original sample versus replacing a subset of the original sample). The taxonomy is useful for teaching the goals and purposes of resampling schemes. An extension of the taxonomy implies other possible resampling approaches that have not previously been considered. Univariate and multivariate examples are presented.

  9. 30 CFR 57.19021 - Minimum rope strength.

    Science.gov (United States)

    2010-07-01

    ... feet: Minimum Value=Static Load×(7.0−0.001L) For rope lengths 3,000 feet or greater: Minimum Value=Static Load×4.0. (b) Friction drum ropes. For rope lengths less than 4,000 feet: Minimum Value=Static Load×(7.0−0.0005L) For rope lengths 4,000 feet or greater: Minimum Value=Static Load×5.0. (c) Tail...

  10. 30 CFR 56.19021 - Minimum rope strength.

    Science.gov (United States)

    2010-07-01

    ... feet: Minimum Value=Static Load×(7.0-0.001L) For rope lengths 3,000 feet or greater: Minimum Value=Static Load×4.0 (b) Friction drum ropes. For rope lengths less than 4,000 feet: Minimum Value=Static Load×(7.0-0.0005L) For rope lengths 4,000 feet or greater: Minimum Value=Static Load×5.0 (c) Tail ropes...

  11. Standardization of a broth microdilution susceptibility testing method to determine minimum inhibitory concentrations of aquatic bacteria

    DEFF Research Database (Denmark)

    Miller, R.A.; Walker, R.D.; Carson, J.

    2005-01-01

    (ampicillin, enrofloxacin, erythromycin, florfenicol, flumequine, gentamicin, ormetoprim/sulfadimethoxine, oxolinic acid, oxytetracycline and trimethoprim/sulfamethoxazole). Minimum inhibitory concentration (MIC) QC ranges were determined using dry- and frozen-form 96-well plates and cation-adjusted Mueller...

  12. Prevalence of alcohol-impaired drivers based on random breath tests in a roadside survey in Catalonia (Spain).

    Science.gov (United States)

    Alcañiz, Manuela; Guillén, Montserrat; Santolino, Miguel; Sánchez-Moscona, Daniel; Llatje, Oscar; Ramon, Lluís

    2014-04-01

    Sobriety checkpoints are not usually randomly located by traffic authorities. As such, information provided by non-random alcohol tests cannot be used to infer the characteristics of the general driving population. In this paper a case study is presented in which the prevalence of alcohol-impaired driving is estimated for the general population of drivers. A stratified probabilistic sample was designed to represent vehicles circulating in non-urban areas of Catalonia (Spain), a region characterized by its complex transportation network and dense traffic around the metropolis of Barcelona. Random breath alcohol concentration tests were performed during spring 2012 on 7596 drivers. The estimated prevalence of alcohol-impaired drivers was 1.29%, which is roughly a third of the rate obtained in non-random tests. Higher rates were found on weekends (1.90% on Saturdays and 4.29% on Sundays) and especially at night. The rate is higher for men (1.45%) than for women (0.64%) and it shows an increasing pattern with age. In vehicles with two occupants, the proportion of alcohol-impaired drivers is estimated at 2.62%, but when the driver was alone the rate drops to 0.84%, which might reflect the socialization of drinking habits. The results are compared with outcomes in previous surveys, showing a decreasing trend in the prevalence of alcohol-impaired drivers over time. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. 78 FR 71036 - Pipeline Safety: Random Drug Testing Rate; Contractor Management Information System Reporting...

    Science.gov (United States)

    2013-11-27

    ... PHMSA-2013-0248] Pipeline Safety: Random Drug Testing Rate; Contractor Management Information System Reporting; and Obtaining Drug and Alcohol Management Information System Sign-In Information AGENCY: Pipeline... Management Information System (MIS) Data; and New Method for Operators to Obtain User Name and Password for...

  14. 30 CFR 77.1431 - Minimum rope strength.

    Science.gov (United States)

    2010-07-01

    ... feet: Minimum Value=Static Load×(7.0−0.001L) For rope lengths 3,000 feet or greater: Minimum Value=Static Load×4.0 (b) Friction drum ropes. For rope lengths less than 4,000 feet: Minimum Value=Static Load×(7.0−0.0005L) For rope lengths 4,000 feet or greater: Minimum Value=Static Load×5.0 (c) Tail ropes...

  15. Random matrix theories and chaotic dynamics

    International Nuclear Information System (INIS)

    Bohigas, O.

    1991-01-01

    A review of some of the main ideas, assumptions and results of the Wigner-Dyson type random matrix theories (RMT) which are relevant in the general context of 'Chaos and Quantum Physics' is presented. RMT are providing interesting and unexpected clues to connect classical dynamics with quantum phenomena. It is this aspect which will be emphasised and, concerning the main body of RMT, the author will restrict himself to a minimum. However, emphasis will be put on some generalizations of the 'canonical' random matrix ensembles that increase their flexibility, rendering the incorporation of relevant physical constraints possible. (R.P.) 112 refs., 35 figs., 5 tabs

  16. A Phosphate Minimum in the Oxygen Minimum Zone (OMZ) off Peru

    Science.gov (United States)

    Paulmier, A.; Giraud, M.; Sudre, J.; Jonca, J.; Leon, V.; Moron, O.; Dewitte, B.; Lavik, G.; Grasse, P.; Frank, M.; Stramma, L.; Garcon, V.

    2016-02-01

    The Oxygen Minimum Zone (OMZ) off Peru is known to be associated with the advection of Equatorial SubSurface Waters (ESSW), rich in nutrients and poor in oxygen, through the Peru-Chile UnderCurrent (PCUC), but this circulation remains to be refined within the OMZ. During the Pelágico cruise in November-December 2010, measurements of phosphate revealed the presence of a phosphate minimum (Pmin) in various hydrographic stations, which could not be explained so far and could be associated with a specific water mass. This Pmin, localized at a relatively constant layer ( 20minimum with a mean vertical phosphate decrease of 0.6 µM but highly variable between 0.1 and 2.2 µM. In average, these Pmin are associated with a predominant mixing of SubTropical Under- and Surface Waters (STUW and STSW: 20 and 40%, respectively) within ESSW ( 25%), complemented evenly by overlying (ESW, TSW: 8%) and underlying waters (AAIW, SPDW: 7%). The hypotheses and mechanisms leading to the Pmin formation in the OMZ are further explored and discussed, considering the physical regional contribution associated with various circulation pathways ventilating the OMZ and the local biogeochemical contribution including the potential diazotrophic activity.

  17. Synthesis of Sine-on-Random vibration profiles for accelerated life tests based on fatigue damage spectrum equivalence

    Science.gov (United States)

    Angeli, Andrea; Cornelis, Bram; Troncossi, Marco

    2018-03-01

    In many real life environments, mechanical and electronic systems are subjected to vibrations that may induce dynamic loads and potentially lead to an early failure due to fatigue damage. Thus, qualification tests by means of shakers are advisable for the most critical components in order to verify their durability throughout the entire life cycle. Nowadays the trend is to tailor the qualification tests according to the specific application of the tested component, considering the measured field data as reference to set up the experimental campaign, for example through the so called "Mission Synthesis" methodology. One of the main issues is to define the excitation profiles for the tests, that must have, besides the (potentially scaled) frequency content, also the same damage potential of the field data despite being applied for a limited duration. With this target, the current procedures generally provide the test profile as a stationary random vibration specified by a Power Spectral Density (PSD). In certain applications this output may prove inadequate to represent the nature of the reference signal, and the procedure could result in an unrealistic qualification test. For instance when a rotating part is present in the system the component under analysis may be subjected to Sine-on-Random (SoR) vibrations, namely excitations composed of sinusoidal contributions superimposed to random vibrations. In this case, the synthesized test profile should preserve not only the induced fatigue damage but also the deterministic components of the environmental vibration. In this work, the potential advantages of a novel procedure to synthesize SoR profiles instead of PSDs for qualification tests are presented and supported by the results of an experimental campaign.

  18. Applications of random forest feature selection for fine-scale genetic population assignment.

    Science.gov (United States)

    Sylvester, Emma V A; Bentzen, Paul; Bradbury, Ian R; Clément, Marie; Pearce, Jon; Horne, John; Beiko, Robert G

    2018-02-01

    Genetic population assignment used to inform wildlife management and conservation efforts requires panels of highly informative genetic markers and sensitive assignment tests. We explored the utility of machine-learning algorithms (random forest, regularized random forest and guided regularized random forest) compared with F ST ranking for selection of single nucleotide polymorphisms (SNP) for fine-scale population assignment. We applied these methods to an unpublished SNP data set for Atlantic salmon ( Salmo salar ) and a published SNP data set for Alaskan Chinook salmon ( Oncorhynchus tshawytscha ). In each species, we identified the minimum panel size required to obtain a self-assignment accuracy of at least 90% using each method to create panels of 50-700 markers Panels of SNPs identified using random forest-based methods performed up to 7.8 and 11.2 percentage points better than F ST -selected panels of similar size for the Atlantic salmon and Chinook salmon data, respectively. Self-assignment accuracy ≥90% was obtained with panels of 670 and 384 SNPs for each data set, respectively, a level of accuracy never reached for these species using F ST -selected panels. Our results demonstrate a role for machine-learning approaches in marker selection across large genomic data sets to improve assignment for management and conservation of exploited populations.

  19. Digital-Analog Hybrid Scheme and Its Application to Chaotic Random Number Generators

    Science.gov (United States)

    Yuan, Zeshi; Li, Hongtao; Miao, Yunchi; Hu, Wen; Zhu, Xiaohua

    2017-12-01

    Practical random number generation (RNG) circuits are typically achieved with analog devices or digital approaches. Digital-based techniques, which use field programmable gate array (FPGA) and graphics processing units (GPU) etc. usually have better performances than analog methods as they are programmable, efficient and robust. However, digital realizations suffer from the effect of finite precision. Accordingly, the generated random numbers (RNs) are actually periodic instead of being real random. To tackle this limitation, in this paper we propose a novel digital-analog hybrid scheme that employs the digital unit as the main body, and minimum analog devices to generate physical RNs. Moreover, the possibility of realizing the proposed scheme with only one memory element is discussed. Without loss of generality, we use the capacitor and the memristor along with FPGA to construct the proposed hybrid system, and a chaotic true random number generator (TRNG) circuit is realized, producing physical RNs at a throughput of Gbit/s scale. These RNs successfully pass all the tests in the NIST SP800-22 package, confirming the significance of the scheme in practical applications. In addition, the use of this new scheme is not restricted to RNGs, and it also provides a strategy to solve the effect of finite precision in other digital systems.

  20. Optimal design of degradation tests in presence of cost constraint

    International Nuclear Information System (INIS)

    Wu, S.-J.; Chang, C.-T.

    2002-01-01

    Degradation test is a useful technique to provide information about the lifetime of highly reliable products. We obtain the degradation measurements over time in such a test. In general, the degradation data are modeled by a nonlinear regression model with random coefficients. If we can obtain the estimates of parameters under the model, then the failure time distribution can be estimated. However, in order to obtain a precise estimate of the percentile of failure time distribution, one needs to design an optimal degradation test. Therefore, this study proposes an approach to determine the number of units to test, inspection frequency, and termination time of a degradation test under a determined cost of experiment such that the variance of estimator of percentile of failure time distribution is minimum. The method will be applied to a numerical example and the sensitivity analysis will be discussed

  1. Statistics for Ratios of Rayleigh, Rician, Nakagami-m, and Weibull Distributed Random Variables

    Directory of Open Access Journals (Sweden)

    Dragana Č. Pavlović

    2013-01-01

    Full Text Available The distributions of ratios of random variables are of interest in many areas of the sciences. In this brief paper, we present the joint probability density function (PDF and PDF of maximum of ratios μ1=R1/r1 and μ2=R2/r2 for the cases where R1, R2, r1, and r2 are Rayleigh, Rician, Nakagami-m, and Weibull distributed random variables. Random variables R1 and R2, as well as random variables r1 and r2, are correlated. Ascertaining on the suitability of the Weibull distribution to describe fading in both indoor and outdoor environments, special attention is dedicated to the case of Weibull random variables. For this case, analytical expressions for the joint PDF, PDF of maximum, PDF of minimum, and product moments of arbitrary number of ratios μi=Ri/ri, i=1,…,L are obtained. Random variables in numerator, Ri, as well as random variables in denominator, ri, are exponentially correlated. To the best of the authors' knowledge, analytical expressions for the PDF of minimum and product moments of {μi}i=1L are novel in the open technical literature. The proposed mathematical analysis is complemented by various numerical results. An application of presented theoretical results is illustrated with respect to performance assessment of wireless systems.

  2. Optimization of the Dutch Matrix Test by Random Selection of Sentences From a Preselected Subset

    Directory of Open Access Journals (Sweden)

    Rolph Houben

    2015-04-01

    Full Text Available Matrix tests are available for speech recognition testing in many languages. For an accurate measurement, a steep psychometric function of the speech materials is required. For existing tests, it would be beneficial if it were possible to further optimize the available materials by increasing the function’s steepness. The objective is to show if the steepness of the psychometric function of an existing matrix test can be increased by selecting a homogeneous subset of recordings with the steepest sentence-based psychometric functions. We took data from a previous multicenter evaluation of the Dutch matrix test (45 normal-hearing listeners. Based on half of the data set, first the sentences (140 out of 311 with a similar speech reception threshold and with the steepest psychometric function (≥9.7%/dB were selected. Subsequently, the steepness of the psychometric function for this selection was calculated from the remaining (unused second half of the data set. The calculation showed that the slope increased from 10.2%/dB to 13.7%/dB. The resulting subset did not allow the construction of enough balanced test lists. Therefore, the measurement procedure was changed to randomly select the sentences during testing. Random selection may interfere with a representative occurrence of phonemes. However, in our material, the median phonemic occurrence remained close to that of the original test. This finding indicates that phonemic occurrence is not a critical factor. The work highlights the possibility that existing speech tests might be improved by selecting sentences with a steep psychometric function.

  3. Assessment of Random Assignment in Training and Test Sets using Generalized Cluster Analysis Technique

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACĂ

    2011-06-01

    Full Text Available Aim: The properness of random assignment of compounds in training and validation sets was assessed using the generalized cluster technique. Material and Method: A quantitative Structure-Activity Relationship model using Molecular Descriptors Family on Vertices was evaluated in terms of assignment of carboquinone derivatives in training and test sets during the leave-many-out analysis. Assignment of compounds was investigated using five variables: observed anticancer activity and four structure descriptors. Generalized cluster analysis with K-means algorithm was applied in order to investigate if the assignment of compounds was or not proper. The Euclidian distance and maximization of the initial distance using a cross-validation with a v-fold of 10 was applied. Results: All five variables included in analysis proved to have statistically significant contribution in identification of clusters. Three clusters were identified, each of them containing both carboquinone derivatives belonging to training as well as to test sets. The observed activity of carboquinone derivatives proved to be normal distributed on every. The presence of training and test sets in all clusters identified using generalized cluster analysis with K-means algorithm and the distribution of observed activity within clusters sustain a proper assignment of compounds in training and test set. Conclusion: Generalized cluster analysis using the K-means algorithm proved to be a valid method in assessment of random assignment of carboquinone derivatives in training and test sets.

  4. Minimum curvilinearity to enhance topological prediction of protein interactions by network embedding

    KAUST Repository

    Cannistraci, Carlo

    2013-06-21

    Motivation: Most functions within the cell emerge thanks to protein-protein interactions (PPIs), yet experimental determination of PPIs is both expensive and time-consuming. PPI networks present significant levels of noise and incompleteness. Predicting interactions using only PPI-network topology (topological prediction) is difficult but essential when prior biological knowledge is absent or unreliable.Methods: Network embedding emphasizes the relations between network proteins embedded in a low-dimensional space, in which protein pairs that are closer to each other represent good candidate interactions. To achieve network denoising, which boosts prediction performance, we first applied minimum curvilinear embedding (MCE), and then adopted shortest path (SP) in the reduced space to assign likelihood scores to candidate interactions. Furthermore, we introduce (i) a new valid variation of MCE, named non-centred MCE (ncMCE); (ii) two automatic strategies for selecting the appropriate embedding dimension; and (iii) two new randomized procedures for evaluating predictions.Results: We compared our method against several unsupervised and supervisedly tuned embedding approaches and node neighbourhood techniques. Despite its computational simplicity, ncMCE-SP was the overall leader, outperforming the current methods in topological link prediction.Conclusion: Minimum curvilinearity is a valuable non-linear framework that we successfully applied to the embedding of protein networks for the unsupervised prediction of novel PPIs. The rationale for our approach is that biological and evolutionary information is imprinted in the non-linear patterns hidden behind the protein network topology, and can be exploited for predicting new protein links. The predicted PPIs represent good candidates for testing in high-throughput experiments or for exploitation in systems biology tools such as those used for network-based inference and prediction of disease-related functional modules. The

  5. Patterns of Cancer Genetic Testing: A Randomized Survey of Oregon Clinicians

    International Nuclear Information System (INIS)

    Cox, S. L.; Zlot, A. I.; Silvey, S. K.; Silvey, S. K.

    2012-01-01

    Introduction. Appropriate use of genetic tests for population-based cancer screening, diagnosis of inherited cancers, and guidance of cancer treatment can improve health outcomes. We investigated clinicians’ use and knowledge of eight breast, ovarian, and colorectal cancer genetic tests. Methods. We conducted a randomized survey of 2,191 Oregon providers, asking about their experience with fecal DNA, OncoVue, BRCA, MMR, CYP2D6, tumor gene expression profiling, UGT1A1, and KRAS. Results. Clinicians reported low confidence in their knowledge of medical genetics; most confident were OB-GYNs and specialists. Clinicians were more likely to have ordered/recommended BRCA and MMR than the other tests, and OB-GYNs were twice as likely to have ordered/recommended BRCA testing than primary care providers. Less than 10% of providers ordered/recommended OncoVue, fecal DNA, CYP2D6, or UGT1A1; less than 30% ordered/recommended tumor gene expression profiles or KRAS. The most common reason for not ordering/recommending these tests was lack of familiarity. Conclusions. Use of appropriate, evidence-based testing can help reduce incidence and mortality of certain cancers, but these tests need to be better integrated into clinical practice. Continued evaluation of emerging technologies, dissemination of findings, and an increase in provider confidence and knowledge are necessary to achieve this end.

  6. An Improved Minimum Error Interpolator of CNC for General Curves Based on FPGA

    Directory of Open Access Journals (Sweden)

    Jiye HUANG

    2014-05-01

    Full Text Available This paper presents an improved minimum error interpolation algorithm for general curves generation in computer numerical control (CNC. Compared with the conventional interpolation algorithms such as the By-Point Comparison method, the Minimum- Error method and the Digital Differential Analyzer (DDA method, the proposed improved Minimum-Error interpolation algorithm can find a balance between accuracy and efficiency. The new algorithm is applicable for the curves of linear, circular, elliptical and parabolic. The proposed algorithm is realized on a field programmable gate array (FPGA with Verilog HDL language, and simulated by the ModelSim software, and finally verified on a two-axis CNC lathe. The algorithm has the following advantages: firstly, the maximum interpolation error is only half of the minimum step-size; and secondly the computing time is only two clock cycles of the FPGA. Simulations and actual tests have proved that the high accuracy and efficiency of the algorithm, which shows that it is highly suited for real-time applications.

  7. A Cluster Randomized Controlled Trial Testing the Effectiveness of Houvast: A Strengths-Based Intervention for Homeless Young Adults

    Science.gov (United States)

    Krabbenborg, Manon A. M.; Boersma, Sandra N.; van der Veld, William M.; van Hulst, Bente; Vollebergh, Wilma A. M.; Wolf, Judith R. L. M.

    2017-01-01

    Objective: To test the effectiveness of Houvast: a strengths-based intervention for homeless young adults. Method: A cluster randomized controlled trial was conducted with 10 Dutch shelter facilities randomly allocated to an intervention and a control group. Homeless young adults were interviewed when entering the facility and when care ended.…

  8. Reliability and Minimum Detectable Change of Temporal-Spatial, Kinematic, and Dynamic Stability Measures during Perturbed Gait.

    Directory of Open Access Journals (Sweden)

    Christopher A Rábago

    Full Text Available Temporal-spatial, kinematic variability, and dynamic stability measures collected during perturbation-based assessment paradigms are often used to identify dysfunction associated with gait instability. However, it remains unclear which measures are most reliable for detecting and tracking responses to perturbations. This study systematically determined the between-session reliability and minimum detectable change values of temporal-spatial, kinematic variability, and dynamic stability measures during three types of perturbed gait. Twenty young healthy adults completed two identical testing sessions two weeks apart, comprised of an unperturbed and three perturbed (cognitive, physical, and visual walking conditions in a virtual reality environment. Within each session, perturbation responses were compared to unperturbed walking using paired t-tests. Between-session reliability and minimum detectable change values were also calculated for each measure and condition. All temporal-spatial, kinematic variability and dynamic stability measures demonstrated fair to excellent between-session reliability. Minimal detectable change values, normalized to mean values ranged from 1-50%. Step width mean and variability measures demonstrated the greatest response to perturbations with excellent between-session reliability and low minimum detectable change values. Orbital stability measures demonstrated specificity to perturbation direction and sensitivity with excellent between-session reliability and low minimum detectable change values. We observed substantially greater between-session reliability and lower minimum detectable change values for local stability measures than previously described which may be the result of averaging across trials within a session and using velocity versus acceleration data for reconstruction of state spaces. Across all perturbation types, temporal-spatial, orbital and local measures were the most reliable measures with the

  9. Trends in Mean Annual Minimum and Maximum Near Surface Temperature in Nairobi City, Kenya

    Directory of Open Access Journals (Sweden)

    George Lukoye Makokha

    2010-01-01

    Full Text Available This paper examines the long-term urban modification of mean annual conditions of near surface temperature in Nairobi City. Data from four weather stations situated in Nairobi were collected from the Kenya Meteorological Department for the period from 1966 to 1999 inclusive. The data included mean annual maximum and minimum temperatures, and was first subjected to homogeneity test before analysis. Both linear regression and Mann-Kendall rank test were used to discern the mean annual trends. Results show that the change of temperature over the thirty-four years study period is higher for minimum temperature than maximum temperature. The warming trends began earlier and are more significant at the urban stations than is the case at the sub-urban stations, an indication of the spread of urbanisation from the built-up Central Business District (CBD to the suburbs. The established significant warming trends in minimum temperature, which are likely to reach higher proportions in future, pose serious challenges on climate and urban planning of the city. In particular the effect of increased minimum temperature on human physiological comfort, building and urban design, wind circulation and air pollution needs to be incorporated in future urban planning programmes of the city.

  10. Parameters, test criteria and fault assessment in random sampling of waste barrels from non-qualified processes

    International Nuclear Information System (INIS)

    Martens, B.R.

    1989-01-01

    In the context of random sampling tests, parameters are checked on the waste barrels and criteria are given on which these tests are based. Also, it is shown how faulty data on the properties of the waste or faulty waste barrels should be treated. To decide the extent of testing, the properties of the waste relevant to final storage are determined based on the conditioning process used. (DG) [de

  11. The impact of minimum wages on population health: evidence from 24 OECD countries.

    Science.gov (United States)

    Lenhart, Otto

    2017-11-01

    This study examines the relationship between minimum wages and several measures of population health by analyzing data from 24 OECD countries for a time period of 31 years. Specifically, I test for health effects as a result of within-country variations in the generosity of minimum wages, which are measured by the Kaitz index. The paper finds that higher levels of minimum wages are associated with significant reductions of overall mortality rates as well as in the number of deaths due to outcomes that have been shown to be more prevalent among individuals with low socioeconomic status (e.g., diabetes, disease of the circulatory system, stroke). A 10% point increase of the Kaitz index is associated with significant declines in death rates and an increase in life expectancy of 0.44 years. Furthermore, I provide evidence for potential channels through which minimum wages impact population health by showing that more generous minimum wages impact outcomes such as poverty, the share of the population with unmet medical needs, the number of doctor consultations, tobacco consumption, calorie intake, and the likelihood of people being overweight.

  12. Isoflurane minimum alveolar concentration reduction by fentanyl.

    Science.gov (United States)

    McEwan, A I; Smith, C; Dyar, O; Goodman, D; Smith, L R; Glass, P S

    1993-05-01

    Isoflurane is commonly combined with fentanyl during anesthesia. Because of hysteresis between plasma and effect site, bolus administration of fentanyl does not accurately describe the interaction between these drugs. The purpose of this study was to determine the MAC reduction of isoflurane by fentanyl when both drugs had reached steady biophase concentrations. Seventy-seven patients were randomly allocated to receive either no fentanyl or fentanyl at several predetermined plasma concentrations. Fentanyl was administered using a computer-assisted continuous infusion device. Patients were also randomly allocated to receive a predetermined steady state end-tidal concentration of isoflurane. Blood samples for fentanyl concentration were taken at 10 min after initiation of the infusion and before and immediately after skin incision. A minimum of 20 min was allowed between the start of the fentanyl infusion and skin incision. The reduction in the MAC of isoflurane by the measured fentanyl concentration was calculated using a maximum likelihood solution to a logistic regression model. There was an initial steep reduction in the MAC of isoflurane by fentanyl, with 3 ng/ml resulting in a 63% MAC reduction. A ceiling effect was observed with 10 ng/ml providing only a further 19% reduction in MAC. A 50% decrease in MAC was produced by a fentanyl concentration of 1.67 ng/ml. Defining the MAC reduction of isoflurane by all the opioids allows their more rational administration with inhalational anesthetics and provides a comparison of their relative anesthetic potencies.

  13. 12 CFR 564.4 - Minimum appraisal standards.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Minimum appraisal standards. 564.4 Section 564.4 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY APPRAISALS § 564.4 Minimum appraisal standards. For federally related transactions, all appraisals shall, at a minimum: (a...

  14. The minimum wage in the Czech enterprises

    OpenAIRE

    Eva Lajtkepová

    2010-01-01

    Although the statutory minimum wage is not a new category, in the Czech Republic we encounter the definition and regulation of a minimum wage for the first time in the 1990 amendment to Act No. 65/1965 Coll., the Labour Code. The specific amount of the minimum wage and the conditions of its operation were then subsequently determined by government regulation in February 1991. Since that time, the value of minimum wage has been adjusted fifteenth times (the last increase was in January 2007). ...

  15. Minimum Wages and Regional Disparity: An analysis on the evolution of price-adjusted minimum wages and their effects on firm profitability (Japanese)

    OpenAIRE

    MORIKAWA Masayuki

    2013-01-01

    This paper, using prefecture level panel data, empirically analyzes 1) the recent evolution of price-adjusted regional minimum wages and 2) the effects of minimum wages on firm profitability. As a result of rapid increases in minimum wages in the metropolitan areas since 2007, the regional disparity of nominal minimum wages has been widening. However, the disparity of price-adjusted minimum wages has been shrinking. According to the analysis of the effects of minimum wages on profitability us...

  16. 41 CFR 50-201.1101 - Minimum wages.

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 1 2010-07-01 2010-07-01 true Minimum wages. 50-201... Contracts PUBLIC CONTRACTS, DEPARTMENT OF LABOR 201-GENERAL REGULATIONS § 50-201.1101 Minimum wages. Determinations of prevailing minimum wages or changes therein will be published in the Federal Register by the...

  17. Minimum Wage Laws and the Distribution of Employment.

    Science.gov (United States)

    Lang, Kevin

    The desirability of raising the minimum wage long revolved around just one question: the effect of higher minimum wages on the overall level of employment. An even more critical effect of the minimum wage rests on the composition of employment--who gets the minimum wage job. An examination of employment in eating and drinking establishments…

  18. 29 CFR 505.3 - Prevailing minimum compensation.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Prevailing minimum compensation. 505.3 Section 505.3 Labor... HUMANITIES § 505.3 Prevailing minimum compensation. (a)(1) In the absence of an alternative determination...)(2) of this section, the prevailing minimum compensation required to be paid under the Act to the...

  19. Minimum Wages and Teenagers' Enrollment--Employment Outcomes: A Multinominal Logit Model.

    Science.gov (United States)

    Ehrenberg, Ronald G.; Marcus, Alan J.

    1982-01-01

    This paper tests the hypothesis that the effect of minimum wage legislation on teenagers' education decisions is asymmetrical across family income classes, with the legislation inducing children from low-income families to reduce their levels of schooling and children from higher-income families to increase their educational attainment. (Author)

  20. Do Some Workers Have Minimum Wage Careers?

    Science.gov (United States)

    Carrington, William J.; Fallick, Bruce C.

    2001-01-01

    Most workers who begin their careers in minimum-wage jobs eventually gain more experience and move on to higher paying jobs. However, more than 8% of workers spend at least half of their first 10 working years in minimum wage jobs. Those more likely to have minimum wage careers are less educated, minorities, women with young children, and those…

  1. Does the Minimum Wage Affect Welfare Caseloads?

    Science.gov (United States)

    Page, Marianne E.; Spetz, Joanne; Millar, Jane

    2005-01-01

    Although minimum wages are advocated as a policy that will help the poor, few studies have examined their effect on poor families. This paper uses variation in minimum wages across states and over time to estimate the impact of minimum wage legislation on welfare caseloads. We find that the elasticity of the welfare caseload with respect to the…

  2. 29 CFR 4.159 - General minimum wage.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 1 2010-07-01 2010-07-01 true General minimum wage. 4.159 Section 4.159 Labor Office of... General minimum wage. The Act, in section 2(b)(1), provides generally that no contractor or subcontractor... a contract less than the minimum wage specified under section 6(a)(1) of the Fair Labor Standards...

  3. Testing sex-specific pathways from peer victimization to anxiety and depression in early adolescents through a randomized intervention trial

    NARCIS (Netherlands)

    Vuijk, P.; Lier, P.A.C. van; Crijnen, A.A.M.; Huizink, A.C.

    2007-01-01

    The aim of this study was to test for sex differences in the role of physical and relational victimization in anxiety and depression development through a randomized prevention trial. 448 seven-year-old boys and girls were randomly assigned to the Good Behavior Game intervention, a two-year

  4. Harbor porpoise clicks do not have conditionally minimum time bandwidth product

    DEFF Research Database (Denmark)

    Beedholm, Kristian

    2008-01-01

    The hypothesis that odontocete clicks have minimal time frequency product given their delay and center frequency values is tested by using an in-phase averaged porpoise click compared with a pure tone weighted with the same envelope. These signals have the same delay and the same center frequency...... values but the time bandwidth product of the artificial click is only 0.76 that of the original. Therefore signals with the same parameters exist that have a lower time bandwidth product. The observation that porpoise clicks are in fact minimum phase is confirmed for porpoise clicks and this property...... is argued to be incompatible with optimal reception, if auditory filters are also minimum phase....

  5. Hierarchical Solution of the Traveling Salesman Problem with Random Dyadic Tilings

    Science.gov (United States)

    Kalmár-Nagy, Tamás; Bak, Bendegúz Dezső

    We propose a hierarchical heuristic approach for solving the Traveling Salesman Problem (TSP) in the unit square. The points are partitioned with a random dyadic tiling and clusters are formed by the points located in the same tile. Each cluster is represented by its geometrical barycenter and a “coarse” TSP solution is calculated for these barycenters. Midpoints are placed at the middle of each edge in the coarse solution. Near-optimal (or optimal) minimum tours are computed for each cluster. The tours are concatenated using the midpoints yielding a solution for the original TSP. The method is tested on random TSPs (independent, identically distributed points in the unit square) up to 10,000 points as well as on a popular benchmark problem (att532 — coordinates of 532 American cities). Our solutions are 8-13% longer than the optimal ones. We also present an optimization algorithm for the partitioning to improve our solutions. This algorithm further reduces the solution errors (by several percent using 1000 iteration steps). The numerical experiments demonstrate the viability of the approach.

  6. An ILP based memetic algorithm for finding minimum positive influence dominating sets in social networks

    Science.gov (United States)

    Lin, Geng; Guan, Jian; Feng, Huibin

    2018-06-01

    The positive influence dominating set problem is a variant of the minimum dominating set problem, and has lots of applications in social networks. It is NP-hard, and receives more and more attention. Various methods have been proposed to solve the positive influence dominating set problem. However, most of the existing work focused on greedy algorithms, and the solution quality needs to be improved. In this paper, we formulate the minimum positive influence dominating set problem as an integer linear programming (ILP), and propose an ILP based memetic algorithm (ILPMA) for solving the problem. The ILPMA integrates a greedy randomized adaptive construction procedure, a crossover operator, a repair operator, and a tabu search procedure. The performance of ILPMA is validated on nine real-world social networks with nodes up to 36,692. The results show that ILPMA significantly improves the solution quality, and is robust.

  7. Total Sulfur Amino Acid Requirements Are Not Altered in Children with Chronic Renal Insufficiency, but Minimum Methionine Needs Are Increased.

    Science.gov (United States)

    Elango, Rajavel; Humayun, Mohammad A; Turner, Justine M; Rafii, Mahroukh; Langos, Veronika; Ball, Ronald O; Pencharz, Paul B

    2017-10-01

    Background: The total sulfur amino acid (TSAA) and minimum Met requirements have been previously determined in healthy children. TSAA metabolism is altered in kidney disease. Whether TSAA requirements are altered in children with chronic renal insufficiency (CRI) is unknown. Objective: We sought to determine the TSAA (Met in the absence of Cys) requirements and minimum Met (in the presence of excess Cys) requirements in children with CRI. Methods: Five children (4 boys, 1 girl) aged 10 ± 2.6 y with CRI were randomly assigned to receive graded intakes of Met (0, 5, 10, 15, 25, and 35 mg · kg -1 · d -1 ) with no Cys in the diet. Four of the children (3 boys, 1 girl) were then randomly assigned to receive graded dietary intakes of Met (0, 2.5, 5, 7.5, 10, and 15 mg · kg -1 · d -1 ) with 21 mg · kg -1 · d -1 Cys. The mean TSAA and minimum Met requirements were determined by measuring the oxidation of l-[1- 13 C]Phe to 13 CO 2 (F 13 CO 2 ). A 2-phase linear-regression crossover analysis of the F 13 CO 2 data identified a breakpoint at minimal F 13 CO 2 Urine samples collected from all study days and from previous studies of healthy children were measured for sulfur metabolites. Results: The mean and population-safe (upper 95% CI) intakes of TSAA and minimum Met in children with CRI were determined to be 12.6 and 15.9 mg · kg -1 · d -1 and 7.3 and 10.9 mg · kg -1 · d -1 , respectively. In healthy school-aged children the mean and upper 95% CI intakes of TSAA and minimum Met were determined to be 12.9 and 17.2 mg · kg -1 · d -1 and 5.8 and 7.3 mg · kg -1 · d -1 , respectively. A comparison of the minimum Met requirements between healthy children and children with CRI indicated significant ( P < 0.05) differences. Conclusion: These results suggest that children with CRI have a similar mean and population-safe TSAA to that of healthy children, suggesting adequate Cys synthesis via transsulfuration, but higher minimum Met requirement, suggesting reduced

  8. New Minimum Wage Research: A Symposium.

    Science.gov (United States)

    Ehrenberg, Ronald G.; And Others

    1992-01-01

    Includes "Introduction" (Ehrenberg); "Effect of the Minimum Wage [MW] on the Fast-Food Industry" (Katz, Krueger); "Using Regional Variation in Wages to Measure Effects of the Federal MW" (Card); "Do MWs Reduce Employment?" (Card); "Employment Effects of Minimum and Subminimum Wages" (Neumark,…

  9. Teaching the Minimum Wage in Econ 101 in Light of the New Economics of the Minimum Wage.

    Science.gov (United States)

    Krueger, Alan B.

    2001-01-01

    Argues that the recent controversy over the effect of the minimum wage on employment offers an opportunity for teaching introductory economics. Examines eight textbooks to determine topic coverage but finds little consensus. Describes how minimum wage effects should be taught. (RLH)

  10. Probability on graphs random processes on graphs and lattices

    CERN Document Server

    Grimmett, Geoffrey

    2018-01-01

    This introduction to some of the principal models in the theory of disordered systems leads the reader through the basics, to the very edge of contemporary research, with the minimum of technical fuss. Topics covered include random walk, percolation, self-avoiding walk, interacting particle systems, uniform spanning tree, random graphs, as well as the Ising, Potts, and random-cluster models for ferromagnetism, and the Lorentz model for motion in a random medium. This new edition features accounts of major recent progress, including the exact value of the connective constant of the hexagonal lattice, and the critical point of the random-cluster model on the square lattice. The choice of topics is strongly motivated by modern applications, and focuses on areas that merit further research. Accessible to a wide audience of mathematicians and physicists, this book can be used as a graduate course text. Each chapter ends with a range of exercises.

  11. A cluster-randomized trial of provider-initiated (opt-out) HIV counseling and testing of tuberculosis patients in South Africa.

    Science.gov (United States)

    Pope, Diana S; Deluca, Andrea N; Kali, Paula; Hausler, Harry; Sheard, Carol; Hoosain, Ebrahim; Chaudhary, Mohammad A; Celentano, David D; Chaisson, Richard E

    2008-06-01

    To determine whether implementation of provider-initiated human immunodeficiency virus (HIV) counseling would increase the proportion of tuberculosis (TB) patients who received HIV counseling and testing. Cluster-randomized trial with clinic as the unit of randomization. Twenty, medium-sized primary care TB clinics in the Nelson Mandela Metropolitan Municipality, Port Elizabeth, Eastern Cape Province, South Africa. A total of 754 adults (18 years and older) newly registered as TB patients in the 20 study clinics. Implementation of provider-initiated HIV counseling and testing. Percentage of TB patients HIV counseled and tested. SECONDARY: Percentage of patients with HIV test positive, and percentage of those who received cotrimoxazole and who were referred for HIV care. : A total of 754 adults newly registered as TB patients were enrolled. In clinics randomly assigned to implement provider-initiated HIV counseling and testing, 20.7% (73/352) patients were counseled versus 7.7% (31/402) in the control clinics (P = 0.011), and 20.2% (n = 71) versus 6.5% (n = 26) underwent HIV testing (P = 0.009). Of those patients counseled, 97% in the intervention clinics accepted testing versus 79% in control clinics (P = 0.12). The proportion of patients identified as HIV infected in intervention clinics was 8.5% versus 2.5% in control clinics (P = 0.044). Fewer than 40% of patients with a positive HIV test were prescribed cotrimoxazole or referred for HIV care in either study arm. Provider-initiated HIV counseling significantly increased the proportion of adult TB patients who received HIV counseling and testing, but the magnitude of the effect was small. Additional interventions to optimize HIV testing for TB patients urgently need to be evaluated.

  12. 30 CFR 75.1431 - Minimum rope strength.

    Science.gov (United States)

    2010-07-01

    ..., including rotation resistant). For rope lengths less than 3,000 feet: Minimum Value=Static Load×(7.0−0.001L) For rope lengths 3,000 feet or greater: Minimum Value=Static Load×4.0 (b) Friction drum ropes. For rope lengths less than 4,000 feet: Minimum Value=Static Load×(7.0−0.0005L) For rope lengths 4,000 feet...

  13. A Randomized Comparative Study Evaluating Various Cough Stress Tests and 24-Hour Pad Test with Urodynamics in the Diagnosis of Stress Urinary Incontinence.

    Science.gov (United States)

    Henderson, Joseph W; Kane, Sarah M; Mangel, Jeffrey M; Kikano, Elias G; Garibay, Jorge A; Pollard, Robert R; Mahajan, Sangeeta T; Debanne, Sara M; Hijaz, Adonis K

    2018-06-01

    The cough stress test is a common and accepted tool to evaluate stress urinary incontinence but there is no agreement on how the test should be performed. We assessed the diagnostic ability of different cough stress tests performed when varying patient position and bladder volume using urodynamic stress urinary incontinence as the gold standard. The 24-hour pad test was also evaluated. We recruited women who presented to specialty outpatient clinics with the complaint of urinary incontinence and who were recommended to undergo urodynamic testing. A total of 140 patients were randomized to 4 cough stress test groups, including group 1-a comfortably full bladder, group 2-an empty bladder, group 3- a bladder infused with 200 cc saline and group 4-a bladder filled to half functional capacity. The sequence of standing and sitting was randomly assigned. The groups were compared by 1-way ANOVA or the generalized Fisher exact test. The κ statistic was used to evaluate agreement between the sitting and standing positions. The 95% CIs of sensitivity and specificity were calculated using the Wilson method. ROC analysis was done to evaluate the performance of the 24-hour pad test. The cough stress test performed with a bladder filled to half functional capacity was the best performing test with 83% sensitivity and 90% specificity. There was no statistically significant evidence that the sensitivity or specificity of 1 cough stress test differed from that of the others. The pad test had no significant predictive ability to diagnose urodynamic stress urinary incontinence (AUC 0.60, p = 0.08). Cough stress tests were accurate to diagnose urodynamic stress urinary incontinence. The 24-hour pad test was not predictive of urodynamic stress urinary incontinence and not helpful when used in conjunction with the cough stress test. Copyright © 2018 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  14. Reduction of the number of parameters needed for a polynomial random regression test-day model

    NARCIS (Netherlands)

    Pool, M.H.; Meuwissen, T.H.E.

    2000-01-01

    Legendre polynomials were used to describe the (co)variance matrix within a random regression test day model. The goodness of fit depended on the polynomial order of fit, i.e., number of parameters to be estimated per animal but is limited by computing capacity. Two aspects: incomplete lactation

  15. Minimum deterrence and regional security. Section 2. Other regions

    International Nuclear Information System (INIS)

    Azikiwe, A.E.

    1993-01-01

    Compared to European political and security circumstance, minimum deterrence is less an illusion in other regions where weapon free zones already exist. It will continue to be relevant to the security of other regions. Strategic arms limitation should be pursued vigorously in a constructive and pragmatic manner, bearing in mind the need to readjust to new global challenges. The Comprehensive Test Ban Treaty is the linchpin on which the Non-proliferation Treaty rests

  16. Features of Random Metal Nanowire Networks with Application in Transparent Conducting Electrodes

    KAUST Repository

    Maloth, Thirupathi

    2017-01-01

    in terms of sheet resistance and optical transmittance. However, as the electrical properties of such random networks are achieved thanks to a percolation network, a minimum size of the electrodes is needed so it actually exceeds the representative volume

  17. 30 CFR 281.30 - Minimum royalty.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false Minimum royalty. 281.30 Section 281.30 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE INTERIOR OFFSHORE LEASING OF MINERALS OTHER THAN OIL, GAS, AND SULPHUR IN THE OUTER CONTINENTAL SHELF Financial Considerations § 281.30 Minimum royalty...

  18. Comparative Evaluations of Randomly Selected Four Point-of-Care Glucometer Devices in Addis Ababa, Ethiopia.

    Science.gov (United States)

    Wolde, Mistire; Tarekegn, Getahun; Kebede, Tedla

    2018-05-01

    Point-of-care glucometer (PoCG) devices play a significant role in self-monitoring of the blood sugar level, particularly in the follow-up of high blood sugar therapeutic response. The aim of this study was to evaluate blood glucose test results performed with four randomly selected glucometers on diabetes and control subjects versus standard wet chemistry (hexokinase) methods in Addis Ababa, Ethiopia. A prospective cross-sectional study was conducted on randomly selected 200 study participants (100 participants with diabetes and 100 healthy controls). Four randomly selected PoCG devices (CareSens N, DIAVUE Prudential, On Call Extra, i-QARE DS-W) were evaluated against hexokinase method and ISO 15197:2003 and ISO 15197:2013 standards. The minimum and maximum blood sugar values were recorded by CareSens N (21 mg/dl) and hexokinase method (498.8 mg/dl), respectively. The mean sugar values of all PoCG devices except On Call Extra showed significant differences compared with the reference hexokinase method. Meanwhile, all four PoCG devices had strong positive relationship (>80%) with the reference method (hexokinase). On the other hand, none of the four PoCG devices fulfilled the minimum accuracy measurement set by ISO 15197:2003 and ISO 15197:2013 standards. In addition, the linear regression analysis revealed that all four selected PoCG overestimated the glucose concentrations. The overall evaluation of the selected four PoCG measurements were poorly correlated with standard reference method. Therefore, before introducing PoCG devices to the market, there should be a standardized evaluation platform for validation. Further similar large-scale studies on other PoCG devices also need to be undertaken.

  19. The graph-theoretic minimum energy path problem for ionic conduction

    Directory of Open Access Journals (Sweden)

    Ippei Kishida

    2015-10-01

    Full Text Available A new computational method was developed to analyze the ionic conduction mechanism in crystals through graph theory. The graph was organized into nodes, which represent the crystal structures modeled by ionic site occupation, and edges, which represent structure transitions via ionic jumps. We proposed a minimum energy path problem, which is similar to the shortest path problem. An effective algorithm to solve the problem was established. Since our method does not use randomized algorithm and time parameters, the computational cost to analyze conduction paths and a migration energy is very low. The power of the method was verified by applying it to α-AgI and the ionic conduction mechanism in α-AgI was revealed. The analysis using single point calculations found the minimum energy path for long-distance ionic conduction, which consists of 12 steps of ionic jumps in a unit cell. From the results, the detailed theoretical migration energy was calculated as 0.11 eV by geometry optimization and nudged elastic band method. Our method can refine candidates for possible jumps in crystals and it can be adapted to other computational methods, such as the nudged elastic band method. We expect that our method will be a powerful tool for analyzing ionic conduction mechanisms, even for large complex crystals.

  20. State cigarette minimum price laws - United States, 2009.

    Science.gov (United States)

    2010-04-09

    Cigarette price increases reduce the demand for cigarettes and thereby reduce smoking prevalence, cigarette consumption, and youth initiation of smoking. Excise tax increases are the most effective government intervention to increase the price of cigarettes, but cigarette manufacturers use trade discounts, coupons, and other promotions to counteract the effects of these tax increases and appeal to price-sensitive smokers. State cigarette minimum price laws, initiated by states in the 1940s and 1950s to protect tobacco retailers from predatory business practices, typically require a minimum percentage markup to be added to the wholesale and/or retail price. If a statute prohibits trade discounts from the minimum price calculation, these laws have the potential to counteract discounting by cigarette manufacturers. To assess the status of cigarette minimum price laws in the United States, CDC surveyed state statutes and identified those states with minimum price laws in effect as of December 31, 2009. This report summarizes the results of that survey, which determined that 25 states had minimum price laws for cigarettes (median wholesale markup: 4.00%; median retail markup: 8.00%), and seven of those states also expressly prohibited the use of trade discounts in the minimum retail price calculation. Minimum price laws can help prevent trade discounting from eroding the positive effects of state excise tax increases and higher cigarette prices on public health.

  1. Three Phase Power Imbalance Decomposition into Systematic Imbalance and Random Imbalance

    DEFF Research Database (Denmark)

    Kong, Wangwei; Ma, Kang; Wu, Qiuwei

    2017-01-01

    Uneven load allocations and random load behaviors are two major causes for three-phase power imbalance. The former mainly cause systematic imbalance, which can be addressed by low-cost phase swapping; the latter contribute to random imbalance, which requires relatively costly demand...... minimum phase, or both. Then, this paper proposes a new method to decompose three-phase power series into a systematic imbalance component and a random imbalance component as the closed-form solutions of quadratic optimization models that minimize random imbalance. A degree of power imbalance...... is calculated based on the systematic imbalance component to guide phase swapping. Case studies demonstrate that 72.8% of 782 low voltage substations have systematic imbalance components. The degree of power imbalance results reveal the maximum need for phase swapping and the random imbalance components reveal...

  2. Random close packing of hard spheres and disks

    International Nuclear Information System (INIS)

    Berryman, J.G.

    1983-01-01

    A simple definition of random close packing of hard spheres is presented, and the consequences of this definition are explored. According to this definition, random close packing occurs at the minimum packing fraction eta for which the median nearest-neighbor radius equals the diameter of the spheres. Using the radial distribution function at more dilute concentrations to estimate median nearest-neighbor radii, lower bounds on the critical packing fraction eta/sub RCP/ are obtained and the value of eta/sub RCP/ is estimated by extrapolation. Random close packing is predicted to occur for eta/sub RCP/ = 0.64 +- 0.02 in three dimensions and eta/sub RCP/ = 0.82 +- 0.02 in two dimensions. Both of these predictions are shown to be consistent with the available experimental data

  3. A cluster randomized trial of provider-initiated (Opt-out) HIV counseling and testing of tuberculosis patients in South Africa

    Science.gov (United States)

    Pope, Diana S.; DeLuca, Andrea N.; Kali, Paula; Hausler, Harry; Sheard, Carol; Hoosain, Ebrahim; Chaudhary, Mohammed A.; Celentano, David D.; Chaisson, Richard E.

    2008-01-01

    Objective To determine whether implementation of provider-initiated HIV counseling would increase the proportion of tuberculosis patients that received HIV counseling and testing. Design Cluster-randomized trial with clinic as unit of randomization Setting Twenty, medium-sized primary care TB clinics in the Nelson Mandela Metropolitan Municipality, Port Elizabeth, Eastern Cape Province, South Africa Subjects A total of 754 adults (≥ 18 years) newly registered as tuberculosis patients the twenty study clinics Intervention Implementation of provider-initiated HIV counseling and testing. Main outcome measures Percentage of TB patients HIV counseled and tested. Secondary Percentage of patients HIV test positive and percentage of those that received cotrimoxazole and who were referred for HIV care. Results A total of 754 adults newly registered as tuberculosis patients were enrolled. In clinics randomly assigned to implement provider-initiated HIV counseling and testing, 20.7% (73/352) patients were counseled versus 7.7% (31/402) in the control clinics (p = 0.011), and 20.2 % (n = 71) versus 6.5% (n = 26) underwent HIV testing (p = 0.009). Of those patients counseled, 97% in the intervention clinics accepted testing versus 79% in control clinics (p =0.12). The proportion of patients identified as HIV-infected in intervention clinics was 8.5% versus 2.5% in control clinics (p=0.044). Fewer than 40% of patients with a positive HIV test were prescribed cotrimoxazole or referred for HIV care in either study arm. Conclusions Provider-initiated HIV counseling significantly increased the proportion of adult TB patients that received HIV counseling and testing, but the magnitude of the effect was small. Additional interventions to optimize HIV testing for TB patients urgently need to be evaluated. PMID:18520677

  4. Minimum acceptable face velocities of laboratory fume hoods and guidelines for their classification

    International Nuclear Information System (INIS)

    Bolton, N.E.; Porter, W.E.; Alcorn, S.P.; Everett, W.S.; Hunt, J.B.; Morehead, J.F.; Higdon, H.F.

    1978-06-01

    Data developed to support the requirement of a 100 LFM minimum face velocity requirement for laboratory fume hoods are summarized. Also included is a description of the Y-12 test hood as well as guidelines for a hood classification scheme

  5. Another Look at the Draft Mil-Std-1540E Unit Random Vibration Test Requirements

    Science.gov (United States)

    Perl, E.; Peterson, A. J..; Davis, D.

    2012-07-01

    The draft Mil-Std-1540E has been updated to reflect lessons learned since its publication as an SMC Standard in 2008, [1], and an earlier Aerospace Corporation Technical Report released in 2006, [2]. This paper discusses the technical rationale supporting some of the unit random vibration test requirements to provide better insight into their derivation and application to programs. It is intended that these requirements be tailored for each program to reflect the customer risk profile. Several tailoring options are provided and a two phase test strategy is discussed to highlight its applicability to utilizing heritage hardware in new applications.

  6. Quantum random flip-flop and its applications in random frequency synthesis and true random number generation

    Energy Technology Data Exchange (ETDEWEB)

    Stipčević, Mario, E-mail: mario.stipcevic@irb.hr [Photonics and Quantum Optics Research Unit, Center of Excellence for Advanced Materials and Sensing Devices, Ruđer Bošković Institute, Bijenička 54, 10000 Zagreb (Croatia)

    2016-03-15

    In this work, a new type of elementary logic circuit, named random flip-flop (RFF), is proposed, experimentally realized, and studied. Unlike conventional Boolean logic circuits whose action is deterministic and highly reproducible, the action of a RFF is intentionally made maximally unpredictable and, in the proposed realization, derived from a fundamentally random process of emission and detection of light quanta. We demonstrate novel applications of RFF in randomness preserving frequency division, random frequency synthesis, and random number generation. Possible usages of these applications in the information and communication technology, cryptographic hardware, and testing equipment are discussed.

  7. The impact of a minimum pension on old age poverty and its budgetary cost. Evidence from Latin America

    Directory of Open Access Journals (Sweden)

    Jean-Jacques Dethier

    2011-12-01

    Full Text Available This paper examines the impact on old age poverty and the fiscal cost of universal minimum oldage pensions in Latin America using recent household survey data for 18 countries. Alleviatingold age poverty requires different approach from other age groups and a minimum pension islikely to be the only alternative available. First we measure old age poverty rates for all countries.Second we discuss the design of minimum pensions schemes, means-tested or not, as wellas the disincentive effects that they are expected to have on the economic and social behavior ofhouseholds including labor supply, saving and family solidarity. Third we use the household surveysto simulate the fiscal cost and the impact on poverty rates of alternative minimum pensionschemes in the 18 countries. We show that a universal minimum pension would substantiallyreduce poverty among the elderly except in Argentina, Brazil, Chile and Uruguay where minimumpension systems already exist and poverty rates are low. Such schemes have much tobe commended in terms of incentives, spillover effects and administrative simplicity but have ahigh fiscal cost. The latter is a function of the age at which benefits are awarded, the prevailinglongevity, the generosity of benefits, the efficacy of means testing, and naturally the fiscal capacityof the country.

  8. Assessing the impacts of Saskatchewan's minimum alcohol pricing regulations on alcohol-related crime.

    Science.gov (United States)

    Stockwell, Tim; Zhao, Jinhui; Sherk, Adam; Callaghan, Russell C; Macdonald, Scott; Gatley, Jodi

    2017-07-01

    Saskatchewan's introduction in April 2010 of minimum prices graded by alcohol strength led to an average minimum price increase of 9.1% per Canadian standard drink (=13.45 g ethanol). This increase was shown to be associated with reduced consumption and switching to lower alcohol content beverages. Police also informally reported marked reductions in night-time alcohol-related crime. This study aims to assess the impacts of changes to Saskatchewan's minimum alcohol-pricing regulations between 2008 and 2012 on selected crime events often related to alcohol use. Data were obtained from Canada's Uniform Crime Reporting Survey. Auto-regressive integrated moving average time series models were used to test immediate and lagged associations between minimum price increases and rates of night-time and police identified alcohol-related crimes. Controls were included for simultaneous crime rates in the neighbouring province of Alberta, economic variables, linear trend, seasonality and autoregressive and/or moving-average effects. The introduction of increased minimum-alcohol prices was associated with an abrupt decrease in night-time alcohol-related traffic offences for men (-8.0%, P prices may contribute to reductions in alcohol-related traffic-related and violent crimes perpetrated by men. Observed lagged effects for violent incidents may be due to a delay in bars passing on increased prices to their customers, perhaps because of inventory stockpiling. [Stockwell T, Zhao J, Sherk A, Callaghan RC, Macdonald S, Gatley J. Assessing the impacts of Saskatchewan's minimum alcohol pricing regulations on alcohol-related crime. Drug Alcohol Rev 2017;36:492-501]. © 2016 Australasian Professional Society on Alcohol and other Drugs.

  9. Sampling large random knots in a confined space

    International Nuclear Information System (INIS)

    Arsuaga, J; Blackstone, T; Diao, Y; Hinson, K; Karadayi, E; Saito, M

    2007-01-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e n 2 )). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n 2 ). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications

  10. Sampling large random knots in a confined space

    Science.gov (United States)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.

    2007-09-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  11. Sampling large random knots in a confined space

    Energy Technology Data Exchange (ETDEWEB)

    Arsuaga, J [Department of Mathematics, San Francisco State University, 1600 Holloway Ave, San Francisco, CA 94132 (United States); Blackstone, T [Department of Computer Science, San Francisco State University, 1600 Holloway Ave., San Francisco, CA 94132 (United States); Diao, Y [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Hinson, K [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Karadayi, E [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States); Saito, M [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States)

    2007-09-28

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e{sup n{sup 2}}). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n{sup 2}). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  12. Embedded Platform for Automatic Testing and Optimizing of FPGA Based Cryptographic True Random Number Generators

    Directory of Open Access Journals (Sweden)

    M. Varchola

    2009-12-01

    Full Text Available This paper deals with an evaluation platform for cryptographic True Random Number Generators (TRNGs based on the hardware implementation of statistical tests for FPGAs. It was developed in order to provide an automatic tool that helps to speed up the TRNG design process and can provide new insights on the TRNG behavior as it will be shown on a particular example in the paper. It enables to test sufficient statistical properties of various TRNG designs under various working conditions on the fly. Moreover, the tests are suitable to be embedded into cryptographic hardware products in order to recognize TRNG output of weak quality and thus increase its robustness and reliability. Tests are fully compatible with the FIPS 140 standard and are implemented by the VHDL language as an IP-Core for vendor independent FPGAs. A recent Flash based Actel Fusion FPGA was chosen for preliminary experiments. The Actel version of the tests possesses an interface to the Actel’s CoreMP7 softcore processor that is fully compatible with the industry standard ARM7TDMI. Moreover, identical tests suite was implemented to the Xilinx Virtex 2 and 5 in order to compare the performance of the proposed solution with the performance of already published one based on the same FPGAs. It was achieved 25% and 65% greater clock frequency respectively while consuming almost equal resources of the Xilinx FPGAs. On the top of it, the proposed FIPS 140 architecture is capable of processing one random bit per one clock cycle which results in 311.5 Mbps throughput for Virtex 5 FPGA.

  13. Minimum Price Guarantees In a Consumer Search Model

    NARCIS (Netherlands)

    M.C.W. Janssen (Maarten); A. Parakhonyak (Alexei)

    2009-01-01

    textabstractThis paper is the first to examine the effect of minimum price guarantees in a sequential search model. Minimum price guarantees are not advertised and only known to consumers when they come to the shop. We show that in such an environment, minimum price guarantees increase the value of

  14. The effectiveness of educational interventions to enhance the adoption of fee-based arsenic testing in Bangladesh: a cluster randomized controlled trial.

    Science.gov (United States)

    George, Christine Marie; Inauen, Jennifer; Rahman, Sheikh Masudur; Zheng, Yan

    2013-07-01

    Arsenic (As) testing could help 22 million people, using drinking water sources that exceed the Bangladesh As standard, to identify safe sources. A cluster randomized controlled trial was conducted to evaluate the effectiveness of household education and local media in the increasing demand for fee-based As testing. Randomly selected households (N = 452) were divided into three interventions implemented by community workers: 1) fee-based As testing with household education (HE); 2) fee-based As testing with household education and a local media campaign (HELM); and 3) fee-based As testing alone (Control). The fee for the As test was US$ 0.28, higher than the cost of the test (US$ 0.16). Of households with untested wells, 93% in both intervention groups HE and HELM purchased an As test, whereas only 53% in the control group. In conclusion, fee-based As testing with household education is effective in the increasing demand for As testing in rural Bangladesh.

  15. Testing Self-Similarity Through Lamperti Transformations

    KAUST Repository

    Lee, Myoungji

    2016-07-14

    Self-similar processes have been widely used in modeling real-world phenomena occurring in environmetrics, network traffic, image processing, and stock pricing, to name but a few. The estimation of the degree of self-similarity has been studied extensively, while statistical tests for self-similarity are scarce and limited to processes indexed in one dimension. This paper proposes a statistical hypothesis test procedure for self-similarity of a stochastic process indexed in one dimension and multi-self-similarity for a random field indexed in higher dimensions. If self-similarity is not rejected, our test provides a set of estimated self-similarity indexes. The key is to test stationarity of the inverse Lamperti transformations of the process. The inverse Lamperti transformation of a self-similar process is a strongly stationary process, revealing a theoretical connection between the two processes. To demonstrate the capability of our test, we test self-similarity of fractional Brownian motions and sheets, their time deformations and mixtures with Gaussian white noise, and the generalized Cauchy family. We also apply the self-similarity test to real data: annual minimum water levels of the Nile River, network traffic records, and surface heights of food wrappings. © 2016, International Biometric Society.

  16. Wage inequality, minimum wage effects and spillovers

    OpenAIRE

    Stewart, Mark B.

    2011-01-01

    This paper investigates possible spillover effects of the UK minimum wage. The halt in the growth in inequality in the lower half of the wage distribution (as measured by the 50:10 percentile ratio) since the mid-1990s, in contrast to the continued inequality growth in the upper half of the distribution, suggests the possibility of a minimum wage effect and spillover effects on wages above the minimum. This paper analyses individual wage changes, using both a difference-in-differences estimat...

  17. Greedy Local Search and Vertex Cover in Sparse Random Graphs

    DEFF Research Database (Denmark)

    Witt, Carsten

    2009-01-01

    . This work starts with a rigorous explanation for this claim based on the refined analysis of the Karp-Sipser algorithm by Aronson et al. Subsequently, theoretical supplements are given to experimental studies of search heuristics on random graphs. For c 1, a greedy and randomized local-search heuristic...... finds an optimal cover in polynomial time with a probability arbitrarily close to 1. This behavior relies on the absence of a giant component. As an additional insight into the randomized search, it is shown that the heuristic fails badly also on graphs consisting of a single tree component of maximum......Recently, various randomized search heuristics have been studied for the solution of the minimum vertex cover problem, in particular for sparse random instances according to the G(n, c/n) model, where c > 0 is a constant. Methods from statistical physics suggest that the problem is easy if c

  18. Establishment of Grain Farmers' Supply Response Model and Empirical Analysis under Minimum Grain Purchase Price Policy

    OpenAIRE

    Zhang, Shuang

    2012-01-01

    Based on farmers' supply behavior theory and price expectations theory, this paper establishes grain farmers' supply response model of two major grain varieties (early indica rice and mixed wheat) in the major producing areas, to test whether the minimum grain purchase price policy can have price-oriented effect on grain production and supply in the major producing areas. Empirical analysis shows that the minimum purchase price published annually by the government has significant positive imp...

  19. Minimum Variance Portfolios in the Brazilian Equity Market

    Directory of Open Access Journals (Sweden)

    Alexandre Rubesam

    2013-03-01

    Full Text Available We investigate minimum variance portfolios in the Brazilian equity market using different methods to estimate the covariance matrix, from the simple model of using the sample covariance to multivariate GARCH models. We compare the performance of the minimum variance portfolios to those of the following benchmarks: (i the IBOVESPA equity index, (ii an equally-weighted portfolio, (iii the maximum Sharpe ratio portfolio and (iv the maximum growth portfolio. Our results show that the minimum variance portfolio has higher returns with lower risk compared to the benchmarks. We also consider long-short 130/30 minimum variance portfolios and obtain similar results. The minimum variance portfolio invests in relatively few stocks with low βs measured with respect to the IBOVESPA index, being easily replicable by individual and institutional investors alike.

  20. Local Times of Galactic Cosmic Ray Intensity Maximum and Minimum in the Diurnal Variation

    Directory of Open Access Journals (Sweden)

    Su Yeon Oh

    2006-06-01

    Full Text Available The Diurnal variation of galactic cosmic ray (GCR flux intensity observed by the ground Neutron Monitor (NM shows a sinusoidal pattern with the amplitude of 1sim 2 % of daily mean. We carried out a statistical study on tendencies of the local times of GCR intensity daily maximum and minimum. To test the influences of the solar activity and the location (cut-off rigidity on the distribution in the local times of maximum and minimum GCR intensity, we have examined the data of 1996 (solar minimum and 2000 (solar maximum at the low-latitude Haleakala (latitude: 20.72 N, cut-off rigidity: 12.91 GeV and the high-latitude Oulu (latitude: 65.05 N, cut-off rigidity: 0.81 GeV NM stations. The most frequent local times of the GCR intensity daily maximum and minimum come later about 2sim3 hours in the solar activity maximum year 2000 than in the solar activity minimum year 1996. Oulu NM station whose cut-off rigidity is smaller has the most frequent local times of the GCR intensity maximum and minimum later by 2sim3 hours from those of Haleakala station. This feature is more evident at the solar maximum. The phase of the daily variation in GCR is dependent upon the interplanetary magnetic field varying with the solar activity and the cut-off rigidity varying with the geographic latitude.

  1. Minimum Covers of Fixed Cardinality in Weighted Graphs.

    Science.gov (United States)

    White, Lee J.

    Reported is the result of research on combinatorial and algorithmic techniques for information processing. A method is discussed for obtaining minimum covers of specified cardinality from a given weighted graph. By the indicated method, it is shown that the family of minimum covers of varying cardinality is related to the minimum spanning tree of…

  2. Minimum target prices for production of direct-acting antivirals and associated diagnostics to combat hepatitis C virus.

    Science.gov (United States)

    van de Ven, Nikolien; Fortunak, Joe; Simmons, Bryony; Ford, Nathan; Cooke, Graham S; Khoo, Saye; Hill, Andrew

    2015-04-01

    Combinations of direct-acting antivirals (DAAs) can cure hepatitis C virus (HCV) in the majority of treatment-naïve patients. Mass treatment programs to cure HCV in developing countries are only feasible if the costs of treatment and laboratory diagnostics are very low. This analysis aimed to estimate minimum costs of DAA treatment and associated diagnostic monitoring. Clinical trials of HCV DAAs were reviewed to identify combinations with consistently high rates of sustained virological response across hepatitis C genotypes. For each DAA, molecular structures, doses, treatment duration, and components of retrosynthesis were used to estimate costs of large-scale, generic production. Manufacturing costs per gram of DAA were based upon treating at least 5 million patients per year and a 40% margin for formulation. Costs of diagnostic support were estimated based on published minimum prices of genotyping, HCV antigen tests plus full blood count/clinical chemistry tests. Predicted minimum costs for 12-week courses of combination DAAs with the most consistent efficacy results were: US$122 per person for sofosbuvir+daclatasvir; US$152 for sofosbuvir+ribavirin; US$192 for sofosbuvir+ledipasvir; and US$115 for MK-8742+MK-5172. Diagnostic testing costs were estimated at US$90 for genotyping US$34 for two HCV antigen tests and US$22 for two full blood count/clinical chemistry tests. Minimum costs of treatment and diagnostics to cure hepatitis C virus infection were estimated at US$171-360 per person without genotyping or US$261-450 per person with genotyping. These cost estimates assume that existing large-scale treatment programs can be established. © 2014 The Authors. Hepatology published by Wiley Periodicals, Inc., on behalf of the American Association for the Study of Liver Diseases.

  3. Binary pseudorandom test standard to determine the modulation transfer function of optical microscopes

    Energy Technology Data Exchange (ETDEWEB)

    Novak, Erik; Trolinger, James D.; Lacey, Ian; Anderson, Erik H.; Artemiev, Nikolay A.; Babin, Sergey; Cabrini, Stefano; Calafiore, Guiseppe; Chan, Elaine R.; McKinney, Wayne R.; Peroz, Christophe; Takacs, Peter Z.; Yashchuk, Valeriy V.

    2015-09-01

    This work reports on the development of a binary pseudo-random test sample optimized to calibrate the MTF of optical microscopes. The sample consists of a number of 1-D and 2-D patterns, with different minimum sizes of spatial artifacts from 300 nm to 2 microns. We describe the mathematical background, fabrication process, data acquisition and analysis procedure to return spatial frequency based instrument calibration. We show that the developed samples satisfy the characteristics of a test standard: functionality, ease of specification and fabrication, reproducibility, and low sensitivity to manufacturing error. © (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.

  4. Who Benefits from a Minimum Wage Increase?

    OpenAIRE

    John W. Lopresti; Kevin J. Mumford

    2015-01-01

    This paper addresses the question of how a minimum wage increase affects the wages of low-wage workers. Most studies assume that there is a simple mechanical increase in the wage for workers earning a wage between the old and the new minimum wage, with some studies allowing for spillovers to workers with wages just above this range. Rather than assume that the wages of these workers would have remained constant, this paper estimates how a minimum wage increase impacts a low-wage worker's wage...

  5. Testing Mediators of Intervention Effects in Randomized Controlled Trials: An Evaluation of Three Depression Prevention Programs

    Science.gov (United States)

    Stice, Eric; Rohde, Paul; Seeley, John R.; Gau, Jeff M.

    2010-01-01

    Objective: Evaluate a new 5-step method for testing mediators hypothesized to account for the effects of depression prevention programs. Method: In this indicated prevention trial, at-risk teens with elevated depressive symptoms were randomized to a group cognitive-behavioral (CB) intervention, group supportive expressive intervention, CB…

  6. On Generating Optimal Signal Probabilities for Random Tests: A Genetic Approach

    Directory of Open Access Journals (Sweden)

    M. Srinivas

    1996-01-01

    Full Text Available Genetic Algorithms are robust search and optimization techniques. A Genetic Algorithm based approach for determining the optimal input distributions for generating random test vectors is proposed in the paper. A cost function based on the COP testability measure for determining the efficacy of the input distributions is discussed. A brief overview of Genetic Algorithms (GAs and the specific details of our implementation are described. Experimental results based on ISCAS-85 benchmark circuits are presented. The performance of our GAbased approach is compared with previous results. While the GA generates more efficient input distributions than the previous methods which are based on gradient descent search, the overheads of the GA in computing the input distributions are larger.

  7. Stochastic modelling of the monthly average maximum and minimum temperature patterns in India 1981-2015

    Science.gov (United States)

    Narasimha Murthy, K. V.; Saravana, R.; Vijaya Kumar, K.

    2018-04-01

    The paper investigates the stochastic modelling and forecasting of monthly average maximum and minimum temperature patterns through suitable seasonal auto regressive integrated moving average (SARIMA) model for the period 1981-2015 in India. The variations and distributions of monthly maximum and minimum temperatures are analyzed through Box plots and cumulative distribution functions. The time series plot indicates that the maximum temperature series contain sharp peaks in almost all the years, while it is not true for the minimum temperature series, so both the series are modelled separately. The possible SARIMA model has been chosen based on observing autocorrelation function (ACF), partial autocorrelation function (PACF), and inverse autocorrelation function (IACF) of the logarithmic transformed temperature series. The SARIMA (1, 0, 0) × (0, 1, 1)12 model is selected for monthly average maximum and minimum temperature series based on minimum Bayesian information criteria. The model parameters are obtained using maximum-likelihood method with the help of standard error of residuals. The adequacy of the selected model is determined using correlation diagnostic checking through ACF, PACF, IACF, and p values of Ljung-Box test statistic of residuals and using normal diagnostic checking through the kernel and normal density curves of histogram and Q-Q plot. Finally, the forecasting of monthly maximum and minimum temperature patterns of India for the next 3 years has been noticed with the help of selected model.

  8. How unprecedented a solar minimum was it?

    Science.gov (United States)

    Russell, C T; Jian, L K; Luhmann, J G

    2013-05-01

    The end of the last solar cycle was at least 3 years late, and to date, the new solar cycle has seen mainly weaker activity since the onset of the rising phase toward the new solar maximum. The newspapers now even report when auroras are seen in Norway. This paper is an update of our review paper written during the deepest part of the last solar minimum [1]. We update the records of solar activity and its consequent effects on the interplanetary fields and solar wind density. The arrival of solar minimum allows us to use two techniques that predict sunspot maximum from readings obtained at solar minimum. It is clear that the Sun is still behaving strangely compared to the last few solar minima even though we are well beyond the minimum phase of the cycle 23-24 transition.

  9. Minimum-Cost Reachability for Priced Timed Automata

    DEFF Research Database (Denmark)

    Behrmann, Gerd; Fehnker, Ansgar; Hune, Thomas Seidelin

    2001-01-01

    This paper introduces the model of linearly priced timed automata as an extension of timed automata, with prices on both transitions and locations. For this model we consider the minimum-cost reachability problem: i.e. given a linearly priced timed automaton and a target state, determine...... the minimum cost of executions from the initial state to the target state. This problem generalizes the minimum-time reachability problem for ordinary timed automata. We prove decidability of this problem by offering an algorithmic solution, which is based on a combination of branch-and-bound techniques...

  10. Minimum Q Electrically Small Antennas

    DEFF Research Database (Denmark)

    Kim, O. S.

    2012-01-01

    Theoretically, the minimum radiation quality factor Q of an isolated resonance can be achieved in a spherical electrically small antenna by combining TM1m and TE1m spherical modes, provided that the stored energy in the antenna spherical volume is totally suppressed. Using closed-form expressions...... for a multiarm spherical helix antenna confirm the theoretical predictions. For example, a 4-arm spherical helix antenna with a magnetic-coated perfectly electrically conducting core (ka=0.254) exhibits the Q of 0.66 times the Chu lower bound, or 1.25 times the minimum Q....

  11. Non-destructive Testing by Infrared Thermography Under Random Excitation and ARMA Analysis

    Science.gov (United States)

    Bodnar, J. L.; Nicolas, J. L.; Candoré, J. C.; Detalle, V.

    2012-11-01

    Photothermal thermography is a non-destructive testing (NDT) method, which has many applications in the field of control and characterization of thin materials. This technique is usually implemented under CW or flash excitation. Such excitations are not adapted for control of fragile materials or for multi-frequency analysis. To allow these analyses, in this article, the use of a new control mode is proposed: infrared thermography under random excitation and auto regressive moving average analysis. First, the principle of this NDT method is presented. Then, the method is shown to permit detection, with low energy constraints, of detachments situated in mural paintings.

  12. The RANDOM computer program: A linear congruential random number generator

    Science.gov (United States)

    Miles, R. F., Jr.

    1986-01-01

    The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.

  13. Stochastic variational approach to minimum uncertainty states

    Energy Technology Data Exchange (ETDEWEB)

    Illuminati, F.; Viola, L. [Dipartimento di Fisica, Padova Univ. (Italy)

    1995-05-21

    We introduce a new variational characterization of Gaussian diffusion processes as minimum uncertainty states. We then define a variational method constrained by kinematics of diffusions and Schroedinger dynamics to seek states of local minimum uncertainty for general non-harmonic potentials. (author)

  14. Minimum entropy production principle

    Czech Academy of Sciences Publication Activity Database

    Maes, C.; Netočný, Karel

    2013-01-01

    Roč. 8, č. 7 (2013), s. 9664-9677 ISSN 1941-6016 Institutional support: RVO:68378271 Keywords : MINEP Subject RIV: BE - Theoretical Physics http://www.scholarpedia.org/article/Minimum_entropy_production_principle

  15. Randomized controlled trial of supervised patient self-testing of warfarin therapy using an internet-based expert system.

    LENUS (Irish Health Repository)

    Ryan, F

    2009-08-01

    Increased frequency of prothrombin time testing, facilitated by patient self-testing (PST) of the International Normalized Ratio (INR) can improve the clinical outcomes of oral anticoagulation therapy (OAT). However, oversight of this type of management is often difficult and time-consuming for healthcare professionals. This study reports the first randomized controlled trial of an automated direct-to-patient expert system, enabling remote and effective management of patients on OAT.

  16. Examining the impact of genetic testing for type 2 diabetes on health behaviors: study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Voils Corrine I

    2012-08-01

    Full Text Available Abstract Background We describe the study design, procedures, and development of the risk counseling protocol used in a randomized controlled trial to evaluate the impact of genetic testing for diabetes mellitus (DM on psychological, health behavior, and clinical outcomes. Methods/Design Eligible patients are aged 21 to 65 years with body mass index (BMI ≥27 kg/m2 and no prior diagnosis of DM. At baseline, conventional DM risk factors are assessed, and blood is drawn for possible genetic testing. Participants are randomized to receive conventional risk counseling for DM with eye disease counseling or with genetic test results. The counseling protocol was pilot tested to identify an acceptable graphical format for conveying risk estimates and match the length of the eye disease to genetic counseling. Risk estimates are presented with a vertical bar graph denoting risk level with colors and descriptors. After receiving either genetic counseling regarding risk for DM or control counseling on eye disease, brief lifestyle counseling for prevention of DM is provided to all participants. Discussion A standardized risk counseling protocol is being used in a randomized trial of 600 participants. Results of this trial will inform policy about whether risk counseling should include genetic counseling. Trial registration ClinicalTrials.gov Identifier NCT01060540

  17. A Robust Statistics Approach to Minimum Variance Portfolio Optimization

    Science.gov (United States)

    Yang, Liusha; Couillet, Romain; McKay, Matthew R.

    2015-12-01

    We study the design of portfolios under a minimum risk criterion. The performance of the optimized portfolio relies on the accuracy of the estimated covariance matrix of the portfolio asset returns. For large portfolios, the number of available market returns is often of similar order to the number of assets, so that the sample covariance matrix performs poorly as a covariance estimator. Additionally, financial market data often contain outliers which, if not correctly handled, may further corrupt the covariance estimation. We address these shortcomings by studying the performance of a hybrid covariance matrix estimator based on Tyler's robust M-estimator and on Ledoit-Wolf's shrinkage estimator while assuming samples with heavy-tailed distribution. Employing recent results from random matrix theory, we develop a consistent estimator of (a scaled version of) the realized portfolio risk, which is minimized by optimizing online the shrinkage intensity. Our portfolio optimization method is shown via simulations to outperform existing methods both for synthetic and real market data.

  18. On testing the missing at random assumption

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    Most approaches to learning from incomplete data are based on the assumption that unobserved values are missing at random (mar). While the mar assumption, as such, is not testable, it can become testable in the context of other distributional assumptions, e.g. the naive Bayes assumption...

  19. Minimum emittance in TBA and MBA lattices

    Science.gov (United States)

    Xu, Gang; Peng, Yue-Mei

    2015-03-01

    For reaching a small emittance in a modern light source, triple bend achromats (TBA), theoretical minimum emittance (TME) and even multiple bend achromats (MBA) have been considered. This paper derived the necessary condition for achieving minimum emittance in TBA and MBA theoretically, where the bending angle of inner dipoles has a factor of 31/3 bigger than that of the outer dipoles. Here, we also calculated the conditions attaining the minimum emittance of TBA related to phase advance in some special cases with a pure mathematics method. These results may give some directions on lattice design.

  20. Minimum emittance in TBA and MBA lattices

    International Nuclear Information System (INIS)

    Xu Gang; Peng Yuemei

    2015-01-01

    For reaching a small emittance in a modern light source, triple bend achromats (TBA), theoretical minimum emittance (TME) and even multiple bend achromats (MBA) have been considered. This paper derived the necessary condition for achieving minimum emittance in TBA and MBA theoretically, where the bending angle of inner dipoles has a factor of 3 1/3 bigger than that of the outer dipoles. Here, we also calculated the conditions attaining the minimum emittance of TBA related to phase advance in some special cases with a pure mathematics method. These results may give some directions on lattice design. (authors)

  1. Minimum DNBR Prediction Using Artificial Intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong Su; Kim, Ju Hyun; Na, Man Gyun [Chosun University, Gwangju (Korea, Republic of)

    2011-05-15

    The minimum DNBR (MDNBR) for prevention of the boiling crisis and the fuel clad melting is very important factor that should be consistently monitored in safety aspects. Artificial intelligence methods have been extensively and successfully applied to nonlinear function approximation such as the problem in question for predicting DNBR values. In this paper, support vector regression (SVR) model and fuzzy neural network (FNN) model are developed to predict the MDNBR using a number of measured signals from the reactor coolant system. Also, two models are trained using a training data set and verified against test data set, which does not include training data. The proposed MDNBR estimation algorithms were verified by using nuclear and thermal data acquired from many numerical simulations of the Yonggwang Nuclear Power Plant Unit 3 (YGN-3)

  2. Continuous-Time Mean-Variance Portfolio Selection with Random Horizon

    International Nuclear Information System (INIS)

    Yu, Zhiyong

    2013-01-01

    This paper examines the continuous-time mean-variance optimal portfolio selection problem with random market parameters and random time horizon. Treating this problem as a linearly constrained stochastic linear-quadratic optimal control problem, I explicitly derive the efficient portfolios and efficient frontier in closed forms based on the solutions of two backward stochastic differential equations. Some related issues such as a minimum variance portfolio and a mutual fund theorem are also addressed. All the results are markedly different from those in the problem with deterministic exit time. A key part of my analysis involves proving the global solvability of a stochastic Riccati equation, which is interesting in its own right

  3. Continuous-Time Mean-Variance Portfolio Selection with Random Horizon

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Zhiyong, E-mail: yuzhiyong@sdu.edu.cn [Shandong University, School of Mathematics (China)

    2013-12-15

    This paper examines the continuous-time mean-variance optimal portfolio selection problem with random market parameters and random time horizon. Treating this problem as a linearly constrained stochastic linear-quadratic optimal control problem, I explicitly derive the efficient portfolios and efficient frontier in closed forms based on the solutions of two backward stochastic differential equations. Some related issues such as a minimum variance portfolio and a mutual fund theorem are also addressed. All the results are markedly different from those in the problem with deterministic exit time. A key part of my analysis involves proving the global solvability of a stochastic Riccati equation, which is interesting in its own right.

  4. 41 CFR 50-202.2 - Minimum wage in all industries.

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 1 2010-07-01 2010-07-01 true Minimum wage in all... Public Contracts PUBLIC CONTRACTS, DEPARTMENT OF LABOR 202-MINIMUM WAGE DETERMINATIONS Groups of Industries § 50-202.2 Minimum wage in all industries. In all industries, the minimum wage applicable to...

  5. Inflation with a graceful exit in a random landscape

    International Nuclear Information System (INIS)

    Pedro, F.G.; Westphal, A.

    2016-11-01

    We develop a stochastic description of small-field inflationary histories with a graceful exit in a random potential whose Hessian is a Gaussian random matrix as a model of the unstructured part of the string landscape. The dynamical evolution in such a random potential from a small-field inflation region towards a viable late-time de Sitter (dS) minimum maps to the dynamics of Dyson Brownian motion describing the relaxation of non-equilibrium eigenvalue spectra in random matrix theory. We analytically compute the relaxation probability in a saddle point approximation of the partition function of the eigenvalue distribution of the Wigner ensemble describing the mass matrices of the critical points. When applied to small-field inflation in the landscape, this leads to an exponentially strong bias against small-field ranges and an upper bound N<<10 on the number of light fields N participating during inflation from the non-observation of negative spatial curvature.

  6. Inflation with a graceful exit in a random landscape

    Energy Technology Data Exchange (ETDEWEB)

    Pedro, F.G. [Univ. Autonoma de Madrid (Spain). Dept. de Fisica Teorica y Inst. de Fisica Teorica UAM/CSIC; Westphal, A. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Theory Group

    2016-11-15

    We develop a stochastic description of small-field inflationary histories with a graceful exit in a random potential whose Hessian is a Gaussian random matrix as a model of the unstructured part of the string landscape. The dynamical evolution in such a random potential from a small-field inflation region towards a viable late-time de Sitter (dS) minimum maps to the dynamics of Dyson Brownian motion describing the relaxation of non-equilibrium eigenvalue spectra in random matrix theory. We analytically compute the relaxation probability in a saddle point approximation of the partition function of the eigenvalue distribution of the Wigner ensemble describing the mass matrices of the critical points. When applied to small-field inflation in the landscape, this leads to an exponentially strong bias against small-field ranges and an upper bound N<<10 on the number of light fields N participating during inflation from the non-observation of negative spatial curvature.

  7. Inflation with a graceful exit in a random landscape

    Energy Technology Data Exchange (ETDEWEB)

    Pedro, F.G. [Departamento de Física Teórica and Instituto de Física Teórica UAM/CSIC,Universidad Autónoma de Madrid,Cantoblanco, 28049 Madrid (Spain); Westphal, A. [Deutsches Elektronen-Synchrotron DESY, Theory Group,D-22603 Hamburg (Germany)

    2017-03-30

    We develop a stochastic description of small-field inflationary histories with a graceful exit in a random potential whose Hessian is a Gaussian random matrix as a model of the unstructured part of the string landscape. The dynamical evolution in such a random potential from a small-field inflation region towards a viable late-time de Sitter (dS) minimum maps to the dynamics of Dyson Brownian motion describing the relaxation of non-equilibrium eigenvalue spectra in random matrix theory. We analytically compute the relaxation probability in a saddle point approximation of the partition function of the eigenvalue distribution of the Wigner ensemble describing the mass matrices of the critical points. When applied to small-field inflation in the landscape, this leads to an exponentially strong bias against small-field ranges and an upper bound N≪10 on the number of light fields N participating during inflation from the non-observation of negative spatial curvature.

  8. Randomized comparison of vaginal self-sampling by standard vs. dry swabs for Human papillomavirus testing

    International Nuclear Information System (INIS)

    Eperon, Isabelle; Vassilakos, Pierre; Navarria, Isabelle; Menoud, Pierre-Alain; Gauthier, Aude; Pache, Jean-Claude; Boulvain, Michel; Untiet, Sarah; Petignat, Patrick

    2013-01-01

    To evaluate if human papillomavirus (HPV) self-sampling (Self-HPV) using a dry vaginal swab is a valid alternative for HPV testing. Women attending colposcopy clinic were recruited to collect two consecutive Self-HPV samples: a Self-HPV using a dry swab (S-DRY) and a Self-HPV using a standard wet transport medium (S-WET). These samples were analyzed for HPV using real time PCR (Roche Cobas). Participants were randomized to determine the order of the tests. Questionnaires assessing preferences and acceptability for both tests were conducted. Subsequently, women were invited for colposcopic examination; a physician collected a cervical sample (physician-sampling) with a broom-type device and placed it into a liquid-based cytology medium. Specimens were then processed for the production of cytology slides and a Hybrid Capture HPV DNA test (Qiagen) was performed from the residual liquid. Biopsies were performed if indicated. Unweighted kappa statistics (κ) and McNemar tests were used to measure the agreement among the sampling methods. A total of 120 women were randomized. Overall HPV prevalence was 68.7% (95% Confidence Interval (CI) 59.3–77.2) by S-WET, 54.4% (95% CI 44.8–63.9) by S-DRY and 53.8% (95% CI 43.8–63.7) by HC. Among paired samples (S-WET and S-DRY), the overall agreement was good (85.7%; 95% CI 77.8–91.6) and the κ was substantial (0.70; 95% CI 0.57-0.70). The proportion of positive type-specific HPV agreement was also good (77.3%; 95% CI 68.2-84.9). No differences in sensitivity for cervical intraepithelial neoplasia grade one (CIN1) or worse between the two Self-HPV tests were observed. Women reported the two Self-HPV tests as highly acceptable. Self-HPV using dry swab transfer does not appear to compromise specimen integrity. Further study in a large screening population is needed. ClinicalTrials.gov: http://clinicaltrials.gov/show/NCT01316120

  9. 29 CFR 525.13 - Renewal of special minimum wage certificates.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Renewal of special minimum wage certificates. 525.13... minimum wage certificates. (a) Applications may be filed for renewal of special minimum wage certificates.... (c) Workers with disabilities may not continue to be paid special minimum wages after notice that an...

  10. An Empirical Analysis of the Relationship between Minimum Wage ...

    African Journals Online (AJOL)

    An Empirical Analysis of the Relationship between Minimum Wage, Investment and Economic Growth in Ghana. ... In addition, the ratio of public investment to tax revenue must increase as minimum wage increases since such complementary changes are more likely to lead to economic growth. Keywords: minimum wage ...

  11. 12 CFR 3.6 - Minimum capital ratios.

    Science.gov (United States)

    2010-01-01

    ... should have well-diversified risks, including no undue interest rate risk exposure; excellent control... 12 Banks and Banking 1 2010-01-01 2010-01-01 false Minimum capital ratios. 3.6 Section 3.6 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY MINIMUM CAPITAL RATIOS; ISSUANCE...

  12. 12 CFR 615.5330 - Minimum surplus ratios.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Minimum surplus ratios. 615.5330 Section 615.5330 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM FUNDING AND FISCAL AFFAIRS, LOAN POLICIES AND OPERATIONS, AND FUNDING OPERATIONS Surplus and Collateral Requirements § 615.5330 Minimum...

  13. Non-random mating for selection with restricted rates of inbreeding and overlapping generations

    NARCIS (Netherlands)

    Sonesson, A.K.; Meuwissen, T.H.E.

    2002-01-01

    Minimum coancestry mating with a maximum of one offspring per mating pair (MC1) is compared with random mating schemes for populations with overlapping generations. Optimum contribution selection is used, whereby $\\\\\\\\Delta F$ is restricted. For schemes with $\\\\\\\\Delta F$ restricted to 0.25% per

  14. Application of quasi-random numbers for simulation

    International Nuclear Information System (INIS)

    Kazachenko, O.N.; Takhtamyshev, G.G.

    1985-01-01

    Application of the Monte-Carlo method for multidimensional integration is discussed. The main goal is to check the statement that the application of quasi-random numbers instead of regular pseudo-random numbers provides more rapid convergency. The Sobol, Richtmayer and Halton algorithms of quasi-random sequences are described. Over 50 tests to compare these quasi-random numbers as well as pseudo-random numbers were fulfilled. In all cases quasi-random numbers have clearly demonstrated a more rapid convergency as compared with pseudo-random ones. Positive test results on quasi-random trend in Monte-Carlo method seem very promising

  15. Effects of cognitive behavioral therapy with relaxation vs. imagery rescripting on test anxiety: A randomized controlled trial.

    Science.gov (United States)

    Reiss, Neele; Warnecke, Irene; Tolgou, Theano; Krampen, Dorothea; Luka-Krausgrill, Ursula; Rohrmann, Sonja

    2017-01-15

    Test anxiety is a common condition in students, which may lead to impaired academic performance as well as to distress. The primary objective of this study was to evaluate the effectiveness of two cognitive-behavioral interventions designed to reduce test anxiety. Test anxiety in the participants was diagnosed as social or specific phobia according to DSM-IV. Subsequently subjects were randomized to three groups: a moderated self-help group, which served as a control group, and two treatment groups, where either relaxation techniques or imagery rescripting were applied. Students suffering from test anxiety were recruited at two German universities (n=180). The randomized controlled design comprised three groups which received test anxiety treatment in weekly three-hour sessions over a period of five weeks. Treatment outcome was assessed with a test anxiety questionnaire, which was administered before and after treatment, as well as in a six-month follow-up. A repeated-measures ANOVA for participants with complete data (n=59) revealed a significant reduction of test anxiety from baseline to six-month follow-up in all three treatment groups (panxiety. The sample may therefore represent only more severe forms of text anxiety . Moreover, the sample size in this study was small, the numbers of participants per group differed, and treatment results were based on self-report. Due to the length of the treatment, an implementation of the group treatments used in this study might not be feasible in all settings. Group treatments constitute an effective method of treating test anxiety, e.g. in university settings. Imagery rescripting may particularly contribute to treatment efficacy. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. The continuous reaction time test for minimal hepatic encephalopathy validated by a randomized controlled multi-modal intervention-A pilot study

    DEFF Research Database (Denmark)

    Lauridsen, M M; Mikkelsen, S; Svensson, T

    2017-01-01

    Background: Minimal hepatic encephalopathy (MHE) is clinically undetectable and the diagnosis requires psychometric tests. However, a lack of clarity exists as to whether the tests are in fact able to detect changes in cognition. Aim: To examine if the continuous reaction time test (CRT) can detect...... changes in cognition with anti-HE intervention in patients with cirrhosis and without clinically manifest hepatic encephalopathy (HE). Methods: Firstly, we conducted a reproducibility analysis and secondly measured change in CRT induced by anti-HE treatment in a randomized controlled pilot study: We...... stratified 44 patients with liver cirrhosis and without clinically manifest HE according to a normal (n = 22) or abnormal (n = 22) CRT. Each stratum was then block randomized to receive multimodal anti-HE intervention (lactulose+branched-chain amino acids+rifaximin) or triple placebos for 3 months...

  17. Ambient Modal Testing of the Vestvej Bridge using Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune; Rytter, A.

    This paper presents an ambient vibration study of the Vestvej Bridge. The bridge is a typically Danish two-span concrete bridge which crosses a highway. The purpose of the study is to perform a pre-investigation of the dynamic behavior to obtain information for the design of a demonstration project...... concerning application of vibration based inspection of bridges. The data analysis process of ambient vribration testing of bridges has traditionally been based on auto and cross spectral densities estimated using an FFT algorithm. In the pre-analysis state the spectral densities are all averaged to obtain...... measurements might have a low signal to noise ratio. Thus, it might be difficult clearly to identify physical modes from the spectral densities. The Random Decrement (RD) technique is another method to perform the data analysis process in the time domain only. It is basically a very simple and very easily...

  18. Ambient Modal Testing of the Vestvej Bridge using Random Decrement

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune; Rytter, A.

    1998-01-01

    This paper presents an ambient vibration study of the Vestvej Bridge. The bridge is a typically Danish two-span concrete bridge which crosses a highway. The purpose of the study is to perform a pre-investigation of the dynamic behavior to obtain information for the design of a demonstration project...... concerning application of vibration based inspection of bridges. The data analysis process of ambient vribration testing of bridges has traditionally been based on auto and cross spectral densities estimated using an FFT algorithm. In the pre-analysis state the spectral densities are all averaged to obtain...... measurements might have a low signal to noise ratio. Thus, it might be difficult clearly to identify physical modes from the spectral densities. The Random Decrement (RD) technique is another method to perform the data analysis process in the time domain only. It is basically a very simple and very easily...

  19. 5 CFR 551.601 - Minimum age standards.

    Science.gov (United States)

    2010-01-01

    ... ADMINISTRATION UNDER THE FAIR LABOR STANDARDS ACT Child Labor § 551.601 Minimum age standards. (a) 16-year... subject to its child labor provisions, with certain exceptions not applicable here. (b) 18-year minimum... occupation found and declared by the Secretary of Labor to be particularly hazardous for the employment of...

  20. 12 CFR 932.8 - Minimum liquidity requirements.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Minimum liquidity requirements. 932.8 Section... CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.8 Minimum liquidity requirements. In addition to meeting the deposit liquidity requirements contained in § 965.3 of this chapter, each Bank...

  1. [Hospitals failing minimum volumes in 2004: reasons and consequences].

    Science.gov (United States)

    Geraedts, M; Kühnen, C; Cruppé, W de; Blum, K; Ohmann, C

    2008-02-01

    In 2004 Germany introduced annual minimum volumes nationwide on five surgical procedures: kidney, liver, stem cell transplantation, complex oesophageal, and pancreatic interventions. Hospitals that fail to reach the minimum volumes are no longer allowed to perform the respective procedures unless they raise one of eight legally accepted exceptions. The goal of our study was to investigate how many hospitals fell short of the minimum volumes in 2004, whether and how this was justified, and whether hospitals that failed the requirements experienced any consequences. We analysed data on meeting the minimum volume requirements in 2004 that all German hospitals were obliged to publish as part of their biannual structured quality reports. We performed telephone interviews: a) with all hospitals not achieving the minimum volumes for complex oesophageal, and pancreatic interventions, and b) with the national umbrella organisations of all German sickness funds. In 2004, one quarter of all German acute care hospitals (N=485) performed 23,128 procedures where minimum volumes applied. 197 hospitals (41%) did not meet at least one of the minimum volumes. These hospitals performed N=715 procedures (3.1%) where the minimum volumes were not met. In 43% of these cases the hospitals raised legally accepted exceptions. In 33% of the cases the hospitals argued using reasons that were not legally acknowledged. 69% of those hospitals that failed to achieve the minimum volumes for complex oesophageal and pancreatic interventions did not experience any consequences from the sickness funds. However, one third of those hospitals reported that the sickness funds addressed the issue and partially announced consequences for the future. The sickness funds' umbrella organisations stated that there were only sparse activities related to the minimum volumes and that neither uniform registrations nor uniform proceedings in case of infringements of the standards had been agreed upon. In spite of the

  2. Relativistic effects in photoionization time delay near the Cooper minimum of noble-gas atoms

    Science.gov (United States)

    Saha, Soumyajit; Mandal, Ankur; Jose, Jobin; Varma, Hari R.; Deshmukh, P. C.; Kheifets, A. S.; Dolmatov, V. K.; Manson, S. T.

    2014-11-01

    Time delay of photoemission from valence n s , n p3 /2 , and n p1 /2 subshells of noble-gas atoms is theoretically scrutinized within the framework of the dipole relativistic random phase approximation. The focus is on the variation of time delay in the vicinity of the Cooper minima in photoionization of the outer subshells of neon, argon, krypton, and xenon, where the corresponding dipole matrix element changes its sign while passing through a node. It is revealed that the presence of the Cooper minimum in one photoionization channel has a strong effect on time delay in other channels. This is shown to be due to interchannel coupling.

  3. The Distribution of the Sample Minimum-Variance Frontier

    OpenAIRE

    Raymond Kan; Daniel R. Smith

    2008-01-01

    In this paper, we present a finite sample analysis of the sample minimum-variance frontier under the assumption that the returns are independent and multivariate normally distributed. We show that the sample minimum-variance frontier is a highly biased estimator of the population frontier, and we propose an improved estimator of the population frontier. In addition, we provide the exact distribution of the out-of-sample mean and variance of sample minimum-variance portfolios. This allows us t...

  4. Interplay of Determinism and Randomness: From Irreversibility to Chaos, Fractals, and Stochasticity

    Science.gov (United States)

    Tsonis, A.

    2017-12-01

    We will start our discussion into randomness by looking exclusively at our formal mathematical system to show that even in this pure and strictly logical system one cannot do away with randomness. By employing simple mathematical models, we will identify the three possible sources of randomness: randomness due to inability to find the rules (irreversibility), randomness due to inability to have infinite power (chaos), and randomness due to stochastic processes. Subsequently we will move from the mathematical system to our physical world to show that randomness, through the quantum mechanical character of small scales, through chaos, and because of the second law of thermodynamics, is an intrinsic property of nature as well. We will subsequently argue that the randomness in the physical world is consistent with the three sources of randomness suggested from the study of simple mathematical systems. Many examples ranging from purely mathematical to natural processes will be presented, which clearly demonstrate how the combination of rules and randomness produces the world we live in. Finally, the principle of least effort or the principle of minimum energy consumption will be suggested as the underlying principle behind this symbiosis between determinism and randomness.

  5. 24 CFR 891.145 - Owner deposit (Minimum Capital Investment).

    Science.gov (United States)

    2010-04-01

    ... General Program Requirements § 891.145 Owner deposit (Minimum Capital Investment). As a Minimum Capital... Investment shall be one-half of one percent (0.5%) of the HUD-approved capital advance, not to exceed $25,000. ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Owner deposit (Minimum Capital...

  6. PENERAPAN METODE LEAST MEDIAN SQUARE-MINIMUM COVARIANCE DETERMINANT (LMS-MCD DALAM REGRESI KOMPONEN UTAMA

    Directory of Open Access Journals (Sweden)

    I PUTU EKA IRAWAN

    2013-11-01

    Full Text Available Principal Component Regression is a method to overcome multicollinearity techniques by combining principal component analysis with regression analysis. The calculation of classical principal component analysis is based on the regular covariance matrix. The covariance matrix is optimal if the data originated from a multivariate normal distribution, but is very sensitive to the presence of outliers. Alternatives are used to overcome this problem the method of Least Median Square-Minimum Covariance Determinant (LMS-MCD. The purpose of this research is to conduct a comparison between Principal Component Regression (RKU and Method of Least Median Square - Minimum Covariance Determinant (LMS-MCD in dealing with outliers. In this study, Method of Least Median Square - Minimum Covariance Determinant (LMS-MCD has a bias and mean square error (MSE is smaller than the parameter RKU. Based on the difference of parameter estimators, still have a test that has a difference of parameter estimators method LMS-MCD greater than RKU method.

  7. Empirical versus Random Item Selection in the Design of Intelligence Test Short Forms--The WISC-R Example.

    Science.gov (United States)

    Goh, David S.

    1979-01-01

    The advantages of using psychometric thoery to design short forms of intelligence tests are demonstrated by comparing such usage to a systematic random procedure that has previously been used. The Wechsler Intelligence Scale for Children Revised (WISC-R) Short Form is presented as an example. (JKS)

  8. Cost Analysis of the STONE Randomized Trial: Can Health Care Costs be Reduced One Test at a Time?

    Science.gov (United States)

    Melnikow, Joy; Xing, Guibo; Cox, Ginger; Leigh, Paul; Mills, Lisa; Miglioretti, Diana L; Moghadassi, Michelle; Smith-Bindman, Rebecca

    2016-04-01

    Decreasing the use of high-cost tests may reduce health care costs. To compare costs of care for patients presenting to the emergency department (ED) with suspected kidney stones randomized to 1 of 3 initial imaging tests. Patients were randomized to point-of-care ultrasound (POC US, least costly), radiology ultrasound (RAD US), or computed tomography (CT, most costly). Subsequent testing and treatment were the choice of the treating physician. A total of 2759 patients at 15 EDs were randomized to POC US (n=908), RAD US, (n=893), or CT (n=958). Mean age was 40.4 years; 51.8% were male. All medical care documented in the trial database in the 7 days following enrollment was abstracted and coded to estimate costs using national average 2012 Medicare reimbursements. Costs for initial ED care and total 7-day costs were compared using nonparametric bootstrap to account for clustering of patients within medical centers. Initial ED visit costs were modestly lower for patients assigned to RAD US: $423 ($411, $434) compared with patients assigned to CT: $448 ($438, $459) (Pcosts were not significantly different between groups: $1014 ($912, $1129) for POC US, $970 ($878, $1078) for RAD US, and $959 ($870, $1044) for CT. Hospital admissions contributed over 50% of total costs, though only 11% of patients were admitted. Mean total costs (and admission rates) varied substantially by site from $749 to $1239. Assignment to a less costly test had no impact on overall health care costs for ED patients. System-level interventions addressing variation in admission rates from the ED might have greater impact on costs.

  9. Optimization Strategies for Bruch's Membrane Opening Minimum Rim Area Calculation: Sequential versus Simultaneous Minimization.

    Science.gov (United States)

    Enders, Philip; Adler, Werner; Schaub, Friederike; Hermann, Manuel M; Diestelhorst, Michael; Dietlein, Thomas; Cursiefen, Claus; Heindl, Ludwig M

    2017-10-24

    To compare a simultaneously optimized continuous minimum rim surface parameter between Bruch's membrane opening (BMO) and the internal limiting membrane to the standard sequential minimization used for calculating the BMO minimum rim area in spectral domain optical coherence tomography (SD-OCT). In this case-control, cross-sectional study, 704 eyes of 445 participants underwent SD-OCT of the optic nerve head (ONH), visual field testing, and clinical examination. Globally and clock-hour sector-wise optimized BMO-based minimum rim area was calculated independently. Outcome parameters included BMO-globally optimized minimum rim area (BMO-gMRA) and sector-wise optimized BMO-minimum rim area (BMO-MRA). BMO area was 1.89 ± 0.05 mm 2 . Mean global BMO-MRA was 0.97 ± 0.34 mm 2 , mean global BMO-gMRA was 1.01 ± 0.36 mm 2 . Both parameters correlated with r = 0.995 (P < 0.001); mean difference was 0.04 mm 2 (P < 0.001). In all sectors, parameters differed by 3.0-4.2%. In receiver operating characteristics, the calculated area under the curve (AUC) to differentiate glaucoma was 0.873 for BMO-MRA, compared to 0.866 for BMO-gMRA (P = 0.004). Among ONH sectors, the temporal inferior location showed the highest AUC. Optimization strategies to calculate BMO-based minimum rim area led to significantly different results. Imposing an additional adjacency constraint within calculation of BMO-MRA does not improve diagnostic power. Global and temporal inferior BMO-MRA performed best in differentiating glaucoma patients.

  10. Dispersive and Covalent Interactions between Graphene and Metal Surfaces from the Random Phase Approximation

    DEFF Research Database (Denmark)

    Olsen, Thomas; Yan, Jun; Mortensen, Jens Jørgen

    2011-01-01

    We calculate the potential energy surfaces for graphene adsorbed on Cu(111), Ni(111), and Co(0001) using density functional theory and the random phase approximation (RPA). For these adsorption systems covalent and dispersive interactions are equally important and while commonly used approximations...... for exchange-correlation functionals give inadequate descriptions of either van der Waals or chemical bonds, RPA accounts accurately for both. It is found that the adsorption is a delicate competition between a weak chemisorption minimum close to the surface and a physisorption minimum further from the surface....

  11. Minimum Wages and the Distribution of Family Incomes

    OpenAIRE

    Dube, Arindrajit

    2017-01-01

    Using the March Current Population Survey data from 1984 to 2013, I provide a comprehensive evaluation of how minimum wage policies influence the distribution of family incomes. I find robust evidence that higher minimum wages shift down the cumulative distribution of family incomes at the bottom, reducing the share of non-elderly individuals with incomes below 50, 75, 100, and 125 percent of the federal poverty threshold. The long run (3 or more years) minimum wage elasticity of the non-elde...

  12. Syndesmotic fixation in supination-external rotation ankle fractures: a prospective randomized study.

    Science.gov (United States)

    Pakarinen, Harri J; Flinkkilä, Tapio E; Ohtonen, Pasi P; Hyvönen, Pekka H; Lakovaara, Martti T; Leppilahti, Juhana I; Ristiniemi, Jukka Y

    2011-12-01

    This study was designed to assess whether transfixion of an unstable syndesmosis is necessary in supination-external rotation (Lauge-Hansen SE/Weber B)-type ankle fractures. A prospective study of 140 patients with unilateral Lauge-Hansen supination-external rotation type 4 ankle fractures was done. After bony fixation, the 7.5-Nm standardized external rotation (ER) stress test for both ankles was performed under fluoroscopy. A positive stress examination was defined as a difference of more than 2 mm side-to-side in the tibiotalar or tibiofibular clear spaces on mortise radiographs. If the stress test was positive, the patient was randomized to either syndesmotic transfixion with 3.5-mm tricortical screws or no syndesmotic fixation. Clinical outcome was assessed using the Olerud-Molander scoring system, RAND 36-Item Health Survey, and Visual Analogue Scale (VAS) to measure pain and function after a minimum 1-year of followup. Twenty four (17%) of 140 patients had positive standardized 7.5-Nm ER stress tests after malleolar fixation. The stress view was positive three times on tibiotalar clear space, seven on tibiofibular clear space, and 14 times on both tibiotalar and tibiofibular clear spaces. There was no significant difference between the two randomization groups with regards to Olerud-Molander functional score, VAS scale measuring pain and function, or RAND 36-Item Health Survey pain or physical function at 1 year. Relevant syndesmotic injuries are rare in supination-external rotation ankle fractures, and syndesmotic transfixion with a screw did not influence the functional outcome or pain after the 1-year followup compared with no fixation.

  13. 7 CFR 1610.5 - Minimum Bank loan.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 11 2010-01-01 2010-01-01 false Minimum Bank loan. 1610.5 Section 1610.5 Agriculture Regulations of the Department of Agriculture (Continued) RURAL TELEPHONE BANK, DEPARTMENT OF AGRICULTURE LOAN POLICIES § 1610.5 Minimum Bank loan. A Bank loan will not be made unless the applicant qualifies for a Bank...

  14. Minimum Wage Effects in the Longer Run

    Science.gov (United States)

    Neumark, David; Nizalova, Olena

    2007-01-01

    Exposure to minimum wages at young ages could lead to adverse longer-run effects via decreased labor market experience and tenure, and diminished education and training, while beneficial longer-run effects could arise if minimum wages increase skill acquisition. Evidence suggests that as individuals reach their late 20s, they earn less the longer…

  15. Features of Random Metal Nanowire Networks with Application in Transparent Conducting Electrodes

    KAUST Repository

    Maloth, Thirupathi

    2017-05-01

    Among the alternatives to conventional Indium Tin Oxide (ITO) used in making transparent conducting electrodes, the random metal nanowire (NW) networks are considered to be superior offering performance at par with ITO. The performance is measured in terms of sheet resistance and optical transmittance. However, as the electrical properties of such random networks are achieved thanks to a percolation network, a minimum size of the electrodes is needed so it actually exceeds the representative volume element (RVE) of the material and the macroscopic electrical properties are achieved. There is not much information about the compatibility of this minimum RVE size with the resolution actually needed in electronic devices. Furthermore, the efficiency of NWs in terms of electrical conduction is overlooked. In this work, we address the above industrially relevant questions - 1) The minimum size of electrodes that can be made based on the dimensions of NWs and the material coverage. For this, we propose a morphology based classification in defining the RVE size and we also compare the same with that is based on macroscopic electrical properties stabilization. 2) The amount of NWs that do not participate in electrical conduction, hence of no practical use. The results presented in this thesis are a design guide to experimentalists to design transparent electrodes with more optimal usage of the material.

  16. 29 CFR 783.43 - Computation of seaman's minimum wage.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Computation of seaman's minimum wage. 783.43 Section 783.43...'s minimum wage. Section 6(b) requires, under paragraph (2) of the subsection, that an employee...'s minimum wage requirements by reason of the 1961 Amendments (see §§ 783.23 and 783.26). Although...

  17. 12 CFR 931.3 - Minimum investment in capital stock.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Minimum investment in capital stock. 931.3... CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL STOCK § 931.3 Minimum investment in capital stock. (a) A Bank shall require each member to maintain a minimum investment in the capital stock of the Bank, both...

  18. Misuse of randomization

    DEFF Research Database (Denmark)

    Liu, Jianping; Kjaergard, Lise Lotte; Gluud, Christian

    2002-01-01

    The quality of randomization of Chinese randomized trials on herbal medicines for hepatitis B was assessed. Search strategy and inclusion criteria were based on the published protocol. One hundred and seventy-six randomized clinical trials (RCTs) involving 20,452 patients with chronic hepatitis B...... virus (HBV) infection were identified that tested Chinese medicinal herbs. They were published in 49 Chinese journals. Only 10% (18/176) of the studies reported the method by which they randomized patients. Only two reported allocation concealment and were considered as adequate. Twenty percent (30...

  19. The effect of fish oil on two-step tuberculin test in hospitalized patients: A randomized controlled trial

    Directory of Open Access Journals (Sweden)

    N. Arfa

    2018-02-01

    Full Text Available Background: According to national tuberculosis control guide, two-step tuberculin skin test (TST should be done in the elderly, if the initial test is negative. However, it raises questions about the usefulness of this approach. Objective: This study aimed to explore the effects of fish oil supplements on the two-step tuberculin test in hospitalized patients. Methods: In this randomized controlled clinical trial, 128 patients randomly allocated to control group (receiving placebo, n=64 and treatment group (receiving fish oil supplements, n=64 during 2016. Fish oil supplement group was treated with 2 g daily for 4 consecutive days. The outcome was considered a change in 2 sequential TST induration sizes. Significant increase in the size of the secondary induration compared to primary was considered 6 mm or more. Findings: There was significant difference between primary and secondary indurations of two groups (higher in treatment group (P=0.04. According to the results of analysis of variance and correlation tests, two effective factors were identified: initial induration and residence location (P=0.014 and P=0.002, respectively. In both groups, no clinically significant increase in the size of induration was observed. Conclusion: It seems that the number of cases considered as infected with tuberculosis does not increase with two- rather than one-step tuberculin skin test. Also, the short-term administration of fish oil supplements does not change this result.

  20. Minimum-Cost Reachability for Priced Timed Automata

    DEFF Research Database (Denmark)

    Behrmann, Gerd; Fehnker, Ansgar; Hune, Thomas Seidelin

    2001-01-01

    This paper introduces the model of linearly priced timed automata as an extension of timed automata, with prices on both transitions and locations. For this model we consider the minimum-cost reachability problem: i.e. given a linearly priced timed automaton and a target state, determine...... the minimum cost of executions from the initial state to the target state. This problem generalizes the minimum-time reachability problem for ordinary timed automata. We prove decidability of this problem by offering an algorithmic solution, which is based on a combination of branch-and-bound techniques...... and a new notion of priced regions. The latter allows symbolic representation and manipulation of reachable states together with the cost of reaching them....

  1. Quality pseudo-random number generator

    International Nuclear Information System (INIS)

    Tarasiuk, J.

    1996-01-01

    The pseudo-random number generator (RNG) was written to match needs of nuclear and high-energy physics computation which in some cases require very long and independent random number sequences. In this random number generator the repetition period is about 10 36 what should be sufficient for all computers in the world. In this article the test results of RNG correlation, speed and identity of computations for PC, Sun4 and VAX computer tests are presented

  2. Adaptive Mean Queue Size and Its Rate of Change: Queue Management with Random Dropping

    OpenAIRE

    Karmeshu; Patel, Sanjeev; Bhatnagar, Shalabh

    2016-01-01

    The Random early detection (RED) active queue management (AQM) scheme uses the average queue size to calculate the dropping probability in terms of minimum and maximum thresholds. The effect of heavy load enhances the frequency of crossing the maximum threshold value resulting in frequent dropping of the packets. An adaptive queue management with random dropping (AQMRD) algorithm is proposed which incorporates information not just about the average queue size but also the rate of change of th...

  3. Employment Effects of Minimum and Subminimum Wages. Recent Evidence.

    Science.gov (United States)

    Neumark, David

    Using a specially constructed panel data set on state minimum wage laws and labor market conditions, Neumark and Wascher (1992) presented evidence that countered the claim that minimum wages could be raised with no cost to employment. They concluded that estimates indicating that minimum wages reduced employment on the order of 1-2 percent for a…

  4. Minimum Wage Effects on Educational Enrollments in New Zealand

    Science.gov (United States)

    Pacheco, Gail A.; Cruickshank, Amy A.

    2007-01-01

    This paper empirically examines the impact of minimum wages on educational enrollments in New Zealand. A significant reform to the youth minimum wage since 2000 has resulted in some age groups undergoing a 91% rise in their real minimum wage over the last 10 years. Three panel least squares multivariate models are estimated from a national sample…

  5. Zero forcing parameters and minimum rank problems

    NARCIS (Netherlands)

    Barioli, F.; Barrett, W.; Fallat, S.M.; Hall, H.T.; Hogben, L.; Shader, B.L.; Driessche, van den P.; Holst, van der H.

    2010-01-01

    The zero forcing number Z(G), which is the minimum number of vertices in a zero forcing set of a graph G, is used to study the maximum nullity/minimum rank of the family of symmetric matrices described by G. It is shown that for a connected graph of order at least two, no vertex is in every zero

  6. Minimum bias measurement at 13 TeV

    CERN Document Server

    Orlando, Nicola; The ATLAS collaboration

    2017-01-01

    The modelling of Minimum Bias (MB) is a crucial ingredient to learn about the description of soft QCD processes and to simulate the environment at the LHC with many concurrent pp interactions (pile-up). We summarise the ATLAS minimum bias measurements with proton-proton collision at 13 TeV center-of-mass-energy at the Large Hadron Collider.

  7. Minimum area requirements for an at-risk butterfly based on movement and demography.

    Science.gov (United States)

    Brown, Leone M; Crone, Elizabeth E

    2016-02-01

    Determining the minimum area required to sustain populations has a long history in theoretical and conservation biology. Correlative approaches are often used to estimate minimum area requirements (MARs) based on relationships between area and the population size required for persistence or between species' traits and distribution patterns across landscapes. Mechanistic approaches to estimating MAR facilitate prediction across space and time but are few. We used a mechanistic MAR model to determine the critical minimum patch size (CMP) for the Baltimore checkerspot butterfly (Euphydryas phaeton), a locally abundant species in decline along its southern range, and sister to several federally listed species. Our CMP is based on principles of diffusion, where individuals in smaller patches encounter edges and leave with higher probability than those in larger patches, potentially before reproducing. We estimated a CMP for the Baltimore checkerspot of 0.7-1.5 ha, in accordance with trait-based MAR estimates. The diffusion rate on which we based this CMP was broadly similar when estimated at the landscape scale (comparing flight path vs. capture-mark-recapture data), and the estimated population growth rate was consistent with observed site trends. Our mechanistic approach to estimating MAR is appropriate for species whose movement follows a correlated random walk and may be useful where landscape-scale distributions are difficult to assess, but demographic and movement data are obtainable from a single site or the literature. Just as simple estimates of lambda are often used to assess population viability, the principles of diffusion and CMP could provide a starting place for estimating MAR for conservation. © 2015 Society for Conservation Biology.

  8. A sequential logic circuit for coincidences randomly distributed in 'time' and 'duration', with selection and total sampling

    International Nuclear Information System (INIS)

    Carnet, Bernard; Delhumeau, Michel

    1971-06-01

    The principles of binary analysis applied to the investigation of sequential circuits were used to design a two way coincidence circuit whose input may be, random or periodic variables of constant or variable duration. The output signal strictly reproduces the characteristics of the input signal triggering the coincidence. A coincidence between input signals does not produce any output signal if one of the signals has already triggered the output signal. The characteristics of the output signal in relation to those of the input signal are: minimum time jitter, excellent duration reproducibility and maximum efficiency. Some rules are given for achieving these results. The symmetry, transitivity and non-transitivity characteristics of the edges on the primitive graph are analyzed and lead to some rules for positioning the states on a secondary graph. It is from this graph that the equations of the circuits can be calculated. The development of the circuit and its dynamic testing are discussed. For this testing, the functioning of the circuit is simulated by feeding into the input randomly generated signals

  9. Minimum airflow reset of single-duct VAV terminal boxes

    Science.gov (United States)

    Cho, Young-Hum

    Single duct Variable Air Volume (VAV) systems are currently the most widely used type of HVAC system in the United States. When installing such a system, it is critical to determine the minimum airflow set point of the terminal box, as an optimally selected set point will improve the level of thermal comfort and indoor air quality (IAQ) while at the same time lower overall energy costs. In principle, this minimum rate should be calculated according to the minimum ventilation requirement based on ASHRAE standard 62.1 and maximum heating load of the zone. Several factors must be carefully considered when calculating this minimum rate. Terminal boxes with conventional control sequences may result in occupant discomfort and energy waste. If the minimum rate of airflow is set too high, the AHUs will consume excess fan power, and the terminal boxes may cause significant simultaneous room heating and cooling. At the same time, a rate that is too low will result in poor air circulation and indoor air quality in the air-conditioned space. Currently, many scholars are investigating how to change the algorithm of the advanced VAV terminal box controller without retrofitting. Some of these controllers have been found to effectively improve thermal comfort, indoor air quality, and energy efficiency. However, minimum airflow set points have not yet been identified, nor has controller performance been verified in confirmed studies. In this study, control algorithms were developed that automatically identify and reset terminal box minimum airflow set points, thereby improving indoor air quality and thermal comfort levels, and reducing the overall rate of energy consumption. A theoretical analysis of the optimal minimum airflow and discharge air temperature was performed to identify the potential energy benefits of resetting the terminal box minimum airflow set points. Applicable control algorithms for calculating the ideal values for the minimum airflow reset were developed and

  10. Minimum wall pressure coefficient of orifice plate energy dissipater

    Directory of Open Access Journals (Sweden)

    Wan-zheng Ai

    2015-01-01

    Full Text Available Orifice plate energy dissipaters have been successfully used in large-scale hydropower projects due to their simple structure, convenient construction procedure, and high energy dissipation ratio. The minimum wall pressure coefficient of an orifice plate can indirectly reflect its cavitation characteristics: the lower the minimum wall pressure coefficient is, the better the ability of the orifice plate to resist cavitation damage is. Thus, it is important to study the minimum wall pressure coefficient of the orifice plate. In this study, this coefficient and related parameters, such as the contraction ratio, defined as the ratio of the orifice plate diameter to the flood-discharging tunnel diameter; the relative thickness, defined as the ratio of the orifice plate thickness to the tunnel diameter; and the Reynolds number of the flow through the orifice plate, were theoretically analyzed, and their relationships were obtained through physical model experiments. It can be concluded that the minimum wall pressure coefficient is mainly dominated by the contraction ratio and relative thickness. The lower the contraction ratio and relative thickness are, the larger the minimum wall pressure coefficient is. The effects of the Reynolds number on the minimum wall pressure coefficient can be neglected when it is larger than 105. An empirical expression was presented to calculate the minimum wall pressure coefficient in this study.

  11. Conditions of Minimum Wage Indexation in Czech and Slovak Legislation in the Context of Business Economics

    Directory of Open Access Journals (Sweden)

    Pernica Martin

    2016-12-01

    Full Text Available The aim of the article is to assess – on the basis of a comparison of Czech and Slovak legislation relating to the conditions of the minimum wage indexation – whether it would be appropriate to use certain aspects of Slovak legislation in the Czech legislation and vice versa. When elaborating the article, some logical methods were used. In order to collect data, important employers were addressed in the South-Moravian Region. A carrying method used during the work was a comparison. Analyses were processed using the data of the Czech Statistical Office, the European Statistical Office and the Ministry of Labour and Social Affairs of the Czech Republic. To evaluate the research, the percentage representation of positive and negative responses and Pearson’s Chi-square test were used. The paper presents the results of research whose aim was to get the views of entrepreneurs regarding the minimum wage level and conditions of its indexation. Employers supported the idea of maintaining the institution of the minimum wage. A predominant portion of companies would welcome it if the minimum wage were derived on the basis of an average wage, and the vast majority of companies would welcome the annual indexation of the minimum wage by inflation.

  12. A permutation test to analyse systematic bias and random measurement errors of medical devices via boosting location and scale models.

    Science.gov (United States)

    Mayr, Andreas; Schmid, Matthias; Pfahlberg, Annette; Uter, Wolfgang; Gefeller, Olaf

    2017-06-01

    Measurement errors of medico-technical devices can be separated into systematic bias and random error. We propose a new method to address both simultaneously via generalized additive models for location, scale and shape (GAMLSS) in combination with permutation tests. More precisely, we extend a recently proposed boosting algorithm for GAMLSS to provide a test procedure to analyse potential device effects on the measurements. We carried out a large-scale simulation study to provide empirical evidence that our method is able to identify possible sources of systematic bias as well as random error under different conditions. Finally, we apply our approach to compare measurements of skin pigmentation from two different devices in an epidemiological study.

  13. RANDOM WALK HYPOTHESIS IN FINANCIAL MARKETS

    Directory of Open Access Journals (Sweden)

    Nicolae-Marius JULA

    2017-05-01

    Full Text Available Random walk hypothesis states that the stock market prices do not follow a predictable trajectory, but are simply random. If you are trying to predict a random set of data, one should test for randomness, because, despite the power and complexity of the used models, the results cannot be trustworthy. There are several methods for testing these hypotheses and the use of computational power provided by the R environment makes the work of the researcher easier and with a cost-effective approach. The increasing power of computing and the continuous development of econometric tests should give the potential investors new tools in selecting commodities and investing in efficient markets.

  14. An empirical test of pseudo random number generators by means of an exponential decaying process; Una prueba empirica de generadores de numeros pseudoaleatorios mediante un proceso de decaimiento exponencial

    Energy Technology Data Exchange (ETDEWEB)

    Coronel B, H.F.; Hernandez M, A.R.; Jimenez M, M.A. [Facultad de Fisica e Inteligencia Artificial, Universidad Veracruzana, A.P. 475, Xalapa, Veracruz (Mexico); Mora F, L.E. [CIMAT, A.P. 402, 36000 Guanajuato (Mexico)]. e-mail: hcoronel@uv.mx

    2007-07-01

    Empirical tests for pseudo random number generators based on the use of processes or physical models have been successfully used and are considered as complementary to theoretical tests of randomness. In this work a statistical methodology for evaluating the quality of pseudo random number generators is presented. The method is illustrated in the context of the so-called exponential decay process, using some pseudo random number generators commonly used in physics. (Author)

  15. 76 FR 15368 - Minimum Security Devices and Procedures

    Science.gov (United States)

    2011-03-21

    ... DEPARTMENT OF THE TREASURY Office of Thrift Supervision Minimum Security Devices and Procedures... concerning the following information collection. Title of Proposal: Minimum Security Devices and Procedures... security devices and procedures to discourage robberies, burglaries, and larcenies, and to assist in the...

  16. Some Tests of Random Walk Hypothesis for Bulgarian Foreign Exchange Rates

    OpenAIRE

    Nikolai Gueorguiev

    1993-01-01

    The objective of this paper is to check if the exchange rate in newly emerged, relatively thin foreign exchange markets, follows a random walk pattern. The findings of the current study cast doubts on random walk presence in Bulgarian exchange rates against major international currencies. It turns out that the series of daily returns are stationary but correlated and therefore can be modelled better by higher-order ARIMA processes than by random walk.

  17. A Hybrid Optimized Weighted Minimum Spanning Tree for the Shortest Intrapath Selection in Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    Matheswaran Saravanan

    2014-01-01

    Full Text Available Wireless sensor network (WSN consists of sensor nodes that need energy efficient routing techniques as they have limited battery power, computing, and storage resources. WSN routing protocols should enable reliable multihop communication with energy constraints. Clustering is an effective way to reduce overheads and when this is aided by effective resource allocation, it results in reduced energy consumption. In this work, a novel hybrid evolutionary algorithm called Bee Algorithm-Simulated Annealing Weighted Minimal Spanning Tree (BASA-WMST routing is proposed in which randomly deployed sensor nodes are split into the best possible number of independent clusters with cluster head and optimal route. The former gathers data from sensors belonging to the cluster, forwarding them to the sink. The shortest intrapath selection for the cluster is selected using Weighted Minimum Spanning Tree (WMST. The proposed algorithm computes the distance-based Minimum Spanning Tree (MST of the weighted graph for the multihop network. The weights are dynamically changed based on the energy level of each sensor during route selection and optimized using the proposed bee algorithm simulated annealing algorithm.

  18. The potential for increased power from combining P-values testing the same hypothesis.

    Science.gov (United States)

    Ganju, Jitendra; Julie Ma, Guoguang

    2017-02-01

    The conventional approach to hypothesis testing for formal inference is to prespecify a single test statistic thought to be optimal. However, we usually have more than one test statistic in mind for testing the null hypothesis of no treatment effect but we do not know which one is the most powerful. Rather than relying on a single p-value, combining p-values from prespecified multiple test statistics can be used for inference. Combining functions include Fisher's combination test and the minimum p-value. Using randomization-based tests, the increase in power can be remarkable when compared with a single test and Simes's method. The versatility of the method is that it also applies when the number of covariates exceeds the number of observations. The increase in power is large enough to prefer combined p-values over a single p-value. The limitation is that the method does not provide an unbiased estimator of the treatment effect and does not apply to situations when the model includes treatment by covariate interaction.

  19. Protocol for the "Michigan Awareness Control Study": A prospective, randomized, controlled trial comparing electronic alerts based on bispectral index monitoring or minimum alveolar concentration for the prevention of intraoperative awareness

    Directory of Open Access Journals (Sweden)

    Avidan Michael S

    2009-11-01

    Full Text Available Abstract Background The incidence of intraoperative awareness with explicit recall is 1-2/1000 cases in the United States. The Bispectral Index monitor is an electroencephalographic method of assessing anesthetic depth that has been shown in one prospective study to reduce the incidence of awareness in the high-risk population. In the B-Aware trial, the number needed to treat in order to prevent one case of awareness in the high-risk population was 138. Since the number needed to treat and the associated cost of treatment would be much higher in the general population, the efficacy of the Bispectral Index monitor in preventing awareness in all anesthetized patients needs to be clearly established. This is especially true given the findings of the B-Unaware trial, which demonstrated no significant difference between protocols based on the Bispectral Index monitor or minimum alveolar concentration for the reduction of awareness in high risk patients. Methods/Design To evaluate efficacy in the general population, we are conducting a prospective, randomized, controlled trial comparing the Bispectral Index monitor to a non-electroencephalographic gauge of anesthetic depth. The total recruitment for the study is targeted for 30,000 patients at both low and high risk for awareness. We have developed a novel algorithm that is capable of real-time analysis of our electronic perioperative information system. In one arm of the study, anesthesia providers will receive an electronic page if the Bispectral Index value is >60. In the other arm of the study, anesthesia providers will receive a page if the age-adjusted minimum alveolar concentration is Discussion Awareness during general anesthesia is a persistent problem and the role of the Bispectral Index monitor in its prevention is still unclear. The Michigan Awareness Control Study is the largest prospective trial of awareness prevention ever conducted. Trial Registration Clinical Trial NCT00689091

  20. Minimum Time Trajectory Optimization of CNC Machining with Tracking Error Constraints

    Directory of Open Access Journals (Sweden)

    Qiang Zhang

    2014-01-01

    Full Text Available An off-line optimization approach of high precision minimum time feedrate for CNC machining is proposed. Besides the ordinary considered velocity, acceleration, and jerk constraints, dynamic performance constraint of each servo drive is also considered in this optimization problem to improve the tracking precision along the optimized feedrate trajectory. Tracking error is applied to indicate the servo dynamic performance of each axis. By using variable substitution, the tracking error constrained minimum time trajectory planning problem is formulated as a nonlinear path constrained optimal control problem. Bang-bang constraints structure of the optimal trajectory is proved in this paper; then a novel constraint handling method is proposed to realize a convex optimization based solution of the nonlinear constrained optimal control problem. A simple ellipse feedrate planning test is presented to demonstrate the effectiveness of the approach. Then the practicability and robustness of the trajectory generated by the proposed approach are demonstrated by a butterfly contour machining example.

  1. 76 FR 30243 - Minimum Security Devices and Procedures

    Science.gov (United States)

    2011-05-24

    ... DEPARTMENT OF THE TREASURY Office of Thrift Supervision Minimum Security Devices and Procedures.... Title of Proposal: Minimum Security Devices and Procedures. OMB Number: 1550-0062. Form Number: N/A... respect to the installation, maintenance, and operation of security devices and procedures to discourage...

  2. Does increasing the minimum wage reduce poverty in developing countries?

    OpenAIRE

    Gindling, T. H.

    2014-01-01

    Do minimum wage policies reduce poverty in developing countries? It depends. Raising the minimum wage could increase or decrease poverty, depending on labor market characteristics. Minimum wages target formal sector workers—a minority of workers in most developing countries—many of whom do not live in poor households. Whether raising minimum wages reduces poverty depends not only on whether formal sector workers lose jobs as a result, but also on whether low-wage workers live in poor househol...

  3. Genetic test feedback with weight control advice: study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Meisel Susanne F

    2012-12-01

    Full Text Available Abstract Background Genetic testing for risk of weight gain is already available over the internet despite uncertain benefits and concerns about adverse emotional or behavioral effects. Few studies have assessed the effect of adding genetic test feedback to weight control advice, even though one of the proposed applications of genetic testing is to stimulate preventive action. This study will investigate the motivational effect of adding genetic test feedback to simple weight control advice in a situation where weight gain is relatively common. Methods/design First-year university students (n = 800 will be randomized to receive either 1 their personal genetic test result for a gene (FTO related to weight gain susceptibility in addition to a leaflet with simple weight control advice (‘Feedback + Advice’ group, FA, or 2 only the leaflet containing simple weight control advice (‘Advice Only’ group, AO. Motivation to avoid weight gain and active use of weight control strategies will be assessed one month after receipt of the leaflet with or without genetic test feedback. Weight and body fat will be measured at baseline and eight months follow-up. We will also assess short-term psychological reactions to the genetic test result. In addition, we will explore interactions between feedback condition and gene test status. Discussion We hope to provide a first indication of the clinical utility of weight-related genetic test feedback in the prevention context. Trial registration Current controlled trials ISRCTN91178663

  4. The SME gauge sector with minimum length

    Science.gov (United States)

    Belich, H.; Louzada, H. L. C.

    2017-12-01

    We study the gauge sector of the Standard Model Extension (SME) with the Lorentz covariant deformed Heisenberg algebra associated to the minimum length. In order to find and estimate corrections, we clarify whether the violation of Lorentz symmetry and the existence of a minimum length are independent phenomena or are, in some way, related. With this goal, we analyze the dispersion relations of this theory.

  5. The SME gauge sector with minimum length

    Energy Technology Data Exchange (ETDEWEB)

    Belich, H.; Louzada, H.L.C. [Universidade Federal do Espirito Santo, Departamento de Fisica e Quimica, Vitoria, ES (Brazil)

    2017-12-15

    We study the gauge sector of the Standard Model Extension (SME) with the Lorentz covariant deformed Heisenberg algebra associated to the minimum length. In order to find and estimate corrections, we clarify whether the violation of Lorentz symmetry and the existence of a minimum length are independent phenomena or are, in some way, related. With this goal, we analyze the dispersion relations of this theory. (orig.)

  6. Akibat Hukum Bagi Bank Bila Kewajiban Modal Inti Minimum Tidak Terpenuhi

    Directory of Open Access Journals (Sweden)

    Indira Retno Aryatie

    2012-02-01

    Full Text Available As an implementation of the Indonesian Banking Architecture policy, the government issues Bank Indonesia Regulation No. 9/16/ PBI/2007 on Minimum Tier One Capital that increases the minimum capital to 100 billion rupiah. This writing discusses the legal complication that a bank will face should it fail to fulfil the minimum ratio. Sebagai tindak lanjut dari kebijakan Arsitektur Perbankan Indonesia, pemerintah mengeluarkan Peraturan Bank Indonesia 9/16/PBI/2007 tentang Kewajiban Modal Inti Minimum Bank yang menaikkan modal inti minimum bank umum menjadi 100 miliar rupiah. Tulisan ini membahas akibat hukum yang akan dialami bank bila kewajiban modal minimum tersebut gagal dipenuhi.

  7. The impact of minimum wage adjustments on Vietnamese wage inequality

    DEFF Research Database (Denmark)

    Hansen, Henrik; Rand, John; Torm, Nina

    Using Vietnamese Labour Force Survey data we analyse the impact of minimum wage changes on wage inequality. Minimum wages serve to reduce local wage inequality in the formal sectors by decreasing the gap between the median wages and the lower tail of the local wage distributions. In contrast, local...... wage inequality is increased in the informal sectors. Overall, the minimum wages decrease national wage inequality. Our estimates indicate a decrease in the wage distribution Gini coefficient of about 2 percentage points and an increase in the 10/50 wage ratio of 5-7 percentage points caused...... by the adjustment of the minimum wages from 2011to 2012 that levelled the minimum wage across economic sectors....

  8. Risk control and the minimum significant risk

    International Nuclear Information System (INIS)

    Seiler, F.A.; Alvarez, J.L.

    1996-01-01

    Risk management implies that the risk manager can, by his actions, exercise at least a modicum of control over the risk in question. In the terminology of control theory, a management action is a control signal imposed as feedback on the system to bring about a desired change in the state of the system. In the terminology of risk management, an action is taken to bring a predicted risk to lower values. Even if it is assumed that the management action taken is 100% effective and that the projected risk reduction is infinitely well known, there is a lower limit to the desired effects that can be achieved. It is based on the fact that all risks, such as the incidence of cancer, exhibit a degree of variability due to a number of extraneous factors such as age at exposure, sex, location, and some lifestyle parameters such as smoking or the consumption of alcohol. If the control signal is much smaller than the variability of the risk, the signal is lost in the noise and control is lost. This defines a minimum controllable risk based on the variability of the risk over the population considered. This quantity is the counterpart of the minimum significant risk which is defined by the uncertainties of the risk model. Both the minimum controllable risk and the minimum significant risk are evaluated for radiation carcinogenesis and are shown to be of the same order of magnitude. For a realistic management action, the assumptions of perfectly effective action and perfect model prediction made above have to be dropped, resulting in an effective minimum controllable risk which is determined by both risk limits. Any action below that effective limit is futile, but it is also unethical due to the ethical requirement of doing more good than harm. Finally, some implications of the effective minimum controllable risk on the use of the ALARA principle and on the evaluation of remedial action goals are presented

  9. A unique concept for automatically controlling the braking action of wheeled vehicles during minimum distance stops

    Science.gov (United States)

    Barthlome, D. E.

    1975-01-01

    Test results of a unique automatic brake control system are outlined and a comparison is made of its mode of operation to that of an existing skid control system. The purpose of the test system is to provide automatic control of braking action such that hydraulic brake pressure is maintained at a near constant, optimum value during minimum distance stops.

  10. 42 CFR 84.197 - Respirator containers; minimum requirements.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Respirator containers; minimum requirements. 84.197... Cartridge Respirators § 84.197 Respirator containers; minimum requirements. Respirators shall be equipped with a substantial, durable container bearing markings which show the applicant's name, the type and...

  11. 42 CFR 84.174 - Respirator containers; minimum requirements.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Respirator containers; minimum requirements. 84.174... Air-Purifying Particulate Respirators § 84.174 Respirator containers; minimum requirements. (a) Except..., durable container bearing markings which show the applicant's name, the type of respirator it contains...

  12. The NO Regular Defibrillation testing In Cardioverter Defibrillator Implantation (NORDIC ICD) trial: concept and design of a randomized, controlled trial of intra-operative defibrillation testing during de novo defibrillator implantation.

    Science.gov (United States)

    Bänsch, Dietmar; Bonnemeier, Hendrik; Brandt, Johan; Bode, Frank; Svendsen, Jesper Hastrup; Felk, Angelika; Hauser, Tino; Wegscheider, Karl

    2015-01-01

    Although defibrillation (DF) testing is still considered a standard procedure during implantable cardioverter-defibrillator (ICD) insertion and has been an essential element of all trials that demonstrated the survival benefit of ICD therapy, there are no large randomized clinical trials demonstrating that DF testing improves clinical outcome and if the outcome would remain the same by omitting DF testing. Between February 2011 and July 2013, we randomly assigned 1077 patients to ICD implantation with (n = 540) or without (n = 537) DF testing. The intra-operative DF testing was standardized across all participating centres. After inducing a fast ventricular tachycardia (VT) with a heart rate ≥240 b.p.m. or ventricular fibrillation (VF) with a low-energy T-wave shock, DF was attempted with an initial 15 J shock. If the shock reversed the VT or VF, DF testing was considered successful and terminated. If unsuccessful, two effective 24 J shocks were administered. If DF was unsuccessful, the system was reconfigured and another DF testing was performed. An ICD shock energy of 40 J had to be programmed in all patients for treatment of spontaneous VT/VF episodes. The primary endpoint was the average efficacy of the first ICD shock for all true VT/VF episodes in each patient during follow-up. The secondary endpoints included the frequency of system revisions, total fluoroscopy, implantation time, procedural serious adverse events, and all-cause, cardiac, and arrhythmic mortality during follow-up. Home Monitoring was used in all patients to continuously monitor the system integrity, device programming and performance. The NO Regular Defibrillation testing In Cardioverter Defibrillator Implantation (NORDIC ICD) trial is one of two large prospective randomized trials assessing the effect of DF testing omission during ICD implantation. NCT01282918. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2014. For permissions please email

  13. Design of a minimum emittance nBA lattice

    Science.gov (United States)

    Lee, S. Y.

    1998-04-01

    An attempt to design a minimum emittance n-bend achromat (nBA) lattice has been made. One distinct feature is that dipoles with two different lengths were used. As a multiple bend achromat, five bend achromat lattices with six superperiod were designed. The obtained emittace is three times larger than the theoretical minimum. Tunes were chosen to avoid third order resonances. In order to correct first and second order chromaticities, eight family sextupoles were placed. The obtained emittance of five bend achromat lattices is almost equal to the minimum emittance of five bend achromat lattice consisting of dipoles with equal length.

  14. Quantum mechanics the theoretical minimum

    CERN Document Server

    Susskind, Leonard

    2014-01-01

    From the bestselling author of The Theoretical Minimum, an accessible introduction to the math and science of quantum mechanicsQuantum Mechanics is a (second) book for anyone who wants to learn how to think like a physicist. In this follow-up to the bestselling The Theoretical Minimum, physicist Leonard Susskind and data engineer Art Friedman offer a first course in the theory and associated mathematics of the strange world of quantum mechanics. Quantum Mechanics presents Susskind and Friedman’s crystal-clear explanations of the principles of quantum states, uncertainty and time dependence, entanglement, and particle and wave states, among other topics. An accessible but rigorous introduction to a famously difficult topic, Quantum Mechanics provides a tool kit for amateur scientists to learn physics at their own pace.

  15. Minimum resolvable power contrast model

    Science.gov (United States)

    Qian, Shuai; Wang, Xia; Zhou, Jingjing

    2018-01-01

    Signal-to-noise ratio and MTF are important indexs to evaluate the performance of optical systems. However,whether they are used alone or joint assessment cannot intuitively describe the overall performance of the system. Therefore, an index is proposed to reflect the comprehensive system performance-Minimum Resolvable Radiation Performance Contrast (MRP) model. MRP is an evaluation model without human eyes. It starts from the radiance of the target and the background, transforms the target and background into the equivalent strips,and considers attenuation of the atmosphere, the optical imaging system, and the detector. Combining with the signal-to-noise ratio and the MTF, the Minimum Resolvable Radiation Performance Contrast is obtained. Finally the detection probability model of MRP is given.

  16. Invalid Permutation Tests

    Directory of Open Access Journals (Sweden)

    Mikel Aickin

    2010-01-01

    Full Text Available Permutation tests are often presented in a rather casual manner, in both introductory and advanced statistics textbooks. The appeal of the cleverness of the procedure seems to replace the need for a rigorous argument that it produces valid hypothesis tests. The consequence of this educational failing has been a widespread belief in a “permutation principle”, which is supposed invariably to give tests that are valid by construction, under an absolute minimum of statistical assumptions. Several lines of argument are presented here to show that the permutation principle itself can be invalid, concentrating on the Fisher-Pitman permutation test for two means. A simple counterfactual example illustrates the general problem, and a slightly more elaborate counterfactual argument is used to explain why the main mathematical proof of the validity of permutation tests is mistaken. Two modifications of the permutation test are suggested to be valid in a very modest simulation. In instances where simulation software is readily available, investigating the validity of a specific permutation test can be done easily, requiring only a minimum understanding of statistical technicalities.

  17. Multidimensional adaptive testing with a minimum error-variance criterion

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1997-01-01

    The case of adaptive testing under a multidimensional logistic response model is addressed. An adaptive algorithm is proposed that minimizes the (asymptotic) variance of the maximum-likelihood (ML) estimator of a linear combination of abilities of interest. The item selection criterion is a simple

  18. Calculation of the minimum critical mass of fissile nuclides

    International Nuclear Information System (INIS)

    Wright, R.Q.; Hopper, Calvin Mitchell

    2008-01-01

    The OB-1 method for the calculation of the minimum critical mass of fissile actinides in metal/water systems was described in a previous paper. A fit to the calculated minimum critical mass data using the extended criticality parameter is the basis of the revised method. The solution density (grams/liter) for the minimum critical mass is also obtained by a fit to calculated values. Input to the calculation consists of the Maxwellian averaged fission and absorption cross sections and the thermal values of nubar. The revised method gives more accurate values than the original method does for both the minimum critical mass and the solution densities. The OB-1 method has been extended to calculate the uncertainties in the minimum critical mass for 12 different fissile nuclides. The uncertainties for the fission and capture cross sections and the estimated nubar uncertainties are used to determine the uncertainties in the minimum critical mass, either in percent or grams. Results have been obtained for U-233, U-235, Pu-236, Pu-239, Pu-241, Am-242m, Cm-243, Cm-245, Cf-249, Cf-251, Cf-253, and Es-254. Eight of these 12 nuclides are included in the ANS-8.15 standard.

  19. Implications of the Deep Minimum for Slow Solar Wind Origin

    Science.gov (United States)

    Antiochos, S. K.; Mikic, Z.; Lionello, R.; Titov, V. S.; Linker, J. A.

    2009-12-01

    The origin of the slow solar wind has long been one of the most important problems in solar/heliospheric physics. Two observational constraints make this problem especially challenging. First, the slow wind has the composition of the closed-field corona, unlike the fast wind that originates on open field lines. Second, the slow wind has substantial angular extent, of order 30 degrees, which is much larger than the widths observed for streamer stalks or the widths expected theoretically for a dynamic heliospheric current sheet. We propose that the slow wind originates from an intricate network of narrow (possibly singular) open-field corridors that emanate from the polar coronal hole regions. Using topological arguments, we show that these corridors must be ubiquitous in the solar corona. The total solar eclipse in August 2008, near the lowest point of the Deep Minimum, affords an ideal opportunity to test this theory by using the ultra-high resolution Predictive Science's (PSI) eclipse model for the corona and wind. Analysis of the PSI eclipse model demonstrates that the extent and scales of the open-field corridors can account for both the angular width of the slow wind and its closed-field composition. We discuss the implications of our slow wind theory for the structure of the corona and heliosphere at the Deep Minimum and describe further observational and theoretical tests. This work has been supported by the NASA HTP, SR&T, and LWS programs.

  20. 42 CFR 84.134 - Respirator containers; minimum requirements.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Respirator containers; minimum requirements. 84.134... Respirators § 84.134 Respirator containers; minimum requirements. Supplied-air respirators shall be equipped with a substantial, durable container bearing markings which show the applicant's name, the type and...

  1. 42 CFR 84.1134 - Respirator containers; minimum requirements.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Respirator containers; minimum requirements. 84... Combination Gas Masks § 84.1134 Respirator containers; minimum requirements. (a) Except as provided in paragraph (b) of this section each respirator shall be equipped with a substantial, durable container...

  2. 42 CFR 84.74 - Apparatus containers; minimum requirements.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Apparatus containers; minimum requirements. 84.74...-Contained Breathing Apparatus § 84.74 Apparatus containers; minimum requirements. (a) Apparatus may be equipped with a substantial, durable container bearing markings which show the applicant's name, the type...

  3. Minimum K-S estimator using PH-transform technique

    Directory of Open Access Journals (Sweden)

    Somchit Boonthiem

    2016-07-01

    Full Text Available In this paper, we propose an improvement of the Minimum Kolmogorov-Smirnov (K-S estimator using proportional hazards transform (PH-transform technique. The data of experiment is 47 fire accidents data of an insurance company in Thailand. This experiment has two operations, the first operation, we minimize K-S statistic value using grid search technique for nine distributions; Rayleigh distribution, gamma distribution, Pareto distribution, log-logistic distribution, logistic distribution, normal distribution, Weibull distribution, lognormal distribution, and exponential distribution and the second operation, we improve K-S statistic using PHtransform. The result appears that PH-transform technique can improve the Minimum K-S estimator. The algorithms give better the Minimum K-S estimator for seven distributions; Rayleigh distribution, logistic distribution, gamma distribution, Pareto distribution, log-logistic distribution, normal distribution, Weibull distribution, log-normal distribution, and exponential distribution while the Minimum K-S estimators of normal distribution and logistic distribution are unchanged

  4. MSEBAG: a dynamic classifier ensemble generation based on `minimum-sufficient ensemble' and bagging

    Science.gov (United States)

    Chen, Lei; Kamel, Mohamed S.

    2016-01-01

    In this paper, we propose a dynamic classifier system, MSEBAG, which is characterised by searching for the 'minimum-sufficient ensemble' and bagging at the ensemble level. It adopts an 'over-generation and selection' strategy and aims to achieve a good bias-variance trade-off. In the training phase, MSEBAG first searches for the 'minimum-sufficient ensemble', which maximises the in-sample fitness with the minimal number of base classifiers. Then, starting from the 'minimum-sufficient ensemble', a backward stepwise algorithm is employed to generate a collection of ensembles. The objective is to create a collection of ensembles with a descending fitness on the data, as well as a descending complexity in the structure. MSEBAG dynamically selects the ensembles from the collection for the decision aggregation. The extended adaptive aggregation (EAA) approach, a bagging-style algorithm performed at the ensemble level, is employed for this task. EAA searches for the competent ensembles using a score function, which takes into consideration both the in-sample fitness and the confidence of the statistical inference, and averages the decisions of the selected ensembles to label the test pattern. The experimental results show that the proposed MSEBAG outperforms the benchmarks on average.

  5. The Einstein-Hilbert gravitation with minimum length

    Science.gov (United States)

    Louzada, H. L. C.

    2018-05-01

    We study the Einstein-Hilbert gravitation with the deformed Heisenberg algebra leading to the minimum length, with the intention to find and estimate the corrections in this theory, clarifying whether or not it is possible to obtain, by means of the minimum length, a theory, in D=4, which is causal, unitary and provides a massive graviton. Therefore, we will calculate and analyze the dispersion relationships of the considered theory.

  6. Definition of the minimum longitude of insert in the rebuilding of Charpy test tubes for surveillance and life extension of vessels in Mexico

    International Nuclear Information System (INIS)

    Romero C, J.; Hernandez C, R.; Rocamontes A, M.

    2011-11-01

    In the National Institute of Nuclear Research (Mexico) a welding system for the rebuilding of Charpy test tubes has been developed, automated, qualified and used for the surveillance of the mechanical properties (mainly embrittlement) of the vessel. This system uses the halves of the rehearsed Charpy test tubes of the surveillance capsules extracted of the reactors, to obtain, of a rehearsed test tube, two reconstituted test tubes. This rebuilding process is used so much in the surveillance program like in the potential extension of the operation license of the vessel. To the halves of Charpy test tubes that have been removed the deformed part by machine are called -insert- and in a very general way the rebuilding consists in weld with the welding process -Stud Welding- two metallic implants in the ends of the insert, to obtain a reconstituted test tube. The main characteristic of this welding are the achieved small dimensions, so much of the areas welded as of the areas affected by the heat. The applicable normative settles down that the minim longitude of the insert for the welding process by Stud Welding it should be of 18 mm, however according to the same normative this longitude can diminish if is demonstrated analytic or experimentally that the central volume of 1 cm 3 in the insert is not affected. In this work the measurement of the temperature profiles to different distances of the welding interface is presented, defining an equation for the maximum temperatures reached in function of the distance, on the other hand the real longitude affected in the test tube by means of metallography is determined and this way the minimum longitude of the insert for this developed rebuilding system was determined. (Author)

  7. HIV self-testing among female sex workers in Zambia: A cluster randomized controlled trial.

    Directory of Open Access Journals (Sweden)

    Michael M Chanda

    2017-11-01

    Full Text Available HIV self-testing (HIVST may play a role in addressing gaps in HIV testing coverage and as an entry point for HIV prevention services. We conducted a cluster randomized trial of 2 HIVST distribution mechanisms compared to the standard of care among female sex workers (FSWs in Zambia.Trained peer educators in Kapiri Mposhi, Chirundu, and Livingstone, Zambia, each recruited 6 FSW participants. Peer educator-FSW groups were randomized to 1 of 3 arms: (1 delivery (direct distribution of an oral HIVST from the peer educator, (2 coupon (a coupon for collection of an oral HIVST from a health clinic/pharmacy, or (3 standard-of-care HIV testing. Participants in the 2 HIVST arms received 2 kits: 1 at baseline and 1 at 10 weeks. The primary outcome was any self-reported HIV testing in the past month at the 1- and 4-month visits, as HIVST can replace other types of HIV testing. Secondary outcomes included linkage to care, HIVST use in the HIVST arms, and adverse events. Participants completed questionnaires at 1 and 4 months following peer educator interventions. In all, 965 participants were enrolled between September 16 and October 12, 2016 (delivery, N = 316; coupon, N = 329; standard of care, N = 320; 20% had never tested for HIV. Overall HIV testing at 1 month was 94.9% in the delivery arm, 84.4% in the coupon arm, and 88.5% in the standard-of-care arm (delivery versus standard of care risk ratio [RR] = 1.07, 95% CI 0.99-1.15, P = 0.10; coupon versus standard of care RR = 0.95, 95% CI 0.86-1.05, P = 0.29; delivery versus coupon RR = 1.13, 95% CI 1.04-1.22, P = 0.005. Four-month rates were 84.1% for the delivery arm, 79.8% for the coupon arm, and 75.1% for the standard-of-care arm (delivery versus standard of care RR = 1.11, 95% CI 0.98-1.27, P = 0.11; coupon versus standard of care RR = 1.06, 95% CI 0.92-1.22, P = 0.42; delivery versus coupon RR = 1.05, 95% CI 0.94-1.18, P = 0.40. At 1 month, the majority of HIV tests were self-tests (88.4%. HIV self-test

  8. HIV self-testing among female sex workers in Zambia: A cluster randomized controlled trial.

    Science.gov (United States)

    Chanda, Michael M; Ortblad, Katrina F; Mwale, Magdalene; Chongo, Steven; Kanchele, Catherine; Kamungoma, Nyambe; Fullem, Andrew; Dunn, Caitlin; Barresi, Leah G; Harling, Guy; Bärnighausen, Till; Oldenburg, Catherine E

    2017-11-01

    HIV self-testing (HIVST) may play a role in addressing gaps in HIV testing coverage and as an entry point for HIV prevention services. We conducted a cluster randomized trial of 2 HIVST distribution mechanisms compared to the standard of care among female sex workers (FSWs) in Zambia. Trained peer educators in Kapiri Mposhi, Chirundu, and Livingstone, Zambia, each recruited 6 FSW participants. Peer educator-FSW groups were randomized to 1 of 3 arms: (1) delivery (direct distribution of an oral HIVST from the peer educator), (2) coupon (a coupon for collection of an oral HIVST from a health clinic/pharmacy), or (3) standard-of-care HIV testing. Participants in the 2 HIVST arms received 2 kits: 1 at baseline and 1 at 10 weeks. The primary outcome was any self-reported HIV testing in the past month at the 1- and 4-month visits, as HIVST can replace other types of HIV testing. Secondary outcomes included linkage to care, HIVST use in the HIVST arms, and adverse events. Participants completed questionnaires at 1 and 4 months following peer educator interventions. In all, 965 participants were enrolled between September 16 and October 12, 2016 (delivery, N = 316; coupon, N = 329; standard of care, N = 320); 20% had never tested for HIV. Overall HIV testing at 1 month was 94.9% in the delivery arm, 84.4% in the coupon arm, and 88.5% in the standard-of-care arm (delivery versus standard of care risk ratio [RR] = 1.07, 95% CI 0.99-1.15, P = 0.10; coupon versus standard of care RR = 0.95, 95% CI 0.86-1.05, P = 0.29; delivery versus coupon RR = 1.13, 95% CI 1.04-1.22, P = 0.005). Four-month rates were 84.1% for the delivery arm, 79.8% for the coupon arm, and 75.1% for the standard-of-care arm (delivery versus standard of care RR = 1.11, 95% CI 0.98-1.27, P = 0.11; coupon versus standard of care RR = 1.06, 95% CI 0.92-1.22, P = 0.42; delivery versus coupon RR = 1.05, 95% CI 0.94-1.18, P = 0.40). At 1 month, the majority of HIV tests were self-tests (88.4%). HIV self-test use

  9. Reducing tobacco use and access through strengthened minimum price laws.

    Science.gov (United States)

    McLaughlin, Ian; Pearson, Anne; Laird-Metke, Elisa; Ribisl, Kurt

    2014-10-01

    Higher prices reduce consumption and initiation of tobacco products. A minimum price law that establishes a high statutory minimum price and prohibits the industry's discounting tactics for tobacco products is a promising pricing strategy as an alternative to excise tax increases. Although some states have adopted minimum price laws on the basis of statutorily defined price "markups" over the invoice price, existing state laws have been largely ineffective at increasing the retail price. We analyzed 3 new variations of minimum price laws that hold great potential for raising tobacco prices and reducing consumption: (1) a flat rate minimum price law similar to a recent enactment in New York City, (2) an enhanced markup law, and (3) a law that incorporates both elements.

  10. Study of RNA structures with a connection to random matrix theory

    International Nuclear Information System (INIS)

    Bhadola, Pradeep; Deo, Nivedita

    2015-01-01

    This manuscript investigates the level of complexity and thermodynamic properties of the real RNA structures and compares the properties with the random RNA sequences. A discussion on the similarities of thermodynamical properties of the real structures with the non linear random matrix model of RNA folding is presented. The structural information contained in the PDB file is exploited to get the base pairing information. The complexity of an RNA structure is defined by a topological quantity called genus which is calculated from the base pairing information. Thermodynamic analysis of the real structures is done numerically. The real structures have a minimum free energy which is very small compared to the randomly generated sequences of the same length. This analysis suggests that there are specific patterns in the structures which are preserved during the evolution of the sequences and certain sequences are discarded by the evolutionary process. Further analyzing the sequences of a fixed length reveal that the RNA structures exist in ensembles i.e. although all the sequences in the ensemble have different series of nucleotides (sequence) they fold into structures that have the same pairs of hydrogen bonding as well as the same minimum free energy. The specific heat of the RNA molecule is numerically estimated at different lengths. The specific heat curve with temperature shows a bump and for some RNA, a double peak behavior is observed. The same behavior is seen in the study of the random matrix model with non linear interaction of RNA folding. The bump in the non linear matrix model can be controlled by the change in the interaction strength.

  11. Inferring genetic parameters of lactation in Tropical Milking Criollo cattle with random regression test-day models.

    Science.gov (United States)

    Santellano-Estrada, E; Becerril-Pérez, C M; de Alba, J; Chang, Y M; Gianola, D; Torres-Hernández, G; Ramírez-Valverde, R

    2008-11-01

    This study inferred genetic and permanent environmental variation of milk yield in Tropical Milking Criollo cattle and compared 5 random regression test-day models using Wilmink's function and Legendre polynomials. Data consisted of 15,377 test-day records from 467 Tropical Milking Criollo cows that calved between 1974 and 2006 in the tropical lowlands of the Gulf Coast of Mexico and in southern Nicaragua. Estimated heritabilities of test-day milk yields ranged from 0.18 to 0.45, and repeatabilities ranged from 0.35 to 0.68 for the period spanning from 6 to 400 d in milk. Genetic correlation between days in milk 10 and 400 was around 0.50 but greater than 0.90 for most pairs of test days. The model that used first-order Legendre polynomials for additive genetic effects and second-order Legendre polynomials for permanent environmental effects gave the smallest residual variance and was also favored by the Akaike information criterion and likelihood ratio tests.

  12. 12 CFR 567.2 - Minimum regulatory capital requirement.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Minimum regulatory capital requirement. 567.2... Regulatory Capital Requirements § 567.2 Minimum regulatory capital requirement. (a) To meet its regulatory capital requirement a savings association must satisfy each of the following capital standards: (1) Risk...

  13. 29 CFR 525.24 - Advisory Committee on Special Minimum Wages.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Advisory Committee on Special Minimum Wages. 525.24 Section 525.24 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR DIVISION, DEPARTMENT OF LABOR... Special Minimum Wages. The Advisory Committee on Special Minimum Wages, the members of which are appointed...

  14. Random numbers from vacuum fluctuations

    International Nuclear Information System (INIS)

    Shi, Yicheng; Kurtsiefer, Christian; Chng, Brenda

    2016-01-01

    We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.

  15. Random numbers from vacuum fluctuations

    Energy Technology Data Exchange (ETDEWEB)

    Shi, Yicheng; Kurtsiefer, Christian, E-mail: christian.kurtsiefer@gmail.com [Department of Physics, National University of Singapore, 2 Science Drive 3, Singapore 117542 (Singapore); Center for Quantum Technologies, National University of Singapore, 3 Science Drive 2, Singapore 117543 (Singapore); Chng, Brenda [Center for Quantum Technologies, National University of Singapore, 3 Science Drive 2, Singapore 117543 (Singapore)

    2016-07-25

    We implement a quantum random number generator based on a balanced homodyne measurement of vacuum fluctuations of the electromagnetic field. The digitized signal is directly processed with a fast randomness extraction scheme based on a linear feedback shift register. The random bit stream is continuously read in a computer at a rate of about 480 Mbit/s and passes an extended test suite for random numbers.

  16. Generalized oscillator strength for the argon 3p6-3p5 4s transition: Correlation and exchange effects on the characteristic minimum

    International Nuclear Information System (INIS)

    Chen, Zhifan; Msezane, Alfred Z.; Amusia, M. Ya.

    1999-01-01

    We have investigated the generalized oscillator strength (GOS) for a transition of the type np→(n+1)s, where n is the principal quantum number of the outermost filled shell of the atomic ground state, using the random-phase approximation with exchange. We find that the influence of correlation and exchange effects on the position of the characteristic minimum in the GOS of Ar(n=3) is insignificant. Also, our first Born approximation predicts the position of the minimum accurately provided that accurate target wave functions are employed. Our results agree excellently with measurements and are expected to be applicable equally to the corresponding subshells of Ne(n=2), Kr(n=4), and Xe(n=5). (c) 1999 The American Physical Society

  17. Statistical physics when the minimum temperature is not absolute zero

    Science.gov (United States)

    Chung, Won Sang; Hassanabadi, Hassan

    2018-04-01

    In this paper, the nonzero minimum temperature is considered based on the third law of thermodynamics and existence of the minimal momentum. From the assumption of nonzero positive minimum temperature in nature, we deform the definitions of some thermodynamical quantities and investigate nonzero minimum temperature correction to the well-known thermodynamical problems.

  18. 42 CFR 422.382 - Minimum net worth amount.

    Science.gov (United States)

    2010-10-01

    ... that CMS considers appropriate to reduce, control or eliminate start-up administrative costs. (b) After... section. (c) Calculation of the minimum net worth amount—(1) Cash requirement. (i) At the time of application, the organization must maintain at least $750,000 of the minimum net worth amount in cash or cash...

  19. Minimum Competencies in Undergraduate Motor Development. Guidance Document

    Science.gov (United States)

    National Association for Sport and Physical Education, 2004

    2004-01-01

    The minimum competency guidelines in Motor Development described herein at the undergraduate level may be gained in one or more motor development course(s) or through other courses provided in an undergraduate curriculum. The minimum guidelines include: (1) Formulation of a developmental perspective; (2) Knowledge of changes in motor behavior…

  20. Rigorously testing multialternative decision field theory against random utility models.

    Science.gov (United States)

    Berkowitsch, Nicolas A J; Scheibehenne, Benjamin; Rieskamp, Jörg

    2014-06-01

    Cognitive models of decision making aim to explain the process underlying observed choices. Here, we test a sequential sampling model of decision making, multialternative decision field theory (MDFT; Roe, Busemeyer, & Townsend, 2001), on empirical grounds and compare it against 2 established random utility models of choice: the probit and the logit model. Using a within-subject experimental design, participants in 2 studies repeatedly choose among sets of options (consumer products) described on several attributes. The results of Study 1 showed that all models predicted participants' choices equally well. In Study 2, in which the choice sets were explicitly designed to distinguish the models, MDFT had an advantage in predicting the observed choices. Study 2 further revealed the occurrence of multiple context effects within single participants, indicating an interdependent evaluation of choice options and correlations between different context effects. In sum, the results indicate that sequential sampling models can provide relevant insights into the cognitive process underlying preferential choices and thus can lead to better choice predictions. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  1. Random walk of passive tracers among randomly moving obstacles.

    Science.gov (United States)

    Gori, Matteo; Donato, Irene; Floriani, Elena; Nardecchia, Ilaria; Pettini, Marco

    2016-04-14

    This study is mainly motivated by the need of understanding how the diffusion behavior of a biomolecule (or even of a larger object) is affected by other moving macromolecules, organelles, and so on, inside a living cell, whence the possibility of understanding whether or not a randomly walking biomolecule is also subject to a long-range force field driving it to its target. By means of the Continuous Time Random Walk (CTRW) technique the topic of random walk in random environment is here considered in the case of a passively diffusing particle among randomly moving and interacting obstacles. The relevant physical quantity which is worked out is the diffusion coefficient of the passive tracer which is computed as a function of the average inter-obstacles distance. The results reported here suggest that if a biomolecule, let us call it a test molecule, moves towards its target in the presence of other independently interacting molecules, its motion can be considerably slowed down.

  2. The Unusual Minimum of Cycle 23: Observations and Interpretation

    Science.gov (United States)

    Martens, Petrus C.; Nandy, D.; Munoz-Jaramillo, A.

    2009-05-01

    The current minimum of cycle 23 is unusual in its long duration, the very low level to which Total Solar Irradiance (TSI) has fallen, and the small flux of the open polar fields. The deep minimum of TSI seems to be related to an unprecedented dearth of polar faculae, and hence to the small amount of open flux. Based upon surface flux transport models it has been suggested that the causes of these phenomena may be an unusually vigorous meridional flow, or even a deviation from Joy's law resulting in smaller Joy angles than usual for emerging flux in cycle 23. There is also the possibility of a connection with the recently inferred emergence in polar regions of bipoles that systematically defy Hale's law. Much speculation has been going on as to the consequences of this exceptional minimum: are we entering another global minimum, is this the end of the 80 year period of exceptionally high solar activity, or is this just a statistical hiccup? Dynamo simulations are underway that may help answer this question. As an aside it must be mentioned that the current minimum of TSI puts an upper limit in the TSI input for global climate simulations during the Maunder minimum, and that a possible decrease in future solar activity will result in a very small but not insignificant reduction in the pace of global warming.

  3. Minimum Wages and Skill Acquisition: Another Look at Schooling Effects.

    Science.gov (United States)

    Neumark, David; Wascher, William

    2003-01-01

    Examines the effects of minimum wage on schooling, seeking to reconcile some of the contradictory results in recent research using Current Population Survey data from the late 1970s through the 1980s. Findings point to negative effects of minimum wages on school enrollment, bolstering the findings of negative effects of minimum wages on enrollment…

  4. Minimum Information about T Regulatory Cells: A Step toward Reproducibility and Standardization

    Directory of Open Access Journals (Sweden)

    Anke Fuchs

    2018-01-01

    Full Text Available Cellular therapies with CD4+ T regulatory cells (Tregs hold promise of efficacious treatment for the variety of autoimmune and allergic diseases as well as posttransplant complications. Nevertheless, current manufacturing of Tregs as a cellular medicinal product varies between different laboratories, which in turn hampers precise comparisons of the results between the studies performed. While the number of clinical trials testing Tregs is already substantial, it seems to be crucial to provide some standardized characteristics of Treg products in order to minimize the problem. We have previously developed reporting guidelines called minimum information about tolerogenic antigen-presenting cells, which allows the comparison between different preparations of tolerance-inducing antigen-presenting cells. Having this experience, here we describe another minimum information about Tregs (MITREG. It is important to note that MITREG does not dictate how investigators should generate or characterize Tregs, but it does require investigators to report their Treg data in a consistent and transparent manner. We hope this will, therefore, be a useful tool facilitating standardized reporting on the manufacturing of Tregs, either for research purposes or for clinical application. This way MITREG might also be an important step toward more standardized and reproducible testing of the Tregs preparations in clinical applications.

  5. Genetic Analysis of Milk Yield Using Random Regression Test Day Model in Tehran Province Holstein Dairy Cow

    Directory of Open Access Journals (Sweden)

    A. Seyeddokht

    2012-09-01

    Full Text Available In this research a random regression test day model was used to estimate heritability values and calculation genetic correlations between test day milk records. a total of 140357 monthly test day milk records belonging to 28292 first lactation Holstein cattle(trice time a day milking distributed in 165 herd and calved from 2001 to 2010 belonging to the herds of Tehran province were used. The fixed effects of herd-year-month of calving as contemporary group and age at calving and Holstein gene percentage as covariate were fitted. Orthogonal legendre polynomial with a 4th-order was implemented to take account of genetic and environmental aspects of milk production over the course of lactation. RRM using Legendre polynomials as base functions appears to be the most adequate to describe the covariance structure of the data. The results showed that the average of heritability for the second half of lactation period was higher than that of the first half. The heritability value for the first month was lowest (0.117 and for the eighth month of the lactation was highest (0.230 compared to the other months of lactation. Because of genetic variation was increased gradually, and residual variance was high in the first months of lactation, heritabilities were different over the course of lactation. The RRMs with a higher number of parameters were more useful to describe the genetic variation of test-day milk yield throughout the lactation. In this research estimation of genetic parameters, and calculation genetic correlations were implemented by random regression test day model, therefore using this method is the exact way to take account of parameters rather than the other ways.

  6. 14 CFR 91.155 - Basic VFR weather minimums.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 2 2010-01-01 2010-01-01 false Basic VFR weather minimums. 91.155 Section...) AIR TRAFFIC AND GENERAL OPERATING RULES GENERAL OPERATING AND FLIGHT RULES Flight Rules Visual Flight Rules § 91.155 Basic VFR weather minimums. (a) Except as provided in paragraph (b) of this section and...

  7. Minimum Wages and Teen Employment: A Spatial Panel Approach

    OpenAIRE

    Charlene Kalenkoski; Donald Lacombe

    2011-01-01

    The authors employ spatial econometrics techniques and Annual Averages data from the U.S. Bureau of Labor Statistics for 1990-2004 to examine how changes in the minimum wage affect teen employment. Spatial econometrics techniques account for the fact that employment is correlated across states. Such correlation may exist if a change in the minimum wage in a state affects employment not only in its own state but also in other, neighboring states. The authors show that state minimum wages negat...

  8. Varying levels of difficulty index of skills-test items randomly selected by examinees on the Korean emergency medical technician licensing examination.

    Science.gov (United States)

    Koh, Bongyeun; Hong, Sunggi; Kim, Soon-Sim; Hyun, Jin-Sook; Baek, Milye; Moon, Jundong; Kwon, Hayran; Kim, Gyoungyong; Min, Seonggi; Kang, Gu-Hyun

    2016-01-01

    The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE), which requires examinees to select items randomly. The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (P<0.01), as well as 4 of the 5 items on the advanced skills test (P<0.05). In the class II KEMTLE, 4 of the 5 items on the basic skills test showed significantly different difficulty index (P<0.01), as well as all 3 of the advanced skills test items (P<0.01). In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination.

  9. Varying levels of difficulty index of skills-test items randomly selected by examinees on the Korean emergency medical technician licensing examination

    Directory of Open Access Journals (Sweden)

    Bongyeun Koh

    2016-01-01

    Full Text Available Purpose: The goal of this study was to characterize the difficulty index of the items in the skills test components of the class I and II Korean emergency medical technician licensing examination (KEMTLE, which requires examinees to select items randomly. Methods: The results of 1,309 class I KEMTLE examinations and 1,801 class II KEMTLE examinations in 2013 were subjected to analysis. Items from the basic and advanced skills test sections of the KEMTLE were compared to determine whether some were significantly more difficult than others. Results: In the class I KEMTLE, all 4 of the items on the basic skills test showed significant variation in difficulty index (P<0.01, as well as 4 of the 5 items on the advanced skills test (P<0.05. In the class II KEMTLE, 4 of the 5 items on the basic skills test showed significantly different difficulty index (P<0.01, as well as all 3 of the advanced skills test items (P<0.01. Conclusion: In the skills test components of the class I and II KEMTLE, the procedure in which examinees randomly select questions should be revised to require examinees to respond to a set of fixed items in order to improve the reliability of the national licensing examination.

  10. Quantum random number generator

    Science.gov (United States)

    Soubusta, Jan; Haderka, Ondrej; Hendrych, Martin

    2001-03-01

    Since reflection or transmission of a quantum particle on a beamsplitter is inherently random quantum process, a device built on this principle does not suffer from drawbacks of neither pseudo-random computer generators or classical noise sources. Nevertheless, a number of physical conditions necessary for high quality random numbers generation must be satisfied. Luckily, in quantum optics realization they can be well controlled. We present an easy random number generator based on the division of weak light pulses on a beamsplitter. The randomness of the generated bit stream is supported by passing the data through series of 15 statistical test. The device generates at a rate of 109.7 kbit/s.

  11. A note on minimum-variance theory and beyond

    Energy Technology Data Exchange (ETDEWEB)

    Feng Jianfeng [Department of Informatics, Sussex University, Brighton, BN1 9QH (United Kingdom); Tartaglia, Giangaetano [Physics Department, Rome University ' La Sapienza' , Rome 00185 (Italy); Tirozzi, Brunello [Physics Department, Rome University ' La Sapienza' , Rome 00185 (Italy)

    2004-04-30

    We revisit the minimum-variance theory proposed by Harris and Wolpert (1998 Nature 394 780-4), discuss the implications of the theory on modelling the firing patterns of single neurons and analytically find the optimal control signals, trajectories and velocities. Under the rate coding assumption, input control signals employed in the minimum-variance theory should be Fitts processes rather than Poisson processes. Only if information is coded by interspike intervals, Poisson processes are in agreement with the inputs employed in the minimum-variance theory. For the integrate-and-fire model with Fitts process inputs, interspike intervals of efferent spike trains are very irregular. We introduce diffusion approximations to approximate neural models with renewal process inputs and present theoretical results on calculating moments of interspike intervals of the integrate-and-fire model. Results in Feng, et al (2002 J. Phys. A: Math. Gen. 35 7287-304) are generalized. In conclusion, we present a complete picture on the minimum-variance theory ranging from input control signals, to model outputs, and to its implications on modelling firing patterns of single neurons.

  12. A note on minimum-variance theory and beyond

    International Nuclear Information System (INIS)

    Feng Jianfeng; Tartaglia, Giangaetano; Tirozzi, Brunello

    2004-01-01

    We revisit the minimum-variance theory proposed by Harris and Wolpert (1998 Nature 394 780-4), discuss the implications of the theory on modelling the firing patterns of single neurons and analytically find the optimal control signals, trajectories and velocities. Under the rate coding assumption, input control signals employed in the minimum-variance theory should be Fitts processes rather than Poisson processes. Only if information is coded by interspike intervals, Poisson processes are in agreement with the inputs employed in the minimum-variance theory. For the integrate-and-fire model with Fitts process inputs, interspike intervals of efferent spike trains are very irregular. We introduce diffusion approximations to approximate neural models with renewal process inputs and present theoretical results on calculating moments of interspike intervals of the integrate-and-fire model. Results in Feng, et al (2002 J. Phys. A: Math. Gen. 35 7287-304) are generalized. In conclusion, we present a complete picture on the minimum-variance theory ranging from input control signals, to model outputs, and to its implications on modelling firing patterns of single neurons

  13. Self-correcting random number generator

    Science.gov (United States)

    Humble, Travis S.; Pooser, Raphael C.

    2016-09-06

    A system and method for generating random numbers. The system may include a random number generator (RNG), such as a quantum random number generator (QRNG) configured to self-correct or adapt in order to substantially achieve randomness from the output of the RNG. By adapting, the RNG may generate a random number that may be considered random regardless of whether the random number itself is tested as such. As an example, the RNG may include components to monitor one or more characteristics of the RNG during operation, and may use the monitored characteristics as a basis for adapting, or self-correcting, to provide a random number according to one or more performance criteria.

  14. More randomness from the same data

    International Nuclear Information System (INIS)

    Bancal, Jean-Daniel; Sheridan, Lana; Scarani, Valerio

    2014-01-01

    Correlations that cannot be reproduced with local variables certify the generation of private randomness. Usually, the violation of a Bell inequality is used to quantify the amount of randomness produced. Here, we show how private randomness generated during a Bell test can be directly quantified from the observed correlations, without the need to process these data into an inequality. The frequency with which the different measurement settings are used during the Bell test can also be taken into account. This improved analysis turns out to be very relevant for Bell tests performed with a finite collection efficiency. In particular, applying our technique to the data of a recent experiment (Christensen et al 2013 Phys. Rev. Lett. 111 130406), we show that about twice as much randomness as previously reported can be potentially extracted from this setup. (paper)

  15. Applicability of the minimum entropy generation method for optimizing thermodynamic cycles

    Institute of Scientific and Technical Information of China (English)

    Cheng Xue-Tao; Liang Xin-Gang

    2013-01-01

    Entropy generation is often used as a figure of merit in thermodynamic cycle optimizations.In this paper,it is shown that the applicability of the minimum entropy generation method to optimizing output power is conditional.The minimum entropy generation rate and the minimum entropy generation number do not correspond to the maximum output power when the total heat into the system of interest is not prescribed.For the cycles whose working medium is heated or cooled by streams with prescribed inlet temperatures and prescribed heat capacity flow rates,it is theoretically proved that both the minimum entropy generation rate and the minimum entropy generation number correspond to the maximum output power when the virtual entropy generation induced by dumping the used streams into the environment is considered.However,the minimum principle of entropy generation is not tenable in the case that the virtual entropy generation is not included,because the total heat into the system of interest is not fixed.An irreversible Carnot cycle and an irreversible Brayton cycle are analysed.The minimum entropy generation rate and the minimum entropy generation number do not correspond to the maximum output power if the heat into the system of interest is not prescribed.

  16. Applicability of the minimum entropy generation method for optimizing thermodynamic cycles

    International Nuclear Information System (INIS)

    Cheng Xue-Tao; Liang Xin-Gang

    2013-01-01

    Entropy generation is often used as a figure of merit in thermodynamic cycle optimizations. In this paper, it is shown that the applicability of the minimum entropy generation method to optimizing output power is conditional. The minimum entropy generation rate and the minimum entropy generation number do not correspond to the maximum output power when the total heat into the system of interest is not prescribed. For the cycles whose working medium is heated or cooled by streams with prescribed inlet temperatures and prescribed heat capacity flow rates, it is theoretically proved that both the minimum entropy generation rate and the minimum entropy generation number correspond to the maximum output power when the virtual entropy generation induced by dumping the used streams into the environment is considered. However, the minimum principle of entropy generation is not tenable in the case that the virtual entropy generation is not included, because the total heat into the system of interest is not fixed. An irreversible Carnot cycle and an irreversible Brayton cycle are analysed. The minimum entropy generation rate and the minimum entropy generation number do not correspond to the maximum output power if the heat into the system of interest is not prescribed. (general)

  17. Minimum Moduli in Von Neumann Algebras | Gopalraj | Quaestiones ...

    African Journals Online (AJOL)

    In this paper we answer a question raised in [12] in the affirmative, namely that the essential minimum modulus of an element in a von. Neumann algebra, relative to any norm closed two-sided ideal, is equal to the minimum modulus of the element perturbed by an element from the ideal. As a corollary of this result, we ...

  18. Random number generation and creativity.

    Science.gov (United States)

    Bains, William

    2008-01-01

    A previous paper suggested that humans can generate genuinely random numbers. I tested this hypothesis by repeating the experiment with a larger number of highly numerate subjects, asking them to call out a sequence of digits selected from 0 through 9. The resulting sequences were substantially non-random, with an excess of sequential pairs of numbers and a deficit of repeats of the same number, in line with previous literature. However, the previous literature suggests that humans generate random numbers with substantial conscious effort, and distractions which reduce that effort reduce the randomness of the numbers. I reduced my subjects' concentration by asking them to call out in another language, and with alcohol - neither affected the randomness of their responses. This suggests that the ability to generate random numbers is a 'basic' function of the human mind, even if those numbers are not mathematically 'random'. I hypothesise that there is a 'creativity' mechanism, while not truly random, provides novelty as part of the mind's defence against closed programming loops, and that testing for the effects seen here in people more or less familiar with numbers or with spontaneous creativity could identify more features of this process. It is possible that training to perform better at simple random generation tasks could help to increase creativity, through training people to reduce the conscious mind's suppression of the 'spontaneous', creative response to new questions.

  19. Subjective well-being and minimum wages: Evidence from U.S. states.

    Science.gov (United States)

    Kuroki, Masanori

    2018-02-01

    This paper investigates whether increases in minimum wages are associated with higher life satisfaction by using monthly-level state minimum wages and individual-level data from the 2005-2010 Behavioral Risk Factor Surveillance System. The magnitude I find suggests that a 10% increase in the minimum wage is associated with a 0.03-point increase in life satisfaction for workers without a high school diploma, on a 4-point scale. Contrary to popular belief that higher minimum wages hurt business owners, I find little evidence that higher minimum wages lead to the loss of well-being among self-employed people. Copyright © 2017 John Wiley & Sons, Ltd.

  20. Development of an accelerated reliability test schedule for terrestrial solar cells

    Science.gov (United States)

    Lathrop, J. W.; Prince, J. L.

    1981-01-01

    An accelerated test schedule using a minimum amount of tests and a minimum number of cells has been developed on the basis of stress test results obtained from more than 1500 cells of seven different cell types. The proposed tests, which include bias-temperature, bias-temperature-humidity, power cycle, thermal cycle, and thermal shock tests, use as little as 10 and up to 25 cells, depending on the test type.

  1. Six months into Myanmar's minimum wage: Reflecting on progress ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2016-04-25

    Apr 25, 2016 ... Participants examined recent results from an IDRC-funded enterprise survey, ... of a minimum wage, and how they have coped with the new situation.” ... Debate on the impact of minimum wages on employment continues ...

  2. Do minimum wages reduce poverty? Evidence from Central America ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2010-12-16

    Dec 16, 2010 ... Raising minimum wages has traditionally been considered a way to protect poor ... However, the effect of raising minimum wages remains an empirical question ... ​More than 70 of Vietnamese entrepreneurs choose to start a ...

  3. Decision trees with minimum average depth for sorting eight elements

    KAUST Repository

    AbouEisha, Hassan M.

    2015-11-19

    We prove that the minimum average depth of a decision tree for sorting 8 pairwise different elements is equal to 620160/8!. We show also that each decision tree for sorting 8 elements, which has minimum average depth (the number of such trees is approximately equal to 8.548×10^326365), has also minimum depth. Both problems were considered by Knuth (1998). To obtain these results, we use tools based on extensions of dynamic programming which allow us to make sequential optimization of decision trees relative to depth and average depth, and to count the number of decision trees with minimum average depth.

  4. Quantitative characterisation of an engineering write-up using random walk analysis

    Directory of Open Access Journals (Sweden)

    Sunday A. Oke

    2008-02-01

    Full Text Available This contribution reports on the investigation of correlation properties in an English scientific text (engineering write-up by means of a random walk. Though the idea to use a random walk to characterise correlations is not new (it was used e.g. in the genome analysis and in the analysis of texts, a random walk approach to the analysis of an English scientific text is still far from being exploited in its full strength as demonstrated in this paper. A method of high-dimensional embedding is proposed. Case examples were drawn arbitrarily from four engineering write-ups (Ph.D. synopsis of three engineering departments in the Faculty of Technology, University of Ibadan, Nigeria. Thirteen additional analyses of non-engineering English texts were made and the results compared to the engineering English texts. Thus, a total of seventeen write-ups of eight Faculties and sixteen Departments of the University of Ibadan were considered. The characterising exponents which relate the average distance of random walkers away from a known starting position to the elapsed time steps were estimated for the seventeen cases according to the power law and in three different dimensional spaces. The average characteristic exponent obtained for the seventeen cases and over three different dimensional spaces studied was 1.42 to 2-decimal with a minimum and a maximum coefficient of determination (R2 of 0.9495 and 0.9994 respectively. This is found to be 284% of the average characterising exponent value (0.5, as supported by the literature for random walkers based on the pseudo-random number generator. The average characteristic exponent obtained for the four cases that were engineering-based and over the three different dimensional studied spaces was 1.41 to 2-decimal (closer by 99.3% to 1.42 with a minimum and a maximum coefficient of determination (R2 of 0.9507 and 0.9974 respectively. This is found to be 282% of the average characterising exponent value (0.5, as

  5. Coupling between minimum scattering antennas

    DEFF Research Database (Denmark)

    Andersen, J.; Lessow, H; Schjær-Jacobsen, Hans

    1974-01-01

    Coupling between minimum scattering antennas (MSA's) is investigated by the coupling theory developed by Wasylkiwskyj and Kahn. Only rotationally symmetric power patterns are considered, and graphs of relative mutual impedance are presented as a function of distance and pattern parameters. Crossed...

  6. Experimental vibroacoustic testing of plane panels using synthesized random pressure fields.

    Science.gov (United States)

    Robin, Olivier; Berry, Alain; Moreau, Stéphane

    2014-06-01

    The experimental reproduction of random pressure fields on a plane panel and corresponding induced vibrations is studied. An open-loop reproduction strategy is proposed that uses the synthetic array concept, for which a small array element is moved to create a large array by post-processing. Three possible approaches are suggested to define the complex amplitudes to be imposed to the reproduction sources distributed on a virtual plane facing the panel to be tested. Using a single acoustic monopole, a scanning laser vibrometer and a baffled simply supported aluminum panel, experimental vibroacoustic indicators such as the Transmission Loss for Diffuse Acoustic Field, high-speed subsonic and supersonic Turbulent Boundary Layer excitations are obtained. Comparisons with simulation results obtained using a commercial software show that the Transmission Loss estimation is possible under both excitations. Moreover and as a complement to frequency domain indicators, the vibroacoustic behavior of the panel can be studied in the wave number domain.

  7. Predicting Minimum Control Speed on the Ground (VMCG) and Minimum Control Airspeed (VMCA) of Engine Inoperative Flight Using Aerodynamic Database and Propulsion Database Generators

    Science.gov (United States)

    Hadder, Eric Michael

    There are many computer aided engineering tools and software used by aerospace engineers to design and predict specific parameters of an airplane. These tools help a design engineer predict and calculate such parameters such as lift, drag, pitching moment, takeoff range, maximum takeoff weight, maximum flight range and much more. However, there are very limited ways to predict and calculate the minimum control speeds of an airplane in engine inoperative flight. There are simple solutions, as well as complicated solutions, yet there is neither standard technique nor consistency throughout the aerospace industry. To further complicate this subject, airplane designers have the option of using an Automatic Thrust Control System (ATCS), which directly alters the minimum control speeds of an airplane. This work addresses this issue with a tool used to predict and calculate the Minimum Control Speed on the Ground (VMCG) as well as the Minimum Control Airspeed (VMCA) of any existing or design-stage airplane. With simple line art of an airplane, a program called VORLAX is used to generate an aerodynamic database used to calculate the stability derivatives of an airplane. Using another program called Numerical Propulsion System Simulation (NPSS), a propulsion database is generated to use with the aerodynamic database to calculate both VMCG and VMCA. This tool was tested using two airplanes, the Airbus A320 and the Lockheed Martin C130J-30 Super Hercules. The A320 does not use an Automatic Thrust Control System (ATCS), whereas the C130J-30 does use an ATCS. The tool was able to properly calculate and match known values of VMCG and VMCA for both of the airplanes. The fact that this tool was able to calculate the known values of VMCG and VMCA for both airplanes means that this tool would be able to predict the VMCG and VMCA of an airplane in the preliminary stages of design. This would allow design engineers the ability to use an Automatic Thrust Control System (ATCS) as part

  8. 29 CFR 510.23 - Agricultural activities eligible for minimum wage phase-in.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Agricultural activities eligible for minimum wage phase-in..., DEPARTMENT OF LABOR REGULATIONS IMPLEMENTATION OF THE MINIMUM WAGE PROVISIONS OF THE 1989 AMENDMENTS TO THE... eligible for minimum wage phase-in. Agriculture activities eligible for an extended phase-in of the minimum...

  9. Is a Minimum Wage an Appropriate Instrument for Redistribution?

    NARCIS (Netherlands)

    A.A.F. Gerritsen (Aart); B. Jacobs (Bas)

    2016-01-01

    textabstractWe analyze the redistributional (dis)advantages of a minimum wage over income taxation in competitive labor markets, without imposing assumptions on the (in)efficiency of labor rationing. Compared to a distributionally equivalent tax change, a minimum-wage increase raises involuntary

  10. A Randomized Controlled Trial to Test the Effectiveness of an Immersive 3D Video Game for Anxiety Prevention among Adolescents.

    Directory of Open Access Journals (Sweden)

    Hanneke Scholten

    Full Text Available Adolescent anxiety is debilitating, the most frequently diagnosed adolescent mental health problem, and leads to substantial long-term problems. A randomized controlled trial (n = 138 was conducted to test the effectiveness of a biofeedback video game (Dojo for adolescents with elevated levels of anxiety. Adolescents (11-15 years old were randomly assigned to play Dojo or a control game (Rayman 2: The Great Escape. Initial screening for anxiety was done on 1,347 adolescents in five high schools; only adolescents who scored above the "at-risk" cut-off on the Spence Children Anxiety Survey were eligible. Adolescents' anxiety levels were assessed at pre-test, post-test, and at three month follow-up to examine the extent to which playing Dojo decreased adolescents' anxiety. The present study revealed equal improvements in anxiety symptoms in both conditions at follow-up and no differences between Dojo and the closely matched control game condition. Latent growth curve models did reveal a steeper decrease of personalized anxiety symptoms (not of total anxiety symptoms in the Dojo condition compared to the control condition. Moderation analyses did not show any differences in outcomes between boys and girls nor did age differentiate outcomes. The present results are of importance for prevention science, as this was the first full-scale randomized controlled trial testing indicated prevention effects of a video game aimed at reducing anxiety. Future research should carefully consider the choice of control condition and outcome measurements, address the potentially high impact of participants' expectations, and take critical design issues into consideration, such as individual- versus group-based intervention and contamination issues.

  11. Minimum analytical quality specifications of inter-laboratory comparisons: agreement among Spanish EQAP organizers.

    Science.gov (United States)

    Ricós, Carmen; Ramón, Francisco; Salas, Angel; Buño, Antonio; Calafell, Rafael; Morancho, Jorge; Gutiérrez-Bassini, Gabriella; Jou, Josep M

    2011-11-18

    Four Spanish scientific societies organizing external quality assessment programs (EQAP) formed a working group to promote the use of common minimum quality specifications for clinical tests. Laboratories that do not meet the minimum specifications are encouraged to make immediate review of the analytical procedure affected and to implement corrective actions if necessary. The philosophy was to use the 95th percentile of results sent to EQAP (expressed in terms of percentage deviation from the target value) obtained for all results (except the outliers) during a cycle of 1 year. The target value for a number of analytes of the basic biochemistry program was established as the overall mean. However, because of the substantial discrepancies between routine methods for basic hematology, hormones, proteins, therapeutic drugs and tumor markers, the target in these cases was the peer group mean. The resulting specifications were quite similar to those established in the US (CLIA), and Germany (Richtlinie). The proposed specifications stand for the minimum level of quality to be attained for laboratories, to assure harmonized service performance. They have nothing to do with satisfying clinical requirements, which are the final level of quality to be reached, and that is strongly recommended in our organizations by means of documents, courses, symposiums and all types of educational activities.

  12. The impact of the minimum wage on health.

    Science.gov (United States)

    Andreyeva, Elena; Ukert, Benjamin

    2018-03-07

    This study evaluates the effect of minimum wage on risky health behaviors, healthcare access, and self-reported health. We use data from the 1993-2015 Behavioral Risk Factor Surveillance System, and employ a difference-in-differences strategy that utilizes time variation in new minimum wage laws across U.S. states. Results suggest that the minimum wage increases the probability of being obese and decreases daily fruit and vegetable intake, but also decreases days with functional limitations while having no impact on healthcare access. Subsample analyses reveal that the increase in weight and decrease in fruit and vegetable intake are driven by the older population, married, and whites. The improvement in self-reported health is especially strong among non-whites, females, and married.

  13. Setting a minimum age for juvenile justice jurisdiction in California.

    Science.gov (United States)

    S Barnert, Elizabeth; S Abrams, Laura; Maxson, Cheryl; Gase, Lauren; Soung, Patricia; Carroll, Paul; Bath, Eraka

    2017-03-13

    Purpose Despite the existence of minimum age laws for juvenile justice jurisdiction in 18 US states, California has no explicit law that protects children (i.e. youth less than 12 years old) from being processed in the juvenile justice system. In the absence of a minimum age law, California lags behind other states and international practice and standards. The paper aims to discuss these issues. Design/methodology/approach In this policy brief, academics across the University of California campuses examine current evidence, theory, and policy related to the minimum age of juvenile justice jurisdiction. Findings Existing evidence suggests that children lack the cognitive maturity to comprehend or benefit from formal juvenile justice processing, and diverting children from the system altogether is likely to be more beneficial for the child and for public safety. Research limitations/implications Based on current evidence and theory, the authors argue that minimum age legislation that protects children from contact with the juvenile justice system and treats them as children in need of services and support, rather than as delinquents or criminals, is an important policy goal for California and for other national and international jurisdictions lacking a minimum age law. Originality/value California has no law specifying a minimum age for juvenile justice jurisdiction, meaning that young children of any age can be processed in the juvenile justice system. This policy brief provides a rationale for a minimum age law in California and other states and jurisdictions without one.

  14. Performance Evaluation and Robustness Testing of Advanced Oscilloscope Triggering Schemes

    Directory of Open Access Journals (Sweden)

    Shakeb A. KHAN

    2010-01-01

    Full Text Available In this paper, performance and robustness of two advanced oscilloscope triggering schemes is evaluated. The problem of time period measurement of complex waveforms can be solved using the algorithms, which utilize the associative memory network based weighted hamming distance (Whd and autocorrelation based techniques. Robustness of both the advanced techniques, are then evaluated by simulated addition of random noise of different levels to complex test signals waveforms, and minimum value of Whd (Whd min and peak value of coefficient of correlation(COCmax are computed over 10000 cycles of the selected test waveforms. The distance between mean value of second lowest value of Whd and Whd min and distance between second highest value of coefficient of correlation (COC and COC max are used as parameters to analyze the robustness of considered techniques. From the results, it is found that both the techniques are capable of producing trigger pulses efficiently; but correlation based technique is found to be better from robustness point of view.

  15. A method simulating random magnetic field in interplanetary space by an autoregressive method

    International Nuclear Information System (INIS)

    Kato, Masahito; Sakai, Takasuke

    1985-01-01

    With an autoregressive method, we tried to generate the random noise fitting in with the power spectrum which can be analytically Fouriertransformed into an autocorrelation function. Although we can not directly compare our method with FFT by Owens (1978), we can only point out the following; FFT method should determine at first the number of data points N, or the total length to be generated and we cannot generate random data more than N. Because, beyond the NΔy, the generated data repeats the same pattern as below NΔy, where Δy = minimum interval for random noise. So if you want to change or increase N after generating the random noise, you should start the generation from the first step. The characteristic of the generated random number may depend upon the number of N, judging from the generating method. Once the prediction error filters are determined, our method can produce successively the random numbers, that is, we can possibly extend N to infinite without any effort. (author)

  16. Memory for Random Time Patterns in Audition, Touch, and Vision.

    Science.gov (United States)

    Kang, HiJee; Lancelin, Denis; Pressnitzer, Daniel

    2018-03-22

    Perception deals with temporal sequences of events, like series of phonemes for audition, dynamic changes in pressure for touch textures, or moving objects for vision. Memory processes are thus needed to make sense of the temporal patterning of sensory information. Recently, we have shown that auditory temporal patterns could be learned rapidly and incidentally with repeated exposure [Kang et al., 2017]. Here, we tested whether rapid incidental learning of temporal patterns was specific to audition, or if it was a more general property of sensory systems. We used a same behavioral task in three modalities: audition, touch, and vision, for stimuli having identical temporal statistics. Participants were presented with sequences of acoustic pulses for audition, motion pulses to the fingertips for touch, or light pulses for vision. Pulses were randomly and irregularly spaced, with all inter-pulse intervals in the sub-second range and all constrained to be longer than the temporal acuity in any modality. This led to pulse sequences with an average inter-pulse interval of 166 ms, a minimum inter-pulse interval of 60 ms, and a total duration of 1.2 s. Results showed that, if a random temporal pattern re-occurred at random times during an experimental block, it was rapidly learned, whatever the sensory modality. Moreover, patterns first learned in the auditory modality displayed transfer of learning to either touch or vision. This suggests that sensory systems may be exquisitely tuned to incidentally learn re-occurring temporal patterns, with possible cross-talk between the senses. Copyright © 2018 IBRO. Published by Elsevier Ltd. All rights reserved.

  17. Neutron slowing down and transport in monoisotopic media with constant cross sections or with a square-well minimum

    International Nuclear Information System (INIS)

    Peng, W.H.

    1977-01-01

    A specialized moments-method computer code was constructed for the calculation of the even spatial moments of the scalar flux, phi/sub 2n/, through 2n = 80. Neutron slowing-down and transport in a medium with constant cross sections was examined and the effect of a superimposed square-well cross section minimum on the penetrating flux was studied. In the constant cross section case, for nuclei that are not too light, the scalar flux is essentially independent of the nuclide mass. The numerical results obtained were used to test the validity of existing analytic approximations to the flux at both small and large lethargies relative to the source energy. As a result it was possible to define the regions in the lethargy--distance plane where these analytic solutions apply with reasonable accuracy. A parametric study was made of the effect of a square-well cross section minimum on neutron fluxes at energies below the minimum. It was shown that the flux at energies well below the minimum is essentially independent of the position of the minimum in lethargy. The results can be described by a convolution-of-sources model involving only the lethargy separation between detector and source, the width and the relative depth of the minimum. On the basis of the computations and the corresponding model, it is possible to predict, e.g., the conditions under which transport in the region of minimum completely determines the penetrating flux. At the other extreme, the model describes when the transport in the minimum can be treated in the same manner as in any comparable lethargy interval. With the aid of these criteria it is possible to understand the apparent paradoxical effects of certain minima in neutron penetration through such media as iron and sodium

  18. Ecological niche partitioning of the invasive dinoflagellate Prorocentrum minimum and its native congeners in the Baltic Sea.

    Science.gov (United States)

    Telesh, Irena V; Schubert, Hendrik; Skarlato, Sergei O

    2016-11-01

    This study analyses three decades of the peculiar bloom-formation history of the potentially toxic invasive planktonic dinoflagellates Prorocentrum minimum (Pavillard) Schiller in the SW Baltic Sea. We tested a research hypothesis that the unexpectedly long delay (nearly two decades) in population development of P. minimum prior to its first bloom was caused by competition with one or several closely related native dinoflagellate species due to ecological niche partitioning which hampered the spread and bloom-forming potential of the invader. We applied the ecological niche concept to a large, long-term phytoplankton database and analysed the invasion history and population dynamics of P. minimum in the SW Baltic Sea coastal waters using the data on phytoplankton composition, abundance and biomass. The ecological niche dimensions of P. minimum and its congener P. balticum were identified as the optimum environmental conditions for the species during the bloom events based on water temperature, salinity, pH, concentration of nutrients (PO 4 3- ; total phosphorus, TP; total nitrogen, TN; SiO 4 4- ), TN/TP-ratio and habitat type. The data on spatial distribution and ecological niche dimensions of P. minimum have contributed to the development of the "protistan species maximum concept". High microplankton diversity at critical salinities in the Baltic Sea may be considered as a possible reason for the significant niche overlap and strong competitive interactions among congeners leading to prolonged delay in population growth of P. minimum preceding its first bloom in the highly variable brackishwater environment. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. 47 CFR 25.205 - Minimum angle of antenna elevation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false Minimum angle of antenna elevation. 25.205 Section 25.205 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES SATELLITE COMMUNICATIONS Technical Standards § 25.205 Minimum angle of antenna elevation. (a) Earth station...

  20. Road networks as collections of minimum cost paths

    Science.gov (United States)

    Wegner, Jan Dirk; Montoya-Zegarra, Javier Alexander; Schindler, Konrad

    2015-10-01

    We present a probabilistic representation of network structures in images. Our target application is the extraction of urban roads from aerial images. Roads appear as thin, elongated, partially curved structures forming a loopy graph, and this complex layout requires a prior that goes beyond standard smoothness and co-occurrence assumptions. In the proposed model the network is represented as a union of 1D paths connecting distant (super-)pixels. A large set of putative candidate paths is constructed in such a way that they include the true network as much as possible, by searching for minimum cost paths in the foreground (road) likelihood. Selecting the optimal subset of candidate paths is posed as MAP inference in a higher-order conditional random field. Each path forms a higher-order clique with a type of clique potential, which attracts the member nodes of cliques with high cumulative road evidence to the foreground label. That formulation induces a robust PN -Potts model, for which a global MAP solution can be found efficiently with graph cuts. Experiments with two road data sets show that the proposed model significantly improves per-pixel accuracies as well as the overall topological network quality with respect to several baselines.

  1. Parameterization of ion channeling half-angles and minimum yields

    Energy Technology Data Exchange (ETDEWEB)

    Doyle, Barney L.

    2016-03-15

    A MS Excel program has been written that calculates ion channeling half-angles and minimum yields in cubic bcc, fcc and diamond lattice crystals. All of the tables and graphs in the three Ion Beam Analysis Handbooks that previously had to be manually looked up and read from were programed into Excel in handy lookup tables, or parameterized, for the case of the graphs, using rather simple exponential functions with different power functions of the arguments. The program then offers an extremely convenient way to calculate axial and planar half-angles, minimum yields, effects on half-angles and minimum yields of amorphous overlayers. The program can calculate these half-angles and minimum yields for 〈u v w〉 axes and [h k l] planes up to (5 5 5). The program is open source and available at (http://www.sandia.gov/pcnsc/departments/iba/ibatable.html).

  2. 26 CFR 5c.168(f)(8)-4 - Minimum investment of lessor.

    Science.gov (United States)

    2010-04-01

    ... 26 Internal Revenue 14 2010-04-01 2010-04-01 false Minimum investment of lessor. 5c.168(f)(8)-4....168(f)(8)-4 Minimum investment of lessor. (a) Minimum investment. Under section 168(f)(8)(B)(ii), an... has a minimum at risk investment which, at the time the property is placed in service under the lease...

  3. Providing detailed information about latent tuberculosis and compliance with the PPD test among healthcare workers in Israel: a randomized controlled study.

    Science.gov (United States)

    Taubman, Danielle; Titler, Nava; Edelstein, Hana; Elias, Mazen; Saliba, Walid

    2013-12-01

    The compliance of screening for latent tuberculosis (TB) with the tuberculin purified protein derivative (PPD) test is very low among healthcare workers (HCWs) in Israel. This randomized controlled study uses the Health Belief Model (HBM) as a conceptual framework to examine whether providing more information about latent TB and the PPD test increases the response rate for PPD screening among HCWs. All candidate HCWs for latent TB screening were randomly allocated to one of the following two invitations to perform the PPD test: regular letter (control group, n=97), and a letter with information about latent TB and the PPD test (intervention group, n=196). 293 HCWs were included (185 nurses, and 108 physicians). Overall, 36 (12.3%) HCWs were compliant with the PPD test screening. Compliance with PPD testing in the intervention group was not statistically different from the control group, RR 0.87 (95% CI, 0.46-1.65). Compliance for latent TB screening is low among HCWs in northeastern Israel. Providing detailed information about latent TB was not associated with increased test compliance. Understanding existing disparities in screening rates and potential barriers to latent TB screening among HCWs is important in order to move forward and successfully increase screening rates. Copyright © 2013 Ministry of Health, Saudi Arabia. Published by Elsevier Ltd. All rights reserved.

  4. Elemental GCR Observations during the 2009-2010 Solar Minimum Period

    Science.gov (United States)

    Lave, K. A.; Israel, M. H.; Binns, W. R.; Christian, E. R.; Cummings, A. C.; Davis, A. J.; deNolfo, G. A.; Leske, R. A.; Mewaldt, R. A.; Stone, E. C.; hide

    2013-01-01

    Using observations from the Cosmic Ray Isotope Spectrometer (CRIS) onboard the Advanced Composition Explorer (ACE), we present new measurements of the galactic cosmic ray (GCR) elemental composition and energy spectra for the species B through Ni in the energy range approx. 50-550 MeV/nucleon during the record setting 2009-2010 solar minimum period. These data are compared with our observations from the 1997-1998 solar minimum period, when solar modulation in the heliosphere was somewhat higher. For these species, we find that the intensities during the 2009-2010 solar minimum were approx. 20% higher than those in the previous solar minimum, and in fact were the highest GCR intensities recorded during the space age. Relative abundances for these species during the two solar minimum periods differed by small but statistically significant amounts, which are attributed to the combination of spectral shape differences between primary and secondary GCRs in the interstellar medium and differences between the levels of solar modulation in the two solar minima. We also present the secondary-to-primary ratios B/C and (Sc+Ti+V)/Fe for both solar minimum periods, and demonstrate that these ratios are reasonably well fit by a simple "leaky-box" galactic transport model that is combined with a spherically symmetric solar modulation model.

  5. 25 CFR 547.11 - What are the minimum technical standards for money and credit handling?

    Science.gov (United States)

    2010-04-01

    ... GAMES § 547.11 What are the minimum technical standards for money and credit handling? This section... interface is: (i) Involved in the play of a game; (ii) In audit mode, recall mode or any test mode; (iii...) For machine-readable vouchers and coupons, a bar code or other form of machine readable representation...

  6. A Randomized Controlled Trial Evaluating Efficacy of Promoting a Home-Based HIV Self-Testing with Online Counseling on Increasing HIV Testing Among Men Who Have Sex with Men.

    Science.gov (United States)

    Wang, Zixin; Lau, Joseph T F; Ip, Mary; Ho, Shara P Y; Mo, Phoenix K H; Latkin, Carl; Ma, Yee Ling; Kim, Yoona

    2018-01-01

    We developed an innovative home-based HIV self-testing (HIVST) service that included mailing of a free HIVST kit, and providing online real-time instructions and pre-test/post-test counseling (HIVST-OIC). The present parallel-group and non-blinded randomized controlled trial was conducted to evaluate the efficacy of promoting HIVST-OIC in increasing HIV testing rate among 430 men who have sex with men (MSM), with access to online live-chat applications in Hong Kong. At month 6, as compared to the control group, the intervention group reported significantly higher prevalence of HIV testing of any type (89.8 vs. 50.7%; relative risk (RR): 1.77; p strong potential in increasing prevalence of HIV testing and reducing sexual risk behaviors. Implementation research is warranted.

  7. Minimum qualifications for nuclear criticality safety professionals

    International Nuclear Information System (INIS)

    Ketzlach, N.

    1990-01-01

    A Nuclear Criticality Technology and Safety Training Committee has been established within the U.S. Department of Energy (DOE) Nuclear Criticality Safety and Technology Project to review and, if necessary, develop standards for the training of personnel involved in nuclear criticality safety (NCS). The committee is exploring the need for developing a standard or other mechanism for establishing minimum qualifications for NCS professionals. The development of standards and regulatory guides for nuclear power plant personnel may serve as a guide in developing the minimum qualifications for NCS professionals

  8. A minimum achievable PV electrical generating cost

    International Nuclear Information System (INIS)

    Sabisky, E.S.

    1996-01-01

    The role and share of photovoltaic (PV) generated electricity in our nation's future energy arsenal is primarily dependent on its future production cost. This paper provides a framework for obtaining a minimum achievable electrical generating cost (a lower bound) for fixed, flat-plate photovoltaic systems. A cost of 2.8 $cent/kWh (1990$) was derived for a plant located in Southwestern USA sunshine using a cost of money of 8%. In addition, a value of 22 $cent/Wp (1990$) was estimated as a minimum module manufacturing cost/price

  9. Causal Tracking Control of a Non-Minimum Phase HIL Transmission Test System

    OpenAIRE

    Wang, Pengfei

    2009-01-01

    The automotive industry has long relied on testing powertrain components in real vehicles, which causes the development process to be slow and expensive. Therefore, hardware in the loop (HIL) testing techniques are increasingly being adopted to develop electronic control units (ECU) for engine and other components of a vehicle. In this thesis, HIL testing system is developed to provide a laboratory testing environment for continuously variable transmissions (CVTs). Two induction motors were u...

  10. Evidence of significant bias in an elementary random number generator

    International Nuclear Information System (INIS)

    Borgwaldt, H.; Brandl, V.

    1981-03-01

    An elementary pseudo random number generator for isotropically distributed unit vectors in 3-dimensional space has ben tested for bias. This generator uses the IBM-suplied routine RANDU and a transparent rejection technique. The tests show clearly that non-randomness in the pseudo random numbers generated by the primary IBM generator leads to bias in the order of 1 percent in estimates obtained from the secondary random number generator. FORTRAN listings of 4 variants of the random number generator called by a simple test programme and output listings are included for direct reference. (orig.) [de

  11. The MIXMAX random number generator

    Science.gov (United States)

    Savvidy, Konstantin G.

    2015-11-01

    In this paper, we study the randomness properties of unimodular matrix random number generators. Under well-known conditions, these discrete-time dynamical systems have the highly desirable K-mixing properties which guarantee high quality random numbers. It is found that some widely used random number generators have poor Kolmogorov entropy and consequently fail in empirical tests of randomness. These tests show that the lowest acceptable value of the Kolmogorov entropy is around 50. Next, we provide a solution to the problem of determining the maximal period of unimodular matrix generators of pseudo-random numbers. We formulate the necessary and sufficient condition to attain the maximum period and present a family of specific generators in the MIXMAX family with superior performance and excellent statistical properties. Finally, we construct three efficient algorithms for operations with the MIXMAX matrix which is a multi-dimensional generalization of the famous cat-map. First, allowing to compute the multiplication by the MIXMAX matrix with O(N) operations. Second, to recursively compute its characteristic polynomial with O(N2) operations, and third, to apply skips of large number of steps S to the sequence in O(N2 log(S)) operations.

  12. Setting a minimum age for juvenile justice jurisdiction in California

    Science.gov (United States)

    Barnert, Elizabeth S.; Abrams, Laura S.; Maxson, Cheryl; Gase, Lauren; Soung, Patricia; Carroll, Paul; Bath, Eraka

    2018-01-01

    Purpose Despite the existence of minimum age laws for juvenile justice jurisdiction in 18 US states, California has no explicit law that protects children (i.e. youth less than 12 years old) from being processed in the juvenile justice system. In the absence of a minimum age law, California lags behind other states and international practice and standards. The paper aims to discuss these issues. Design/methodology/approach In this policy brief, academics across the University of California campuses examine current evidence, theory, and policy related to the minimum age of juvenile justice jurisdiction. Findings Existing evidence suggests that children lack the cognitive maturity to comprehend or benefit from formal juvenile justice processing, and diverting children from the system altogether is likely to be more beneficial for the child and for public safety. Research limitations/implications Based on current evidence and theory, the authors argue that minimum age legislation that protects children from contact with the juvenile justice system and treats them as children in need of services and support, rather than as delinquents or criminals, is an important policy goal for California and for other national and international jurisdictions lacking a minimum age law. Originality/value California has no law specifying a minimum age for juvenile justice jurisdiction, meaning that young children of any age can be processed in the juvenile justice system. This policy brief provides a rationale for a minimum age law in California and other states and jurisdictions without one. Paper type Conceptual paper PMID:28299968

  13. Generalized oscillator strength for the argon 3p{sup 6}-3p{sup 5} 4s transition: Correlation and exchange effects on the characteristic minimum

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Zhifan [Center for Theoretical Studies of Physical Systems, and Department of Physics, Clark Atlanta University, Atlanta, Georgia 30314 (United States); Msezane, Alfred Z. [Center for Theoretical Studies of Physical Systems, and Department of Physics, Clark Atlanta University, Atlanta, Georgia 30314 (United States); Amusia, M. Ya. [Racah Institute of Physics, The Hebrew University of Jerusalem, Jerusalem 91904, (Israel)

    1999-12-01

    We have investigated the generalized oscillator strength (GOS) for a transition of the type np{yields}(n+1)s, where n is the principal quantum number of the outermost filled shell of the atomic ground state, using the random-phase approximation with exchange. We find that the influence of correlation and exchange effects on the position of the characteristic minimum in the GOS of Ar(n=3) is insignificant. Also, our first Born approximation predicts the position of the minimum accurately provided that accurate target wave functions are employed. Our results agree excellently with measurements and are expected to be applicable equally to the corresponding subshells of Ne(n=2), Kr(n=4), and Xe(n=5). (c) 1999 The American Physical Society.

  14. Minimum inhibitory concentration distribution in environmental Legionella spp. isolates.

    Science.gov (United States)

    Sandalakis, Vassilios; Chochlakis, Dimosthenis; Goniotakis, Ioannis; Tselentis, Yannis; Psaroulaki, Anna

    2014-12-01

    In Greece standard tests are performed in the watering and cooling systems of hotels' units either as part of the surveillance scheme or following human infection. The purpose of this study was to establish the minimum inhibitory concentration (MIC) distributions of environmental Legionella isolates for six antimicrobials commonly used for the treatment of Legionella infections, by MIC-test methodology. Water samples were collected from 2004 to 2011 from 124 hotels from the four prefectures of Crete (Greece). Sixty-eight (68) Legionella isolates, comprising L. pneumophila serogroups 1, 2, 3, 5, 6, 8, 12, 13, 15, L. anisa, L. rubrilucens, L. maceachernii, L. quinlivanii, L. oakridgensis, and L. taurinensis, were included in the study. MIC-tests were performed on buffered charcoal yeast extract with α-ketoglutarate, L-cysteine, and ferric pyrophosphate. The MICs were read after 2 days of incubation at 36 ± 1 °C at 2.5% CO2. A large distribution in MICs was recorded for each species and each antibiotic tested. Rifampicin proved to be the most potent antibiotic regardless of the Legionella spp.; tetracycline appeared to have the least activity on our environmental isolates. The MIC-test approach is an easy, although not so cost-effective, way to determine MICs in Legionella spp. These data should be kept in mind especially since these Legionella species may cause human disease.

  15. Kalman Filtering for Discrete Stochastic Systems with Multiplicative Noises and Random Two-Step Sensor Delays

    Directory of Open Access Journals (Sweden)

    Dongyan Chen

    2015-01-01

    Full Text Available This paper is concerned with the optimal Kalman filtering problem for a class of discrete stochastic systems with multiplicative noises and random two-step sensor delays. Three Bernoulli distributed random variables with known conditional probabilities are introduced to characterize the phenomena of the random two-step sensor delays which may happen during the data transmission. By using the state augmentation approach and innovation analysis technique, an optimal Kalman filter is constructed for the augmented system in the sense of the minimum mean square error (MMSE. Subsequently, the optimal Kalman filtering is derived for corresponding augmented system in initial instants. Finally, a simulation example is provided to demonstrate the feasibility and effectiveness of the proposed filtering method.

  16. An algorithm for seeking the optimum value of a function: 'random' method; Un algorithme de recherche de l'optimum d'une fonction: la methode random

    Energy Technology Data Exchange (ETDEWEB)

    Guais, J C [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1967-07-01

    After a brief survey of classical techniques for static optimization, we present a Random seeking method for any function, of an arbitrary number of variables, with constraints. The resulting program is shown and illustrated by some examples. The comparison with classical methods points out the advantages of Random in some cases where analytic procedures fail or require too much calculation time. (author) [French] Apres une rapide revue des differents procedes actuels d'optimisation statique, on expose une methode de recherche aleatoire du minimum (ou du maximum) d'une fonction quelconque, definie sur un nombre theoriquement illimite de parametres independants, avec contraintes. Le programme resultant est presente. Il est illustre par quelques exemples simples et compare a des methodes d'optimisation classiques; Ceci montre en particulier que le programme RANDOM permet une recherche aisee d'extrema dans certains cas ou d'autres programmes ne conduisent pas a des solutions satisfaisantes ou bien demandent un temps calcul prohibitif. (auteur)

  17. 29 CFR 510.22 - Industries eligible for minimum wage phase-in.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Industries eligible for minimum wage phase-in. 510.22... REGULATIONS IMPLEMENTATION OF THE MINIMUM WAGE PROVISIONS OF THE 1989 AMENDMENTS TO THE FAIR LABOR STANDARDS ACT IN PUERTO RICO Classification of Industries § 510.22 Industries eligible for minimum wage phase-in...

  18. 13 CFR 107.830 - Minimum duration/term of financing.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Minimum duration/term of financing... INVESTMENT COMPANIES Financing of Small Businesses by Licensees Structuring Licensee's Financing of An Eligible Small Business: Terms and Conditions of Financing § 107.830 Minimum duration/term of financing. (a...

  19. 42 CFR 84.117 - Gas mask containers; minimum requirements.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Gas mask containers; minimum requirements. 84.117... SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Gas Masks § 84.117 Gas mask containers; minimum requirements. (a) Gas masks shall be equipped with a substantial...

  20. Do Minimum Wages in Latin America and the Caribbean Matter? Evidence from 19 Countries

    DEFF Research Database (Denmark)

    Kristensen, Nicolai; Cunningham, Wendy

    of if and how minimum wages affect wage distributions in LAC countries. Although there is no single minimum wage institution in the LAC region, we find regional trends. Minimum wages affect the wage distribution in both the formal and, especially, the informal sector, both at the minimum wage and at multiples...... of the minimum. The minimum does not uniformly benefit low-wage workers: in countries where the minimum wage is relatively low compared to mean wages, the minimum wage affects the more disadvantaged segments of the labor force, namely informal sector workers, women, young and older workers, and the low skilled...

  1. Random breath testing in Australia: getting it to work according to specifications.

    Science.gov (United States)

    Homel, R

    1993-01-01

    After reading the deterrence literature, particularly the work of H. Laurence Ross, I concluded in the late 1970's that many road accidents could be prevented through the wholehearted implementation of random breath testing (RBT). RBT is a system of drink-drive law enforcement which aims to increase the perceived likelihood of apprehension through the use of mass breath testing techniques at roadblocks which are highly visible, are unpredictable in their locations and give the impression of ubiquity. As the result of public pressure, RBT was introduced in NSW in December 1982, with spectacular results. The law was intensively enforced and extensively advertised, partly due to the advocacy of researchers such as myself, but also because ther was an acute political need for instant results. Since RBT is a difficult enforcement technique for police to sustain in effective form, researchers must strive to improve their understanding of what works, and remain in close contact with police, policy makers and politicians. Although this process is costly in terms of time and, possibly, academic 'pay-off', it is essential if the fragile understanding of deterrence principles amongst these groups is not to lead to superficially attractive, but probably ineffective techniques such as low visibility mobile RBT.

  2. Quantifiers for randomness of chaotic pseudo-random number generators.

    Science.gov (United States)

    De Micco, L; Larrondo, H A; Plastino, A; Rosso, O A

    2009-08-28

    We deal with randomness quantifiers and concentrate on their ability to discern the hallmark of chaos in time series used in connection with pseudo-random number generators (PRNGs). Workers in the field are motivated to use chaotic maps for generating PRNGs because of the simplicity of their implementation. Although there exist very efficient general-purpose benchmarks for testing PRNGs, we feel that the analysis provided here sheds additional didactic light on the importance of the main statistical characteristics of a chaotic map, namely (i) its invariant measure and (ii) the mixing constant. This is of help in answering two questions that arise in applications: (i) which is the best PRNG among the available ones? and (ii) if a given PRNG turns out not to be good enough and a randomization procedure must still be applied to it, which is the best applicable randomization procedure? Our answer provides a comparative analysis of several quantifiers advanced in the extant literature.

  3. 29 CFR 552.100 - Application of minimum wage and overtime provisions.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Application of minimum wage and overtime provisions. 552... § 552.100 Application of minimum wage and overtime provisions. (a)(1) Domestic service employees must receive for employment in any household a minimum wage of not less than that required by section 6(a) of...

  4. Minimum alveolar concentration of isoflurane in green iguanas and the effect of butorphanol on minimum alveolar concentration.

    Science.gov (United States)

    Mosley, Craig A E; Dyson, Doris; Smith, Dale A

    2003-06-01

    To determine minimum alveolar concentration (MAC) of isoflurane in green iguanas and effects of butorphanol on MAC. Prospective randomized trial. 10 healthy mature iguanas. in each iguana, MAC was measured 3 times: twice after induction of anesthesia with isoflurane and once after induction of anesthesia with isoflurane and IM administration of butorphanol (1 mg/kg [0.45 mg/lb]). A blood sample was collected from the tail vein for blood-gas analysis at the beginning and end of the anesthetic period. The MAC was determined with a standard bracketing technique; an electrical current was used as the supramaximal stimulus. Animals were artificially ventilated with a ventilator set to deliver a tidal volume of 30 mL/kg (14 mL/lb) at a rate of 4 breaths/min. Mean +/- SD MAC values during the 3 trials (2 without and 1 with butorphanol) were 2.0 +/- 0.6, 2.1 +/- 0.6, and 1.7 +/- 0.7%, respectively, which were not significantly different from each other. Heart rate and end-tidal partial pressure of CO2 were also not significantly different among the 3 trials. Mean +/- SD heart rate was 48 +/- 10 beats/min; mean end-tidal partial pressure of CO2 was 22 +/- 10 mm Hg. There were no significant differences in blood-gas values for samples obtained at the beginning versus the end of the anesthetic period. Results suggest that the MAC of isoflurane in green iguanas is 2.1% and that butorphanol does not have any significant isoflurane-sparing effects.

  5. Distinguishing mixed quantum states: Minimum-error discrimination versus optimum unambiguous discrimination

    International Nuclear Information System (INIS)

    Herzog, Ulrike; Bergou, Janos A.

    2004-01-01

    We consider two different optimized measurement strategies for the discrimination of nonorthogonal quantum states. The first is ambiguous discrimination with a minimum probability of inferring an erroneous result, and the second is unambiguous, i.e., error-free, discrimination with a minimum probability of getting an inconclusive outcome, where the measurement fails to give a definite answer. For distinguishing between two mixed quantum states, we investigate the relation between the minimum-error probability achievable in ambiguous discrimination, and the minimum failure probability that can be reached in unambiguous discrimination of the same two states. The latter turns out to be at least twice as large as the former for any two given states. As an example, we treat the case where the state of the quantum system is known to be, with arbitrary prior probability, either a given pure state, or a uniform statistical mixture of any number of mutually orthogonal states. For this case we derive an analytical result for the minimum probability of error and perform a quantitative comparison with the minimum failure probability

  6. Minimum alcohol pricing policies in practice: A critical examination of implementation in Canada.

    Science.gov (United States)

    Thompson, Kara; Stockwell, Tim; Wettlaufer, Ashley; Giesbrecht, Norman; Thomas, Gerald

    2017-02-01

    There is an interest globally in using Minimum Unit Pricing (MUP) of alcohol to promote public health. Canada is the only country to have both implemented and evaluated some forms of minimum alcohol prices, albeit in ways that fall short of MUP. To inform these international debates, we describe the degree to which minimum alcohol prices in Canada meet recommended criteria for being an effective public health policy. We collected data on the implementation of minimum pricing with respect to (1) breadth of application, (2) indexation to inflation and (3) adjustments for alcohol content. Some jurisdictions have implemented recommended practices with respect to minimum prices; however, the full harm reduction potential of minimum pricing is not fully realised due to incomplete implementation. Key concerns include the following: (1) the exclusion of minimum prices for several beverage categories, (2) minimum prices below the recommended minima and (3) prices are not regularly adjusted for inflation or alcohol content. We provide recommendations for best practices when implementing minimum pricing policy.

  7. 19 CFR 144.33 - Minimum quantities to be withdrawn.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Minimum quantities to be withdrawn. 144.33 Section 144.33 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT... Warehouse § 144.33 Minimum quantities to be withdrawn. Unless by special authority of the Commissioner of...

  8. Testing the Causal Direction of Mediation Effects in Randomized Intervention Studies.

    Science.gov (United States)

    Wiedermann, Wolfgang; Li, Xintong; von Eye, Alexander

    2018-05-21

    In a recent update of the standards for evidence in research on prevention interventions, the Society of Prevention Research emphasizes the importance of evaluating and testing the causal mechanism through which an intervention is expected to have an effect on an outcome. Mediation analysis is commonly applied to study such causal processes. However, these analytic tools are limited in their potential to fully understand the role of theorized mediators. For example, in a design where the treatment x is randomized and the mediator (m) and the outcome (y) are measured cross-sectionally, the causal direction of the hypothesized mediator-outcome relation is not uniquely identified. That is, both mediation models, x → m → y or x → y → m, may be plausible candidates to describe the underlying intervention theory. As a third explanation, unobserved confounders can still be responsible for the mediator-outcome association. The present study introduces principles of direction dependence which can be used to empirically evaluate these competing explanatory theories. We show that, under certain conditions, third higher moments of variables (i.e., skewness and co-skewness) can be used to uniquely identify the direction of a mediator-outcome relation. Significance procedures compatible with direction dependence are introduced and results of a simulation study are reported that demonstrate the performance of the tests. An empirical example is given for illustrative purposes and a software implementation of the proposed method is provided in SPSS.

  9. A Minimum Spanning Tree Representation of Anime Similarities

    OpenAIRE

    Wibowo, Canggih Puspo

    2016-01-01

    In this work, a new way to represent Japanese animation (anime) is presented. We applied a minimum spanning tree to show the relation between anime. The distance between anime is calculated through three similarity measurements, namely crew, score histogram, and topic similarities. Finally, the centralities are also computed to reveal the most significance anime. The result shows that the minimum spanning tree can be used to determine the similarity anime. Furthermore, by using centralities c...

  10. Minimum Viable Product and the Importance of Experimentation in Technology Startups

    Directory of Open Access Journals (Sweden)

    Dobrila Rancic Moogk

    2012-03-01

    Full Text Available Entrepreneurs are often faced with limited resources in their quest to commercialize new technology. This article presents the model of a lean startup, which can be applied to an organization regardless of its size or environment. It also emphasizes the conditions of extreme uncertainty under which the commercialization of new technology is carried out. The lean startup philosophy advocates efficient use of resources by introducing a minimum viable product to the market as soon as possible in order to test its value and the entrepreneur’s growth projections. This testing is done by running experiments that examine the metrics relevant to three distinct types of the growth. These experiments bring about accelerated learning to help reduce the uncertainty that accompanies commercialization projects, thereby bringing the resulting new technology to market faster.

  11. Start of Eta Car's X-ray Minimum

    Science.gov (United States)

    Corcoran, Michael F.; Liburd, Jamar; Hamaguchi, Kenji; Gull, Theodore; Madura, Thomas; Teodoro, Mairan; Moffat, Anthony; Richardson, Noel; Russell, Chris; Pollock, Andrew; hide

    2014-01-01

    Analysis of Eta Car's X-ray spectrum in the 2-10 keV band using quicklook data from the XRay Telescope on Swift shows that the flux on July 30, 2014 was 4.9 plus or minus 2.0×10(exp-12) ergs s(exp-1)cm(exp-2). This flux is nearly equal to the X-ray minimum flux seen by RXTE in 2009, 2003.5, and 1998, and indicates that Eta Car has reached its X-ray minimum, as expected based on the 2024-day period derived from previous 2-10 keV observations with RXTE.

  12. The Impact Of Minimum Wage On Employment Level And ...

    African Journals Online (AJOL)

    This research work has been carried out to analyze the critical impact of minimum wage of employment level and productivity in Nigeria. A brief literature on wage and its determination was highlighted. Models on minimum wage effect are being look into. This includes research work done by different economist analyzing it ...

  13. 30 CFR 77.606-1 - Rubber gloves; minimum requirements.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Rubber gloves; minimum requirements. 77.606-1... COAL MINES Trailing Cables § 77.606-1 Rubber gloves; minimum requirements. (a) Rubber gloves (lineman's gloves) worn while handling high-voltage trailing cables shall be rated at least 20,000 volts and shall...

  14. Do minimum wages reduce poverty? Evidence from Central America ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    In all three countries, these multiple minimum wages are negotiated among representatives of the central government, labour unions and the chambers of commerce. Minimum wage legislation applies to all private-sector employees, but in all three countries a large part of the work force is self-employed or works as unpaid ...

  15. The Minimum Wage, Restaurant Prices, and Labor Market Structure

    Science.gov (United States)

    Aaronson, Daniel; French, Eric; MacDonald, James

    2008-01-01

    Using store-level and aggregated Consumer Price Index data, we show that restaurant prices rise in response to minimum wage increases under several sources of identifying variation. We introduce a general model of employment determination that implies minimum wage hikes cause prices to rise in competitive labor markets but potentially fall in…

  16. Program pseudo-random number generator for microcomputers

    International Nuclear Information System (INIS)

    Ososkov, G.A.

    1980-01-01

    Program pseudo-random number generators (PNG) intended for the test of control equipment and communication channels are considered. In the case of 8-bit microcomputers it is necessary to assign 4 words of storage to allocate one random number. The proposed economical algorithms of the random number generation are based on the idea of the ''mixing'' of such quarters of the preceeding random number to obtain the next one. Test results of the PNG are displayed for two such generators. A FORTRAN variant of the PNG is presented along with a program realizing the PNG made on the base of the INTEL-8080 autocode

  17. Thermal-hydraulic study of fixed bed nuclear reactor (FBNR), in FCC, BCC and pseudo-random configurations of the core through CFD method

    International Nuclear Information System (INIS)

    Luna, M.; Chavez, I.; Cajas, D.; Santos, R.

    2015-01-01

    The study of thermal-hydraulic performance of a fixed bed nuclear reactor (FBNR) core and the effect of the porosity was studied by the CFD method with 'SolidWorks' software. The representative sections of three different packed beds arrangements were analyzed: face-centered cubic (FCC), body-centered cubic (BCC), and a pseudo-random, with values of porosity of 0.28, 0.33 and 0.53 respectively. The minimum coolant flow required to avoid the phase change for each one of the configurations was determined. The results show that the heat transfer rate increases when the porosity value decreases, and consequently the minimum coolant flow in each configuration. The results of minimum coolant flow were: 728.51 kg/s for the FCC structure, 372.72 kg/s for the BCC, and 304.96 kg/s for the pseudo-random. Meanwhile, the heat transfer coefficients in each packed bed were 6480 W/m 2 *K, 3718 W/m 2 *K and 3042 W/m 2 *K respectively. Finally the pressure drop was calculated, and the results were 0.588 MPa for FCC configuration, 0.033 MPa for BCC and 0.017 MPa for the pseudo-random one. This means that with a higher porosity, the fluid can circulate easier because there are fewer obstacles to cross, so there are fewer energy losses. (authors)

  18. A comparison between the original and Tablet-based Symbol Digit Modalities Test in patients with schizophrenia: Test-retest agreement, random measurement error, practice effect, and ecological validity.

    Science.gov (United States)

    Tang, Shih-Fen; Chen, I-Hui; Chiang, Hsin-Yu; Wu, Chien-Te; Hsueh, I-Ping; Yu, Wan-Hui; Hsieh, Ching-Lin

    2017-11-27

    We aimed to compare the test-retest agreement, random measurement error, practice effect, and ecological validity of the original and Tablet-based Symbol Digit Modalities Test (T-SDMT) over five serial assessments, and to examine the concurrent validity of the T-SDMT in patients with schizophrenia. Sixty patients with chronic schizophrenia completed five serial assessments (one week apart) of the SDMT and T-SDMT and one assessment of the Activities of Daily Living Rating Scale III at the first time point. Both measures showed high test-retest agreement, similar levels of random measurement error over five serial assessments. Moreover, the practice effects of the two measures did not reach a plateau phase after five serial assessments in young and middle-aged participants. Nevertheless, only the practice effect of the T-SDMT became trivial after the first assessment. Like the SDMT, the T-SDMT had good ecological validity. The T-SDMT also had good concurrent validity with the SDMT. In addition, only the T-SDMT had discriminative validity to discriminate processing speed in young and middle-aged participants. Compared to the SDMT, the T-SDMT had overall slightly better psychometric properties, so it can be an alternative measure to the SDMT for assessing processing speed in patients with schizophrenia. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Benefits of expressive writing in reducing test anxiety: A randomized controlled trial in Chinese samples.

    Science.gov (United States)

    Shen, Lujun; Yang, Lei; Zhang, Jing; Zhang, Meng

    2018-01-01

    To explore the effect of expressive writing of positive emotions on test anxiety among senior-high-school students. The Test Anxiety Scale (TAS) was used to assess the anxiety level of 200 senior-high-school students. Seventy-five students with high anxiety were recruited and divided randomly into experimental and control groups. Each day for 30 days, the experimental group engaged in 20 minutes of expressive writing of positive emotions, while the control group was asked to merely write down their daily events. A second test was given after the month-long experiment to analyze whether there had been a reduction in anxiety among the sample. Quantitative data was obtained from TAS scores. The NVivo10.0 software program was used to examine the frequency of particular word categories used in participants' writing manuscripts. Senior-high-school students indicated moderate to high test anxiety. There was a significant difference in post-test results (P 0.05). Students' writing manuscripts were mainly encoded on five code categories: cause, anxiety manifestation, positive emotion, insight and evaluation. There was a negative relation between positive emotion, insight codes and test anxiety. There were significant differences in the positive emotion, anxiety manifestation, and insight code categories between the first 10 days' manuscripts and the last 10 days' ones. Long-term expressive writing of positive emotions appears to help reduce test anxiety by using insight and positive emotion words for Chinese students. Efficient and effective intervention programs to ease test anxiety can be designed based on this study.

  20. Benefits of expressive writing in reducing test anxiety: A randomized controlled trial in Chinese samples.

    Directory of Open Access Journals (Sweden)

    Lujun Shen

    Full Text Available To explore the effect of expressive writing of positive emotions on test anxiety among senior-high-school students.The Test Anxiety Scale (TAS was used to assess the anxiety level of 200 senior-high-school students. Seventy-five students with high anxiety were recruited and divided randomly into experimental and control groups. Each day for 30 days, the experimental group engaged in 20 minutes of expressive writing of positive emotions, while the control group was asked to merely write down their daily events. A second test was given after the month-long experiment to analyze whether there had been a reduction in anxiety among the sample. Quantitative data was obtained from TAS scores. The NVivo10.0 software program was used to examine the frequency of particular word categories used in participants' writing manuscripts.Senior-high-school students indicated moderate to high test anxiety. There was a significant difference in post-test results (P 0.05. Students' writing manuscripts were mainly encoded on five code categories: cause, anxiety manifestation, positive emotion, insight and evaluation. There was a negative relation between positive emotion, insight codes and test anxiety. There were significant differences in the positive emotion, anxiety manifestation, and insight code categories between the first 10 days' manuscripts and the last 10 days' ones.Long-term expressive writing of positive emotions appears to help reduce test anxiety by using insight and positive emotion words for Chinese students. Efficient and effective intervention programs to ease test anxiety can be designed based on this study.

  1. Benefits of expressive writing in reducing test anxiety: A randomized controlled trial in Chinese samples

    Science.gov (United States)

    Zhang, Jing; Zhang, Meng

    2018-01-01

    Purpose To explore the effect of expressive writing of positive emotions on test anxiety among senior-high-school students. Methods The Test Anxiety Scale (TAS) was used to assess the anxiety level of 200 senior-high-school students. Seventy-five students with high anxiety were recruited and divided randomly into experimental and control groups. Each day for 30 days, the experimental group engaged in 20 minutes of expressive writing of positive emotions, while the control group was asked to merely write down their daily events. A second test was given after the month-long experiment to analyze whether there had been a reduction in anxiety among the sample. Quantitative data was obtained from TAS scores. The NVivo10.0 software program was used to examine the frequency of particular word categories used in participants’ writing manuscripts. Results Senior-high-school students indicated moderate to high test anxiety. There was a significant difference in post-test results (P 0.05). Students’ writing manuscripts were mainly encoded on five code categories: cause, anxiety manifestation, positive emotion, insight and evaluation. There was a negative relation between positive emotion, insight codes and test anxiety. There were significant differences in the positive emotion, anxiety manifestation, and insight code categories between the first 10 days’ manuscripts and the last 10 days’ ones. Conclusions Long-term expressive writing of positive emotions appears to help reduce test anxiety by using insight and positive emotion words for Chinese students. Efficient and effective intervention programs to ease test anxiety can be designed based on this study. PMID:29401473

  2. Construction of Protograph LDPC Codes with Linear Minimum Distance

    Science.gov (United States)

    Divsalar, Dariush; Dolinar, Sam; Jones, Christopher

    2006-01-01

    A construction method for protograph-based LDPC codes that simultaneously achieve low iterative decoding threshold and linear minimum distance is proposed. We start with a high-rate protograph LDPC code with variable node degrees of at least 3. Lower rate codes are obtained by splitting check nodes and connecting them by degree-2 nodes. This guarantees the linear minimum distance property for the lower-rate codes. Excluding checks connected to degree-1 nodes, we show that the number of degree-2 nodes should be at most one less than the number of checks for the protograph LDPC code to have linear minimum distance. Iterative decoding thresholds are obtained by using the reciprocal channel approximation. Thresholds are lowered by using either precoding or at least one very high-degree node in the base protograph. A family of high- to low-rate codes with minimum distance linearly increasing in block size and with capacity-approaching performance thresholds is presented. FPGA simulation results for a few example codes show that the proposed codes perform as predicted.

  3. Quantitative Research on the Minimum Wage

    Science.gov (United States)

    Goldfarb, Robert S.

    1975-01-01

    The article reviews recent research examining the impact of minimum wage requirements on the size and distribution of teenage employment and earnings. The studies measure income distribution, employment levels and effect on unemployment. (MW)

  4. Application of Minimum-time Optimal Control System in Buck-Boost Bi-linear Converters

    Directory of Open Access Journals (Sweden)

    S. M. M. Shariatmadar

    2017-08-01

    Full Text Available In this study, the theory of minimum-time optimal control system in buck-boost bi-linear converters is described, so that output voltage regulation is carried out within minimum time. For this purpose, the Pontryagin's Minimum Principle is applied to find optimal switching level applying minimum-time optimal control rules. The results revealed that by utilizing an optimal switching level instead of classical switching patterns, output voltage regulation will be carried out within minimum time. However, transient energy index of increased overvoltage significantly reduces in order to attain minimum time optimal control in reduced output load. The laboratory results were used in order to verify numerical simulations.

  5. Disk Density Tuning of a Maximal Random Packing.

    Science.gov (United States)

    Ebeida, Mohamed S; Rushdi, Ahmad A; Awad, Muhammad A; Mahmoud, Ahmed H; Yan, Dong-Ming; English, Shawn A; Owens, John D; Bajaj, Chandrajit L; Mitchell, Scott A

    2016-08-01

    We introduce an algorithmic framework for tuning the spatial density of disks in a maximal random packing, without changing the sizing function or radii of disks. Starting from any maximal random packing such as a Maximal Poisson-disk Sampling (MPS), we iteratively relocate, inject (add), or eject (remove) disks, using a set of three successively more-aggressive local operations. We may achieve a user-defined density, either more dense or more sparse, almost up to the theoretical structured limits. The tuned samples are conflict-free, retain coverage maximality, and, except in the extremes, retain the blue noise randomness properties of the input. We change the density of the packing one disk at a time, maintaining the minimum disk separation distance and the maximum domain coverage distance required of any maximal packing. These properties are local, and we can handle spatially-varying sizing functions. Using fewer points to satisfy a sizing function improves the efficiency of some applications. We apply the framework to improve the quality of meshes, removing non-obtuse angles; and to more accurately model fiber reinforced polymers for elastic and failure simulations.

  6. On the Minimum Cable Tensions for the Cable-Based Parallel Robots

    Directory of Open Access Journals (Sweden)

    Peng Liu

    2014-01-01

    Full Text Available This paper investigates the minimum cable tension distributions in the workspace for cable-based parallel robots to find out more information on the stability. First, the kinematic model of a cable-based parallel robot is derived based on the wrench matrix. Then, a noniterative polynomial-based optimization algorithm with the proper optimal objective function is presented based on the convex optimization theory, in which the minimum cable tension at any pose is determined. Additionally, three performance indices are proposed to show the distributions of the minimum cable tensions in a specified region of the workspace. An important thing is that the three performance indices can be used to evaluate the stability of the cable-based parallel robots. Furthermore, a new workspace, the Specified Minimum Cable Tension Workspace (SMCTW, is introduced, within which all the minimum tensions exceed a specified value, therefore meeting the specified stability requirement. Finally, a camera robot parallel driven by four cables for aerial panoramic photographing is selected to illustrate the distributions of the minimum cable tensions in the workspace and the relationship between the three performance indices and the stability.

  7. Energy and environmental norms on Minimum Vital Flux

    International Nuclear Information System (INIS)

    Maran, S.

    2008-01-01

    By the end of the year will come into force the recommendations on Minimum Vital flow and operators of hydroelectric power plants will be required to make available part of water of their derivations in order to protect river ecosystems. In this article the major energy and environmental consequences of these rules, we report some quantitative evaluations and are discusses the proposals for overcoming the weaknesses of the approach in the estimation of Minimum Vital Flux [it

  8. MINIMUM BRACING STIFFNESS FOR MULTI-COLUMN SYSTEMS: THEORY

    OpenAIRE

    ARISTIZÁBAL-OCHOA, J. DARÍO

    2011-01-01

    A method that determines the minimum bracing stiffness required by a multi-column elastic system to achieve non-sway buckling conditions is proposed. Equations that evaluate the required minimum stiffness of the lateral and torsional bracings and the corresponding “braced" critical buckling load for each column of the story level are derived using the modified stability functions. The following effects are included: 1) the types of end connections (rigid, semirigid, and simple); 2) the bluepr...

  9. Towards a mathematical foundation of minimum-variance theory

    Energy Technology Data Exchange (ETDEWEB)

    Feng Jianfeng [COGS, Sussex University, Brighton (United Kingdom); Zhang Kewei [SMS, Sussex University, Brighton (United Kingdom); Wei Gang [Mathematical Department, Baptist University, Hong Kong (China)

    2002-08-30

    The minimum-variance theory which accounts for arm and eye movements with noise signal inputs was proposed by Harris and Wolpert (1998 Nature 394 780-4). Here we present a detailed theoretical analysis of the theory and analytical solutions of the theory are obtained. Furthermore, we propose a new version of the minimum-variance theory, which is more realistic for a biological system. For the new version we show numerically that the variance is considerably reduced. (author)

  10. Minimum Wage Policy and Country’s Technical Efficiency

    OpenAIRE

    Karim, Mohd Zaini Abd; Chan, Sok-Gee; Hassan, Sallahuddin

    2016-01-01

    Recently, the government has decided that Malaysia would introduce a minimum wage policy. However, some quarters argued against the idea of a nationwide minimum wage asserting that it will lead to an increase in the cost of doing business and thus will hurt Malaysian competitiveness. Although standard economic theory unambiguously implies that wage floors have a negative impact on employment, the existing empirical literature is not so clear. Some studies have found the expected negative impa...

  11. Anesthesiologists' perceptions of minimum acceptable work habits of nurse anesthetists.

    Science.gov (United States)

    Logvinov, Ilana I; Dexter, Franklin; Hindman, Bradley J; Brull, Sorin J

    2017-05-01

    Work habits are non-technical skills that are an important part of job performance. Although non-technical skills are usually evaluated on a relative basis (i.e., "grading on a curve"), validity of evaluation on an absolute basis (i.e., "minimum passing score") needs to be determined. Survey and observational study. None. None. The theme of "work habits" was assessed using a modification of Dannefer et al.'s 6-item scale, with scores ranging from 1 (lowest performance) to 5 (highest performance). E-mail invitations were sent to all consultant and fellow anesthesiologists at Mayo Clinic in Florida, Arizona, and Minnesota. Because work habits expectations can be generational, the survey was designed for adjustment based on all invited (responding or non-responding) anesthesiologists' year of graduation from residency. The overall mean±standard deviation of the score for anesthesiologists' minimum expectations of nurse anesthetists' work habits was 3.64±0.66 (N=48). Minimum acceptable scores were correlated with the year of graduation from anesthesia residency (linear regression P=0.004). Adjusting for survey non-response using all N=207 anesthesiologists, the mean of the minimum acceptable work habits adjusted for year of graduation was 3.69 (standard error 0.02). The minimum expectations for nurse anesthetists' work habits were compared with observational data obtained from the University of Iowa. Among 8940 individual nurse anesthetist work habits scores, only 2.6% were habits scores were significantly greater than the Mayo estimate (3.69) for the minimum expectations; all Phabits of nurse anesthetists within departments should not be compared with an appropriate minimum score (i.e., of 3.69). Instead, work habits scores should be analyzed based on relative reporting among anesthetists. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Fermat and the Minimum Principle

    Indian Academy of Sciences (India)

    Arguably, least action and minimum principles were offered or applied much earlier. This (or these) principle(s) is/are among the fundamental, basic, unifying or organizing ones used to describe a variety of natural phenomena. It considers the amount of energy expended in performing a given action to be the least required ...

  13. Second Law Analysis of the Optimal Fin by Minimum Entropy Generation

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Based on the entropy generation concept of thermodynamics, this paper established a general theoretical model for the analysis of entropy generation to optimize fms, in which the minimum entropy generation was selected as the object to be studied. The irreversibility due to heat transfer and friction was taken into account so that the minimum entropygeneration number has been analyzed with respect to second law of thermodynamics in the forced cross-flow. The optimum dimensions of cylinder pins were discussed. It's found that the minimum entropy generation number depends on parameters related to the fluid and fin physical parameters. Variations of the minimum entropy generation number with different parameters were analyzed.

  14. The Effect of Minimum Wages on Adolescent Fertility: A Nationwide Analysis.

    Science.gov (United States)

    Bullinger, Lindsey Rose

    2017-03-01

    To investigate the effect of minimum wage laws on adolescent birth rates in the United States. I used a difference-in-differences approach and vital statistics data measured quarterly at the state level from 2003 to 2014. All models included state covariates, state and quarter-year fixed effects, and state-specific quarter-year nonlinear time trends, which provided plausibly causal estimates of the effect of minimum wage on adolescent birth rates. A $1 increase in minimum wage reduces adolescent birth rates by about 2%. The effects are driven by non-Hispanic White and Hispanic adolescents. Nationwide, increasing minimum wages by $1 would likely result in roughly 5000 fewer adolescent births annually.

  15. The debate on the economic effects of minimum wage legislation

    Directory of Open Access Journals (Sweden)

    Santos Miguel Ruesga-Benito

    2017-12-01

    Full Text Available The minimum wage establishment has its origin in the first third of the last century. Since its creation has been a focus of continuing controversy and an unfinished debate on economics field. This work reviews the effects of the minimum wage on employment and other macroeconomic variables, from both theoretical and empirical perspectives. The method is based on the revision of the literature and the main economic indicators. The central contribution of this paper is providing a general reflection on theoretical and empirical analysis about the debate on minimum wage and its effects. The results showed that some labor policies are taking account the effects of austerity strategies, shifting the attention towards the implementation of minimum wages or their updating, in order to reduce the growing inequalities in the distribution of income, and even poverty levels.

  16. The Effects of Data Gaps on the Calculated Monthly Mean Maximum and Minimum Temperatures in the Continental United States: A Spatial and Temporal Study.

    Science.gov (United States)

    Stooksbury, David E.; Idso, Craig D.; Hubbard, Kenneth G.

    1999-05-01

    Gaps in otherwise regularly scheduled observations are often referred to as missing data. This paper explores the spatial and temporal impacts that data gaps in the recorded daily maximum and minimum temperatures have on the calculated monthly mean maximum and minimum temperatures. For this analysis 138 climate stations from the United States Historical Climatology Network Daily Temperature and Precipitation Data set were selected. The selected stations had no missing maximum or minimum temperature values during the period 1951-80. The monthly mean maximum and minimum temperatures were calculated for each station for each month. For each month 1-10 consecutive days of data from each station were randomly removed. This was performed 30 times for each simulated gap period. The spatial and temporal impact of the 1-10-day data gaps were compared. The influence of data gaps is most pronounced in the continental regions during the winter and least pronounced in the southeast during the summer. In the north central plains, 10-day data gaps during January produce a standard deviation value greater than 2°C about the `true' mean. In the southeast, 10-day data gaps in July produce a standard deviation value less than 0.5°C about the mean. The results of this study will be of value in climate variability and climate trend research as well as climate assessment and impact studies.

  17. Do minimum wages improve early life health? Evidence from developing countries.

    Science.gov (United States)

    Majid, Muhammad Farhan; Mendoza Rodríguez, José M; Harper, Sam; Frank, John; Nandi, Arijit

    2016-06-01

    The impact of legislated minimum wages on the early-life health of children living in low and middle-income countries has not been examined. For our analyses, we used data from the Demographic and Household Surveys (DHS) from 57 countries conducted between 1999 and 2013. Our analyses focus on height-for-age z scores (HAZ) for children under 5 years of age who were surveyed as part of the DHS. To identify the causal effect of minimum wages, we utilized plausibly exogenous variation in the legislated minimum wages during each child's year of birth, the identifying assumption being that mothers do not time their births around changes in the minimum wage. As a sensitivity exercise, we also made within family comparisons (mother fixed effect models). Our final analysis on 49 countries reveal that a 1% increase in minimum wages was associated with 0.1% (95% CI = -0.2, 0) decrease in HAZ scores. Adverse effects of an increase in the minimum wage were observed among girls and for children of fathers who were less than 35 years old, mothers aged 20-29, parents who were married, parents who were less educated, and parents involved in manual work. We also explored heterogeneity by region and GDP per capita at baseline (1999). Adverse effects were concentrated in lower-income countries and were most pronounced in South Asia. By contrast, increases in the minimum wage improved children's HAZ in Latin America, and among children of parents working in a skilled sector. Our findings are inconsistent with the hypothesis that increases in the minimum wage unconditionally improve child health in lower-income countries, and highlight heterogeneity in the impact of minimum wages around the globe. Future work should involve country and occupation specific studies which can explore not only different outcomes such as infant mortality rates, but also explore the role of parental investments in shaping these effects. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. A new method of optimal capacitor switching based on minimum spanning tree theory in distribution systems

    Science.gov (United States)

    Li, H. W.; Pan, Z. Y.; Ren, Y. B.; Wang, J.; Gan, Y. L.; Zheng, Z. Z.; Wang, W.

    2018-03-01

    According to the radial operation characteristics in distribution systems, this paper proposes a new method based on minimum spanning trees method for optimal capacitor switching. Firstly, taking the minimal active power loss as objective function and not considering the capacity constraints of capacitors and source, this paper uses Prim algorithm among minimum spanning trees algorithms to get the power supply ranges of capacitors and source. Then with the capacity constraints of capacitors considered, capacitors are ranked by the method of breadth-first search. In term of the order from high to low of capacitor ranking, capacitor compensation capacity based on their power supply range is calculated. Finally, IEEE 69 bus system is adopted to test the accuracy and practicality of the proposed algorithm.

  19. Neutrino Mixing and Masses from a Minimum Principle

    CERN Document Server

    Alonso, R; Isidori, G; Maiani, L

    2013-01-01

    We analyze the structure of quark and lepton mass matrices under the hypothesis that they are determined from a minimum principle applied to a generic potential invariant under the $\\left[SU(3)\\right]^5\\otimes \\mathcal O(3)$ flavor symmetry, acting on Standard Model fermions and right-handed neutrinos. Unlike the quark case, we show that hierarchical masses for charged leptons are naturally accompanied by degenerate Majorana neutrinos with one mixing angle close to maximal, a second potentially large, a third one necessarily small, and one maximal relative Majorana phase. Adding small perturbations the predicted structure for the neutrino mass matrix is in excellent agreement with present observations and could be tested in the near future via neutrino-less double beta decay and cosmological measurements. The generalization of these results to arbitrary sew-saw models is also discussed.

  20. 77 FR 43196 - Minimum Internal Control Standards and Technical Standards

    Science.gov (United States)

    2012-07-24

    ... NATIONAL INDIAN GAMING COMMISSION 25 CFR Parts 543 and 547 Minimum Internal Control Standards [email protected] . SUPPLEMENTARY INFORMATION: Part 543 addresses minimum internal control standards (MICS) for Class II gaming operations. The regulations require tribes to establish controls and implement...

  1. Minimum-error discrimination of entangled quantum states

    International Nuclear Information System (INIS)

    Lu, Y.; Coish, N.; Kaltenbaek, R.; Hamel, D. R.; Resch, K. J.; Croke, S.

    2010-01-01

    Strategies to optimally discriminate between quantum states are critical in quantum technologies. We present an experimental demonstration of minimum-error discrimination between entangled states, encoded in the polarization of pairs of photons. Although the optimal measurement involves projection onto entangled states, we use a result of J. Walgate et al. [Phys. Rev. Lett. 85, 4972 (2000)] to design an optical implementation employing only local polarization measurements and feed-forward, which performs at the Helstrom bound. Our scheme can achieve perfect discrimination of orthogonal states and minimum-error discrimination of nonorthogonal states. Our experimental results show a definite advantage over schemes not using feed-forward.

  2. Mechanical properties of banana/kenaf fiber-reinforced hybrid polyester composites: Effect of woven fabric and random orientation

    International Nuclear Information System (INIS)

    Alavudeen, A.; Rajini, N.; Karthikeyan, S.; Thiruchitrambalam, M.; Venkateshwaren, N.

    2015-01-01

    Highlights: • This paper is presents the fabrications of kenaf/banana fiber hybrid composites. • Effect of weaving pattern and random orientation on mechanical properties was studied. • Role of interfacial adhesion due to chemical modifications were analyzed with the aid of SEM. • Hybridization of kenaf and banana fibers in plain woven composites exhibits maximum mechanical strength. - Abstract: The present work deals with the effect of weaving patterns and random orientatation on the mechanical properties of banana, kenaf and banana/kenaf fiber-reinforced hybrid polyester composites. Composites were prepared using the hand lay-up method with two different weaving patterns, namely, plain and twill type. Of the two weaving patterns, the plain type showed improved tensile properties compared to the twill type in all the fabricated composites. Furthermore, the maximum increase in mechanical strength was observed in the plain woven hybrid composites rather than in randomly oriented composites. This indicates minimum stress development at the interface of composites due to the distribution of load transfer along the fiber direction. Moreover, alkali (NaOH) and sodium lauryl sulfate (SLS) treatments appear to provide an additional improvement in mechanical strength through enhanced interfacial bonding. Morphological studies of fractured mechanical testing samples were performed by scanning electron microscopy (SEM) to understand the de-bonding of fiber/matrix adhesion

  3. UPGMA and the normalized equidistant minimum evolution problem

    OpenAIRE

    Moulton, Vincent; Spillner, Andreas; Wu, Taoyang

    2017-01-01

    UPGMA (Unweighted Pair Group Method with Arithmetic Mean) is a widely used clustering method. Here we show that UPGMA is a greedy heuristic for the normalized equidistant minimum evolution (NEME) problem, that is, finding a rooted tree that minimizes the minimum evolution score relative to the dissimilarity matrix among all rooted trees with the same leaf-set in which all leaves have the same distance to the root. We prove that the NEME problem is NP-hard. In addition, we present some heurist...

  4. Pay equity, minimum wage and equality at work

    OpenAIRE

    Rubery, Jill

    2003-01-01

    Reviews the underlying causes of pay discrimination embedded within the organization of the labour market and structures of pay and reward. Discusses the need to focus on pay equity as part of a general strategy of promoting equity and decent work and examines the case for using minimum wage policies in comparison to more targeted equal pay policies to reduce gender pay equity. Identifies potential obstacles to or support for such policies and describes experiences of the use of minimum wages...

  5. A theory of compliance with minimum wage legislation

    OpenAIRE

    Jellal, Mohamed

    2012-01-01

    In this paper, we introduce firm heterogeneity in the context of a model of non-compliance with minimum wage legislation. The introduction of heterogeneity in the ease with which firms can be monitored for non compliance allows us to show that non-compliance will persist in sectors which are relatively difficult to monitor, despite the government implementing non stochastic monitoring. Moreover, we show that the incentive not to comply is an increasing function of the level of the minimum wag...

  6. The Effects of Minimum Wage Throughout the Wage Distribution in Indonesia

    Directory of Open Access Journals (Sweden)

    Sri Gusvina Dewi

    2018-03-01

    Full Text Available The global financial crisis in 2007 followed by Indonesia’s largest labor demonstration in 2013 encouraged turmoils on Indonesia labor market. This paper examines the effect of the minimum wage on wage distribution in 2007 and 2014 and how the minimum wage increases in 2014 affected the distribution of wage differences between 2007 and 2014. This study employs recentered influence function (RIF regression method to estimate the wage function by using unconditional quantile regression. Furthermore, to measure the effect of the minimum wage increase in 2014 on the distribution of wage differences, it uses the Oaxaca–Blinder decomposition method. Using balanced panel data from the Indonesian Family Life Survey (IFLS, it found that the minimum wage mitigates wage disparity in 2007 and 2014. The minimum wage policy in 2014 leads to an increase in the wage difference between 2007 and 2014, with the largest wage difference being in the middle distribution.DOI: 10.15408/sjie.v7i2.6125

  7. A machine learning calibration model using random forests to improve sensor performance for lower-cost air quality monitoring

    Science.gov (United States)

    Zimmerman, Naomi; Presto, Albert A.; Kumar, Sriniwasa P. N.; Gu, Jason; Hauryliuk, Aliaksei; Robinson, Ellis S.; Robinson, Allen L.; Subramanian, R.

    2018-01-01

    Low-cost sensing strategies hold the promise of denser air quality monitoring networks, which could significantly improve our understanding of personal air pollution exposure. Additionally, low-cost air quality sensors could be deployed to areas where limited monitoring exists. However, low-cost sensors are frequently sensitive to environmental conditions and pollutant cross-sensitivities, which have historically been poorly addressed by laboratory calibrations, limiting their utility for monitoring. In this study, we investigated different calibration models for the Real-time Affordable Multi-Pollutant (RAMP) sensor package, which measures CO, NO2, O3, and CO2. We explored three methods: (1) laboratory univariate linear regression, (2) empirical multiple linear regression, and (3) machine-learning-based calibration models using random forests (RF). Calibration models were developed for 16-19 RAMP monitors (varied by pollutant) using training and testing windows spanning August 2016 through February 2017 in Pittsburgh, PA, US. The random forest models matched (CO) or significantly outperformed (NO2, CO2, O3) the other calibration models, and their accuracy and precision were robust over time for testing windows of up to 16 weeks. Following calibration, average mean absolute error on the testing data set from the random forest models was 38 ppb for CO (14 % relative error), 10 ppm for CO2 (2 % relative error), 3.5 ppb for NO2 (29 % relative error), and 3.4 ppb for O3 (15 % relative error), and Pearson r versus the reference monitors exceeded 0.8 for most units. Model performance is explored in detail, including a quantification of model variable importance, accuracy across different concentration ranges, and performance in a range of monitoring contexts including the National Ambient Air Quality Standards (NAAQS) and the US EPA Air Sensors Guidebook recommendations of minimum data quality for personal exposure measurement. A key strength of the RF approach is that

  8. Microcomputer-Assisted Discoveries: Generate Your Own Random Numbers.

    Science.gov (United States)

    Kimberling, Clark

    1984-01-01

    Having students try to generate their own random numbers can lead to much discovery learning as one tries to create 'patternlessness' from formulas. Developing an equidistribution test and runs test, plus other ideas for generating random numbers, is discussed, with computer programs given. (MNS)

  9. 14 CFR 33.83 - Vibration test.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Vibration test. 33.83 Section 33.83... STANDARDS: AIRCRAFT ENGINES Block Tests; Turbine Aircraft Engines § 33.83 Vibration test. (a) Each engine... experience, analysis, and component test and shall address, as a minimum, blades, vanes, rotor discs, spacers...

  10. Home-Based HIV Testing and Counseling for Male Couples (Project Nexus): A Protocol for a Randomized Controlled Trial.

    Science.gov (United States)

    Stephenson, Rob; Freeland, Ryan; Sullivan, Stephen P; Riley, Erin; Johnson, Brent A; Mitchell, Jason; McFarland, Deborah; Sullivan, Patrick S

    2017-05-30

    HIV prevalence remains high among men who have sex with men (MSM) in the United States, yet the majority of research has focused on MSM as individuals, not as dyads, and has discussed HIV risks primarily in the context of casual sex. Nexus is an online prevention program that combines home-based HIV testing and couples HIV testing and counseling (CHTC). It allows partners in dyadic MSM relationships to receive HIV testing and care in the comfort of their designated residence, via video-based chat. By using video-based technologies (eg, VSee video chat), male couples receive counseling and support from a remote online counselor, while testing for HIV at home. This randomized control trial (RCT) aims to examine the effects of video-based counseling combined with home-based HIV testing on couples' management of HIV risk, formation and adherence to explicit sexual agreements, and sexual risk-taking. The research implements a prospective RCT of 400 online-recruited male couples: 200 self-reported concordant-negative couples and 200 self-reported discordant couples. Couples in the control arm will receive one or two home-based HIV self-testing kits and will be asked to report their results via the study's website. Couples in the experimental arm will receive one or two home-based HIV self-testing kits and will conduct these tests together under the facilitation of a remotely located counselor during a prescheduled VSee-based video CHTC session. Study assessments are taken at baseline, as well as at 3- and 6-month follow-up sessions. Project Nexus was launched in April 2016 and is ongoing. To date, 219 eligible couples have been enrolled and randomized. Combining home-based HIV testing with video-based counseling creates an opportunity to expand CHTC to male couples who (1) live outside metro areas, (2) live in rural areas without access to testing services or LGBTQ resources, or (3) feel that current clinic-based testing is not for them (eg, due to fears of

  11. Quality control of brachytherapy equipment in the Netherlands and Belgium: current practice and minimum requirements

    International Nuclear Information System (INIS)

    Elfrink, Robert J.M.; Kolkman-Deurloo, Inger-Karine K.; Kleffens, Herman J. van; Rijnders, Alex; Schaeken, Bob; Aalbers, Tony H.L.; Dries, Wim J.F.; Venselaar, Jack L.M.

    2002-01-01

    Background and purpose: Brachytherapy is applied in 39 radiotherapy institutions in The Netherlands and Belgium. Each institution has its own quality control (QC) programme to ensure safe and accurate dose delivery to the patient. The main goal of this work is to gain insight into the current practice of QC of brachytherapy in The Netherlands and Belgium and to reduce possible variations in test frequencies and tolerances by formulating a set of minimum QC-requirements. Materials and methods: An extensive questionnaire about QC of brachytherapy was distributed to and completed by the 39 radiotherapy institutions. A separate smaller questionnaire was sent to nine institutions performing intracoronary brachytherapy. The questions were related to safety systems, physical irradiation parameters and total time spent on QC. The results of the questionnaires were compared with recommendations given in international brachytherapy QC reports. Results: The answers to the questionnaires showed large variations in test frequencies and test methods. Furthermore, large variations in time spent on QC exist, which is mainly due to differences in QC-philosophy and differences in the available resources. Conclusions: Based on the results of the questionnaires and the comparison with the international recommendations, a set of minimum requirements for QC of brachytherapy has been formulated. These guidelines will be implemented in the radiotherapy institutions in The Netherlands and Belgium

  12. Decision trees with minimum average depth for sorting eight elements

    KAUST Repository

    AbouEisha, Hassan M.; Chikalov, Igor; Moshkov, Mikhail

    2015-01-01

    We prove that the minimum average depth of a decision tree for sorting 8 pairwise different elements is equal to 620160/8!. We show also that each decision tree for sorting 8 elements, which has minimum average depth (the number of such trees

  13. Nursing Minimum Data Set Based on EHR Archetypes Approach.

    Science.gov (United States)

    Spigolon, Dandara N; Moro, Cláudia M C

    2012-01-01

    The establishment of a Nursing Minimum Data Set (NMDS) can facilitate the use of health information systems. The adoption of these sets and represent them based on archetypes are a way of developing and support health systems. The objective of this paper is to describe the definition of a minimum data set for nursing in endometriosis represent with archetypes. The study was divided into two steps: Defining the Nursing Minimum Data Set to endometriosis, and Development archetypes related to the NMDS. The nursing data set to endometriosis was represented in the form of archetype, using the whole perception of the evaluation item, organs and senses. This form of representation is an important tool for semantic interoperability and knowledge representation for health information systems.

  14. Recent Immigrants as Labor Market Arbitrageurs: Evidence from the Minimum Wage*

    Science.gov (United States)

    Cadena, Brian C.

    2014-01-01

    This paper investigates the local labor supply effects of changes to the minimum wage by examining the response of low-skilled immigrants’ location decisions. Canonical models emphasize the importance of labor mobility when evaluating the employment effects of the minimum wage; yet few studies address this outcome directly. Low-skilled immigrant populations shift toward labor markets with stagnant minimum wages, and this result is robust to a number of alternative interpretations. This mobility provides behavior-based evidence in favor of a non-trivial negative employment effect of the minimum wage. Further, it reduces the estimated demand elasticity using teens; employment losses among native teens are substantially larger in states that have historically attracted few immigrant residents. PMID:24999288

  15. Recent Immigrants as Labor Market Arbitrageurs: Evidence from the Minimum Wage.

    Science.gov (United States)

    Cadena, Brian C

    2014-03-01

    This paper investigates the local labor supply effects of changes to the minimum wage by examining the response of low-skilled immigrants' location decisions. Canonical models emphasize the importance of labor mobility when evaluating the employment effects of the minimum wage; yet few studies address this outcome directly. Low-skilled immigrant populations shift toward labor markets with stagnant minimum wages, and this result is robust to a number of alternative interpretations. This mobility provides behavior-based evidence in favor of a non-trivial negative employment effect of the minimum wage. Further, it reduces the estimated demand elasticity using teens; employment losses among native teens are substantially larger in states that have historically attracted few immigrant residents.

  16. EFFECTS DISTRIBUTIVE THE WAGE MINIMUM IN MARKET OF LABOR CEARENSE

    Directory of Open Access Journals (Sweden)

    Joyciane Coelho Vasconcelos

    2015-11-01

    Full Text Available This paper analyses the contribution of the minimum wage (MW for the devolution of income from the labor market at Ceará in the period 2002-2012. This research was based on National Sample Survey (PNAD of the Brazilian Institute of Geography and Statistics (IBGE.It was used the simulation methodology proposed in DiNardo, Fortin and Lemieux (1996 from the estimated counterfactual Kernel density functions. The simulations were performed for females and males. The results revealed by the decompositions than the minimum wage, the degree of formalization and the personal attributes had impacts not concentrators to workers female and male. However, for women, the de-concentrating effect of the minimum wage is more intense in the sample compared to men. In summary, the simulations indicate the importance of the minimum wage to reduce the dispersion of labor income in recent years.

  17. Optimal random search for a single hidden target.

    Science.gov (United States)

    Snider, Joseph

    2011-01-01

    A single target is hidden at a location chosen from a predetermined probability distribution. Then, a searcher must find a second probability distribution from which random search points are sampled such that the target is found in the minimum number of trials. Here it will be shown that if the searcher must get very close to the target to find it, then the best search distribution is proportional to the square root of the target distribution regardless of dimension. For a Gaussian target distribution, the optimum search distribution is approximately a Gaussian with a standard deviation that varies inversely with how close the searcher must be to the target to find it. For a network where the searcher randomly samples nodes and looks for the fixed target along edges, the optimum is either to sample a node with probability proportional to the square root of the out-degree plus 1 or not to do so at all.

  18. Semi-device-independent random-number expansion without entanglement

    International Nuclear Information System (INIS)

    Li Hongwei; Yin Zhenqiang; Wu Yuchun; Zou Xubo; Wang Shuang; Chen Wei; Guo Guangcan; Han Zhengfu

    2011-01-01

    By testing the classical correlation violation between two systems, true random numbers can be generated and certified without applying classical statistical method. In this work, we propose a true random-number expansion protocol without entanglement, where the randomness can be guaranteed only by the two-dimensional quantum witness violation. Furthermore, we only assume that the dimensionality of the system used in the protocol has a tight bound, and the whole protocol can be regarded as a semi-device-independent black-box scenario. Compared with the device-independent random-number expansion protocol based on entanglement, our protocol is much easier to implement and test.

  19. 29 CFR 510.24 - Governmental entities eligible for minimum wage phase-in.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Governmental entities eligible for minimum wage phase-in..., DEPARTMENT OF LABOR REGULATIONS IMPLEMENTATION OF THE MINIMUM WAGE PROVISIONS OF THE 1989 AMENDMENTS TO THE... eligible for minimum wage phase-in. (a) The Commonwealth government of Puerto Rico has been determined to...

  20. A Systematic Review of Surgical Randomized Controlled Trials: Part 2. Funding Source, Conflict of Interest, and Sample Size in Plastic Surgery.

    Science.gov (United States)

    Voineskos, Sophocles H; Coroneos, Christopher J; Ziolkowski, Natalia I; Kaur, Manraj N; Banfield, Laura; Meade, Maureen O; Chung, Kevin C; Thoma, Achilleas; Bhandari, Mohit

    2016-02-01

    The authors examined industry support, conflict of interest, and sample size in plastic surgery randomized controlled trials that compared surgical interventions. They hypothesized that industry-funded trials demonstrate statistically significant outcomes more often, and randomized controlled trials with small sample sizes report statistically significant results more frequently. An electronic search identified randomized controlled trials published between 2000 and 2013. Independent reviewers assessed manuscripts and performed data extraction. Funding source, conflict of interest, primary outcome direction, and sample size were examined. Chi-squared and independent-samples t tests were used in the analysis. The search identified 173 randomized controlled trials, of which 100 (58 percent) did not acknowledge funding status. A relationship between funding source and trial outcome direction was not observed. Both funding status and conflict of interest reporting improved over time. Only 24 percent (six of 25) of industry-funded randomized controlled trials reported authors to have independent control of data and manuscript contents. The mean number of patients randomized was 73 per trial (median, 43, minimum, 3, maximum, 936). Small trials were not found to be positive more often than large trials (p = 0.87). Randomized controlled trials with small sample size were common; however, this provides great opportunity for the field to engage in further collaboration and produce larger, more definitive trials. Reporting of trial funding and conflict of interest is historically poor, but it greatly improved over the study period. Underreporting at author and journal levels remains a limitation when assessing the relationship between funding source and trial outcomes. Improved reporting and manuscript control should be goals that both authors and journals can actively achieve.

  1. On the road again: traffic fatalities and auto insurance minimums

    Directory of Open Access Journals (Sweden)

    Pavel A. Yakovlev

    2018-03-01

    Full Text Available Prior research on policy-induced moral hazard effects in the auto insurance market has focused on the impact of compulsory insurance, no-fault liability, and tort liability laws on traffic fatalities. In contrast, this paper examines the moral hazard effect of a previously overlooked policy variable: minimum auto insurance coverage. We hypothesize that state-mandated auto insurance minimums may “over-insure” some drivers, lowering their incentives to drive carefully. Using a longitudinal panel of American states from 1982 to 2006, we find that policy-induced increases in auto insurance minimums are associated with higher traffic fatality rates, ceteris paribus.

  2. What's So Peculiar about the Cycle 23/24 Solar Minimum?

    Science.gov (United States)

    Sheeley, N. R., Jr.

    2010-06-01

    Traditionally, solar physicists become anxious around solar minimum, as they await the high-latitude sunspot groups of the new cycle. Now, we are in an extended sunspot minimum with conditions not seen in recent memory, and interest in the sunspot cycle has increased again. In this paper, I will describe some of the characteristics of the current solar minimum, including its great depth, its extended duration, its weak polar magnetic fields, and its small amount of open flux. Flux transport simulations suggest that these characteristics are a consequence of temporal variations of the Sun's large-scale meridional circulation.

  3. Classification of periodic, chaotic and random sequences using ...

    Indian Academy of Sciences (India)

    We analyse the problem with multiple values of initial conditions and perform statistical hypothesis testing for differences in means to determine the minimum ... the datasets showed a significant main effect (F3,196 = 77.04, P < 0.001). Post hoc.

  4. Preference of small molecules for local minimum conformations when binding to proteins.

    Directory of Open Access Journals (Sweden)

    Qi Wang

    2007-09-01

    Full Text Available It is well known that small molecules (ligands do not necessarily adopt their lowest potential energy conformations when binding to proteins. Analyses of protein-bound ligand crystal structures have reportedly shown that many of them do not even adopt the conformations at local minima of their potential energy surfaces (local minimum conformations. The results of these analyses raise a concern regarding the validity of virtual screening methods that use ligands in local minimum conformations. Here we report a normal-mode-analysis (NMA study of 100 crystal structures of protein-bound ligands. Our data show that the energy minimization of a ligand alone does not automatically stop at a local minimum conformation if the minimum of the potential energy surface is shallow, thus leading to the folding of the ligand. Furthermore, our data show that all 100 ligand conformations in their protein-bound ligand crystal structures are nearly identical to their local minimum conformations obtained from NMA-monitored energy minimization, suggesting that ligands prefer to adopt local minimum conformations when binding to proteins. These results both support virtual screening methods that use ligands in local minimum conformations and caution about possible adverse effect of excessive energy minimization when generating a database of ligand conformations for virtual screening.

  5. Minimum-B mirrors plus EBT principles

    International Nuclear Information System (INIS)

    Yoshikawa, S.

    1983-01-01

    Electrons are heated at the minimum B location(s) created by the multipole field and the toroidal field. Resulting hot electrons can assist plasma confinement by (1) providing mirror, (2) creating azimuthally symmetric toroidal confinement, or (3) creating modified bumpy torus

  6. Minimum Wage and Community College Attendance: How Economic Circumstances Affect Educational Choices

    Science.gov (United States)

    Williams, Betsy

    2013-01-01

    How do changes in minimum wages affect community college enrollment and employment? In particular, among adults without associate's or bachelor's degrees who may earn near the minimum wage, do endowment effects of a higher minimum wage encourage school attendance? Among adults without associate's or bachelor's degrees who may earn near the minimum…

  7. 22 CFR 151.4 - Minimum limits for motor vehicle insurance.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Minimum limits for motor vehicle insurance. 151.4 Section 151.4 Foreign Relations DEPARTMENT OF STATE DIPLOMATIC PRIVILEGES AND IMMUNITIES COMPULSORY LIABILITY INSURANCE FOR DIPLOMATIC MISSIONS AND PERSONNEL § 151.4 Minimum limits for motor vehicle...

  8. Minimum Probability of Error-Based Equalization Algorithms for Fading Channels

    Directory of Open Access Journals (Sweden)

    Janos Levendovszky

    2007-06-01

    Full Text Available Novel channel equalizer algorithms are introduced for wireless communication systems to combat channel distortions resulting from multipath propagation. The novel algorithms are based on newly derived bounds on the probability of error (PE and guarantee better performance than the traditional zero forcing (ZF or minimum mean square error (MMSE algorithms. The new equalization methods require channel state information which is obtained by a fast adaptive channel identification algorithm. As a result, the combined convergence time needed for channel identification and PE minimization still remains smaller than the convergence time of traditional adaptive algorithms, yielding real-time equalization. The performance of the new algorithms is tested by extensive simulations on standard mobile channels.

  9. Reference respiratory waveforms by minimum jerk model analysis

    Energy Technology Data Exchange (ETDEWEB)

    Anetai, Yusuke, E-mail: anetai@radonc.med.osaka-u.ac.jp; Sumida, Iori; Takahashi, Yutaka; Yagi, Masashi; Mizuno, Hirokazu; Ogawa, Kazuhiko [Department of Radiation Oncology, Osaka University Graduate School of Medicine, Yamadaoka 2-2, Suita-shi, Osaka 565-0871 (Japan); Ota, Seiichi [Department of Medical Technology, Osaka University Hospital, Yamadaoka 2-15, Suita-shi, Osaka 565-0871 (Japan)

    2015-09-15

    Purpose: CyberKnife{sup ®} robotic surgery system has the ability to deliver radiation to a tumor subject to respiratory movements using Synchrony{sup ®} mode with less than 2 mm tracking accuracy. However, rapid and rough motion tracking causes mechanical tracking errors and puts mechanical stress on the robotic joint, leading to unexpected radiation delivery errors. During clinical treatment, patient respiratory motions are much more complicated, suggesting the need for patient-specific modeling of respiratory motion. The purpose of this study was to propose a novel method that provides a reference respiratory wave to enable smooth tracking for each patient. Methods: The minimum jerk model, which mathematically derives smoothness by means of jerk, or the third derivative of position and the derivative of acceleration with respect to time that is proportional to the time rate of force changed was introduced to model a patient-specific respiratory motion wave to provide smooth motion tracking using CyberKnife{sup ®}. To verify that patient-specific minimum jerk respiratory waves were being tracked smoothly by Synchrony{sup ®} mode, a tracking laser projection from CyberKnife{sup ®} was optically analyzed every 0.1 s using a webcam and a calibrated grid on a motion phantom whose motion was in accordance with three pattern waves (cosine, typical free-breathing, and minimum jerk theoretical wave models) for the clinically relevant superior–inferior directions from six volunteers assessed on the same node of the same isocentric plan. Results: Tracking discrepancy from the center of the grid to the beam projection was evaluated. The minimum jerk theoretical wave reduced the maximum-peak amplitude of radial tracking discrepancy compared with that of the waveforms modeled by cosine and typical free-breathing model by 22% and 35%, respectively, and provided smooth tracking for radial direction. Motion tracking constancy as indicated by radial tracking discrepancy

  10. Reference respiratory waveforms by minimum jerk model analysis

    International Nuclear Information System (INIS)

    Anetai, Yusuke; Sumida, Iori; Takahashi, Yutaka; Yagi, Masashi; Mizuno, Hirokazu; Ogawa, Kazuhiko; Ota, Seiichi

    2015-01-01

    Purpose: CyberKnife"® robotic surgery system has the ability to deliver radiation to a tumor subject to respiratory movements using Synchrony"® mode with less than 2 mm tracking accuracy. However, rapid and rough motion tracking causes mechanical tracking errors and puts mechanical stress on the robotic joint, leading to unexpected radiation delivery errors. During clinical treatment, patient respiratory motions are much more complicated, suggesting the need for patient-specific modeling of respiratory motion. The purpose of this study was to propose a novel method that provides a reference respiratory wave to enable smooth tracking for each patient. Methods: The minimum jerk model, which mathematically derives smoothness by means of jerk, or the third derivative of position and the derivative of acceleration with respect to time that is proportional to the time rate of force changed was introduced to model a patient-specific respiratory motion wave to provide smooth motion tracking using CyberKnife"®. To verify that patient-specific minimum jerk respiratory waves were being tracked smoothly by Synchrony"® mode, a tracking laser projection from CyberKnife"® was optically analyzed every 0.1 s using a webcam and a calibrated grid on a motion phantom whose motion was in accordance with three pattern waves (cosine, typical free-breathing, and minimum jerk theoretical wave models) for the clinically relevant superior–inferior directions from six volunteers assessed on the same node of the same isocentric plan. Results: Tracking discrepancy from the center of the grid to the beam projection was evaluated. The minimum jerk theoretical wave reduced the maximum-peak amplitude of radial tracking discrepancy compared with that of the waveforms modeled by cosine and typical free-breathing model by 22% and 35%, respectively, and provided smooth tracking for radial direction. Motion tracking constancy as indicated by radial tracking discrepancy affected by respiratory

  11. Does acetaminophen/hydrocodone affect cold pulpal testing in patients with symptomatic irreversible pulpitis? A prospective, randomized, double-blind, placebo-controlled study.

    Science.gov (United States)

    Fowler, Sara; Fullmer, Spencer; Drum, Melissa; Reader, Al

    2014-12-01

    The purpose of this prospective randomized, double-blind, placebo-controlled study was to determine the effects of a combination dose of 1000 mg acetaminophen/10 mg hydrocodone on cold pulpal testing in patients experiencing symptomatic irreversible pulpitis. One hundred emergency patients in moderate to severe pain diagnosed with symptomatic irreversible pulpitis of a mandibular posterior tooth randomly received, in a double-blind manner, identical capsules of either a combination of 1000 mg acetaminophen/10 hydrocodone or placebo. Cold testing with Endo-Ice (1,1,1,2 tetrafluoroethane; Hygenic Corp, Akron, OH) was performed at baseline and every 10 minutes for 60 minutes. Pain to cold testing was recorded by the patient using a Heft-Parker visual analog scale. Patients' reaction to the cold application was also rated. Cold testing at baseline and at 10 minutes resulted in severe pain for both the acetaminophen/hydrocodone and placebo groups. Although pain ratings decreased from 20-60 minutes, the ratings still resulted in moderate pain. Patient reaction to cold testing showed that 56%-62% had a severe reaction. Although the reactions decreased in severity over the 60 minutes, 20%-34% still had severe reactions at 60 minutes. Regarding pain and patients' reactions to cold testing, there were no significant differences between the combination acetaminophen/hydrocodone and placebo groups at any time period. A combination dose of 1000 mg of acetaminophen/10 mg of hydrocodone did not statistically affect cold pulpal testing in patients presenting with symptomatic irreversible pulpitis. Patients experienced moderate to severe pain and reactions to cold testing. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  12. 29 CFR 783.26 - The section 6(b)(2) minimum wage requirement.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false The section 6(b)(2) minimum wage requirement. 783.26... The section 6(b)(2) minimum wage requirement. Section 6(b), with paragraph (2) thereof, requires the... prescribed by” paragraph (1) of the subsection is the minimum wage rate applicable according to the schedule...

  13. Clinical trial: a randomized trial of early endoscopy, Helicobacter pylori testing and empirical therapy for the management of dyspepsia in primary care.

    Science.gov (United States)

    Duggan, A E; Elliott, C A; Miller, P; Hawkey, C J; Logan, R F A

    2009-01-01

    Early endoscopy, Helicobacter pylori eradication and empirical acid suppression are commonly used dyspepsia management strategies in primary care but have not been directly compared in a single trial. To compare endoscopy, H. pylori test and refer, H. pylori test and treat and empirical acid suppression for dyspepsia in primary care. Patients presenting to their general practitioner with dyspepsia were randomized to endoscopy, H. pylori'test and treat', H. pylori test and endoscope positives, or empirical therapy with symptoms, patient satisfaction, healthcare costs and cost effectiveness at 12 months being the outcomes. At 2 months, the proportion of patients reporting no or minimal dyspeptic symptoms ranged from 74% for those having early endoscopy to 55% for those on empirical therapy (P = 0.009), but at 1 year, there was little difference among the four strategies. Early endoscopy was associated with fewer subsequent consultations for dyspepsia (P = 0.003). 'Test and treat' resulted in fewer endoscopies overall and was most cost-effective over a range of cost assumptions. Empirical therapy resulted in the lowest initial costs, but the highest rate of subsequent endoscopy. Gastro-oesophageal cancers were found in four patients randomized to the H. pylori testing strategies. While early endoscopy offered some advantages 'Test and treat' was the most cost-effective strategy. In older patients, early endoscopy may be an appropriate strategy in view of the greater risk of malignant disease. © 2008 The Authors. Journal compilation © 2008 Blackwell Publishing Ltd.

  14. All-optical fast random number generator.

    Science.gov (United States)

    Li, Pu; Wang, Yun-Cai; Zhang, Jian-Zhong

    2010-09-13

    We propose a scheme of all-optical random number generator (RNG), which consists of an ultra-wide bandwidth (UWB) chaotic laser, an all-optical sampler and an all-optical comparator. Free from the electric-device bandwidth, it can generate 10Gbit/s random numbers in our simulation. The high-speed bit sequences can pass standard statistical tests for randomness after all-optical exclusive-or (XOR) operation.

  15. Patient perspectives with abbreviated versus standard pre-test HIV counseling in the prenatal setting: a randomized-controlled, non-inferiority trial.

    Science.gov (United States)

    Cohan, Deborah; Gomez, Elvira; Greenberg, Mara; Washington, Sierra; Charlebois, Edwin D

    2009-01-01

    In the US, an unacceptably high percentage of pregnant women do not undergo prenatal HIV testing. Previous studies have found increased uptake of prenatal HIV testing with abbreviated pre-test counseling, however little is known about patient decision making, testing satisfaction and knowledge in this setting. A randomized-controlled, non-inferiority trial was conducted from October 2006 through February 2008 at San Francisco General Hospital (SFGH), the public teaching hospital of the City and County of San Francisco. A total of 278 English- and Spanish-speaking pregnant women were randomized to receive either abbreviated or standard nurse-performed HIV test counseling at the initial prenatal visit. Patient decision making experience was compared between abbreviated versus standard HIV counseling strategies among a sample of low-income, urban, ethnically diverse prenatal patients. The primary outcome was the decisional conflict score (DCS) using O'Connor low-literacy scale and secondary outcomes included satisfaction with test decision, basic HIV knowledge and HIV testing uptake. We conducted an intention-to-treat analysis of 278 women--134 (48.2%) in the abbreviated arm (AA) and 144 (51.8%) in the standard arm (SA). There was no significant difference in the proportion of women with low decisional conflict (71.6% in AA vs. 76.4% in SA, p = .37), and the observed mean difference between the groups of 3.88 (95% CI: -0.65, 8.41) did not exceed the non-inferiority margin. HIV testing uptake was very high (97. 8%) and did not differ significantly between the 2 groups (99.3% in AA vs. 96.5% in SA, p = .12). Likewise, there was no difference in satisfaction with testing decision (97.8% in AA vs. 99.3% in SA, p = .36). However, women in AA had significantly lower mean HIV knowledge scores (78.4%) compared to women in SA (83.7%, pprocess, while associated with slightly lower knowledge, does not compromise patient decision making or satisfaction regarding HIV testing

  16. Comments on the 'minimum flux corona' concept

    International Nuclear Information System (INIS)

    Antiochos, S.K.; Underwood, J.H.

    1978-01-01

    Hearn's (1975) models of the energy balance and mass loss of stellar coronae, based on a 'minimum flux corona' concept, are critically examined. First, it is shown that the neglect of the relevant length scales for coronal temperature variation leads to an inconsistent computation of the total energy flux F. The stability arguments upon which the minimum flux concept is based are shown to be fallacious. Errors in the computation of the stellar wind contribution to the energy budget are identified. Finally we criticize Hearn's (1977) suggestion that the model, with a value of the thermal conductivity modified by the magnetic field, can explain the difference between solar coronal holes and quiet coronal regions. (orig.) 891 WL [de

  17. Minimum wakefield achievable by waveguide damped cavity

    International Nuclear Information System (INIS)

    Lin, X.E.; Kroll, N.M.

    1995-01-01

    The authors use an equivalent circuit to model a waveguide damped cavity. Both exponentially damped and persistent (decay t -3/2 ) components of the wakefield are derived from this model. The result shows that for a cavity with resonant frequency a fixed interval above waveguide cutoff, the persistent wakefield amplitude is inversely proportional to the external Q value of the damped mode. The competition of the two terms results in an optimal Q value, which gives a minimum wakefield as a function of the distance behind the source particle. The minimum wakefield increases when the resonant frequency approaches the waveguide cutoff. The results agree very well with computer simulation on a real cavity-waveguide system

  18. UBV photometry of dwarf novae in the brightness minimum

    International Nuclear Information System (INIS)

    Voloshina, I.B.; Lyutyj, V.M.

    1983-01-01

    Photoelectric one-night observations of the dwarf novae SS Cyg at minimum light evidence for the existence of eclipses in this system at the moments of conjuctions. The orbital inclination of the system is estimated to be i approximately 70 deg C. The components of this system are low-massive (white and red dwarf stars) and low-luminous objects. As the optical luminosity of the dwarf novae at the minimum light is dependent on the accretion disk and hot spot at its periphery, where the substance jet run out from a nondegenerated component falls, eclipses should be associated with the disk and hot spot. The white dwarf star contributes greatly to the luminosity at the minimum light, but its eclipses are possible only at i approximately 90 deg

  19. CONSEQUENCES OF INCREASING THE MINIMUM WAGE IN UKRAINE TWICE

    Directory of Open Access Journals (Sweden)

    Volodymyr Boreiko

    2017-03-01

    Full Text Available In the article the views of scientists on the role of incomes of the poorest people in providing of economic development of the country and consequences of increasing the minimum wage in Ukraine twice are investigated; the dynamics of change in Ukraine minimum wage during a decade are analyzed and decline in living standards of population during this period is shown; measures, which should be taken for non-inflationary growth in incomes of the population, are grounded; it is disclosed that such measures should include unification of income tax for individuals and single social contribution and restoration of a progressive taxation of incomes of the working population. Key words: minimum wage, household income, the poorest part of the population, the economy of country, tax system.

  20. Minimum Time Path Planning for Robotic Manipulator in Drilling/ Spot Welding Tasks

    Directory of Open Access Journals (Sweden)

    Qiang Zhang

    2016-04-01

    Full Text Available In this paper, a minimum time path planning strategy is proposed for multi points manufacturing problems in drilling/spot welding tasks. By optimizing the travelling schedule of the set points and the detailed transfer path between points, the minimum time manufacturing task is realized under fully utilizing the dynamic performance of robotic manipulator. According to the start-stop movement in drilling/spot welding task, the path planning problem can be converted into a traveling salesman problem (TSP and a series of point to point minimum time transfer path planning problems. Cubic Hermite interpolation polynomial is used to parameterize the transfer path and then the path parameters are optimized to obtain minimum point to point transfer time. A new TSP with minimum time index is constructed by using point-point transfer time as the TSP parameter. The classical genetic algorithm (GA is applied to obtain the optimal travelling schedule. Several minimum time drilling tasks of a 3-DOF robotic manipulator are used as examples to demonstrate the effectiveness of the proposed approach.