WorldWideScience

Sample records for small sample sizes

  1. Decision Support on Small size Passive Samples

    Directory of Open Access Journals (Sweden)

    Vladimir Popukaylo

    2018-05-01

    Full Text Available A construction technique of adequate mathematical models for small size passive samples, in conditions when classical probabilistic-statis\\-tical methods do not allow obtaining valid conclusions was developed.

  2. Test of a sample container for shipment of small size plutonium samples with PAT-2

    International Nuclear Information System (INIS)

    Kuhn, E.; Aigner, H.; Deron, S.

    1981-11-01

    A light-weight container for the air transport of plutonium, to be designated PAT-2, has been developed in the USA and is presently undergoing licensing. The very limited effective space for bearing plutonium required the design of small size sample canisters to meet the needs of international safeguards for the shipment of plutonium samples. The applicability of a small canister for the sampling of small size powder and solution samples has been tested in an intralaboratory experiment. The results of the experiment, based on the concept of pre-weighed samples, show that the tested canister can successfully be used for the sampling of small size PuO 2 -powder samples of homogeneous source material, as well as for dried aliquands of plutonium nitrate solutions. (author)

  3. Overestimation of test performance by ROC analysis: Effect of small sample size

    International Nuclear Information System (INIS)

    Seeley, G.W.; Borgstrom, M.C.; Patton, D.D.; Myers, K.J.; Barrett, H.H.

    1984-01-01

    New imaging systems are often observer-rated by ROC techniques. For practical reasons the number of different images, or sample size (SS), is kept small. Any systematic bias due to small SS would bias system evaluation. The authors set about to determine whether the area under the ROC curve (AUC) would be systematically biased by small SS. Monte Carlo techniques were used to simulate observer performance in distinguishing signal (SN) from noise (N) on a 6-point scale; P(SN) = P(N) = .5. Four sample sizes (15, 25, 50 and 100 each of SN and N), three ROC slopes (0.8, 1.0 and 1.25), and three intercepts (0.8, 1.0 and 1.25) were considered. In each of the 36 combinations of SS, slope and intercept, 2000 runs were simulated. Results showed a systematic bias: the observed AUC exceeded the expected AUC in every one of the 36 combinations for all sample sizes, with the smallest sample sizes having the largest bias. This suggests that evaluations of imaging systems using ROC curves based on small sample size systematically overestimate system performance. The effect is consistent but subtle (maximum 10% of AUC standard deviation), and is probably masked by the s.d. in most practical settings. Although there is a statistically significant effect (F = 33.34, P<0.0001) due to sample size, none was found for either the ROC curve slope or intercept. Overestimation of test performance by small SS seems to be an inherent characteristic of the ROC technique that has not previously been described

  4. Estimating sample size for a small-quadrat method of botanical ...

    African Journals Online (AJOL)

    Reports the results of a study conducted to determine an appropriate sample size for a small-quadrat method of botanical survey for application in the Mixed Bushveld of South Africa. Species density and grass density were measured using a small-quadrat method in eight plant communities in the Nylsvley Nature Reserve.

  5. On the Structure of Cortical Microcircuits Inferred from Small Sample Sizes.

    Science.gov (United States)

    Vegué, Marina; Perin, Rodrigo; Roxin, Alex

    2017-08-30

    The structure in cortical microcircuits deviates from what would be expected in a purely random network, which has been seen as evidence of clustering. To address this issue, we sought to reproduce the nonrandom features of cortical circuits by considering several distinct classes of network topology, including clustered networks, networks with distance-dependent connectivity, and those with broad degree distributions. To our surprise, we found that all of these qualitatively distinct topologies could account equally well for all reported nonrandom features despite being easily distinguishable from one another at the network level. This apparent paradox was a consequence of estimating network properties given only small sample sizes. In other words, networks that differ markedly in their global structure can look quite similar locally. This makes inferring network structure from small sample sizes, a necessity given the technical difficulty inherent in simultaneous intracellular recordings, problematic. We found that a network statistic called the sample degree correlation (SDC) overcomes this difficulty. The SDC depends only on parameters that can be estimated reliably given small sample sizes and is an accurate fingerprint of every topological family. We applied the SDC criterion to data from rat visual and somatosensory cortex and discovered that the connectivity was not consistent with any of these main topological classes. However, we were able to fit the experimental data with a more general network class, of which all previous topologies were special cases. The resulting network topology could be interpreted as a combination of physical spatial dependence and nonspatial, hierarchical clustering. SIGNIFICANCE STATEMENT The connectivity of cortical microcircuits exhibits features that are inconsistent with a simple random network. Here, we show that several classes of network models can account for this nonrandom structure despite qualitative differences in

  6. Uncertainty budget in internal monostandard NAA for small and large size samples analysis

    International Nuclear Information System (INIS)

    Dasari, K.B.; Acharya, R.

    2014-01-01

    Total uncertainty budget evaluation on determined concentration value is important under quality assurance programme. Concentration calculation in NAA or carried out by relative NAA and k0 based internal monostandard NAA (IM-NAA) method. IM-NAA method has been used for small and large sample analysis of clay potteries. An attempt was made to identify the uncertainty components in IM-NAA and uncertainty budget for La in both small and large size samples has been evaluated and compared. (author)

  7. Addressing small sample size bias in multiple-biomarker trials: Inclusion of biomarker-negative patients and Firth correction.

    Science.gov (United States)

    Habermehl, Christina; Benner, Axel; Kopp-Schneider, Annette

    2018-03-01

    In recent years, numerous approaches for biomarker-based clinical trials have been developed. One of these developments are multiple-biomarker trials, which aim to investigate multiple biomarkers simultaneously in independent subtrials. For low-prevalence biomarkers, small sample sizes within the subtrials have to be expected, as well as many biomarker-negative patients at the screening stage. The small sample sizes may make it unfeasible to analyze the subtrials individually. This imposes the need to develop new approaches for the analysis of such trials. With an expected large group of biomarker-negative patients, it seems reasonable to explore options to benefit from including them in such trials. We consider advantages and disadvantages of the inclusion of biomarker-negative patients in a multiple-biomarker trial with a survival endpoint. We discuss design options that include biomarker-negative patients in the study and address the issue of small sample size bias in such trials. We carry out a simulation study for a design where biomarker-negative patients are kept in the study and are treated with standard of care. We compare three different analysis approaches based on the Cox model to examine if the inclusion of biomarker-negative patients can provide a benefit with respect to bias and variance of the treatment effect estimates. We apply the Firth correction to reduce the small sample size bias. The results of the simulation study suggest that for small sample situations, the Firth correction should be applied to adjust for the small sample size bias. Additional to the Firth penalty, the inclusion of biomarker-negative patients in the analysis can lead to further but small improvements in bias and standard deviation of the estimates. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Analysis of small sample size studies using nonparametric bootstrap test with pooled resampling method.

    Science.gov (United States)

    Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A

    2017-06-30

    Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Big Data, Small Sample.

    Science.gov (United States)

    Gerlovina, Inna; van der Laan, Mark J; Hubbard, Alan

    2017-05-20

    Multiple comparisons and small sample size, common characteristics of many types of "Big Data" including those that are produced by genomic studies, present specific challenges that affect reliability of inference. Use of multiple testing procedures necessitates calculation of very small tail probabilities of a test statistic distribution. Results based on large deviation theory provide a formal condition that is necessary to guarantee error rate control given practical sample sizes, linking the number of tests and the sample size; this condition, however, is rarely satisfied. Using methods that are based on Edgeworth expansions (relying especially on the work of Peter Hall), we explore the impact of departures of sampling distributions from typical assumptions on actual error rates. Our investigation illustrates how far the actual error rates can be from the declared nominal levels, suggesting potentially wide-spread problems with error rate control, specifically excessive false positives. This is an important factor that contributes to "reproducibility crisis". We also review some other commonly used methods (such as permutation and methods based on finite sampling inequalities) in their application to multiple testing/small sample data. We point out that Edgeworth expansions, providing higher order approximations to the sampling distribution, offer a promising direction for data analysis that could improve reliability of studies relying on large numbers of comparisons with modest sample sizes.

  10. Evaluation of Approaches to Analyzing Continuous Correlated Eye Data When Sample Size Is Small.

    Science.gov (United States)

    Huang, Jing; Huang, Jiayan; Chen, Yong; Ying, Gui-Shuang

    2018-02-01

    To evaluate the performance of commonly used statistical methods for analyzing continuous correlated eye data when sample size is small. We simulated correlated continuous data from two designs: (1) two eyes of a subject in two comparison groups; (2) two eyes of a subject in the same comparison group, under various sample size (5-50), inter-eye correlation (0-0.75) and effect size (0-0.8). Simulated data were analyzed using paired t-test, two sample t-test, Wald test and score test using the generalized estimating equations (GEE) and F-test using linear mixed effects model (LMM). We compared type I error rates and statistical powers, and demonstrated analysis approaches through analyzing two real datasets. In design 1, paired t-test and LMM perform better than GEE, with nominal type 1 error rate and higher statistical power. In design 2, no test performs uniformly well: two sample t-test (average of two eyes or a random eye) achieves better control of type I error but yields lower statistical power. In both designs, the GEE Wald test inflates type I error rate and GEE score test has lower power. When sample size is small, some commonly used statistical methods do not perform well. Paired t-test and LMM perform best when two eyes of a subject are in two different comparison groups, and t-test using the average of two eyes performs best when the two eyes are in the same comparison group. When selecting the appropriate analysis approach the study design should be considered.

  11. Standard Deviation for Small Samples

    Science.gov (United States)

    Joarder, Anwar H.; Latif, Raja M.

    2006-01-01

    Neater representations for variance are given for small sample sizes, especially for 3 and 4. With these representations, variance can be calculated without a calculator if sample sizes are small and observations are integers, and an upper bound for the standard deviation is immediate. Accessible proofs of lower and upper bounds are presented for…

  12. Sensitivity and specificity of normality tests and consequences on reference interval accuracy at small sample size: a computer-simulation study.

    Science.gov (United States)

    Le Boedec, Kevin

    2016-12-01

    According to international guidelines, parametric methods must be chosen for RI construction when the sample size is small and the distribution is Gaussian. However, normality tests may not be accurate at small sample size. The purpose of the study was to evaluate normality test performance to properly identify samples extracted from a Gaussian population at small sample sizes, and assess the consequences on RI accuracy of applying parametric methods to samples that falsely identified the parent population as Gaussian. Samples of n = 60 and n = 30 values were randomly selected 100 times from simulated Gaussian, lognormal, and asymmetric populations of 10,000 values. The sensitivity and specificity of 4 normality tests were compared. Reference intervals were calculated using 6 different statistical methods from samples that falsely identified the parent population as Gaussian, and their accuracy was compared. Shapiro-Wilk and D'Agostino-Pearson tests were the best performing normality tests. However, their specificity was poor at sample size n = 30 (specificity for P Box-Cox transformation) on all samples regardless of their distribution or adjusting, the significance level of normality tests depending on sample size would limit the risk of constructing inaccurate RI. © 2016 American Society for Veterinary Clinical Pathology.

  13. [Effect sizes, statistical power and sample sizes in "the Japanese Journal of Psychology"].

    Science.gov (United States)

    Suzukawa, Yumi; Toyoda, Hideki

    2012-04-01

    This study analyzed the statistical power of research studies published in the "Japanese Journal of Psychology" in 2008 and 2009. Sample effect sizes and sample statistical powers were calculated for each statistical test and analyzed with respect to the analytical methods and the fields of the studies. The results show that in the fields like perception, cognition or learning, the effect sizes were relatively large, although the sample sizes were small. At the same time, because of the small sample sizes, some meaningful effects could not be detected. In the other fields, because of the large sample sizes, meaningless effects could be detected. This implies that researchers who could not get large enough effect sizes would use larger samples to obtain significant results.

  14. Speeding Up Non-Parametric Bootstrap Computations for Statistics Based on Sample Moments in Small/Moderate Sample Size Applications.

    Directory of Open Access Journals (Sweden)

    Elias Chaibub Neto

    Full Text Available In this paper we propose a vectorized implementation of the non-parametric bootstrap for statistics based on sample moments. Basically, we adopt the multinomial sampling formulation of the non-parametric bootstrap, and compute bootstrap replications of sample moment statistics by simply weighting the observed data according to multinomial counts instead of evaluating the statistic on a resampled version of the observed data. Using this formulation we can generate a matrix of bootstrap weights and compute the entire vector of bootstrap replications with a few matrix multiplications. Vectorization is particularly important for matrix-oriented programming languages such as R, where matrix/vector calculations tend to be faster than scalar operations implemented in a loop. We illustrate the application of the vectorized implementation in real and simulated data sets, when bootstrapping Pearson's sample correlation coefficient, and compared its performance against two state-of-the-art R implementations of the non-parametric bootstrap, as well as a straightforward one based on a for loop. Our investigations spanned varying sample sizes and number of bootstrap replications. The vectorized bootstrap compared favorably against the state-of-the-art implementations in all cases tested, and was remarkably/considerably faster for small/moderate sample sizes. The same results were observed in the comparison with the straightforward implementation, except for large sample sizes, where the vectorized bootstrap was slightly slower than the straightforward implementation due to increased time expenditures in the generation of weight matrices via multinomial sampling.

  15. Autoregressive Prediction with Rolling Mechanism for Time Series Forecasting with Small Sample Size

    Directory of Open Access Journals (Sweden)

    Zhihua Wang

    2014-01-01

    Full Text Available Reasonable prediction makes significant practical sense to stochastic and unstable time series analysis with small or limited sample size. Motivated by the rolling idea in grey theory and the practical relevance of very short-term forecasting or 1-step-ahead prediction, a novel autoregressive (AR prediction approach with rolling mechanism is proposed. In the modeling procedure, a new developed AR equation, which can be used to model nonstationary time series, is constructed in each prediction step. Meanwhile, the data window, for the next step ahead forecasting, rolls on by adding the most recent derived prediction result while deleting the first value of the former used sample data set. This rolling mechanism is an efficient technique for its advantages of improved forecasting accuracy, applicability in the case of limited and unstable data situations, and requirement of little computational effort. The general performance, influence of sample size, nonlinearity dynamic mechanism, and significance of the observed trends, as well as innovation variance, are illustrated and verified with Monte Carlo simulations. The proposed methodology is then applied to several practical data sets, including multiple building settlement sequences and two economic series.

  16. A novel approach for small sample size family-based association studies: sequential tests.

    Science.gov (United States)

    Ilk, Ozlem; Rajabli, Farid; Dungul, Dilay Ciglidag; Ozdag, Hilal; Ilk, Hakki Gokhan

    2011-08-01

    In this paper, we propose a sequential probability ratio test (SPRT) to overcome the problem of limited samples in studies related to complex genetic diseases. The results of this novel approach are compared with the ones obtained from the traditional transmission disequilibrium test (TDT) on simulated data. Although TDT classifies single-nucleotide polymorphisms (SNPs) to only two groups (SNPs associated with the disease and the others), SPRT has the flexibility of assigning SNPs to a third group, that is, those for which we do not have enough evidence and should keep sampling. It is shown that SPRT results in smaller ratios of false positives and negatives, as well as better accuracy and sensitivity values for classifying SNPs when compared with TDT. By using SPRT, data with small sample size become usable for an accurate association analysis.

  17. Small sample whole-genome amplification

    Science.gov (United States)

    Hara, Christine; Nguyen, Christine; Wheeler, Elizabeth; Sorensen, Karen; Arroyo, Erin; Vrankovich, Greg; Christian, Allen

    2005-11-01

    Many challenges arise when trying to amplify and analyze human samples collected in the field due to limitations in sample quantity, and contamination of the starting material. Tests such as DNA fingerprinting and mitochondrial typing require a certain sample size and are carried out in large volume reactions; in cases where insufficient sample is present whole genome amplification (WGA) can be used. WGA allows very small quantities of DNA to be amplified in a way that enables subsequent DNA-based tests to be performed. A limiting step to WGA is sample preparation. To minimize the necessary sample size, we have developed two modifications of WGA: the first allows for an increase in amplified product from small, nanoscale, purified samples with the use of carrier DNA while the second is a single-step method for cleaning and amplifying samples all in one column. Conventional DNA cleanup involves binding the DNA to silica, washing away impurities, and then releasing the DNA for subsequent testing. We have eliminated losses associated with incomplete sample release, thereby decreasing the required amount of starting template for DNA testing. Both techniques address the limitations of sample size by providing ample copies of genomic samples. Carrier DNA, included in our WGA reactions, can be used when amplifying samples with the standard purification method, or can be used in conjunction with our single-step DNA purification technique to potentially further decrease the amount of starting sample necessary for future forensic DNA-based assays.

  18. The large sample size fallacy.

    Science.gov (United States)

    Lantz, Björn

    2013-06-01

    Significance in the statistical sense has little to do with significance in the common practical sense. Statistical significance is a necessary but not a sufficient condition for practical significance. Hence, results that are extremely statistically significant may be highly nonsignificant in practice. The degree of practical significance is generally determined by the size of the observed effect, not the p-value. The results of studies based on large samples are often characterized by extreme statistical significance despite small or even trivial effect sizes. Interpreting such results as significant in practice without further analysis is referred to as the large sample size fallacy in this article. The aim of this article is to explore the relevance of the large sample size fallacy in contemporary nursing research. Relatively few nursing articles display explicit measures of observed effect sizes or include a qualitative discussion of observed effect sizes. Statistical significance is often treated as an end in itself. Effect sizes should generally be calculated and presented along with p-values for statistically significant results, and observed effect sizes should be discussed qualitatively through direct and explicit comparisons with the effects in related literature. © 2012 Nordic College of Caring Science.

  19. Small-size low-temperature scanning tunnel microscope

    International Nuclear Information System (INIS)

    Al'tfeder, I.B.; Khajkin, M.S.

    1989-01-01

    A small-size scanning tunnel microscope, designed for operation in transport helium-filled Dewar flasks is described. The microscope design contains a device moving the pin to the tested sample surface and a piezoelectric fine positioning device. High vibration protection of the microscope is provided by its suspension using silk threads. The small-size scanning tunnel microscope provides for atomic resolution

  20. Generic Learning-Based Ensemble Framework for Small Sample Size Face Recognition in Multi-Camera Networks.

    Science.gov (United States)

    Zhang, Cuicui; Liang, Xuefeng; Matsuyama, Takashi

    2014-12-08

    Multi-camera networks have gained great interest in video-based surveillance systems for security monitoring, access control, etc. Person re-identification is an essential and challenging task in multi-camera networks, which aims to determine if a given individual has already appeared over the camera network. Individual recognition often uses faces as a trial and requires a large number of samples during the training phrase. This is difficult to fulfill due to the limitation of the camera hardware system and the unconstrained image capturing conditions. Conventional face recognition algorithms often encounter the "small sample size" (SSS) problem arising from the small number of training samples compared to the high dimensionality of the sample space. To overcome this problem, interest in the combination of multiple base classifiers has sparked research efforts in ensemble methods. However, existing ensemble methods still open two questions: (1) how to define diverse base classifiers from the small data; (2) how to avoid the diversity/accuracy dilemma occurring during ensemble. To address these problems, this paper proposes a novel generic learning-based ensemble framework, which augments the small data by generating new samples based on a generic distribution and introduces a tailored 0-1 knapsack algorithm to alleviate the diversity/accuracy dilemma. More diverse base classifiers can be generated from the expanded face space, and more appropriate base classifiers are selected for ensemble. Extensive experimental results on four benchmarks demonstrate the higher ability of our system to cope with the SSS problem compared to the state-of-the-art system.

  1. Sample Size Calculations for Population Size Estimation Studies Using Multiplier Methods With Respondent-Driven Sampling Surveys.

    Science.gov (United States)

    Fearon, Elizabeth; Chabata, Sungai T; Thompson, Jennifer A; Cowan, Frances M; Hargreaves, James R

    2017-09-14

    While guidance exists for obtaining population size estimates using multiplier methods with respondent-driven sampling surveys, we lack specific guidance for making sample size decisions. To guide the design of multiplier method population size estimation studies using respondent-driven sampling surveys to reduce the random error around the estimate obtained. The population size estimate is obtained by dividing the number of individuals receiving a service or the number of unique objects distributed (M) by the proportion of individuals in a representative survey who report receipt of the service or object (P). We have developed an approach to sample size calculation, interpreting methods to estimate the variance around estimates obtained using multiplier methods in conjunction with research into design effects and respondent-driven sampling. We describe an application to estimate the number of female sex workers in Harare, Zimbabwe. There is high variance in estimates. Random error around the size estimate reflects uncertainty from M and P, particularly when the estimate of P in the respondent-driven sampling survey is low. As expected, sample size requirements are higher when the design effect of the survey is assumed to be greater. We suggest a method for investigating the effects of sample size on the precision of a population size estimate obtained using multipler methods and respondent-driven sampling. Uncertainty in the size estimate is high, particularly when P is small, so balancing against other potential sources of bias, we advise researchers to consider longer service attendance reference periods and to distribute more unique objects, which is likely to result in a higher estimate of P in the respondent-driven sampling survey. ©Elizabeth Fearon, Sungai T Chabata, Jennifer A Thompson, Frances M Cowan, James R Hargreaves. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 14.09.2017.

  2. Small-sized reverberation chamber for the measurement of sound absorption

    International Nuclear Information System (INIS)

    Rey, R. del; Alba, J.; Bertó, L.; Gregori, A.

    2017-01-01

    This paper presents the design, construction, calibration and automation of a reverberation chamber for small samples. A balance has been sought between reducing sample size, to reduce the manufacturing costs of materials, and finding the appropriate volume of the chamber, to obtain reliable values at high and mid frequencies. The small-sized reverberation chamber, that was built, has a volume of 1.12 m3 and allows for the testing of samples of 0.3 m2. By using diffusers, to improve the diffusion degree, and automating measurements, we were able to improve the reliability of the results, thus reducing test errors. Several comparison studies of the measurements of the small-sized reverberation chamber and the standardised reverberation chamber are shown, and a good degree of adjustment can be seen between them, within the range of valid frequencies. This paper presents a small laboratory for comparing samples and making decisions before the manufacturing of larger sizes. [es

  3. A comparison of confidence/credible interval methods for the area under the ROC curve for continuous diagnostic tests with small sample size.

    Science.gov (United States)

    Feng, Dai; Cortese, Giuliana; Baumgartner, Richard

    2017-12-01

    The receiver operating characteristic (ROC) curve is frequently used as a measure of accuracy of continuous markers in diagnostic tests. The area under the ROC curve (AUC) is arguably the most widely used summary index for the ROC curve. Although the small sample size scenario is common in medical tests, a comprehensive study of small sample size properties of various methods for the construction of the confidence/credible interval (CI) for the AUC has been by and large missing in the literature. In this paper, we describe and compare 29 non-parametric and parametric methods for the construction of the CI for the AUC when the number of available observations is small. The methods considered include not only those that have been widely adopted, but also those that have been less frequently mentioned or, to our knowledge, never applied to the AUC context. To compare different methods, we carried out a simulation study with data generated from binormal models with equal and unequal variances and from exponential models with various parameters and with equal and unequal small sample sizes. We found that the larger the true AUC value and the smaller the sample size, the larger the discrepancy among the results of different approaches. When the model is correctly specified, the parametric approaches tend to outperform the non-parametric ones. Moreover, in the non-parametric domain, we found that a method based on the Mann-Whitney statistic is in general superior to the others. We further elucidate potential issues and provide possible solutions to along with general guidance on the CI construction for the AUC when the sample size is small. Finally, we illustrate the utility of different methods through real life examples.

  4. Determination of the optimal sample size for a clinical trial accounting for the population size.

    Science.gov (United States)

    Stallard, Nigel; Miller, Frank; Day, Simon; Hee, Siew Wan; Madan, Jason; Zohar, Sarah; Posch, Martin

    2017-07-01

    The problem of choosing a sample size for a clinical trial is a very common one. In some settings, such as rare diseases or other small populations, the large sample sizes usually associated with the standard frequentist approach may be infeasible, suggesting that the sample size chosen should reflect the size of the population under consideration. Incorporation of the population size is possible in a decision-theoretic approach either explicitly by assuming that the population size is fixed and known, or implicitly through geometric discounting of the gain from future patients reflecting the expected population size. This paper develops such approaches. Building on previous work, an asymptotic expression is derived for the sample size for single and two-arm clinical trials in the general case of a clinical trial with a primary endpoint with a distribution of one parameter exponential family form that optimizes a utility function that quantifies the cost and gain per patient as a continuous function of this parameter. It is shown that as the size of the population, N, or expected size, N∗ in the case of geometric discounting, becomes large, the optimal trial size is O(N1/2) or O(N∗1/2). The sample size obtained from the asymptotic expression is also compared with the exact optimal sample size in examples with responses with Bernoulli and Poisson distributions, showing that the asymptotic approximations can also be reasonable in relatively small sample sizes. © 2016 The Author. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Concepts in sample size determination

    Directory of Open Access Journals (Sweden)

    Umadevi K Rao

    2012-01-01

    Full Text Available Investigators involved in clinical, epidemiological or translational research, have the drive to publish their results so that they can extrapolate their findings to the population. This begins with the preliminary step of deciding the topic to be studied, the subjects and the type of study design. In this context, the researcher must determine how many subjects would be required for the proposed study. Thus, the number of individuals to be included in the study, i.e., the sample size is an important consideration in the design of many clinical studies. The sample size determination should be based on the difference in the outcome between the two groups studied as in an analytical study, as well as on the accepted p value for statistical significance and the required statistical power to test a hypothesis. The accepted risk of type I error or alpha value, which by convention is set at the 0.05 level in biomedical research defines the cutoff point at which the p value obtained in the study is judged as significant or not. The power in clinical research is the likelihood of finding a statistically significant result when it exists and is typically set to >80%. This is necessary since the most rigorously executed studies may fail to answer the research question if the sample size is too small. Alternatively, a study with too large a sample size will be difficult and will result in waste of time and resources. Thus, the goal of sample size planning is to estimate an appropriate number of subjects for a given study design. This article describes the concepts in estimating the sample size.

  6. Testing of Small Graphite Samples for Nuclear Qualification

    Energy Technology Data Exchange (ETDEWEB)

    Julie Chapman

    2010-11-01

    Accurately determining the mechanical properties of small irradiated samples is crucial to predicting the behavior of the overal irradiated graphite components within a Very High Temperature Reactor. The sample size allowed in a material test reactor, however, is limited, and this poses some difficulties with respect to mechanical testing. In the case of graphite with a larger grain size, a small sample may exhibit characteristics not representative of the bulk material, leading to inaccuracies in the data. A study to determine a potential size effect on the tensile strength was pursued under the Next Generation Nuclear Plant program. It focuses first on optimizing the tensile testing procedure identified in the American Society for Testing and Materials (ASTM) Standard C 781-08. Once the testing procedure was verified, a size effect was assessed by gradually reducing the diameter of the specimens. By monitoring the material response, a size effect was successfully identified.

  7. Breaking Free of Sample Size Dogma to Perform Innovative Translational Research

    Science.gov (United States)

    Bacchetti, Peter; Deeks, Steven G.; McCune, Joseph M.

    2011-01-01

    Innovative clinical and translational research is often delayed or prevented by reviewers’ expectations that any study performed in humans must be shown in advance to have high statistical power. This supposed requirement is not justifiable and is contradicted by the reality that increasing sample size produces diminishing marginal returns. Studies of new ideas often must start small (sometimes even with an N of 1) because of cost and feasibility concerns, and recent statistical work shows that small sample sizes for such research can produce more projected scientific value per dollar spent than larger sample sizes. Renouncing false dogma about sample size would remove a serious barrier to innovation and translation. PMID:21677197

  8. The Effect of Small Sample Size on Measurement Equivalence of Psychometric Questionnaires in MIMIC Model: A Simulation Study

    Directory of Open Access Journals (Sweden)

    Jamshid Jamali

    2017-01-01

    Full Text Available Evaluating measurement equivalence (also known as differential item functioning (DIF is an important part of the process of validating psychometric questionnaires. This study aimed at evaluating the multiple indicators multiple causes (MIMIC model for DIF detection when latent construct distribution is nonnormal and the focal group sample size is small. In this simulation-based study, Type I error rates and power of MIMIC model for detecting uniform-DIF were investigated under different combinations of reference to focal group sample size ratio, magnitude of the uniform-DIF effect, scale length, the number of response categories, and latent trait distribution. Moderate and high skewness in the latent trait distribution led to a decrease of 0.33% and 0.47% power of MIMIC model for detecting uniform-DIF, respectively. The findings indicated that, by increasing the scale length, the number of response categories and magnitude DIF improved the power of MIMIC model, by 3.47%, 4.83%, and 20.35%, respectively; it also decreased Type I error of MIMIC approach by 2.81%, 5.66%, and 0.04%, respectively. This study revealed that power of MIMIC model was at an acceptable level when latent trait distributions were skewed. However, empirical Type I error rate was slightly greater than nominal significance level. Consequently, the MIMIC was recommended for detection of uniform-DIF when latent construct distribution is nonnormal and the focal group sample size is small.

  9. The Effect of Small Sample Size on Measurement Equivalence of Psychometric Questionnaires in MIMIC Model: A Simulation Study.

    Science.gov (United States)

    Jamali, Jamshid; Ayatollahi, Seyyed Mohammad Taghi; Jafari, Peyman

    2017-01-01

    Evaluating measurement equivalence (also known as differential item functioning (DIF)) is an important part of the process of validating psychometric questionnaires. This study aimed at evaluating the multiple indicators multiple causes (MIMIC) model for DIF detection when latent construct distribution is nonnormal and the focal group sample size is small. In this simulation-based study, Type I error rates and power of MIMIC model for detecting uniform-DIF were investigated under different combinations of reference to focal group sample size ratio, magnitude of the uniform-DIF effect, scale length, the number of response categories, and latent trait distribution. Moderate and high skewness in the latent trait distribution led to a decrease of 0.33% and 0.47% power of MIMIC model for detecting uniform-DIF, respectively. The findings indicated that, by increasing the scale length, the number of response categories and magnitude DIF improved the power of MIMIC model, by 3.47%, 4.83%, and 20.35%, respectively; it also decreased Type I error of MIMIC approach by 2.81%, 5.66%, and 0.04%, respectively. This study revealed that power of MIMIC model was at an acceptable level when latent trait distributions were skewed. However, empirical Type I error rate was slightly greater than nominal significance level. Consequently, the MIMIC was recommended for detection of uniform-DIF when latent construct distribution is nonnormal and the focal group sample size is small.

  10. Weighted piecewise LDA for solving the small sample size problem in face verification.

    Science.gov (United States)

    Kyperountas, Marios; Tefas, Anastasios; Pitas, Ioannis

    2007-03-01

    A novel algorithm that can be used to boost the performance of face-verification methods that utilize Fisher's criterion is presented and evaluated. The algorithm is applied to similarity, or matching error, data and provides a general solution for overcoming the "small sample size" (SSS) problem, where the lack of sufficient training samples causes improper estimation of a linear separation hyperplane between the classes. Two independent phases constitute the proposed method. Initially, a set of weighted piecewise discriminant hyperplanes are used in order to provide a more accurate discriminant decision than the one produced by the traditional linear discriminant analysis (LDA) methodology. The expected classification ability of this method is investigated throughout a series of simulations. The second phase defines proper combinations for person-specific similarity scores and describes an outlier removal process that further enhances the classification ability. The proposed technique has been tested on the M2VTS and XM2VTS frontal face databases. Experimental results indicate that the proposed framework greatly improves the face-verification performance.

  11. Sample size determination in clinical trials with multiple endpoints

    CERN Document Server

    Sozu, Takashi; Hamasaki, Toshimitsu; Evans, Scott R

    2015-01-01

    This book integrates recent methodological developments for calculating the sample size and power in trials with more than one endpoint considered as multiple primary or co-primary, offering an important reference work for statisticians working in this area. The determination of sample size and the evaluation of power are fundamental and critical elements in the design of clinical trials. If the sample size is too small, important effects may go unnoticed; if the sample size is too large, it represents a waste of resources and unethically puts more participants at risk than necessary. Recently many clinical trials have been designed with more than one endpoint considered as multiple primary or co-primary, creating a need for new approaches to the design and analysis of these clinical trials. The book focuses on the evaluation of power and sample size determination when comparing the effects of two interventions in superiority clinical trials with multiple endpoints. Methods for sample size calculation in clin...

  12. Sample sizes and model comparison metrics for species distribution models

    Science.gov (United States)

    B.B. Hanberry; H.S. He; D.C. Dey

    2012-01-01

    Species distribution models use small samples to produce continuous distribution maps. The question of how small a sample can be to produce an accurate model generally has been answered based on comparisons to maximum sample sizes of 200 observations or fewer. In addition, model comparisons often are made with the kappa statistic, which has become controversial....

  13. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  14. Estimation for small domains in double sampling for stratification ...

    African Journals Online (AJOL)

    In this article, we investigate the effect of randomness of the size of a small domain on the precision of an estimator of mean for the domain under double sampling for stratification. The result shows that for a small domain that cuts across various strata with unknown weights, the sampling variance depends on the within ...

  15. Effects of growth rate, size, and light availability on tree survival across life stages: a demographic analysis accounting for missing values and small sample sizes.

    Science.gov (United States)

    Moustakas, Aristides; Evans, Matthew R

    2015-02-28

    Plant survival is a key factor in forest dynamics and survival probabilities often vary across life stages. Studies specifically aimed at assessing tree survival are unusual and so data initially designed for other purposes often need to be used; such data are more likely to contain errors than data collected for this specific purpose. We investigate the survival rates of ten tree species in a dataset designed to monitor growth rates. As some individuals were not included in the census at some time points we use capture-mark-recapture methods both to allow us to account for missing individuals, and to estimate relocation probabilities. Growth rates, size, and light availability were included as covariates in the model predicting survival rates. The study demonstrates that tree mortality is best described as constant between years and size-dependent at early life stages and size independent at later life stages for most species of UK hardwood. We have demonstrated that even with a twenty-year dataset it is possible to discern variability both between individuals and between species. Our work illustrates the potential utility of the method applied here for calculating plant population dynamics parameters in time replicated datasets with small sample sizes and missing individuals without any loss of sample size, and including explanatory covariates.

  16. Small renal size in newborns with spina bifida: possible causes.

    Science.gov (United States)

    Montaldo, Paolo; Montaldo, Luisa; Iossa, Azzurra Concetta; Cennamo, Marina; Caredda, Elisabetta; Del Gado, Roberto

    2014-02-01

    Previous studies reported that children with neural tube defects, but without any history of intrinsic renal diseases, have small kidneys when compared with age-matched standard renal growth. The aim of this study was to investigate the possible causes of small renal size in children with spina bifida by comparing growth hormone deficiency, physical limitations and hyperhomocysteinemia. The sample included 187 newborns with spina bifida. Renal sizes in the patients were assessed by using maximum measurement of renal length and the measurements were compared by using the Sutherland monogram. According to the results, the sample was divided into two groups--a group of 120 patients with small kidneys (under the third percentile) and a control group of 67 newborns with normal kidney size. Plasma total homocysteine was investigated in mothers and in their children. Serum insulin-like growth factor-1 (IGF-1) levels were measured. Serum IGF-1 levels were normal in both groups. Children and mothers with homocysteine levels >10 μmol/l were more than twice as likely to have small kidneys and to give to birth children with small kidneys, respectively, compared with newborns and mothers with homocysteine levels <10 μmol/l. An inverse correlation was also found between the homocysteine levels of mothers and kidney sizes of children (r = - 0.6109 P ≤ 0.01). It is highly important for mothers with hyperhomocysteinemia to be educated about benefits of folate supplementation in order to reduce the risk of small renal size and lower renal function in children.

  17. Sample size calculation in metabolic phenotyping studies.

    Science.gov (United States)

    Billoir, Elise; Navratil, Vincent; Blaise, Benjamin J

    2015-09-01

    The number of samples needed to identify significant effects is a key question in biomedical studies, with consequences on experimental designs, costs and potential discoveries. In metabolic phenotyping studies, sample size determination remains a complex step. This is due particularly to the multiple hypothesis-testing framework and the top-down hypothesis-free approach, with no a priori known metabolic target. Until now, there was no standard procedure available to address this purpose. In this review, we discuss sample size estimation procedures for metabolic phenotyping studies. We release an automated implementation of the Data-driven Sample size Determination (DSD) algorithm for MATLAB and GNU Octave. Original research concerning DSD was published elsewhere. DSD allows the determination of an optimized sample size in metabolic phenotyping studies. The procedure uses analytical data only from a small pilot cohort to generate an expanded data set. The statistical recoupling of variables procedure is used to identify metabolic variables, and their intensity distributions are estimated by Kernel smoothing or log-normal density fitting. Statistically significant metabolic variations are evaluated using the Benjamini-Yekutieli correction and processed for data sets of various sizes. Optimal sample size determination is achieved in a context of biomarker discovery (at least one statistically significant variation) or metabolic exploration (a maximum of statistically significant variations). DSD toolbox is encoded in MATLAB R2008A (Mathworks, Natick, MA) for Kernel and log-normal estimates, and in GNU Octave for log-normal estimates (Kernel density estimates are not robust enough in GNU octave). It is available at http://www.prabi.fr/redmine/projects/dsd/repository, with a tutorial at http://www.prabi.fr/redmine/projects/dsd/wiki. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  18. Small sample GEE estimation of regression parameters for longitudinal data.

    Science.gov (United States)

    Paul, Sudhir; Zhang, Xuemao

    2014-09-28

    Longitudinal (clustered) response data arise in many bio-statistical applications which, in general, cannot be assumed to be independent. Generalized estimating equation (GEE) is a widely used method to estimate marginal regression parameters for correlated responses. The advantage of the GEE is that the estimates of the regression parameters are asymptotically unbiased even if the correlation structure is misspecified, although their small sample properties are not known. In this paper, two bias adjusted GEE estimators of the regression parameters in longitudinal data are obtained when the number of subjects is small. One is based on a bias correction, and the other is based on a bias reduction. Simulations show that the performances of both the bias-corrected methods are similar in terms of bias, efficiency, coverage probability, average coverage length, impact of misspecification of correlation structure, and impact of cluster size on bias correction. Both these methods show superior properties over the GEE estimates for small samples. Further, analysis of data involving a small number of subjects also shows improvement in bias, MSE, standard error, and length of the confidence interval of the estimates by the two bias adjusted methods over the GEE estimates. For small to moderate sample sizes (N ≤50), either of the bias-corrected methods GEEBc and GEEBr can be used. However, the method GEEBc should be preferred over GEEBr, as the former is computationally easier. For large sample sizes, the GEE method can be used. Copyright © 2014 John Wiley & Sons, Ltd.

  19. Small head size after atomic irradiation

    International Nuclear Information System (INIS)

    Miller, R.W.; Mulvihill, J.J.

    1975-01-01

    A study of children exposed to nuclear explosions in Hiroshima and Nagasaki showed small head size and mental retardation when exposure occurred less than 18 weeks of gestational age. Increased frequency of small head size occurred when maternal exposure was 10 to 19 rad. Tables and graphs are presented to show relationships between dose, gestational age, and frequency of small head size

  20. Predicting sample size required for classification performance

    Directory of Open Access Journals (Sweden)

    Figueroa Rosa L

    2012-02-01

    Full Text Available Abstract Background Supervised learning methods need annotated data in order to generate efficient models. Annotated data, however, is a relatively scarce resource and can be expensive to obtain. For both passive and active learning methods, there is a need to estimate the size of the annotated sample required to reach a performance target. Methods We designed and implemented a method that fits an inverse power law model to points of a given learning curve created using a small annotated training set. Fitting is carried out using nonlinear weighted least squares optimization. The fitted model is then used to predict the classifier's performance and confidence interval for larger sample sizes. For evaluation, the nonlinear weighted curve fitting method was applied to a set of learning curves generated using clinical text and waveform classification tasks with active and passive sampling methods, and predictions were validated using standard goodness of fit measures. As control we used an un-weighted fitting method. Results A total of 568 models were fitted and the model predictions were compared with the observed performances. Depending on the data set and sampling method, it took between 80 to 560 annotated samples to achieve mean average and root mean squared error below 0.01. Results also show that our weighted fitting method outperformed the baseline un-weighted method (p Conclusions This paper describes a simple and effective sample size prediction algorithm that conducts weighted fitting of learning curves. The algorithm outperformed an un-weighted algorithm described in previous literature. It can help researchers determine annotation sample size for supervised machine learning.

  1. Exploratory Factor Analysis With Small Samples and Missing Data.

    Science.gov (United States)

    McNeish, Daniel

    2017-01-01

    Exploratory factor analysis (EFA) is an extremely popular method for determining the underlying factor structure for a set of variables. Due to its exploratory nature, EFA is notorious for being conducted with small sample sizes, and recent reviews of psychological research have reported that between 40% and 60% of applied studies have 200 or fewer observations. Recent methodological studies have addressed small size requirements for EFA models; however, these models have only considered complete data, which are the exception rather than the rule in psychology. Furthermore, the extant literature on missing data techniques with small samples is scant, and nearly all existing studies focus on topics that are not of primary interest to EFA models. Therefore, this article presents a simulation to assess the performance of various missing data techniques for EFA models with both small samples and missing data. Results show that deletion methods do not extract the proper number of factors and estimate the factor loadings with severe bias, even when data are missing completely at random. Predictive mean matching is the best method overall when considering extracting the correct number of factors and estimating factor loadings without bias, although 2-stage estimation was a close second.

  2. Small Mammal Sampling in Mortandad and Los Alamos Canyons, 2005

    International Nuclear Information System (INIS)

    Kathy Bennett; Sherri Sherwood; Rhonda Robinson

    2006-01-01

    As part of an ongoing ecological field investigation at Los Alamos National Laboratory, a study was conducted that compared measured contaminant concentrations in sediment to population parameters for small mammals in the Mortandad Canyon watershed. Mortandad Canyon and its tributary canyons have received contaminants from multiple solid waste management units and areas of concern since establishment of the Laboratory in the 1940s. The study included three reaches within Effluent and Mortandad canyons (E-1W, M-2W, and M-3) that had a spread in the concentrations of metals and radionuclides and included locations where polychlorinated biphenyls and perchlorate had been detected. A reference location, reach LA-BKG in upper Los Alamos Canyon, was also included in the study for comparison purposes. A small mammal study was initiated to assess whether potential adverse effects were evident in Mortandad Canyon due to the presence of contaminants, designated as contaminants of potential ecological concern, in the terrestrial media. Study sites, including the reference site, were sampled in late July/early August. Species diversity and the mean daily capture rate were the highest for E-1W reach and the lowest for the reference site. Species composition among the three reaches in Mortandad was similar with very little overlap with the reference canyon. Differences in species composition and diversity were most likely due to differences in habitat. Sex ratios, body weights, and reproductive status of small mammals were also evaluated. However, small sample sizes of some species within some sites affected the analysis. Ratios of males to females by species of each site (n = 5) were tested using a Chi-square analysis. No differences were detected. Where there was sufficient sample size, body weights of adult small mammals were compared between sites. No differences in body weights were found. Reproductive status of species appears to be similar across sites. However, sample

  3. Small Mammal Sampling in Mortandad and Los Alamos Canyons, 2005

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, Kathy; Sherwood, Sherri; Robinson, Rhonda

    2006-08-15

    As part of an ongoing ecological field investigation at Los Alamos National Laboratory, a study was conducted that compared measured contaminant concentrations in sediment to population parameters for small mammals in the Mortandad Canyon watershed. Mortandad Canyon and its tributary canyons have received contaminants from multiple solid waste management units and areas of concern since establishment of the Laboratory in the 1940s. The study included three reaches within Effluent and Mortandad canyons (E-1W, M-2W, and M-3) that had a spread in the concentrations of metals and radionuclides and included locations where polychlorinated biphenyls and perchlorate had been detected. A reference location, reach LA-BKG in upper Los Alamos Canyon, was also included in the study for comparison purposes. A small mammal study was initiated to assess whether potential adverse effects were evident in Mortandad Canyon due to the presence of contaminants, designated as contaminants of potential ecological concern, in the terrestrial media. Study sites, including the reference site, were sampled in late July/early August. Species diversity and the mean daily capture rate were the highest for E-1W reach and the lowest for the reference site. Species composition among the three reaches in Mortandad was similar with very little overlap with the reference canyon. Differences in species composition and diversity were most likely due to differences in habitat. Sex ratios, body weights, and reproductive status of small mammals were also evaluated. However, small sample sizes of some species within some sites affected the analysis. Ratios of males to females by species of each site (n = 5) were tested using a Chi-square analysis. No differences were detected. Where there was sufficient sample size, body weights of adult small mammals were compared between sites. No differences in body weights were found. Reproductive status of species appears to be similar across sites. However, sample

  4. CT dose survey in adults: what sample size for what precision?

    International Nuclear Information System (INIS)

    Taylor, Stephen; Muylem, Alain van; Howarth, Nigel; Gevenois, Pierre Alain; Tack, Denis

    2017-01-01

    To determine variability of volume computed tomographic dose index (CTDIvol) and dose-length product (DLP) data, and propose a minimum sample size to achieve an expected precision. CTDIvol and DLP values of 19,875 consecutive CT acquisitions of abdomen (7268), thorax (3805), lumbar spine (3161), cervical spine (1515) and head (4106) were collected in two centers. Their variabilities were investigated according to sample size (10 to 1000 acquisitions) and patient body weight categories (no weight selection, 67-73 kg and 60-80 kg). The 95 % confidence interval in percentage of their median (CI95/med) value was calculated for increasing sample sizes. We deduced the sample size that set a 95 % CI lower than 10 % of the median (CI95/med ≤ 10 %). Sample size ensuring CI95/med ≤ 10 %, ranged from 15 to 900 depending on the body region and the dose descriptor considered. In sample sizes recommended by regulatory authorities (i.e., from 10-20 patients), mean CTDIvol and DLP of one sample ranged from 0.50 to 2.00 times its actual value extracted from 2000 samples. The sampling error in CTDIvol and DLP means is high in dose surveys based on small samples of patients. Sample size should be increased at least tenfold to decrease this variability. (orig.)

  5. CT dose survey in adults: what sample size for what precision?

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, Stephen [Hopital Ambroise Pare, Department of Radiology, Mons (Belgium); Muylem, Alain van [Hopital Erasme, Department of Pneumology, Brussels (Belgium); Howarth, Nigel [Clinique des Grangettes, Department of Radiology, Chene-Bougeries (Switzerland); Gevenois, Pierre Alain [Hopital Erasme, Department of Radiology, Brussels (Belgium); Tack, Denis [EpiCURA, Clinique Louis Caty, Department of Radiology, Baudour (Belgium)

    2017-01-15

    To determine variability of volume computed tomographic dose index (CTDIvol) and dose-length product (DLP) data, and propose a minimum sample size to achieve an expected precision. CTDIvol and DLP values of 19,875 consecutive CT acquisitions of abdomen (7268), thorax (3805), lumbar spine (3161), cervical spine (1515) and head (4106) were collected in two centers. Their variabilities were investigated according to sample size (10 to 1000 acquisitions) and patient body weight categories (no weight selection, 67-73 kg and 60-80 kg). The 95 % confidence interval in percentage of their median (CI95/med) value was calculated for increasing sample sizes. We deduced the sample size that set a 95 % CI lower than 10 % of the median (CI95/med ≤ 10 %). Sample size ensuring CI95/med ≤ 10 %, ranged from 15 to 900 depending on the body region and the dose descriptor considered. In sample sizes recommended by regulatory authorities (i.e., from 10-20 patients), mean CTDIvol and DLP of one sample ranged from 0.50 to 2.00 times its actual value extracted from 2000 samples. The sampling error in CTDIvol and DLP means is high in dose surveys based on small samples of patients. Sample size should be increased at least tenfold to decrease this variability. (orig.)

  6. The impact of sample size on the reproducibility of voxel-based lesion-deficit mappings.

    Science.gov (United States)

    Lorca-Puls, Diego L; Gajardo-Vidal, Andrea; White, Jitrachote; Seghier, Mohamed L; Leff, Alexander P; Green, David W; Crinion, Jenny T; Ludersdorfer, Philipp; Hope, Thomas M H; Bowman, Howard; Price, Cathy J

    2018-07-01

    This study investigated how sample size affects the reproducibility of findings from univariate voxel-based lesion-deficit analyses (e.g., voxel-based lesion-symptom mapping and voxel-based morphometry). Our effect of interest was the strength of the mapping between brain damage and speech articulation difficulties, as measured in terms of the proportion of variance explained. First, we identified a region of interest by searching on a voxel-by-voxel basis for brain areas where greater lesion load was associated with poorer speech articulation using a large sample of 360 right-handed English-speaking stroke survivors. We then randomly drew thousands of bootstrap samples from this data set that included either 30, 60, 90, 120, 180, or 360 patients. For each resample, we recorded effect size estimates and p values after conducting exactly the same lesion-deficit analysis within the previously identified region of interest and holding all procedures constant. The results show (1) how often small effect sizes in a heterogeneous population fail to be detected; (2) how effect size and its statistical significance varies with sample size; (3) how low-powered studies (due to small sample sizes) can greatly over-estimate as well as under-estimate effect sizes; and (4) how large sample sizes (N ≥ 90) can yield highly significant p values even when effect sizes are so small that they become trivial in practical terms. The implications of these findings for interpreting the results from univariate voxel-based lesion-deficit analyses are discussed. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  7. Support vector regression to predict porosity and permeability: Effect of sample size

    Science.gov (United States)

    Al-Anazi, A. F.; Gates, I. D.

    2012-02-01

    Porosity and permeability are key petrophysical parameters obtained from laboratory core analysis. Cores, obtained from drilled wells, are often few in number for most oil and gas fields. Porosity and permeability correlations based on conventional techniques such as linear regression or neural networks trained with core and geophysical logs suffer poor generalization to wells with only geophysical logs. The generalization problem of correlation models often becomes pronounced when the training sample size is small. This is attributed to the underlying assumption that conventional techniques employing the empirical risk minimization (ERM) inductive principle converge asymptotically to the true risk values as the number of samples increases. In small sample size estimation problems, the available training samples must span the complexity of the parameter space so that the model is able both to match the available training samples reasonably well and to generalize to new data. This is achieved using the structural risk minimization (SRM) inductive principle by matching the capability of the model to the available training data. One method that uses SRM is support vector regression (SVR) network. In this research, the capability of SVR to predict porosity and permeability in a heterogeneous sandstone reservoir under the effect of small sample size is evaluated. Particularly, the impact of Vapnik's ɛ-insensitivity loss function and least-modulus loss function on generalization performance was empirically investigated. The results are compared to the multilayer perception (MLP) neural network, a widely used regression method, which operates under the ERM principle. The mean square error and correlation coefficients were used to measure the quality of predictions. The results demonstrate that SVR yields consistently better predictions of the porosity and permeability with small sample size than the MLP method. Also, the performance of SVR depends on both kernel function

  8. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  9. Basic distribution free identification tests for small size samples of environmental data

    International Nuclear Information System (INIS)

    Federico, A.G.; Musmeci, F.

    1998-01-01

    Testing two or more data sets for the hypothesis that they are sampled form the same population is often required in environmental data analysis. Typically the available samples have a small number of data and often then assumption of normal distributions is not realistic. On the other hand the diffusion of the days powerful Personal Computers opens new possible opportunities based on a massive use of the CPU resources. The paper reviews the problem introducing the feasibility of two non parametric approaches based on intrinsic equi probability properties of the data samples. The first one is based on a full re sampling while the second is based on a bootstrap approach. A easy to use program is presented. A case study is given based on the Chernobyl children contamination data [it

  10. Optimizing the triple-axis spectrometer PANDA at the MLZ for small samples and complex sample environment conditions

    Science.gov (United States)

    Utschick, C.; Skoulatos, M.; Schneidewind, A.; Böni, P.

    2016-11-01

    The cold-neutron triple-axis spectrometer PANDA at the neutron source FRM II has been serving an international user community studying condensed matter physics problems. We report on a new setup, improving the signal-to-noise ratio for small samples and pressure cell setups. Analytical and numerical Monte Carlo methods are used for the optimization of elliptic and parabolic focusing guides. They are placed between the monochromator and sample positions, and the flux at the sample is compared to the one achieved by standard monochromator focusing techniques. A 25 times smaller spot size is achieved, associated with a factor of 2 increased intensity, within the same divergence limits, ± 2 ° . This optional neutron focusing guide shall establish a top-class spectrometer for studying novel exotic properties of matter in combination with more stringent sample environment conditions such as extreme pressures associated with small sample sizes.

  11. 77 FR 30227 - Small Business Size Regulations, Small Business Innovation Research (SBIR) Program and Small...

    Science.gov (United States)

    2012-05-22

    ... Business Size Regulations, Small Business Innovation Research (SBIR) Program and Small Business Technology... public Webinar and Roundtable Meetings regarding its proposal to amend its regulations governing size and eligibility for the Small Business Innovation Research (SBIR) and Small Business Technology Transfer (STTR...

  12. Constrained statistical inference: sample-size tables for ANOVA and regression

    Directory of Open Access Journals (Sweden)

    Leonard eVanbrabant

    2015-01-01

    Full Text Available Researchers in the social and behavioral sciences often have clear expectations about the order/direction of the parameters in their statistical model. For example, a researcher might expect that regression coefficient beta1 is larger than beta2 and beta3. The corresponding hypothesis is H: beta1 > {beta2, beta3} and this is known as an (order constrained hypothesis. A major advantage of testing such a hypothesis is that power can be gained and inherently a smaller sample size is needed. This article discusses this gain in sample size reduction, when an increasing number of constraints is included into the hypothesis. The main goal is to present sample-size tables for constrained hypotheses. A sample-size table contains the necessary sample-size at a prespecified power (say, 0.80 for an increasing number of constraints. To obtain sample-size tables, two Monte Carlo simulations were performed, one for ANOVA and one for multiple regression. Three results are salient. First, in an ANOVA the needed sample-size decreases with 30% to 50% when complete ordering of the parameters is taken into account. Second, small deviations from the imposed order have only a minor impact on the power. Third, at the maximum number of constraints, the linear regression results are comparable with the ANOVA results. However, in the case of fewer constraints, ordering the parameters (e.g., beta1 > beta2 results in a higher power than assigning a positive or a negative sign to the parameters (e.g., beta1 > 0.

  13. 76 FR 63216 - Small Business Size Standards: Information

    Science.gov (United States)

    2011-10-12

    ... Federal small business assistance, SBA establishes small business definitions (referred to as size... business definition or size standard established by the SBA Administrator. The SBA considers as part of its... SMALL BUSINESS ADMINISTRATION 13 CFR Part 121 RIN 3245-AG26 Small Business Size Standards...

  14. Detecting spatial structures in throughfall data: The effect of extent, sample size, sampling design, and variogram estimation method

    Science.gov (United States)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-09-01

    In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous

  15. Choosing a suitable sample size in descriptive sampling

    International Nuclear Information System (INIS)

    Lee, Yong Kyun; Choi, Dong Hoon; Cha, Kyung Joon

    2010-01-01

    Descriptive sampling (DS) is an alternative to crude Monte Carlo sampling (CMCS) in finding solutions to structural reliability problems. It is known to be an effective sampling method in approximating the distribution of a random variable because it uses the deterministic selection of sample values and their random permutation,. However, because this method is difficult to apply to complex simulations, the sample size is occasionally determined without thorough consideration. Input sample variability may cause the sample size to change between runs, leading to poor simulation results. This paper proposes a numerical method for choosing a suitable sample size for use in DS. Using this method, one can estimate a more accurate probability of failure in a reliability problem while running a minimal number of simulations. The method is then applied to several examples and compared with CMCS and conventional DS to validate its usefulness and efficiency

  16. Impact of sample size on principal component analysis ordination of an environmental data set: effects on eigenstructure

    Directory of Open Access Journals (Sweden)

    Shaukat S. Shahid

    2016-06-01

    Full Text Available In this study, we used bootstrap simulation of a real data set to investigate the impact of sample size (N = 20, 30, 40 and 50 on the eigenvalues and eigenvectors resulting from principal component analysis (PCA. For each sample size, 100 bootstrap samples were drawn from environmental data matrix pertaining to water quality variables (p = 22 of a small data set comprising of 55 samples (stations from where water samples were collected. Because in ecology and environmental sciences the data sets are invariably small owing to high cost of collection and analysis of samples, we restricted our study to relatively small sample sizes. We focused attention on comparison of first 6 eigenvectors and first 10 eigenvalues. Data sets were compared using agglomerative cluster analysis using Ward’s method that does not require any stringent distributional assumptions.

  17. Size Exclusion HPLC Detection of Small-Size Impurities as a Complementary Means for Quality Analysis of Extracellular Vesicles

    Directory of Open Access Journals (Sweden)

    Tao Huang

    2015-07-01

    Full Text Available For extracellular vesicle research, whether for biomarker discoveries or therapeutic applications, it is critical to have high-quality samples. Both microscopy and NanoSight Tracking Analysis (NTA for size distribution have been used to detect large vesicles. However, there is currently no well-established method that is convenient for routine quality analysis of small-size impurities in vesicle samples. In this paper we report a convenient method, called ‘size-exclusion high-performance liquid chromatography’ (SE-HPLC, alongside NTA and Microscopy analysis to guide and qualify the isolation and processing of vesicles. First, the SE-HPLC analysis was used to detect impurities of small-size proteins during the ultra-centrifugation process of vesicle isolation; it was then employed to test the changes of vesicles under different pH conditions or integrity after storage. As SE-HPLC is generally accessible in most institutions, it could be used as a routine means to assist researchers in examining the integrity and quality of extracellular vesicles along with other techniques either during isolation/preparation or for further engineering and storage.

  18. Statistical issues in reporting quality data: small samples and casemix variation.

    Science.gov (United States)

    Zaslavsky, A M

    2001-12-01

    To present two key statistical issues that arise in analysis and reporting of quality data. Casemix variation is relevant to quality reporting when the units being measured have differing distributions of patient characteristics that also affect the quality outcome. When this is the case, adjustment using stratification or regression may be appropriate. Such adjustments may be controversial when the patient characteristic does not have an obvious relationship to the outcome. Stratified reporting poses problems for sample size and reporting format, but may be useful when casemix effects vary across units. Although there are no absolute standards of reliability, high reliabilities (interunit F > or = 10 or reliability > or = 0.9) are desirable for distinguishing above- and below-average units. When small or unequal sample sizes complicate reporting, precision may be improved using indirect estimation techniques that incorporate auxiliary information, and 'shrinkage' estimation can help to summarize the strength of evidence about units with small samples. With broader understanding of casemix adjustment and methods for analyzing small samples, quality data can be analysed and reported more accurately.

  19. Sample size in psychological research over the past 30 years.

    Science.gov (United States)

    Marszalek, Jacob M; Barber, Carolyn; Kohlhart, Julie; Holmes, Cooper B

    2011-04-01

    The American Psychological Association (APA) Task Force on Statistical Inference was formed in 1996 in response to a growing body of research demonstrating methodological issues that threatened the credibility of psychological research, and made recommendations to address them. One issue was the small, even dramatically inadequate, size of samples used in studies published by leading journals. The present study assessed the progress made since the Task Force's final report in 1999. Sample sizes reported in four leading APA journals in 1955, 1977, 1995, and 2006 were compared using nonparametric statistics, while data from the last two waves were fit to a hierarchical generalized linear growth model for more in-depth analysis. Overall, results indicate that the recommendations for increasing sample sizes have not been integrated in core psychological research, although results slightly vary by field. This and other implications are discussed in the context of current methodological critique and practice.

  20. Generic Learning-Based Ensemble Framework for Small Sample Size Face Recognition in Multi-Camera Networks

    Directory of Open Access Journals (Sweden)

    Cuicui Zhang

    2014-12-01

    Full Text Available Multi-camera networks have gained great interest in video-based surveillance systems for security monitoring, access control, etc. Person re-identification is an essential and challenging task in multi-camera networks, which aims to determine if a given individual has already appeared over the camera network. Individual recognition often uses faces as a trial and requires a large number of samples during the training phrase. This is difficult to fulfill due to the limitation of the camera hardware system and the unconstrained image capturing conditions. Conventional face recognition algorithms often encounter the “small sample size” (SSS problem arising from the small number of training samples compared to the high dimensionality of the sample space. To overcome this problem, interest in the combination of multiple base classifiers has sparked research efforts in ensemble methods. However, existing ensemble methods still open two questions: (1 how to define diverse base classifiers from the small data; (2 how to avoid the diversity/accuracy dilemma occurring during ensemble. To address these problems, this paper proposes a novel generic learning-based ensemble framework, which augments the small data by generating new samples based on a generic distribution and introduces a tailored 0–1 knapsack algorithm to alleviate the diversity/accuracy dilemma. More diverse base classifiers can be generated from the expanded face space, and more appropriate base classifiers are selected for ensemble. Extensive experimental results on four benchmarks demonstrate the higher ability of our system to cope with the SSS problem compared to the state-of-the-art system.

  1. 77 FR 72702 - Small Business Size Standards: Information

    Science.gov (United States)

    2012-12-06

    ... SMALL BUSINESS ADMINISTRATION 13 CFR Part 121 RIN 3245-AG26 Small Business Size Standards: Information AGENCY: U.S. Small Business Administration. ACTION: Final rule. SUMMARY: The United States Small Business Administration (SBA) is increasing the receipts based small business size standards for 15...

  2. Effects of sample size on estimates of population growth rates calculated with matrix models.

    Directory of Open Access Journals (Sweden)

    Ian J Fiske

    Full Text Available BACKGROUND: Matrix models are widely used to study the dynamics and demography of populations. An important but overlooked issue is how the number of individuals sampled influences estimates of the population growth rate (lambda calculated with matrix models. Even unbiased estimates of vital rates do not ensure unbiased estimates of lambda-Jensen's Inequality implies that even when the estimates of the vital rates are accurate, small sample sizes lead to biased estimates of lambda due to increased sampling variance. We investigated if sampling variability and the distribution of sampling effort among size classes lead to biases in estimates of lambda. METHODOLOGY/PRINCIPAL FINDINGS: Using data from a long-term field study of plant demography, we simulated the effects of sampling variance by drawing vital rates and calculating lambda for increasingly larger populations drawn from a total population of 3842 plants. We then compared these estimates of lambda with those based on the entire population and calculated the resulting bias. Finally, we conducted a review of the literature to determine the sample sizes typically used when parameterizing matrix models used to study plant demography. CONCLUSIONS/SIGNIFICANCE: We found significant bias at small sample sizes when survival was low (survival = 0.5, and that sampling with a more-realistic inverse J-shaped population structure exacerbated this bias. However our simulations also demonstrate that these biases rapidly become negligible with increasing sample sizes or as survival increases. For many of the sample sizes used in demographic studies, matrix models are probably robust to the biases resulting from sampling variance of vital rates. However, this conclusion may depend on the structure of populations or the distribution of sampling effort in ways that are unexplored. We suggest more intensive sampling of populations when individual survival is low and greater sampling of stages with high

  3. Effects of sample size on estimates of population growth rates calculated with matrix models.

    Science.gov (United States)

    Fiske, Ian J; Bruna, Emilio M; Bolker, Benjamin M

    2008-08-28

    Matrix models are widely used to study the dynamics and demography of populations. An important but overlooked issue is how the number of individuals sampled influences estimates of the population growth rate (lambda) calculated with matrix models. Even unbiased estimates of vital rates do not ensure unbiased estimates of lambda-Jensen's Inequality implies that even when the estimates of the vital rates are accurate, small sample sizes lead to biased estimates of lambda due to increased sampling variance. We investigated if sampling variability and the distribution of sampling effort among size classes lead to biases in estimates of lambda. Using data from a long-term field study of plant demography, we simulated the effects of sampling variance by drawing vital rates and calculating lambda for increasingly larger populations drawn from a total population of 3842 plants. We then compared these estimates of lambda with those based on the entire population and calculated the resulting bias. Finally, we conducted a review of the literature to determine the sample sizes typically used when parameterizing matrix models used to study plant demography. We found significant bias at small sample sizes when survival was low (survival = 0.5), and that sampling with a more-realistic inverse J-shaped population structure exacerbated this bias. However our simulations also demonstrate that these biases rapidly become negligible with increasing sample sizes or as survival increases. For many of the sample sizes used in demographic studies, matrix models are probably robust to the biases resulting from sampling variance of vital rates. However, this conclusion may depend on the structure of populations or the distribution of sampling effort in ways that are unexplored. We suggest more intensive sampling of populations when individual survival is low and greater sampling of stages with high elasticities.

  4. Exact association test for small size sequencing data.

    Science.gov (United States)

    Lee, Joowon; Lee, Seungyeoun; Jang, Jin-Young; Park, Taesung

    2018-04-20

    Recent statistical methods for next generation sequencing (NGS) data have been successfully applied to identifying rare genetic variants associated with certain diseases. However, most commonly used methods (e.g., burden tests and variance-component tests) rely on large sample sizes. Notwithstanding, due to its-still high cost, NGS data is generally restricted to small sample sizes, that cannot be analyzed by most existing methods. In this work, we propose a new exact association test for sequencing data that does not require a large sample approximation, which is applicable to both common and rare variants. Our method, based on the Generalized Cochran-Mantel-Haenszel (GCMH) statistic, was applied to NGS datasets from intraductal papillary mucinous neoplasm (IPMN) patients. IPMN is a unique pancreatic cancer subtype that can turn into an invasive and hard-to-treat metastatic disease. Application of our method to IPMN data successfully identified susceptible genes associated with progression of IPMN to pancreatic cancer. Our method is expected to identify disease-associated genetic variants more successfully, and corresponding signal pathways, improving our understanding of specific disease's etiology and prognosis.

  5. Shrinkage-based diagonal Hotelling’s tests for high-dimensional small sample size data

    KAUST Repository

    Dong, Kai

    2015-09-16

    DNA sequencing techniques bring novel tools and also statistical challenges to genetic research. In addition to detecting differentially expressed genes, testing the significance of gene sets or pathway analysis has been recognized as an equally important problem. Owing to the “large pp small nn” paradigm, the traditional Hotelling’s T2T2 test suffers from the singularity problem and therefore is not valid in this setting. In this paper, we propose a shrinkage-based diagonal Hotelling’s test for both one-sample and two-sample cases. We also suggest several different ways to derive the approximate null distribution under different scenarios of pp and nn for our proposed shrinkage-based test. Simulation studies show that the proposed method performs comparably to existing competitors when nn is moderate or large, but it is better when nn is small. In addition, we analyze four gene expression data sets and they demonstrate the advantage of our proposed shrinkage-based diagonal Hotelling’s test.

  6. Shrinkage-based diagonal Hotelling’s tests for high-dimensional small sample size data

    KAUST Repository

    Dong, Kai; Pang, Herbert; Tong, Tiejun; Genton, Marc G.

    2015-01-01

    DNA sequencing techniques bring novel tools and also statistical challenges to genetic research. In addition to detecting differentially expressed genes, testing the significance of gene sets or pathway analysis has been recognized as an equally important problem. Owing to the “large pp small nn” paradigm, the traditional Hotelling’s T2T2 test suffers from the singularity problem and therefore is not valid in this setting. In this paper, we propose a shrinkage-based diagonal Hotelling’s test for both one-sample and two-sample cases. We also suggest several different ways to derive the approximate null distribution under different scenarios of pp and nn for our proposed shrinkage-based test. Simulation studies show that the proposed method performs comparably to existing competitors when nn is moderate or large, but it is better when nn is small. In addition, we analyze four gene expression data sets and they demonstrate the advantage of our proposed shrinkage-based diagonal Hotelling’s test.

  7. Mechanical stability of nanoporous metals with small ligament sizes

    International Nuclear Information System (INIS)

    Crowson, Douglas A.; Farkas, Diana; Corcoran, Sean G.

    2009-01-01

    Digital samples of nanoporous gold with small ligament sizes were studied by atomistic simulation using different interatomic potentials that represent varying surface stress values. We predict a surface relaxation driven mechanical instability for these materials. Plastic deformation is induced by the surface stress without external load, related to the combination of the surface stress value and the surface to volume ratio.

  8. Systematic studies of small scintillators for new sampling calorimeter

    Indian Academy of Sciences (India)

    A new sampling calorimeter using very thin scintillators and the multi-pixel photon counter (MPPC) has been proposed to produce better position resolution for the international linear collider (ILC) experiment. As part of this R & D study, small plastic scintillators of different sizes, thickness and wrapping reflectors are ...

  9. Transportable high sensitivity small sample radiometric calorimeter

    International Nuclear Information System (INIS)

    Wetzel, J.R.; Biddle, R.S.; Cordova, B.S.; Sampson, T.E.; Dye, H.R.; McDow, J.G.

    1998-01-01

    A new small-sample, high-sensitivity transportable radiometric calorimeter, which can be operated in different modes, contains an electrical calibration method, and can be used to develop secondary standards, will be described in this presentation. The data taken from preliminary tests will be presented to indicate the precision and accuracy of the instrument. The calorimeter and temperature-controlled bath, at present, require only a 30-in. by 20-in. tabletop area. The calorimeter is operated from a laptop computer system using unique measurement module capable of monitoring all necessary calorimeter signals. The calorimeter can be operated in the normal calorimeter equilibration mode, as a comparison instrument, using twin chambers and an external electrical calibration method. The sample chamber is 0.75 in (1.9 cm) in diameter by 2.5 in. (6.35 cm) long. This size will accommodate most 238 Pu heat standards manufactured in the past. The power range runs from 0.001 W to <20 W. The high end is only limited by sample size

  10. Sample size in qualitative interview studies

    DEFF Research Database (Denmark)

    Malterud, Kirsti; Siersma, Volkert Dirk; Guassora, Ann Dorrit Kristiane

    2016-01-01

    Sample sizes must be ascertained in qualitative studies like in quantitative studies but not by the same means. The prevailing concept for sample size in qualitative studies is “saturation.” Saturation is closely tied to a specific methodology, and the term is inconsistently applied. We propose...... the concept “information power” to guide adequate sample size for qualitative studies. Information power indicates that the more information the sample holds, relevant for the actual study, the lower amount of participants is needed. We suggest that the size of a sample with sufficient information power...... and during data collection of a qualitative study is discussed....

  11. 76 FR 70667 - Small Business Size Standards: Educational Services

    Science.gov (United States)

    2011-11-15

    ... business assistance, SBA establishes small business size definitions (referred to as size standards) for... SMALL BUSINESS ADMINISTRATION 13 CFR Part 121 RIN 3245-AG29 Small Business Size Standards: Educational Services AGENCY: U.S. Small Business Administration. ACTION: Proposed rule. SUMMARY: The U.S...

  12. ANALYSIS OF MONTE CARLO SIMULATION SAMPLING TECHNIQUES ON SMALL SIGNAL STABILITY OF WIND GENERATOR- CONNECTED POWER SYSTEM

    Directory of Open Access Journals (Sweden)

    TEMITOPE RAPHAEL AYODELE

    2016-04-01

    Full Text Available Monte Carlo simulation using Simple Random Sampling (SRS technique is popularly known for its ability to handle complex uncertainty problems. However, to produce a reasonable result, it requires huge sample size. This makes it to be computationally expensive, time consuming and unfit for online power system applications. In this article, the performance of Latin Hypercube Sampling (LHS technique is explored and compared with SRS in term of accuracy, robustness and speed for small signal stability application in a wind generator-connected power system. The analysis is performed using probabilistic techniques via eigenvalue analysis on two standard networks (Single Machine Infinite Bus and IEEE 16–machine 68 bus test system. The accuracy of the two sampling techniques is determined by comparing their different sample sizes with the IDEAL (conventional. The robustness is determined based on a significant variance reduction when the experiment is repeated 100 times with different sample sizes using the two sampling techniques in turn. Some of the results show that sample sizes generated from LHS for small signal stability application produces the same result as that of the IDEAL values starting from 100 sample size. This shows that about 100 sample size of random variable generated using LHS method is good enough to produce reasonable results for practical purpose in small signal stability application. It is also revealed that LHS has the least variance when the experiment is repeated 100 times compared to SRS techniques. This signifies the robustness of LHS over that of SRS techniques. 100 sample size of LHS produces the same result as that of the conventional method consisting of 50000 sample size. The reduced sample size required by LHS gives it computational speed advantage (about six times over the conventional method.

  13. 77 FR 28520 - Small Business Size Regulations, Small Business Innovation Research (SBIR) Program and Small...

    Science.gov (United States)

    2012-05-15

    ... SMALL BUSINESS ADMINISTRATION 13 CFR Part 121 RIN 3245-AG46 Small Business Size Regulations, Small Business Innovation Research (SBIR) Program and Small Business Technology Transfer (STTR) Program AGENCY: Small Business Administration. ACTION: Proposed rule. SUMMARY: The U.S. Small Business Administration...

  14. Uniform fabrication of thick SU-8 patterns on small-sized wafers for micro-optics applications

    Science.gov (United States)

    Abada, S.; Reig, B.; Daran, E.; Doucet, JB; Camps, T.; Charlot, S.; Bardinal, V.

    2014-05-01

    This paper reports on an alternative method for precise and uniform fabrication of 100μm-thick SU-8 microstructures on small-sized or non-circular samples. Standard spin-coating of high-viscosity resists is indeed known to induce large edge beads, leading to an air gap between the mask and the SU-8 photo-resist surface during UV photolithography. This results in a non uniform thickness deposition and in a poor pattern definition. This problem becomes highly critical in the case of small-sized samples. To overcome it, we have developed a soft thermal imprint method based on the use of a nano-imprint equipment and applicable whatever sample fragility, shape and size (from 2cm to 6 inches). After final photolithography, the SU8 pattern thickness variation profile is measured. Thickness uniformity is improved from 30% to 5% with a 5μm maximal deviation to the target value over 2cm-long samples.

  15. 77 FR 39385 - Receipts-Based, Small Business Size Standard

    Science.gov (United States)

    2012-07-03

    .... The NRC is increasing its receipts-based, small business size standard from $6.5 million to $7 million...-based, small business size standard increasing from $6.5 million to $7.0 million. This adjustment is to... regulatory programs. The NRC is increasing its receipts-based, small business size standard from $6.5 million...

  16. Improved sample size determination for attributes and variables sampling

    International Nuclear Information System (INIS)

    Stirpe, D.; Picard, R.R.

    1985-01-01

    Earlier INMM papers have addressed the attributes/variables problem and, under conservative/limiting approximations, have reported analytical solutions for the attributes and variables sample sizes. Through computer simulation of this problem, we have calculated attributes and variables sample sizes as a function of falsification, measurement uncertainties, and required detection probability without using approximations. Using realistic assumptions for uncertainty parameters of measurement, the simulation results support the conclusions: (1) previously used conservative approximations can be expensive because they lead to larger sample sizes than needed; and (2) the optimal verification strategy, as well as the falsification strategy, are highly dependent on the underlying uncertainty parameters of the measurement instruments. 1 ref., 3 figs

  17. 76 FR 69154 - Small Business Size and Status Integrity

    Science.gov (United States)

    2011-11-08

    ... SMALL BUSINESS ADMINISTRATION 13 CFR Parts 121, 124, 125, 126, and 127 RIN 3245-AG23 Small Business Size and Status Integrity AGENCY: U.S. Small Business Administration (SBA). ACTION: Proposed rule... implement provisions of the Small Business Jobs Act of 2010 (Jobs Act) pertaining to small business size and...

  18. TableSim--A program for analysis of small-sample categorical data.

    Science.gov (United States)

    David J. Rugg

    2003-01-01

    Documents a computer program for calculating correct P-values of 1-way and 2-way tables when sample sizes are small. The program is written in Fortran 90; the executable code runs in 32-bit Microsoft-- command line environments.

  19. Reliable calculation in probabilistic logic: Accounting for small sample size and model uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Ferson, S. [Applied Biomathematics, Setauket, NY (United States)

    1996-12-31

    A variety of practical computational problems arise in risk and safety assessments, forensic statistics and decision analyses in which the probability of some event or proposition E is to be estimated from the probabilities of a finite list of related subevents or propositions F,G,H,.... In practice, the analyst`s knowledge may be incomplete in two ways. First, the probabilities of the subevents may be imprecisely known from statistical estimations, perhaps based on very small sample sizes. Second, relationships among the subevents may be known imprecisely. For instance, there may be only limited information about their stochastic dependencies. Representing probability estimates as interval ranges on has been suggested as a way to address the first source of imprecision. A suite of AND, OR and NOT operators defined with reference to the classical Frochet inequalities permit these probability intervals to be used in calculations that address the second source of imprecision, in many cases, in a best possible way. Using statistical confidence intervals as inputs unravels the closure properties of this approach however, requiring that probability estimates be characterized by a nested stack of intervals for all possible levels of statistical confidence, from a point estimate (0% confidence) to the entire unit interval (100% confidence). The corresponding logical operations implied by convolutive application of the logical operators for every possible pair of confidence intervals reduces by symmetry to a manageably simple level-wise iteration. The resulting calculus can be implemented in software that allows users to compute comprehensive and often level-wise best possible bounds on probabilities for logical functions of events.

  20. Test of methods for retrospective activity size distribution determination from filter samples

    International Nuclear Information System (INIS)

    Meisenberg, Oliver; Tschiersch, Jochen

    2015-01-01

    Determining the activity size distribution of radioactive aerosol particles requires sophisticated and heavy equipment, which makes measurements at large number of sites difficult and expensive. Therefore three methods for a retrospective determination of size distributions from aerosol filter samples in the laboratory were tested for their applicability. Extraction into a carrier liquid with subsequent nebulisation showed size distributions with a slight but correctable bias towards larger diameters compared with the original size distribution. Yields in the order of magnitude of 1% could be achieved. Sonication-assisted extraction into a carrier liquid caused a coagulation mode to appear in the size distribution. Sonication-assisted extraction into the air did not show acceptable results due to small yields. The method of extraction into a carrier liquid without sonication was applied to aerosol samples from Chernobyl in order to calculate inhalation dose coefficients for 137 Cs based on the individual size distribution. The effective dose coefficient is about half of that calculated with a default reference size distribution. - Highlights: • Activity size distributions can be recovered after aerosol sampling on filters. • Extraction into a carrier liquid and subsequent nebulisation is appropriate. • This facilitates the determination of activity size distributions for individuals. • Size distributions from this method can be used for individual dose coefficients. • Dose coefficients were calculated for the workers at the new Chernobyl shelter

  1. Experimental determination of size distributions: analyzing proper sample sizes

    International Nuclear Information System (INIS)

    Buffo, A; Alopaeus, V

    2016-01-01

    The measurement of various particle size distributions is a crucial aspect for many applications in the process industry. Size distribution is often related to the final product quality, as in crystallization or polymerization. In other cases it is related to the correct evaluation of heat and mass transfer, as well as reaction rates, depending on the interfacial area between the different phases or to the assessment of yield stresses of polycrystalline metals/alloys samples. The experimental determination of such distributions often involves laborious sampling procedures and the statistical significance of the outcome is rarely investigated. In this work, we propose a novel rigorous tool, based on inferential statistics, to determine the number of samples needed to obtain reliable measurements of size distribution, according to specific requirements defined a priori. Such methodology can be adopted regardless of the measurement technique used. (paper)

  2. INITIAL PLANETESIMAL SIZES AND THE SIZE DISTRIBUTION OF SMALL KUIPER BELT OBJECTS

    International Nuclear Information System (INIS)

    Schlichting, Hilke E.; Fuentes, Cesar I.; Trilling, David E.

    2013-01-01

    The Kuiper Belt is a remnant from the early solar system and its size distribution contains many important constraints that can be used to test models of planet formation and collisional evolution. We show, by comparing observations with theoretical models, that the observed Kuiper Belt size distribution is well matched by coagulation models, which start with an initial planetesimal population with radii of about 1 km, and subsequent collisional evolution. We find that the observed size distribution above R ∼ 30 km is primordial, i.e., it has not been modified by collisional evolution over the age of the solar system, and that the size distribution below R ∼ 30 km has been modified by collisions and that its slope is well matched by collisional evolution models that use published strength laws. We investigate in detail the resulting size distribution of bodies ranging from 0.01 km to 30 km and find that its slope changes several times as a function of radius before approaching the expected value for an equilibrium collisional cascade of material strength dominated bodies for R ∼< 0.1 km. Compared to a single power-law size distribution that would span the whole range from 0.01 km to 30 km, we find in general a strong deficit of bodies around R ∼ 10 km and a strong excess of bodies around 2 km in radius. This deficit and excess of bodies are caused by the planetesimal size distribution left over from the runaway growth phase, which left most of the initial mass in small planetesimals while only a small fraction of the total mass is converted into large protoplanets. This excess mass in small planetesimals leaves a permanent signature in the size distribution of small bodies that is not erased after 4.5 Gyr of collisional evolution. Observations of the small Kuiper Belt Object (KBO) size distribution can therefore test if large KBOs grew as a result of runaway growth and constrained the initial planetesimal sizes. We find that results from recent KBO

  3. Testing of elastomer seals using small-size rigs

    International Nuclear Information System (INIS)

    Leeks, C.W.E.; Dunford, B.; Barnfield, J.H.; Gray, I.L.S.

    1997-01-01

    This paper looks at the use of small size seal leakage test rigs to demonstrate the compliance of full size container seals against the IAEA Transport Regulation's limits for activity release for normal transport and accident conditions. The detailed requirements of the regulations are discussed and it is concluded that an appropriate test programme to meet these requirements using only small size test rigs, can normally be set up and carried out on a relatively short time scale. It is important that any small test rigs should be designed to represent the relevant features of the seal arrangement and the overall test programme should cover all of the conditions, specified by the regulations, for the type, classification and contents of the container under consideration. The parameters of elastomer O-rings, which affect their sealing ability, are considered and those which are amenable to small scale testing or have to be modelled at full size are identified. Generally, the seals used in leakage tests have to be modelled with a full size cross-section but can have a reduced peripheral length. (Author)

  4. Sample size calculations for case-control studies

    Science.gov (United States)

    This R package can be used to calculate the required samples size for unconditional multivariate analyses of unmatched case-control studies. The sample sizes are for a scalar exposure effect, such as binary, ordinal or continuous exposures. The sample sizes can also be computed for scalar interaction effects. The analyses account for the effects of potential confounder variables that are also included in the multivariate logistic model.

  5. Relative efficiency and sample size for cluster randomized trials with variable cluster sizes.

    Science.gov (United States)

    You, Zhiying; Williams, O Dale; Aban, Inmaculada; Kabagambe, Edmond Kato; Tiwari, Hemant K; Cutter, Gary

    2011-02-01

    The statistical power of cluster randomized trials depends on two sample size components, the number of clusters per group and the numbers of individuals within clusters (cluster size). Variable cluster sizes are common and this variation alone may have significant impact on study power. Previous approaches have taken this into account by either adjusting total sample size using a designated design effect or adjusting the number of clusters according to an assessment of the relative efficiency of unequal versus equal cluster sizes. This article defines a relative efficiency of unequal versus equal cluster sizes using noncentrality parameters, investigates properties of this measure, and proposes an approach for adjusting the required sample size accordingly. We focus on comparing two groups with normally distributed outcomes using t-test, and use the noncentrality parameter to define the relative efficiency of unequal versus equal cluster sizes and show that statistical power depends only on this parameter for a given number of clusters. We calculate the sample size required for an unequal cluster sizes trial to have the same power as one with equal cluster sizes. Relative efficiency based on the noncentrality parameter is straightforward to calculate and easy to interpret. It connects the required mean cluster size directly to the required sample size with equal cluster sizes. Consequently, our approach first determines the sample size requirements with equal cluster sizes for a pre-specified study power and then calculates the required mean cluster size while keeping the number of clusters unchanged. Our approach allows adjustment in mean cluster size alone or simultaneous adjustment in mean cluster size and number of clusters, and is a flexible alternative to and a useful complement to existing methods. Comparison indicated that we have defined a relative efficiency that is greater than the relative efficiency in the literature under some conditions. Our measure

  6. Basic distribution free identification tests for small size samples of environmental data

    Energy Technology Data Exchange (ETDEWEB)

    Federico, A.G.; Musmeci, F. [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dipt. Ambiente

    1998-01-01

    Testing two or more data sets for the hypothesis that they are sampled form the same population is often required in environmental data analysis. Typically the available samples have a small number of data and often then assumption of normal distributions is not realistic. On the other hand the diffusion of the days powerful Personal Computers opens new possible opportunities based on a massive use of the CPU resources. The paper reviews the problem introducing the feasibility of two non parametric approaches based on intrinsic equi probability properties of the data samples. The first one is based on a full re sampling while the second is based on a bootstrap approach. A easy to use program is presented. A case study is given based on the Chernobyl children contamination data. [Italiano] Nell`analisi di dati ambientali ricorre spesso il caso di dover sottoporre a test l`ipotesi di provenienza di due, o piu`, insiemi di dati dalla stessa popolazione. Tipicamente i dati disponibili sono pochi e spesso l`ipotesi di provenienza da distribuzioni normali non e` sostenibile. D`altra aprte la diffusione odierna di Personal Computer fornisce nuove possibili soluzioni basate sull`uso intensivo delle risorse della CPU. Il rapporto analizza il problema e presenta la possibilita` di utilizzo di due test non parametrici basati sulle proprieta` intrinseche di equiprobabilita` dei campioni. Il primo e` basato su una tecnica di ricampionamento esaustivo mentre il secondo su un approccio di tipo bootstrap. E` presentato un programma di semplice utilizzo e un caso di studio basato su dati di contaminazione di bambini a Chernobyl.

  7. 77 FR 39442 - Receipts-Based, Small Business Size Standard

    Science.gov (United States)

    2012-07-03

    ... RIN 3150-AJ14 [NRC-2012-0062] Receipts-Based, Small Business Size Standard AGENCY: Nuclear Regulatory... Regulatory Flexibility Act of 1980, as amended. The NRC is proposing to increase its receipts-based, small business size standard from $6.5 million to $7 million to conform to the standard set by the Small Business...

  8. 78 FR 77334 - Small Business Size Standards: Construction

    Science.gov (United States)

    2013-12-23

    ... enrollment in the System of Award Management's (SAM) Dynamic Small Business Search database, and more firms... SMALL BUSINESS ADMINISTRATION 13 CFR Part 121 RIN 3245-AG37 Small Business Size Standards: Construction AGENCY: U.S. Small Business Administration. ACTION: Final rule. SUMMARY: The United States Small...

  9. Use of the small gas proportional counters for the carbon-14 measurement of very small samples

    International Nuclear Information System (INIS)

    Sayre, E.V.; Harbottle, G.; Stoenner, R.W.; Otlet, R.L.; Evans, G.V.

    1981-01-01

    Two recent developments are: the first is the mass-spectrometric separation of 14 C and 12 C ions, followed by counting of the 14 C, while the second is the extension of conventional proportional counter operation, using CO 2 as counting gas, to very small counters and samples. Although the second method is slow (months of counting time are required for 10 mg of carbon) it does not require operator intervention and many samples may be counted simultaneously. Also, it costs only a fraction of the capital expense of an accelerator installation. The development, construction and operation of suitable small counters are described, and results of three actual dating studies involving milligram scale carbon samples will be given. None of these could have been carried out if conventional, gram-sized samples had been needed. New installations, based on the use of these counters, are under construction or in the planning stages. These are located at Brookhaven Laboratory, the National Bureau of Standards (USA) and Harwell (UK). The Harwell installation, which is in advanced stages of construction, will be described in outline. The main significance of the small-counter method is, that although it will not suffice to measure the smallest (much less than 10 mg) or oldest samples, it will permit existing radiocarbon laboratories to extend their capability considerably, in the direction of smaller samples, at modest expense

  10. Small-sample-worth perturbation methods

    International Nuclear Information System (INIS)

    1985-01-01

    It has been assumed that the perturbed region, R/sub p/, is large enough so that: (1) even without a great deal of biasing there is a substantial probability that an average source-neutron will enter it; and (2) once having entered, the neutron is likely to make several collisions in R/sub p/ during its lifetime. Unfortunately neither assumption is valid for the typical configurations one encounters in small-sample-worth experiments. In such experiments one measures the reactivity change which is induced when a very small void in a critical assembly is filled with a sample of some test-material. Only a minute fraction of the fission-source neutrons ever gets into the sample and, of those neutrons that do, most emerge uncollided. Monte Carlo small-sample perturbations computations are described

  11. Volatile and non-volatile elements in grain-size separated samples of Apollo 17 lunar soils

    International Nuclear Information System (INIS)

    Giovanoli, R.; Gunten, H.R. von; Kraehenbuehl, U.; Meyer, G.; Wegmueller, F.; Gruetter, A.; Wyttenbach, A.

    1977-01-01

    Three samples of Apollo 17 lunar soils (75081, 72501 and 72461) were separated into 9 grain-size fractions between 540 and 1 μm mean diameter. In order to detect mineral fractionations caused during the separation procedures major elements were determined by instrumental neutron activation analyses performed on small aliquots of the separated samples. Twenty elements were measured in each size fraction using instrumental and radiochemical neutron activation techniques. The concentration of the main elements in sample 75081 does not change with the grain-size. Exceptions are Fe and Ti which decrease slightly and Al which increases slightly with the decrease in the grain-size. These changes in the composition in main elements suggest a decrease in Ilmenite and an increase in Anorthite with decreasing grain-size. However, it can be concluded that the mineral composition of the fractions changes less than a factor of 2. Samples 72501 and 72461 are not yet analyzed for the main elements. (Auth.)

  12. Three-year-olds obey the sample size principle of induction: the influence of evidence presentation and sample size disparity on young children's generalizations.

    Science.gov (United States)

    Lawson, Chris A

    2014-07-01

    Three experiments with 81 3-year-olds (M=3.62years) examined the conditions that enable young children to use the sample size principle (SSP) of induction-the inductive rule that facilitates generalizations from large rather than small samples of evidence. In Experiment 1, children exhibited the SSP when exemplars were presented sequentially but not when exemplars were presented simultaneously. Results from Experiment 3 suggest that the advantage of sequential presentation is not due to the additional time to process the available input from the two samples but instead may be linked to better memory for specific individuals in the large sample. In addition, findings from Experiments 1 and 2 suggest that adherence to the SSP is mediated by the disparity between presented samples. Overall, these results reveal that the SSP appears early in development and is guided by basic cognitive processes triggered during the acquisition of input. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. Does increasing the size of bi-weekly samples of records influence results when using the Global Trigger Tool? An observational study of retrospective record reviews of two different sample sizes.

    Science.gov (United States)

    Mevik, Kjersti; Griffin, Frances A; Hansen, Tonje E; Deilkås, Ellen T; Vonen, Barthold

    2016-04-25

    To investigate the impact of increasing sample of records reviewed bi-weekly with the Global Trigger Tool method to identify adverse events in hospitalised patients. Retrospective observational study. A Norwegian 524-bed general hospital trust. 1920 medical records selected from 1 January to 31 December 2010. Rate, type and severity of adverse events identified in two different samples sizes of records selected as 10 and 70 records, bi-weekly. In the large sample, 1.45 (95% CI 1.07 to 1.97) times more adverse events per 1000 patient days (39.3 adverse events/1000 patient days) were identified than in the small sample (27.2 adverse events/1000 patient days). Hospital-acquired infections were the most common category of adverse events in both the samples, and the distributions of the other categories of adverse events did not differ significantly between the samples. The distribution of severity level of adverse events did not differ between the samples. The findings suggest that while the distribution of categories and severity are not dependent on the sample size, the rate of adverse events is. Further studies are needed to conclude if the optimal sample size may need to be adjusted based on the hospital size in order to detect a more accurate rate of adverse events. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  14. Transformational and transactional leadership: does it work in small to medium-sized enterprises?

    NARCIS (Netherlands)

    Mesu, J.K.

    2013-01-01

    Using a sample of 755 employees who rated 121 supervisors within 50 Dutch small and medium-sized enterprises (SMEs), this dissertation intends to fill in some of the existing gaps in the literature. Firstly, by investigating whether the impact of transformational and transactional leadership extends

  15. Investigation of small and modular-sized fast reactor

    International Nuclear Information System (INIS)

    Kubota, Kenichi; Kawasaki, Nobuchika; Umetsu, Yoichiro; Akatsu, Minoru; Kasai, Shigeo; Konomura, Mamoru; Ichimiya, Masakazu

    2000-06-01

    In this paper, feasibility of the multipurpose small fast reactor, which could be used for requirements concerned with various utilization of electricity and energy and flexibility of power supply site, is discussed on the basis of examination of literatures of various small reactors. And also, a possibility of economic improvement by learning effect of fabrication cost is discussed for the modular-sized reactor which is expected to be a base load power supply system with lower initial investment. (1) Multipurpose small reactor (a) The small reactor with 10MWe-150MWe has a potential as a power source for large co-generation, a large island, a middle city, desalination and marine use. (b) Highly passive mechanism, long fuel exchange interval, and minimized maintenance activities are required for the multipurpose small reactor design. The reactor has a high potential for the long fuel exchange interval, since it is relatively easy for FR to obtain a long life core. (c) Current designs of small FRs in Japan and USA (NERI Project) are reviewed to obtain design requirements for the multipurpose small reactor. (2) Modular-sized reactor (a) In order that modular-sized reactor could be competitive to 3200MWe twin plant (two large monolithic reactor) with 200kyenWe, the target capital cost of FOAK is estimated to be 260kyen/yenWe for 800MWe modular, 280kyen/yenWe for 400MWe modular and 290kyen/yenWe for 200MWe by taking account of the leaning effect. (b) As the result of the review on the current designs of modular-sized FRs in Japan and USA (S-PRISM) from the viewpoint of economic improvement, since it only be necessary to make further effort for the target capital cost of FOAK, since the modular-sized FRs requires a large amount of material for shielding, vessels and heat exchangers essentially. (author)

  16. 76 FR 27935 - Small Business Size Standards: Transportation and Warehousing

    Science.gov (United States)

    2011-05-13

    ... assistance, SBA establishes small business definitions (referred to as size standards) for private sector... SMALL BUSINESS ADMINISTRATION 13 CFR Part 121 RIN 3245-AG08 Small Business Size Standards: Transportation and Warehousing AGENCY: U.S. Small Business Administration. ACTION: Proposed rule. SUMMARY: The U...

  17. Chironomid midges (Diptera, chironomidae) show extremely small genome sizes.

    Science.gov (United States)

    Cornette, Richard; Gusev, Oleg; Nakahara, Yuichi; Shimura, Sachiko; Kikawada, Takahiro; Okuda, Takashi

    2015-06-01

    Chironomid midges (Diptera; Chironomidae) are found in various environments from the high Arctic to the Antarctic, including temperate and tropical regions. In many freshwater habitats, members of this family are among the most abundant invertebrates. In the present study, the genome sizes of 25 chironomid species were determined by flow cytometry and the resulting C-values ranged from 0.07 to 0.20 pg DNA (i.e. from about 68 to 195 Mbp). These genome sizes were uniformly very small and included, to our knowledge, the smallest genome sizes recorded to date among insects. Small proportion of transposable elements and short intron sizes were suggested to contribute to the reduction of genome sizes in chironomids. We discuss about the possible developmental and physiological advantages of having a small genome size and about putative implications for the ecological success of the family Chironomidae.

  18. The causal effect of board size in the performance of small and medium-sized firms

    DEFF Research Database (Denmark)

    Bennedsen, Morten; Kongsted, Hans Christian; Meisner Nielsen, Kasper

    2008-01-01

    correlation between family size and board size and show this correlation to be driven by firms where the CEO's relatives serve on the board. Second, we find empirical evidence of a small adverse board size effect driven by the minority of small and medium-sized firms that are characterized by having......Empirical studies of large publicly traded firms have shown a robust negative relationship between board size and firm performance. The evidence on small and medium-sized firms is less clear; we show that existing work has been incomplete in analyzing the causal relationship due to weak...... identification strategies. Using a rich data set of almost 7000 closely held corporations we provide a causal analysis of board size effects on firm performance: We use a novel instrument given by the number of children of the chief executive officer (CEO) of the firms. First, we find a strong positive...

  19. Neuromuscular dose-response studies: determining sample size.

    Science.gov (United States)

    Kopman, A F; Lien, C A; Naguib, M

    2011-02-01

    Investigators planning dose-response studies of neuromuscular blockers have rarely used a priori power analysis to determine the minimal sample size their protocols require. Institutional Review Boards and peer-reviewed journals now generally ask for this information. This study outlines a proposed method for meeting these requirements. The slopes of the dose-response relationships of eight neuromuscular blocking agents were determined using regression analysis. These values were substituted for γ in the Hill equation. When this is done, the coefficient of variation (COV) around the mean value of the ED₅₀ for each drug is easily calculated. Using these values, we performed an a priori one-sample two-tailed t-test of the means to determine the required sample size when the allowable error in the ED₅₀ was varied from ±10-20%. The COV averaged 22% (range 15-27%). We used a COV value of 25% in determining the sample size. If the allowable error in finding the mean ED₅₀ is ±15%, a sample size of 24 is needed to achieve a power of 80%. Increasing 'accuracy' beyond this point requires increasing greater sample sizes (e.g. an 'n' of 37 for a ±12% error). On the basis of the results of this retrospective analysis, a total sample size of not less than 24 subjects should be adequate for determining a neuromuscular blocking drug's clinical potency with a reasonable degree of assurance.

  20. Estimating Sample Size for Usability Testing

    Directory of Open Access Journals (Sweden)

    Alex Cazañas

    2017-02-01

    Full Text Available One strategy used to assure that an interface meets user requirements is to conduct usability testing. When conducting such testing one of the unknowns is sample size. Since extensive testing is costly, minimizing the number of participants can contribute greatly to successful resource management of a project. Even though a significant number of models have been proposed to estimate sample size in usability testing, there is still not consensus on the optimal size. Several studies claim that 3 to 5 users suffice to uncover 80% of problems in a software interface. However, many other studies challenge this assertion. This study analyzed data collected from the user testing of a web application to verify the rule of thumb, commonly known as the “magic number 5”. The outcomes of the analysis showed that the 5-user rule significantly underestimates the required sample size to achieve reasonable levels of problem detection.

  1. Involvement of small and medium-sized enterprises (SMEs in elaborating and implementing public policies: Study case-Romanian small and medium-sized enterprises

    Directory of Open Access Journals (Sweden)

    Popescu Ruxandra

    2017-07-01

    Full Text Available Involvement and development of the private sector in boosting the economy nationwide is a main objective of the current program of the government, which means that a good cooperation between small and medium companies, private companies and multinationals and public environment including both public institutions and policies implemented and developed by them, becomes more than necessary. The paper summarizes the findings of a quantitative research based on a self-applied questionnaire which was aimed at Romanian small and medium-sized enterprises and also of a qualitative research that gives an overview of the process of elaborating and implementing a public policy. The involvement of small and medium-sized enterprises in the process of designing and implementing a public policy can become indispensable but it is well know that there is, in fact, a lack of initiative at this level, from both parties. One of the main research questions of this paper is to find out how much do representatives of small and medium-sized enterprises get involved in the process of elaborating a public policy and how much do these actions and measures impact the organizational policies of the companies themselves. A good cooperation between the business environment and the public institutions and a strong correlation of their joint efforts, should become a common practice between both parties, being crucial that this form of cooperation to be initiated from the very beginning. The contribution of this paper is a practical one, given the fact that the paper itself entails the direct responses of small and medium-sized enterprises on the current and future public policies that directly targets them, providing as well an analysis on the effects of public policies on small and medium-sized enterprises. Thus being said, the paper can also be a guide for both small and medium-sized enterprises in providing examples and measures of involvement and favorable public policies

  2. Sample Size Determination for One- and Two-Sample Trimmed Mean Tests

    Science.gov (United States)

    Luh, Wei-Ming; Olejnik, Stephen; Guo, Jiin-Huarng

    2008-01-01

    Formulas to determine the necessary sample sizes for parametric tests of group comparisons are available from several sources and appropriate when population distributions are normal. However, in the context of nonnormal population distributions, researchers recommend Yuen's trimmed mean test, but formulas to determine sample sizes have not been…

  3. 76 FR 14323 - Small Business Size Standards: Professional, Scientific and Technical Services

    Science.gov (United States)

    2011-03-16

    ... government small business assistance programs, SBA establishes small business size definitions (referred to... its field of operation and (3) within a specific small business definition or size standard... SMALL BUSINESS ADMINISTRATION 13 CFR Part 121 RIN 3245-AG07 Small Business Size Standards...

  4. Sample size determination for mediation analysis of longitudinal data.

    Science.gov (United States)

    Pan, Haitao; Liu, Suyu; Miao, Danmin; Yuan, Ying

    2018-03-27

    Sample size planning for longitudinal data is crucial when designing mediation studies because sufficient statistical power is not only required in grant applications and peer-reviewed publications, but is essential to reliable research results. However, sample size determination is not straightforward for mediation analysis of longitudinal design. To facilitate planning the sample size for longitudinal mediation studies with a multilevel mediation model, this article provides the sample size required to achieve 80% power by simulations under various sizes of the mediation effect, within-subject correlations and numbers of repeated measures. The sample size calculation is based on three commonly used mediation tests: Sobel's method, distribution of product method and the bootstrap method. Among the three methods of testing the mediation effects, Sobel's method required the largest sample size to achieve 80% power. Bootstrapping and the distribution of the product method performed similarly and were more powerful than Sobel's method, as reflected by the relatively smaller sample sizes. For all three methods, the sample size required to achieve 80% power depended on the value of the ICC (i.e., within-subject correlation). A larger value of ICC typically required a larger sample size to achieve 80% power. Simulation results also illustrated the advantage of the longitudinal study design. The sample size tables for most encountered scenarios in practice have also been published for convenient use. Extensive simulations study showed that the distribution of the product method and bootstrapping method have superior performance to the Sobel's method, but the product method was recommended to use in practice in terms of less computation time load compared to the bootstrapping method. A R package has been developed for the product method of sample size determination in mediation longitudinal study design.

  5. 14CO2 analysis of soil gas: Evaluation of sample size limits and sampling devices

    Science.gov (United States)

    Wotte, Anja; Wischhöfer, Philipp; Wacker, Lukas; Rethemeyer, Janet

    2017-12-01

    Radiocarbon (14C) analysis of CO2 respired from soils or sediments is a valuable tool to identify different carbon sources. The collection and processing of the CO2, however, is challenging and prone to contamination. We thus continuously improve our handling procedures and present a refined method for the collection of even small amounts of CO2 in molecular sieve cartridges (MSCs) for accelerator mass spectrometry 14C analysis. Using a modified vacuum rig and an improved desorption procedure, we were able to increase the CO2 recovery from the MSC (95%) as well as the sample throughput compared to our previous study. By processing series of different sample size, we show that our MSCs can be used for CO2 samples of as small as 50 μg C. The contamination by exogenous carbon determined in these laboratory tests, was less than 2.0 μg C from fossil and less than 3.0 μg C from modern sources. Additionally, we tested two sampling devices for the collection of CO2 samples released from soils or sediments, including a respiration chamber and a depth sampler, which are connected to the MSC. We obtained a very promising, low process blank for the entire CO2 sampling and purification procedure of ∼0.004 F14C (equal to 44,000 yrs BP) and ∼0.003 F14C (equal to 47,000 yrs BP). In contrast to previous studies, we observed no isotopic fractionation towards lighter δ13C values during the passive sampling with the depth samplers.

  6. Development of small-size baking oven

    Energy Technology Data Exchange (ETDEWEB)

    Tabata, Akihisa; Kuwabara, Shigeru; Yamazawa, Yoshitaka; Shigeta, Eiji

    1987-03-01

    In the bakery business, oven fresh bakeries selling fresh bread by installing their own baking ovens at their shops have become popular recently. This article reports the development of a small-size gas baking oven for oven fresh bakaries. The gas convection oven developed recently is based on the structure of the conventional electric convection oven, and uses low pressure gas. The gas oven has an advantage that the combustion gas contains moisture. The convection oven bakes bread normally at the baking density approximately 2.5 times as much as that of the radiation oven, thereby the size of the oven may become smaller. This oven can bake many kinds of bread ranging from croissants to bean-jam buns by gas combnstion heat as well as radiation heat from the radiation plates installed at the top of each compartment in the oven. An ultra small air heat type burner was developed to provide stable short flames in order to make the size of the combustion chamber smaller. (20 figs, 2 tabs)

  7. Bivariate Probit Models for Analysing how “Knowledge” Affects Innovation and Performance in Small and Medium Sized Firms

    OpenAIRE

    FARACE, Salvatore; MAZZOTTA, Fernanda

    2011-01-01

    This paper examines the determinants of innovation and its effects on small- and medium-sized firms We use the data from the OPIS databank, which provides a survey on a representative sample of firms from a province of the Southern Italy. We want to study whether small and medium sized firms can have a competitive advantage using their innovative capabilities, regardless of their sectoral and size limits. The main factor influencing the likelihood of innovation is knowledge, which is acquired...

  8. Small Scale Yielding Correction of Constraint Loss in Small Sized Fracture Toughness Test Specimens

    International Nuclear Information System (INIS)

    Kim, Maan Won; Kim, Min Chul; Lee, Bong Sang; Hong, Jun Hwa

    2005-01-01

    Fracture toughness data in the ductile-brittle transition region of ferritic steels show scatter produced by local sampling effects and specimen geometry dependence which results from relaxation in crack tip constraint. The ASTM E1921 provides a standard test method to define the median toughness temperature curve, so called Master Curve, for the material corresponding to a 1T crack front length and also defines a reference temperature, T 0 , at which median toughness value is 100 MPam for a 1T size specimen. The ASTM E1921 procedures assume that high constraint, small scaling yielding (SSY) conditions prevail at fracture along the crack front. Violation of the SSY assumption occurs most often during tests of smaller specimens. Constraint loss in such cases leads to higher toughness values and thus lower T 0 values. When applied to a structure with low constraint geometry, the standard fracture toughness estimates may lead to strongly over-conservative estimates. A lot of efforts have been made to adjust the constraint effect. In this work, we applied a small-scale yielding correction (SSYC) to adjust the constraint loss of 1/3PCVN and PCVN specimens which are relatively smaller than 1T size specimen at the fracture toughness Master Curve test

  9. The value chain of small-sized energy wood

    Energy Technology Data Exchange (ETDEWEB)

    Karttunen, K.; Foehr, J.; Ranta, T. (Lappeenranta Univ. of Technology, Mikkeli (Finland), LUT Energy), Email: kalle.karttunen@lut.fi, Email: jarno.fohr@lut.fi, Email: tapio.ranta@lut.fi; Ahtikoski, A. (The Finnish Forest Research Institute, Rovaniemi (Finland)), Email: anssi.ahtikoski@metla.fi; Valsta, L. (Helsinki Univ. (Finland), Dept. of Forest Economics), Email: lauri.valsta@helsinki.fi

    2009-07-01

    Finland has agreed to increase the share of renewable energy to the level of 38% by the end of 2020. Most of the increase is to be based on bioenergy. According to the National Climate and Energy Strategy, the need for forest biomass will come to more than 20 TWh, or some 10 million cubic meters per year. Energy wood from young stand thinnings are the biomass resource with the most potential at the moment. The purpose of this study was to compare cost differences between forest management incorporating energy wood thinning and forest management based on traditional roundwood thinning. In addition, alternative supply chain costs for small-sized wood were studied. The results of the study show that it is worth considering the following points if the demand and average price for forest chips remain high. 1. Forest-owners: Forest management including energy wood thinning is financially feasible. 2. Supply chain: A terminal chipping chain enables large-scale procurement of small-sized energy wood. 3. Power plants: Currently, subsidies, emission trading, and decreasing pulpwood prices together enable large-scale use of small-sized wood for energy purposes. The value chain of small-sized energy wood in large-scale power plants could be mobilised. (orig.)

  10. Sample size of the reference sample in a case-augmented study.

    Science.gov (United States)

    Ghosh, Palash; Dewanji, Anup

    2017-05-01

    The case-augmented study, in which a case sample is augmented with a reference (random) sample from the source population with only covariates information known, is becoming popular in different areas of applied science such as pharmacovigilance, ecology, and econometrics. In general, the case sample is available from some source (for example, hospital database, case registry, etc.); however, the reference sample is required to be drawn from the corresponding source population. The required minimum size of the reference sample is an important issue in this regard. In this work, we address the minimum sample size calculation and discuss related issues. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Information Security in Small and Medium-Sized Companies

    OpenAIRE

    David Kral

    2011-01-01

    Information security doesn’t involve only large organizations. Small and medium-sized companies must closely examine this issue too, because they are increasingly threatened by cyber attacks. Many of them mistakenly believe, that security of their valuable data is sufficient, or that the attackers are not interested in them. Existing standards and methodologies for implementation and management of information security are often hard to transfer to the environment of small and medium-sized bus...

  12. 40 CFR 80.127 - Sample size guidelines.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Sample size guidelines. 80.127 Section 80.127 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Attest Engagements § 80.127 Sample size guidelines. In performing the...

  13. Pen size and parity effects on maternal behaviour of Small-Tail Han sheep.

    Science.gov (United States)

    Lv, S-J; Yang, Y; Dwyer, C M; Li, F-K

    2015-07-01

    The aim of this experiment was to study the effects of pen size and parity on maternal behaviour of twin-bearing Small-Tail Han ewes. A total of 24 ewes were allocated to a 2×2 design (six per pen), with parity (primiparous or multiparous) and pen size (large: 6.0×3.0 m; small: 6.0×1.5 m) as main effects at Linyi University, Shandong Province, China. Behaviour was observed from after parturition until weaning. All ewes were observed for 6 h every 5 days from 0700 to1000 h and from 1400 to 1700 h. Continuous focal animal sampling was used to quantify the duration of maternal behaviours: sucking, grooming and following as well as the frequency of udder accepting, udder refusing and low-pitched bleating. Oestradiol and cortisol concentrations in the faeces (collected in the morning every 5 days) were detected using EIA kits. All lambs were weighed 24 h after parturition and again at weaning at 35 days of age. The small pen size significantly reduced following (Pbehaviour in sheep during lactation. The study is also the first to report on the maternal behaviour of Chinese native sheep breeds (Small-Tail Han sheep), with implications for the production of sheep in China.

  14. 78 FR 37404 - Small Business Size Standards: Support Activities for Mining

    Science.gov (United States)

    2013-06-20

    ... assistance programs, SBA establishes small business size definitions (referred to as size standards) for... million should be the limit of a small business definition and anything larger than that, such as that SBA... Business Act (15 U.S.C. 632(a)) (Act) requires that small business size definitions vary to reflect...

  15. 77 FR 72766 - Small Business Size Standards: Support Activities for Mining

    Science.gov (United States)

    2012-12-06

    ... eligibility for Federal small business assistance, SBA establishes small business size definitions (referred... operation; and (3) within a specific small business definition or size standard established by SBA... to SBA's Administrator the responsibility for establishing small business definitions. The Act also...

  16. Prognosis method to predict small-sized breast cancer affected by fibrocystic disease

    Directory of Open Access Journals (Sweden)

    S. A. Velichko

    2017-01-01

    Full Text Available The purpose of the study is to develop an effective radiological symptom-complex of small-sized breast cancer affected by fibrocystic breast disease by using multivariate statistical methods.Materials and methods. Radiological findings of small-sized breast cancer affected by fibrocystic mastopathy were analyzed in 100 patients with histologically verified diagnosis.Results. It was revealed that the conventional approach to the analysis of mammograms based on the detection of the primary, secondary and indirect mammographic signs of small-sized breast cancer is not effective enough - the sensitivity of mammography is only 62%. Fibrocystic disease and moderate-to-severe sclerosing adenosis make small-sized breast cancer hard to visualize by mammography. The detailed analysis of mammograms allowed us to identify the additional manifestations of small-sized breast cancer affected by mastopathy. The computer program allowing us to evaluate the risk of small-size breast cancer and the diagnostic algorithm for detecting small size breast cancer with sensitivity of 92% were developed. 

  17. Small-size biofuel cell on paper.

    Science.gov (United States)

    Zhang, Lingling; Zhou, Ming; Wen, Dan; Bai, Lu; Lou, Baohua; Dong, Shaojun

    2012-05-15

    In this work, we demonstrated a novel paper-based mediator-less and compartment-less biofuel cell (BFC) with small size (1.5 cm × 1.5 cm). Ionic liquid functionalized carbon nanotubes (CNTs-IL) nanocomposite was used as support for both stably confining the anodic biocatalyst (i.e., NAD(+)-dependent glucose dehydrogenase, GDH) for glucose electrooxidation and for facilitating direct electrochemistry of the cathodic biocatalyst (i.e., bilirubin oxidase, BOD) for O(2) electroreduction. Such BFC provided a simple approach to fabricate low-cost and portable power devices on small-size paper, which can harvest energy from a wide range of commercial beverages containing glucose (e.g., Nescafe instant coffee, Maidong vitamin water, Watermelon fresh juice, and Minute Maid grape juice). These made the low-cost paper-based biodevice potential for broad energy applications. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Publication Bias in Psychology: A Diagnosis Based on the Correlation between Effect Size and Sample Size

    Science.gov (United States)

    Kühberger, Anton; Fritz, Astrid; Scherndl, Thomas

    2014-01-01

    Background The p value obtained from a significance test provides no information about the magnitude or importance of the underlying phenomenon. Therefore, additional reporting of effect size is often recommended. Effect sizes are theoretically independent from sample size. Yet this may not hold true empirically: non-independence could indicate publication bias. Methods We investigate whether effect size is independent from sample size in psychological research. We randomly sampled 1,000 psychological articles from all areas of psychological research. We extracted p values, effect sizes, and sample sizes of all empirical papers, and calculated the correlation between effect size and sample size, and investigated the distribution of p values. Results We found a negative correlation of r = −.45 [95% CI: −.53; −.35] between effect size and sample size. In addition, we found an inordinately high number of p values just passing the boundary of significance. Additional data showed that neither implicit nor explicit power analysis could account for this pattern of findings. Conclusion The negative correlation between effect size and samples size, and the biased distribution of p values indicate pervasive publication bias in the entire field of psychology. PMID:25192357

  19. [Practical aspects regarding sample size in clinical research].

    Science.gov (United States)

    Vega Ramos, B; Peraza Yanes, O; Herrera Correa, G; Saldívar Toraya, S

    1996-01-01

    The knowledge of the right sample size let us to be sure if the published results in medical papers had a suitable design and a proper conclusion according to the statistics analysis. To estimate the sample size we must consider the type I error, type II error, variance, the size of the effect, significance and power of the test. To decide what kind of mathematics formula will be used, we must define what kind of study we have, it means if its a prevalence study, a means values one or a comparative one. In this paper we explain some basic topics of statistics and we describe four simple samples of estimation of sample size.

  20. Examination on small-sized cogeneration HTGR for developing countries

    International Nuclear Information System (INIS)

    Sakaba, Nariaki; Tachibana, Yukio; Shimakawa, Satoshi; Ohashi, Hirofumi; Sato, Hiroyuki; Yan, Xing; Murakami, Tomoyuki; Ohashi, Kazutaka; Nakagawa, Shigeaki; Goto, Minoru; Ueta, Shohei; Mozumi, Yasuhiro; Imai, Yoshiyuki; Tanaka, Nobuyuki; Okuda, Hiroyuki; Iwatsuki, Jin; Kubo, Shinji; Takada, Shoji; Nishihara, Tetsuo; Kunitomi, Kazuhiko

    2008-03-01

    The small-sized and safe cogeneration High Temperature Gas-cooled Reactor (HTGR) that can be used not only for electric power generation but also for hydrogen production and district heating is considered one of the most promising nuclear reactors for developing countries where sufficient infrastructure such as power grids is not provided. Thus, the small-sized cogeneration HTGR, named High Temperature Reactor 50-Cogeneration (HTR50C), was studied assuming that it should be constructed in developing countries. Specification, equipment configuration, etc. of the HTR50C were determined, and economical evaluation was made. As a result, it was shown that the HTR50C is economically competitive with small-sized light water reactors. (author)

  1. 75 FR 61597 - Small Business Size Standards: Retail Trade

    Science.gov (United States)

    2010-10-06

    ... eligibility for Federal small business assistance programs, SBA establishes small business size definitions... authorizes the SBA Administrator to establish only one definition of small business for an industry. [[Page... SBA's Administrator the responsibility for establishing small business definitions. The Act also...

  2. 77 FR 42211 - Small Business Size Standards: Arts, Entertainment, and Recreation

    Science.gov (United States)

    2012-07-18

    ..., SBA establishes small business definitions (referred to as size standards) for private sector... operated; (2) not dominant in its field of operation; and (3) within a specific small business definition... SMALL BUSINESS ADMINISTRATION 13 CFR Part 121 RIN 3245-AG36 Small Business Size Standards: Arts...

  3. Sample size determination and power

    CERN Document Server

    Ryan, Thomas P, Jr

    2013-01-01

    THOMAS P. RYAN, PhD, teaches online advanced statistics courses for Northwestern University and The Institute for Statistics Education in sample size determination, design of experiments, engineering statistics, and regression analysis.

  4. Maybe Small Is Too Small a Term: Introduction to Advancing Small Sample Prevention Science.

    Science.gov (United States)

    Fok, Carlotta Ching Ting; Henry, David; Allen, James

    2015-10-01

    Prevention research addressing health disparities often involves work with small population groups experiencing such disparities. The goals of this special section are to (1) address the question of what constitutes a small sample; (2) identify some of the key research design and analytic issues that arise in prevention research with small samples; (3) develop applied, problem-oriented, and methodologically innovative solutions to these design and analytic issues; and (4) evaluate the potential role of these innovative solutions in describing phenomena, testing theory, and evaluating interventions in prevention research. Through these efforts, we hope to promote broader application of these methodological innovations. We also seek whenever possible, to explore their implications in more general problems that appear in research with small samples but concern all areas of prevention research. This special section includes two sections. The first section aims to provide input for researchers at the design phase, while the second focuses on analysis. Each article describes an innovative solution to one or more challenges posed by the analysis of small samples, with special emphasis on testing for intervention effects in prevention research. A concluding article summarizes some of their broader implications, along with conclusions regarding future directions in research with small samples in prevention science. Finally, a commentary provides the perspective of the federal agencies that sponsored the conference that gave rise to this special section.

  5. Small Sample Properties of the Wilcoxon Signed Rank Test with Discontinuous and Dependent Observations

    OpenAIRE

    Nadine Chlass; Jens J. Krueger

    2007-01-01

    This Monte-Carlo study investigates sensitivity of the Wilcoxon signed rank test to certain assumption violations in small samples. Emphasis is put on within-sample-dependence, between-sample dependence, and the presence of ties. Our results show that both assumption violations induce severe size distortions and entail power losses. Surprisingly, these consequences do vary substantially with other properties the data may display. Results provided are particularly relevant for experimental set...

  6. 75 FR 1296 - Small Business Size Regulations; 8(a) Business Development/Small Disadvantaged Business Status...

    Science.gov (United States)

    2010-01-11

    ... SMALL BUSINESS ADMINISTRATION 13 CFR Parts 121 and 124 Small Business Size Regulations; 8(a) Business Development/Small Disadvantaged Business Status Determinations AGENCY: U.S. Small Business Administration. ACTION: Notice of public meetings; request for comments. SUMMARY: The U.S. Small Business...

  7. Quantifying predictability through information theory: small sample estimation in a non-Gaussian framework

    International Nuclear Information System (INIS)

    Haven, Kyle; Majda, Andrew; Abramov, Rafail

    2005-01-01

    Many situations in complex systems require quantitative estimates of the lack of information in one probability distribution relative to another. In short term climate and weather prediction, examples of these issues might involve the lack of information in the historical climate record compared with an ensemble prediction, or the lack of information in a particular Gaussian ensemble prediction strategy involving the first and second moments compared with the non-Gaussian ensemble itself. The relative entropy is a natural way to quantify the predictive utility in this information, and recently a systematic computationally feasible hierarchical framework has been developed. In practical systems with many degrees of freedom, computational overhead limits ensemble predictions to relatively small sample sizes. Here the notion of predictive utility, in a relative entropy framework, is extended to small random samples by the definition of a sample utility, a measure of the unlikeliness that a random sample was produced by a given prediction strategy. The sample utility is the minimum predictability, with a statistical level of confidence, which is implied by the data. Two practical algorithms for measuring such a sample utility are developed here. The first technique is based on the statistical method of null-hypothesis testing, while the second is based upon a central limit theorem for the relative entropy of moment-based probability densities. These techniques are tested on known probability densities with parameterized bimodality and skewness, and then applied to the Lorenz '96 model, a recently developed 'toy' climate model with chaotic dynamics mimicking the atmosphere. The results show a detection of non-Gaussian tendencies of prediction densities at small ensemble sizes with between 50 and 100 members, with a 95% confidence level

  8. 78 FR 38811 - Small Business Size and Status Integrity

    Science.gov (United States)

    2013-06-28

    .... Firms will not be able submit offers for small business contracts based on their online representations... SMALL BUSINESS ADMINISTRATION 13 CFR Parts 121, 124, 125, 126, and 127 RIN 3245-AG23 Small Business Size and Status Integrity AGENCY: Small Business Administration. ACTION: Final rule. SUMMARY: This...

  9. Generalized procedures for determining inspection sample sizes (related to quantitative measurements). Vol. 1: Detailed explanations

    International Nuclear Information System (INIS)

    Jaech, J.L.; Lemaire, R.J.

    1986-11-01

    Generalized procedures have been developed to determine sample sizes in connection with the planning of inspection activities. These procedures are based on different measurement methods. They are applied mainly to Bulk Handling Facilities and Physical Inventory Verifications. The present report attempts (i) to assign to appropriate statistical testers (viz. testers for gross, partial and small defects) the measurement methods to be used, and (ii) to associate the measurement uncertainties with the sample sizes required for verification. Working papers are also provided to assist in the application of the procedures. This volume contains the detailed explanations concerning the above mentioned procedures

  10. An Improvement to Interval Estimation for Small Samples

    Directory of Open Access Journals (Sweden)

    SUN Hui-Ling

    2017-02-01

    Full Text Available Because it is difficult and complex to determine the probability distribution of small samples,it is improper to use traditional probability theory to process parameter estimation for small samples. Bayes Bootstrap method is always used in the project. Although,the Bayes Bootstrap method has its own limitation,In this article an improvement is given to the Bayes Bootstrap method,This method extended the amount of samples by numerical simulation without changing the circumstances in a small sample of the original sample. And the new method can give the accurate interval estimation for the small samples. Finally,by using the Monte Carlo simulation to model simulation to the specific small sample problems. The effectiveness and practicability of the Improved-Bootstrap method was proved.

  11. Sample size calculations for cluster randomised crossover trials in Australian and New Zealand intensive care research.

    Science.gov (United States)

    Arnup, Sarah J; McKenzie, Joanne E; Pilcher, David; Bellomo, Rinaldo; Forbes, Andrew B

    2018-06-01

    The cluster randomised crossover (CRXO) design provides an opportunity to conduct randomised controlled trials to evaluate low risk interventions in the intensive care setting. Our aim is to provide a tutorial on how to perform a sample size calculation for a CRXO trial, focusing on the meaning of the elements required for the calculations, with application to intensive care trials. We use all-cause in-hospital mortality from the Australian and New Zealand Intensive Care Society Adult Patient Database clinical registry to illustrate the sample size calculations. We show sample size calculations for a two-intervention, two 12-month period, cross-sectional CRXO trial. We provide the formulae, and examples of their use, to determine the number of intensive care units required to detect a risk ratio (RR) with a designated level of power between two interventions for trials in which the elements required for sample size calculations remain constant across all ICUs (unstratified design); and in which there are distinct groups (strata) of ICUs that differ importantly in the elements required for sample size calculations (stratified design). The CRXO design markedly reduces the sample size requirement compared with the parallel-group, cluster randomised design for the example cases. The stratified design further reduces the sample size requirement compared with the unstratified design. The CRXO design enables the evaluation of routinely used interventions that can bring about small, but important, improvements in patient care in the intensive care setting.

  12. Small and medium-sized nuclear power plants

    International Nuclear Information System (INIS)

    Schmidt, R.

    1986-01-01

    Small and medium-sized nuclear power plants have long been under discussion as possible applications of nuclear power in countries with small transmission grid systems, in threshold countries and developing countries, and under special local supply conditions. IAEA has condensed and promoted this interest and tried to establish the demand, and possibilities of meeting it, in special events and campaigns. In recent years, considerable interest was registered even in industrialized countries, but here specially for heating and process heat generation applications and for special purposes and, in medium-sized units, also for combined supplies of electricity and heat. This corresponds to special reactor and plant concepts, some of which have already been developed to a stage at which construction work could begin. The analysis presented deals with necessary preconditions on the sides of the users and the vendors, with problems of economy, infrastructure and financing and with the market prospects of small nuclear power plants. (orig./HP) [de

  13. The Relationship Between Training and Employment Growth in Small and Medium-Sized Enterprises

    OpenAIRE

    Andy Cosh; Alan Hughes; Melvyn Weeks

    2000-01-01

    This paper provides a rigorous analysis of the impact of training upon the employment growth characteristics of small and medium sized firms. Using appropriate statistical techniques to cope with sample selection biases and heterogenerous employment growth patterns it reveals that training is positively related to employment growth, in particular when it is embedded in a wider range of human relations practices.

  14. Bank Size and Small- and Medium-sized Enterprise (SME) Lending: Evidence from China

    Science.gov (United States)

    SHEN, YAN; SHEN, MINGGAO; XU, ZHONG; BAI, YING

    2014-01-01

    Summary Using panel data collected in 2005, we evaluate how bank size, discretion over credit, incentive schemes, competition, and the institutional environment affect lending to small- and medium-sized enterprises in China. We deal with the endogeneity problem using instrumental variables, and a reduced-form approach is also applied to allow for weak instruments in estimation. We find that total bank asset is an insignificant factor for banks’ decision on small- and medium-enterprise (SME) lending, but more local lending authority, more competition, carefully designed incentive schemes, and stronger law enforcement encourage commercial banks to lend to SMEs. PMID:26052179

  15. Bank Size and Small- and Medium-sized Enterprise (SME) Lending: Evidence from China.

    Science.gov (United States)

    Shen, Yan; Shen, Minggao; Xu, Zhong; Bai, Ying

    2009-04-01

    Using panel data collected in 2005, we evaluate how bank size, discretion over credit, incentive schemes, competition, and the institutional environment affect lending to small- and medium-sized enterprises in China. We deal with the endogeneity problem using instrumental variables, and a reduced-form approach is also applied to allow for weak instruments in estimation. We find that total bank asset is an insignificant factor for banks' decision on small- and medium-enterprise (SME) lending, but more local lending authority, more competition, carefully designed incentive schemes, and stronger law enforcement encourage commercial banks to lend to SMEs.

  16. The role of independent intermediaries. The case of small and medium-sized exporters

    DEFF Research Database (Denmark)

    Madsen, Tage Koed; Moen, Øystein; Hammervold, Randi

    2012-01-01

    and participation rather than just discrete types of intermediary modes (agents, importers, dealers, etc.). Associations with performance as well as the role of product and distributor characteristics are analyzed. Empirical data based on a sample of product-market ventures in 227 small and medium-sized Norwegian...... export firms are analysed by a structural equation modelling approach. The article provides empirical evidence that managers keep control of decision making to an extent that may have a negative impact on export performance. The empirical study indicates that firms should participate more in task......The article examines how small and medium-sized exporters collaborate with intermediaries in foreign markets by studying the level of control, i.e. the delegation of decisions rights and task solution. It goes one step further than previous research, since it examines degrees of control...

  17. Internationalization of small and medium-sized enterprises

    Directory of Open Access Journals (Sweden)

    Zoran Paunović

    2010-06-01

    Full Text Available Highly developed countries, which are similar to Croatia by size and population, highlight the importance of small and medium-sized enterprises (SMEs as holders of export activities and key factors in raising the competitiveness of the entire economy. In this paper authors research the concept of internationalization of SMEs. Analyzing the influence of decision makers on the process of internationalization and showing its advantages and disadvantages for the respective company and country, this research introduces the most common models on the basis of which this process is implemented in practice. A case study of a small export company from Croatia illustrates the process of internationalization to the U.S. market and provides useful information to the companies which are planning to enter new markets.

  18. Estimation of individual reference intervals in small sample sizes

    DEFF Research Database (Denmark)

    Hansen, Ase Marie; Garde, Anne Helene; Eller, Nanna Hurwitz

    2007-01-01

    In occupational health studies, the study groups most often comprise healthy subjects performing their work. Sampling is often planned in the most practical way, e.g., sampling of blood in the morning at the work site just after the work starts. Optimal use of reference intervals requires...... from various variables such as gender, age, BMI, alcohol, smoking, and menopause. The reference intervals were compared to reference intervals calculated using IFCC recommendations. Where comparable, the IFCC calculated reference intervals had a wider range compared to the variance component models...

  19. Gaseous radiocarbon measurements of small samples

    International Nuclear Information System (INIS)

    Ruff, M.; Szidat, S.; Gaeggeler, H.W.; Suter, M.; Synal, H.-A.; Wacker, L.

    2010-01-01

    Radiocarbon dating by means of accelerator mass spectrometry (AMS) is a well-established method for samples containing carbon in the milligram range. However, the measurement of small samples containing less than 50 μg carbon often fails. It is difficult to graphitise these samples and the preparation is prone to contamination. To avoid graphitisation, a solution can be the direct measurement of carbon dioxide. The MICADAS, the smallest accelerator for radiocarbon dating in Zurich, is equipped with a hybrid Cs sputter ion source. It allows the measurement of both, graphite targets and gaseous CO 2 samples, without any rebuilding. This work presents experiences dealing with small samples containing 1-40 μg carbon. 500 unknown samples of different environmental research fields have been measured yet. Most of the samples were measured with the gas ion source. These data are compared with earlier measurements of small graphite samples. The performance of the two different techniques is discussed and main contributions to the blank determined. An analysis of blank and standard data measured within years allowed a quantification of the contamination, which was found to be of the order of 55 ng and 750 ng carbon (50 pMC) for the gaseous and the graphite samples, respectively. For quality control, a number of certified standards were measured using the gas ion source to demonstrate reliability of the data.

  20. 76 FR 70680 - Small Business Size Standards: Real Estate and Rental and Leasing

    Science.gov (United States)

    2011-11-15

    ... eligibility for Federal small business assistance, SBA establishes small business size definitions (referred... business definition or size standard established by the SBA Administrator. SBA considers as part of its... SMALL BUSINESS ADMINISTRATION 13 CFR Part 121 RIN 3245-AG28 Small Business Size Standards: Real...

  1. 77 FR 76215 - Small Business Size Regulations, Small Business Innovation Research (SBIR) Program and Small...

    Science.gov (United States)

    2012-12-27

    ... overall goal of simplification and maximization of benefits for small businesses, SBA proposed amendments... franchisee. F. Section 121.704--When SBA Determines Size and Eligibility SBA's proposed regulations for the...

  2. 76 FR 8221 - Small Business Size Regulations; 8(a) Business Development/Small Disadvantaged Business Status...

    Science.gov (United States)

    2011-02-11

    ... Vol. 76 Friday, No. 29 February 11, 2011 Part VII Small Business Administration 13 CFR Parts 121 and 124 Small Business Size Regulations; 8(a) Business Development/Small Disadvantaged Business Status... Regulations#0;#0; [[Page 8222

  3. A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis.

    Science.gov (United States)

    Lin, Johnny; Bentler, Peter M

    2012-01-01

    Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne's asymptotically distribution-free method and Satorra Bentler's mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler's statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby's study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic.

  4. PIXE–PIGE analysis of size-segregated aerosol samples from remote areas

    Energy Technology Data Exchange (ETDEWEB)

    Calzolai, G., E-mail: calzolai@fi.infn.it [Department of Physics and Astronomy, University of Florence and National Institute of Nuclear Physics (INFN), Via G. Sansone 1, 50019 Sesto Fiorentino (Italy); Chiari, M.; Lucarelli, F.; Nava, S.; Taccetti, F. [Department of Physics and Astronomy, University of Florence and National Institute of Nuclear Physics (INFN), Via G. Sansone 1, 50019 Sesto Fiorentino (Italy); Becagli, S.; Frosini, D.; Traversi, R.; Udisti, R. [Department of Chemistry, University of Florence, Via della Lastruccia 3, 50019 Sesto Fiorentino (Italy)

    2014-01-01

    The chemical characterization of size-segregated samples is helpful to study the aerosol effects on both human health and environment. The sampling with multi-stage cascade impactors (e.g., Small Deposit area Impactor, SDI) produces inhomogeneous samples, with a multi-spot geometry and a non-negligible particle stratification. At LABEC (Laboratory of nuclear techniques for the Environment and the Cultural Heritage), an external beam line is fully dedicated to PIXE–PIGE analysis of aerosol samples. PIGE is routinely used as a sidekick of PIXE to correct the underestimation of PIXE in quantifying the concentration of the lightest detectable elements, like Na or Al, due to X-ray absorption inside the individual aerosol particles. In this work PIGE has been used to study proper attenuation correction factors for SDI samples: relevant attenuation effects have been observed also for stages collecting smaller particles, and consequent implications on the retrieved aerosol modal structure have been evidenced.

  5. Consultant-Client Relationship and Knowledge Transfer in Small- and Medium-Sized Enterprises Change Processes.

    Science.gov (United States)

    Martinez, Luis F; Ferreira, Aristides I; Can, Amina B

    2016-04-01

    Based on Szulanski's knowledge transfer model, this study examined how the communicational, motivational, and sharing of understanding variables influenced knowledge transfer and change processes in small- and medium-sized enterprises, particularly under projects developed by funded programs. The sample comprised 144 entrepreneurs, mostly male (65.3%) and mostly ages 35 to 45 years (40.3%), who filled an online questionnaire measuring the variables of "sharing of understanding," "motivation," "communication encoding competencies," "source credibility," "knowledge transfer," and "organizational change." Data were collected between 2011 and 2012 and measured the relationship between clients and consultants working in a Portuguese small- and medium-sized enterprise-oriented action learning program. To test the hypotheses, structural equation modeling was conducted to identify the antecedents of sharing of understanding, motivational, and communicational variables, which were positively correlated with the knowledge transfer between consultants and clients. This transfer was also positively correlated with organizational change. Overall, the study provides important considerations for practitioners and academicians and establishes new avenues for future studies concerning the issues of consultant-client relationship and the efficacy of Government-funded programs designed to improve performance of small- and medium-sized enterprises. © The Author(s) 2016.

  6. Using Data-Dependent Priors to Mitigate Small Sample Bias in Latent Growth Models: A Discussion and Illustration Using M"plus"

    Science.gov (United States)

    McNeish, Daniel M.

    2016-01-01

    Mixed-effects models (MEMs) and latent growth models (LGMs) are often considered interchangeable save the discipline-specific nomenclature. Software implementations of these models, however, are not interchangeable, particularly with small sample sizes. Restricted maximum likelihood estimation that mitigates small sample bias in MEMs has not been…

  7. Estimation of sample size and testing power (Part 4).

    Science.gov (United States)

    Hu, Liang-ping; Bao, Xiao-lei; Guan, Xue; Zhou, Shi-guo

    2012-01-01

    Sample size estimation is necessary for any experimental or survey research. An appropriate estimation of sample size based on known information and statistical knowledge is of great significance. This article introduces methods of sample size estimation of difference test for data with the design of one factor with two levels, including sample size estimation formulas and realization based on the formulas and the POWER procedure of SAS software for quantitative data and qualitative data with the design of one factor with two levels. In addition, this article presents examples for analysis, which will play a leading role for researchers to implement the repetition principle during the research design phase.

  8. 77 FR 11001 - Small Business Size Standards: Health Care and Social Assistance

    Science.gov (United States)

    2012-02-24

    ..., and (3) within a specific small business definition or size standard established by the SBA... SMALL BUSINESS ADMINISTRATION 13 CFR Part 121 RIN 3245-AG30 Small Business Size Standards: Health Care and Social Assistance AGENCY: U.S. Small Business Administration. ACTION: Proposed rule. SUMMARY...

  9. 77 FR 55755 - Small Business Size Standards: Agriculture, Forestry, Fishing, and Hunting

    Science.gov (United States)

    2012-09-11

    ... operation; and (3) within a specific small business definition or size standard established by SBA... SMALL BUSINESS ADMINISTRATION 13 CFR Part 121 RIN 3245-AG43 Small Business Size Standards: Agriculture, Forestry, Fishing, and Hunting AGENCY: U.S. Small Business Administration. ACTION: Proposed rule...

  10. 76 FR 27952 - Small Business Size Standards: Professional, Scientific and Technical Services.

    Science.gov (United States)

    2011-05-13

    ... Administration (SBA or Agency) proposed to increase small business size standards for 35 industries and one sub... SMALL BUSINESS ADMINISTRATION 13 CFR Part 121 RIN 3245-AG07 Small Business Size Standards: Professional, Scientific and Technical Services. AGENCY: U.S. Small Business Administration. ACTION: Proposed...

  11. In Situ Sampling of Relative Dust Devil Particle Loads and Their Vertical Grain Size Distributions.

    Science.gov (United States)

    Raack, Jan; Reiss, Dennis; Balme, Matthew R; Taj-Eddine, Kamal; Ori, Gian Gabriele

    2017-04-19

    During a field campaign in the Sahara Desert in southern Morocco, spring 2012, we sampled the vertical grain size distribution of two active dust devils that exhibited different dimensions and intensities. With these in situ samples of grains in the vortices, it was possible to derive detailed vertical grain size distributions and measurements of the lifted relative particle load. Measurements of the two dust devils show that the majority of all lifted particles were only lifted within the first meter (∼46.5% and ∼61% of all particles; ∼76.5 wt % and ∼89 wt % of the relative particle load). Furthermore, ∼69% and ∼82% of all lifted sand grains occurred in the first meter of the dust devils, indicating the occurrence of "sand skirts." Both sampled dust devils were relatively small (∼15 m and ∼4-5 m in diameter) compared to dust devils in surrounding regions; nevertheless, measurements show that ∼58.5% to 73.5% of all lifted particles were small enough to go into suspension (grain size classification). This relatively high amount represents only ∼0.05 to 0.15 wt % of the lifted particle load. Larger dust devils probably entrain larger amounts of fine-grained material into the atmosphere, which can have an influence on the climate. Furthermore, our results indicate that the composition of the surface, on which the dust devils evolved, also had an influence on the particle load composition of the dust devil vortices. The internal particle load structure of both sampled dust devils was comparable related to their vertical grain size distribution and relative particle load, although both dust devils differed in their dimensions and intensities. A general trend of decreasing grain sizes with height was also detected. Key Words: Mars-Dust devils-Planetary science-Desert soils-Atmosphere-Grain sizes. Astrobiology 17, xxx-xxx.

  12. Effects of sample size on estimation of rainfall extremes at high temperatures

    Science.gov (United States)

    Boessenkool, Berry; Bürger, Gerd; Heistermann, Maik

    2017-09-01

    High precipitation quantiles tend to rise with temperature, following the so-called Clausius-Clapeyron (CC) scaling. It is often reported that the CC-scaling relation breaks down and even reverts for very high temperatures. In our study, we investigate this reversal using observational climate data from 142 stations across Germany. One of the suggested meteorological explanations for the breakdown is limited moisture supply. Here we argue that, instead, it could simply originate from undersampling. As rainfall frequency generally decreases with higher temperatures, rainfall intensities as dictated by CC scaling are less likely to be recorded than for moderate temperatures. Empirical quantiles are conventionally estimated from order statistics via various forms of plotting position formulas. They have in common that their largest representable return period is given by the sample size. In small samples, high quantiles are underestimated accordingly. The small-sample effect is weaker, or disappears completely, when using parametric quantile estimates from a generalized Pareto distribution (GPD) fitted with L moments. For those, we obtain quantiles of rainfall intensities that continue to rise with temperature.

  13. Effects of sample size on estimation of rainfall extremes at high temperatures

    Directory of Open Access Journals (Sweden)

    B. Boessenkool

    2017-09-01

    Full Text Available High precipitation quantiles tend to rise with temperature, following the so-called Clausius–Clapeyron (CC scaling. It is often reported that the CC-scaling relation breaks down and even reverts for very high temperatures. In our study, we investigate this reversal using observational climate data from 142 stations across Germany. One of the suggested meteorological explanations for the breakdown is limited moisture supply. Here we argue that, instead, it could simply originate from undersampling. As rainfall frequency generally decreases with higher temperatures, rainfall intensities as dictated by CC scaling are less likely to be recorded than for moderate temperatures. Empirical quantiles are conventionally estimated from order statistics via various forms of plotting position formulas. They have in common that their largest representable return period is given by the sample size. In small samples, high quantiles are underestimated accordingly. The small-sample effect is weaker, or disappears completely, when using parametric quantile estimates from a generalized Pareto distribution (GPD fitted with L moments. For those, we obtain quantiles of rainfall intensities that continue to rise with temperature.

  14. Automated Sampling and Extraction of Krypton from Small Air Samples for Kr-85 Measurement Using Atom Trap Trace Analysis

    International Nuclear Information System (INIS)

    Hebel, S.; Hands, J.; Goering, F.; Kirchner, G.; Purtschert, R.

    2015-01-01

    Atom-Trap-Trace-Analysis (ATTA) provides the capability of measuring the Krypton-85 concentration in microlitre amounts of krypton extracted from air samples of about 1 litre. This sample size is sufficiently small to allow for a range of applications, including on-site spot sampling and continuous sampling over periods of several hours. All samples can be easily handled and transported to an off-site laboratory for ATTA measurement, or stored and analyzed on demand. Bayesian sampling methodologies can be applied by blending samples for bulk measurement and performing in-depth analysis as required. Prerequisite for measurement is the extraction of a pure krypton fraction from the sample. This paper introduces an extraction unit able to isolate the krypton in small ambient air samples with high speed, high efficiency and in a fully automated manner using a combination of cryogenic distillation and gas chromatography. Air samples are collected using an automated smart sampler developed in-house to achieve a constant sampling rate over adjustable time periods ranging from 5 minutes to 3 hours per sample. The smart sampler can be deployed in the field and operate on battery for one week to take up to 60 air samples. This high flexibility of sampling and the fast, robust sample preparation are a valuable tool for research and the application of Kr-85 measurements to novel Safeguards procedures. (author)

  15. 78 FR 45051 - Small Business Size Standards; Support Activities for Mining; Correction

    Science.gov (United States)

    2013-07-26

    ... Regulations by increasing small business size standards for three of the four industries in North American... SMALL BUSINESS ADMINISTRATION 13 CFR Part 121 RIN 3245-AG44 Small Business Size Standards; Support Activities for Mining; Correction AGENCY: U.S. Small Business Administration. ACTION: Final rule; correction...

  16. Sample size determination for equivalence assessment with multiple endpoints.

    Science.gov (United States)

    Sun, Anna; Dong, Xiaoyu; Tsong, Yi

    2014-01-01

    Equivalence assessment between a reference and test treatment is often conducted by two one-sided tests (TOST). The corresponding power function and sample size determination can be derived from a joint distribution of the sample mean and sample variance. When an equivalence trial is designed with multiple endpoints, it often involves several sets of two one-sided tests. A naive approach for sample size determination in this case would select the largest sample size required for each endpoint. However, such a method ignores the correlation among endpoints. With the objective to reject all endpoints and when the endpoints are uncorrelated, the power function is the production of all power functions for individual endpoints. With correlated endpoints, the sample size and power should be adjusted for such a correlation. In this article, we propose the exact power function for the equivalence test with multiple endpoints adjusted for correlation under both crossover and parallel designs. We further discuss the differences in sample size for the naive method without and with correlation adjusted methods and illustrate with an in vivo bioequivalence crossover study with area under the curve (AUC) and maximum concentration (Cmax) as the two endpoints.

  17. Respondent-driven sampling and the recruitment of people with small injecting networks.

    Science.gov (United States)

    Paquette, Dana; Bryant, Joanne; de Wit, John

    2012-05-01

    Respondent-driven sampling (RDS) is a form of chain-referral sampling, similar to snowball sampling, which was developed to reach hidden populations such as people who inject drugs (PWID). RDS is said to reach members of a hidden population that may not be accessible through other sampling methods. However, less attention has been paid as to whether there are segments of the population that are more likely to be missed by RDS. This study examined the ability of RDS to capture people with small injecting networks. A study of PWID, using RDS, was conducted in 2009 in Sydney, Australia. The size of participants' injecting networks was examined by recruitment chain and wave. Participants' injecting network characteristics were compared to those of participants from a separate pharmacy-based study. A logistic regression analysis was conducted to examine the characteristics independently associated with having small injecting networks, using the combined RDS and pharmacy-based samples. In comparison with the pharmacy-recruited participants, RDS participants were almost 80% less likely to have small injecting networks, after adjusting for other variables. RDS participants were also more likely to have their injecting networks form a larger proportion of those in their social networks, and to have acquaintances as part of their injecting networks. Compared to those with larger injecting networks, individuals with small injecting networks were equally likely to engage in receptive sharing of injecting equipment, but less likely to have had contact with prevention services. These findings suggest that those with small injecting networks are an important group to recruit, and that RDS is less likely to capture these individuals.

  18. Preeminence and prerequisites of sample size calculations in clinical trials

    OpenAIRE

    Richa Singhal; Rakesh Rana

    2015-01-01

    The key components while planning a clinical study are the study design, study duration, and sample size. These features are an integral part of planning a clinical trial efficiently, ethically, and cost-effectively. This article describes some of the prerequisites for sample size calculation. It also explains that sample size calculation is different for different study designs. The article in detail describes the sample size calculation for a randomized controlled trial when the primary out...

  19. A simple approach to power and sample size calculations in logistic regression and Cox regression models.

    Science.gov (United States)

    Vaeth, Michael; Skovlund, Eva

    2004-06-15

    For a given regression problem it is possible to identify a suitably defined equivalent two-sample problem such that the power or sample size obtained for the two-sample problem also applies to the regression problem. For a standard linear regression model the equivalent two-sample problem is easily identified, but for generalized linear models and for Cox regression models the situation is more complicated. An approximately equivalent two-sample problem may, however, also be identified here. In particular, we show that for logistic regression and Cox regression models the equivalent two-sample problem is obtained by selecting two equally sized samples for which the parameters differ by a value equal to the slope times twice the standard deviation of the independent variable and further requiring that the overall expected number of events is unchanged. In a simulation study we examine the validity of this approach to power calculations in logistic regression and Cox regression models. Several different covariate distributions are considered for selected values of the overall response probability and a range of alternatives. For the Cox regression model we consider both constant and non-constant hazard rates. The results show that in general the approach is remarkably accurate even in relatively small samples. Some discrepancies are, however, found in small samples with few events and a highly skewed covariate distribution. Comparison with results based on alternative methods for logistic regression models with a single continuous covariate indicates that the proposed method is at least as good as its competitors. The method is easy to implement and therefore provides a simple way to extend the range of problems that can be covered by the usual formulas for power and sample size determination. Copyright 2004 John Wiley & Sons, Ltd.

  20. Comparing interval estimates for small sample ordinal CFA models.

    Science.gov (United States)

    Natesan, Prathiba

    2015-01-01

    Robust maximum likelihood (RML) and asymptotically generalized least squares (AGLS) methods have been recommended for fitting ordinal structural equation models. Studies show that some of these methods underestimate standard errors. However, these studies have not investigated the coverage and bias of interval estimates. An estimate with a reasonable standard error could still be severely biased. This can only be known by systematically investigating the interval estimates. The present study compares Bayesian, RML, and AGLS interval estimates of factor correlations in ordinal confirmatory factor analysis models (CFA) for small sample data. Six sample sizes, 3 factor correlations, and 2 factor score distributions (multivariate normal and multivariate mildly skewed) were studied. Two Bayesian prior specifications, informative and relatively less informative were studied. Undercoverage of confidence intervals and underestimation of standard errors was common in non-Bayesian methods. Underestimated standard errors may lead to inflated Type-I error rates. Non-Bayesian intervals were more positive biased than negatively biased, that is, most intervals that did not contain the true value were greater than the true value. Some non-Bayesian methods had non-converging and inadmissible solutions for small samples and non-normal data. Bayesian empirical standard error estimates for informative and relatively less informative priors were closer to the average standard errors of the estimates. The coverage of Bayesian credibility intervals was closer to what was expected with overcoverage in a few cases. Although some Bayesian credibility intervals were wider, they reflected the nature of statistical uncertainty that comes with the data (e.g., small sample). Bayesian point estimates were also more accurate than non-Bayesian estimates. The results illustrate the importance of analyzing coverage and bias of interval estimates, and how ignoring interval estimates can be misleading

  1. Optimum sample size allocation to minimize cost or maximize power for the two-sample trimmed mean test.

    Science.gov (United States)

    Guo, Jiin-Huarng; Luh, Wei-Ming

    2009-05-01

    When planning a study, sample size determination is one of the most important tasks facing the researcher. The size will depend on the purpose of the study, the cost limitations, and the nature of the data. By specifying the standard deviation ratio and/or the sample size ratio, the present study considers the problem of heterogeneous variances and non-normality for Yuen's two-group test and develops sample size formulas to minimize the total cost or maximize the power of the test. For a given power, the sample size allocation ratio can be manipulated so that the proposed formulas can minimize the total cost, the total sample size, or the sum of total sample size and total cost. On the other hand, for a given total cost, the optimum sample size allocation ratio can maximize the statistical power of the test. After the sample size is determined, the present simulation applies Yuen's test to the sample generated, and then the procedure is validated in terms of Type I errors and power. Simulation results show that the proposed formulas can control Type I errors and achieve the desired power under the various conditions specified. Finally, the implications for determining sample sizes in experimental studies and future research are discussed.

  2. Optimal sample size for probability of detection curves

    International Nuclear Information System (INIS)

    Annis, Charles; Gandossi, Luca; Martin, Oliver

    2013-01-01

    Highlights: • We investigate sample size requirement to develop probability of detection curves. • We develop simulations to determine effective inspection target sizes, number and distribution. • We summarize these findings and provide guidelines for the NDE practitioner. -- Abstract: The use of probability of detection curves to quantify the reliability of non-destructive examination (NDE) systems is common in the aeronautical industry, but relatively less so in the nuclear industry, at least in European countries. Due to the nature of the components being inspected, sample sizes tend to be much lower. This makes the manufacturing of test pieces with representative flaws, in sufficient numbers, so to draw statistical conclusions on the reliability of the NDT system under investigation, quite costly. The European Network for Inspection and Qualification (ENIQ) has developed an inspection qualification methodology, referred to as the ENIQ Methodology. It has become widely used in many European countries and provides assurance on the reliability of NDE systems, but only qualitatively. The need to quantify the output of inspection qualification has become more important as structural reliability modelling and quantitative risk-informed in-service inspection methodologies become more widely used. A measure of the NDE reliability is necessary to quantify risk reduction after inspection and probability of detection (POD) curves provide such a metric. The Joint Research Centre, Petten, The Netherlands supported ENIQ by investigating the question of the sample size required to determine a reliable POD curve. As mentioned earlier manufacturing of test pieces with defects that are typically found in nuclear power plants (NPPs) is usually quite expensive. Thus there is a tendency to reduce sample sizes, which in turn increases the uncertainty associated with the resulting POD curve. The main question in conjunction with POS curves is the appropriate sample size. Not

  3. Sample size for morphological traits of pigeonpea

    Directory of Open Access Journals (Sweden)

    Giovani Facco

    2015-12-01

    Full Text Available The objectives of this study were to determine the sample size (i.e., number of plants required to accurately estimate the average of morphological traits of pigeonpea (Cajanus cajan L. and to check for variability in sample size between evaluation periods and seasons. Two uniformity trials (i.e., experiments without treatment were conducted for two growing seasons. In the first season (2011/2012, the seeds were sown by broadcast seeding, and in the second season (2012/2013, the seeds were sown in rows spaced 0.50 m apart. The ground area in each experiment was 1,848 m2, and 360 plants were marked in the central area, in a 2 m × 2 m grid. Three morphological traits (e.g., number of nodes, plant height and stem diameter were evaluated 13 times during the first season and 22 times in the second season. Measurements for all three morphological traits were normally distributed and confirmed through the Kolmogorov-Smirnov test. Randomness was confirmed using the Run Test, and the descriptive statistics were calculated. For each trait, the sample size (n was calculated for the semiamplitudes of the confidence interval (i.e., estimation error equal to 2, 4, 6, ..., 20% of the estimated mean with a confidence coefficient (1-? of 95%. Subsequently, n was fixed at 360 plants, and the estimation error of the estimated percentage of the average for each trait was calculated. Variability of the sample size for the pigeonpea culture was observed between the morphological traits evaluated, among the evaluation periods and between seasons. Therefore, to assess with an accuracy of 6% of the estimated average, at least 136 plants must be evaluated throughout the pigeonpea crop cycle to determine the sample size for the traits (e.g., number of nodes, plant height and stem diameter in the different evaluation periods and between seasons. 

  4. Soybean yield modeling using bootstrap methods for small samples

    Energy Technology Data Exchange (ETDEWEB)

    Dalposso, G.A.; Uribe-Opazo, M.A.; Johann, J.A.

    2016-11-01

    One of the problems that occur when working with regression models is regarding the sample size; once the statistical methods used in inferential analyzes are asymptotic if the sample is small the analysis may be compromised because the estimates will be biased. An alternative is to use the bootstrap methodology, which in its non-parametric version does not need to guess or know the probability distribution that generated the original sample. In this work we used a set of soybean yield data and physical and chemical soil properties formed with fewer samples to determine a multiple linear regression model. Bootstrap methods were used for variable selection, identification of influential points and for determination of confidence intervals of the model parameters. The results showed that the bootstrap methods enabled us to select the physical and chemical soil properties, which were significant in the construction of the soybean yield regression model, construct the confidence intervals of the parameters and identify the points that had great influence on the estimated parameters. (Author)

  5. 75 FR 61604 - Small Business Size Standards; Accommodation and Food Services Industries

    Science.gov (United States)

    2010-10-06

    ... business assistance programs, SBA establishes small business size definitions (referred to as size... Administrator the responsibility for establishing small business definitions. The Act also requires that small business definitions vary to reflect industry differences. The supplementary information section of this...

  6. In Good Company : When Small and Medium-sized Enterprises Acquire Multiplex Knowledge from Key Commercial Partners

    NARCIS (Netherlands)

    Bojica, Ana Maria; Estrada Vaquero, Isabel; del Mar Fuentes-Fuentes, Maria

    2018-01-01

    This study explores the specific conditions under which key strategic alliances of small and medium-sized enterprises (SMEs) with commercial partners can become multiplex in knowledge exchange. Using survey data from a sample of 150 Spanish SMEs in the information and communication technology (ICT)

  7. Small-Size High-Current Generators for X-Ray Backlighting

    Science.gov (United States)

    Chaikovsky, S. A.; Artyomov, A. P.; Zharova, N. V.; Zhigalin, A. S.; Lavrinovich, I. V.; Oreshkin, V. I.; Ratakhin, N. A.; Rousskikh, A. G.; Fedunin, A. V.; Fedushchak, V. F.; Erfort, A. A.

    2017-12-01

    The paper deals with the soft X-ray backlighting based on the X-pinch as a powerful tool for physical studies of fast processes. Proposed are the unique small-size pulsed power generators operating as a low-inductance capacitor bank. These pulse generators provide the X-pinch-based soft X-ray source (hν = 1-10 keV) of micron size at 2-3 ns pulse duration. The small size and weight of pulse generators allow them to be transported to any laboratory for conducting X-ray backlighting of test objects with micron space resolution and nanosecond exposure time. These generators also allow creating synchronized multi-frame radiographic complexes with frame delay variation in a broad range.

  8. EDXRF applied to the chemical element determination of small invertebrate samples

    International Nuclear Information System (INIS)

    Magalhaes, Marcelo L.R.; Santos, Mariana L.O.; Cantinha, Rebeca S.; Souza, Thomas Marques de; Franca, Elvis J. de

    2015-01-01

    Energy Dispersion X-Ray Fluorescence - EDXRF is a fast analytical technique of easy operation, however demanding reliable analytical curves due to the intrinsic matrix dependence and interference during the analysis. By using biological materials of diverse matrices, multielemental analytical protocols can be implemented and a group of chemical elements could be determined in diverse biological matrices depending on the chemical element concentration. Particularly for invertebrates, EDXRF presents some advantages associated to the possibility of the analysis of small size samples, in which a collimator can be used that directing the incidence of X-rays to a small surface of the analyzed samples. In this work, EDXRF was applied to determine Cl, Fe, P, S and Zn in invertebrate samples using the collimator of 3 mm and 10 mm. For the assessment of the analytical protocol, the SRM 2976 Trace Elements in Mollusk produced and SRM 8415 Whole Egg Powder by the National Institute of Standards and Technology - NIST were also analyzed. After sampling by using pitfall traps, invertebrate were lyophilized, milled and transferred to polyethylene vials covered by XRF polyethylene. Analyses were performed at atmosphere lower than 30 Pa, varying voltage and electric current according to the chemical element to be analyzed. For comparison, Zn in the invertebrate material was also quantified by graphite furnace atomic absorption spectrometry after acid treatment (mixture of nitric acid and hydrogen peroxide) of samples have. Compared to the collimator of 10 mm, the SRM 2976 and SRM 8415 results obtained by the 3 mm collimator agreed well at the 95% confidence level since the E n Number were in the range of -1 and 1. Results from GFAAS were in accordance to the EDXRF values for composite samples. Therefore, determination of some chemical elements by EDXRF can be recommended for very small invertebrate samples (lower than 100 mg) with advantage of preserving the samples. (author)

  9. Determinants of capital structure in small and medium sized enterprises in Malaysia

    OpenAIRE

    Mat Nawi, Hafizah

    2015-01-01

    This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University London This study aims to investigate the determinants of capital structure in small and medium-sized enterprises (SMEs) in Malaysia and their effect on firms’ performance. The study addresses the following primary question: What are the factors that influence the capital structure of SMEs in Malaysia? The sample of this research is SMEs in the east coast region of Malaysia. Adopting a posi...

  10. Prognostic Importance of Small Prostate Size in Men Receiving Definitive Prostate Brachytherapy

    International Nuclear Information System (INIS)

    Taira, Al V.; Merrick, Gregory S.; Galbreath, Robert W.; Butler, Wayne M.; Adamovich, Edward; Wallner, Kent E.

    2012-01-01

    Purpose: To assess whether small prostate size is an adverse prognostic factor in men undergoing brachytherapy in the same manner in which it seems to be for men undergoing radical prostatectomy. Methods and Materials: From April 1995 to June 2008, 2024 patients underwent brachytherapy by a single brachytherapist. Median follow-up was 7.4 years. The role of small prostate size (≤20 cm 3 ) as a prognostic factor for biochemical progression-free survival, cause-specific survival, and all-cause mortality was investigated. The differences in survival between men with small and larger prostates were compared using Kaplan-Meier curves and log-rank tests. Results: Median prostate size for the entire cohort was 32.7 cm 3 . For the 167 men with small prostates, median prostate size was 17.4 cm 3 . There was no difference in biochemical progression-free survival (95.2% vs 96.2%, P=.603), cause-specific survival (97.7% vs 98.3%, P=.546), or all-cause mortality (78.0% vs 77.2%, P=.838) at 10 years for men with small prostates compared with men with larger prostates. On univariate and multivariate analysis, small prostate size was not associated with any of the primary outcome measures. Conclusion: Men with small prostates treated with brachytherapy have excellent outcomes and are at no higher risk of treatment failure than men with larger glands. High-quality implants with adequate margins seem sufficient to address the increased adverse risk factors associated with small prostate size.

  11. Sample size determination for a three-arm equivalence trial of Poisson and negative binomial responses.

    Science.gov (United States)

    Chang, Yu-Wei; Tsong, Yi; Zhao, Zhigen

    2017-01-01

    Assessing equivalence or similarity has drawn much attention recently as many drug products have lost or will lose their patents in the next few years, especially certain best-selling biologics. To claim equivalence between the test treatment and the reference treatment when assay sensitivity is well established from historical data, one has to demonstrate both superiority of the test treatment over placebo and equivalence between the test treatment and the reference treatment. Thus, there is urgency for practitioners to derive a practical way to calculate sample size for a three-arm equivalence trial. The primary endpoints of a clinical trial may not always be continuous, but may be discrete. In this paper, the authors derive power function and discuss sample size requirement for a three-arm equivalence trial with Poisson and negative binomial clinical endpoints. In addition, the authors examine the effect of the dispersion parameter on the power and the sample size by varying its coefficient from small to large. In extensive numerical studies, the authors demonstrate that required sample size heavily depends on the dispersion parameter. Therefore, misusing a Poisson model for negative binomial data may easily lose power up to 20%, depending on the value of the dispersion parameter.

  12. Preeminence and prerequisites of sample size calculations in clinical trials

    Directory of Open Access Journals (Sweden)

    Richa Singhal

    2015-01-01

    Full Text Available The key components while planning a clinical study are the study design, study duration, and sample size. These features are an integral part of planning a clinical trial efficiently, ethically, and cost-effectively. This article describes some of the prerequisites for sample size calculation. It also explains that sample size calculation is different for different study designs. The article in detail describes the sample size calculation for a randomized controlled trial when the primary outcome is a continuous variable and when it is a proportion or a qualitative variable.

  13. GOVERNMENT SUPPORT FOR SMALL AND MEDIUM-SIZED BUSINESS AND INNOVATIVE ACTIVITIES

    Directory of Open Access Journals (Sweden)

    Pоlina Kolisnichenko

    2017-09-01

    Full Text Available The purpose of the paper is to reveal the conditions of the innovative development of the small and mediumsized entrepreneurship in Ukraine; the problems that suppress the innovative activity and small and medium-sized enterprises development; peculiarities of the tax incentives for the development of the entrepreneurship in the advanced countries and in Ukraine; the main kinds and characteristics of the small and medium-sized enterprises public support. Methodology. The methods of scientific research include: analysis and generalization for studying the main kinds and characteristics of the government support of the small and medium-sized enterprises; dynamic and comparative analysis for studying the problems and factors influencing the development of small and medium enterprises and innovative activity, peculiarities of tax incentives for business development; systemanalytical method for studying the conditions of the innovative development of the small and medium-sized entrepreneurship. Results. The government's financial support priorities should be: optimal application of the fiscal regulation instruments (reduction of the amount of taxes, determination of the criteria for the maximum taxation amount, tax incentives etc., maintenance of the self-investment of small and medium-sized enterprises as well as investment, financial means of the public influence over the development of the enterprises, effective combination of both direct and indirect forms of the innovative development support. Practical implications. The obtained results can be used in the process of formation and implementation of the small and medium enterprise sector development strategy and innovative activity in the long-term perspective. Value/originality. The obtained data can provide a better understanding of the direction of innovative business development in Ukraine.

  14. Considerations for Sample Preparation Using Size-Exclusion Chromatography for Home and Synchrotron Sources.

    Science.gov (United States)

    Rambo, Robert P

    2017-01-01

    The success of a SAXS experiment for structural investigations depends on two precise measurements, the sample and the buffer background. Buffer matching between the sample and background can be achieved using dialysis methods but in biological SAXS of monodisperse systems, sample preparation is routinely being performed with size exclusion chromatography (SEC). SEC is the most reliable method for SAXS sample preparation as the method not only purifies the sample for SAXS but also almost guarantees ideal buffer matching. Here, I will highlight the use of SEC for SAXS sample preparation and demonstrate using example proteins that SEC purification does not always provide for ideal samples. Scrutiny of the SEC elution peak using quasi-elastic and multi-angle light scattering techniques can reveal hidden features (heterogeneity) of the sample that should be considered during SAXS data analysis. In some cases, sample heterogeneity can be controlled using a small molecule additive and I outline a simple additive screening method for sample preparation.

  15. Revisiting sample size: are big trials the answer?

    Science.gov (United States)

    Lurati Buse, Giovanna A L; Botto, Fernando; Devereaux, P J

    2012-07-18

    The superiority of the evidence generated in randomized controlled trials over observational data is not only conditional to randomization. Randomized controlled trials require proper design and implementation to provide a reliable effect estimate. Adequate random sequence generation, allocation implementation, analyses based on the intention-to-treat principle, and sufficient power are crucial to the quality of a randomized controlled trial. Power, or the probability of the trial to detect a difference when a real difference between treatments exists, strongly depends on sample size. The quality of orthopaedic randomized controlled trials is frequently threatened by a limited sample size. This paper reviews basic concepts and pitfalls in sample-size estimation and focuses on the importance of large trials in the generation of valid evidence.

  16. Small population size of Pribilof Rock Sandpipers confirmed through distance-sampling surveys in Alaska

    Science.gov (United States)

    Ruthrauff, Daniel R.; Tibbitts, T. Lee; Gill, Robert E.; Dementyev, Maksim N.; Handel, Colleen M.

    2012-01-01

    The Rock Sandpiper (Calidris ptilocnemis) is endemic to the Bering Sea region and unique among shorebirds in the North Pacific for wintering at high latitudes. The nominate subspecies, the Pribilof Rock Sandpiper (C. p. ptilocnemis), breeds on four isolated islands in the Bering Sea and appears to spend the winter primarily in Cook Inlet, Alaska. We used a stratified systematic sampling design and line-transect method to survey the entire breeding range of this population during springs 2001-2003. Densities were up to four times higher on the uninhabited and more northerly St. Matthew and Hall islands than on St. Paul and St. George islands, which both have small human settlements and introduced reindeer herds. Differences in density, however, appeared to be more related to differences in vegetation than to anthropogenic factors, raising some concern for prospective effects of climate change. We estimated the total population at 19 832 birds (95% CI 17 853–21 930), ranking it among the smallest of North American shorebird populations. To determine the vulnerability of C. p. ptilocnemis to anthropogenic and stochastic environmental threats, future studies should focus on determining the amount of gene flow among island subpopulations, the full extent of the subspecies' winter range, and the current trajectory of this small population.

  17. Modified FlowCAM procedure for quantifying size distribution of zooplankton with sample recycling capacity.

    Directory of Open Access Journals (Sweden)

    Esther Wong

    Full Text Available We have developed a modified FlowCAM procedure for efficiently quantifying the size distribution of zooplankton. The modified method offers the following new features: 1 prevents animals from settling and clogging with constant bubbling in the sample container; 2 prevents damage to sample animals and facilitates recycling by replacing the built-in peristaltic pump with an external syringe pump, in order to generate negative pressure, creates a steady flow by drawing air from the receiving conical flask (i.e. vacuum pump, and transfers plankton from the sample container toward the main flowcell of the imaging system and finally into the receiving flask; 3 aligns samples in advance of imaging and prevents clogging with an additional flowcell placed ahead of the main flowcell. These modifications were designed to overcome the difficulties applying the standard FlowCAM procedure to studies where the number of individuals per sample is small, and since the FlowCAM can only image a subset of a sample. Our effective recycling procedure allows users to pass the same sample through the FlowCAM many times (i.e. bootstrapping the sample in order to generate a good size distribution. Although more advanced FlowCAM models are equipped with syringe pump and Field of View (FOV flowcells which can image all particles passing through the flow field; we note that these advanced setups are very expensive, offer limited syringe and flowcell sizes, and do not guarantee recycling. In contrast, our modifications are inexpensive and flexible. Finally, we compared the biovolumes estimated by automated FlowCAM image analysis versus conventional manual measurements, and found that the size of an individual zooplankter can be estimated by the FlowCAM image system after ground truthing.

  18. Causality in Statistical Power: Isomorphic Properties of Measurement, Research Design, Effect Size, and Sample Size

    Directory of Open Access Journals (Sweden)

    R. Eric Heidel

    2016-01-01

    Full Text Available Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power.

  19. 78 FR 37397 - Small Business Size Standards: Agriculture, Forestry, Fishing and Hunting

    Science.gov (United States)

    2013-06-20

    ... responsibility for establishing small business size definitions (15 U.S.C. 632(a)). The Act also requires that small business size definitions vary to reflect industry differences. The Jobs Act requires the... definition, after consultation with the Office of Advocacy of the U.S. Small Business Administration (5 U.S.C...

  20. Synthesis and characterization of small size fluorescent LEEH caped blue emission ZnTe quantum dots

    Directory of Open Access Journals (Sweden)

    Patnaik Sumanta Kumar

    2017-04-01

    Full Text Available We report here for the first time the synthesis of LEEH caped very small size (2 nm ZnTe quantum dots at low temperature (less than 100 °C using a simple chemical route. The effects of aging and stirring time on the absorption spectra of the quantum dots were investigated. The synthesized nanocrystal (NC was characterized by PL, TEM, XRD and the formation of very small size quantum dots having FCC structure was confirmed. Further, blue emission from the prepared sample was observed during exposure to monochromatic UV radiation. ZnTe NCs obtained in this study were found to be more stable compared to those presented in literature reports. ZnTe NCs may be considered as a new material in place of CdTe for optoelectronics devices.

  1. GOVERNMENT SUPPORT FOR SMALL AND MEDIUM-SIZED BUSINESS AND INNOVATIVE ACTIVITIES

    OpenAIRE

    Pоlina Kolisnichenko

    2017-01-01

    The purpose of the paper is to reveal the conditions of the innovative development of the small and mediumsized entrepreneurship in Ukraine; the problems that suppress the innovative activity and small and medium-sized enterprises development; peculiarities of the tax incentives for the development of the entrepreneurship in the advanced countries and in Ukraine; the main kinds and characteristics of the small and medium-sized enterprises public support. Methodology. The methods of scientific...

  2. Sample-size dependence of diversity indices and the determination of sufficient sample size in a high-diversity deep-sea environment

    OpenAIRE

    Soetaert, K.; Heip, C.H.R.

    1990-01-01

    Diversity indices, although designed for comparative purposes, often cannot be used as such, due to their sample-size dependence. It is argued here that this dependence is more pronounced in high diversity than in low diversity assemblages and that indices more sensitive to rarer species require larger sample sizes to estimate diversity with reasonable precision than indices which put more weight on commoner species. This was tested for Hill's diversity number N sub(0) to N sub( proportional ...

  3. Sample size calculation for comparing two negative binomial rates.

    Science.gov (United States)

    Zhu, Haiyuan; Lakkis, Hassan

    2014-02-10

    Negative binomial model has been increasingly used to model the count data in recent clinical trials. It is frequently chosen over Poisson model in cases of overdispersed count data that are commonly seen in clinical trials. One of the challenges of applying negative binomial model in clinical trial design is the sample size estimation. In practice, simulation methods have been frequently used for sample size estimation. In this paper, an explicit formula is developed to calculate sample size based on the negative binomial model. Depending on different approaches to estimate the variance under null hypothesis, three variations of the sample size formula are proposed and discussed. Important characteristics of the formula include its accuracy and its ability to explicitly incorporate dispersion parameter and exposure time. The performance of the formula with each variation is assessed using simulations. Copyright © 2013 John Wiley & Sons, Ltd.

  4. Estimation of sample size and testing power (part 5).

    Science.gov (United States)

    Hu, Liang-ping; Bao, Xiao-lei; Guan, Xue; Zhou, Shi-guo

    2012-02-01

    Estimation of sample size and testing power is an important component of research design. This article introduced methods for sample size and testing power estimation of difference test for quantitative and qualitative data with the single-group design, the paired design or the crossover design. To be specific, this article introduced formulas for sample size and testing power estimation of difference test for quantitative and qualitative data with the above three designs, the realization based on the formulas and the POWER procedure of SAS software and elaborated it with examples, which will benefit researchers for implementing the repetition principle.

  5. A size exclusion-reversed phase two dimensional-liquid chromatography methodology for stability and small molecule related species in antibody drug conjugates.

    Science.gov (United States)

    Li, Yi; Gu, Christine; Gruenhagen, Jason; Zhang, Kelly; Yehl, Peter; Chetwyn, Nik P; Medley, Colin D

    2015-05-08

    Antibody drug conjugates (ADCs) are complex therapeutic agents combining the specific targeting properties of antibodies and highly potent cytotoxic small molecule drugs to selectively eliminate tumor cells while limiting the toxicity to normal healthy tissues. One unique critical quality attribute of ADCs is the content of unconjugated small molecule drug present from either incomplete conjugation or degradation of the ADC. In this work, size exclusion chromatography (SEC) was coupled with reversed-phase (RP) HPLC in an online 2-dimensional chromatography format for identification and quantitation of unconjugated small molecule drugs and related small molecule impurities in ADC samples directly without sample preparation. The SEC method in the 1st dimension not only separated the small molecule impurities from the intact ADC, but also provided information about the size variants (monomer, dimer, aggregates, etc.) of the ADC. The small molecule peak from the SEC was trapped and sent to a RP-HPLC in the 2nd dimension to further separate and quantify the different small molecule impurities present in the ADC sample. This SEC-RP 2D-LC method demonstrated excellent precision (%RSDmolecule degradation products and aggregation of the conjugate were observed in the stability samples and the degradation pathways of the ADC were investigated. This 2D-LC method offers a powerful tool for ADC characterization and provides valuable information for conjugation and formulation development. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Frictional behaviour of sandstone: A sample-size dependent triaxial investigation

    Science.gov (United States)

    Roshan, Hamid; Masoumi, Hossein; Regenauer-Lieb, Klaus

    2017-01-01

    Frictional behaviour of rocks from the initial stage of loading to final shear displacement along the formed shear plane has been widely investigated in the past. However the effect of sample size on such frictional behaviour has not attracted much attention. This is mainly related to the limitations in rock testing facilities as well as the complex mechanisms involved in sample-size dependent frictional behaviour of rocks. In this study, a suite of advanced triaxial experiments was performed on Gosford sandstone samples at different sizes and confining pressures. The post-peak response of the rock along the formed shear plane has been captured for the analysis with particular interest in sample-size dependency. Several important phenomena have been observed from the results of this study: a) the rate of transition from brittleness to ductility in rock is sample-size dependent where the relatively smaller samples showed faster transition toward ductility at any confining pressure; b) the sample size influences the angle of formed shear band and c) the friction coefficient of the formed shear plane is sample-size dependent where the relatively smaller sample exhibits lower friction coefficient compared to larger samples. We interpret our results in terms of a thermodynamics approach in which the frictional properties for finite deformation are viewed as encompassing a multitude of ephemeral slipping surfaces prior to the formation of the through going fracture. The final fracture itself is seen as a result of the self-organisation of a sufficiently large ensemble of micro-slip surfaces and therefore consistent in terms of the theory of thermodynamics. This assumption vindicates the use of classical rock mechanics experiments to constrain failure of pressure sensitive rocks and the future imaging of these micro-slips opens an exciting path for research in rock failure mechanisms.

  7. Effects of sample size and sampling frequency on studies of brown bear home ranges and habitat use

    Science.gov (United States)

    Arthur, Steve M.; Schwartz, Charles C.

    1999-01-01

    We equipped 9 brown bears (Ursus arctos) on the Kenai Peninsula, Alaska, with collars containing both conventional very-high-frequency (VHF) transmitters and global positioning system (GPS) receivers programmed to determine an animal's position at 5.75-hr intervals. We calculated minimum convex polygon (MCP) and fixed and adaptive kernel home ranges for randomly-selected subsets of the GPS data to examine the effects of sample size on accuracy and precision of home range estimates. We also compared results obtained by weekly aerial radiotracking versus more frequent GPS locations to test for biases in conventional radiotracking data. Home ranges based on the MCP were 20-606 km2 (x = 201) for aerial radiotracking data (n = 12-16 locations/bear) and 116-1,505 km2 (x = 522) for the complete GPS data sets (n = 245-466 locations/bear). Fixed kernel home ranges were 34-955 km2 (x = 224) for radiotracking data and 16-130 km2 (x = 60) for the GPS data. Differences between means for radiotracking and GPS data were due primarily to the larger samples provided by the GPS data. Means did not differ between radiotracking data and equivalent-sized subsets of GPS data (P > 0.10). For the MCP, home range area increased and variability decreased asymptotically with number of locations. For the kernel models, both area and variability decreased with increasing sample size. Simulations suggested that the MCP and kernel models required >60 and >80 locations, respectively, for estimates to be both accurate (change in area bears. Our results suggest that the usefulness of conventional radiotracking data may be limited by potential biases and variability due to small samples. Investigators that use home range estimates in statistical tests should consider the effects of variability of those estimates. Use of GPS-equipped collars can facilitate obtaining larger samples of unbiased data and improve accuracy and precision of home range estimates.

  8. Radiation-related small head sizes among prenatally exposed atomic bomb survivors

    International Nuclear Information System (INIS)

    Otaki, Masanori; Schull, William J.

    2004-01-01

    severely mentally retarded cases with small heads and 68.9 and 11.9 for the severely mentally retarded cases without small heads. These values are 96.4 and 19.8 for cases with small heads only. The mean IQ and standard deviation for the overall sample are 107.8 and 16.4, respectively. No significant difference exists between the first two IQ means identified above, but both are significantly less than the mean for individuals with small heads but without severe mental retardation. The mean IQ of individuals with small heads but without severe mental retardation does not differ significantly from the mean for the entire sample. The relationship of small head size to four other anthropometric measurements (standing height, body weight, sitting height, and chest circumference) is described. (author)

  9. Development of a measuring system for vapor-jet forms of small-sized fuel injectors; Kogata injector funmu keijo sokutei system no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    Hibino, H; Komatsubara, H; Kawashima, O; Fujita, A [Aisan Industry Co. Ltd., Aichi (Japan)

    1997-10-01

    In the small-sized fuel injectors adapted to the United States` exhaust-gas regulation or the like, the vapor jet is extremely atomized and the jet form as one of the performances of the product has become more important than before. Accordingly, we have developed a measuring system in which the vapor jet of the small-sized fuel injector is irradiated with a flat laser light, the sectional form of the jet that is shining due to diffusion is sampled, and the distribution and the form of the sampled sections are determined by the image processing. 2 refs., 15 figs., 4 tabs.

  10. Management of small and medium size enterprises as a carrier of economic growth

    Directory of Open Access Journals (Sweden)

    Ožegović Lazar

    2012-01-01

    Full Text Available Numerous studies in the world of market economy show that the share of small and medium size enterprises is constantly increasing compared to large enterprises. This does not decrease the significance of large enterprises, but the dependence between them gets larger every day. National economies which manage to find the optimal combination between small, medium size and large enterprises are more successful than the others. Management of small and medium size enterprises (SMEs in conceptual terms is similar to the process of management of large enterprises. An organization must be managed in order to function and this rule applies to small and medium size enterprises.

  11. Effects of sample size on the second magnetization peak in ...

    Indian Academy of Sciences (India)

    the sample size decreases – a result that could be interpreted as a size effect in the order– disorder vortex matter phase transition. However, local magnetic measurements trace this effect to metastable disordered vortex states, revealing the same order–disorder transition induction in samples of different size. Keywords.

  12. EDXRF applied to the chemical element determination of small invertebrate samples

    Energy Technology Data Exchange (ETDEWEB)

    Magalhaes, Marcelo L.R.; Santos, Mariana L.O.; Cantinha, Rebeca S.; Souza, Thomas Marques de; Franca, Elvis J. de, E-mail: marcelo_rlm@hotmail.com, E-mail: marianasantos_ufpe@hotmail.com, E-mail: rebecanuclear@gmail.com, E-mail: thomasmarques@live.com.pt, E-mail: ejfranca@cnen.gov.br [Centro Regional de Ciencias Nucleares do Nordeste (CRCN-NE/CNEN-PE), Recife, PE (Brazil)

    2015-07-01

    Energy Dispersion X-Ray Fluorescence - EDXRF is a fast analytical technique of easy operation, however demanding reliable analytical curves due to the intrinsic matrix dependence and interference during the analysis. By using biological materials of diverse matrices, multielemental analytical protocols can be implemented and a group of chemical elements could be determined in diverse biological matrices depending on the chemical element concentration. Particularly for invertebrates, EDXRF presents some advantages associated to the possibility of the analysis of small size samples, in which a collimator can be used that directing the incidence of X-rays to a small surface of the analyzed samples. In this work, EDXRF was applied to determine Cl, Fe, P, S and Zn in invertebrate samples using the collimator of 3 mm and 10 mm. For the assessment of the analytical protocol, the SRM 2976 Trace Elements in Mollusk produced and SRM 8415 Whole Egg Powder by the National Institute of Standards and Technology - NIST were also analyzed. After sampling by using pitfall traps, invertebrate were lyophilized, milled and transferred to polyethylene vials covered by XRF polyethylene. Analyses were performed at atmosphere lower than 30 Pa, varying voltage and electric current according to the chemical element to be analyzed. For comparison, Zn in the invertebrate material was also quantified by graphite furnace atomic absorption spectrometry after acid treatment (mixture of nitric acid and hydrogen peroxide) of samples have. Compared to the collimator of 10 mm, the SRM 2976 and SRM 8415 results obtained by the 3 mm collimator agreed well at the 95% confidence level since the E{sub n} Number were in the range of -1 and 1. Results from GFAAS were in accordance to the EDXRF values for composite samples. Therefore, determination of some chemical elements by EDXRF can be recommended for very small invertebrate samples (lower than 100 mg) with advantage of preserving the samples. (author)

  13. STATISTICAL EVALUATION OF SMALL SCALE MIXING DEMONSTRATION SAMPLING AND BATCH TRANSFER PERFORMANCE - 12093

    Energy Technology Data Exchange (ETDEWEB)

    GREER DA; THIEN MG

    2012-01-12

    The ability to effectively mix, sample, certify, and deliver consistent batches of High Level Waste (HLW) feed from the Hanford Double Shell Tanks (DST) to the Waste Treatment and Immobilization Plant (WTP) presents a significant mission risk with potential to impact mission length and the quantity of HLW glass produced. DOE's Tank Operations Contractor, Washington River Protection Solutions (WRPS) has previously presented the results of mixing performance in two different sizes of small scale DSTs to support scale up estimates of full scale DST mixing performance. Currently, sufficient sampling of DSTs is one of the largest programmatic risks that could prevent timely delivery of high level waste to the WTP. WRPS has performed small scale mixing and sampling demonstrations to study the ability to sufficiently sample the tanks. The statistical evaluation of the demonstration results which lead to the conclusion that the two scales of small DST are behaving similarly and that full scale performance is predictable will be presented. This work is essential to reduce the risk of requiring a new dedicated feed sampling facility and will guide future optimization work to ensure the waste feed delivery mission will be accomplished successfully. This paper will focus on the analytical data collected from mixing, sampling, and batch transfer testing from the small scale mixing demonstration tanks and how those data are being interpreted to begin to understand the relationship between samples taken prior to transfer and samples from the subsequent batches transferred. An overview of the types of data collected and examples of typical raw data will be provided. The paper will then discuss the processing and manipulation of the data which is necessary to begin evaluating sampling and batch transfer performance. This discussion will also include the evaluation of the analytical measurement capability with regard to the simulant material used in the demonstration tests. The

  14. Analysis of femtogram-sized plutonium samples by thermal ionization mass spectrometry

    International Nuclear Information System (INIS)

    Smith, D.H.; Duckworth, D.C.; Bostick, D.T.; Coleman, R.M.; McPherson, R.L.; McKown, H.S.

    1994-01-01

    The goal of this investigation was to extend the ability to perform isotopic analysis of plutonium to samples as small as possible. Plutonium ionizes thermally with quite good efficiency (first ionization potential 5.7 eV). Sub-nanogram sized samples can be analyzed on a near-routine basis given the necessary instrumentation. Efforts in this laboratory have been directed at rhenium-carbon systems; solutions of carbon in rhenium provide surfaces with work functions higher than pure rhenium (5.8 vs. ∼ 5.4 eV). Using a single resin bead as a sample loading medium both concentrates the sample nearly to a point and, due to its interaction with rhenium, produces the desired composite surface. Earlier work in this area showed that a layer of rhenium powder slurried in solution containing carbon substantially enhanced precision of isotopic measurements for uranium. Isotopic fractionation was virtually eliminated, and ionization efficiencies 2-5 times better than previously measured were attained for both Pu and U (1.7 and 0.5%, respectively). The other side of this coin should be the ability to analyze smaller samples, which is the subject of this report

  15. Sample Size in Qualitative Interview Studies: Guided by Information Power.

    Science.gov (United States)

    Malterud, Kirsti; Siersma, Volkert Dirk; Guassora, Ann Dorrit

    2015-11-27

    Sample sizes must be ascertained in qualitative studies like in quantitative studies but not by the same means. The prevailing concept for sample size in qualitative studies is "saturation." Saturation is closely tied to a specific methodology, and the term is inconsistently applied. We propose the concept "information power" to guide adequate sample size for qualitative studies. Information power indicates that the more information the sample holds, relevant for the actual study, the lower amount of participants is needed. We suggest that the size of a sample with sufficient information power depends on (a) the aim of the study, (b) sample specificity, (c) use of established theory, (d) quality of dialogue, and (e) analysis strategy. We present a model where these elements of information and their relevant dimensions are related to information power. Application of this model in the planning and during data collection of a qualitative study is discussed. © The Author(s) 2015.

  16. ANL small-sample calorimeter system design and operation

    International Nuclear Information System (INIS)

    Roche, C.T.; Perry, R.B.; Lewis, R.N.; Jung, E.A.; Haumann, J.R.

    1978-07-01

    The Small-Sample Calorimetric System is a portable instrument designed to measure the thermal power produced by radioactive decay of plutonium-containing fuels. The small-sample calorimeter is capable of measuring samples producing power up to 32 milliwatts at a rate of one sample every 20 min. The instrument is contained in two packages: a data-acquisition module consisting of a microprocessor with an 8K-byte nonvolatile memory, and a measurement module consisting of the calorimeter and a sample preheater. The total weight of the system is 18 kg

  17. Visual exposure to large and small portion sizes and perceptions of portion size normality: Three experimental studies.

    Science.gov (United States)

    Robinson, Eric; Oldham, Melissa; Cuckson, Imogen; Brunstrom, Jeffrey M; Rogers, Peter J; Hardman, Charlotte A

    2016-03-01

    Portion sizes of many foods have increased in recent times. In three studies we examined the effect that repeated visual exposure to larger versus smaller food portion sizes has on perceptions of what constitutes a normal-sized food portion and measures of portion size selection. In studies 1 and 2 participants were visually exposed to images of large or small portions of spaghetti bolognese, before making evaluations about an image of an intermediate sized portion of the same food. In study 3 participants were exposed to images of large or small portions of a snack food before selecting a portion size of snack food to consume. Across the three studies, visual exposure to larger as opposed to smaller portion sizes resulted in participants considering a normal portion of food to be larger than a reference intermediate sized portion. In studies 1 and 2 visual exposure to larger portion sizes also increased the size of self-reported ideal meal size. In study 3 visual exposure to larger portion sizes of a snack food did not affect how much of that food participants subsequently served themselves and ate. Visual exposure to larger portion sizes may adjust visual perceptions of what constitutes a 'normal' sized portion. However, we did not find evidence that visual exposure to larger portions altered snack food intake. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Conservative Sample Size Determination for Repeated Measures Analysis of Covariance.

    Science.gov (United States)

    Morgan, Timothy M; Case, L Douglas

    2013-07-05

    In the design of a randomized clinical trial with one pre and multiple post randomized assessments of the outcome variable, one needs to account for the repeated measures in determining the appropriate sample size. Unfortunately, one seldom has a good estimate of the variance of the outcome measure, let alone the correlations among the measurements over time. We show how sample sizes can be calculated by making conservative assumptions regarding the correlations for a variety of covariance structures. The most conservative choice for the correlation depends on the covariance structure and the number of repeated measures. In the absence of good estimates of the correlations, the sample size is often based on a two-sample t-test, making the 'ultra' conservative and unrealistic assumption that there are zero correlations between the baseline and follow-up measures while at the same time assuming there are perfect correlations between the follow-up measures. Compared to the case of taking a single measurement, substantial savings in sample size can be realized by accounting for the repeated measures, even with very conservative assumptions regarding the parameters of the assumed correlation matrix. Assuming compound symmetry, the sample size from the two-sample t-test calculation can be reduced at least 44%, 56%, and 61% for repeated measures analysis of covariance by taking 2, 3, and 4 follow-up measures, respectively. The results offer a rational basis for determining a fairly conservative, yet efficient, sample size for clinical trials with repeated measures and a baseline value.

  19. 77 FR 53769 - Receipts-Based, Small Business Size Standard; Confirmation of Effective Date

    Science.gov (United States)

    2012-09-04

    ... Flexibility Act of 1980, as amended. The NRC is increasing its receipts-based, small business size standard from $6.5 million to $7 million to conform to the standard set by the Small Business Administration...-Based, Small Business Size Standard; Confirmation of Effective Date AGENCY: Nuclear Regulatory...

  20. 78 FR 30384 - Small Business Size Standards: Waiver of the Nonmanufacturer Rule

    Science.gov (United States)

    2013-05-22

    ... SBA's 8(a) Business Development (BD) program, or Women-Owned Small Businesses (WOSBs). DATES: Comments... small businesses, Service-Disabled Veteran-Owned (SDVO) small businesses, Women-Owned Small Businesses... SMALL BUSINESS ADMINISTRATION Small Business Size Standards: Waiver of the Nonmanufacturer Rule...

  1. JAIF report on small- and medium-size LWRs

    International Nuclear Information System (INIS)

    Anon.

    1983-01-01

    The Ministry of International Trade and Industry has made a report on the results of the study on the utilization of small and medium size LWRs. The Japan Atomic Industrial Forum undertook this research work by the commission of the Ministry. It studied six cases of PWRs and BWRs with thermal output power ranging from 20 MWt to 700 MWt as the hypothetical design for their siting conditions, reactor capacity and the examples of utilization of the corresponding steam output. A large housing estate of 5,000 units, a local core city with population of 100,000, industrial utilization for petrochemical plants and paper and pulp plants and the combined utilization for a factory and a housing estate of 5,000 units were examined. The report said that a city with population of 100,000 would be the best object to supply heat from a nuclear reactor, and it is important to develop the nuclear reactors with 200 MWt capacity. This type is versatile in the siting conditions, and can be constructed near a city, accordingly, it offers the possibility of extensive utilization. The significance of small and medium size LWRs as the alternative energy for oil, the regional development accompanying the use of small and medium size LWRs, the technical development toward scale reduction, easy operation and versatile utilization, the economical evaluation of the utilization models and the tasks for the future are reported. (Kako, I.)

  2. A Systematic Review of Surgical Randomized Controlled Trials: Part 2. Funding Source, Conflict of Interest, and Sample Size in Plastic Surgery.

    Science.gov (United States)

    Voineskos, Sophocles H; Coroneos, Christopher J; Ziolkowski, Natalia I; Kaur, Manraj N; Banfield, Laura; Meade, Maureen O; Chung, Kevin C; Thoma, Achilleas; Bhandari, Mohit

    2016-02-01

    The authors examined industry support, conflict of interest, and sample size in plastic surgery randomized controlled trials that compared surgical interventions. They hypothesized that industry-funded trials demonstrate statistically significant outcomes more often, and randomized controlled trials with small sample sizes report statistically significant results more frequently. An electronic search identified randomized controlled trials published between 2000 and 2013. Independent reviewers assessed manuscripts and performed data extraction. Funding source, conflict of interest, primary outcome direction, and sample size were examined. Chi-squared and independent-samples t tests were used in the analysis. The search identified 173 randomized controlled trials, of which 100 (58 percent) did not acknowledge funding status. A relationship between funding source and trial outcome direction was not observed. Both funding status and conflict of interest reporting improved over time. Only 24 percent (six of 25) of industry-funded randomized controlled trials reported authors to have independent control of data and manuscript contents. The mean number of patients randomized was 73 per trial (median, 43, minimum, 3, maximum, 936). Small trials were not found to be positive more often than large trials (p = 0.87). Randomized controlled trials with small sample size were common; however, this provides great opportunity for the field to engage in further collaboration and produce larger, more definitive trials. Reporting of trial funding and conflict of interest is historically poor, but it greatly improved over the study period. Underreporting at author and journal levels remains a limitation when assessing the relationship between funding source and trial outcomes. Improved reporting and manuscript control should be goals that both authors and journals can actively achieve.

  3. Successful E-Learning in Small and Medium-Sized Enterprises

    Science.gov (United States)

    Paulsen, Morten Flate

    2009-01-01

    So far, e-learning has primarily been used when there are many learners involved. The up-front investments related to e-learning are relatively high, and may be perceived as prohibitive for small and medium-sized enterprises (SMEs). Some e-learning is, however, getting less expensive, and some e-learning models are more suited for small-scale…

  4. Context matters: volunteer bias, small sample size, and the value of comparison groups in the assessment of research-based undergraduate introductory biology lab courses.

    Science.gov (United States)

    Brownell, Sara E; Kloser, Matthew J; Fukami, Tadashi; Shavelson, Richard J

    2013-01-01

    The shift from cookbook to authentic research-based lab courses in undergraduate biology necessitates the need for evaluation and assessment of these novel courses. Although the biology education community has made progress in this area, it is important that we interpret the effectiveness of these courses with caution and remain mindful of inherent limitations to our study designs that may impact internal and external validity. The specific context of a research study can have a dramatic impact on the conclusions. We present a case study of our own three-year investigation of the impact of a research-based introductory lab course, highlighting how volunteer students, a lack of a comparison group, and small sample sizes can be limitations of a study design that can affect the interpretation of the effectiveness of a course.

  5. Sample size calculation while controlling false discovery rate for differential expression analysis with RNA-sequencing experiments.

    Science.gov (United States)

    Bi, Ran; Liu, Peng

    2016-03-31

    RNA-Sequencing (RNA-seq) experiments have been popularly applied to transcriptome studies in recent years. Such experiments are still relatively costly. As a result, RNA-seq experiments often employ a small number of replicates. Power analysis and sample size calculation are challenging in the context of differential expression analysis with RNA-seq data. One challenge is that there are no closed-form formulae to calculate power for the popularly applied tests for differential expression analysis. In addition, false discovery rate (FDR), instead of family-wise type I error rate, is controlled for the multiple testing error in RNA-seq data analysis. So far, there are very few proposals on sample size calculation for RNA-seq experiments. In this paper, we propose a procedure for sample size calculation while controlling FDR for RNA-seq experimental design. Our procedure is based on the weighted linear model analysis facilitated by the voom method which has been shown to have competitive performance in terms of power and FDR control for RNA-seq differential expression analysis. We derive a method that approximates the average power across the differentially expressed genes, and then calculate the sample size to achieve a desired average power while controlling FDR. Simulation results demonstrate that the actual power of several popularly applied tests for differential expression is achieved and is close to the desired power for RNA-seq data with sample size calculated based on our method. Our proposed method provides an efficient algorithm to calculate sample size while controlling FDR for RNA-seq experimental design. We also provide an R package ssizeRNA that implements our proposed method and can be downloaded from the Comprehensive R Archive Network ( http://cran.r-project.org ).

  6. The Power of Low Back Pain Trials: A Systematic Review of Power, Sample Size, and Reporting of Sample Size Calculations Over Time, in Trials Published Between 1980 and 2012.

    Science.gov (United States)

    Froud, Robert; Rajendran, Dévan; Patel, Shilpa; Bright, Philip; Bjørkli, Tom; Eldridge, Sandra; Buchbinder, Rachelle; Underwood, Martin

    2017-06-01

    A systematic review of nonspecific low back pain trials published between 1980 and 2012. To explore what proportion of trials have been powered to detect different bands of effect size; whether there is evidence that sample size in low back pain trials has been increasing; what proportion of trial reports include a sample size calculation; and whether likelihood of reporting sample size calculations has increased. Clinical trials should have a sample size sufficient to detect a minimally important difference for a given power and type I error rate. An underpowered trial is one within which probability of type II error is too high. Meta-analyses do not mitigate underpowered trials. Reviewers independently abstracted data on sample size at point of analysis, whether a sample size calculation was reported, and year of publication. Descriptive analyses were used to explore ability to detect effect sizes, and regression analyses to explore the relationship between sample size, or reporting sample size calculations, and time. We included 383 trials. One-third were powered to detect a standardized mean difference of less than 0.5, and 5% were powered to detect less than 0.3. The average sample size was 153 people, which increased only slightly (∼4 people/yr) from 1980 to 2000, and declined slightly (∼4.5 people/yr) from 2005 to 2011 (P pain trials and the reporting of sample size calculations may need to be increased. It may be justifiable to power a trial to detect only large effects in the case of novel interventions. 3.

  7. Assessing learning in small sized physics courses

    Directory of Open Access Journals (Sweden)

    Emanuela Ene

    2018-01-01

    Full Text Available We describe the construction, validation, and testing of a concept inventory for an Introduction to Physics of Semiconductors course offered by the department of physics to undergraduate engineering students. By design, this inventory addresses both content knowledge and the ability to interpret content via different cognitive processes outlined in Bloom’s revised taxonomy. The primary challenge comes from the low number of test takers. We describe the Rasch modeling analysis for this concept inventory, and the results of the calibration on a small sample size, with the intention of providing a useful blueprint to other instructors. Our study involved 101 students from Oklahoma State University and fourteen faculty teaching or doing research in the field of semiconductors at seven universities. The items were written in four-option multiple-choice format. It was possible to calibrate a 30-item unidimensional scale precisely enough to characterize the student population enrolled each semester and, therefore, to allow the tailoring of the learning activities of each class. We show that this scale can be employed as an item bank from which instructors could extract short testlets and where we can add new items fitting the existing calibration.

  8. Assessing learning in small sized physics courses

    Science.gov (United States)

    Ene, Emanuela; Ackerson, Bruce J.

    2018-01-01

    We describe the construction, validation, and testing of a concept inventory for an Introduction to Physics of Semiconductors course offered by the department of physics to undergraduate engineering students. By design, this inventory addresses both content knowledge and the ability to interpret content via different cognitive processes outlined in Bloom's revised taxonomy. The primary challenge comes from the low number of test takers. We describe the Rasch modeling analysis for this concept inventory, and the results of the calibration on a small sample size, with the intention of providing a useful blueprint to other instructors. Our study involved 101 students from Oklahoma State University and fourteen faculty teaching or doing research in the field of semiconductors at seven universities. The items were written in four-option multiple-choice format. It was possible to calibrate a 30-item unidimensional scale precisely enough to characterize the student population enrolled each semester and, therefore, to allow the tailoring of the learning activities of each class. We show that this scale can be employed as an item bank from which instructors could extract short testlets and where we can add new items fitting the existing calibration.

  9. Sample size choices for XRCT scanning of highly unsaturated soil mixtures

    Directory of Open Access Journals (Sweden)

    Smith Jonathan C.

    2016-01-01

    Full Text Available Highly unsaturated soil mixtures (clay, sand and gravel are used as building materials in many parts of the world, and there is increasing interest in understanding their mechanical and hydraulic behaviour. In the laboratory, x-ray computed tomography (XRCT is becoming more widely used to investigate the microstructures of soils, however a crucial issue for such investigations is the choice of sample size, especially concerning the scanning of soil mixtures where there will be a range of particle and void sizes. In this paper we present a discussion (centred around a new set of XRCT scans on sample sizing for scanning of samples comprising soil mixtures, where a balance has to be made between realistic representation of the soil components and the desire for high resolution scanning, We also comment on the appropriateness of differing sample sizes in comparison to sample sizes used for other geotechnical testing. Void size distributions for the samples are presented and from these some hypotheses are made as to the roles of inter- and intra-aggregate voids in the mechanical behaviour of highly unsaturated soils.

  10. The influence of CEO gender on market orientation and performance in service small and medium-sized service businesses

    NARCIS (Netherlands)

    Davis, Peter S.; Babakus, Emin; Englis-Danskin, Paula; Pett, Tim

    2010-01-01

    This study examines the effects of CEO gender on market orientation and performance (growth and profitability) among a sample of small and medium-sized service businesses. Gender was found to have significant indirect effects (via market orientation) on both market performance (growth) and financial

  11. Small is Beautiful? Firm's Size, Prevention & Food Safety.

    OpenAIRE

    Rouviere, Elodie; Soubeyran, Raphael

    2012-01-01

    The European General Food Law of 2005 and the newly promulgated FDA Food Safety Modernization Act (FFSMA) of 2010 ask all food operators to implement preventive efforts. In this article, we explore the link between firm’s size and preventive efforts. We show two main results. First, when there is no cross-contamination, small firms will provide higher preventive efforts than large firms. When there is crosscontamination, the effort-size curve may have a "inverted-U" shape. From our results we...

  12. Comparing Server Energy Use and Efficiency Using Small Sample Sizes

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Henry C.; Qin, Yong; Price, Phillip N.

    2014-11-01

    This report documents a demonstration that compared the energy consumption and efficiency of a limited sample size of server-type IT equipment from different manufacturers by measuring power at the server power supply power cords. The results are specific to the equipment and methods used. However, it is hoped that those responsible for IT equipment selection can used the methods described to choose models that optimize energy use efficiency. The demonstration was conducted in a data center at Lawrence Berkeley National Laboratory in Berkeley, California. It was performed with five servers of similar mechanical and electronic specifications; three from Intel and one each from Dell and Supermicro. Server IT equipment is constructed using commodity components, server manufacturer-designed assemblies, and control systems. Server compute efficiency is constrained by the commodity component specifications and integration requirements. The design freedom, outside of the commodity component constraints, provides room for the manufacturer to offer a product with competitive efficiency that meets market needs at a compelling price. A goal of the demonstration was to compare and quantify the server efficiency for three different brands. The efficiency is defined as the average compute rate (computations per unit of time) divided by the average energy consumption rate. The research team used an industry standard benchmark software package to provide a repeatable software load to obtain the compute rate and provide a variety of power consumption levels. Energy use when the servers were in an idle state (not providing computing work) were also measured. At high server compute loads, all brands, using the same key components (processors and memory), had similar results; therefore, from these results, it could not be concluded that one brand is more efficient than the other brands. The test results show that the power consumption variability caused by the key components as a

  13. Evaluation research of small and medium-sized enterprise informatization on big data

    Science.gov (United States)

    Yang, Na

    2017-09-01

    Under the background of big data, key construction of small and medium-sized enterprise informationization level was needed, but information construction cost was large, while information cost of inputs can bring benefit to small and medium-sized enterprises. This paper established small and medium-sized enterprise informatization evaluation system from hardware and software security level, information organization level, information technology application and the profit level, and information ability level. The rough set theory was used to brief indexes, and then carry out evaluation by support vector machine (SVM) model. At last, examples were used to verify the theory in order to prove the effectiveness of the method.

  14. Inferring Population Size History from Large Samples of Genome-Wide Molecular Data - An Approximate Bayesian Computation Approach.

    Directory of Open Access Journals (Sweden)

    Simon Boitard

    2016-03-01

    Full Text Available Inferring the ancestral dynamics of effective population size is a long-standing question in population genetics, which can now be tackled much more accurately thanks to the massive genomic data available in many species. Several promising methods that take advantage of whole-genome sequences have been recently developed in this context. However, they can only be applied to rather small samples, which limits their ability to estimate recent population size history. Besides, they can be very sensitive to sequencing or phasing errors. Here we introduce a new approximate Bayesian computation approach named PopSizeABC that allows estimating the evolution of the effective population size through time, using a large sample of complete genomes. This sample is summarized using the folded allele frequency spectrum and the average zygotic linkage disequilibrium at different bins of physical distance, two classes of statistics that are widely used in population genetics and can be easily computed from unphased and unpolarized SNP data. Our approach provides accurate estimations of past population sizes, from the very first generations before present back to the expected time to the most recent common ancestor of the sample, as shown by simulations under a wide range of demographic scenarios. When applied to samples of 15 or 25 complete genomes in four cattle breeds (Angus, Fleckvieh, Holstein and Jersey, PopSizeABC revealed a series of population declines, related to historical events such as domestication or modern breed creation. We further highlight that our approach is robust to sequencing errors, provided summary statistics are computed from SNPs with common alleles.

  15. Small angle neutron scattering measurements of magnetic cluster sizes in magnetic recorging disks

    CERN Document Server

    Toney, M

    2003-01-01

    We describe Small Angle Neutron Scattering measurements of the magnetic cluster size distributions for several longitudinal magnetic recording media. We find that the average magnetic cluster size is slightly larger than the average physical grain size, that there is a broad distribution of cluster sizes, and that the cluster size is inversely correlated to the media signal-to-noise ratio. These results show that intergranular magnetic coupling in these media is small and they provide empirical data for the cluster-size distribution that can be incorporated into models of magnetic recording.

  16. A regression-based differential expression detection algorithm for microarray studies with ultra-low sample size.

    Directory of Open Access Journals (Sweden)

    Daniel Vasiliu

    Full Text Available Global gene expression analysis using microarrays and, more recently, RNA-seq, has allowed investigators to understand biological processes at a system level. However, the identification of differentially expressed genes in experiments with small sample size, high dimensionality, and high variance remains challenging, limiting the usability of these tens of thousands of publicly available, and possibly many more unpublished, gene expression datasets. We propose a novel variable selection algorithm for ultra-low-n microarray studies using generalized linear model-based variable selection with a penalized binomial regression algorithm called penalized Euclidean distance (PED. Our method uses PED to build a classifier on the experimental data to rank genes by importance. In place of cross-validation, which is required by most similar methods but not reliable for experiments with small sample size, we use a simulation-based approach to additively build a list of differentially expressed genes from the rank-ordered list. Our simulation-based approach maintains a low false discovery rate while maximizing the number of differentially expressed genes identified, a feature critical for downstream pathway analysis. We apply our method to microarray data from an experiment perturbing the Notch signaling pathway in Xenopus laevis embryos. This dataset was chosen because it showed very little differential expression according to limma, a powerful and widely-used method for microarray analysis. Our method was able to detect a significant number of differentially expressed genes in this dataset and suggest future directions for investigation. Our method is easily adaptable for analysis of data from RNA-seq and other global expression experiments with low sample size and high dimensionality.

  17. 76 FR 42157 - Small Business Size Standards: Waiver of the Nonmanufacturer Rule

    Science.gov (United States)

    2011-07-18

    ... (BD) Program, or Women-Owned Small Business (WOSB) concerns to provide the products of small business... SMALL BUSINESS ADMINISTRATION Small Business Size Standards: Waiver of the Nonmanufacturer Rule AGENCY: U.S. Small Business Administration. ACTION: Notice of Denial to Waive the Nonmanufacturer Rule...

  18. 76 FR 42157 - Small Business Size Standards; Waiver of the Nonmanufacturer Rule

    Science.gov (United States)

    2011-07-18

    ... Development (BD) Program, or Women- Owned Small Business (WOSB) concerns to provide the products of small... SMALL BUSINESS ADMINISTRATION Small Business Size Standards; Waiver of the Nonmanufacturer Rule AGENCY: U.S. Small Business Administration. ACTION: Notice of Retraction of a Class Waiver SUMMARY: The U...

  19. Significance assessment of small-medium sized reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kanno, Minoru [Japan Atomic Power Co., Research and Development Dept., Tokyo (Japan)

    2002-12-01

    Preliminary assessment for deployment of small-medium sized reactor (S and M reactor) as a future option has been conducted at the JAPCO (Japan Atomic Power Company) under the cooperation with the CRIERI (Central Research Institute of Electric Power Industry). Significance of the S and M reactor introduction is listed as follows; lower investment cost, possible siting near demand side, enlarged freedom of siting, shorter transmission line, good compatibility with slow increase of demand and plain explanation of safety using simpler system such as integral type vessel without piping, natural convection core cooling and passive safety system. The deployment of simpler plant system, modular shop fabrication, ship-shell structured building and longer operation period can assure economics comparable with that of a large sized reactor, coping with scale-demerit. Also the S and M reactor is preferable in size for the nuclear heat utilization such as hydrogen production. (T. Tanaka)

  20. Adaptive clinical trial designs with pre-specified rules for modifying the sample size: understanding efficient types of adaptation.

    Science.gov (United States)

    Levin, Gregory P; Emerson, Sarah C; Emerson, Scott S

    2013-04-15

    Adaptive clinical trial design has been proposed as a promising new approach that may improve the drug discovery process. Proponents of adaptive sample size re-estimation promote its ability to avoid 'up-front' commitment of resources, better address the complicated decisions faced by data monitoring committees, and minimize accrual to studies having delayed ascertainment of outcomes. We investigate aspects of adaptation rules, such as timing of the adaptation analysis and magnitude of sample size adjustment, that lead to greater or lesser statistical efficiency. Owing in part to the recent Food and Drug Administration guidance that promotes the use of pre-specified sampling plans, we evaluate alternative approaches in the context of well-defined, pre-specified adaptation. We quantify the relative costs and benefits of fixed sample, group sequential, and pre-specified adaptive designs with respect to standard operating characteristics such as type I error, maximal sample size, power, and expected sample size under a range of alternatives. Our results build on others' prior research by demonstrating in realistic settings that simple and easily implemented pre-specified adaptive designs provide only very small efficiency gains over group sequential designs with the same number of analyses. In addition, we describe optimal rules for modifying the sample size, providing efficient adaptation boundaries on a variety of scales for the interim test statistic for adaptation analyses occurring at several different stages of the trial. We thus provide insight into what are good and bad choices of adaptive sampling plans when the added flexibility of adaptive designs is desired. Copyright © 2012 John Wiley & Sons, Ltd.

  1. Algorithm for computing significance levels using the Kolmogorov-Smirnov statistic and valid for both large and small samples

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, S.E.; Fields, D.E.

    1983-10-01

    The KSTEST code presented here is designed to perform the Kolmogorov-Smirnov one-sample test. The code may be used as a stand-alone program or the principal subroutines may be excerpted and used to service other programs. The Kolmogorov-Smirnov one-sample test is a nonparametric goodness-of-fit test. A number of codes to perform this test are in existence, but they suffer from the inability to provide meaningful results in the case of small sample sizes (number of values less than or equal to 80). The KSTEST code overcomes this inadequacy by using two distinct algorithms. If the sample size is greater than 80, an asymptotic series developed by Smirnov is evaluated. If the sample size is 80 or less, a table of values generated by Birnbaum is referenced. Valid results can be obtained from KSTEST when the sample contains from 3 to 300 data points. The program was developed on a Digital Equipment Corporation PDP-10 computer using the FORTRAN-10 language. The code size is approximately 450 card images and the typical CPU execution time is 0.19 s.

  2. A practical and theoretical definition of very small field size for radiotherapy output factor measurements.

    Science.gov (United States)

    Charles, P H; Cranmer-Sargison, G; Thwaites, D I; Crowe, S B; Kairn, T; Knight, R T; Kenny, J; Langton, C M; Trapp, J V

    2014-04-01

    This work introduces the concept of very small field size. Output factor (OPF) measurements at these field sizes require extremely careful experimental methodology including the measurement of dosimetric field size at the same time as each OPF measurement. Two quantifiable scientific definitions of the threshold of very small field size are presented. A practical definition was established by quantifying the effect that a 1 mm error in field size or detector position had on OPFs and setting acceptable uncertainties on OPF at 1%. Alternatively, for a theoretical definition of very small field size, the OPFs were separated into additional factors to investigate the specific effects of lateral electronic disequilibrium, photon scatter in the phantom, and source occlusion. The dominant effect was established and formed the basis of a theoretical definition of very small fields. Each factor was obtained using Monte Carlo simulations of a Varian iX linear accelerator for various square field sizes of side length from 4 to 100 mm, using a nominal photon energy of 6 MV. According to the practical definition established in this project, field sizes ≤ 15 mm were considered to be very small for 6 MV beams for maximal field size uncertainties of 1 mm. If the acceptable uncertainty in the OPF was increased from 1.0% to 2.0%, or field size uncertainties are 0.5 mm, field sizes ≤ 12 mm were considered to be very small. Lateral electronic disequilibrium in the phantom was the dominant cause of change in OPF at very small field sizes. Thus the theoretical definition of very small field size coincided to the field size at which lateral electronic disequilibrium clearly caused a greater change in OPF than any other effects. This was found to occur at field sizes ≤ 12 mm. Source occlusion also caused a large change in OPF for field sizes ≤ 8 mm. Based on the results of this study, field sizes ≤ 12 mm were considered to be theoretically very small for 6 MV beams. Extremely

  3. Particle size distribution models of small angle neutron scattering pattern on ferro fluids

    International Nuclear Information System (INIS)

    Sistin Asri Ani; Darminto; Edy Giri Rachman Putra

    2009-01-01

    The Fe 3 O 4 ferro fluids samples were synthesized by a co-precipitation method. The investigation of ferro fluids microstructure is known to be one of the most important problems because the presence of aggregates and their internal structure influence greatly the properties of ferro fluids. The size and the size dispersion of particle in ferro fluids were determined assuming a log normal distribution of particle radius. The scattering pattern of the measurement by small angle neutron scattering were fitted by the theoretical scattering function of two limitation models are log normal sphere distribution and fractal aggregate. Two types of particle are detected, which are presumably primary particle of 30 Armstrong in radius and secondary fractal aggregate of 200 Armstrong with polydispersity of 0.47 up to 0.53. (author)

  4. Analyzing the management process in small and medium-sized enterprises in the Region of South Bohemia

    Directory of Open Access Journals (Sweden)

    Váchal Jan

    2017-09-01

    Full Text Available The entry is aimed at analyzing the process of managing small and medium-sized enterprise, specifically in the Region of South Bohemia. The testing sample included 180 enterprises. The fundamental statistical information about SMEs is included, focusing of the numbers, the size category, and their specialization in the Region of South Bohemia. The research activities were aimed at the steepness of management structures and at their extent on all management levels. The analysis indicates that micro-enterprises prefer one management level, while small and middle-sized enterprises prefer two management levels with the statistic dependence on the size category. In regard to the number of employees on individual management levels, the top positions have from 6 employees up to 30 on the operative level. The general business trend involves a transfer to the functional management structure. With respect to the strategic management and decision- making, enterprises boost an attractive prospect of mainly their own sources. A statistical correlation was proved between the elaborated strategy, the size of the enterprise and number of management levels. A strong correlation between the number of management levels and the aim of the enterprise i.e. the type of organization structure was not proved.

  5. SIMPLIFIED MATHEMATICAL MODEL OF SMALL SIZED UNMANNED AIRCRAFT VEHICLE LAYOUT

    Directory of Open Access Journals (Sweden)

    2016-01-01

    Full Text Available Strong reduction of new aircraft design period using new technology based on artificial intelligence is the key problem mentioned in forecasts of leading aerospace industry research centers. This article covers the approach to devel- opment of quick aerodynamic design methods based on artificial intelligence neural system. The problem is being solved for the classical scheme of small sized unmanned aircraft vehicle (UAV. The principal parts of the method are the mathe- matical model of layout, layout generator of this type of aircraft is built on aircraft neural networks, automatic selection module for cleaning variety of layouts generated in automatic mode, robust direct computational fluid dynamics method, aerodynamic characteristics approximators on artificial neural networks.Methods based on artificial neural networks have intermediate position between computational fluid dynamics methods or experiments and simplified engineering approaches. The use of ANN for estimating aerodynamic characteris-tics put limitations on input data. For this task the layout must be presented as a vector with dimension not exceeding sev-eral hundred. Vector components must include all main parameters conventionally used for layouts description and com- pletely replicate the most important aerodynamics and structural properties.The first stage of the work is presented in the paper. Simplified mathematical model of small sized UAV was developed. To estimate the range of geometrical parameters of layouts the review of existing vehicle was done. The result of the work is the algorithm and computer software for generating the layouts based on ANN technolo-gy. 10000 samples were generated and the dataset containig geometrical and aerodynamic characteristics of layoutwas created.

  6. Perspective on small and medium size reactors

    International Nuclear Information System (INIS)

    Stahlkopf, K.E.; Braun, C.

    1985-01-01

    There has been renewed interests in the United States in small and medium size nuclear power plants (600 MWe or less). The reasons for this include: (1) there is a large uncertainty in load growth projections. Small to medium size nuclear power plants may be better suited to meet uncertain load growth projections. (2) It has been argued that a large economy of scale exists because of the nature of nuclear power installations. A recent examination of the French program shows that no economy of scale exists between the 900 MWe and 1300 MWe plants. Others have suggested that it is possible to reduce the economy of scale so it is not a prohibitive factor. (3) In the past in the United States, it has been customary for several smaller utilities to share the output of a large nuclear plant to take advantage of the perceived economy of scale. Difficulties have been encountered by some of these enterprises. (4) An examination of capacity factors for the United States shows that plants of smaller output appear to operate more reliably and economically than larger plants

  7. Visual exposure to large and small portion sizes and perceptions of portion size normality: Three experimental studies

    OpenAIRE

    Robinson, Eric; Oldham, Melissa; Cuckson, Imogen; Brunstrom, Jeffrey M.; Rogers, Peter J.; Hardman, Charlotte A.

    2016-01-01

    Portion sizes of many foods have increased in recent times. In three studies we examined the effect that repeated visual exposure to larger versus smaller food portion sizes has on perceptions of what constitutes a normal-sized food portion and measures of portion size selection. In studies 1 and 2 participants were visually exposed to images of large or small portions of spaghetti bolognese, before making evaluations about an image of an intermediate sized portion of the same food. In study ...

  8. Multi-actinide analysis with AMS for ultra-trace determination and small sample sizes: advantages and drawbacks

    Energy Technology Data Exchange (ETDEWEB)

    Quinto, Francesca; Lagos, Markus; Plaschke, Markus; Schaefer, Thorsten; Geckeis, Horst [Institute for Nuclear Waste Disposal, Karlsruhe Institute of Technology (Germany); Steier, Peter; Golser, Robin [VERA Laboratory, Faculty of Physics, University of Vienna (Austria)

    2016-07-01

    With the abundance sensitivities of AMS for U-236, Np-237 and Pu-239 relative to U-238 at levels lower than 1E-15, a simultaneous determination of several actinides without previous chemical separation from each other is possible. The actinides are extracted from the matrix elements via an iron hydroxide co-precipitation and the nuclides sequentially measured from the same sputter target. This simplified method allows for the use of non-isotopic tracers and consequently the determination of Np-237 and Am-243 for which isotopic tracers with the degree of purity required by ultra-trace mass-spectrometric analysis are not available. With detection limits of circa 1E+4 atoms in a sample, 1E+8 atoms are determined with circa 1 % relative uncertainty due to counting statistics. This allows for an unprecedented reduction of the sample size down to 100 ml of natural water. However, the use of non-isotopic tracers introduces a dominating uncertainty of up to 30 % related to the reproducibility of the results. The advantages and drawbacks of the novel method will be presented with the aid of recent results from the CFM Project at the Grimsel Test Site and from the investigation of global fallout in environmental samples.

  9. Simple and multiple linear regression: sample size considerations.

    Science.gov (United States)

    Hanley, James A

    2016-11-01

    The suggested "two subjects per variable" (2SPV) rule of thumb in the Austin and Steyerberg article is a chance to bring out some long-established and quite intuitive sample size considerations for both simple and multiple linear regression. This article distinguishes two of the major uses of regression models that imply very different sample size considerations, neither served well by the 2SPV rule. The first is etiological research, which contrasts mean Y levels at differing "exposure" (X) values and thus tends to focus on a single regression coefficient, possibly adjusted for confounders. The second research genre guides clinical practice. It addresses Y levels for individuals with different covariate patterns or "profiles." It focuses on the profile-specific (mean) Y levels themselves, estimating them via linear compounds of regression coefficients and covariates. By drawing on long-established closed-form variance formulae that lie beneath the standard errors in multiple regression, and by rearranging them for heuristic purposes, one arrives at quite intuitive sample size considerations for both research genres. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Context Matters: Volunteer Bias, Small Sample Size, and the Value of Comparison Groups in the Assessment of Research-Based Undergraduate Introductory Biology Lab Courses

    Directory of Open Access Journals (Sweden)

    Sara E. Brownell

    2013-08-01

    Full Text Available The shift from cookbook to authentic research-based lab courses in undergraduate biology necessitates the need for evaluation and assessment of these novel courses. Although the biology education community has made progress in this area, it is important that we interpret the effectiveness of these courses with caution and remain mindful of inherent limitations to our study designs that may impact internal and external validity. The specific context of a research study can have a dramatic impact on the conclusions. We present a case study of our own three-year investigation of the impact of a research-based introductory lab course, highlighting how volunteer students, a lack of a comparison group, and small sample sizes can be limitations of a study design that can affect the interpretation of the effectiveness of a course.

  11. Development of a small-sized radon data logger

    International Nuclear Information System (INIS)

    Tasaka, Shigeki; Sasaki, Yoshimi

    1996-01-01

    A small-sized radon data logger and a electrostatic collecting radon monitor were developed for the continuous monitoring of environmental radon and radon daughters. A PIN photodiode (PD), an alpha particle defector, installed inside a container attracts radon daughters when charged electrostatically. Alpha particles are completely separated from each other according to the energy level. New logger has made it possible 10 analysts and save the radon data. Alpha particle count data from radon daughters are automatically integrated over preset time intervals and the energy regions. The desiccant P 2 O 5 was placed in the bottom of the monitor, since the collection efficiency of 218 Po atoms depends on the humidity of the air. We can get the 30 days continuous data logging at 30 min sampling frequency at any place with the car battery. We observed the radon concentration of the air inside the Super-Kamiokande dome from Jan-30 to Feb-8-1996. The average of radon concentration was found to be (46±13) Bq/m 3 . (author)

  12. The Statistics and Mathematics of High Dimension Low Sample Size Asymptotics.

    Science.gov (United States)

    Shen, Dan; Shen, Haipeng; Zhu, Hongtu; Marron, J S

    2016-10-01

    The aim of this paper is to establish several deep theoretical properties of principal component analysis for multiple-component spike covariance models. Our new results reveal an asymptotic conical structure in critical sample eigendirections under the spike models with distinguishable (or indistinguishable) eigenvalues, when the sample size and/or the number of variables (or dimension) tend to infinity. The consistency of the sample eigenvectors relative to their population counterparts is determined by the ratio between the dimension and the product of the sample size with the spike size. When this ratio converges to a nonzero constant, the sample eigenvector converges to a cone, with a certain angle to its corresponding population eigenvector. In the High Dimension, Low Sample Size case, the angle between the sample eigenvector and its population counterpart converges to a limiting distribution. Several generalizations of the multi-spike covariance models are also explored, and additional theoretical results are presented.

  13. Export Marketing Involvement of African Small and Medium Size ...

    African Journals Online (AJOL)

    Export Marketing Involvement of African Small and Medium Size Enterprises: Tanzania's Fish Processing Industry. ... This study examined the Export marketing involvement of SMEs from the Least ... EMAIL FULL TEXT EMAIL FULL TEXT

  14. The attention-weighted sample-size model of visual short-term memory

    DEFF Research Database (Denmark)

    Smith, Philip L.; Lilburn, Simon D.; Corbett, Elaine A.

    2016-01-01

    exceeded that predicted by the sample-size model for both simultaneously and sequentially presented stimuli. Instead, the set-size effect and the serial position curves with sequential presentation were predicted by an attention-weighted version of the sample-size model, which assumes that one of the items...

  15. 75 FR 68394 - Small Business Size Standards: Waiver of the Nonmanufacturer Rule

    Science.gov (United States)

    2010-11-05

    ..., Participants in SBA's 8(a) Business Development (BD) Program, or Women-Owned Small Business (WOSB) concerns... SMALL BUSINESS ADMINISTRATION Small Business Size Standards: Waiver of the Nonmanufacturer Rule AGENCY: U.S. Small Business Administration. ACTION: Notice of Waiver to the Nonmanufacturer Rule for...

  16. 78 FR 61443 - Small Business Size Standards: Waiver of the Nonmanufacturer Rule

    Science.gov (United States)

    2013-10-03

    ... SBA's 8(a) Business Development Program, or Women-Owned Small Businesses to provide aerospace ball and... businesses, Women-Owned Small Businesses, or Participants in the SBA's 8(a) Business Development Program... SMALL BUSINESS ADMINISTRATION Small Business Size Standards: Waiver of the Nonmanufacturer Rule...

  17. Improvement of environmental management incentives for small and medium-sized enterprises

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Kee Bok; Lee, Seung Kyu; Lim, Chae Woon; Chung, Ho Sun [Korea Environment Institute, Seoul (Korea)

    1998-12-01

    The developed countries have been prepared the national innovation for sustainable development for a long time. The international advanced enterprises, such as multinational enterprises, are enhancing the active and strategic environmental management. However, in spite of such changes in the world, small and middle-sized enterprises do not show any substantial changes. It is true that most of enterprises do not recognize the need of environmental management. Although they have intention for environmental improvement, it is also true that they do not have any strength to maintain effective management. Moreover, with over 99% of small and middle-sized enterprises among entire enterprises in Korea, it is hardly possible to establish an effective regulation system. The purpose of this study is to make a policy plan to induce the development of environmental friendly small and middle-sized enterprises under the recognition of such problems. 138 refs., 4 figs., 195 tabs.

  18. Role of Relationship Marketing in Small and Medium-Sized Entreprises

    OpenAIRE

    Ružica Butigan; Ivana Mahnić

    2011-01-01

    Along with marketing, small and medium enterprises (SMEs) are commonly associated in literature with relationship marketing which results in marketing networks. This paper examines the specific characteristics that differentiate large companies from small and medium-sized enterprises, and the reasons that prevent SMEs from engaging in traditional marketing within the scope of marketing mix. The paper also shows that the key characteristic which distinguishes small from large companies is a pr...

  19. Core and shell sizing of small silver-coated nanospheres by optical extinction spectroscopy

    International Nuclear Information System (INIS)

    Schinca, D C; Scaffardi, L B

    2008-01-01

    Silver metal nanoparticles (Nps) are extensively used in different areas of research and technology due to their interesting optical, thermal and electric properties, especially for bare core and core-shell nanostructures with sizes smaller than 10 nm. Since these properties are core-shell size-dependent, size measurement is important in manipulating their potential functionalization and applications. Bare and coated small silver Nps fabricated by physical and chemical methods present specific characteristics in their extinction spectra that are potentially useful for sizing purposes. This work presents a novel procedure to size mean core radius smaller than 10 nm and mean shell thickness of silver core-shell Nps based on a comparative study of the characteristics in their optical extinction spectra in different media as a function of core radii, shell thickness and coating refractive index. From the regularities derived from these relationships, it can be concluded that plasmon full width at half-maximum (FWHM) is sensitive to core size but not to coating thickness, while plasmon resonance wavelength (PRW) is related to shell thickness and mostly independent of core radius. These facts, which allow sizing simultaneously both mean core radius and shell thickness, can also be used to size bare silver Nps as a special case of core-shell Nps with zero shell thickness. The proposed method was applied to size experimental samples and the results show good agreement with conventional TEM microscopy.

  20. 78 FR 76886 - Small Business Size Standards: Waiver of the Nonmanufacturer Rule

    Science.gov (United States)

    2013-12-19

    ..., Participants in SBA's 8(a) Business Development (BD) Program, or Women-Owned Small Business (WOSB) concerns... SMALL BUSINESS ADMINISTRATION Small Business Size Standards: Waiver of the Nonmanufacturer Rule AGENCY: U.S. Small Business Administration. ACTION: Notice of Final Action: Granting Class Waiver of the...

  1. Effects of Group Size on Students Mathematics Achievement in Small Group Settings

    Science.gov (United States)

    Enu, Justice; Danso, Paul Amoah; Awortwe, Peter K.

    2015-01-01

    An ideal group size is hard to obtain in small group settings; hence there are groups with more members than others. The purpose of the study was to find out whether group size has any effects on students' mathematics achievement in small group settings. Two third year classes of the 2011/2012 academic year were selected from two schools in the…

  2. Sample size re-assessment leading to a raised sample size does not inflate type I error rate under mild conditions.

    Science.gov (United States)

    Broberg, Per

    2013-07-19

    One major concern with adaptive designs, such as the sample size adjustable designs, has been the fear of inflating the type I error rate. In (Stat Med 23:1023-1038, 2004) it is however proven that when observations follow a normal distribution and the interim result show promise, meaning that the conditional power exceeds 50%, type I error rate is protected. This bound and the distributional assumptions may seem to impose undesirable restrictions on the use of these designs. In (Stat Med 30:3267-3284, 2011) the possibility of going below 50% is explored and a region that permits an increased sample size without inflation is defined in terms of the conditional power at the interim. A criterion which is implicit in (Stat Med 30:3267-3284, 2011) is derived by elementary methods and expressed in terms of the test statistic at the interim to simplify practical use. Mathematical and computational details concerning this criterion are exhibited. Under very general conditions the type I error rate is preserved under sample size adjustable schemes that permit a raise. The main result states that for normally distributed observations raising the sample size when the result looks promising, where the definition of promising depends on the amount of knowledge gathered so far, guarantees the protection of the type I error rate. Also, in the many situations where the test statistic approximately follows a normal law, the deviation from the main result remains negligible. This article provides details regarding the Weibull and binomial distributions and indicates how one may approach these distributions within the current setting. There is thus reason to consider such designs more often, since they offer a means of adjusting an important design feature at little or no cost in terms of error rate.

  3. Pb isotope analysis of ng size samples by TIMS equipped with a 1013 Ω resistor using a 207Pb-204Pb double spike

    NARCIS (Netherlands)

    Klaver, M.; Smeets, R.J.; Koornneef, J.M.; Davies, G.R.; Vroon, P.Z.

    2016-01-01

    The use of the double spike technique to correct for instrumental mass fractionation has yielded high precision results for lead isotope measurements by thermal ionisation mass spectrometry (TIMS), but the applicability to ng size Pb samples is hampered by the small size of the

  4. Efficient inference of population size histories and locus-specific mutation rates from large-sample genomic variation data.

    Science.gov (United States)

    Bhaskar, Anand; Wang, Y X Rachel; Song, Yun S

    2015-02-01

    With the recent increase in study sample sizes in human genetics, there has been growing interest in inferring historical population demography from genomic variation data. Here, we present an efficient inference method that can scale up to very large samples, with tens or hundreds of thousands of individuals. Specifically, by utilizing analytic results on the expected frequency spectrum under the coalescent and by leveraging the technique of automatic differentiation, which allows us to compute gradients exactly, we develop a very efficient algorithm to infer piecewise-exponential models of the historical effective population size from the distribution of sample allele frequencies. Our method is orders of magnitude faster than previous demographic inference methods based on the frequency spectrum. In addition to inferring demography, our method can also accurately estimate locus-specific mutation rates. We perform extensive validation of our method on simulated data and show that it can accurately infer multiple recent epochs of rapid exponential growth, a signal that is difficult to pick up with small sample sizes. Lastly, we use our method to analyze data from recent sequencing studies, including a large-sample exome-sequencing data set of tens of thousands of individuals assayed at a few hundred genic regions. © 2015 Bhaskar et al.; Published by Cold Spring Harbor Laboratory Press.

  5. Human resources management within the process management in small and medium-sized enterprises

    Directory of Open Access Journals (Sweden)

    Marie Duspivová

    2013-01-01

    Full Text Available Sector of small and medium-sized enterprises is regarded as the backbone of the economy and a driving force of innovation, employment and social integration. Development of the sector of small and medium-sized enterprises in the Czech Republic has a substantial impact on economic and social development of the country and its various regions. This article deals with the human resources management in small and medium-sized enterprises, because it is more than obvious recently that the prosperity of the organization is depended on human resources and management of them can determine not only whether the organizations succeed, but whether it will be able to survive in turbulent conditions in the present world. The main aim of this paper is to analytical describe the monitoring the process of human resources management in selected categories of enterprises by business activity and number of employees including statistical analysis of causal effects. Further to analytical describe the evaluation the process of human resources management and indicators of this process, which are monitored by small and medium-sized enterprises. To achieve this aim were used selected primary data collected in the project GAJU 068/2010/S titled “Process management and its possible implementation in small and medium-sized enterprises”.

  6. Conditional estimation of local pooled dispersion parameter in small-sample RNA-Seq data improves differential expression test.

    Science.gov (United States)

    Gim, Jungsoo; Won, Sungho; Park, Taesung

    2016-10-01

    High throughput sequencing technology in transcriptomics studies contribute to the understanding of gene regulation mechanism and its cellular function, but also increases a need for accurate statistical methods to assess quantitative differences between experiments. Many methods have been developed to account for the specifics of count data: non-normality, a dependence of the variance on the mean, and small sample size. Among them, the small number of samples in typical experiments is still a challenge. Here we present a method for differential analysis of count data, using conditional estimation of local pooled dispersion parameters. A comprehensive evaluation of our proposed method in the aspect of differential gene expression analysis using both simulated and real data sets shows that the proposed method is more powerful than other existing methods while controlling the false discovery rates. By introducing conditional estimation of local pooled dispersion parameters, we successfully overcome the limitation of small power and enable a powerful quantitative analysis focused on differential expression test with the small number of samples.

  7. Potential growth and opportunities for Kenyan small sized firms online : E-business adoption for small boutiques and the new customer online

    OpenAIRE

    Mbare, Candy

    2016-01-01

    The purpose of this thesis is to identify the online growth opportunities for the small sized firm in Kenya. Kenya is the most developed nation in East-Africa. E-commerce is a rapidly rising trend in developing nations, that provides small sized firms with growth opportunities. The e-commerce trend also comes with changes in consumer behavior. Sufficient use of e-commerce by small sized firms in Kenya, can open a opportunity to reach a wider range of audience and increase profits by expan...

  8. Development of electric discharge equipment for small specimen sampling

    International Nuclear Information System (INIS)

    Okamoto, Koji; Kitagawa, Hideaki; Kusumoto, Junichi; Kanaya, Akihiro; Kobayashi, Toshimi

    2009-01-01

    We have developed the on-site electric discharge sampling equipment that can effectively take samples such as small specimens from the surface portion of the plant components. Compared with the conventional sampling equipment, our sampling equipment can take samples that are thinner in depth and larger in area. In addition, the affection to the equipment can be held down to the minimum, and the thermally-affected zone of the material due to electric discharge is small, which is to be ignored. Therefore, our equipment is excellent in taking samples for various tests such as residual life evaluation.

  9. Determination of a representative volume element based on the variability of mechanical properties with sample size in bread.

    Science.gov (United States)

    Ramírez, Cristian; Young, Ashley; James, Bryony; Aguilera, José M

    2010-10-01

    Quantitative analysis of food structure is commonly obtained by image analysis of a small portion of the material that may not be the representative of the whole sample. In order to quantify structural parameters (air cells) of 2 types of bread (bread and bagel) the concept of representative volume element (RVE) was employed. The RVE for bread, bagel, and gelatin-gel (used as control) was obtained from the relationship between sample size and the coefficient of variation, calculated from the apparent Young's modulus measured on 25 replicates. The RVE was obtained when the coefficient of variation for different sample sizes converged to a constant value. In the 2 types of bread tested, the tendency of the coefficient of variation was to decrease as the sample size increased, while in the homogeneous gelatin-gel, it remained always constant around 2.3% to 2.4%. The RVE resulted to be cubes with sides of 45 mm for bread, 20 mm for bagels, and 10 mm for gelatin-gel (smallest sample tested). The quantitative image analysis as well as visual observation demonstrated that bread presented the largest dispersion of air-cell sizes. Moreover, both the ratio of maximum air-cell area/image area and maximum air-cell height/image height were greater for bread (values of 0.05 and 0.30, respectively) than for bagels (0.03 and 0.20, respectively). Therefore, the size and the size variation of air cells present in the structure determined the size of the RVE. It was concluded that RVE is highly dependent on the heterogeneity of the structure of the types of baked products.

  10. Small-size pedestrian detection in large scene based on fast R-CNN

    Science.gov (United States)

    Wang, Shengke; Yang, Na; Duan, Lianghua; Liu, Lu; Dong, Junyu

    2018-04-01

    Pedestrian detection is a canonical sub-problem of object detection with high demand during recent years. Although recent deep learning object detectors such as Fast/Faster R-CNN have shown excellent performance for general object detection, they have limited success for small size pedestrian detection in large-view scene. We study that the insufficient resolution of feature maps lead to the unsatisfactory accuracy when handling small instances. In this paper, we investigate issues involving Fast R-CNN for pedestrian detection. Driven by the observations, we propose a very simple but effective baseline for pedestrian detection based on Fast R-CNN, employing the DPM detector to generate proposals for accuracy, and training a fast R-CNN style network to jointly optimize small size pedestrian detection with skip connection concatenating feature from different layers to solving coarseness of feature maps. And the accuracy is improved in our research for small size pedestrian detection in the real large scene.

  11. Sample Size and Saturation in PhD Studies Using Qualitative Interviews

    Directory of Open Access Journals (Sweden)

    Mark Mason

    2010-08-01

    Full Text Available A number of issues can affect sample size in qualitative research; however, the guiding principle should be the concept of saturation. This has been explored in detail by a number of authors but is still hotly debated, and some say little understood. A sample of PhD studies using qualitative approaches, and qualitative interviews as the method of data collection was taken from theses.com and contents analysed for their sample sizes. Five hundred and sixty studies were identified that fitted the inclusion criteria. Results showed that the mean sample size was 31; however, the distribution was non-random, with a statistically significant proportion of studies, presenting sample sizes that were multiples of ten. These results are discussed in relation to saturation. They suggest a pre-meditated approach that is not wholly congruent with the principles of qualitative research. URN: urn:nbn:de:0114-fqs100387

  12. Variability of the raindrop size distribution at small spatial scales

    Science.gov (United States)

    Berne, A.; Jaffrain, J.

    2010-12-01

    Because of the interactions between atmospheric turbulence and cloud microphysics, the raindrop size distribution (DSD) is strongly variable in space and time. The spatial variability of the DSD at small spatial scales (below a few km) is not well documented and not well understood, mainly because of a lack of adequate measurements at the appropriate resolutions. A network of 16 disdrometers (Parsivels) has been designed and set up over EPFL campus in Lausanne, Switzerland. This network covers a typical operational weather radar pixel of 1x1 km2. The question of the significance of the variability of the DSD at such small scales is relevant for radar remote sensing of rainfall because the DSD is often assumed to be uniform within a radar sample volume and because the Z-R relationships used to convert the measured radar reflectivity Z into rain rate R are usually derived from point measurements. Thanks to the number of disdrometers, it was possible to quantify the spatial variability of the DSD at the radar pixel scale and to show that it can be significant. In this contribution, we show that the variability of the total drop concentration, of the median volume diameter and of the rain rate are significant, taking into account the sampling uncertainty associated with disdrometer measurements. The influence of this variability on the Z-R relationship can be non-negligible. Finally, the spatial structure of the DSD is quantified using a geostatistical tool, the variogram, and indicates high spatial correlation within a radar pixel.

  13. Accelerator mass spectrometry of small biological samples.

    Science.gov (United States)

    Salehpour, Mehran; Forsgard, Niklas; Possnert, Göran

    2008-12-01

    Accelerator mass spectrometry (AMS) is an ultra-sensitive technique for isotopic ratio measurements. In the biomedical field, AMS can be used to measure femtomolar concentrations of labeled drugs in body fluids, with direct applications in early drug development such as Microdosing. Likewise, the regenerative properties of cells which are of fundamental significance in stem-cell research can be determined with an accuracy of a few years by AMS analysis of human DNA. However, AMS nominally requires about 1 mg of carbon per sample which is not always available when dealing with specific body substances such as localized, organ-specific DNA samples. Consequently, it is of analytical interest to develop methods for the routine analysis of small samples in the range of a few tens of microg. We have used a 5 MV Pelletron tandem accelerator to study small biological samples using AMS. Different methods are presented and compared. A (12)C-carrier sample preparation method is described which is potentially more sensitive and less susceptible to contamination than the standard procedures.

  14. Self-navigation of a scanning tunneling microscope tip toward a micron-sized graphene sample.

    Science.gov (United States)

    Li, Guohong; Luican, Adina; Andrei, Eva Y

    2011-07-01

    We demonstrate a simple capacitance-based method to quickly and efficiently locate micron-sized conductive samples, such as graphene flakes, on insulating substrates in a scanning tunneling microscope (STM). By using edge recognition, the method is designed to locate and to identify small features when the STM tip is far above the surface, allowing for crash-free search and navigation. The method can be implemented in any STM environment, even at low temperatures and in strong magnetic field, with minimal or no hardware modifications.

  15. Performance and separation occurrence of binary probit regression estimator using maximum likelihood method and Firths approach under different sample size

    Science.gov (United States)

    Lusiana, Evellin Dewi

    2017-12-01

    The parameters of binary probit regression model are commonly estimated by using Maximum Likelihood Estimation (MLE) method. However, MLE method has limitation if the binary data contains separation. Separation is the condition where there are one or several independent variables that exactly grouped the categories in binary response. It will result the estimators of MLE method become non-convergent, so that they cannot be used in modeling. One of the effort to resolve the separation is using Firths approach instead. This research has two aims. First, to identify the chance of separation occurrence in binary probit regression model between MLE method and Firths approach. Second, to compare the performance of binary probit regression model estimator that obtained by MLE method and Firths approach using RMSE criteria. Those are performed using simulation method and under different sample size. The results showed that the chance of separation occurrence in MLE method for small sample size is higher than Firths approach. On the other hand, for larger sample size, the probability decreased and relatively identic between MLE method and Firths approach. Meanwhile, Firths estimators have smaller RMSE than MLEs especially for smaller sample sizes. But for larger sample sizes, the RMSEs are not much different. It means that Firths estimators outperformed MLE estimator.

  16. Unions in small and medium-sized enterprises

    DEFF Research Database (Denmark)

    Holten, Ann-Louise; Crouch, Colin

    2014-01-01

    Trade unions are commonly weak in small- and medium-sized enterprises, which constitute a majority of European firms and are often family-owned. We investigate the influence of family ownership on employee membership, perceptions and experience with unions in Danish and Italian firms in the textile...... and clothing sector. Family ownership reduces union membership; and within family firms, the number of family members employed is negatively associated with unionization rates and employee perceptions of unions....

  17. Traceable size determination of PMMA nanoparticles based on Small Angle X-ray Scattering (SAXS)

    Energy Technology Data Exchange (ETDEWEB)

    Gleber, G; Cibik, L; Mueller, P; Krumrey, M [Physikalisch-Technische Bundesanstalt (PTB), Abbestrasse 2-12, 10587 Berlin (Germany); Haas, S; Hoell, A, E-mail: gudrun.gleber@ptb.d [Helmholtz-Zentrum-Berlin fuer Materialien und Energie (HZB), Albert-Einstein-Strasse 15, 12489 Berlin (Germany)

    2010-10-01

    The size and size distribution of PMMA nanoparticles has been investigated with SAXS (small angle X-ray scattering) using monochromatized synchrotron radiation. The uncertainty has contributions from the wavelength or photon energy of the radiation, the scattering angle and the fit procedure for the obtained scattering curves. The wavelength can be traced back to the lattice constant of silicon, and the scattering angle is traceable via geometric measurements of the detector pixel size and the distance between the sample and the detector. SAXS measurements and data evaluations have been performed at different distances and photon energies for two PMMA nanoparticle suspensions with low polydispersity and nominal diameters of 108 nm and 192 nm, respectively, as well as for a mixture of both. The relative variation of the diameters obtained for different experimental conditions was below {+-} 0.3 %. The determined number-weighted mean diameters of (109.0 {+-} 0.7) nm and (188.0 {+-} 1.3) nm, respectively, are close to the nominal values.

  18. Traceable size determination of PMMA nanoparticles based on Small Angle X-ray Scattering (SAXS)

    Science.gov (United States)

    Gleber, G.; Cibik, L.; Haas, S.; Hoell, A.; Müller, P.; Krumrey, M.

    2010-10-01

    The size and size distribution of PMMA nanoparticles has been investigated with SAXS (small angle X-ray scattering) using monochromatized synchrotron radiation. The uncertainty has contributions from the wavelength or photon energy of the radiation, the scattering angle and the fit procedure for the obtained scattering curves. The wavelength can be traced back to the lattice constant of silicon, and the scattering angle is traceable via geometric measurements of the detector pixel size and the distance between the sample and the detector. SAXS measurements and data evaluations have been performed at different distances and photon energies for two PMMA nanoparticle suspensions with low polydispersity and nominal diameters of 108 nm and 192 nm, respectively, as well as for a mixture of both. The relative variation of the diameters obtained for different experimental conditions was below ± 0.3 %. The determined number-weighted mean diameters of (109.0 ± 0.7) nm and (188.0 ± 1.3) nm, respectively, are close to the nominal values.

  19. Traceable size determination of PMMA nanoparticles based on Small Angle X-ray Scattering (SAXS)

    International Nuclear Information System (INIS)

    Gleber, G; Cibik, L; Mueller, P; Krumrey, M; Haas, S; Hoell, A

    2010-01-01

    The size and size distribution of PMMA nanoparticles has been investigated with SAXS (small angle X-ray scattering) using monochromatized synchrotron radiation. The uncertainty has contributions from the wavelength or photon energy of the radiation, the scattering angle and the fit procedure for the obtained scattering curves. The wavelength can be traced back to the lattice constant of silicon, and the scattering angle is traceable via geometric measurements of the detector pixel size and the distance between the sample and the detector. SAXS measurements and data evaluations have been performed at different distances and photon energies for two PMMA nanoparticle suspensions with low polydispersity and nominal diameters of 108 nm and 192 nm, respectively, as well as for a mixture of both. The relative variation of the diameters obtained for different experimental conditions was below ± 0.3 %. The determined number-weighted mean diameters of (109.0 ± 0.7) nm and (188.0 ± 1.3) nm, respectively, are close to the nominal values.

  20. Sample size allocation in multiregional equivalence studies.

    Science.gov (United States)

    Liao, Jason J Z; Yu, Ziji; Li, Yulan

    2018-06-17

    With the increasing globalization of drug development, the multiregional clinical trial (MRCT) has gained extensive use. The data from MRCTs could be accepted by regulatory authorities across regions and countries as the primary sources of evidence to support global marketing drug approval simultaneously. The MRCT can speed up patient enrollment and drug approval, and it makes the effective therapies available to patients all over the world simultaneously. However, there are many challenges both operationally and scientifically in conducting a drug development globally. One of many important questions to answer for the design of a multiregional study is how to partition sample size into each individual region. In this paper, two systematic approaches are proposed for the sample size allocation in a multiregional equivalence trial. A numerical evaluation and a biosimilar trial are used to illustrate the characteristics of the proposed approaches. Copyright © 2018 John Wiley & Sons, Ltd.

  1. Mechanism of Interaction between Entrepreneurial Spirit and Growth of Small and Medium Sized Private Enterprises

    Institute of Scientific and Technical Information of China (English)

    Peng XIE

    2016-01-01

    In the context of government appeal of " mass entrepreneurship and innovation",all areas launched the movement of " mass entrepreneurship" and " grassroots entrepreneurship". As parts of vitality of China’s market economy,small and medium sized private enterprises become grassroots of market competition in talents,funds,and technologies. This paper combined three levels of entrepreneurial spirit and small and medium sized private enterprises,studied acting mechanism of entrepreneurial spirit,discussed competitive power of small and medium sized private enterprises,and recommended that small and medium sized private enterprises should cultivate core competitive power,adapt to external environment,and create external environment support,to realize sound development.

  2. Sampling strategies for estimating brook trout effective population size

    Science.gov (United States)

    Andrew R. Whiteley; Jason A. Coombs; Mark Hudy; Zachary Robinson; Keith H. Nislow; Benjamin H. Letcher

    2012-01-01

    The influence of sampling strategy on estimates of effective population size (Ne) from single-sample genetic methods has not been rigorously examined, though these methods are increasingly used. For headwater salmonids, spatially close kin association among age-0 individuals suggests that sampling strategy (number of individuals and location from...

  3. The status of development of small and medium sized reactors

    International Nuclear Information System (INIS)

    Konstantinov, L.V; Kupitz, J.

    1987-01-01

    Several IAEA Member States have shown their interest in reactor design, having a smaller power rating (100-500 MW(e) range) than those generally available on the international market. These small and medium sized power reactors are of interest either for domestic applications or for export into countries with less developed infrastructure. There are different developments undertaken for these power reactors to be ready for offering in the nineties and beyond. The paper gives an overview about the status and different trends in IAEA Member States in the development of small and medium sized reactors for the 90's and provides an outlook for very new reactor designs as a long term option for nuclear power. (author)

  4. Analysis and comparison of fish growth from small samples of length-at-age data : Detection of sexual dimorphism in Eurasian perch as an example

    NARCIS (Netherlands)

    Mooij, WM; Van Rooij, JM; Wijnhoven, S

    A relatively simple approach is presented for statistical analysis and comparison of fish growth patterns inferred from size-at-age data. It can be used for any growth model and small sample sizes. Bootstrapping is used to generate confidence regions for the model parameters and for size and growth

  5. Sample Size Induced Brittle-to-Ductile Transition of Single-Crystal Aluminum Nitride

    Science.gov (United States)

    2015-08-01

    ARL-RP-0528 ● AUG 2015 US Army Research Laboratory Sample Size Induced Brittle-to- Ductile Transition of Single-Crystal Aluminum...originator. ARL-RP-0528 ● AUG 2015 US Army Research Laboratory Sample Size Induced Brittle-to- Ductile Transition of Single-Crystal...Sample Size Induced Brittle-to- Ductile Transition of Single-Crystal Aluminum Nitride 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT

  6. 77 FR 7489 - Small Business Size Standards: Professional, Technical, and Scientific Services

    Science.gov (United States)

    2012-02-10

    ... would (1) provide a competitive advantage to larger firms over their truly small counterparts; (2) allow... Vol. 77 Friday, No. 28 February 10, 2012 Part V Small Business Administration 13 CFR Part 121 Small Business Size Standards: Professional, Technical, and Scientific Services; Final Rule #0;#0...

  7. Statistical characterization of a large geochemical database and effect of sample size

    Science.gov (United States)

    Zhang, C.; Manheim, F.T.; Hinde, J.; Grossman, J.N.

    2005-01-01

    The authors investigated statistical distributions for concentrations of chemical elements from the National Geochemical Survey (NGS) database of the U.S. Geological Survey. At the time of this study, the NGS data set encompasses 48,544 stream sediment and soil samples from the conterminous United States analyzed by ICP-AES following a 4-acid near-total digestion. This report includes 27 elements: Al, Ca, Fe, K, Mg, Na, P, Ti, Ba, Ce, Co, Cr, Cu, Ga, La, Li, Mn, Nb, Nd, Ni, Pb, Sc, Sr, Th, V, Y and Zn. The goal and challenge for the statistical overview was to delineate chemical distributions in a complex, heterogeneous data set spanning a large geographic range (the conterminous United States), and many different geological provinces and rock types. After declustering to create a uniform spatial sample distribution with 16,511 samples, histograms and quantile-quantile (Q-Q) plots were employed to delineate subpopulations that have coherent chemical and mineral affinities. Probability groupings are discerned by changes in slope (kinks) on the plots. Major rock-forming elements, e.g., Al, Ca, K and Na, tend to display linear segments on normal Q-Q plots. These segments can commonly be linked to petrologic or mineralogical associations. For example, linear segments on K and Na plots reflect dilution of clay minerals by quartz sand (low in K and Na). Minor and trace element relationships are best displayed on lognormal Q-Q plots. These sensitively reflect discrete relationships in subpopulations within the wide range of the data. For example, small but distinctly log-linear subpopulations for Pb, Cu, Zn and Ag are interpreted to represent ore-grade enrichment of naturally occurring minerals such as sulfides. None of the 27 chemical elements could pass the test for either normal or lognormal distribution on the declustered data set. Part of the reasons relate to the presence of mixtures of subpopulations and outliers. Random samples of the data set with successively

  8. Multi-SOM: an Algorithm for High-Dimensional, Small Size Datasets

    Directory of Open Access Journals (Sweden)

    Shen Lu

    2013-04-01

    Full Text Available Since it takes time to do experiments in bioinformatics, biological datasets are sometimes small but with high dimensionality. From probability theory, in order to discover knowledge from a set of data, we have to have a sufficient number of samples. Otherwise, the error bounds can become too large to be useful. For the SOM (Self- Organizing Map algorithm, the initial map is based on the training data. In order to avoid the bias caused by the insufficient training data, in this paper we present an algorithm, called Multi-SOM. Multi-SOM builds a number of small self-organizing maps, instead of just one big map. Bayesian decision theory is used to make the final decision among similar neurons on different maps. In this way, we can better ensure that we can get a real random initial weight vector set, the map size is less of consideration and errors tend to average out. In our experiments as applied to microarray datasets which are highly intense data composed of genetic related information, the precision of Multi-SOMs is 10.58% greater than SOMs, and its recall is 11.07% greater than SOMs. Thus, the Multi-SOMs algorithm is practical.

  9. Sample size determination for logistic regression on a logit-normal distribution.

    Science.gov (United States)

    Kim, Seongho; Heath, Elisabeth; Heilbrun, Lance

    2017-06-01

    Although the sample size for simple logistic regression can be readily determined using currently available methods, the sample size calculation for multiple logistic regression requires some additional information, such as the coefficient of determination ([Formula: see text]) of a covariate of interest with other covariates, which is often unavailable in practice. The response variable of logistic regression follows a logit-normal distribution which can be generated from a logistic transformation of a normal distribution. Using this property of logistic regression, we propose new methods of determining the sample size for simple and multiple logistic regressions using a normal transformation of outcome measures. Simulation studies and a motivating example show several advantages of the proposed methods over the existing methods: (i) no need for [Formula: see text] for multiple logistic regression, (ii) available interim or group-sequential designs, and (iii) much smaller required sample size.

  10. Sample size reassessment for a two-stage design controlling the false discovery rate.

    Science.gov (United States)

    Zehetmayer, Sonja; Graf, Alexandra C; Posch, Martin

    2015-11-01

    Sample size calculations for gene expression microarray and NGS-RNA-Seq experiments are challenging because the overall power depends on unknown quantities as the proportion of true null hypotheses and the distribution of the effect sizes under the alternative. We propose a two-stage design with an adaptive interim analysis where these quantities are estimated from the interim data. The second stage sample size is chosen based on these estimates to achieve a specific overall power. The proposed procedure controls the power in all considered scenarios except for very low first stage sample sizes. The false discovery rate (FDR) is controlled despite of the data dependent choice of sample size. The two-stage design can be a useful tool to determine the sample size of high-dimensional studies if in the planning phase there is high uncertainty regarding the expected effect sizes and variability.

  11. Large- and small-size advantages in sneaking behaviour in the dusky frillgoby Bathygobius fuscus

    Science.gov (United States)

    Takegaki, Takeshi; Kaneko, Takashi; Matsumoto, Yukio

    2012-04-01

    Sneaking tactic, a male alternative reproductive tactic involving sperm competition, is generally adopted by small individuals because of its inconspicuousness. However, large size has an advantage when competition occurs between sneakers for fertilization of eggs. Here, we suggest that both large- and small-size advantages of sneaker males are present within the same species. Large sneaker males of the dusky frillgoby Bathygobius fuscus showed a high success rate in intruding into spawning nests because of their advantage in competition among sneaker males in keeping a suitable position to sneak, whereas small sneakers had few chances to sneak. However, small sneaker males were able to stay in the nests longer than large sneaker males when they succeeded in sneak intrusion. This suggests the possibility of an increase in their paternity. The findings of these size-specific behavioural advantages may be important in considering the evolution of size-related reproductive traits.

  12. Nomogram for sample size calculation on a straightforward basis for the kappa statistic.

    Science.gov (United States)

    Hong, Hyunsook; Choi, Yunhee; Hahn, Seokyung; Park, Sue Kyung; Park, Byung-Joo

    2014-09-01

    Kappa is a widely used measure of agreement. However, it may not be straightforward in some situation such as sample size calculation due to the kappa paradox: high agreement but low kappa. Hence, it seems reasonable in sample size calculation that the level of agreement under a certain marginal prevalence is considered in terms of a simple proportion of agreement rather than a kappa value. Therefore, sample size formulae and nomograms using a simple proportion of agreement rather than a kappa under certain marginal prevalences are proposed. A sample size formula was derived using the kappa statistic under the common correlation model and goodness-of-fit statistic. The nomogram for the sample size formula was developed using SAS 9.3. The sample size formulae using a simple proportion of agreement instead of a kappa statistic and nomograms to eliminate the inconvenience of using a mathematical formula were produced. A nomogram for sample size calculation with a simple proportion of agreement should be useful in the planning stages when the focus of interest is on testing the hypothesis of interobserver agreement involving two raters and nominal outcome measures. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Sample size optimization in nuclear material control. 1

    International Nuclear Information System (INIS)

    Gladitz, J.

    1982-01-01

    Equations have been derived and exemplified which allow the determination of the minimum variables sample size for given false alarm and detection probabilities of nuclear material losses and diversions, respectively. (author)

  14. Sustainability and Small to Medium Sized Enterprises--How to Engage Them

    Science.gov (United States)

    Condon, Linda

    2004-01-01

    Small and medium sized enterprises (SMEs) have a major advantage over larger organisations in regard to addressing sustainability issues--their size means they are able to react very quickly to changes in the business environment. They are disadvantaged, however, by lack of information on marketplace changes that make sustainability an opportunity…

  15. 77 FR 58747 - Small Business Size Standards: Real Estate and Rental and Leasing

    Science.gov (United States)

    2012-09-24

    ...: The United States Small Business Administration (SBA) is increasing the small business size standards... small businesses, the commenter suggested increasing it to $150 million. He contended that his business... industry makes it difficult for small businesses to grow and develop and increase their market share. To...

  16. Impact of shoe size in a sample of elderly individuals

    Directory of Open Access Journals (Sweden)

    Daniel López-López

    Full Text Available Summary Introduction: The use of an improper shoe size is common in older people and is believed to have a detrimental effect on the quality of life related to foot health. The objective is to describe and compare, in a sample of participants, the impact of shoes that fit properly or improperly, as well as analyze the scores related to foot health and health overall. Method: A sample of 64 participants, with a mean age of 75.3±7.9 years, attended an outpatient center where self-report data was recorded, the measurements of the size of the feet and footwear were determined and the scores compared between the group that wears the correct size of shoes and another group of individuals who do not wear the correct size of shoes, using the Spanish version of the Foot Health Status Questionnaire. Results: The group wearing an improper shoe size showed poorer quality of life regarding overall health and specifically foot health. Differences between groups were evaluated using a t-test for independent samples resulting statistically significant (p<0.05 for the dimension of pain, function, footwear, overall foot health, and social function. Conclusion: Inadequate shoe size has a significant negative impact on quality of life related to foot health. The degree of negative impact seems to be associated with age, sex, and body mass index (BMI.

  17. The use of secondary ion mass spectrometry in forensic analyses of ultra-small samples

    Science.gov (United States)

    Cliff, John

    2010-05-01

    It is becoming increasingly important in forensic science to perform chemical and isotopic analyses on very small sample sizes. Moreover, in some instances the signature of interest may be incorporated in a vast background making analyses impossible by bulk methods. Recent advances in instrumentation make secondary ion mass spectrometry (SIMS) a powerful tool to apply to these problems. As an introduction, we present three types of forensic analyses in which SIMS may be useful. The causal organism of anthrax (Bacillus anthracis) chelates Ca and other metals during spore formation. Thus, the spores contain a trace element signature related to the growth medium that produced the organisms. Although other techniques have been shown to be useful in analyzing these signatures, the sample size requirements are generally relatively large. We have shown that time of flight SIMS (TOF-SIMS) combined with multivariate analysis, can clearly separate Bacillus sp. cultures prepared in different growth media using analytical spot sizes containing approximately one nanogram of spores. An important emerging field in forensic analysis is that of provenance of fecal pollution. The strategy of choice for these analyses-developing host-specific nucleic acid probes-has met with considerable difficulty due to lack of specificity of the probes. One potentially fruitful strategy is to combine in situ nucleic acid probing with high precision isotopic analyses. Bulk analyses of human and bovine fecal bacteria, for example, indicate a relative difference in d13C content of about 4 per mil. We have shown that sample sizes of several nanograms can be analyzed with the IMS 1280 with precisions capable of separating two per mil differences in d13C. The NanoSIMS 50 is capable of much better spatial resolution than the IMS 1280, albeit at a cost of analytical precision. Nevertheless we have documented precision capable of separating five per mil differences in d13C using analytical spots containing

  18. Pulsed Direct Current Electrospray: Enabling Systematic Analysis of Small Volume Sample by Boosting Sample Economy.

    Science.gov (United States)

    Wei, Zhenwei; Xiong, Xingchuang; Guo, Chengan; Si, Xingyu; Zhao, Yaoyao; He, Muyi; Yang, Chengdui; Xu, Wei; Tang, Fei; Fang, Xiang; Zhang, Sichun; Zhang, Xinrong

    2015-11-17

    We had developed pulsed direct current electrospray ionization mass spectrometry (pulsed-dc-ESI-MS) for systematically profiling and determining components in small volume sample. Pulsed-dc-ESI utilized constant high voltage to induce the generation of single polarity pulsed electrospray remotely. This method had significantly boosted the sample economy, so as to obtain several minutes MS signal duration from merely picoliter volume sample. The elongated MS signal duration enable us to collect abundant MS(2) information on interested components in a small volume sample for systematical analysis. This method had been successfully applied for single cell metabolomics analysis. We had obtained 2-D profile of metabolites (including exact mass and MS(2) data) from single plant and mammalian cell, concerning 1034 components and 656 components for Allium cepa and HeLa cells, respectively. Further identification had found 162 compounds and 28 different modification groups of 141 saccharides in a single Allium cepa cell, indicating pulsed-dc-ESI a powerful tool for small volume sample systematical analysis.

  19. Observations of Bright Massive Stars Using Small Size Telescopes

    Science.gov (United States)

    Beradze, Sopia; Kochiashvili, Nino

    2017-11-01

    The size of a telescope determines goals and objects of observations. During the latest decades it becomes more and more difficult to get photometric data of bright stars because most of telescopes of small sizes do not operate already. But there are rather interesting questions connected to the properties and evolution ties between different types of massive stars. Multi-wavelength photometric data are needed for solution of some of them. We are presenting our observational plans of bright Massive X-ray binaries, WR and LBV stars using a small size telescope. All these stars, which are presented in the poster are observational targets of Sopia Beradze's future PhD thesis. We already have got very interesting results on the reddening and possible future eruption of the massive hypergiant star P Cygni. Therefore, we decided to choose some additional interesting massive stars of different type for future observations. All Massive stars play an important role in the chemical evolution of galaxies because of they have very high mass loss - up to 10-4M⊙/a year. Our targets are on different evolutionary stages and three of them are the members of massive binaries. We plan to do UBVRI photometric observations of these stars using the 48 cm Cassegrain telescope of the Abastumani Astrophisical Observatory.

  20. Small-for-Size Liver Transplantation Increases Pulmonary Injury in Rats: Prevention by NIM811

    Directory of Open Access Journals (Sweden)

    Qinlong Liu

    2012-01-01

    Full Text Available Pulmonary complications after liver transplantation (LT often cause mortality. This study investigated whether small-for-size LT increases acute pulmonary injury and whether NIM811 which improves small-for-size liver graft survival attenuates LT-associated lung injury. Rat livers were reduced to 50% of original size, stored in UW-solution with and without NIM811 (5 μM for 6 h, and implanted into recipients of the same or about twice the donor weight, resulting in half-size (HSG and quarter-size grafts (QSG, respectively. Liver injury increased and regeneration was suppressed after QSG transplantation as expected. NIM811 blunted these alterations >75%. Pulmonary histological alterations were minimal at 5–18 h after LT. At 38 h, neutrophils and monocytes/macrophage infiltration, alveolar space exudation, alveolar septal thickening, oxidative/nitrosative protein adduct formation, and alveolar epithelial cell/capillary endothelial apoptosis became overt in the lungs of QSG recipients, but these alterations were mild in full-size and HSG recipients. Liver pretreatment with NIM811 markedly decreased pulmonary injury in QSG recipients. Hepatic TNFα and IL-1β mRNAs and pulmonary ICAM-1 expression were markedly higher after QSG transplantation, which were all decreased by NIM811. Together, dysfunctional small-for-size grafts produce toxic cytokines, leading to lung inflammation and injury. NIM811 decreased toxic cytokine formation, thus attenuating pulmonary injury after small-for-size LT.

  1. Fatigue-crack propagation in gamma-based titanium aluminide alloys at large and small crack sizes

    International Nuclear Information System (INIS)

    Kruzic, J.J.; Campbell, J.P.; Ritchie, R.O.

    1999-01-01

    Most evaluations of the fracture and fatigue-crack propagation properties of γ+α 2 titanium aluminide alloys to date have been performed using standard large-crack samples, e.g., compact-tension specimens containing crack sizes which are on the order of tens of millimeters, i.e., large compared to microstructural dimensions. However, these alloys have been targeted for applications, such as blades in gas-turbine engines, where relevant crack sizes are much smaller ( 5 mm) and (c ≅ 25--300 microm) cracks in a γ-TiAl based alloy, of composition Ti-47Al-2Nb-2Cr-0.2B (at.%), specifically for duplex (average grain size approximately17 microm) and refined lamellar (average colony size ≅150 microm) microstructures. It is found that, whereas the lamellar microstructure displays far superior fracture toughness and fatigue-crack growth resistance in the presence of large cracks, in small-crack testing the duplex microstructure exhibits a better combination of properties. The reasons for such contrasting behavior are examined in terms of the intrinsic and extrinsic (i.e., crack bridging) contributions to cyclic crack advance

  2. [Progress in sample preparation and analytical methods for trace polar small molecules in complex samples].

    Science.gov (United States)

    Zhang, Qianchun; Luo, Xialin; Li, Gongke; Xiao, Xiaohua

    2015-09-01

    Small polar molecules such as nucleosides, amines, amino acids are important analytes in biological, food, environmental, and other fields. It is necessary to develop efficient sample preparation and sensitive analytical methods for rapid analysis of these polar small molecules in complex matrices. Some typical materials in sample preparation, including silica, polymer, carbon, boric acid and so on, are introduced in this paper. Meanwhile, the applications and developments of analytical methods of polar small molecules, such as reversed-phase liquid chromatography, hydrophilic interaction chromatography, etc., are also reviewed.

  3. Integrating sphere based reflectance measurements for small-area semiconductor samples

    Science.gov (United States)

    Saylan, S.; Howells, C. T.; Dahlem, M. S.

    2018-05-01

    This article describes a method that enables reflectance spectroscopy of small semiconductor samples using an integrating sphere, without the use of additional optical elements. We employed an inexpensive sample holder to measure the reflectance of different samples through 2-, 3-, and 4.5-mm-diameter apertures and applied a mathematical formulation to remove the bias from the measured spectra caused by illumination of the holder. Using the proposed method, the reflectance of samples fabricated using expensive or rare materials and/or low-throughput processes can be measured. It can also be incorporated to infer the internal quantum efficiency of small-area, research-level solar cells. Moreover, small samples that reflect light at large angles and develop scattering may also be measured reliably, by virtue of an integrating sphere insensitive to directionalities.

  4. Small and medium sized reactors: Status and prospects. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-08-01

    During the early years of nuclear power deployment, the plants entering service were dominated by what are now considered small (less than 300 MW(e)) and medium (300 to 700 MW(e)) reactors. In the late 1970s, the balance shifted to larger size plants to serve the requirements of industrialized countries. However, since the early 1990s, the increased interest of developing countries in nuclear power, mainly in Asia, has resulted in intensified efforts in development of small and medium sized reactors (SMRs). Also, in industrialized countries, electricity market deregulation is calling for power generation flexibility that SMRs may offer. Apart from electricity generation, SMRs are of particular interest for non-electrical applications of nuclear energy, such as desalination of seawater and district heating. In recognition of the current global interest in small and medium sized reactors this seminar was convened to provide a forum for the exchange of information by experts and policy makers from industrialized and developing countries on the technical, economic, environmental, and social aspects of SMR development and implementation in the 21st century, and to make this information available to all interested IAEA Member States. Keynote addresses also provided information on global energy demand and supply and international trends impacting the development and introduction of SMRs. Two hundred forty seven attendees from 39 countries and 5 international organizations participated in the seminar. The majority of the participants were from developing countries. The seminar not only provided valuable up to date information on SMRs, it also highlighted the importance of continued international co-operation in the development and application of nuclear power for peaceful uses throughout the world.

  5. Small and medium sized reactors: Status and prospects. Proceedings

    International Nuclear Information System (INIS)

    2002-01-01

    During the early years of nuclear power deployment, the plants entering service were dominated by what are now considered small (less than 300 MW(e)) and medium (300 to 700 MW(e)) reactors. In the late 1970s, the balance shifted to larger size plants to serve the requirements of industrialized countries. However, since the early 1990s, the increased interest of developing countries in nuclear power, mainly in Asia, has resulted in intensified efforts in development of small and medium sized reactors (SMRs). Also, in industrialized countries, electricity market deregulation is calling for power generation flexibility that SMRs may offer. Apart from electricity generation, SMRs are of particular interest for non-electrical applications of nuclear energy, such as desalination of seawater and district heating. In recognition of the current global interest in small and medium sized reactors this seminar was convened to provide a forum for the exchange of information by experts and policy makers from industrialized and developing countries on the technical, economic, environmental, and social aspects of SMR development and implementation in the 21st century, and to make this information available to all interested IAEA Member States. Keynote addresses also provided information on global energy demand and supply and international trends impacting the development and introduction of SMRs. Two hundred forty seven attendees from 39 countries and 5 international organizations participated in the seminar. The majority of the participants were from developing countries. The seminar not only provided valuable up to date information on SMRs, it also highlighted the importance of continued international co-operation in the development and application of nuclear power for peaceful uses throughout the world

  6. Supply Chain adoption in Small and Medium-Sized Enterprises (SMEs)

    DEFF Research Database (Denmark)

    Juhl, Mathias Thim; Bernon, Mike

    Purpose: The importance of having a competitive supply chain strategy is not to be underestimated (Underwood & Agg 2012; Aronow et al. 2014). Despite the importance of creating strong supply chain capabilities, small and medium-sized enterprises (SMEs) find it difficult to implement supply chain...... and customer needs. Research Approach: An exploratory case study of five small and medium sized manufacturing companies was undertaken using in-depth interviews and business reports. Combined with relevant literature, the case study interviews provide basis for a discussion on the current adoption of supply...... needs and a low interaction in the supply chain, to having an “outside-in” perspective (Day & Moorman 2013) and development of capabilities that support long-term competitive advantage. The case studies revealed two significant factors to support consistency between supply chain capabilities...

  7. Small-Sized Whole-Tree Harvesting in Finland

    Energy Technology Data Exchange (ETDEWEB)

    Kaerhae, Kalle [Metsaeteho Oy, Helsinki (Finland)

    2006-07-15

    In Finland, there are two mechanized harvesting systems used for small diameter (d{sub 1.3}= 10 cm) thinning wood: 1) the traditional two-machine (harvester and forwarder) system, and 2) the harwarder system (i.e. the same machine performs both felling and haulage to the roadside). At present, there are more than 20 energy wood harwarders in use in Finland. However, there have been no comprehensive studies carried out on the energy wood harwarders. This paper looks into the productivity results obtained with energy wood harwarders. In addition, the energy wood harvesting costs for harwarders are compared with those of the two-machine system. The results clearly indicated what kind of machine resources can be profitably allocated to different whole-tree harvesting sites. The energy wood harwarders should be directed towards harvesting sites where the forwarding distances are small, the trees harvested are relatively small, and the total volume of energy wood removed is quite low. Respectively, when the stem size removed is relatively large in young stands, and the forest haulage distances are long, the traditional two-machine system is more competitive.

  8. Threshold-dependent sample sizes for selenium assessment with stream fish tissue

    Science.gov (United States)

    Hitt, Nathaniel P.; Smith, David R.

    2015-01-01

    Natural resource managers are developing assessments of selenium (Se) contamination in freshwater ecosystems based on fish tissue concentrations. We evaluated the effects of sample size (i.e., number of fish per site) on the probability of correctly detecting mean whole-body Se values above a range of potential management thresholds. We modeled Se concentrations as gamma distributions with shape and scale parameters fitting an empirical mean-to-variance relationship in data from southwestern West Virginia, USA (63 collections, 382 individuals). We used parametric bootstrapping techniques to calculate statistical power as the probability of detecting true mean concentrations up to 3 mg Se/kg above management thresholds ranging from 4 to 8 mg Se/kg. Sample sizes required to achieve 80% power varied as a function of management thresholds and Type I error tolerance (α). Higher thresholds required more samples than lower thresholds because populations were more heterogeneous at higher mean Se levels. For instance, to assess a management threshold of 4 mg Se/kg, a sample of eight fish could detect an increase of approximately 1 mg Se/kg with 80% power (given α = 0.05), but this sample size would be unable to detect such an increase from a management threshold of 8 mg Se/kg with more than a coin-flip probability. Increasing α decreased sample size requirements to detect above-threshold mean Se concentrations with 80% power. For instance, at an α-level of 0.05, an 8-fish sample could detect an increase of approximately 2 units above a threshold of 8 mg Se/kg with 80% power, but when α was relaxed to 0.2, this sample size was more sensitive to increasing mean Se concentrations, allowing detection of an increase of approximately 1.2 units with equivalent power. Combining individuals into 2- and 4-fish composite samples for laboratory analysis did not decrease power because the reduced number of laboratory samples was compensated for by increased

  9. Optimum sample size to estimate mean parasite abundance in fish parasite surveys

    Directory of Open Access Journals (Sweden)

    Shvydka S.

    2018-03-01

    Full Text Available To reach ethically and scientifically valid mean abundance values in parasitological and epidemiological studies this paper considers analytic and simulation approaches for sample size determination. The sample size estimation was carried out by applying mathematical formula with predetermined precision level and parameter of the negative binomial distribution estimated from the empirical data. A simulation approach to optimum sample size determination aimed at the estimation of true value of the mean abundance and its confidence interval (CI was based on the Bag of Little Bootstraps (BLB. The abundance of two species of monogenean parasites Ligophorus cephali and L. mediterraneus from Mugil cephalus across the Azov-Black Seas localities were subjected to the analysis. The dispersion pattern of both helminth species could be characterized as a highly aggregated distribution with the variance being substantially larger than the mean abundance. The holistic approach applied here offers a wide range of appropriate methods in searching for the optimum sample size and the understanding about the expected precision level of the mean. Given the superior performance of the BLB relative to formulae with its few assumptions, the bootstrap procedure is the preferred method. Two important assessments were performed in the present study: i based on CIs width a reasonable precision level for the mean abundance in parasitological surveys of Ligophorus spp. could be chosen between 0.8 and 0.5 with 1.6 and 1x mean of the CIs width, and ii the sample size equal 80 or more host individuals allows accurate and precise estimation of mean abundance. Meanwhile for the host sample size in range between 25 and 40 individuals, the median estimates showed minimal bias but the sampling distribution skewed to the low values; a sample size of 10 host individuals yielded to unreliable estimates.

  10. SMALL AND MIDDLE-SIZED BUSINESS: LOGISTICAL AND SOCIAL AND ECONOMICAL FUNCTIONS

    Directory of Open Access Journals (Sweden)

    O. N. Pavlova

    2007-01-01

    Full Text Available Entrepreneurship is one of the factors that can ensure a sustainable character to the development of Belarus. The present Belarusian entrepreneurship should select the required forms and determine development priorities. It is practically impossible to do it without taking into account modem global social and economical tendencies and also specificity of Belarusian mentality and traditions. In this sense priority status is given to small and middle-sized business because it is considered as an element of economic system that ensures stability to economy. Small and middle-sized business is an ideal logistical model that provides harmonization of interests and optimization of management solutions.

  11. Sample size for post-marketing safety studies based on historical controls.

    Science.gov (United States)

    Wu, Yu-te; Makuch, Robert W

    2010-08-01

    As part of a drug's entire life cycle, post-marketing studies are an important part in the identification of rare, serious adverse events. Recently, the US Food and Drug Administration (FDA) has begun to implement new post-marketing safety mandates as a consequence of increased emphasis on safety. The purpose of this research is to provide exact sample size formula for the proposed hybrid design, based on a two-group cohort study with incorporation of historical external data. Exact sample size formula based on the Poisson distribution is developed, because the detection of rare events is our outcome of interest. Performance of exact method is compared to its approximate large-sample theory counterpart. The proposed hybrid design requires a smaller sample size compared to the standard, two-group prospective study design. In addition, the exact method reduces the number of subjects required in the treatment group by up to 30% compared to the approximate method for the study scenarios examined. The proposed hybrid design satisfies the advantages and rationale of the two-group design with smaller sample sizes generally required. 2010 John Wiley & Sons, Ltd.

  12. Sample size computation for association studies using case–parents ...

    Indian Academy of Sciences (India)

    ple size needed to reach a given power (Knapp 1999; Schaid. 1999; Chen and Deng 2001; Brown 2004). In their seminal paper, Risch and Merikangas (1996) showed that for a mul- tiplicative mode of inheritance (MOI) for the susceptibility gene, sample size depends on two parameters: the frequency of the risk allele at the ...

  13. Consensus of heterogeneous multi-agent systems based on sampled data with a small sampling delay

    International Nuclear Information System (INIS)

    Wang Na; Wu Zhi-Hai; Peng Li

    2014-01-01

    In this paper, consensus problems of heterogeneous multi-agent systems based on sampled data with a small sampling delay are considered. First, a consensus protocol based on sampled data with a small sampling delay for heterogeneous multi-agent systems is proposed. Then, the algebra graph theory, the matrix method, the stability theory of linear systems, and some other techniques are employed to derive the necessary and sufficient conditions guaranteeing heterogeneous multi-agent systems to asymptotically achieve the stationary consensus. Finally, simulations are performed to demonstrate the correctness of the theoretical results. (interdisciplinary physics and related areas of science and technology)

  14. A Single IGF1 Allele Is a Major Determinant of Small Size in Dogs

    Science.gov (United States)

    Sutter, Nathan B.; Bustamante, Carlos D.; Chase, Kevin; Gray, Melissa M.; Zhao, Keyan; Zhu, Lan; Padhukasahasram, Badri; Karlins, Eric; Davis, Sean; Jones, Paul G.; Quignon, Pascale; Johnson, Gary S.; Parker, Heidi G.; Fretwell, Neale; Mosher, Dana S.; Lawler, Dennis F.; Satyaraj, Ebenezer; Nordborg, Magnus; Lark, K. Gordon; Wayne, Robert K.; Ostrander, Elaine A.

    2009-01-01

    The domestic dog exhibits greater diversity in body size than any other terrestrial vertebrate. We used a strategy that exploits the breed structure of dogs to investigate the genetic basis of size. First, through a genome-wide scan, we identified a major quantitative trait locus (QTL) on chromosome 15 influencing size variation within a single breed. Second, we examined genetic variation in the 15-megabase interval surrounding the QTL in small and giant breeds and found marked evidence for a selective sweep spanning a single gene (IGF1), encoding insulin-like growth factor 1. A single IGF1 single-nucleotide polymorphism haplotype is common to all small breeds and nearly absent from giant breeds, suggesting that the same causal sequence variant is a major contributor to body size in all small dogs. PMID:17412960

  15. Enhancing sampling design in mist-net bat surveys by accounting for sample size optimization

    OpenAIRE

    Trevelin, Leonardo Carreira; Novaes, Roberto Leonan Morim; Colas-Rosas, Paul François; Benathar, Thayse Cristhina Melo; Peres, Carlos A.

    2017-01-01

    The advantages of mist-netting, the main technique used in Neotropical bat community studies to date, include logistical implementation, standardization and sampling representativeness. Nonetheless, study designs still have to deal with issues of detectability related to how different species behave and use the environment. Yet there is considerable sampling heterogeneity across available studies in the literature. Here, we approach the problem of sample size optimization. We evaluated the co...

  16. Preparing Monodisperse Macromolecular Samples for Successful Biological Small-Angle X-ray and Neutron Scattering Experiments

    Science.gov (United States)

    Jeffries, Cy M.; Graewert, Melissa A.; Blanchet, Clément E.; Langley, David B.; Whitten, Andrew E.; Svergun, Dmitri I

    2017-01-01

    Small-angle X-ray and neutron scattering (SAXS and SANS) are techniques used to extract structural parameters and determine the overall structures and shapes of biological macromolecules, complexes and assemblies in solution. The scattering intensities measured from a sample contain contributions from all atoms within the illuminated sample volume including the solvent and buffer components as well as the macromolecules of interest. In order to obtain structural information, it is essential to prepare an exactly matched solvent blank so that background scattering contributions can be accurately subtracted from the sample scattering to obtain the net scattering from the macromolecules in the sample. In addition, sample heterogeneity caused by contaminants, aggregates, mismatched solvents, radiation damage or other factors can severely influence and complicate data analysis so it is essential that the samples are pure and monodisperse for the duration of the experiment. This Protocol outlines the basic physics of SAXS and SANS and reveals how the underlying conceptual principles of the techniques ultimately ‘translate’ into practical laboratory guidance for the production of samples of sufficiently high quality for scattering experiments. The procedure describes how to prepare and characterize protein and nucleic acid samples for both SAXS and SANS using gel electrophoresis, size exclusion chromatography and light scattering. Also included are procedures specific to X-rays (in-line size exclusion chromatography SAXS) and neutrons, specifically preparing samples for contrast matching/variation experiments and deuterium labeling of proteins. PMID:27711050

  17. Determining Sample Size for Accurate Estimation of the Squared Multiple Correlation Coefficient.

    Science.gov (United States)

    Algina, James; Olejnik, Stephen

    2000-01-01

    Discusses determining sample size for estimation of the squared multiple correlation coefficient and presents regression equations that permit determination of the sample size for estimating this parameter for up to 20 predictor variables. (SLD)

  18. Test methodology and technology of fracture toughness for small size specimens

    Energy Technology Data Exchange (ETDEWEB)

    Wakai, E.; Takada, F.; Ishii, T.; Ando, M. [Japan Atomic Energy Agency, Naga-gun, Ibaraki-ken (Japan); Matsukawa, S. [JNE Techno-Research Co., Kanagawa-ken (Japan)

    2007-07-01

    Full text of publication follows: Small specimen test technology (SSTT) is required to investigate mechanical properties in the limited availability of effective irradiation volumes in test reactors and accelerator-based neutron and charged particle sources. The test methodology guideline and the manufacture processes for very small size specimens have not been established, and we would have to formulate it. The technology to control exactly the load and displacement is also required in the test technology under the environment of high dose radiation produced from the specimens. The objective of this study is to examine the test technology and methodology of fracture toughness for very small size specimens. A new bend test machine installed in hot cell has been manufactured to obtain fracture toughness and DBTT (ductile - brittle transition temperature) of reduced-activation ferritic/martensitic steels for small bend specimens of t/2-1/3PCCVN (pre-cracked 1/3 size Charpy V-notch) with 20 mm length and DFMB (deformation and fracture mini bend specimen) with 9 mm length. The new machine can be performed at temperatures from -196 deg. C to 400 deg. C under unloading compliance method. Neutron irradiation was also performed at about 250 deg. C to about 2 dpa in JMTR. After the irradiation, fracture toughness and DBTT were examined by using the machine. Checking of displacement measurement between linear gauge of cross head's displacement and DVRT of the specimen displacement was performed exactly. Conditions of pre-crack due to fatigue in the specimen preparation were also examined and it depended on the shape and size of the specimens. Fracture toughness and DBTT of F82H steel for t/2-1/3PCCVN, DFMB and 0.18DCT specimens before irradiation were examined as a function of temperature. DBTT of smaller size specimens of DFMB was lower than that of larger size specimen of t/2-1/3PCCVN and 0.18DCT. The changes of fracture toughness and DBTT due to irradiation were also

  19. Measurements of accurate x-ray scattering data of protein solutions using small stationary sample cells

    Science.gov (United States)

    Hong, Xinguo; Hao, Quan

    2009-01-01

    In this paper, we report a method of precise in situ x-ray scattering measurements on protein solutions using small stationary sample cells. Although reduction in the radiation damage induced by intense synchrotron radiation sources is indispensable for the correct interpretation of scattering data, there is still a lack of effective methods to overcome radiation-induced aggregation and extract scattering profiles free from chemical or structural damage. It is found that radiation-induced aggregation mainly begins on the surface of the sample cell and grows along the beam path; the diameter of the damaged region is comparable to the x-ray beam size. Radiation-induced aggregation can be effectively avoided by using a two-dimensional scan (2D mode), with an interval as small as 1.5 times the beam size, at low temperature (e.g., 4 °C). A radiation sensitive protein, bovine hemoglobin, was used to test the method. A standard deviation of less than 5% in the small angle region was observed from a series of nine spectra recorded in 2D mode, in contrast to the intensity variation seen using the conventional stationary technique, which can exceed 100%. Wide-angle x-ray scattering data were collected at a standard macromolecular diffraction station using the same data collection protocol and showed a good signal/noise ratio (better than the reported data on the same protein using a flow cell). The results indicate that this method is an effective approach for obtaining precise measurements of protein solution scattering.

  20. Measurements of accurate x-ray scattering data of protein solutions using small stationary sample cells

    International Nuclear Information System (INIS)

    Hong Xinguo; Hao Quan

    2009-01-01

    In this paper, we report a method of precise in situ x-ray scattering measurements on protein solutions using small stationary sample cells. Although reduction in the radiation damage induced by intense synchrotron radiation sources is indispensable for the correct interpretation of scattering data, there is still a lack of effective methods to overcome radiation-induced aggregation and extract scattering profiles free from chemical or structural damage. It is found that radiation-induced aggregation mainly begins on the surface of the sample cell and grows along the beam path; the diameter of the damaged region is comparable to the x-ray beam size. Radiation-induced aggregation can be effectively avoided by using a two-dimensional scan (2D mode), with an interval as small as 1.5 times the beam size, at low temperature (e.g., 4 deg. C). A radiation sensitive protein, bovine hemoglobin, was used to test the method. A standard deviation of less than 5% in the small angle region was observed from a series of nine spectra recorded in 2D mode, in contrast to the intensity variation seen using the conventional stationary technique, which can exceed 100%. Wide-angle x-ray scattering data were collected at a standard macromolecular diffraction station using the same data collection protocol and showed a good signal/noise ratio (better than the reported data on the same protein using a flow cell). The results indicate that this method is an effective approach for obtaining precise measurements of protein solution scattering.

  1. The Effect of Sterilization on Size and Shape of Fat Globules in Model Processed Cheese Samples

    Directory of Open Access Journals (Sweden)

    B. Tremlová

    2006-01-01

    Full Text Available Model cheese samples from 4 independent productions were heat sterilized (117 °C, 20 minutes after the melting process and packing with an aim to prolong their durability. The objective of the study was to assess changes in the size and shape of fat globules due to heat sterilization by using image analysis methods. The study included a selection of suitable methods of preparation mounts, taking microphotographs and making overlays for automatic processing of photographs by image analyser, ascertaining parameters to determine the size and shape of fat globules and statistical analysis of results obtained. The results of the experiment suggest that changes in shape of fat globules due to heat sterilization are not unequivocal. We found that the size of fat globules was significantly increased (p < 0.01 due to heat sterilization (117 °C, 20 min, and the shares of small fat globules (up to 500 μm2, or 100 μm2 in the samples of heat sterilized processed cheese were decreased. The results imply that the image analysis method is very useful when assessing the effect of technological process on the quality of processed cheese quality.

  2. Single Nucleotide Polymorphisms in the HIRA Gene Affect Litter Size in Small Tail Han Sheep

    Directory of Open Access Journals (Sweden)

    Mei Zhou

    2018-05-01

    Full Text Available Maintenance of appropriate levels of fecundity is critical for efficient sheep production. Opportunities to increase sheep litter size include identifying single gene mutations with major effects on ovulation rate and litter size. Whole-genome sequencing (WGS data of 89 Chinese domestic sheep from nine different geographical locations and ten Australian sheep were analyzed to detect new polymorphisms affecting litter size. Comparative genomic analysis of sheep with contrasting litter size detected a novel set of candidate genes. Two SNPs, g.71874104G>A and g.71833755T>C, were genotyped in 760 Small Tail Han sheep and analyzed for association with litter size. The two SNPs were significantly associated with litter size, being in strong linkage disequilibrium in the region 71.80–71.87 Mb. This haplotype block contains one gene that may affect litter size, Histone Cell Cycle Regulator (HIRA. HIRA mRNA levels in sheep with different lambing ability were significantly higher in ovaries of Small Tail Han sheep (high fecundity than in Sunite sheep (low fecundity. Moreover, the expression levels of HIRA in eight tissues of uniparous Small Tail Han sheep were significantly higher than in multiparous Small Tail Han sheep (p < 0.05. HIRA SNPs significantly affect litter size in sheep and are useful as genetic markers for litter size.

  3. A flexible method for multi-level sample size determination

    International Nuclear Information System (INIS)

    Lu, Ming-Shih; Sanborn, J.B.; Teichmann, T.

    1997-01-01

    This paper gives a flexible method to determine sample sizes for both systematic and random error models (this pertains to sampling problems in nuclear safeguard questions). In addition, the method allows different attribute rejection limits. The new method could assist achieving a higher detection probability and enhance inspection effectiveness

  4. The business environment and phases of development of small and medium-sized enterprises

    OpenAIRE

    Wach, Krzysztof

    2006-01-01

    The subject literature distinguishes several factors that determine the establishment, survival , operations and development of micro, small, and medium-sized enterprises in particular regions. In this article, the author presents a classification of these factors in terms of whether they are barriers or stimulators. On this basis, the author explains the influence of regional environment (meso-environment) factors on the development of small and medium-sized enterprises in various phases of ...

  5. Strategic management in micro, small and medium-sized businesses in relation to financial success of the enterprise

    Directory of Open Access Journals (Sweden)

    Monika Švárová

    2013-01-01

    Full Text Available Small and medium sized enterprises play an essential role in the economy of the Czech Republic as a report of development of small and medium-sized enterprises and its support from 2010 announces. They are the source of development of towns, regions and the state itself. Small and medium-sized enterprises represent 99.84 % of the whole business subjects. Statistic data underline this position- they indicate that small and medium-sized enterprises employ 2/3 employees. Therefore, occupying small and medium-sized enterprises are to be taken as priority.Strategic management on the level of small and medium-sized enterprises is, as number of authors write, diverse. Small enterprises including micro have less access to capital, they have no money for employing specialists in the field, and they solve administration primarily. The aim of this follow-up GAJU project contribution concerned on process analysis of small and medium-sized enterprises is to summarize the gained results from the view of comparison of the level of strategic management in small and medium-sized enterprises.Nowadays, numbers of enterprises are thinking about whether they are able to remain in the market as an established company, they explore the possibility of improving its position. There is a question for small and medium-sized enterprises management- can strategic management be used as an instrument for increasing competitiveness? The results show that SMEs with a clearly defined strategy show better results of financial health (IN99 and ROA than companies without a defined strategy. In terms of business focus, we were able to demonstrate positive relationship between strategy and finance only for companies in the construction industry, on the significance level of 0.05. Compared to manufacturers and service-providing companies where this relation is not statistically significant.

  6. Sample Size for Tablet Compression and Capsule Filling Events During Process Validation.

    Science.gov (United States)

    Charoo, Naseem Ahmad; Durivage, Mark; Rahman, Ziyaur; Ayad, Mohamad Haitham

    2017-12-01

    During solid dosage form manufacturing, the uniformity of dosage units (UDU) is ensured by testing samples at 2 stages, that is, blend stage and tablet compression or capsule/powder filling stage. The aim of this work is to propose a sample size selection approach based on quality risk management principles for process performance qualification (PPQ) and continued process verification (CPV) stages by linking UDU to potential formulation and process risk factors. Bayes success run theorem appeared to be the most appropriate approach among various methods considered in this work for computing sample size for PPQ. The sample sizes for high-risk (reliability level of 99%), medium-risk (reliability level of 95%), and low-risk factors (reliability level of 90%) were estimated to be 299, 59, and 29, respectively. Risk-based assignment of reliability levels was supported by the fact that at low defect rate, the confidence to detect out-of-specification units would decrease which must be supplemented with an increase in sample size to enhance the confidence in estimation. Based on level of knowledge acquired during PPQ and the level of knowledge further required to comprehend process, sample size for CPV was calculated using Bayesian statistics to accomplish reduced sampling design for CPV. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  7. Simplified analysis of passive residual heat removal systems for small size PWR's

    International Nuclear Information System (INIS)

    Botelho, D.A.

    1992-02-01

    The function and general objectives of a passive residual heat removal system for small size PWR's are defined. The characteristic configuration, the components and the operation modes of this system are concisely described. A preliminary conceptual specification of this system, for a small size PWR of 400 MW thermal, is made analogous to the decay heat removal system of the AP-600 reactor. It is shown by analytic models that such passive systems can dissipate 2% of nominal power within the thermal limits allowed to the reactor fuel elements. (author)

  8. Regulation of Small and Medium-Sized Business Development in Russia: Problems and Solutions

    Directory of Open Access Journals (Sweden)

    Lyudmila Yuryevna Bogachkova

    2015-12-01

    Full Text Available The authors prove that despite the active state policy carried out since the second half of the 2000s and aimed at supporting small and medium-sized business in the Russian Federation, the current level of development of this economic sector is insufficient. The present paper characterizes the modern structure of small and medium-sized business. The authors show that the main problems hindering its growth are conditioned by low market demand, large tax deductions, numerous administrative barriers, lack of funding and state support. On the basis of the official data of Russian Federal State Statistics Service on theresults of annual surveys of entrepreneurs, the authors revealed the factors that prevented innovation situation in the country have stable negative impact on MSB, while the impact of such factors as imperfect legal and regulatory framework, investment risks, low profitability and inadequate state of technological infrastructure is relatively nonsignificant. The authors describe systemwide and resource measures of state regulation of small and medium-sized business. The system-wide measures include preferential access to production facilities and equipment, special tax regimes, administrative control. The measures of resource support to entrepreneurs consist in subsidizing the lease payments and interest rates on loans for the modernization of production; grant support, the establishment of microfinance organizations and guarantee funds, the development of business support infrastructure. The authors describe the forms of these measures implementation in 2013 and the main directions of improving the state regulation of small and medium-sized business, including the reduction of tax burden and facilitation of taxation procedures, the reduction of administrative barriers and ensuring access of small and medium-sized enterprises to government orders and technological infrastructure.

  9. Sample Size Calculation for Controlling False Discovery Proportion

    Directory of Open Access Journals (Sweden)

    Shulian Shang

    2012-01-01

    Full Text Available The false discovery proportion (FDP, the proportion of incorrect rejections among all rejections, is a direct measure of abundance of false positive findings in multiple testing. Many methods have been proposed to control FDP, but they are too conservative to be useful for power analysis. Study designs for controlling the mean of FDP, which is false discovery rate, have been commonly used. However, there has been little attempt to design study with direct FDP control to achieve certain level of efficiency. We provide a sample size calculation method using the variance formula of the FDP under weak-dependence assumptions to achieve the desired overall power. The relationship between design parameters and sample size is explored. The adequacy of the procedure is assessed by simulation. We illustrate the method using estimated correlations from a prostate cancer dataset.

  10. The Small Nuclear Genomes of Selaginella Are Associated with a Low Rate of Genome Size Evolution.

    Science.gov (United States)

    Baniaga, Anthony E; Arrigo, Nils; Barker, Michael S

    2016-06-03

    The haploid nuclear genome size (1C DNA) of vascular land plants varies over several orders of magnitude. Much of this observed diversity in genome size is due to the proliferation and deletion of transposable elements. To date, all vascular land plant lineages with extremely small nuclear genomes represent recently derived states, having ancestors with much larger genome sizes. The Selaginellaceae represent an ancient lineage with extremely small genomes. It is unclear how small nuclear genomes evolved in Selaginella We compared the rates of nuclear genome size evolution in Selaginella and major vascular plant clades in a comparative phylogenetic framework. For the analyses, we collected 29 new flow cytometry estimates of haploid genome size in Selaginella to augment publicly available data. Selaginella possess some of the smallest known haploid nuclear genome sizes, as well as the lowest rate of genome size evolution observed across all vascular land plants included in our analyses. Additionally, our analyses provide strong support for a history of haploid nuclear genome size stasis in Selaginella Our results indicate that Selaginella, similar to other early diverging lineages of vascular land plants, has relatively low rates of genome size evolution. Further, our analyses highlight that a rapid transition to a small genome size is only one route to an extremely small genome. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  11. A normative inference approach for optimal sample sizes in decisions from experience

    Science.gov (United States)

    Ostwald, Dirk; Starke, Ludger; Hertwig, Ralph

    2015-01-01

    “Decisions from experience” (DFE) refers to a body of work that emerged in research on behavioral decision making over the last decade. One of the major experimental paradigms employed to study experience-based choice is the “sampling paradigm,” which serves as a model of decision making under limited knowledge about the statistical structure of the world. In this paradigm respondents are presented with two payoff distributions, which, in contrast to standard approaches in behavioral economics, are specified not in terms of explicit outcome-probability information, but by the opportunity to sample outcomes from each distribution without economic consequences. Participants are encouraged to explore the distributions until they feel confident enough to decide from which they would prefer to draw from in a final trial involving real monetary payoffs. One commonly employed measure to characterize the behavior of participants in the sampling paradigm is the sample size, that is, the number of outcome draws which participants choose to obtain from each distribution prior to terminating sampling. A natural question that arises in this context concerns the “optimal” sample size, which could be used as a normative benchmark to evaluate human sampling behavior in DFE. In this theoretical study, we relate the DFE sampling paradigm to the classical statistical decision theoretic literature and, under a probabilistic inference assumption, evaluate optimal sample sizes for DFE. In our treatment we go beyond analytically established results by showing how the classical statistical decision theoretic framework can be used to derive optimal sample sizes under arbitrary, but numerically evaluable, constraints. Finally, we critically evaluate the value of deriving optimal sample sizes under this framework as testable predictions for the experimental study of sampling behavior in DFE. PMID:26441720

  12. THE RELATIONSHIP MARKETING APPLICATION IN SMALL AND MEDIUM-SIZED ENTERPRISES

    Directory of Open Access Journals (Sweden)

    SABOU FELICIA

    2016-12-01

    Full Text Available The paper presents the importance of relationship marketing, the communication between customers and romanian small and medium-sized enterprises, analyzing also how are investigated and resolved the complaints received from the customers. In the paper, I made a study through I analyzed if the small and medium-sized enterprises undertake investigations about the reasons which they have lost customers. The study was conducted during january - february 2016, the main research methods used in the study were observation, documentation, analysis and market survey. Also, I chose the study questionnaire as research tool. From this study resulted that 79,31% of companies surveyed, were concerned and investigated the reasons they lost customers. Customer satisfaction is very important to business success, so it is recommended that companies must pay attention to the customers reactions and customers dissatisfaction consequences.

  13. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples.

    Science.gov (United States)

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G

    2015-10-01

    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  14. Clustering Methods with Qualitative Data: A Mixed Methods Approach for Prevention Research with Small Samples

    Science.gov (United States)

    Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.

    2016-01-01

    Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969

  15. A contemporary decennial global Landsat sample of changing agricultural field sizes

    Science.gov (United States)

    White, Emma; Roy, David

    2014-05-01

    Agriculture has caused significant human induced Land Cover Land Use (LCLU) change, with dramatic cropland expansion in the last century and significant increases in productivity over the past few decades. Satellite data have been used for agricultural applications including cropland distribution mapping, crop condition monitoring, crop production assessment and yield prediction. Satellite based agricultural applications are less reliable when the sensor spatial resolution is small relative to the field size. However, to date, studies of agricultural field size distributions and their change have been limited, even though this information is needed to inform the design of agricultural satellite monitoring systems. Moreover, the size of agricultural fields is a fundamental description of rural landscapes and provides an insight into the drivers of rural LCLU change. In many parts of the world field sizes may have increased. Increasing field sizes cause a subsequent decrease in the number of fields and therefore decreased landscape spatial complexity with impacts on biodiversity, habitat, soil erosion, plant-pollinator interactions, and impacts on the diffusion of herbicides, pesticides, disease pathogens, and pests. The Landsat series of satellites provide the longest record of global land observations, with 30m observations available since 1982. Landsat data are used to examine contemporary field size changes in a period (1980 to 2010) when significant global agricultural changes have occurred. A multi-scale sampling approach is used to locate global hotspots of field size change by examination of a recent global agricultural yield map and literature review. Nine hotspots are selected where significant field size change is apparent and where change has been driven by technological advancements (Argentina and U.S.), abrupt societal changes (Albania and Zimbabwe), government land use and agricultural policy changes (China, Malaysia, Brazil), and/or constrained by

  16. Targeting Triple Negative Breast Cancer with a Small-sized Paramagnetic Nanoparticle

    Science.gov (United States)

    Zhang, Li; Varma, Nadimpalli RS; Gang, Zhang Z.; Ewing, James R.; Arbab, Ali S; Ali, Meser M

    2016-01-01

    There is no available targeted therapy or imaging agent for triple negative breast cancer (TNBC). We developed a small-sized dendrimer-based nanoparticle containing a clinical relevant MRI contrast agent, GdDOTA and a NIR fluorescent dye, DL680. Systemic delivery of dual-modal nanoparticles led to accumulation of the agents in a flank mouse model of TNBC that were detected by both optical and MR imaging. In-vivo fluorescence images, as well as ex-vivo fluorescence images of individual organs, demonstrated that nanoparticles accumulated into tumor selectively. A dual modal strategy resulted in a selective delivery of a small-sized (GdDOTA)42-G4-DL680 dendrimeric agent to TNBC tumors, avoiding other major organs. PMID:28018751

  17. Workplace Health Promotion within Small and Medium-Sized Enterprises

    Science.gov (United States)

    Moore, Ann; Parahoo, Kader; Fleming, Paul

    2010-01-01

    Purpose: The purpose of this study is to explore managers' understanding of workplace health promotion (WHP) and experiences of WHP activity within small and medium-sized enterprises (SMEs) in a Health and Social Care Trust area of Northern Ireland. The paper aims to focus on engagement with activities within the context of prevention of…

  18. Financing Of Small And Medium-Size Enterprises In Cameroon ...

    African Journals Online (AJOL)

    Financing Of Small And Medium-Size Enterprises In Cameroon. ... Available data from the banking sector shows that as much as 78.7% of all ... SMEs and large companies pay back their loans better than the other ... Even the SME loan repayment rate of 62.9% is still low by World Bank ... AJOL African Journals Online.

  19. Rock sampling. [method for controlling particle size distribution

    Science.gov (United States)

    Blum, P. (Inventor)

    1971-01-01

    A method for sampling rock and other brittle materials and for controlling resultant particle sizes is described. The method involves cutting grooves in the rock surface to provide a grouping of parallel ridges and subsequently machining the ridges to provide a powder specimen. The machining step may comprise milling, drilling, lathe cutting or the like; but a planing step is advantageous. Control of the particle size distribution is effected primarily by changing the height and width of these ridges. This control exceeds that obtainable by conventional grinding.

  20. The art of being small : brain-body size scaling in minute parasitic wasps

    NARCIS (Netherlands)

    Woude, van der Emma

    2017-01-01

    Haller’s rule states that small animals have relatively larger brains than large animals. This brain-body size relationship may enable small animals to maintain similar levels of brain performance as large animals. However, it also causes small animals to spend an exceptionally large proportion

  1. Effects of sample size on the second magnetization peak in ...

    Indian Academy of Sciences (India)

    8+ crystals are observed at low temperatures, above the temperature where the SMP totally disappears. In particular, the onset of the SMP shifts to lower fields as the sample size decreases - a result that could be interpreted as a size effect in ...

  2. [Monitoring microbiological safety of small systems of water distribution. Comparison of two sampling programs in a town in central Italy].

    Science.gov (United States)

    Papini, Paolo; Faustini, Annunziata; Manganello, Rosa; Borzacchi, Giancarlo; Spera, Domenico; Perucci, Carlo A

    2005-01-01

    To determine the frequency of sampling in small water distribution systems (distribution. We carried out two sampling programs to monitor the water distribution system in a town in Central Italy between July and September 1992; the Poisson distribution assumption implied 4 water samples, the assumption of negative binomial distribution implied 21 samples. Coliform organisms were used as indicators of water safety. The network consisted of two pipe rings and two wells fed by the same water source. The number of summer customers varied considerably from 3,000 to 20,000. The mean density was 2.33 coliforms/100 ml (sd= 5.29) for 21 samples and 3 coliforms/100 ml (sd= 6) for four samples. However the hypothesis of homogeneity was rejected (p-value samples (beta= 0.24) than with 21 (beta= 0.05). For this small network, determining the samples' size according to heterogeneity hypothesis strengthens the statement that water is drinkable compared with homogeneity assumption.

  3. Sample size for estimation of the Pearson correlation coefficient in cherry tomato tests

    Directory of Open Access Journals (Sweden)

    Bruno Giacomini Sari

    2017-09-01

    Full Text Available ABSTRACT: The aim of this study was to determine the required sample size for estimation of the Pearson coefficient of correlation between cherry tomato variables. Two uniformity tests were set up in a protected environment in the spring/summer of 2014. The observed variables in each plant were mean fruit length, mean fruit width, mean fruit weight, number of bunches, number of fruits per bunch, number of fruits, and total weight of fruits, with calculation of the Pearson correlation matrix between them. Sixty eight sample sizes were planned for one greenhouse and 48 for another, with the initial sample size of 10 plants, and the others were obtained by adding five plants. For each planned sample size, 3000 estimates of the Pearson correlation coefficient were obtained through bootstrap re-samplings with replacement. The sample size for each correlation coefficient was determined when the 95% confidence interval amplitude value was less than or equal to 0.4. Obtaining estimates of the Pearson correlation coefficient with high precision is difficult for parameters with a weak linear relation. Accordingly, a larger sample size is necessary to estimate them. Linear relations involving variables dealing with size and number of fruits per plant have less precision. To estimate the coefficient of correlation between productivity variables of cherry tomato, with a confidence interval of 95% equal to 0.4, it is necessary to sample 275 plants in a 250m² greenhouse, and 200 plants in a 200m² greenhouse.

  4. Effect of sample size on bias correction performance

    Science.gov (United States)

    Reiter, Philipp; Gutjahr, Oliver; Schefczyk, Lukas; Heinemann, Günther; Casper, Markus C.

    2014-05-01

    The output of climate models often shows a bias when compared to observed data, so that a preprocessing is necessary before using it as climate forcing in impact modeling (e.g. hydrology, species distribution). A common bias correction method is the quantile matching approach, which adapts the cumulative distribution function of the model output to the one of the observed data by means of a transfer function. Especially for precipitation we expect the bias correction performance to strongly depend on sample size, i.e. the length of the period used for calibration of the transfer function. We carry out experiments using the precipitation output of ten regional climate model (RCM) hindcast runs from the EU-ENSEMBLES project and the E-OBS observational dataset for the period 1961 to 2000. The 40 years are split into a 30 year calibration period and a 10 year validation period. In the first step, for each RCM transfer functions are set up cell-by-cell, using the complete 30 year calibration period. The derived transfer functions are applied to the validation period of the respective RCM precipitation output and the mean absolute errors in reference to the observational dataset are calculated. These values are treated as "best fit" for the respective RCM. In the next step, this procedure is redone using subperiods out of the 30 year calibration period. The lengths of these subperiods are reduced from 29 years down to a minimum of 1 year, only considering subperiods of consecutive years. This leads to an increasing number of repetitions for smaller sample sizes (e.g. 2 for a length of 29 years). In the last step, the mean absolute errors are statistically tested against the "best fit" of the respective RCM to compare the performances. In order to analyze if the intensity of the effect of sample size depends on the chosen correction method, four variations of the quantile matching approach (PTF, QUANT/eQM, gQM, GQM) are applied in this study. The experiments are further

  5. Managers' Understanding of Workplace Health Promotion within Small and Medium-Sized Enterprises: A Phenomenological Study

    Science.gov (United States)

    Moore, Ann; Parahoo, Kader; Fleming, Paul

    2011-01-01

    Objective: This study aimed at exploring managers' understanding of workplace health promotion and experiences of workplace health promotion activity within small and medium-sized enterprises. Design: A Heideggerian interpretive phenomenological methodology was adopted. Setting: This study was undertaken with small and medium-sized enterprise…

  6. Financial Management Challenges In Small And Medium-Sized Enterprises: A Strategic Management Approach

    OpenAIRE

    Hande Karadag

    2015-01-01

    Abstract :Due to their significant role in creation of new jobs, rise in GDP, entrepreneurship and innovation, small and medium-sized enterprises (SMEs) are recognized as the the drivers of socio-economic growth, both in developed and developing economies. In Turkey, 99.9 % of all enterprises fall into SME category. Therefore, the significance of SMEs for Turkish economy and society is much higher in Turkey, compared to other emerging and developed countries. Small and medium-sized companies ...

  7. Multi-element analysis of small biological samples

    International Nuclear Information System (INIS)

    Rokita, E.; Cafmeyer, J.; Maenhaut, W.

    1983-01-01

    A method combining PIXE and INAA was developed to determine the elemental composition of small biological samples. The method needs virtually no sample preparation and less than 1 mg is sufficient for the analysis. The method was used for determining up to 18 elements in leaves taken from Cracow Herbaceous. The factors which influence the elemental composition of leaves and the possible use of leaves as an environmental pollution indicator are discussed

  8. Indoor particle levels in small- and medium-sized commercial buildings in California.

    Science.gov (United States)

    Wu, Xiangmei May; Apte, Michael G; Bennett, Deborah H

    2012-11-20

    This study monitored indoor and outdoor particle concentrations in 37 small and medium commercial buildings (SMCBs) in California with three buildings sampled on two occasions, resulting in 40 sampling days. Sampled buildings included offices, retail establishments, restaurants, dental offices, and hair salons, among others. Continuous measurements were made for both ultrafine and fine particulate matter as well as black carbon inside and outside of the building. Integrated PM(2.5), PM(2.5-10), and PM(10) samples were also collected inside and outside the building. The majority of the buildings had indoor/outdoor (I/O) particle concentration ratios less than 1.0, indicating that contributions from indoor sources are less than removal of outdoor particles. However, some of the buildings had I/O ratios greater than 1, indicating significant indoor particle sources. This was particularly true of restaurants, hair salons, and dental offices. The infiltration factor was estimated from a regression analysis of indoor and outdoor concentrations for each particle size fraction, finding lower values for ultrafine and coarse particles than for submicrometer particles, as expected. The I/O ratio of black carbon was used as a relative measure of the infiltration factor of particles among buildings, with a geometric mean of 0.62. The contribution of indoor sources to indoor particle levels was estimated for each building.

  9. Mechanical characteristics of historic mortars from tests on small-sample non-standard on small-sample non-standard specimens

    Czech Academy of Sciences Publication Activity Database

    Drdácký, Miloš; Slížková, Zuzana

    2008-01-01

    Roč. 17, č. 1 (2008), s. 20-29 ISSN 1407-7353 R&D Projects: GA ČR(CZ) GA103/06/1609 Institutional research plan: CEZ:AV0Z20710524 Keywords : small-sample non-standard testing * lime * historic mortar Subject RIV: AL - Art, Architecture, Cultural Heritage

  10. The Role of Branding in Small and Medium-Sized Enterprises

    Directory of Open Access Journals (Sweden)

    Franc Vidic

    2013-12-01

    Full Text Available The purpose of this article is to show the relationship between branding and brand management in small and medium-sized enterprises (SMEs. Traditionally, branding was associated with large and global corporations. However, we often forget that small and medium-sized enterprises also deal with their own names (brands in their own way. The study identified four types of businesses, regardless of their association with brands. We named these four types, as follows: Ignorant; User; Low-Cost Producer; and Differentiation Producer. If the first two types (i.e. Ignorant and User differ primarily in the extent to which they use simple branding activities, and are used mainly in the local market where the enterprises tend to operate, we found that the last two types (i.e. low-cost producers and differentiation producers design their branding strategies in accordance with their generic strategies and mode of growth.

  11. Customer acquisition plan for a small-size entrepreneur

    OpenAIRE

    Puotiniemi, Tiia

    2014-01-01

    The aim of this thesis was to draw a customer acquisition plan and improve the marketing planning in the company. Therefore, the final result was the market research summary. Development of the marketing strategy planning was based on internal and external situation analyses. This thesis is concentrated on a small-size operator’s business and therefore the internet marketing was brought up with its profitable benefits and advantages. The thesis was written as an auxiliary guide to build u...

  12. Caution regarding the choice of standard deviations to guide sample size calculations in clinical trials.

    Science.gov (United States)

    Chen, Henian; Zhang, Nanhua; Lu, Xiaosun; Chen, Sophie

    2013-08-01

    The method used to determine choice of standard deviation (SD) is inadequately reported in clinical trials. Underestimations of the population SD may result in underpowered clinical trials. This study demonstrates how using the wrong method to determine population SD can lead to inaccurate sample sizes and underpowered studies, and offers recommendations to maximize the likelihood of achieving adequate statistical power. We review the practice of reporting sample size and its effect on the power of trials published in major journals. Simulated clinical trials were used to compare the effects of different methods of determining SD on power and sample size calculations. Prior to 1996, sample size calculations were reported in just 1%-42% of clinical trials. This proportion increased from 38% to 54% after the initial Consolidated Standards of Reporting Trials (CONSORT) was published in 1996, and from 64% to 95% after the revised CONSORT was published in 2001. Nevertheless, underpowered clinical trials are still common. Our simulated data showed that all minimal and 25th-percentile SDs fell below 44 (the population SD), regardless of sample size (from 5 to 50). For sample sizes 5 and 50, the minimum sample SDs underestimated the population SD by 90.7% and 29.3%, respectively. If only one sample was available, there was less than 50% chance that the actual power equaled or exceeded the planned power of 80% for detecting a median effect size (Cohen's d = 0.5) when using the sample SD to calculate the sample size. The proportions of studies with actual power of at least 80% were about 95%, 90%, 85%, and 80% when we used the larger SD, 80% upper confidence limit (UCL) of SD, 70% UCL of SD, and 60% UCL of SD to calculate the sample size, respectively. When more than one sample was available, the weighted average SD resulted in about 50% of trials being underpowered; the proportion of trials with power of 80% increased from 90% to 100% when the 75th percentile and the

  13. The role of micro size computing clusters for small physics groups

    International Nuclear Information System (INIS)

    Shevel, A Y

    2014-01-01

    A small physics group (3-15 persons) might use a number of computing facilities for the analysis/simulation, developing/testing, teaching. It is discussed different types of computing facilities: collaboration computing facilities, group local computing cluster (including colocation), cloud computing. The author discuss the growing variety of different computing options for small groups and does emphasize the role of the group owned computing cluster of micro size.

  14. Analysis of near optimum design for small and medium size nuclear power plants

    International Nuclear Information System (INIS)

    Ahmed, A.A.

    1977-01-01

    Market surveys in recent years have shown that a significant market would exist among the developing nations of the world for nuclear power plants that would be classified as small to medium sized, provided that these small plants could produce electricity at a unit price comparable to that of equivalent sized fossil fired plants. Nuclear plants in the range of 100 MWe to 500 MWe would fit more effectively into the relatively smaller grids of most developing nations than would the 900 MWe to 1300 MWe units now being constructed in the large industrial nations. Worldwide re-evaluation of the worth of fossil fuels has prompted a re-examination of the competitive position of small to medium sized nuclear generating units compared to comparable fossil fired units, especially in the context of units specifically optimized for the size range of interest, rather than of designs that are simply scaled down versions of the currently available larger units. Since the absolute cost of electricity is more sensitive to external factors such as cost of money, national inflation rate and time required for licensing and construction than to details of design or perhaps even to choice of fuels, and since the cost of electricity generated in small to medium sized fossil fired units is periodically compared to that of scaled down versions of conventional large nuclear units, the point of view taken here is one of comparing the relative generating costs of smaller nuclear units of optimum design with the corresponding costs of scaled down versions of current large nuclear generating units

  15. International orientation and export commitment in fast small and medium size firms internationalization: scales validation and implications for the Brazilian case

    Directory of Open Access Journals (Sweden)

    Marcelo André Machado

    Full Text Available Abstract A set of changes in the competitive environment has recently provoked the emergence of a new kind of organization that has since its creation a meaningful share of its revenue being originated from international activities developed in more than one continent. Within this new reality, the internationalization of the firm in phases or according to its growth has resulted in it losing its capacity to explain this process with regard to small- and medium-sized enterprises (SME. Thus, in this paper, the international orientation (IO and export commitment (EC constructs have been revised under a theoretical context of the fast internationalization of medium-sized companies, so as to identify scales that more accurately measure these dimensions in the Brazilian setting. After a literature review and an exploratory research, the IO and EC scales proposed by Knight and Cavusgil (2004 and Shamsuddoha and Ali (2006 were respectively applied to a sample of 398 small- and medium-sized exporting Brazilian companies. In spite of conjunction and situation differences inherent to the Brazilian companies, the selected scales presented high measuring reliability. Furthermore, the field research outcomes provide evidence for the existence of a phenomenon of fast internationalization in medium-sized companies in Brazil, as well as support some theoretical assumptions of other empirical investigations carried out with samples from developed countries.

  16. Small- and Medium-Sized Commercial Building Monitoring and Controls Needs: A Scoping Study

    Energy Technology Data Exchange (ETDEWEB)

    Katipamula, Srinivas; Underhill, Ronald M.; Goddard, James K.; Taasevigen, Danny J.; Piette, M. A.; Granderson, J.; Brown, Rich E.; Lanzisera, Steven M.; Kuruganti, T.

    2012-10-31

    Buildings consume over 40% of the total energy consumption in the U.S. A significant portion of the energy consumed in buildings is wasted because of the lack of controls or the inability to use existing building automation systems (BASs) properly. Much of the waste occurs because of our inability to manage and controls buildings efficiently. Over 90% of the buildings are either small-size (<5,000 sf) or medium-size (between 5,000 sf and 50,000 sf); these buildings currently do not use BASs to monitor and control their building systems from a central location. According to Commercial Building Energy Consumption Survey (CBECS), about 10% of the buildings in the U.S. use BASs or central controls to manage their building system operations. Buildings that use BASs are typically large (>100,000 sf). Lawrence Berkeley National Laboratory (LBNL), Oak Ridge National Laboratory (ORNL) and Pacific Northwest National Laboratory (PNNL) were asked by the U.S. Department of Energy’s (DOE’s) Building Technologies Program (BTP) to identify monitoring and control needs for small- and medium-sized commercial buildings and recommend possible solutions. This study documents the needs and solutions for small- and medium-sized buildings.

  17. Influence of Sample Size on Automatic Positional Accuracy Assessment Methods for Urban Areas

    Directory of Open Access Journals (Sweden)

    Francisco J. Ariza-López

    2018-05-01

    Full Text Available In recent years, new approaches aimed to increase the automation level of positional accuracy assessment processes for spatial data have been developed. However, in such cases, an aspect as significant as sample size has not yet been addressed. In this paper, we study the influence of sample size when estimating the planimetric positional accuracy of urban databases by means of an automatic assessment using polygon-based methodology. Our study is based on a simulation process, which extracts pairs of homologous polygons from the assessed and reference data sources and applies two buffer-based methods. The parameter used for determining the different sizes (which range from 5 km up to 100 km has been the length of the polygons’ perimeter, and for each sample size 1000 simulations were run. After completing the simulation process, the comparisons between the estimated distribution functions for each sample and population distribution function were carried out by means of the Kolmogorov–Smirnov test. Results show a significant reduction in the variability of estimations when sample size increased from 5 km to 100 km.

  18. Sample size determination for disease prevalence studies with partially validated data.

    Science.gov (United States)

    Qiu, Shi-Fang; Poon, Wai-Yin; Tang, Man-Lai

    2016-02-01

    Disease prevalence is an important topic in medical research, and its study is based on data that are obtained by classifying subjects according to whether a disease has been contracted. Classification can be conducted with high-cost gold standard tests or low-cost screening tests, but the latter are subject to the misclassification of subjects. As a compromise between the two, many research studies use partially validated datasets in which all data points are classified by fallible tests, and some of the data points are validated in the sense that they are also classified by the completely accurate gold-standard test. In this article, we investigate the determination of sample sizes for disease prevalence studies with partially validated data. We use two approaches. The first is to find sample sizes that can achieve a pre-specified power of a statistical test at a chosen significance level, and the second is to find sample sizes that can control the width of a confidence interval with a pre-specified confidence level. Empirical studies have been conducted to demonstrate the performance of various testing procedures with the proposed sample sizes. The applicability of the proposed methods are illustrated by a real-data example. © The Author(s) 2012.

  19. Czech small and medium-sized enterprises and the success in foreign markets

    Directory of Open Access Journals (Sweden)

    Lenka Procházková

    2011-01-01

    Full Text Available The importance of small and medium-sized enterprises (SMEs in national economies has been growing. For these reasons, SME’s are paid more attention. The paper deals with the success of Czech SME’s activities in the foreign markets. These enterprises exploited the opportunity of extensive European Union market without internal borders and extended its activities to foreign markets. The objective of this paper is to determine characteristics related to activities carried on by small and medium-sized Czech companies in the foreign markets and then to describe those that fundamentally affect the success of these entities in the foreign markets. The characteristics are divided into five groups (a group of characteristics relating to the company, product, management, international experience and a group of characteristics relating to the target market. The enterprises’ success in foreign markets is assessed by aggregate indicator of success. This indicator is monitored by multi-dimensional indicators of corporate success and evaluation of both objective and subjective. At the last part of the article due to analysis of relationship used, there are identified characteristics affecting the success of Czech small and medium-sized enterprises in foreign markets.

  20. International Financial Reporting standard for Small and Medium-sized entities

    Directory of Open Access Journals (Sweden)

    Z Koppeschaar

    2012-12-01

    Full Text Available The International Financial Reporting Standard for Small and Medium-sized entities (IFRS for SMEs was published as a standard by the International Accounting Standards Board (IASB during July 2009. During 2007 South Africa became one of the first countries and the first country in Africa to early accept the proposed accounting standard (exposure draft of an IFRS for SMEs. The accounting standard will probably also be accepted by numerous other countries. The aim of this article is to investigate the applicability of this accounting standard. The results indicated that the IFRS for SMEs remains too comprehensive for the majority of small companies. The IFRS for SMEs does not satisfy the needs of South African users of small company financial statements, and as a result the accounting requirements should be simplified. KEYWORDS: Financial accounting; Financial reporting requirements; IFRS for SMEs; Small companies; Users of financial statements; Small company financial statements.

  1. 75 FR 9431 - Small and Medium-Sized Enterprises: U.S. and EU Export Activities, and Barriers and Opportunities...

    Science.gov (United States)

    2010-03-02

    ... to the USTR on the first investigation, No. 332-508, Small and Medium-Sized Enterprises: Overview of..., investigation No. 332-509, Small and Medium-Sized Enterprises: U.S. and EU Export Activities, and Barriers and Opportunities Experienced by U.S. Firms, and investigation No. 332- 510, Small and Medium-Sized Enterprises...

  2. Influence of secular trends and sample size on reference equations for lung function tests.

    Science.gov (United States)

    Quanjer, P H; Stocks, J; Cole, T J; Hall, G L; Stanojevic, S

    2011-03-01

    The aim of our study was to determine the contribution of secular trends and sample size to lung function reference equations, and establish the number of local subjects required to validate published reference values. 30 spirometry datasets collected between 1978 and 2009 provided data on healthy, white subjects: 19,291 males and 23,741 females aged 2.5-95 yrs. The best fit for forced expiratory volume in 1 s (FEV(1)), forced vital capacity (FVC) and FEV(1)/FVC as functions of age, height and sex were derived from the entire dataset using GAMLSS. Mean z-scores were calculated for individual datasets to determine inter-centre differences. This was repeated by subdividing one large dataset (3,683 males and 4,759 females) into 36 smaller subsets (comprising 18-227 individuals) to preclude differences due to population/technique. No secular trends were observed and differences between datasets comprising >1,000 subjects were small (maximum difference in FEV(1) and FVC from overall mean: 0.30- -0.22 z-scores). Subdividing one large dataset into smaller subsets reproduced the above sample size-related differences and revealed that at least 150 males and 150 females would be necessary to validate reference values to avoid spurious differences due to sampling error. Use of local controls to validate reference equations will rarely be practical due to the numbers required. Reference equations derived from large or collated datasets are recommended.

  3. Optimal Sample Size for Probability of Detection Curves

    International Nuclear Information System (INIS)

    Annis, Charles; Gandossi, Luca; Martin, Oliver

    2012-01-01

    The use of Probability of Detection (POD) curves to quantify NDT reliability is common in the aeronautical industry, but relatively less so in the nuclear industry. The European Network for Inspection Qualification's (ENIQ) Inspection Qualification Methodology is based on the concept of Technical Justification, a document assembling all the evidence to assure that the NDT system in focus is indeed capable of finding the flaws for which it was designed. This methodology has become widely used in many countries, but the assurance it provides is usually of qualitative nature. The need to quantify the output of inspection qualification has become more important, especially as structural reliability modelling and quantitative risk-informed in-service inspection methodologies become more widely used. To credit the inspections in structural reliability evaluations, a measure of the NDT reliability is necessary. A POD curve provides such metric. In 2010 ENIQ developed a technical report on POD curves, reviewing the statistical models used to quantify inspection reliability. Further work was subsequently carried out to investigate the issue of optimal sample size for deriving a POD curve, so that adequate guidance could be given to the practitioners of inspection reliability. Manufacturing of test pieces with cracks that are representative of real defects found in nuclear power plants (NPP) can be very expensive. Thus there is a tendency to reduce sample sizes and in turn reduce the conservatism associated with the POD curve derived. Not much guidance on the correct sample size can be found in the published literature, where often qualitative statements are given with no further justification. The aim of this paper is to summarise the findings of such work. (author)

  4. On Using a Pilot Sample Variance for Sample Size Determination in the Detection of Differences between Two Means: Power Consideration

    Science.gov (United States)

    Shieh, Gwowen

    2013-01-01

    The a priori determination of a proper sample size necessary to achieve some specified power is an important problem encountered frequently in practical studies. To establish the needed sample size for a two-sample "t" test, researchers may conduct the power analysis by specifying scientifically important values as the underlying population means…

  5. The effects of parameter estimation on minimizing the in-control average sample size for the double sampling X bar chart

    Directory of Open Access Journals (Sweden)

    Michael B.C. Khoo

    2013-11-01

    Full Text Available The double sampling (DS X bar chart, one of the most widely-used charting methods, is superior for detecting small and moderate shifts in the process mean. In a right skewed run length distribution, the median run length (MRL provides a more credible representation of the central tendency than the average run length (ARL, as the mean is greater than the median. In this paper, therefore, MRL is used as the performance criterion instead of the traditional ARL. Generally, the performance of the DS X bar chart is investigated under the assumption of known process parameters. In practice, these parameters are usually estimated from an in-control reference Phase-I dataset. Since the performance of the DS X bar chart is significantly affected by estimation errors, we study the effects of parameter estimation on the MRL-based DS X bar chart when the in-control average sample size is minimised. This study reveals that more than 80 samples are required for the MRL-based DS X bar chart with estimated parameters to perform more favourably than the corresponding chart with known parameters.

  6. Theory of flotation of small and medium-size particles

    Science.gov (United States)

    Derjaguin, B. V.; Dukhin, S. S.

    1993-08-01

    The paper describes a theory of flotation of small and medium-size particles less than 50μ in radius) when their precipitation on a bubble surface depends more on surface forces than on inertia forces, and deformation of the bubble due to collisions with the particles may be neglected. The approach of the mineral particle to the bubble surface is regarded as taking place in three stages corresponding to movement of the particles through zones 1, 2 and 3. Zone 3 is a liquid wetting layer of such thickness that a positive or negative disjoining pressure arises in this intervening layer between the particle and the bubble. By zone 2 is meant the diffusional boundary layer of the bubble. In zone 1, which comprises the entire liquid outside zone 2, there are no surface forces. Precipitation of the particles is calculated by considering the forces acting in zones 1, 2 and 3. The particles move through zone 1 under the action of gravity and inertia. Analysis of the movement of the particles under the action of these forces gives the critical particle size, below which contact with the bubble surface is impossible, if the surface forces acting in zones 2 and 3 be neglected. The forces acting in zone 2 are ‘diffusio-phoretic’ forces due to the concentration gradient in the diffusional boundary layer. The concentration and electric field intensity distribution in zone 2 is calculated, taking into account ion diffusion to the deformed bubble surface. An examination is made of the ‘equilibrium’ surface forces acting in zone 3 independent of whether the bubble is at rest or in motion. These forces, which determine the behaviour of the thin wetting intervening layer between the bubble and the mineral particle and the height of the force barrier against its rupture, may be represented as results of the disjoining pressure forces acting on various parts of the film. The main components of the disjoining pressure are van der Waals forces, forces of an iono

  7. Fruit size and sampling sites affect on dormancy, viability and germination of teak (Tectona grandis L.) seeds

    International Nuclear Information System (INIS)

    Akram, M.; Aftab, F.

    2016-01-01

    In the present study, fruits (drupes) were collected from Changa Manga Forest Plus Trees (CMF-PT), Changa Manga Forest Teak Stand (CMF-TS) and Punjab University Botanical Gardens (PUBG) and categorized into very large (= 17 mm dia.), large (12-16 mm dia.), medium (9-11 mm dia.) or small (6-8 mm dia.) fruit size grades. Fresh water as well as mechanical scarification and stratification were tested for breaking seed dormancy. Viability status of seeds was estimated by cutting test, X-rays and In vitro seed germination. Out of 2595 fruits from CMF-PT, 500 fruits were of very large grade. This fruit category also had highest individual fruit weight (0.58 g) with more number of 4-seeded fruits (5.29 percent) and fair germination potential (35.32 percent). Generally, most of the fruits were 1-seeded irrespective of size grades and sampling sites. Fresh water scarification had strong effect on germination (44.30 percent) as compared to mechanical scarification and cold stratification after 40 days of sowing. Similarly, sampling sites and fruit size grades also had significant influence on germination. Highest germination (82.33 percent) was obtained on MS (Murashige and Skoog) agar-solidified medium as compared to Woody Plant Medium (WPM) (69.22 percent). Seedlings from all the media were transferred to ex vitro conditions in the greenhouse and achieved highest survival (28.6 percent) from seedlings previously raised on MS agar-solidified medium after 40 days. There was an association between the studied parameters of teak seeds and the sampling sites and fruit size. (author)

  8. What is the optimum sample size for the study of peatland testate amoeba assemblages?

    Science.gov (United States)

    Mazei, Yuri A; Tsyganov, Andrey N; Esaulov, Anton S; Tychkov, Alexander Yu; Payne, Richard J

    2017-10-01

    Testate amoebae are widely used in ecological and palaeoecological studies of peatlands, particularly as indicators of surface wetness. To ensure data are robust and comparable it is important to consider methodological factors which may affect results. One significant question which has not been directly addressed in previous studies is how sample size (expressed here as number of Sphagnum stems) affects data quality. In three contrasting locations in a Russian peatland we extracted samples of differing size, analysed testate amoebae and calculated a number of widely-used indices: species richness, Simpson diversity, compositional dissimilarity from the largest sample and transfer function predictions of water table depth. We found that there was a trend for larger samples to contain more species across the range of commonly-used sample sizes in ecological studies. Smaller samples sometimes failed to produce counts of testate amoebae often considered minimally adequate. It seems likely that analyses based on samples of different sizes may not produce consistent data. Decisions about sample size need to reflect trade-offs between logistics, data quality, spatial resolution and the disturbance involved in sample extraction. For most common ecological applications we suggest that samples of more than eight Sphagnum stems are likely to be desirable. Copyright © 2017 Elsevier GmbH. All rights reserved.

  9. Characteristic Performance Evaluation of a new SAGe Well Detector for Small and Large Sample Geometries

    International Nuclear Information System (INIS)

    Adekola, A.S.; Colaresi, J.; Douwen, J.; Jaederstroem, H.; Mueller, W.F.; Yocum, K.M.; Carmichael, K.

    2015-01-01

    Environmental scientific research requires a detector that has sensitivity low enough to reveal the presence of any contaminant in the sample at a reasonable counting time. Canberra developed the germanium detector geometry called Small Anode Germanium (SAGe) Well detector, which is now available commercially. The SAGe Well detector is a new type of low capacitance germanium well detector manufactured using small anode technology capable of advancing many environmental scientific research applications. The performance of this detector has been evaluated for a range of sample sizes and geometries counted inside the well, and on the end cap of the detector. The detector has energy resolution performance similar to semi-planar detectors, and offers significant improvement over the existing coaxial and Well detectors. Energy resolution performance of 750 eV Full Width at Half Maximum (FWHM) at 122 keV γ-ray energy and resolution of 2.0 - 2.3 keV FWHM at 1332 keV γ-ray energy are guaranteed for detector volumes up to 425 cm 3 . The SAGe Well detector offers an optional 28 mm well diameter with the same energy resolution as the standard 16 mm well. Such outstanding resolution performance will benefit environmental applications in revealing the detailed radionuclide content of samples, particularly at low energy, and will enhance the detection sensitivity resulting in reduced counting time. The detector is compatible with electric coolers without any sacrifice in performance and supports the Canberra Mathematical efficiency calibration method (In situ Object Calibration Software or ISOCS, and Laboratory Source-less Calibration Software or LABSOCS). In addition, the SAGe Well detector supports true coincidence summing available in the ISOCS/LABSOCS framework. The improved resolution performance greatly enhances detection sensitivity of this new detector for a range of sample sizes and geometries counted inside the well. This results in lower minimum detectable

  10. Characteristic Performance Evaluation of a new SAGe Well Detector for Small and Large Sample Geometries

    Energy Technology Data Exchange (ETDEWEB)

    Adekola, A.S.; Colaresi, J.; Douwen, J.; Jaederstroem, H.; Mueller, W.F.; Yocum, K.M.; Carmichael, K. [Canberra Industries Inc., 800 Research Parkway, Meriden, CT 06450 (United States)

    2015-07-01

    Environmental scientific research requires a detector that has sensitivity low enough to reveal the presence of any contaminant in the sample at a reasonable counting time. Canberra developed the germanium detector geometry called Small Anode Germanium (SAGe) Well detector, which is now available commercially. The SAGe Well detector is a new type of low capacitance germanium well detector manufactured using small anode technology capable of advancing many environmental scientific research applications. The performance of this detector has been evaluated for a range of sample sizes and geometries counted inside the well, and on the end cap of the detector. The detector has energy resolution performance similar to semi-planar detectors, and offers significant improvement over the existing coaxial and Well detectors. Energy resolution performance of 750 eV Full Width at Half Maximum (FWHM) at 122 keV γ-ray energy and resolution of 2.0 - 2.3 keV FWHM at 1332 keV γ-ray energy are guaranteed for detector volumes up to 425 cm{sup 3}. The SAGe Well detector offers an optional 28 mm well diameter with the same energy resolution as the standard 16 mm well. Such outstanding resolution performance will benefit environmental applications in revealing the detailed radionuclide content of samples, particularly at low energy, and will enhance the detection sensitivity resulting in reduced counting time. The detector is compatible with electric coolers without any sacrifice in performance and supports the Canberra Mathematical efficiency calibration method (In situ Object Calibration Software or ISOCS, and Laboratory Source-less Calibration Software or LABSOCS). In addition, the SAGe Well detector supports true coincidence summing available in the ISOCS/LABSOCS framework. The improved resolution performance greatly enhances detection sensitivity of this new detector for a range of sample sizes and geometries counted inside the well. This results in lower minimum detectable

  11. A Geology Sampling System for Small Bodies

    Science.gov (United States)

    Naids, Adam J.; Hood, Anthony D.; Abell, Paul; Graff, Trevor; Buffington, Jesse

    2016-01-01

    Human exploration of microgravity bodies is being investigated as a precursor to a Mars surface mission. Asteroids, comets, dwarf planets, and the moons of Mars all fall into this microgravity category and some are being discussed as potential mission targets. Obtaining geological samples for return to Earth will be a major objective for any mission to a small body. Currently, the knowledge base for geology sampling in microgravity is in its infancy. Humans interacting with non-engineered surfaces in microgravity environment pose unique challenges. In preparation for such missions a team at the NASA Johnson Space Center has been working to gain experience on how to safely obtain numerous sample types in such an environment. This paper describes the type of samples the science community is interested in, highlights notable prototype work, and discusses an integrated geology sampling solution.

  12. [Sample size calculation in clinical post-marketing evaluation of traditional Chinese medicine].

    Science.gov (United States)

    Fu, Yingkun; Xie, Yanming

    2011-10-01

    In recent years, as the Chinese government and people pay more attention on the post-marketing research of Chinese Medicine, part of traditional Chinese medicine breed has or is about to begin after the listing of post-marketing evaluation study. In the post-marketing evaluation design, sample size calculation plays a decisive role. It not only ensures the accuracy and reliability of post-marketing evaluation. but also assures that the intended trials will have a desired power for correctly detecting a clinically meaningful difference of different medicine under study if such a difference truly exists. Up to now, there is no systemic method of sample size calculation in view of the traditional Chinese medicine. In this paper, according to the basic method of sample size calculation and the characteristic of the traditional Chinese medicine clinical evaluation, the sample size calculation methods of the Chinese medicine efficacy and safety are discussed respectively. We hope the paper would be beneficial to medical researchers, and pharmaceutical scientists who are engaged in the areas of Chinese medicine research.

  13. 77 FR 72691 - Small Business Size Standards: Administrative and Support, Waste Management and Remediation Services

    Science.gov (United States)

    2012-12-06

    ... importantly, the Small Business Act requires SBA to establish one definition of what is a small business... SMALL BUSINESS ADMINISTRATION 13 CFR Part 121 RIN 3245-AG27 Small Business Size Standards: Administrative and Support, Waste Management and Remediation Services AGENCY: U.S. Small Business Administration...

  14. Characterization and optimization of Silicon Photomultipliers and small size scintillator tiles for future calorimeter applications

    CERN Document Server

    AUTHOR|(CDS)2095312; Horváth, Ákos

    For the active layers of highly granular sampling calorimeters, small scintillator tiles read out by Silicon Photomultipliers (SiPM) can be an interesting and cost effective alternative to silicon sensors. At CERN a test setup was realized for the development of new generations of calorimeters to characterize new types of Silicon Photomultipliers in terms of gain, noise, afterpulses and crosstalk and to study the impact of scintillator wrappings and the tile size on the measured light yield and uniformity. In this thesis work, the experimental setup is described and the steps for commissioning the equipment are discussed. Then, the temperature dependence of the Silicon Photomultiplier response will be investigated, including the dependence of bare Silicon Photomultipliers as well as Silicon Photomultipliers coupled to scintillator tiles. Finally, the tile-photomultiplier response for different tile sizes and coating options will be evaluated. The experimental setup will be extended to allow for the characteri...

  15. Ultra-small-angle X-ray scattering characterization of diesel/gasoline soot: sizes and particle-packing conditions

    Science.gov (United States)

    Kameya, Yuki; Lee, Kyeong O.

    2013-10-01

    Regulations on particulate emissions from internal combustion engines tend to become more stringent, accordingly the importance of particulate filters in the after-treatment system has been increasing. In this work, the applicability of ultra-small-angle X-ray scattering (USAXS) to diesel soot cake and gasoline soot was investigated. Gasoline-direct-injection engine soot was collected at different fuel injection timings. The unified fits method was applied to analyze the resultant scattering curves. The validity of analysis was supported by comparing with carbon black and taking the sample images using a transmission electron microscope, which revealed that the primary particle size ranged from 20 to 55 nm. In addition, the effects of particle-packing conditions on the USAXS measurement were demonstrated by using samples suspended in acetone. Then, the investigation was extended to characterization of diesel soot cake deposited on a diesel particulate filter (DPF). Diesel soot was trapped on a small piece of DPF at different deposition conditions which were specified using the Peclet number. The dependence of scattering curve on soot-deposition conditions was demonstrated. To support the interpretation of the USAXS results, soot cake samples were observed using a scanning electron microscope and the influence of particle-packing conditions on scattering curve was discussed.

  16. Ultra-small-angle X-ray scattering characterization of diesel/gasoline soot: sizes and particle-packing conditions

    International Nuclear Information System (INIS)

    Kameya, Yuki; Lee, Kyeong O.

    2013-01-01

    Regulations on particulate emissions from internal combustion engines tend to become more stringent, accordingly the importance of particulate filters in the after-treatment system has been increasing. In this work, the applicability of ultra-small-angle X-ray scattering (USAXS) to diesel soot cake and gasoline soot was investigated. Gasoline-direct-injection engine soot was collected at different fuel injection timings. The unified fits method was applied to analyze the resultant scattering curves. The validity of analysis was supported by comparing with carbon black and taking the sample images using a transmission electron microscope, which revealed that the primary particle size ranged from 20 to 55 nm. In addition, the effects of particle-packing conditions on the USAXS measurement were demonstrated by using samples suspended in acetone. Then, the investigation was extended to characterization of diesel soot cake deposited on a diesel particulate filter (DPF). Diesel soot was trapped on a small piece of DPF at different deposition conditions which were specified using the Peclet number. The dependence of scattering curve on soot-deposition conditions was demonstrated. To support the interpretation of the USAXS results, soot cake samples were observed using a scanning electron microscope and the influence of particle-packing conditions on scattering curve was discussed

  17. Ultra-small-angle X-ray scattering characterization of diesel/gasoline soot: sizes and particle-packing conditions

    Energy Technology Data Exchange (ETDEWEB)

    Kameya, Yuki, E-mail: ykameya@anl.gov; Lee, Kyeong O. [Argonne National Laboratory, Center for Transportation Research (United States)

    2013-10-15

    Regulations on particulate emissions from internal combustion engines tend to become more stringent, accordingly the importance of particulate filters in the after-treatment system has been increasing. In this work, the applicability of ultra-small-angle X-ray scattering (USAXS) to diesel soot cake and gasoline soot was investigated. Gasoline-direct-injection engine soot was collected at different fuel injection timings. The unified fits method was applied to analyze the resultant scattering curves. The validity of analysis was supported by comparing with carbon black and taking the sample images using a transmission electron microscope, which revealed that the primary particle size ranged from 20 to 55 nm. In addition, the effects of particle-packing conditions on the USAXS measurement were demonstrated by using samples suspended in acetone. Then, the investigation was extended to characterization of diesel soot cake deposited on a diesel particulate filter (DPF). Diesel soot was trapped on a small piece of DPF at different deposition conditions which were specified using the Peclet number. The dependence of scattering curve on soot-deposition conditions was demonstrated. To support the interpretation of the USAXS results, soot cake samples were observed using a scanning electron microscope and the influence of particle-packing conditions on scattering curve was discussed.

  18. Accounting standards for small and medium-sized entities. Evidence from Spain

    OpenAIRE

    Milanés Montero Patricia; Albarrán Lozano Irene; Texeira Quirós Joaquín; Esteban Pérez Calderón

    2011-01-01

    By approving Regulation 1606/2002, the European Commission entrusts the local European Union regulators with the difficult decision as to whether making the International Accounting Standards (IAS/IFRS) extensive for their Small and Medium sized Entities or not. International Financial Reporting Standard for Small and Medium-sizzed Entities (hereinafter IFRS for SMEs) was published in 2009, and the European Commission had also decided to seek the opinion of EU stakeholders on this Standard. W...

  19. Graduate Student in Engineering, Hoped by Owners of Medium-and Small-Sized Companies

    Science.gov (United States)

    Kida, Yoshihiro

    Medium-and small-sized companies have held up Japanese technologies which situated top level in the world as far. However, for the technologies of medium-and small-sized companies to keep the top-level in the 21 centuries, researchers and engineers are demand to have high technologies. Those engineers are hoped for to have not only “knowledge”, but also “creativity, feeling and experience”. Therefore, in order to create such professional engineers, educational institute must have a function by which a practical training and education can be conducted.

  20. Reproducibility of R-fMRI metrics on the impact of different strategies for multiple comparison correction and sample sizes.

    Science.gov (United States)

    Chen, Xiao; Lu, Bin; Yan, Chao-Gan

    2018-01-01

    Concerns regarding reproducibility of resting-state functional magnetic resonance imaging (R-fMRI) findings have been raised. Little is known about how to operationally define R-fMRI reproducibility and to what extent it is affected by multiple comparison correction strategies and sample size. We comprehensively assessed two aspects of reproducibility, test-retest reliability and replicability, on widely used R-fMRI metrics in both between-subject contrasts of sex differences and within-subject comparisons of eyes-open and eyes-closed (EOEC) conditions. We noted permutation test with Threshold-Free Cluster Enhancement (TFCE), a strict multiple comparison correction strategy, reached the best balance between family-wise error rate (under 5%) and test-retest reliability/replicability (e.g., 0.68 for test-retest reliability and 0.25 for replicability of amplitude of low-frequency fluctuations (ALFF) for between-subject sex differences, 0.49 for replicability of ALFF for within-subject EOEC differences). Although R-fMRI indices attained moderate reliabilities, they replicated poorly in distinct datasets (replicability < 0.3 for between-subject sex differences, < 0.5 for within-subject EOEC differences). By randomly drawing different sample sizes from a single site, we found reliability, sensitivity and positive predictive value (PPV) rose as sample size increased. Small sample sizes (e.g., < 80 [40 per group]) not only minimized power (sensitivity < 2%), but also decreased the likelihood that significant results reflect "true" effects (PPV < 0.26) in sex differences. Our findings have implications for how to select multiple comparison correction strategies and highlight the importance of sufficiently large sample sizes in R-fMRI studies to enhance reproducibility. Hum Brain Mapp 39:300-318, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  1. Determining sample size for assessing species composition in ...

    African Journals Online (AJOL)

    Species composition is measured in grasslands for a variety of reasons. Commonly, observations are made using the wheel-point apparatus, but the problem of determining optimum sample size has not yet been satisfactorily resolved. In this study the wheel-point apparatus was used to record 2 000 observations in each of ...

  2. Large- and small-size advantages in sneaking behaviour in the dusky frillgoby Bathygobius fuscus

    OpenAIRE

    Takegaki, Takeshi; Kaneko, Takashi; Matsumoto, Yukio

    2012-01-01

    Sneaking tactic, a male alternative reproductive tactic involving sperm competition, is generally adopted by small individuals because of its inconspicuousness. However, large size has an advantage when competition occurs between sneakers for fertilization of eggs. Here, we suggest that both large- and small-size advantages of sneaker males are present within the same species. Large sneaker males of the dusky frillgoby Bathygobius fuscus showed a high success rate in intruding into spawning n...

  3. Towards traceable size determination of extracellular vesicles

    Directory of Open Access Journals (Sweden)

    Zoltán Varga

    2014-02-01

    Full Text Available Background: Extracellular vesicles (EVs have clinical importance due to their roles in a wide range of biological processes. The detection and characterization of EVs are challenging because of their small size, low refractive index, and heterogeneity. Methods: In this manuscript, the size distribution of an erythrocyte-derived EV sample is determined using state-of-the-art techniques such as nanoparticle tracking analysis, resistive pulse sensing, and electron microscopy, and novel techniques in the field, such as small-angle X-ray scattering (SAXS and size exclusion chromatography coupled with dynamic light scattering detection. Results: The mode values of the size distributions of the studied erythrocyte EVs reported by the different methods show only small deviations around 130 nm, but there are differences in the widths of the size distributions. Conclusion: SAXS is a promising technique with respect to traceability, as this technique was already applied for traceable size determination of solid nanoparticles in suspension. To reach the traceable measurement of EVs, monodisperse and highly concentrated samples are required.

  4. Sample size adjustments for varying cluster sizes in cluster randomized trials with binary outcomes analyzed with second-order PQL mixed logistic regression.

    Science.gov (United States)

    Candel, Math J J M; Van Breukelen, Gerard J P

    2010-06-30

    Adjustments of sample size formulas are given for varying cluster sizes in cluster randomized trials with a binary outcome when testing the treatment effect with mixed effects logistic regression using second-order penalized quasi-likelihood estimation (PQL). Starting from first-order marginal quasi-likelihood (MQL) estimation of the treatment effect, the asymptotic relative efficiency of unequal versus equal cluster sizes is derived. A Monte Carlo simulation study shows this asymptotic relative efficiency to be rather accurate for realistic sample sizes, when employing second-order PQL. An approximate, simpler formula is presented to estimate the efficiency loss due to varying cluster sizes when planning a trial. In many cases sampling 14 per cent more clusters is sufficient to repair the efficiency loss due to varying cluster sizes. Since current closed-form formulas for sample size calculation are based on first-order MQL, planning a trial also requires a conversion factor to obtain the variance of the second-order PQL estimator. In a second Monte Carlo study, this conversion factor turned out to be 1.25 at most. (c) 2010 John Wiley & Sons, Ltd.

  5. The effect of clustering on lot quality assurance sampling: a probabilistic model to calculate sample sizes for quality assessments.

    Science.gov (United States)

    Hedt-Gauthier, Bethany L; Mitsunaga, Tisha; Hund, Lauren; Olives, Casey; Pagano, Marcello

    2013-10-26

    Traditional Lot Quality Assurance Sampling (LQAS) designs assume observations are collected using simple random sampling. Alternatively, randomly sampling clusters of observations and then individuals within clusters reduces costs but decreases the precision of the classifications. In this paper, we develop a general framework for designing the cluster(C)-LQAS system and illustrate the method with the design of data quality assessments for the community health worker program in Rwanda. To determine sample size and decision rules for C-LQAS, we use the beta-binomial distribution to account for inflated risk of errors introduced by sampling clusters at the first stage. We present general theory and code for sample size calculations.The C-LQAS sample sizes provided in this paper constrain misclassification risks below user-specified limits. Multiple C-LQAS systems meet the specified risk requirements, but numerous considerations, including per-cluster versus per-individual sampling costs, help identify optimal systems for distinct applications. We show the utility of C-LQAS for data quality assessments, but the method generalizes to numerous applications. This paper provides the necessary technical detail and supplemental code to support the design of C-LQAS for specific programs.

  6. Magneto dynamic study of small size superconductors

    International Nuclear Information System (INIS)

    Mamsurova, L.G.; Pigal'skij, K.S.; Sakun, V.P.

    1996-01-01

    Magnetic field dependencies of magnetic permeability in fine grained YBaCuO samples of grain size D approx λ are investigated. The hysteresis of an oscillatory part of the magnetic permeability is evaluated for a system of loosely bound grains. Numerical calculations of vortex structures and appropriate μ v values for a superconducting plate of thickness d approx λ are made for cases of thermodynamical equilibrium and the existence. The computational results qualitatively agree with the experiment

  7. Subureteral Injection with Small-Size Dextranomer/Hyaluronic Acid Copolymer: Is It Really Efficient?

    Directory of Open Access Journals (Sweden)

    Iyimser Üre

    2016-01-01

    Full Text Available The aim of this study was to evaluate the clinical results of patients with vesicoureteral reflux, which were treated with subureteral injection of small-size (80–120 μm dextranomer/hyaluronic acid copolymer (Dx/HA. Data of 75 children (105 renal units who underwent STING procedure with small-size Dx/HA for the treatment of vesicoureteral reflux (VUR in our clinic between 2008 and 2012 were retrospectively analyzed. Preoperative reflux grade and side, injection indication, postoperative urinary infections and urinary symptoms, voiding cystourethrogram, and renal scintigraphy results were evaluated. The success rate of the procedure was 100% in patients with grades 1 and 2 reflux, 91% in patients with grade 3 reflux, and 82.6% in patients with grade 4. Overall success rate of the treated patients was 97%. Endoscopic subureteric injection with Dx/HA procedure has become a reasonable minimally invasive alternative technique to open surgery, long-term antibiotic prophylaxis, and surveillance modalities in treatment of VUR in terms of easy application, low costs and complication rates, and high success rates. Injection material composed of small-size dextranomer microspheres seems superior to normal size Dx/HA, together with offering similar success with low cost.

  8. The Impact of their Adoption on Small and Medium-sized Enterprises

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... of small and medium-sized enterprises (SMEs) as well as the impact of these ... of Mathematics and Industrial Engineering at the École Polytechique, Montréal. Dr Lefebvre's research focuses on technological innovation and the strategic ...

  9. International seminar on status and prospects for small and medium sized reactors. Book of extended synopses

    International Nuclear Information System (INIS)

    2001-05-01

    Small and medium sized reactors have been defined by IAEA in accordance with their net electrical power rating, or thermal equivalent power rating for reactors producing process heat. Within the global energy supply and demand the role of nuclear power, and specifically the role of small and medium sizes reactors, is reviewed. Some new design prospects of dual purpose power plants, namely for heat production and desalination, are presented. Some proposal for new concepts of small size power reactors are described. A special concern is devoted to economic and energy policy issues as well as to safety and non-proliferation matters

  10. Predictors of Citation Rate in Psychology: Inconclusive Influence of Effect and Sample Size.

    Science.gov (United States)

    Hanel, Paul H P; Haase, Jennifer

    2017-01-01

    In the present article, we investigate predictors of how often a scientific article is cited. Specifically, we focus on the influence of two often neglected predictors of citation rate: effect size and sample size, using samples from two psychological topical areas. Both can be considered as indicators of the importance of an article and post hoc (or observed) statistical power, and should, especially in applied fields, predict citation rates. In Study 1, effect size did not have an influence on citation rates across a topical area, both with and without controlling for numerous variables that have been previously linked to citation rates. In contrast, sample size predicted citation rates, but only while controlling for other variables. In Study 2, sample and partly effect sizes predicted citation rates, indicating that the relations vary even between scientific topical areas. Statistically significant results had more citations in Study 2 but not in Study 1. The results indicate that the importance (or power) of scientific findings may not be as strongly related to citation rate as is generally assumed.

  11. Sample size calculation to externally validate scoring systems based on logistic regression models.

    Directory of Open Access Journals (Sweden)

    Antonio Palazón-Bru

    Full Text Available A sample size containing at least 100 events and 100 non-events has been suggested to validate a predictive model, regardless of the model being validated and that certain factors can influence calibration of the predictive model (discrimination, parameterization and incidence. Scoring systems based on binary logistic regression models are a specific type of predictive model.The aim of this study was to develop an algorithm to determine the sample size for validating a scoring system based on a binary logistic regression model and to apply it to a case study.The algorithm was based on bootstrap samples in which the area under the ROC curve, the observed event probabilities through smooth curves, and a measure to determine the lack of calibration (estimated calibration index were calculated. To illustrate its use for interested researchers, the algorithm was applied to a scoring system, based on a binary logistic regression model, to determine mortality in intensive care units.In the case study provided, the algorithm obtained a sample size with 69 events, which is lower than the value suggested in the literature.An algorithm is provided for finding the appropriate sample size to validate scoring systems based on binary logistic regression models. This could be applied to determine the sample size in other similar cases.

  12. Determining the sample size for co-dominant molecular marker-assisted linkage detection for a monogenic qualitative trait by controlling the type-I and type-II errors in a segregating F2 population.

    Science.gov (United States)

    Hühn, M; Piepho, H P

    2003-03-01

    Tests for linkage are usually performed using the lod score method. A critical question in linkage analyses is the choice of sample size. The appropriate sample size depends on the desired type-I error and power of the test. This paper investigates the exact type-I error and power of the lod score method in a segregating F(2) population with co-dominant markers and a qualitative monogenic dominant-recessive trait. For illustration, a disease-resistance trait is considered, where the susceptible allele is recessive. A procedure is suggested for finding the appropriate sample size. It is shown that recessive plants have about twice the information content of dominant plants, so the former should be preferred for linkage detection. In some cases the exact alpha-values for a given nominal alpha may be rather small due to the discrete nature of the sampling distribution in small samples. We show that a gain in power is possible by using exact methods.

  13. A simple Bayesian approach to quantifying confidence level of adverse event incidence proportion in small samples.

    Science.gov (United States)

    Liu, Fang

    2016-01-01

    In both clinical development and post-marketing of a new therapy or a new treatment, incidence of an adverse event (AE) is always a concern. When sample sizes are small, large sample-based inferential approaches on an AE incidence proportion in a certain time period no longer apply. In this brief discussion, we introduce a simple Bayesian framework to quantify, in small sample studies and the rare AE case, (1) the confidence level that the incidence proportion of a particular AE p is over or below a threshold, (2) the lower or upper bounds on p with a certain level of confidence, and (3) the minimum required number of patients with an AE before we can be certain that p surpasses a specific threshold, or the maximum allowable number of patients with an AE after which we can no longer be certain that p is below a certain threshold, given a certain confidence level. The method is easy to understand and implement; the interpretation of the results is intuitive. This article also demonstrates the usefulness of simple Bayesian concepts when it comes to answering practical questions.

  14. 76 FR 63509 - Small Business Size Standards: Administrative and Support, Waste Management and Remediation Services

    Science.gov (United States)

    2011-10-12

    ... Federal small business assistance, SBA establishes small business definitions (referred to as size... operated, (2) not dominant in its field of operation, and (3) within a specific small business definition... Vol. 76 Wednesday, No. 197 October 12, 2011 Part VI Small Business Administration 13 CFR Part 121...

  15. Accurate EPR radiosensitivity calibration using small sample masses

    Science.gov (United States)

    Hayes, R. B.; Haskell, E. H.; Barrus, J. K.; Kenner, G. H.; Romanyukha, A. A.

    2000-03-01

    We demonstrate a procedure in retrospective EPR dosimetry which allows for virtually nondestructive sample evaluation in terms of sample irradiations. For this procedure to work, it is shown that corrections must be made for cavity response characteristics when using variable mass samples. Likewise, methods are employed to correct for empty tube signals, sample anisotropy and frequency drift while considering the effects of dose distribution optimization. A demonstration of the method's utility is given by comparing sample portions evaluated using both the described methodology and standard full sample additive dose techniques. The samples used in this study are tooth enamel from teeth removed during routine dental care. We show that by making all the recommended corrections, very small masses can be both accurately measured and correlated with measurements of other samples. Some issues relating to dose distribution optimization are also addressed.

  16. Accurate EPR radiosensitivity calibration using small sample masses

    International Nuclear Information System (INIS)

    Hayes, R.B.; Haskell, E.H.; Barrus, J.K.; Kenner, G.H.; Romanyukha, A.A.

    2000-01-01

    We demonstrate a procedure in retrospective EPR dosimetry which allows for virtually nondestructive sample evaluation in terms of sample irradiations. For this procedure to work, it is shown that corrections must be made for cavity response characteristics when using variable mass samples. Likewise, methods are employed to correct for empty tube signals, sample anisotropy and frequency drift while considering the effects of dose distribution optimization. A demonstration of the method's utility is given by comparing sample portions evaluated using both the described methodology and standard full sample additive dose techniques. The samples used in this study are tooth enamel from teeth removed during routine dental care. We show that by making all the recommended corrections, very small masses can be both accurately measured and correlated with measurements of other samples. Some issues relating to dose distribution optimization are also addressed

  17. Small Business Size Standards

    Data.gov (United States)

    Small Business Administration — Certain government programs, such as SBA loan programs and contracting opportunities, are reserved for small business concerns. In order to qualify, businesses must...

  18. Size selective isocyanate aerosols personal air sampling using porous plastic foams

    International Nuclear Information System (INIS)

    Cong Khanh Huynh; Trinh Vu Duc

    2009-01-01

    As part of a European project (SMT4-CT96-2137), various European institutions specialized in occupational hygiene (BGIA, HSL, IOM, INRS, IST, Ambiente e Lavoro) have established a program of scientific collaboration to develop one or more prototypes of European personal samplers for the collection of simultaneous three dust fractions: inhalable, thoracic and respirable. These samplers based on existing sampling heads (IOM, GSP and cassettes) use Polyurethane Plastic Foam (PUF) according to their porosity to support sampling and separator size of the particles. In this study, the authors present an original application of size selective personal air sampling using chemical impregnated PUF to perform isocyanate aerosols capturing and derivatizing in industrial spray-painting shops.

  19. Multislice helical CT analysis of small-sized airway wall thickness in smokers and patients with bronchial asthma

    International Nuclear Information System (INIS)

    Sekimura, Kenshi; Ito, Harumasa; Nakamura, Yutaka; Kobayashi, Hitoshi; Oikawa, Hirobumi; Inoue, Hiroshi; Ehara, Shigeru; Yamauchi, Kohei

    2010-01-01

    There is accumulating evidence that airway remodeling, which contributes to airway narrowing, plays a role in the pathogenesis of bronchial asthma (BA) and chronic obstructive pulmonary disease (COPD). Development of the multislice helical CT (MSCT) with improved spatial resolution has made it possible to obtain more precise imaging of small-sized airways. Small-sized airway wall-thickness was measured using the MSCT scan to analyze small-sized airways of smokers and BA patients, and examine the effects of a β 2 agonists on small-sized airway wall-thickness of BA patients. Thirty-six non-asthmatics who participated in the Health Check Program of Iwate Medical University and 25 patients with asthma were recruited. Amongst the 36 non-asthmatics were 20 healthy never-smokers and 15 smokers. The other 25 asthmatics were recruited from the outpatient clinic at Iwate Medical University. MSCT was performed and the right B10 bronchus was chosen for dimensional analysis. Airway wall thickness was expressed as a percentage of wall area (WA%). WA% of the 7 asthmatics before and 30 mim after procaterol (20μg) inspiration were compared. Small-sized airway wall thickness was significantly increased in smokers and patients with asthma compared to healthy never-smokers, when determined by MSCT. Both %V 50 and %V 25 had significant negative correlations with WA% among the healthy never-smokers and smoker population. Procaterol inspiration reduced WA% in the small airway of patients with asthma. Increase of small-sized airway thickness measured by MSCT scan may reflect peripheral obstructive lesions of smokers and BA patients. (author)

  20. An integrated approach for multi-level sample size determination

    International Nuclear Information System (INIS)

    Lu, M.S.; Teichmann, T.; Sanborn, J.B.

    1997-01-01

    Inspection procedures involving the sampling of items in a population often require steps of increasingly sensitive measurements, with correspondingly smaller sample sizes; these are referred to as multilevel sampling schemes. In the case of nuclear safeguards inspections verifying that there has been no diversion of Special Nuclear Material (SNM), these procedures have been examined often and increasingly complex algorithms have been developed to implement them. The aim in this paper is to provide an integrated approach, and, in so doing, to describe a systematic, consistent method that proceeds logically from level to level with increasing accuracy. The authors emphasize that the methods discussed are generally consistent with those presented in the references mentioned, and yield comparable results when the error models are the same. However, because of its systematic, integrated approach the proposed method elucidates the conceptual understanding of what goes on, and, in many cases, simplifies the calculations. In nuclear safeguards inspections, an important aspect of verifying nuclear items to detect any possible diversion of nuclear fissile materials is the sampling of such items at various levels of sensitivity. The first step usually is sampling by ''attributes'' involving measurements of relatively low accuracy, followed by further levels of sampling involving greater accuracy. This process is discussed in some detail in the references given; also, the nomenclature is described. Here, the authors outline a coordinated step-by-step procedure for achieving such multilevel sampling, and they develop the relationships between the accuracy of measurement and the sample size required at each stage, i.e., at the various levels. The logic of the underlying procedures is carefully elucidated; the calculations involved and their implications, are clearly described, and the process is put in a form that allows systematic generalization

  1. High temperature salting of mince of small sized fish

    OpenAIRE

    Sorinmade, S.O.; Talabi, S.O.; Aliu, A.

    1982-01-01

    Freshly caught small sized fish species were transported to the laboratory gutted and washed before mechanical separation into bone and mince. Duplicate batches of the mince were then treated with seven different concentrations (wt/wt) of sodium chloride before cooking. The cooked mince was divided into two groups, pressed and unpressed. Percentage residual salt in the salted cooked mince, free and press water and salted cooked pressed mince were determined. Also, the moisture contents of...

  2. Sales strategy in small and medium-sized food companies

    DEFF Research Database (Denmark)

    Nielsen, Renate; Stacey, Julia; Andersen, Lone Merete Schreiber

    2002-01-01

    All managing teams know that they have to deal with the company's strategy in handling the market. But how do they do it and on the basis of which beleifs and interpretations? A nearly completed PhD project has examined the way in which three small and medium-sized food companies define...... their market, their competitors and customers, and how they create the world they perceive and act in through their own actions....

  3. Effects of corridors on home range sizes and interpatch movements of three small mammal species.

    Energy Technology Data Exchange (ETDEWEB)

    Mabry, Karen, E.; Barrett, Gary, W.

    2002-04-30

    Mabry, K.E., and G.W. Barrett. 2002. Effects of corridors on home range sizes and interpatch movements of three small mammal species. Landscape Ecol. 17:629-636. Corridors are predicted to benefit populations in patchy habitats by promoting movement, which should increase population densities, gene flow, and recolonization of extinct patch populations. However, few investigators have considered use of the total landscape, particularly the possibility of interpatch movement through matrix habitat, by small mammals. This study compares home range sizes of 3 species of small mammals, the cotton mouse, old field mouse and cotton rat between patches with and without corridors. Corridor presence did not have a statistically significant influence on average home range size. Habitat specialization and sex influenced the probability of an individual moving between 2 patches without corridors. The results of this study suggest that small mammals may be more capable of interpatch movement without corridors than is frequently assumed.

  4. HRM implications for network collaboration of small and medium sized companies

    DEFF Research Database (Denmark)

    Kesting, Peter; Müller, Sabine; Jørgensen, Frances

    Collaboration in innovation networks is particularly important for small and medium enterprises (SMEs) to remain competitive in a rapidly changing environment. Collaboration may however be particularly challenging for SMEs, due to their size and their inherent shortage of resources. In this paper...

  5. Local heterogeneity effects on small-sample worths

    International Nuclear Information System (INIS)

    Schaefer, R.W.

    1986-01-01

    One of the parameters usually measured in a fast reactor critical assembly is the reactivity associated with inserting a small sample of a material into the core (sample worth). Local heterogeneities introduced by the worth measurement techniques can have a significant effect on the sample worth. Unfortunately, the capability is lacking to model some of the heterogeneity effects associated with the experimental technique traditionally used at ANL (the radial tube technique). It has been suggested that these effects could account for a large portion of what remains of the longstanding central worth discrepancy. The purpose of this paper is to describe a large body of experimental data - most of which has never been reported - that shows the effect of radial tube-related local heterogeneities

  6. Computing Confidence Bounds for Power and Sample Size of the General Linear Univariate Model

    OpenAIRE

    Taylor, Douglas J.; Muller, Keith E.

    1995-01-01

    The power of a test, the probability of rejecting the null hypothesis in favor of an alternative, may be computed using estimates of one or more distributional parameters. Statisticians frequently fix mean values and calculate power or sample size using a variance estimate from an existing study. Hence computed power becomes a random variable for a fixed sample size. Likewise, the sample size necessary to achieve a fixed power varies randomly. Standard statistical practice requires reporting ...

  7. EXTERNAL FORCES DRIVING CHANGE IN THE ROMANIAN SMALL AND MEDIUM SIZED ENTERPRISES

    Directory of Open Access Journals (Sweden)

    Roiban Roxana Nadina

    2012-07-01

    Full Text Available Change is a constant in everyday life confronting organizations to continuously adapt their strategy, structure, processes, and culture in order to survive and stay competitive on the market. Implementing organizational change is one of the most important skills required for managers and in the meantime the most difficult one. The forces driving change within an organization, that can be either external or internal, are those that propel a company forward towards change and in order to identify the need for change and make the proper changes, managers have to develop a tool that allows them to analyze how does the environment influence their business activities. A vision for change will clarify the directions in which the organization needs to move, starting from its current state and taking in consideration the existing opportunities and threats from the environment that allow to move to a future desired state. The purpose of this paper is to identify the concern for change in the Romanian small and medium sized enterprises by presenting and explaining the past and present influences of the main external forces that have determined the need for change in the last 3-5 years and to make recommendations about future possible changes that have to be performed by managers for a better harmonization with the environment. The research method used for this study is the interview on a sample that contains some of the most relevant SME’s from the western side of Romania, from different industries. We analyzed the main external forces that had an impact on the small and medium sized enterprises and how were they generating the need for organizational change, in order to see which present and future changes are required.

  8. Conceptual design of small-sized HTGR system (1). Major specifications and system designs

    International Nuclear Information System (INIS)

    Ohashi, Hirofumi; Sato, Hiroyuki; Tazawa, Yujiro; Yan, Xing L.; Tachibana, Yukio

    2011-06-01

    Japan Atomic Energy Agency (JAEA) has started a conceptual design of a 50MWt small-sized high temperature gas cooled reactor (HTGR) for steam supply and electricity generation (HTR50S), which is a first-of-kind of the commercial plant or a demonstration plant of a small-sized HTGR system for steam supply to the industries and district heating and electricity generation by a steam turbine, to deploy in developing countries in the 2030s. The design philosophy is that the HTR50S is a high advanced reactor, which is reducing the R and D risk based on the HTTR design, upgrading the performance and reducing the cost for commercialization by utilizing the knowledge obtained by the HTTR operation and the GTHTR300 design. The major specifications of the HTR50S were determined and targets of the technology demonstration using the HTR50S (e.g., the increasing the power density, reduction of the number of uranium enrichment in the fuel, increasing the burn up, side-by-side arrangement between the reactor pressure vessel and the steam generator) were identified. In addition, the system design of HTR50S, which offers the capability of electricity generation, cogeneration of electricity and steam for a district heating and industries, was performed. Furthermore, a market size of small-sized HTGR systems was investigated. (author)

  9. Estimation of sample size and testing power (Part 3).

    Science.gov (United States)

    Hu, Liang-ping; Bao, Xiao-lei; Guan, Xue; Zhou, Shi-guo

    2011-12-01

    This article introduces the definition and sample size estimation of three special tests (namely, non-inferiority test, equivalence test and superiority test) for qualitative data with the design of one factor with two levels having a binary response variable. Non-inferiority test refers to the research design of which the objective is to verify that the efficacy of the experimental drug is not clinically inferior to that of the positive control drug. Equivalence test refers to the research design of which the objective is to verify that the experimental drug and the control drug have clinically equivalent efficacy. Superiority test refers to the research design of which the objective is to verify that the efficacy of the experimental drug is clinically superior to that of the control drug. By specific examples, this article introduces formulas of sample size estimation for the three special tests, and their SAS realization in detail.

  10. Favouring Small and Medium Sized Enterprises with Directive 2014/24/EU

    DEFF Research Database (Denmark)

    Trybus, Martin; Andrecka, Marta

    2017-01-01

    This article argues that the four main measures introduced in the 2014 reform of the Procurement Directives to promote Small andMediumSized Enterprises (SMEs) cannot be classified as measures favouring SMEs. A measure favours SMEs when it compromises the main objectives of competition, non...

  11. Risk assessment of small-sized HTR with pebble-bed core

    International Nuclear Information System (INIS)

    Kroeger, W.; Mertens, J.; Wolters, J.

    1987-01-01

    Two recent concepts of small-sized HTR's (HTR-Modul and HTR-100) were analysed regarding their safety concepts and risk protection. In neither case do core cooling accidents contribute to the risk because of the low induced core temperatures. Water ingress accidents dominate the risk in both cases by detaching deposited fission products which can be released into the environment. For these accident sequences no early fatalities and practically no lethal case of cancer were computed. Both HTR concepts include adequate precautionary measures and an infinitely small risk according to the usual standards. The safety concepts make express use of the specific inherent safety features of pebble-bed HTR's. (orig.)

  12. Species richness in soil bacterial communities: a proposed approach to overcome sample size bias.

    Science.gov (United States)

    Youssef, Noha H; Elshahed, Mostafa S

    2008-09-01

    Estimates of species richness based on 16S rRNA gene clone libraries are increasingly utilized to gauge the level of bacterial diversity within various ecosystems. However, previous studies have indicated that regardless of the utilized approach, species richness estimates obtained are dependent on the size of the analyzed clone libraries. We here propose an approach to overcome sample size bias in species richness estimates in complex microbial communities. Parametric (Maximum likelihood-based and rarefaction curve-based) and non-parametric approaches were used to estimate species richness in a library of 13,001 near full-length 16S rRNA clones derived from soil, as well as in multiple subsets of the original library. Species richness estimates obtained increased with the increase in library size. To obtain a sample size-unbiased estimate of species richness, we calculated the theoretical clone library sizes required to encounter the estimated species richness at various clone library sizes, used curve fitting to determine the theoretical clone library size required to encounter the "true" species richness, and subsequently determined the corresponding sample size-unbiased species richness value. Using this approach, sample size-unbiased estimates of 17,230, 15,571, and 33,912 were obtained for the ML-based, rarefaction curve-based, and ACE-1 estimators, respectively, compared to bias-uncorrected values of 15,009, 11,913, and 20,909.

  13. SMALL AND MEDIUM-SIZED ENTERPRISES IN REPUBLIC OF MOLDOVA

    Directory of Open Access Journals (Sweden)

    Igor G. SÎRODOEV

    2009-12-01

    Full Text Available The paper examines the role of SME sector in Moldova’s economy and its likely implications in the RD processes. There are emphasized several key features of the relationship between knowledge economy, RD and SMEs, and their effect on regional and local communities. Then, a summary of RD particular features in Moldova is presented.The role of SME sector, examined in terms of its performance by ownership and activity types, as a whole and by main firms’ categories (micro-, small, and medium-sized is analyzed. Small enterprises have been considered as the most adequate solution forpromoting RD in general case. National-level analysis is completed by regional insight into the sector as a whole. Finally, some critical aspects of SMEs’ role in RD in Moldova are discussed.

  14. [Formal sample size calculation and its limited validity in animal studies of medical basic research].

    Science.gov (United States)

    Mayer, B; Muche, R

    2013-01-01

    Animal studies are highly relevant for basic medical research, although their usage is discussed controversially in public. Thus, an optimal sample size for these projects should be aimed at from a biometrical point of view. Statistical sample size calculation is usually the appropriate methodology in planning medical research projects. However, required information is often not valid or only available during the course of an animal experiment. This article critically discusses the validity of formal sample size calculation for animal studies. Within the discussion, some requirements are formulated to fundamentally regulate the process of sample size determination for animal experiments.

  15. More Differences or More Similarities Regarding Education in Big, Middle-sized and Small Companies

    Directory of Open Access Journals (Sweden)

    Marjana Merkač

    2001-12-01

    Full Text Available The article presents the results of research of education and qualifying of employees in small, middle-sized and big Slovenian companies. The research shows some differences regarding the attitude to the development of employees as a part of a company's business strategy, some obstacles for developing their abilities, and connections between job satisfaction and motivation for learning. It also shows how important it is for the subjects concerning education and qualifying if an individual works for a big, middle-sized, or small company.

  16. How big is big and how small is small the sizes of everything and why

    CERN Document Server

    Smith, Timothy Paul

    2013-01-01

    This book is about how big is the universe and how small are quarks, and what are the sizes of dozens of things between these two extremes. It describes the sizes of atoms and planets, quarks and galaxies, cells and sequoias. It is a romp through forty-five orders of magnitude from the smallest sub-nuclear particles we have measured, to the edge of the observed universe. It also looks at time, from the epic age of the cosmos to the fleeting lifetimes of ethereal particles. It is a narrative that trips its way from stellar magnitudes to the clocks on GPS satellites, from the nearly logarithmic scales of a piano keyboard through a system of numbers invented by Archimedes and on to the measurement of the size of an atom. Why do some things happen at certain scales? Why are cells a hundred thousandths of a meter across? Why are stars never smaller than about 100 million meters in diameter? Why are trees limited to about 120 meters in height? Why are planets spherical, but asteroids not? Often the size of an objec...

  17. Utilizing Content Marketing in Small and Medium-Sized Organizations

    OpenAIRE

    Parviainen, Ville

    2014-01-01

    The major objective of this study is to find out how and to what extent online content is currently utilized for marketing purposes among small and medium-sized organizations in Finland. Additionally, positive and negative future prospects concerning this type of content marketing were explored. The study is mainly qualitative by nature. The empirical part of this thesis was carried out between July 2013 and March 2014 and it consists of five semi-structured interviews with five professio...

  18. Design features to achieve defence-in-depth in small and medium sized reactors

    International Nuclear Information System (INIS)

    Kuznetsov, Vladimir

    2009-01-01

    Broader incorporation of inherent and passive safety design features has become a 'trademark' of many advanced reactor concepts, including several evolutionary designs and nearly all innovative small and medium sized design concepts. Ensuring adequate defence-in-depth is important for reactors of smaller output because many of them are being designed to allow more proximity to the user, specifically, when non-electrical energy products are targeted. Based on the activities recently performed by the International Atomic Energy Agency, the paper provides a summary description of the design features used to achieve defence in depth in the eleven representative concepts of small and medium sized reactors. (author)

  19. Method to make accurate concentration and isotopic measurements for small gas samples

    Science.gov (United States)

    Palmer, M. R.; Wahl, E.; Cunningham, K. L.

    2013-12-01

    Carbon isotopic ratio measurements of CO2 and CH4 provide valuable insight into carbon cycle processes. However, many of these studies, like soil gas, soil flux, and water head space experiments, provide very small gas sample volumes, too small for direct measurement by current constant-flow Cavity Ring-Down (CRDS) isotopic analyzers. Previously, we addressed this issue by developing a sample introduction module which enabled the isotopic ratio measurement of 40ml samples or smaller. However, the system, called the Small Sample Isotope Module (SSIM), does dilute the sample during the delivery with inert carrier gas which causes a ~5% reduction in concentration. The isotopic ratio measurements are not affected by this small dilution, but researchers are naturally interested accurate concentration measurements. We present the accuracy and precision of a new method of using this delivery module which we call 'double injection.' Two portions of the 40ml of the sample (20ml each) are introduced to the analyzer, the first injection of which flushes out the diluting gas and the second injection is measured. The accuracy of this new method is demonstrated by comparing the concentration and isotopic ratio measurements for a gas sampled directly and that same gas measured through the SSIM. The data show that the CO2 concentration measurements were the same within instrument precision. The isotopic ratio precision (1σ) of repeated measurements was 0.16 permil for CO2 and 1.15 permil for CH4 at ambient concentrations. This new method provides a significant enhancement in the information provided by small samples.

  20. Improving small RNA-seq by using a synthetic spike-in set for size-range quality control together with a set for data normalization.

    Science.gov (United States)

    Locati, Mauro D; Terpstra, Inez; de Leeuw, Wim C; Kuzak, Mateusz; Rauwerda, Han; Ensink, Wim A; van Leeuwen, Selina; Nehrdich, Ulrike; Spaink, Herman P; Jonker, Martijs J; Breit, Timo M; Dekker, Rob J

    2015-08-18

    There is an increasing interest in complementing RNA-seq experiments with small-RNA (sRNA) expression data to obtain a comprehensive view of a transcriptome. Currently, two main experimental challenges concerning sRNA-seq exist: how to check the size distribution of isolated sRNAs, given the sensitive size-selection steps in the protocol; and how to normalize data between samples, given the low complexity of sRNA types. We here present two separate sets of synthetic RNA spike-ins for monitoring size-selection and for performing data normalization in sRNA-seq. The size-range quality control (SRQC) spike-in set, consisting of 11 oligoribonucleotides (10-70 nucleotides), was tested by intentionally altering the size-selection protocol and verified via several comparative experiments. We demonstrate that the SRQC set is useful to reproducibly track down biases in the size-selection in sRNA-seq. The external reference for data-normalization (ERDN) spike-in set, consisting of 19 oligoribonucleotides, was developed for sample-to-sample normalization in differential-expression analysis of sRNA-seq data. Testing and applying the ERDN set showed that it can reproducibly detect differential expression over a dynamic range of 2(18). Hence, biological variation in sRNA composition and content between samples is preserved while technical variation is effectively minimized. Together, both spike-in sets can significantly improve the technical reproducibility of sRNA-seq. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. Generating Random Samples of a Given Size Using Social Security Numbers.

    Science.gov (United States)

    Erickson, Richard C.; Brauchle, Paul E.

    1984-01-01

    The purposes of this article are (1) to present a method by which social security numbers may be used to draw cluster samples of a predetermined size and (2) to describe procedures used to validate this method of drawing random samples. (JOW)

  2. Continuing Education for Managers from Small and Medium Sized German Companies.

    Science.gov (United States)

    Fub, Jorg

    1995-01-01

    An international trade school in southern Germany, which is a highly export-oriented environment, has established a vocational and professional continuing education program for personnel of small- and medium-size companies. Offerings include a graduate course in international marketing, seminars for export companies, distance education in…

  3. Accelerating Innovation in Small and Medium-Sized Enterprises in the ICT Services Sector

    Directory of Open Access Journals (Sweden)

    Juan Vicente García Manjón

    2016-09-01

    Full Text Available This work is aimed at establishing key organizational strategies that push small and medium-sized enterprises (SMEs in the information and communication technologies (ICT services sector to be more innovative. Following an input and output approach, a theoretical model is built that helps determine such key factors. Then the model is validated with fieldwork carried out on a representative sample in Castile-Leon region (Spain. According to this model, the decision to innovate (and the trigger for research and development [R&D] activities in SMEs in the ICT services sector is the result of needs that are presented by customers (public or private sector in tailored projects. All the same, the factors leading to innovation are in order of importance, the technological level of the company, the effort made by the company in R&D, and finally, systems of incentives.

  4. On the surveillance for animal diseases in small herds

    DEFF Research Database (Denmark)

    Greiner, Matthias; Dekker, Aldo

    2005-01-01

    Small herds may present a problem in surveillance for infectious animal diseases because typical levels of a within-herd design prevalence are not directly applicable. We suggest a definition of small herds as those smaller than 2/(within-herd design prevalence) on the basis that such herds would...... be expected to have less than two (i.e. only one) infected animals. Consequently, the probability of detecting small herds cannot be improved by choosing a larger sample size within the herd. We derive necessary sample sizes of herds and the probability ("confidence") of detecting disease within a stratum...... conservative (lower) estimates of the confidence for a given sample size and should therefore be preferred....

  5. Specifics of Management in Small and Medium-Size Enterprises in Serbia

    Directory of Open Access Journals (Sweden)

    Marija Lazarević-Moravčević

    2014-12-01

    Full Text Available Under modern conditions for performing business, an enterprise regardless of its size or business activity must be systematically directed and run. Small and medium-size enterprises have certain characteristics that make them considerably different compared to large systems, therefore it is not realistic to expect that the management process in such organizations would develop in the same way as in the large systems. The fact is that the small business often does not involve a small investment, and the management of small enterprises is constantly faced with the problem of "poverty of resources" that leads to the conclusion that the success of SMEs is predominantly determined by the managerial skills of managers/owners. Assuming that the owners/managers of SMEs adequately perceive the capabilities of their enterprise, make the right decisions, finding effective solutions in terms of organization and apply modern approaches to control and the success of the company definitely will follow. In this paper theoretical and empirical research has been carried out with the aim to identify the basic characteristics of successful management of SMEs in Serbia. The research results indicate that the managerial capacity of managers/owners of SMEs is the main strength of the company, and one of the key sources of growth and development. When the influence of external factors is very unfavorable for business development, the management as an internal resource of organization is increasingly gaining importance in creating business success and competitive advantage.

  6. Development of design program for small-sized gas absorption chiller/heaters

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, J.I.; Kwon, O.K.; Moon, C.K. [Pukyong National University, Pusan (Korea); Yang, Y.M.; Kim, H.Y. [R and D Center, Korea Gas Corporation, Ansan (Korea)

    1999-10-01

    Analysis of basic data is performed for development of small size water-cooled household absorption chiller/heater using non CFC refrigerant, analytic simulation program of air cooling performance is developed that system has 1.5-10RT of air cooling performance, we perform cycle analysis and numerical simulation. We develope a performance analysis of simulation program to perform a basic design for 1.5-10RT apparatus of small size system of development model in gas driven double effect absorption chiller/heater. The system working condition and operation limit condition is decided from the existing data which is analyzed and the conference with KOGAS. After the basic input variable and regular condition is established for heat cycle analysis, the simulation algorithm is set up and performance simulation program is coded according to the organized algorithm. The basic design of optimum system is completed from parametric study using developed simulation program and establishing the design variable range of developing object model. 20 refs., 30 figs., 9 tabs.

  7. THE POSSIBILITY OF USING INTERNATIONAL EXPERIENCE IN MICRO-CREDIT FOR SMALL AND MEDIUM-SIZED INDUSTRIAL ENTERPRISES

    Directory of Open Access Journals (Sweden)

    I. N. Klyukin

    2016-01-01

    Full Text Available Purpose of the study. Increasing the availability of funding for small and medium-sized enterprises of the industrial sector put among the most important tasks of economic development as the leading developed countries, and developing countries. In connection with the above, the purpose of this article is to study the micro-credit as an efficient mechanism to stimulate the development of small and medium-sized industrial enterprises and analyzing the possibility of using foreign experience in improving the process of micro-credit to stimulate their development.Research Methodology. The study was conducted on the material of publications on various aspects of the microcredit industry small and medium businesses, including international experience in micro-credit for small and medium-sized industrial enterprises.The article analyzes the functioning of the various models and micro-technologies, disclosed interoperability of commercial banks and microfinance institutions (MFIs in the framework of country-specific microcredit models, and formulated the immediate tasks and activities of the government and regulatory authorities of the Russian Federation aimed at improving the financing of small and medium-sized industrial enterprises.Sounding the findings suggest that the funds to support small and medium-sized industrial enterprises should be more actively attract private investment in the implementation of industrial and innovative development of their projects. In this case, the intensification of financial-credit and investment support to small and medium-sized industrial enterprises, integration and optimization of the different sources of financial resources create favorable conditions for their access to financial and credit resources, and improvement of financial and credit support mechanisms will enhance their responsibility for use granted resources and contribute to their development. At the same time, the main focus of the use of public

  8. Born small, die young: Intrinsic, size-selective mortality in marine larval fish

    KAUST Repository

    Garrido, S.

    2015-11-24

    Mortality during the early stages is a major cause of the natural variations in the size and recruitment strength of marine fish populations. In this study, the relation between the size-at-hatch and early survival was assessed using laboratory experiments and on field-caught larvae of the European sardine (Sardina pilchardus). Larval size-at-hatch was not related to the egg size but was significantly, positively related to the diameter of the otolith-at-hatch. Otolith diameter-at-hatch was also significantly correlated with survival-at-age in fed and unfed larvae in the laboratory. For sardine larvae collected in the Bay of Biscay during the spring of 2008, otolith radius-at-hatch was also significantly related to viability. Larval mortality has frequently been related to adverse environmental conditions and intrinsic factors affecting feeding ability and vulnerability to predators. Our study offers evidence indicating that a significant portion of fish mortality occurs during the endogenous (yolk) and mixed (yolk /prey) feeding period in the absence of predators, revealing that marine fish with high fecundity, such as small pelagics, can spawn a relatively large amount of eggs resulting in small larvae with no chances to survive. Our findings help to better understand the mass mortalities occurring at early stages of marine fish.

  9. Born small, die young: Intrinsic, size-selective mortality in marine larval fish

    KAUST Repository

    Garrido, S.; Ben-Hamadou, R.; Santos, A.M.P.; Ferreira, S.; Teodó sio, M.A.; Cotano, U.; Irigoien, Xabier; Peck, M.A.; Saiz, E.; Ré , P.

    2015-01-01

    Mortality during the early stages is a major cause of the natural variations in the size and recruitment strength of marine fish populations. In this study, the relation between the size-at-hatch and early survival was assessed using laboratory experiments and on field-caught larvae of the European sardine (Sardina pilchardus). Larval size-at-hatch was not related to the egg size but was significantly, positively related to the diameter of the otolith-at-hatch. Otolith diameter-at-hatch was also significantly correlated with survival-at-age in fed and unfed larvae in the laboratory. For sardine larvae collected in the Bay of Biscay during the spring of 2008, otolith radius-at-hatch was also significantly related to viability. Larval mortality has frequently been related to adverse environmental conditions and intrinsic factors affecting feeding ability and vulnerability to predators. Our study offers evidence indicating that a significant portion of fish mortality occurs during the endogenous (yolk) and mixed (yolk /prey) feeding period in the absence of predators, revealing that marine fish with high fecundity, such as small pelagics, can spawn a relatively large amount of eggs resulting in small larvae with no chances to survive. Our findings help to better understand the mass mortalities occurring at early stages of marine fish.

  10. Shear-wave elastography contributes to accurate tumour size estimation when assessing small breast cancers

    International Nuclear Information System (INIS)

    Mullen, R.; Thompson, J.M.; Moussa, O.; Vinnicombe, S.; Evans, A.

    2014-01-01

    Aim: To assess whether the size of peritumoural stiffness (PTS) on shear-wave elastography (SWE) for small primary breast cancers (≤15 mm) was associated with size discrepancies between grey-scale ultrasound (GSUS) and final histological size and whether the addition of PTS size to GSUS size might result in more accurate tumour size estimation when compared to final histological size. Materials and methods: A retrospective analysis of 86 consecutive patients between August 2011 and February 2013 who underwent breast-conserving surgery for tumours of size ≤15 mm at ultrasound was carried out. The size of PTS stiffness was compared to mean GSUS size, mean histological size, and the extent of size discrepancy between GSUS and histology. PTS size and GSUS were combined and compared to the final histological size. Results: PTS of >3 mm was associated with a larger mean final histological size (16 versus 11.3 mm, p < 0.001). PTS size of >3 mm was associated with a higher frequency of underestimation of final histological size by GSUS of >5 mm (63% versus 18%, p < 0.001). The combination of PTS and GSUS size led to accurate estimation of the final histological size (p = 0.03). The size of PTS was not associated with margin involvement (p = 0.27). Conclusion: PTS extending beyond 3 mm from the grey-scale abnormality is significantly associated with underestimation of tumour size of >5 mm for small invasive breast cancers. Taking into account the size of PTS also led to accurate estimation of the final histological size. Further studies are required to assess the relationship of the extent of SWE stiffness and margin status. - Highlights: • Peritumoural stiffness of greater than 3 mm was associated with larger tumour size. • Underestimation of tumour size by ultrasound was associated with peri-tumoural stiffness size. • Combining peri-tumoural stiffness size to ultrasound produced accurate tumour size estimation

  11. Penetration of small and medium sized food companies on foreign markets

    Directory of Open Access Journals (Sweden)

    Ladislav Mura

    2010-01-01

    Full Text Available In the world the interconnection of national economies and internationalization of economic processes occur. In turbulent changing business environment only those companies that understand the current trends in global economy may survive, develop and prosper. Therefore, the issue of internationalization for most companies becomes to be more important and the process of in­ter­na­tio­na­li­za­tion is the way how to stand this increasingly competitive environment. In presented contribution, the attention is paid to issues of internationalization of Slovak food industry enterprises such as their way of business internationalization, percentage share of foreign - trading activities on overall business activity, company evaluation of business internationalization or barriers of foreign markets pe­ne­tra­tion that researched companies have to deal with. Small and medium-sized enterprises (SMEs are the engine of the Slovak economy, generator of development, innovation, growth and they employ more than 60% of employees. Almost all businesses operating in agri-food complex have character of small and medium enterprises. The results of carried research have shown that small and medium-sized companies are considered to be successful in operation on foreign markets in surrounding European countries.

  12. Modeling, design and experimental validation of a small-sized magnetic gear

    NARCIS (Netherlands)

    Zanis, R.; Borisavljevic, A.; Jansen, J.W.; Lomonova, E.A.

    2013-01-01

    A magnetostatic analytical model is created to analyze and design a small-sized magnetic gear for a robotic application. Through a parameter variation study, it is found that the inner rotor magnet height is highly influential to the torque, and based on which, the design is performed. Several

  13. Understanding the cluster randomised crossover design: a graphical illustraton of the components of variation and a sample size tutorial.

    Science.gov (United States)

    Arnup, Sarah J; McKenzie, Joanne E; Hemming, Karla; Pilcher, David; Forbes, Andrew B

    2017-08-15

    In a cluster randomised crossover (CRXO) design, a sequence of interventions is assigned to a group, or 'cluster' of individuals. Each cluster receives each intervention in a separate period of time, forming 'cluster-periods'. Sample size calculations for CRXO trials need to account for both the cluster randomisation and crossover aspects of the design. Formulae are available for the two-period, two-intervention, cross-sectional CRXO design, however implementation of these formulae is known to be suboptimal. The aims of this tutorial are to illustrate the intuition behind the design; and provide guidance on performing sample size calculations. Graphical illustrations are used to describe the effect of the cluster randomisation and crossover aspects of the design on the correlation between individual responses in a CRXO trial. Sample size calculations for binary and continuous outcomes are illustrated using parameters estimated from the Australia and New Zealand Intensive Care Society - Adult Patient Database (ANZICS-APD) for patient mortality and length(s) of stay (LOS). The similarity between individual responses in a CRXO trial can be understood in terms of three components of variation: variation in cluster mean response; variation in the cluster-period mean response; and variation between individual responses within a cluster-period; or equivalently in terms of the correlation between individual responses in the same cluster-period (within-cluster within-period correlation, WPC), and between individual responses in the same cluster, but in different periods (within-cluster between-period correlation, BPC). The BPC lies between zero and the WPC. When the WPC and BPC are equal the precision gained by crossover aspect of the CRXO design equals the precision lost by cluster randomisation. When the BPC is zero there is no advantage in a CRXO over a parallel-group cluster randomised trial. Sample size calculations illustrate that small changes in the specification of

  14. On sample size and different interpretations of snow stability datasets

    Science.gov (United States)

    Schirmer, M.; Mitterer, C.; Schweizer, J.

    2009-04-01

    Interpretations of snow stability variations need an assessment of the stability itself, independent of the scale investigated in the study. Studies on stability variations at a regional scale have often chosen stability tests such as the Rutschblock test or combinations of various tests in order to detect differences in aspect and elevation. The question arose: ‘how capable are such stability interpretations in drawing conclusions'. There are at least three possible errors sources: (i) the variance of the stability test itself; (ii) the stability variance at an underlying slope scale, and (iii) that the stability interpretation might not be directly related to the probability of skier triggering. Various stability interpretations have been proposed in the past that provide partly different results. We compared a subjective one based on expert knowledge with a more objective one based on a measure derived from comparing skier-triggered slopes vs. slopes that have been skied but not triggered. In this study, the uncertainties are discussed and their effects on regional scale stability variations will be quantified in a pragmatic way. An existing dataset with very large sample sizes was revisited. This dataset contained the variance of stability at a regional scale for several situations. The stability in this dataset was determined using the subjective interpretation scheme based on expert knowledge. The question to be answered was how many measurements were needed to obtain similar results (mainly stability differences in aspect or elevation) as with the complete dataset. The optimal sample size was obtained in several ways: (i) assuming a nominal data scale the sample size was determined with a given test, significance level and power, and by calculating the mean and standard deviation of the complete dataset. With this method it can also be determined if the complete dataset consists of an appropriate sample size. (ii) Smaller subsets were created with similar

  15. The PowerAtlas: a power and sample size atlas for microarray experimental design and research

    Directory of Open Access Journals (Sweden)

    Wang Jelai

    2006-02-01

    Full Text Available Abstract Background Microarrays permit biologists to simultaneously measure the mRNA abundance of thousands of genes. An important issue facing investigators planning microarray experiments is how to estimate the sample size required for good statistical power. What is the projected sample size or number of replicate chips needed to address the multiple hypotheses with acceptable accuracy? Statistical methods exist for calculating power based upon a single hypothesis, using estimates of the variability in data from pilot studies. There is, however, a need for methods to estimate power and/or required sample sizes in situations where multiple hypotheses are being tested, such as in microarray experiments. In addition, investigators frequently do not have pilot data to estimate the sample sizes required for microarray studies. Results To address this challenge, we have developed a Microrarray PowerAtlas 1. The atlas enables estimation of statistical power by allowing investigators to appropriately plan studies by building upon previous studies that have similar experimental characteristics. Currently, there are sample sizes and power estimates based on 632 experiments from Gene Expression Omnibus (GEO. The PowerAtlas also permits investigators to upload their own pilot data and derive power and sample size estimates from these data. This resource will be updated regularly with new datasets from GEO and other databases such as The Nottingham Arabidopsis Stock Center (NASC. Conclusion This resource provides a valuable tool for investigators who are planning efficient microarray studies and estimating required sample sizes.

  16. Requirement analysis to promote small-sized E-waste collection from consumers.

    Science.gov (United States)

    Mishima, Kuniko; Nishimura, Hidekazu

    2016-02-01

    The collection and recycling of small-sized waste electrical and electronic equipment is an emerging problem, since these products contain certain amounts of critical metals and rare earths. Even if the amount is not large, having a few supply routes for such recycled resources could be a good strategy to be competitive in a world of finite resources. The small-sized e-waste sometimes contains personal information, therefore, consumers are often reluctant to put them into recycling bins. In order to promote the recycling of E-waste, collection of used products from the consumer becomes important. Effective methods involving incentives for consumers might be necessary. Without such methods, it will be difficult to achieve the critical amounts necessary for an efficient recycling system. This article focused on used mobile phones among information appliances as the first case study, since it contains relatively large amounts of valuable metals compared with other small-sized waste electrical and electronic equipment and there are a large number of products existing in the market. The article carried out surveys to determine what kind of recycled material collection services are preferred by consumers. The results clarify that incentive or reward money alone is not a driving force for recycling behaviour. The article discusses the types of effective services required to promote recycling behaviour. The article concludes that securing information, transferring data and providing proper information about resources and environment can be an effective tool to encourage a recycling behaviour strategy to promote recycling, plus the potential discount service on purchasing new products associated with the return of recycled mobile phones. © The Author(s) 2015.

  17. Innovative potential of small and medium-sized enterprises in the Antofagasta-Chile region: an exploratory study

    Directory of Open Access Journals (Sweden)

    Gianni A. Romani Chocce

    2006-10-01

    Full Text Available The aim of this research is to analyze the innovative potential of Small and Medium Enterprises (SMEs in the Region of Antofagasta - Chile. The paper makes a detailed bibliographic revision on regional innovation processes and presents the results of a questionnaire applied to SMEs, considered as innovative according to the Organization for Economic Co-operation and Development (OCDE criteria. Although these results have to be carefully considered, due to the size of the sample it is possible to observe that there is an increasing awareness of the importance of innovation in the region while, at the same time, many obstacles to innovate are recognized.

  18. Experimental investigation of a small-sized betatron with superposed magnetization

    International Nuclear Information System (INIS)

    Kas'yanov, V.A.; Rychkov, M.V.; Filimonov, A.A.; Furman, Eh.G.; Chakhlov, V.L.; Chertov, A.S.; Shtejn, M.M.

    2001-01-01

    The aim of the paper is to study possibilities of small-sized betatrons (SSB) with direct current superposed magnetization (DSM). It is shown that DSM permits to decrease the SSB weight and cost of the electromagnet and capacitor storage and to shape the prolonged beam dump. It is noted that the DSM realization has the most expediency in SSB operating in a short-time mode [ru

  19. Branding and outcomes in small and medium-sized enterprises (SMEs)

    DEFF Research Database (Denmark)

    Odoom, Raphael; Narteh, Bedman; Rand, John

    2017-01-01

    The study investigates the relationships of enterprise resources and branding capabilities with branding efforts and branding benefits. It examines the differential effect of physical resources and branding capabilities on enterprises’ branding efforts and outcomes. Empirical data for the study...... were drawn from 304 small and medium-sized enterprises (SMEs) in Ghana. The hypothesized relationships were analyzed using Structural Equation Modeling. The study found that resources and capabilities possessions might not be enough to produce the optimum branding benefits for enterprises. A better...... result, however, emerges when these resources and capabilities are integrated with well-coordinated branding efforts of the enterprises. The study offers several implications for managers of small businesses based on the findings of the study....

  20. The quality of the reported sample size calculations in randomized controlled trials indexed in PubMed.

    Science.gov (United States)

    Lee, Paul H; Tse, Andy C Y

    2017-05-01

    There are limited data on the quality of reporting of information essential for replication of the calculation as well as the accuracy of the sample size calculation. We examine the current quality of reporting of the sample size calculation in randomized controlled trials (RCTs) published in PubMed and to examine the variation in reporting across study design, study characteristics, and journal impact factor. We also reviewed the targeted sample size reported in trial registries. We reviewed and analyzed all RCTs published in December 2014 with journals indexed in PubMed. The 2014 Impact Factors for the journals were used as proxies for their quality. Of the 451 analyzed papers, 58.1% reported an a priori sample size calculation. Nearly all papers provided the level of significance (97.7%) and desired power (96.6%), and most of the papers reported the minimum clinically important effect size (73.3%). The median (inter-quartile range) of the percentage difference of the reported and calculated sample size calculation was 0.0% (IQR -4.6%;3.0%). The accuracy of the reported sample size was better for studies published in journals that endorsed the CONSORT statement and journals with an impact factor. A total of 98 papers had provided targeted sample size on trial registries and about two-third of these papers (n=62) reported sample size calculation, but only 25 (40.3%) had no discrepancy with the reported number in the trial registries. The reporting of the sample size calculation in RCTs published in PubMed-indexed journals and trial registries were poor. The CONSORT statement should be more widely endorsed. Copyright © 2016 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  1. Meaningfulness for creation of growth in Small- and Medium-sized enterprises

    DEFF Research Database (Denmark)

    Brink, Tove; Neville, Mette

    2016-01-01

    The research in this paper reveals how meaningfulness of growth can be created in Small- and Medium-sized Enterprises (SMEs). The research is conducted in a four-year period with 24 SMEs participating from different industry branches. The research is now in the late part of the 3rd. year starting...

  2. Modern methods of evaluation existing suppliers and suppliers selected by customer for small and medium-sized companies

    Directory of Open Access Journals (Sweden)

    Marie Jurová

    2010-01-01

    Full Text Available On existing conditions on global market (almost identical quality, almost identical purchasing values companies have to definite the total rating of importance of evaluative criteria. In post-crisis period the problem of suppliers’ evaluation is one of the biggest, because companies had to use all resources and all possibilities to develop their own business. Many authors wrote about collaborative planning and warehousing as one of possibilities to optimization work inside supply chain. For this paper define small and medium-sized enterprises (SME as enterprises with the size of maximum 250 em­ployers. In literature was read we couldn’t find information about own system of suppliers evaluation for small and medium-sized enterprises. SME can have some different types of business and in this case need the universal system of eva­lua­tion. The research of this paper is oriented on small and medium-sized enterprises with different types of business. The new theoretical universal method of calculation for evaluation existing suppliers for small and medium-sized enterprises will present in this paper. This theoretical method is based on average values. This method includes traditional evaluative criteria (quality, delivery time and other (mobility of supplier, possibilities of new level of partnership. This method of evaluation can not be used for continual manufacture. New method can improve the total evaluation of supplier in small and medium-sized enterprises.

  3. A deployable mechanism concept for the collection of small-to-medium-size space debris

    Science.gov (United States)

    St-Onge, David; Sharf, Inna; Sagnières, Luc; Gosselin, Clément

    2018-03-01

    Current efforts in active debris removal strategies and mission planning focus on removing the largest, most massive debris. It can be argued, however, that small untrackable debris, specifically those smaller than 5 cm in size, also pose a serious threat. In this work, we propose and analyze a mission to sweep the most crowded Low Earth Orbit with a large cupola device to remove small-to-medium-size debris. The cupola consists of a deployable mechanism expanding more than 25 times its storage size to extend a membrane covering its surface. The membrane is sufficiently stiff to capture most small debris and to slow down the medium-size objects, thus accelerating their fall. An overview of the design of a belt-driven rigid-link mechanism proposed to support the collecting cupola surface is presented, based on our previous work. Because of its large size, the cupola will be subject to significant aerodynamic drag; thus, orbit maintenance analysis is carried out using the DTM-2013 atmospheric density model and it predicts feasible requirements. While in operation, the device will also be subject to numerous hyper-velocity impacts which may significantly perturb its orientation from the desired attitude for debris collection. Thus, another important feature of the proposed debris removal device is a distributed array of flywheels mounted on the cupola for reorienting and stabilizing its attitude during the mission. Analysis using a stochastic modeling framework for hyper-velocity impacts demonstrates that three-axes attitude stabilization is achievable with the flywheels array. MASTER-2009 software is employed to provide relevant data for all debris related estimates, including the debris fluxes for the baseline mission design and for assessment of its expected performance. Space debris removal is a high priority for ensuring sustainability of space and continual launch and operation of man-made space assets. This manuscript presents the first analysis of a small

  4. Incremental change in cross sectional area in small endotracheal tubes: A call for more size options.

    Science.gov (United States)

    Mortelliti, Caroline L; Mortelliti, Anthony J

    2016-08-01

    To elucidate the relatively large incremental percent change (IPC) in cross sectional area (CSA) in currently available small endotracheal tubes (ETTs), and to make recommendation for lesser incremental change in CSA in these smaller ETTs, in order to minimize iatrogenic airway injury. The CSAs of a commercially available line of ETTs were calculated, and the IPC of the CSA between consecutive size ETTs was calculated and graphed. The average IPC in CSA with large ETTs was applied to calculate identical IPC in the CSA for a theoretical, smaller ETT series, and the dimensions of a new theoretical series of proposed small ETTs were defined. The IPC of CSA in the larger (5.0-8.0 mm inner diameter (ID)) ETTs was 17.07%, and the IPC of CSA in the smaller ETTs (2.0-4.0 mm ID) is remarkably larger (38.08%). Applying the relatively smaller IPC of CSA from larger ETTs to a theoretical sequence of small ETTs, starting with the 2.5 mm ID ETT, suggests that intermediate sizes of small ETTs (ID 2.745 mm, 3.254 mm, and 3.859 mm) should exist. We recommend manufacturers produce additional small ETT size options at the intuitive intermediate sizes of 2.75 mm, 3.25 mm, and 3.75 mm ID in order to improve airway management for infants and small children. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. Regional statistical and economic analysis of small and medium-sized businesses development in Zhytomyr region

    Directory of Open Access Journals (Sweden)

    S.I. Pavlova

    2017-08-01

    Full Text Available Small and medium-sized businesses play an important role in the development of the regional economic system in particular and in solving a number of the following local problems: developing competition, developing the market for goods and services, providing jobs for the able-bodied population, raising living standards and improving the social environment in society. The purpose of this paper is to analyze the state and development of small and medium-sized businesses in the Zhytomyr region, to analyze its contribution to the economic development of the region, and to identify the main problems existing in the region. According to the indicators of state statistics, the author presents the general characteristics of enterprises in the Zhytomyr region from 2012 to 2016 in the context of indicators of the number of enterprises, the number of employed workers and the volume of the products sold, highlighting the activities of small enterprises and assessing their share in general levels. In addition, the paper provides the description of the activities of individual entrepreneurs. The structural comparison for the above-listed indicators of the distribution of influence on the economic system of the Zhytomyr region in terms of enterprises by size is presented. In terms of quantity 93,5 % are small enterprises that provide 31,4 % of the total number of employees with work and make up 23,1 % of the total volume of sales. Average enterprises in these indicators have 6,4 %, 62,0 % and 54,8 % respectively. The statistical and economic analysis of the structure of small enterprises by types of economic activity, by indicators of the number of registered enterprises, and by the volumes of sold products is carried out. The uniformity of the distribution is estimated using the index of the concentration coefficient. The indicators of revenues to budgets of different levels from small and medium-sized businesses are set. The paper presents and summarizes the

  6. Use of sales and operations planning in small and medium-sized enterprises

    Directory of Open Access Journals (Sweden)

    Michał Adamczak

    2013-03-01

    Full Text Available Background: Increasing competitiveness in the market, customer expectations related to the shortening of the deadlines and the reduction of prices of products and services force companies to improve the efficiency of internal processes. The integration of planning process is one of possible ways to achieve this aim. The integration of planning processes by the use of SOP model (Sales and Operations Planning is a method to implement this idea. The study allowed to identify ways to implement the process of sales and operations planning in small and medium-sized enterprises. Material and methods: The study was conducted in companies from different industries. The research method was in-depth interviews conducted with managers of companies or persons occupying management positions in the organizational process of implementing sales and operations planning. Results: During the survey, 10 companies were asked about the use of sales and operations planning, its elements and organizational aspects of its development, by the company. Conclusions: The use of sales and operations plan is closely dependent on the size of the company and its localization in the supply chain. Small enterprises are not interested in the integration of the planning process due to the small scale of operations and the centralization of decision-making process. Medium-sized enterprises, due to the increased complexity of the processes of planning, see the benefits of their integration in the SOP model.

  7. Delivery of workshops on mobility monitoring in small to medium-sized communities.

    Science.gov (United States)

    2009-11-01

    This report summarizes the delivery and outcome of a series of workshops conducted in 13 cities across the : state on performing mobility monitoring in small to medium-sized communities. The workshops served as : implementation for research project 0...

  8. Conceptual design of small-sized HTGR system (3). Core thermal and hydraulic design

    International Nuclear Information System (INIS)

    Inaba, Yoshitomo; Sato, Hiroyuki; Goto, Minoru; Ohashi, Hirofumi; Tachibana, Yukio

    2012-06-01

    The Japan Atomic Energy Agency has started the conceptual designs of small-sized High Temperature Gas-cooled Reactor (HTGR) systems, aiming for the 2030s deployment into developing countries. The small-sized HTGR systems can provide power generation by steam turbine, high temperature steam for industry process and/or low temperature steam for district heating. As one of the conceptual designs in the first stage, the core thermal and hydraulic design of the power generation and steam supply small-sized HTGR system with a thermal power of 50 MW (HTR50S), which was a reference reactor system positioned as a first commercial or demonstration reactor system, was carried out. HTR50S in the first stage has the same coated particle fuel as HTTR. The purpose of the design is to make sure that the maximum fuel temperature in normal operation doesn't exceed the design target. Following the design, safety analysis assuming a depressurization accident was carried out. The fuel temperature in the normal operation and the fuel and reactor pressure vessel temperatures in the depressurization accident were evaluated. As a result, it was cleared that the thermal integrity of the fuel and the reactor coolant pressure boundary is not damaged. (author)

  9. Radioenzymatic assay for trimethoprim in very small serum samples.

    OpenAIRE

    Yogev, R; Melick, C; Tan-Pong, L

    1985-01-01

    A modification of the methotrexate radioassay kit (supplied by New England Enzyme Center) enabled determination of trimethoprim levels in 5-microliter serum samples. An excellent correlation between this assay and high-pressure liquid chromatography assay was found. These preliminary results suggest that with this method rapid determination of trimethoprim levels in very small samples (5 to 10 microliters) can be achieved.

  10. Radioenzymatic assay for trimethoprim in very small serum samples

    International Nuclear Information System (INIS)

    Yogev, R.; Melick, C.; Tan-Pong, L.

    1985-01-01

    A modification of the methotrexate radioassay kit (supplied by New England Enzyme Center) enabled determination of trimethoprim levels in 5-microliter serum samples. An excellent correlation between this assay and high-pressure liquid chromatography assay was found. These preliminary results suggest that with this method rapid determination of trimethoprim levels in very small samples (5 to 10 microliters) can be achieved

  11. Differentiating gold nanorod samples using particle size and shape distributions from transmission electron microscope images

    Science.gov (United States)

    Grulke, Eric A.; Wu, Xiaochun; Ji, Yinglu; Buhr, Egbert; Yamamoto, Kazuhiro; Song, Nam Woong; Stefaniak, Aleksandr B.; Schwegler-Berry, Diane; Burchett, Woodrow W.; Lambert, Joshua; Stromberg, Arnold J.

    2018-04-01

    Size and shape distributions of gold nanorod samples are critical to their physico-chemical properties, especially their longitudinal surface plasmon resonance. This interlaboratory comparison study developed methods for measuring and evaluating size and shape distributions for gold nanorod samples using transmission electron microscopy (TEM) images. The objective was to determine whether two different samples, which had different performance attributes in their application, were different with respect to their size and/or shape descriptor distributions. Touching particles in the captured images were identified using a ruggedness shape descriptor. Nanorods could be distinguished from nanocubes using an elongational shape descriptor. A non-parametric statistical test showed that cumulative distributions of an elongational shape descriptor, that is, the aspect ratio, were statistically different between the two samples for all laboratories. While the scale parameters of size and shape distributions were similar for both samples, the width parameters of size and shape distributions were statistically different. This protocol fulfills an important need for a standardized approach to measure gold nanorod size and shape distributions for applications in which quantitative measurements and comparisons are important. Furthermore, the validated protocol workflow can be automated, thus providing consistent and rapid measurements of nanorod size and shape distributions for researchers, regulatory agencies, and industry.

  12. Position paper: Management of men complaining of a small penis despite an actually normal size.

    Science.gov (United States)

    Ghanem, Hussein; Glina, Sidney; Assalian, Pierre; Buvat, Jacques

    2013-01-01

    With the worldwide increase in penile augmentation procedures and claims of devices designed to elongate the penis, it becomes crucial to study the scientific basis of such procedures or devices, as well as the management of a complaint of a small penis in men with a normal penile size. The aim of this work is to study the scientific basis of opting to penile augmentation procedures and to develop guidelines based on the best available evidence for the management of men complaining of a small penis despite an actually normal size. We reviewed the literature and evaluated the evidence about what the normal penile size is, what patients complaining of a small penis usually suffer from, benefits vs. complications of surgery, penile stretching or traction devices, and outcome with patient education and counseling. Repeated presentation and detailed discussions within the Standard Committee of the International Society for Sexual Medicine were performed. Recommendations are based on the evaluation of evidence-based medical literature, widespread standards committee discussion, public presentation, and debate. We propose a practical approach for evaluating and counseling patients complaining of a small-sized penis. Based on the current status of science, penile lengthening procedure surgery is still considered experimental and should only be limited to special circumstances within research or university institutions with supervising ethics committees. © 2012 International Society for Sexual Medicine.

  13. Bayesian sample size determination for cost-effectiveness studies with censored data.

    Directory of Open Access Journals (Sweden)

    Daniel P Beavers

    Full Text Available Cost-effectiveness models are commonly utilized to determine the combined clinical and economic impact of one treatment compared to another. However, most methods for sample size determination of cost-effectiveness studies assume fully observed costs and effectiveness outcomes, which presents challenges for survival-based studies in which censoring exists. We propose a Bayesian method for the design and analysis of cost-effectiveness data in which costs and effectiveness may be censored, and the sample size is approximated for both power and assurance. We explore two parametric models and demonstrate the flexibility of the approach to accommodate a variety of modifications to study assumptions.

  14. Determining the optimal size of small molecule mixtures for high throughput NMR screening

    International Nuclear Information System (INIS)

    Mercier, Kelly A.; Powers, Robert

    2005-01-01

    High-throughput screening (HTS) using NMR spectroscopy has become a common component of the drug discovery effort and is widely used throughout the pharmaceutical industry. NMR provides additional information about the nature of small molecule-protein interactions compared to traditional HTS methods. In order to achieve comparable efficiency, small molecules are often screened as mixtures in NMR-based assays. Nevertheless, an analysis of the efficiency of mixtures and a corresponding determination of the optimum mixture size (OMS) that minimizes the amount of material and instrumentation time required for an NMR screen has been lacking. A model for calculating OMS based on the application of the hypergeometric distribution function to determine the probability of a 'hit' for various mixture sizes and hit rates is presented. An alternative method for the deconvolution of large screening mixtures is also discussed. These methods have been applied in a high-throughput NMR screening assay using a small, directed library

  15. 13 CFR 121.412 - What are the size procedures for partial small business set-asides?

    Science.gov (United States)

    2010-01-01

    ... Requirements for Government Procurement § 121.412 What are the size procedures for partial small business set... portion of a procurement, and is not required to qualify as a small business for the unrestricted portion. ...

  16. Development of sample size allocation program using hypergeometric distribution

    International Nuclear Information System (INIS)

    Kim, Hyun Tae; Kwack, Eun Ho; Park, Wan Soo; Min, Kyung Soo; Park, Chan Sik

    1996-01-01

    The objective of this research is the development of sample allocation program using hypergeometric distribution with objected-oriented method. When IAEA(International Atomic Energy Agency) performs inspection, it simply applies a standard binomial distribution which describes sampling with replacement instead of a hypergeometric distribution which describes sampling without replacement in sample allocation to up to three verification methods. The objective of the IAEA inspection is the timely detection of diversion of significant quantities of nuclear material, therefore game theory is applied to its sampling plan. It is necessary to use hypergeometric distribution directly or approximate distribution to secure statistical accuracy. Improved binomial approximation developed by Mr. J. L. Jaech and correctly applied binomial approximation are more closer to hypergeometric distribution in sample size calculation than the simply applied binomial approximation of the IAEA. Object-oriented programs of 1. sample approximate-allocation with correctly applied standard binomial approximation, 2. sample approximate-allocation with improved binomial approximation, and 3. sample approximate-allocation with hypergeometric distribution were developed with Visual C ++ and corresponding programs were developed with EXCEL(using Visual Basic for Application). 8 tabs., 15 refs. (Author)

  17. Development of a small-sized generator of ozonated water using an electro-conductive diamond electrode.

    Science.gov (United States)

    Sekido, Kota; Kitaori, Noriyuki

    2008-12-01

    A small-sized generator of ozonated water was developed using an electro-conductive diamond. We studied the optimum conditions for producing ozonated water. As a result, we developed a small-sized generator of ozonated water driven by a dry-cell for use in the average household. This generator was easily able to produce ozonated water with an ozone concentration (over 4 mg/L) sufficient for disinfection. In addition, we verified the high disinfecting performance of the water produced in an actual hospital.

  18. The development of internet based ship design support system for small and medium sized shipyards

    Science.gov (United States)

    Shin, Sung-Chul; Lee, Soon-Sup; Kang, Dong-Hoon; Lee, Kyung-Ho

    2012-03-01

    In this paper, a prototype of ship basic planning system is implemented for the small and medium sized shipyards based on the internet technology and concurrent engineering concept. The system is designed from the user requirements. Consequently, standardized development environment and tools are selected. These tools are used for the system development to define and evaluate core application technologies. The system will contribute to increasing competitiveness of small and medium sized shipyards in the 21st century industrial en-vironment.

  19. SMGE: strategic challenges and performance measurement in a small-sized company (case)

    NARCIS (Netherlands)

    MBA R. Vieveen; PhD B. Pesalj

    2015-01-01

    Case portrays the Superior Manufacturing Group Europe (SMGE), Barendrecht (Netherlands), a small-sized company doing business in the matting industry; facing challenges of how to sustain and improve its performance, by balancing between entrepreneurial initiative, and a need for strategic planning

  20. Novel joint selection methods can reduce sample size for rheumatoid arthritis clinical trials with ultrasound endpoints.

    Science.gov (United States)

    Allen, John C; Thumboo, Julian; Lye, Weng Kit; Conaghan, Philip G; Chew, Li-Ching; Tan, York Kiat

    2018-03-01

    To determine whether novel methods of selecting joints through (i) ultrasonography (individualized-ultrasound [IUS] method), or (ii) ultrasonography and clinical examination (individualized-composite-ultrasound [ICUS] method) translate into smaller rheumatoid arthritis (RA) clinical trial sample sizes when compared to existing methods utilizing predetermined joint sites for ultrasonography. Cohen's effect size (ES) was estimated (ES^) and a 95% CI (ES^L, ES^U) calculated on a mean change in 3-month total inflammatory score for each method. Corresponding 95% CIs [nL(ES^U), nU(ES^L)] were obtained on a post hoc sample size reflecting the uncertainty in ES^. Sample size calculations were based on a one-sample t-test as the patient numbers needed to provide 80% power at α = 0.05 to reject a null hypothesis H 0 : ES = 0 versus alternative hypotheses H 1 : ES = ES^, ES = ES^L and ES = ES^U. We aimed to provide point and interval estimates on projected sample sizes for future studies reflecting the uncertainty in our study ES^S. Twenty-four treated RA patients were followed up for 3 months. Utilizing the 12-joint approach and existing methods, the post hoc sample size (95% CI) was 22 (10-245). Corresponding sample sizes using ICUS and IUS were 11 (7-40) and 11 (6-38), respectively. Utilizing a seven-joint approach, the corresponding sample sizes using ICUS and IUS methods were nine (6-24) and 11 (6-35), respectively. Our pilot study suggests that sample size for RA clinical trials with ultrasound endpoints may be reduced using the novel methods, providing justification for larger studies to confirm these observations. © 2017 Asia Pacific League of Associations for Rheumatology and John Wiley & Sons Australia, Ltd.

  1. Analysis of methods commonly used in biomedicine for treatment versus control comparison of very small samples.

    Science.gov (United States)

    Ristić-Djurović, Jasna L; Ćirković, Saša; Mladenović, Pavle; Romčević, Nebojša; Trbovich, Alexander M

    2018-04-01

    A rough estimate indicated that use of samples of size not larger than ten is not uncommon in biomedical research and that many of such studies are limited to strong effects due to sample sizes smaller than six. For data collected from biomedical experiments it is also often unknown if mathematical requirements incorporated in the sample comparison methods are satisfied. Computer simulated experiments were used to examine performance of methods for qualitative sample comparison and its dependence on the effectiveness of exposure, effect intensity, distribution of studied parameter values in the population, and sample size. The Type I and Type II errors, their average, as well as the maximal errors were considered. The sample size 9 and the t-test method with p = 5% ensured error smaller than 5% even for weak effects. For sample sizes 6-8 the same method enabled detection of weak effects with errors smaller than 20%. If the sample sizes were 3-5, weak effects could not be detected with an acceptable error; however, the smallest maximal error in the most general case that includes weak effects is granted by the standard error of the mean method. The increase of sample size from 5 to 9 led to seven times more accurate detection of weak effects. Strong effects were detected regardless of the sample size and method used. The minimal recommended sample size for biomedical experiments is 9. Use of smaller sizes and the method of their comparison should be justified by the objective of the experiment. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Linear models for airborne-laser-scanning-based operational forest inventory with small field sample size and highly correlated LiDAR data

    Science.gov (United States)

    Junttila, Virpi; Kauranne, Tuomo; Finley, Andrew O.; Bradford, John B.

    2015-01-01

    Modern operational forest inventory often uses remotely sensed data that cover the whole inventory area to produce spatially explicit estimates of forest properties through statistical models. The data obtained by airborne light detection and ranging (LiDAR) correlate well with many forest inventory variables, such as the tree height, the timber volume, and the biomass. To construct an accurate model over thousands of hectares, LiDAR data must be supplemented with several hundred field sample measurements of forest inventory variables. This can be costly and time consuming. Different LiDAR-data-based and spatial-data-based sampling designs can reduce the number of field sample plots needed. However, problems arising from the features of the LiDAR data, such as a large number of predictors compared with the sample size (overfitting) or a strong correlation among predictors (multicollinearity), may decrease the accuracy and precision of the estimates and predictions. To overcome these problems, a Bayesian linear model with the singular value decomposition of predictors, combined with regularization, is proposed. The model performance in predicting different forest inventory variables is verified in ten inventory areas from two continents, where the number of field sample plots is reduced using different sampling designs. The results show that, with an appropriate field plot selection strategy and the proposed linear model, the total relative error of the predicted forest inventory variables is only 5%–15% larger using 50 field sample plots than the error of a linear model estimated with several hundred field sample plots when we sum up the error due to both the model noise variance and the model’s lack of fit.

  3. Use of care management practices in small- and medium-sized physician groups: do public reporting of physician quality and financial incentives matter?

    Science.gov (United States)

    Alexander, Jeffrey A; Maeng, Daniel; Casalino, Lawrence P; Rittenhouse, Diane

    2013-04-01

    To examine the effect of public reporting (PR) and financial incentives tied to quality performance on the use of care management practices (CMPs) among small- and medium-sized physician groups. Survey data from The National Study of Small and Medium-sized Physician Practices were used. Primary data collection was also conducted to assess community-level PR activities. The final sample included 643 practices engaged in quality reporting; about half of these practices were subject to PR. We used a treatment effects model. The instrumental variables were the community-level variables that capture the level of PR activity in each community in which the practices operate. (1) PR is associated with increased use of CMPs, but the estimate is not statistically significant; (2) financial incentives are associated with greater use of CMPs; (3) practices' awareness/sensitivity to quality reports is positively related to their use of CMPs; and (4) combined PR and financial incentives jointly affect CMP use to a greater degree than either of these factors alone. Small- to medium-sized practices appear to respond to PR and financial incentives by greater use of CMPs. Future research needs to investigate the appropriate mix and type of incentive arrangements and quality reporting. © Health Research and Educational Trust.

  4. Monitoring, Modeling, and Diagnosis of Alkali-Silica Reaction in Small Concrete Samples

    Energy Technology Data Exchange (ETDEWEB)

    Agarwal, Vivek [Idaho National Lab. (INL), Idaho Falls, ID (United States); Cai, Guowei [Idaho National Lab. (INL), Idaho Falls, ID (United States); Gribok, Andrei V. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mahadevan, Sankaran [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    Assessment and management of aging concrete structures in nuclear power plants require a more systematic approach than simple reliance on existing code margins of safety. Structural health monitoring of concrete structures aims to understand the current health condition of a structure based on heterogeneous measurements to produce high-confidence actionable information regarding structural integrity that supports operational and maintenance decisions. This report describes alkali-silica reaction (ASR) degradation mechanisms and factors influencing the ASR. A fully coupled thermo-hydro-mechanical-chemical model developed by Saouma and Perotti by taking into consideration the effects of stress on the reaction kinetics and anisotropic volumetric expansion is presented in this report. This model is implemented in the GRIZZLY code based on the Multiphysics Object Oriented Simulation Environment. The implemented model in the GRIZZLY code is randomly used to initiate ASR in a 2D and 3D lattice to study the percolation aspects of concrete. The percolation aspects help determine the transport properties of the material and therefore the durability and service life of concrete. This report summarizes the effort to develop small-size concrete samples with embedded glass to mimic ASR. The concrete samples were treated in water and sodium hydroxide solution at elevated temperature to study how ingress of sodium ions and hydroxide ions at elevated temperature impacts concrete samples embedded with glass. Thermal camera was used to monitor the changes in the concrete sample and results are summarized.

  5. Effects of sample size on robustness and prediction accuracy of a prognostic gene signature

    Directory of Open Access Journals (Sweden)

    Kim Seon-Young

    2009-05-01

    Full Text Available Abstract Background Few overlap between independently developed gene signatures and poor inter-study applicability of gene signatures are two of major concerns raised in the development of microarray-based prognostic gene signatures. One recent study suggested that thousands of samples are needed to generate a robust prognostic gene signature. Results A data set of 1,372 samples was generated by combining eight breast cancer gene expression data sets produced using the same microarray platform and, using the data set, effects of varying samples sizes on a few performances of a prognostic gene signature were investigated. The overlap between independently developed gene signatures was increased linearly with more samples, attaining an average overlap of 16.56% with 600 samples. The concordance between predicted outcomes by different gene signatures also was increased with more samples up to 94.61% with 300 samples. The accuracy of outcome prediction also increased with more samples. Finally, analysis using only Estrogen Receptor-positive (ER+ patients attained higher prediction accuracy than using both patients, suggesting that sub-type specific analysis can lead to the development of better prognostic gene signatures Conclusion Increasing sample sizes generated a gene signature with better stability, better concordance in outcome prediction, and better prediction accuracy. However, the degree of performance improvement by the increased sample size was different between the degree of overlap and the degree of concordance in outcome prediction, suggesting that the sample size required for a study should be determined according to the specific aims of the study.

  6. US market potential for small and medium-sized nuclear reactors

    International Nuclear Information System (INIS)

    Hardie, R.W.; Jackson, S.V.

    1988-01-01

    This paper presents the results of on-site interviews with representatives of 23 investor-owned utilities and 3 publicly-owned utilities. The purpose of the interviews was to obtain information on utilities' attitudes towards small and medium-sized power plants and towards building new nuclear plants in general. Most of the utilities interviewed preferred power plants smaller than the ones currently being offered, as long as the smaller plants did not incur a major economic penalty. However, according to the utilities interviewed, without changes in the current environment it is unlikely that there will be a significant market for new nuclear plants of any size. A trend in the utility industry towards generating companies appears to be a positive step. (orig.)

  7. US market potential for small and medium-sized nuclear reactors

    International Nuclear Information System (INIS)

    Hardie, R.W.; Jackson, S.V.

    1987-01-01

    This paper presents the results of on-site interviews with representatives of 23 investor-owned utilities and 3 publicly-owned utilities. The purpose of the interviews was to obtain information on utilities' attitudes toward small and medium-sized power plants and towards building new nuclear plants in general. Most of the utilities interviewed preferred power plants smaller than the ones currently being offered, as long as the smaller plants did not incur a major economic penalty. However, according to the utilities interviewed, without changes in the current environment it is unlikely that there will be a significant market for new nuclear plants of any size. A trend in the utility industry towards generating companies appears to be a positive step. (author)

  8. Nintendo Wii Fit as an adjunct to physiotherapy following lower limb fractures: preliminary feasibility, safety and sample size considerations.

    Science.gov (United States)

    McPhail, S M; O'Hara, M; Gane, E; Tonks, P; Bullock-Saxton, J; Kuys, S S

    2016-06-01

    The Nintendo Wii Fit integrates virtual gaming with body movement, and may be suitable as an adjunct to conventional physiotherapy following lower limb fractures. This study examined the feasibility and safety of using the Wii Fit as an adjunct to outpatient physiotherapy following lower limb fractures, and reports sample size considerations for an appropriately powered randomised trial. Ambulatory patients receiving physiotherapy following a lower limb fracture participated in this study (n=18). All participants received usual care (individual physiotherapy). The first nine participants also used the Wii Fit under the supervision of their treating clinician as an adjunct to usual care. Adverse events, fracture malunion or exacerbation of symptoms were recorded. Pain, balance and patient-reported function were assessed at baseline and discharge from physiotherapy. No adverse events were attributed to either the usual care physiotherapy or Wii Fit intervention for any patient. Overall, 15 (83%) participants completed both assessments and interventions as scheduled. For 80% power in a clinical trial, the number of complete datasets required in each group to detect a small, medium or large effect of the Wii Fit at a post-intervention assessment was calculated at 175, 63 and 25, respectively. The Nintendo Wii Fit was safe and feasible as an adjunct to ambulatory physiotherapy in this sample. When considering a likely small effect size and the 17% dropout rate observed in this study, 211 participants would be required in each clinical trial group. A larger effect size or multiple repeated measures design would require fewer participants. Copyright © 2015 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.

  9. Spatial resolution of 2D ionization chamber arrays for IMRT dose verification: single-detector size and sampling step width

    International Nuclear Information System (INIS)

    Poppe, Bjoern; Djouguela, Armand; Blechschmidt, Arne; Willborn, Kay; Ruehmann, Antje; Harder, Dietrich

    2007-01-01

    The spatial resolution of 2D detector arrays equipped with ionization chambers or diodes, used for the dose verification of IMRT treatment plans, is limited by the size of the single detector and the centre-to-centre distance between the detectors. Optimization criteria with regard to these parameters have been developed by combining concepts of dosimetry and pattern analysis. The 2D-ARRAY Type 10024 (PTW-Freiburg, Germany), single-chamber cross section 5 x 5 mm 2 , centre-to-centre distance between chambers in each row and column 10 mm, served as an example. Additional frames of given dose distributions can be taken by shifting the whole array parallel or perpendicular to the MLC leaves by, e.g., 5 mm. The size of the single detector is characterized by its lateral response function, a trapezoid with 5 mm top width and 9 mm base width. Therefore, values measured with the 2D array are regarded as sample values from the convolution product of the accelerator generated dose distribution and this lateral response function. Consequently, the dose verification, e.g., by means of the gamma index, is performed by comparing the measured values of the 2D array with the values of the convolution product of the treatment planning system (TPS) calculated dose distribution and the single-detector lateral response function. Sufficiently small misalignments of the measured dose distributions in comparison with the calculated ones can be detected since the lateral response function is symmetric with respect to the centre of the chamber, and the change of dose gradients due to the convolution is sufficiently small. The sampling step width of the 2D array should provide a set of sample values representative of the sampled distribution, which is achieved if the highest spatial frequency contained in this function does not exceed the 'Nyquist frequency', one half of the sampling frequency. Since the convolution products of IMRT-typical dose distributions and the single

  10. Greening Small and Medium-Sized Enterprises: Evaluating Environmental Policy in Vietnam

    NARCIS (Netherlands)

    Van Khoa, Le

    2006-01-01

    Small and Medium-sized Enterprises (SMEs) contribute considerably to the economic and social development of bothViet Namand Ho Chi Minh City (HCMC). But at the same time this sector causes severe environmental

  11. [Application of health questionnaires for health management in small- and medium-sized enterprises].

    Science.gov (United States)

    Kishida, K; Saito, M; Hasegawa, T; Aoki, S; Suzuki, S

    1986-01-01

    Two kinds of health questionnaires, the Todai Health Index (THI) and Cumulative Fatigue Index (CFI), were applied as a screening device for health management of workers belonging to small-medium sized enterprises. A total of 495 workers composed of 452 male workers of a glass-bottle manufacturing factory and 43 male workers of a soft-drink bottling factory were the subjects of the present study. It was found that the two kinds of health questionnaires were different from each other and have their own characteristics. Twelve scales of THI were grouped into two, the first consisting of ten scales (SUSY, RESP, EYSK, MOUT, DIGE, IMPU, MENT, DEPR, NERV, and LIFE) and the second consisting of two scales (AGGR and LISC). Nine categories of CFI were grouped into one by using principal factor analysis. It was confirmed that the twelve scale scores of THI obtained at small-medium sized factories differed from those scale scores of a reference group investigated at a large-sized enterprise. It is on the basis of the scales of aggressiveness and lies and also of the scale of mental unstability which characterizes workers, locality, job (clerical or field work), and size of industry (large or small sized) that the difference could be evaluated. Urban life characterized by a life style of staying up late at night and waking up late in the morning has been reflected on the scale of life irregularity. Irregularity of life induced by transformation of working schedule, such as two or three shifts of work and overtime, was also reflected on this scale. Two scales of THI test, i.e., many subjective symptoms and digestive organ complaints, seemed to be the representative scales indicating a close relation between work load and health level. The discriminant score for diagnosis of psychosomatic diseases is considered to be one of the most useful assessments of the individual's health condition. As mentioned above, THI is recommended as a convenient assessment method for health

  12. Small scattered fragments do not a dwarf make: biological and archaeological data indicate that prehistoric inhabitants of Palau were normal sized.

    Directory of Open Access Journals (Sweden)

    Scott M Fitzpatrick

    Full Text Available UNLABELLED: Current archaeological evidence from Palau in western Micronesia indicates that the archipelago was settled around 3000-3300 BP by normal sized populations; contrary to recent claims, they did not succumb to insular dwarfism. BACKGROUND: Previous and ongoing archaeological research of both human burial and occupation sites throughout the Palauan archipelago during the last 50 years has produced a robust data set to test hypotheses regarding initial colonization and subsequent adaptations over the past three millennia. PRINCIPAL FINDINGS: Close examination of human burials at the early (ca. 3000 BP and stratified site of Chelechol ra Orrak indicates that these were normal sized individuals. This is contrary to the recent claim of contemporaneous "small-bodied" individuals found at two cave sites by Berger et al. (2008. As we argue, their analyses are flawed on a number of different analytical levels. First, their sample size is too small and fragmentary to adequately address the variation inherent in modern humans within and outside of Palau. Second, the size and stature of all other prehistoric (both older and contemporaneous skeletal assemblages found in Palau fall within the normal parameters of modern human variation in the region, indicating this was not a case of insular dwarfism or a separate migratory group. Third, measurements taken on several skeletal elements by Berger et al. may appear to be from smaller-bodied individuals, but the sizes of these people compares well with samples from Chelechol ra Orrak. Last, archaeological, linguistic, and historical evidence demonstrates a great deal of cultural continuity in Palau through time as expected if the same population was inhabiting the archipelago. CONCLUSIONS: Prehistoric Palauan populations were normal sized and exhibit traits that fall within the normal variation for Homo sapiens-they do not support the claims by Berger et al. (2008 that there were smaller

  13. A modified approach to estimating sample size for simple logistic regression with one continuous covariate.

    Science.gov (United States)

    Novikov, I; Fund, N; Freedman, L S

    2010-01-15

    Different methods for the calculation of sample size for simple logistic regression (LR) with one normally distributed continuous covariate give different results. Sometimes the difference can be large. Furthermore, some methods require the user to specify the prevalence of cases when the covariate equals its population mean, rather than the more natural population prevalence. We focus on two commonly used methods and show through simulations that the power for a given sample size may differ substantially from the nominal value for one method, especially when the covariate effect is large, while the other method performs poorly if the user provides the population prevalence instead of the required parameter. We propose a modification of the method of Hsieh et al. that requires specification of the population prevalence and that employs Schouten's sample size formula for a t-test with unequal variances and group sizes. This approach appears to increase the accuracy of the sample size estimates for LR with one continuous covariate.

  14. 77 FR 55737 - Small Business Size Standards: Finance and Insurance and Management of Companies and Enterprises

    Science.gov (United States)

    2012-09-11

    ... 3245-AG45 Small Business Size Standards: Finance and Insurance and Management of Companies and Enterprises AGENCY: U.S. Small Business Administration. ACTION: Proposed rule. SUMMARY: The U.S. Small... NAICS Sector 55, Management of Companies and Enterprises. In addition, SBA proposes to change the...

  15. Conversion of Small Algal Oil Sample to JP-8

    Science.gov (United States)

    2012-01-01

    cracking of Algal Oil to SPK Hydroprocessing Lab Plant uop Nitrogen Hydrogen Product ., __ Small Scale Lab Hydprocessing plant - Down flow trickle ... bed configuration - Capable of retaining 25 cc of catalyst bed Meter UOP ·CONFIDENTIAL File Number The catalytic deoxygenation stage of the...content which combined with the samples acidity, is a challenge to reactor metallurgy. None the less, an attempt was made to convert this sample to

  16. Sample size methods for estimating HIV incidence from cross-sectional surveys.

    Science.gov (United States)

    Konikoff, Jacob; Brookmeyer, Ron

    2015-12-01

    Understanding HIV incidence, the rate at which new infections occur in populations, is critical for tracking and surveillance of the epidemic. In this article, we derive methods for determining sample sizes for cross-sectional surveys to estimate incidence with sufficient precision. We further show how to specify sample sizes for two successive cross-sectional surveys to detect changes in incidence with adequate power. In these surveys biomarkers such as CD4 cell count, viral load, and recently developed serological assays are used to determine which individuals are in an early disease stage of infection. The total number of individuals in this stage, divided by the number of people who are uninfected, is used to approximate the incidence rate. Our methods account for uncertainty in the durations of time spent in the biomarker defined early disease stage. We find that failure to account for this uncertainty when designing surveys can lead to imprecise estimates of incidence and underpowered studies. We evaluated our sample size methods in simulations and found that they performed well in a variety of underlying epidemics. Code for implementing our methods in R is available with this article at the Biometrics website on Wiley Online Library. © 2015, The International Biometric Society.

  17. exTAS - next-generation TAS for small samples and extreme conditions

    International Nuclear Information System (INIS)

    Kulda, J.; Hiess, A.

    2011-01-01

    The currently used implementation of horizontally and vertically focusing optics in three-axis spectrometers (TAS) permits efficient studies of excitations in sub-cm 3 - sized single crystals]. With the present proposal we wish to stimulate a further paradigm shift into the domain of mm 3 -sized samples. exTAS combines highly focused mm-sized focal spots, boosting the sensitivity limits, with a spectrometer layout down-scaled to a table-top size to provide high flexibility in optimizing acceptance angles and to achieve sub-millimeter positioning accuracy. (authors)

  18. Financial Management Challenges In Small And Medium-Sized Enterprises: A Strategic Management Approach

    Directory of Open Access Journals (Sweden)

    Hande Karadag

    2015-02-01

    Full Text Available Abstract :Due to their significant role in creation of new jobs, rise in GDP, entrepreneurship and innovation, small and medium-sized enterprises (SMEs are recognized as the the drivers of socio-economic growth, both in developed and developing economies. In Turkey, 99.9 % of all enterprises fall into SME category. Therefore, the significance of SMEs for Turkish economy and society is much higher in Turkey, compared to other emerging and developed countries. Small and medium-sized companies are faced with a number of challenges whereas the problems arising from “poor financial management” are reported as the major causes of business failures in SMEs. Strategic financial management (SFM which is a research area that has attracted the interest of researchers after 2010, is one of the key managerial areas of SMEs, due to its vital role on the survival, growth and performance of SMEs. The purpose of this paper is to analyze the central role of financial management and identify the financial management challenges and practices that influence the organizational performance in Turkish SMEs, from a strategic management perspective. Within the course of this paper, the importance and challenges of SMEs in Turkey are presented in the first section, while the literature on strategic and financial management in SMEs are reviewed in the second part. In the third section, the recent strategic financial management concept, the implications of strategic financial management practices for SMEs in Turkey and the relationships between strategic financial management practices and SME performance, are discussed. Small and medium sized enterprise finance in Turkey is a developing research area, therefore this paper aims to make a significant contribution to the existing literature by analyzing the major challenges at the conduct of financial management in Turkish SMEs and the influence of strategic financial management practices on the performances of small and

  19. Electron Emitter for small-size Electrodynamic Space Tether using MEMS Technology

    DEFF Research Database (Denmark)

    Fleron, René A. W.; Blanke, Mogens

    2004-01-01

    Adjustment of the orbit of a spacecraft using the forces created by an electro-dynamic space-tether has been shown as a theoretic possibility in recent literature. Practical implementation is being pursued for larger scale missions where a hot filament device controls electron emission...... and the current flowing in the electrodynamic space tether. Applications to small spacecraft, or space debris in the 1–10 kg range, possess difficulties with electron emission technology, as low power emitting devices are needed. This paper addresses the system concepts of a small spacecraft electrodynamic tether...... system with focus on electron emitter design and manufacture using micro-electro-mechanical- system (MEMS) technology. The paper addresses the system concepts of a small size electrodynamic tether mission and shows a novel electron emitter for the 1-2 mA range where altitude can be effectively affected...

  20. The role of small and medium size reactors in the future US nuclear market

    International Nuclear Information System (INIS)

    Twilley, R.C.

    2002-01-01

    This paper addresses the various aspects of siting and sizing new generating facilities in the USA. Environmental and licensing issues are discussed. The economic considerations and assumptions for new plants are presented. Also, the electric transmission and distribution system characteristics and constraints are described with the potential role for several small and medium size designs summarized. (authors)

  1. Small Size and Low Cost UHF RFID Tag Antenna Mountable on Metallic Objects

    Directory of Open Access Journals (Sweden)

    Sergio López-Soriano

    2015-01-01

    Full Text Available Reducing tag size while maintaining good performance is one of the major challenges in radio-frequency identification applications (RFID, in particular when labeling metallic objects. In this contribution, a small size and low cost tag antenna for identifying metal objects in the European UHF band (865–868 MHz is presented. The antenna consists of a transmission line mounted on an inexpensive thin dielectric which is proximity-coupled to a short-ended patch mounted on FR4 substrate. The overall dimensions of the tag are 33.5 × 30 × 3.1 mm. Experimental results show that, for an EIRP of 3.2 W (European regulations, such a small and cheap tag attains read ranges of about 5 m when attached to a metallic object.

  2. A scanning tunneling microscope capable of imaging specified micron-scale small samples

    Science.gov (United States)

    Tao, Wei; Cao, Yufei; Wang, Huafeng; Wang, Kaiyou; Lu, Qingyou

    2012-12-01

    We present a home-built scanning tunneling microscope (STM) which allows us to precisely position the tip on any specified small sample or sample feature of micron scale. The core structure is a stand-alone soft junction mechanical loop (SJML), in which a small piezoelectric tube scanner is mounted on a sliding piece and a "U"-like soft spring strip has its one end fixed to the sliding piece and its opposite end holding the tip pointing to the sample on the scanner. Here, the tip can be precisely aligned to a specified small sample of micron scale by adjusting the position of the spring-clamped sample on the scanner in the field of view of an optical microscope. The aligned SJML can be transferred to a piezoelectric inertial motor for coarse approach, during which the U-spring is pushed towards the sample, causing the tip to approach the pre-aligned small sample. We have successfully approached a hand cut tip that was made from 0.1 mm thin Pt/Ir wire to an isolated individual 32.5 × 32.5 μm2 graphite flake. Good atomic resolution images and high quality tunneling current spectra for that specified tiny flake are obtained in ambient conditions with high repeatability within one month showing high and long term stability of the new STM structure. In addition, frequency spectra of the tunneling current signals do not show outstanding tip mount related resonant frequency (low frequency), which further confirms the stability of the STM structure.

  3. A scanning tunneling microscope capable of imaging specified micron-scale small samples.

    Science.gov (United States)

    Tao, Wei; Cao, Yufei; Wang, Huafeng; Wang, Kaiyou; Lu, Qingyou

    2012-12-01

    We present a home-built scanning tunneling microscope (STM) which allows us to precisely position the tip on any specified small sample or sample feature of micron scale. The core structure is a stand-alone soft junction mechanical loop (SJML), in which a small piezoelectric tube scanner is mounted on a sliding piece and a "U"-like soft spring strip has its one end fixed to the sliding piece and its opposite end holding the tip pointing to the sample on the scanner. Here, the tip can be precisely aligned to a specified small sample of micron scale by adjusting the position of the spring-clamped sample on the scanner in the field of view of an optical microscope. The aligned SJML can be transferred to a piezoelectric inertial motor for coarse approach, during which the U-spring is pushed towards the sample, causing the tip to approach the pre-aligned small sample. We have successfully approached a hand cut tip that was made from 0.1 mm thin Pt∕Ir wire to an isolated individual 32.5 × 32.5 μm(2) graphite flake. Good atomic resolution images and high quality tunneling current spectra for that specified tiny flake are obtained in ambient conditions with high repeatability within one month showing high and long term stability of the new STM structure. In addition, frequency spectra of the tunneling current signals do not show outstanding tip mount related resonant frequency (low frequency), which further confirms the stability of the STM structure.

  4. Gastro-oesophageal reflux in large-sized, deep-chested versus small-sized, barrel-chested dogs undergoing spinal surgery in sternal recumbency.

    Science.gov (United States)

    Anagnostou, Tilemahos L; Kazakos, George M; Savvas, Ioannis; Kostakis, Charalampos; Papadopoulou, Paraskevi

    2017-01-01

    The aim of this study was to investigate whether an increased frequency of gastro-oesophageal reflux (GOR) is more common in large-sized, deep-chested dogs undergoing spinal surgery in sternal recumbency than in small-sized, barrelchested dogs. Prospective, cohort study. Nineteen small-sized, barrel-chested dogs (group B) and 26 large-sized, deep-chested dogs (group D). All animals were premedicated with intramuscular (IM) acepromazine (0.05 mg kg -1 ) and pethidine (3 mg kg -1 ) IM. Anaesthesia was induced with intravenous sodium thiopental and maintained with halothane in oxygen. Lower oesophageal pH was monitored continuously after induction of anaesthesia. Gastro-oesophageal reflux was considered to have occurred whenever pH values > 7.5 or < 4 were recorded. If GOR was detected during anaesthesia, measures were taken to avoid aspiration of gastric contents into the lungs and to prevent the development of oesophagitis/oesophageal stricture. The frequency of GOR during anaesthesia was significantly higher in group D (6/26 dogs; 23.07%) than in group B (0/19 dogs; 0%) (p = 0.032). Signs indicative of aspiration pneumonia, oesophagitis or oesophageal stricture were not reported in any of the GOR cases. In large-sized, deep-chested dogs undergoing spinal surgery in sternal recumbency, it would seem prudent to consider measures aimed at preventing GOR and its potentially devastating consequences (oesophagitis/oesophageal stricture, aspiration pneumonia). Copyright © 2016 Association of Veterinary Anaesthetists and American College of Veterinary Anesthesia and Analgesia. Published by Elsevier Ltd. All rights reserved.

  5. Evaluation of pump pulsation in respirable size-selective sampling: part II. Changes in sampling efficiency.

    Science.gov (United States)

    Lee, Eun Gyung; Lee, Taekhee; Kim, Seung Won; Lee, Larry; Flemmer, Michael M; Harper, Martin

    2014-01-01

    This second, and concluding, part of this study evaluated changes in sampling efficiency of respirable size-selective samplers due to air pulsations generated by the selected personal sampling pumps characterized in Part I (Lee E, Lee L, Möhlmann C et al. Evaluation of pump pulsation in respirable size-selective sampling: Part I. Pulsation measurements. Ann Occup Hyg 2013). Nine particle sizes of monodisperse ammonium fluorescein (from 1 to 9 μm mass median aerodynamic diameter) were generated individually by a vibrating orifice aerosol generator from dilute solutions of fluorescein in aqueous ammonia and then injected into an environmental chamber. To collect these particles, 10-mm nylon cyclones, also known as Dorr-Oliver (DO) cyclones, were used with five medium volumetric flow rate pumps. Those were the Apex IS, HFS513, GilAir5, Elite5, and Basic5 pumps, which were found in Part I to generate pulsations of 5% (the lowest), 25%, 30%, 56%, and 70% (the highest), respectively. GK2.69 cyclones were used with the Legacy [pump pulsation (PP) = 15%] and Elite12 (PP = 41%) pumps for collection at high flows. The DO cyclone was also used to evaluate changes in sampling efficiency due to pulse shape. The HFS513 pump, which generates a more complex pulse shape, was compared to a single sine wave fluctuation generated by a piston. The luminescent intensity of the fluorescein extracted from each sample was measured with a luminescence spectrometer. Sampling efficiencies were obtained by dividing the intensity of the fluorescein extracted from the filter placed in a cyclone with the intensity obtained from the filter used with a sharp-edged reference sampler. Then, sampling efficiency curves were generated using a sigmoid function with three parameters and each sampling efficiency curve was compared to that of the reference cyclone by constructing bias maps. In general, no change in sampling efficiency (bias under ±10%) was observed until pulsations exceeded 25% for the

  6. Sample-size effects in fast-neutron gamma-ray production measurements: solid-cylinder samples

    International Nuclear Information System (INIS)

    Smith, D.L.

    1975-09-01

    The effects of geometry, absorption and multiple scattering in (n,Xγ) reaction measurements with solid-cylinder samples are investigated. Both analytical and Monte-Carlo methods are employed in the analysis. Geometric effects are shown to be relatively insignificant except in definition of the scattering angles. However, absorption and multiple-scattering effects are quite important; accurate microscopic differential cross sections can be extracted from experimental data only after a careful determination of corrections for these processes. The results of measurements performed using several natural iron samples (covering a wide range of sizes) confirm validity of the correction procedures described herein. It is concluded that these procedures are reliable whenever sufficiently accurate neutron and photon cross section and angular distribution information is available for the analysis. (13 figures, 5 tables) (auth)

  7. Subclinical delusional ideation and appreciation of sample size and heterogeneity in statistical judgment.

    Science.gov (United States)

    Galbraith, Niall D; Manktelow, Ken I; Morris, Neil G

    2010-11-01

    Previous studies demonstrate that people high in delusional ideation exhibit a data-gathering bias on inductive reasoning tasks. The current study set out to investigate the factors that may underpin such a bias by examining healthy individuals, classified as either high or low scorers on the Peters et al. Delusions Inventory (PDI). More specifically, whether high PDI scorers have a relatively poor appreciation of sample size and heterogeneity when making statistical judgments. In Expt 1, high PDI scorers made higher probability estimates when generalizing from a sample of 1 with regard to the heterogeneous human property of obesity. In Expt 2, this effect was replicated and was also observed in relation to the heterogeneous property of aggression. The findings suggest that delusion-prone individuals are less appreciative of the importance of sample size when making statistical judgments about heterogeneous properties; this may underpin the data gathering bias observed in previous studies. There was some support for the hypothesis that threatening material would exacerbate high PDI scorers' indifference to sample size.

  8. Role of Relationship Marketing in Small and Medium-Sized Entreprises

    Directory of Open Access Journals (Sweden)

    Ružica Butigan

    2011-06-01

    Full Text Available Along with marketing, small and medium enterprises (SMEs are commonly associated in literature with relationship marketing which results in marketing networks. This paper examines the specific characteristics that differentiate large companies from small and medium-sized enterprises, and the reasons that prevent SMEs from engaging in traditional marketing within the scope of marketing mix. The paper also shows that the key characteristic which distinguishes small from large companies is a prominent role of owner/managers. The owner/manager affects marketing of SMEs by creating his or her own personal networks with stakeholders in the environment, and considers it an important strategic focus and a factor that increases the performance of marketing decision-making. After a thorough review of secondary data sources this paper examines relationship marketing as a marketing framework for SMEs and, finally, proves that relationship marketing is more appropriate for SMEs than are traditional marketing concepts, that relationship marketing and network marketing create an important framework for SMEs and that there is a link between relationship marketing, personal networks and SMEs.

  9. Page sample size in web accessibility testing: how many pages is enough?

    NARCIS (Netherlands)

    Velleman, Eric Martin; van der Geest, Thea

    2013-01-01

    Various countries and organizations use a different sampling approach and sample size of web pages in accessibility conformance tests. We are conducting a systematic analysis to determine how many pages is enough for testing whether a website is compliant with standard accessibility guidelines. This

  10. Size-exclusion chromatography-based enrichment of extracellular vesicles from urine samples

    Directory of Open Access Journals (Sweden)

    Inés Lozano-Ramos

    2015-05-01

    Full Text Available Renal biopsy is the gold-standard procedure to diagnose most of renal pathologies. However, this invasive method is of limited repeatability and often describes an irreversible renal damage. Urine is an easily accessible fluid and urinary extracellular vesicles (EVs may be ideal to describe new biomarkers associated with renal pathologies. Several methods to enrich EVs have been described. Most of them contain a mixture of proteins, lipoproteins and cell debris that may be masking relevant biomarkers. Here, we evaluated size-exclusion chromatography (SEC as a suitable method to isolate urinary EVs. Following a conventional centrifugation to eliminate cell debris and apoptotic bodies, urine samples were concentrated using ultrafiltration and loaded on a SEC column. Collected fractions were analysed by protein content and flow cytometry to determine the presence of tetraspanin markers (CD63 and CD9. The highest tetraspanin content was routinely detected in fractions well before the bulk of proteins eluted. These tetraspanin-peak fractions were analysed by cryo-electron microscopy (cryo-EM and nanoparticle tracking analysis revealing the presence of EVs.When analysed by sodium dodecyl sulphate–polyacrylamide gel electrophoresis, tetraspanin-peak fractions from urine concentrated samples contained multiple bands but the main urine proteins (such as Tamm–Horsfall protein were absent. Furthermore, a preliminary proteomic study of these fractions revealed the presence of EV-related proteins, suggesting their enrichment in concentrated samples. In addition, RNA profiling also showed the presence of vesicular small RNA species.To summarize, our results demonstrated that concentrated urine followed by SEC is a suitable option to isolate EVs with low presence of soluble contaminants. This methodology could permit more accurate analyses of EV-related biomarkers when further characterized by -omics technologies compared with other approaches.

  11. VOLTTRON™: Tech-to-Market Best-Practices Guide for Small- and Medium-Sized Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Cort, Katherine A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Haack, Jereme N. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Katipamula, Srinivas [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Nicholls, Andrew K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-07-11

    VOLTTRON™ is an open-source distributed control and sensing platform developed by Pacific Northwest National Laboratory for the U.S. Department of Energy. It was developed to be used by the Office of Energy Efficiency and Renewable Energy to support transactive controls research and deployment activities. VOLTTRON is designed to be an overarching integration platform that could be used to bring together vendors, users, and developers and enable rapid application development and testing. The platform is designed to support modern control strategies, including the use of agent- and transaction-based controls. It also is designed to support the management of a wide range of applications, including heating, ventilation, and air-conditioning systems; electric vehicles; and distributed-energy and whole-building loads. This report was completed as part of the Building Technologies Office’s Technology-to-Market Initiative for VOLTTRON’s Market Validation and Business Case Development efforts. The report provides technology-to-market guidance and best practices related to VOLTTRON platform deployments and commercialization activities for use by entities serving small- and medium-sized commercial buildings. The report characterizes the platform ecosystem within the small- and medium-sized commercial building market and articulates the value proposition of VOLTTRON for three core participants in this ecosystem: 1) platform owners/adopters, 2) app developers, and 3) end-users. The report also identifies key market drivers and opportunities for open platform deployments in the small- and medium-sized commercial building market. Possible pathways to the market are described—laboratory testing to market adoption to commercialization. We also identify and address various technical and market barriers that could hinder deployment of VOLTTRON. Finally, we provide “best practice” tech-to-market guidance for building energy-related deployment efforts serving small- and

  12. Engineering structure design and fabrication process of small sized China helium-cooled solid breeder test blanket module

    International Nuclear Information System (INIS)

    Wang Zeming; Chen Lu; Hu Gang

    2014-01-01

    Preliminary design and analysis for china helium-cooled solid breeder (CHHC-SB) test blanket module (TBM) have been carried out recently. As partial verification that the original size module was reasonable and the development process was feasible, fabrication work of a small sized module was to be carried out targetedly. In this paper, detailed design and structure analysis of small sized TBM was carried out based on preliminary design work, fabrication process and integrated assembly process was proposed, so a fabrication for the trial engineering of TBM was layed successfully. (authors)

  13. Wind and solar data for sizing small wind turbine and photovoltaic power plants

    Energy Technology Data Exchange (ETDEWEB)

    Grainger, W [Northumbrian Energy Workshop Ltd., Hexham (GB)

    1990-01-01

    Small renewable energy power systems have to be more carefully sized and installed than fossil fuelled alternatives. Accurate assessment of the energy resource available at the site is the first step in system design. This paper describes the sort of data available and how they are processed. When small systems are involved there is little money available for detailed meteorological investigations. This has led our company to develop the techniques described. (author).

  14. A Single IGF1 Allele Is a Major Determinant of Small Size in Dogs

    OpenAIRE

    Sutter, Nathan B.; Bustamante, Carlos D.; Chase, Kevin; Gray, Melissa M.; Zhao, Keyan; Zhu, Lan; Padhukasahasram, Badri; Karlins, Eric; Davis, Sean; Jones, Paul G.; Quignon, Pascale; Johnson, Gary S.; Parker, Heidi G.; Fretwell, Neale; Mosher, Dana S.

    2007-01-01

    The domestic dog exhibits greater diversity in body size than any other terrestrial vertebrate. We used a strategy that exploits the breed structure of dogs to investigate the genetic basis of size. First, through a genome-wide scan, we identified a major quantitative trait locus (QTL) on chromosome 15 influencing size variation within a single breed. Second, we examined genetic variation in the 15-megabase interval surrounding the QTL in small and giant breeds and found marked evidence for a...

  15. Status and prospects for small and medium sized reactors

    International Nuclear Information System (INIS)

    Kupitz, J.; Arkhipov, V.; Brey, H.L.

    2000-01-01

    Member States have evinced interest in the continued development of small and medium sized reactors (SMRs) as an energy source for the future. Support for this programme was recently reaffirmed at the IAEA 1997 General Conference. Although the generation of electricity is the predominant focus of existing SMRs, there is increasing interest in using these plants for desalination, district heating and high temperature industrial process heat applications. Here is a review of SMR development within selected Member States, an overview of the IAEA's SMR programme and a discussion of selected SMR designs with emphasis on safety attributes. (author)

  16. Sensitivity of Mantel Haenszel Model and Rasch Model as Viewed From Sample Size

    OpenAIRE

    ALWI, IDRUS

    2011-01-01

    The aims of this research is to study the sensitivity comparison of Mantel Haenszel and Rasch Model for detection differential item functioning, observed from the sample size. These two differential item functioning (DIF) methods were compared using simulate binary item respon data sets of varying sample size,  200 and 400 examinees were used in the analyses, a detection method of differential item functioning (DIF) based on gender difference. These test conditions were replication 4 tim...

  17. Prognostic Factors Affecting Rotator Cuff Healing After Arthroscopic Repair in Small to Medium-sized Tears.

    Science.gov (United States)

    Park, Ji Soon; Park, Hyung Jun; Kim, Sae Hoon; Oh, Joo Han

    2015-10-01

    Small and medium-sized rotator cuff tears usually have good clinical and anatomic outcomes. However, healing failure still occurs in some cases. To evaluate prognostic factors for rotator cuff healing in patients with only small to medium-sized rotator cuff tears. Case-control study; Level of evidence, 3. Data were prospectively collected from 339 patients with small to medium-sized rotator cuff tears who underwent arthroscopic repair by a single surgeon between March 2004 and August 2012 and who underwent magnetic resonance imaging or computed tomographic arthrography at least 1 year after surgery. The mean age of the patients was 59.8 years (range, 39-80 years), and the mean follow-up time was 20.8 months (range, 12-66 months). The functional evaluation included the visual analog scale (VAS) for pain, American Shoulder and Elbow Surgeons score, Constant-Murley score, and Simple Shoulder Test. Postoperative VAS for pain and functional scores improved significantly compared with preoperative values (P rotator cuff healing (P 2 cm in size (34.2%) compared with patients with a tear ≤2 cm (10.6%) (P rotator cuff tears, grade II fatty degeneration of the infraspinatus muscle according to the Goutallier classification could be a reference point for successful healing, and anatomic outcomes might be better if repair is performed before the patient is 69 years old and the tear size exceeds 2 cm. © 2015 The Author(s).

  18. Understanding Informal Learning in Small- and Medium-Sized Enterprises in South Korea

    Science.gov (United States)

    Jeong, Shinhee; McLean, Gary N.; Park, Soyoun

    2018-01-01

    Purpose: This paper aims to explore informal learning experiences among employees working in South Korean small- and medium-sized enterprises (SMEs) with fewer than 100 employees. This study specifically seeks to understand the characteristics of informal learning in Korean SMEs and culturally sensitive contextual factors that shape informal…

  19. Simultaneous analysis of small organic acids and humic acids using high performance size exclusion chromatography

    NARCIS (Netherlands)

    Qin, X.P.; Liu, F.; Wang, G.C.; Weng, L.P.

    2012-01-01

    An accurate and fast method for simultaneous determination of small organic acids and much larger humic acids was developed using high performance size exclusion chromatography. Two small organic acids, i.e. salicylic acid and 2,3-dihydroxybenzoic acid, and one purified humic acid material were used

  20. European Health Claims for Small and Medium-Sized Companies – Utopian Dream or Future Reality?

    Directory of Open Access Journals (Sweden)

    Sonja Brandenburger

    2015-02-01

    Full Text Available Background: In December 2007, the European Regulation (EC 1924/2006 on nutrition and health claims came into force. The European Union wanted to regulate the use of health claims on products. An online survey was carried out to evaluate the situation, particularly of small and medium-sized companies, dealing with the new regulation. Methods: The online survey on health claims was conducted with 16 enterprises. To underline the findings a SWOT (Strength, Weaknesses, Opportunities, Threats analysis was made of the nutrition and health claims regulation regarding small and medium-sized companies in the European food and drink market. Results: The findings of this study indicated that the European Union did a step in the right direction. Most companies defined the decent competition, the simplified trade within the inner-European market, and the consumer protection as positive aspects. The biggest threat is seen in false investment conditioned by the limited research and development budgets, especially of small and medium-sized enterprises, and the cost intensive scientific evaluation to reach an authorized health claim. Conclusions: Overall, there are several strengths and opportunities speaking for SMEs and health claims in the near future. The most promising ones are the publishing of the new European Union Register of Nutrition and Health Claims and the learning effects that will occur. The biggest threat is, and will remain to be, false investment and the possible loss of a lot of money. Nevertheless, health claims for small and medium-sized enterprises will inevitably be the future to keep the European food and drink market competitive.

  1. A miniature small size 3 MeV deuteron linear accelerator

    International Nuclear Information System (INIS)

    Baranov, L.N.; Bryzgalov, G.A.; Verbovskij, V.V.; Kovpak, N.E.; Onoprienko, V.T.; Papkovich, V.G.; Khizhnyak, N.A.; Shulika, N.G.; Yashin, V.P.

    1975-01-01

    Basic characteristics are presented of the small-size linear deuteron accelerator for 3 MeV, the accelerating system of which operates at H-wave. It is shown that the usage of such accelerating systems makes it possible to reduce the resonator volume by more than 30 times, whereas the capacity of the evacuating devices as well as the total HF supply power are decreased. Owing to a relatively large wave length, particle injection energy may be reduced to 100-150 keV

  2. The Suitability of Expert System Application in Czech Small and Medium‑Sized Enterprises

    Directory of Open Access Journals (Sweden)

    Ekaterina Khitilova

    2017-01-01

    Full Text Available Small and medium-sized enterprises play an important role in the economy of Czech Republic. Expert systems are one of the alternatives evaluated the effectiveness of the relationship between supplier and customer. Currently, measurement and increase in efficiency of supplier-customer relations is very topical subject. The requirements for tools the improvement supplier- customer relationship can be specified on the base the literature analysis. The article defines basic requirements for expert system. The presentation of expert system and its suitability in the surroundings of small and medium sized enterprises is the part of this article. The comparison of the expert system possibilities and the compiled requirements leads to suitability analysis of selected solution. Actual questions and the way of future work are derived from current results, presented in the paper. The article defined the directions of development this expert system.

  3. Research Note Pilot survey to assess sample size for herbaceous ...

    African Journals Online (AJOL)

    A pilot survey to determine sub-sample size (number of point observations per plot) for herbaceous species composition assessments, using a wheel-point apparatus applying the nearest-plant method, was conducted. Three plots differing in species composition on the Zululand coastal plain were selected, and on each plot ...

  4. Size matters: pitch dimensions constrain inter-team distances and surface area difference in small-sided soccer games

    NARCIS (Netherlands)

    Frencken, Wouter; van der Plaats, Jorrit; Visscher, Chris; Lemmink, Koen

    2013-01-01

    Pitch size varies in official soccer matches and differently sized pitches are adopted for tactical purposes in small-sided training games. Since interactive team behaviour emerges under con- straints, the authors evaluate the effect of pitch size (task) manipulations on interactive team behaviour

  5. ASSESSING SMALL SAMPLE WAR-GAMING DATASETS

    Directory of Open Access Journals (Sweden)

    W. J. HURLEY

    2013-10-01

    Full Text Available One of the fundamental problems faced by military planners is the assessment of changes to force structure. An example is whether to replace an existing capability with an enhanced system. This can be done directly with a comparison of measures such as accuracy, lethality, survivability, etc. However this approach does not allow an assessment of the force multiplier effects of the proposed change. To gauge these effects, planners often turn to war-gaming. For many war-gaming experiments, it is expensive, both in terms of time and dollars, to generate a large number of sample observations. This puts a premium on the statistical methodology used to examine these small datasets. In this paper we compare the power of three tests to assess population differences: the Wald-Wolfowitz test, the Mann-Whitney U test, and re-sampling. We employ a series of Monte Carlo simulation experiments. Not unexpectedly, we find that the Mann-Whitney test performs better than the Wald-Wolfowitz test. Resampling is judged to perform slightly better than the Mann-Whitney test.

  6. Estimation of reference intervals from small samples: an example using canine plasma creatinine.

    Science.gov (United States)

    Geffré, A; Braun, J P; Trumel, C; Concordet, D

    2009-12-01

    According to international recommendations, reference intervals should be determined from at least 120 reference individuals, which often are impossible to achieve in veterinary clinical pathology, especially for wild animals. When only a small number of reference subjects is available, the possible bias cannot be known and the normality of the distribution cannot be evaluated. A comparison of reference intervals estimated by different methods could be helpful. The purpose of this study was to compare reference limits determined from a large set of canine plasma creatinine reference values, and large subsets of this data, with estimates obtained from small samples selected randomly. Twenty sets each of 120 and 27 samples were randomly selected from a set of 1439 plasma creatinine results obtained from healthy dogs in another study. Reference intervals for the whole sample and for the large samples were determined by a nonparametric method. The estimated reference limits for the small samples were minimum and maximum, mean +/- 2 SD of native and Box-Cox-transformed values, 2.5th and 97.5th percentiles by a robust method on native and Box-Cox-transformed values, and estimates from diagrams of cumulative distribution functions. The whole sample had a heavily skewed distribution, which approached Gaussian after Box-Cox transformation. The reference limits estimated from small samples were highly variable. The closest estimates to the 1439-result reference interval for 27-result subsamples were obtained by both parametric and robust methods after Box-Cox transformation but were grossly erroneous in some cases. For small samples, it is recommended that all values be reported graphically in a dot plot or histogram and that estimates of the reference limits be compared using different methods.

  7. Accelerator mass spectrometry of ultra-small samples with applications in the biosciences

    International Nuclear Information System (INIS)

    Salehpour, Mehran; Håkansson, Karl; Possnert, Göran

    2013-01-01

    An overview is presented covering the biological accelerator mass spectrometry activities at Uppsala University. The research utilizes the Uppsala University Tandem laboratory facilities, including a 5 MV Pelletron tandem accelerator and two stable isotope ratio mass spectrometers. In addition, a dedicated sample preparation laboratory for biological samples with natural activity is in use, as well as another laboratory specifically for 14 C-labeled samples. A variety of ongoing projects are described and presented. Examples are: (1) Ultra-small sample AMS. We routinely analyze samples with masses in the 5–10 μg C range. Data is presented regarding the sample preparation method, (2) bomb peak biological dating of ultra-small samples. A long term project is presented where purified and cell-specific DNA from various part of the human body including the heart and the brain are analyzed with the aim of extracting regeneration rate of the various human cells, (3) biological dating of various human biopsies, including atherosclerosis related plaques is presented. The average built up time of the surgically removed human carotid plaques have been measured and correlated to various data including the level of insulin in the human blood, and (4) In addition to standard microdosing type measurements using small pharmaceutical drugs, pre-clinical pharmacokinetic data from a macromolecular drug candidate are discussed.

  8. Accelerator mass spectrometry of ultra-small samples with applications in the biosciences

    Energy Technology Data Exchange (ETDEWEB)

    Salehpour, Mehran, E-mail: mehran.salehpour@physics.uu.se [Department of Physics and Astronomy, Ion Physics, PO Box 516, SE-751 20 Uppsala (Sweden); Hakansson, Karl; Possnert, Goeran [Department of Physics and Astronomy, Ion Physics, PO Box 516, SE-751 20 Uppsala (Sweden)

    2013-01-15

    An overview is presented covering the biological accelerator mass spectrometry activities at Uppsala University. The research utilizes the Uppsala University Tandem laboratory facilities, including a 5 MV Pelletron tandem accelerator and two stable isotope ratio mass spectrometers. In addition, a dedicated sample preparation laboratory for biological samples with natural activity is in use, as well as another laboratory specifically for {sup 14}C-labeled samples. A variety of ongoing projects are described and presented. Examples are: (1) Ultra-small sample AMS. We routinely analyze samples with masses in the 5-10 {mu}g C range. Data is presented regarding the sample preparation method, (2) bomb peak biological dating of ultra-small samples. A long term project is presented where purified and cell-specific DNA from various part of the human body including the heart and the brain are analyzed with the aim of extracting regeneration rate of the various human cells, (3) biological dating of various human biopsies, including atherosclerosis related plaques is presented. The average built up time of the surgically removed human carotid plaques have been measured and correlated to various data including the level of insulin in the human blood, and (4) In addition to standard microdosing type measurements using small pharmaceutical drugs, pre-clinical pharmacokinetic data from a macromolecular drug candidate are discussed.

  9. Maximum type 1 error rate inflation in multiarmed clinical trials with adaptive interim sample size modifications.

    Science.gov (United States)

    Graf, Alexandra C; Bauer, Peter; Glimm, Ekkehard; Koenig, Franz

    2014-07-01

    Sample size modifications in the interim analyses of an adaptive design can inflate the type 1 error rate, if test statistics and critical boundaries are used in the final analysis as if no modification had been made. While this is already true for designs with an overall change of the sample size in a balanced treatment-control comparison, the inflation can be much larger if in addition a modification of allocation ratios is allowed as well. In this paper, we investigate adaptive designs with several treatment arms compared to a single common control group. Regarding modifications, we consider treatment arm selection as well as modifications of overall sample size and allocation ratios. The inflation is quantified for two approaches: a naive procedure that ignores not only all modifications, but also the multiplicity issue arising from the many-to-one comparison, and a Dunnett procedure that ignores modifications, but adjusts for the initially started multiple treatments. The maximum inflation of the type 1 error rate for such types of design can be calculated by searching for the "worst case" scenarios, that are sample size adaptation rules in the interim analysis that lead to the largest conditional type 1 error rate in any point of the sample space. To show the most extreme inflation, we initially assume unconstrained second stage sample size modifications leading to a large inflation of the type 1 error rate. Furthermore, we investigate the inflation when putting constraints on the second stage sample sizes. It turns out that, for example fixing the sample size of the control group, leads to designs controlling the type 1 error rate. © 2014 The Author. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. A simple nomogram for sample size for estimating sensitivity and specificity of medical tests

    Directory of Open Access Journals (Sweden)

    Malhotra Rajeev

    2010-01-01

    Full Text Available Sensitivity and specificity measure inherent validity of a diagnostic test against a gold standard. Researchers develop new diagnostic methods to reduce the cost, risk, invasiveness, and time. Adequate sample size is a must to precisely estimate the validity of a diagnostic test. In practice, researchers generally decide about the sample size arbitrarily either at their convenience, or from the previous literature. We have devised a simple nomogram that yields statistically valid sample size for anticipated sensitivity or anticipated specificity. MS Excel version 2007 was used to derive the values required to plot the nomogram using varying absolute precision, known prevalence of disease, and 95% confidence level using the formula already available in the literature. The nomogram plot was obtained by suitably arranging the lines and distances to conform to this formula. This nomogram could be easily used to determine the sample size for estimating the sensitivity or specificity of a diagnostic test with required precision and 95% confidence level. Sample size at 90% and 99% confidence level, respectively, can also be obtained by just multiplying 0.70 and 1.75 with the number obtained for the 95% confidence level. A nomogram instantly provides the required number of subjects by just moving the ruler and can be repeatedly used without redoing the calculations. This can also be applied for reverse calculations. This nomogram is not applicable for testing of the hypothesis set-up and is applicable only when both diagnostic test and gold standard results have a dichotomous category.

  11. Relationship marketing of small to medium sized textile retailers in the Northwest Province / Fred Angels Amulike Musika

    OpenAIRE

    Musika, Fred Angels Amulike

    2002-01-01

    This study concentrated on the concept of relationship marketing and its implementation by small and medium sized textile retailers in the Northwest province of South Africa. This study addressed the gap between the relationship marketing theory and its implementation by small and medium sized textile retailers in the textile industry of the Northwest province. Researchers in relationship marketing have started to realise that there is a definite need for detailed empirical ...

  12. Personnel role by implementing anti-crisis programs in small and medium-size enterprises

    Directory of Open Access Journals (Sweden)

    Pak Andrey Borisovitch

    2011-08-01

    Full Text Available To recover from crisis, small and medium-size enterprises have to develop anti-crisis plans and programs and to monitor the implementation process. The implementation efficiency is provided by the personnel which needs appraisal, motivation and development.

  13. Norm Block Sample Sizes: A Review of 17 Individually Administered Intelligence Tests

    Science.gov (United States)

    Norfolk, Philip A.; Farmer, Ryan L.; Floyd, Randy G.; Woods, Isaac L.; Hawkins, Haley K.; Irby, Sarah M.

    2015-01-01

    The representativeness, recency, and size of norm samples strongly influence the accuracy of inferences drawn from their scores. Inadequate norm samples may lead to inflated or deflated scores for individuals and poorer prediction of developmental and academic outcomes. The purpose of this study was to apply Kranzler and Floyd's method for…

  14. Precision of quantization of the hall conductivity in a finite-size sample: Power law

    International Nuclear Information System (INIS)

    Greshnov, A. A.; Kolesnikova, E. N.; Zegrya, G. G.

    2006-01-01

    A microscopic calculation of the conductivity in the integer quantum Hall effect (IQHE) mode is carried out. The precision of quantization is analyzed for finite-size samples. The precision of quantization shows a power-law dependence on the sample size. A new scaling parameter describing this dependence is introduced. It is also demonstrated that the precision of quantization linearly depends on the ratio between the amplitude of the disorder potential and the cyclotron energy. The data obtained are compared with the results of magnetotransport measurements in mesoscopic samples

  15. 78 FR 66950 - Trade Barriers That U.S. Small and Medium-Sized Enterprises Perceive as Affecting Exports to the...

    Science.gov (United States)

    2013-11-07

    ... report that catalogs trade barriers that U.S. small and medium-sized enterprises (SMEs) perceive as... INTERNATIONAL TRADE COMMISSION [Investigation No. 332-541] Trade Barriers That U.S. Small and Medium-Sized Enterprises Perceive as Affecting Exports to the European Union; Rescheduling of Washington...

  16. Effect of model choice and sample size on statistical tolerance limits

    International Nuclear Information System (INIS)

    Duran, B.S.; Campbell, K.

    1980-03-01

    Statistical tolerance limits are estimates of large (or small) quantiles of a distribution, quantities which are very sensitive to the shape of the tail of the distribution. The exact nature of this tail behavior cannot be ascertained brom small samples, so statistical tolerance limits are frequently computed using a statistical model chosen on the basis of theoretical considerations or prior experience with similar populations. This report illustrates the effects of such choices on the computations

  17. Sample size for monitoring sirex populations and their natural enemies

    Directory of Open Access Journals (Sweden)

    Susete do Rocio Chiarello Penteado

    2016-09-01

    Full Text Available The woodwasp Sirex noctilio Fabricius (Hymenoptera: Siricidae was introduced in Brazil in 1988 and became the main pest in pine plantations. It has spread to about 1.000.000 ha, at different population levels, in the states of Rio Grande do Sul, Santa Catarina, Paraná, São Paulo and Minas Gerais. Control is done mainly by using a nematode, Deladenus siricidicola Bedding (Nematoda: Neothylenchidae. The evaluation of the efficiency of natural enemies has been difficult because there are no appropriate sampling systems. This study tested a hierarchical sampling system to define the sample size to monitor the S. noctilio population and the efficiency of their natural enemies, which was found to be perfectly adequate.

  18. Organisational culture as a part in the development of open innovation - the perspective of small and medium-sized enterprises

    Directory of Open Access Journals (Sweden)

    Szymańska Katarzyna

    2016-05-01

    Full Text Available The ability to introduce various concepts and business models is nowadays a prerequisite of creating a competitive advantage. This is to a large extent closely linked to the ability of enterprises to create, implement and disseminate a variety of innovative solutions. Today the use of open innovation is a necessity. This applies not only to large organisations, but also to small and medium-sized enterprises. In order to implement open innovation, small and medium-sized enterprises need to effectively manage their own growth through the preparation of appropriate strategies and the development of a model that encompasses all changes, taking into account a number of factors related to the growth dynamics of this sector. It is understood that an appropriate organisational culture plays an important role in the implementation of innovation in the sector of small and medium-sized enterprises. There are many indications that a cultural mismatch and misunderstanding are the main reasons for major problems related to the low level of implementation of innovation by small and medium-sized enterprises. The aim of the paper is to outline the issue of the impact of organisational culture on the development of the concept of open innovation in the sector of small and medium-sized enterprises.

  19. Size of nuclear sources from measurements of proton-proton correlations at small relative momentum

    International Nuclear Information System (INIS)

    Rebreyend, D.; Kox, S.; Merchez, F.; Noren, B.; Perrin, C.; Khelfaoui, B.; Gondrand, J.C.; Bondorf, J.P.

    1990-01-01

    This contribution will present recent measurements performed on light heavy ion reactions at intermediate energies. Nuclear source sizes were determined by measuring the correlation at small relative momentum, between two protons detected in the EMRIC set-up. This technique allows the determination of the extent of the emitting source by constructing a correlation function for the coincident protons and analyzing it in the framework of a final state interaction model. We found the apparent source size to be large compared to the dimension of the studied system and low sensitivity of the extracted radii as a function of the target mass and detection angle. We will show that simulations may be needed to fully estimate the correlation induced by detectors with small angular acceptance

  20. Nano-Scale Sample Acquisition Systems for Small Class Exploration Spacecraft

    Science.gov (United States)

    Paulsen, G.

    2015-12-01

    The paradigm for space exploration is changing. Large and expensive missions are very rare and the space community is turning to smaller, lighter, and less expensive missions that could still perform great exploration. These missions are also within reach of commercial companies such as the Google Lunar X Prize teams that develop small scale lunar missions. Recent commercial endeavors such as "Planet Labs inc." and Sky Box Imaging, inc. show that there are new benefits and business models associated with miniaturization of space hardware. The Nano-Scale Sample Acquisition System includes NanoDrill for capture of small rock cores and PlanetVac for capture of surface regolith. These two systems are part of the ongoing effort to develop "Micro Sampling" systems for deployment by the small spacecraft with limited payload capacities. The ideal applications include prospecting missions to the Moon and Asteroids. The MicroDrill is a rotary-percussive coring drill that captures cores 7 mm in diameter and up to 2 cm long. The drill weighs less than 1 kg and can capture a core from a 40 MPa strength rock within a few minutes, with less than 10 Watt power and less than 10 Newton of preload. The PlanetVac is a pneumatic based regolith acquisition system that can capture surface sample in touch-and-go maneuver. These sampling systems were integrated within the footpads of commercial quadcopter for testing. As such, they could also be used by geologists on Earth to explore difficult to get to locations.