WorldWideScience

Sample records for samples show small

  1. Test of a sample container for shipment of small size plutonium samples with PAT-2

    International Nuclear Information System (INIS)

    Kuhn, E.; Aigner, H.; Deron, S.

    1981-11-01

    A light-weight container for the air transport of plutonium, to be designated PAT-2, has been developed in the USA and is presently undergoing licensing. The very limited effective space for bearing plutonium required the design of small size sample canisters to meet the needs of international safeguards for the shipment of plutonium samples. The applicability of a small canister for the sampling of small size powder and solution samples has been tested in an intralaboratory experiment. The results of the experiment, based on the concept of pre-weighed samples, show that the tested canister can successfully be used for the sampling of small size PuO 2 -powder samples of homogeneous source material, as well as for dried aliquands of plutonium nitrate solutions. (author)

  2. Estimation for small domains in double sampling for stratification ...

    African Journals Online (AJOL)

    In this article, we investigate the effect of randomness of the size of a small domain on the precision of an estimator of mean for the domain under double sampling for stratification. The result shows that for a small domain that cuts across various strata with unknown weights, the sampling variance depends on the within ...

  3. Big Data, Small Sample.

    Science.gov (United States)

    Gerlovina, Inna; van der Laan, Mark J; Hubbard, Alan

    2017-05-20

    Multiple comparisons and small sample size, common characteristics of many types of "Big Data" including those that are produced by genomic studies, present specific challenges that affect reliability of inference. Use of multiple testing procedures necessitates calculation of very small tail probabilities of a test statistic distribution. Results based on large deviation theory provide a formal condition that is necessary to guarantee error rate control given practical sample sizes, linking the number of tests and the sample size; this condition, however, is rarely satisfied. Using methods that are based on Edgeworth expansions (relying especially on the work of Peter Hall), we explore the impact of departures of sampling distributions from typical assumptions on actual error rates. Our investigation illustrates how far the actual error rates can be from the declared nominal levels, suggesting potentially wide-spread problems with error rate control, specifically excessive false positives. This is an important factor that contributes to "reproducibility crisis". We also review some other commonly used methods (such as permutation and methods based on finite sampling inequalities) in their application to multiple testing/small sample data. We point out that Edgeworth expansions, providing higher order approximations to the sampling distribution, offer a promising direction for data analysis that could improve reliability of studies relying on large numbers of comparisons with modest sample sizes.

  4. Small-sample-worth perturbation methods

    International Nuclear Information System (INIS)

    1985-01-01

    It has been assumed that the perturbed region, R/sub p/, is large enough so that: (1) even without a great deal of biasing there is a substantial probability that an average source-neutron will enter it; and (2) once having entered, the neutron is likely to make several collisions in R/sub p/ during its lifetime. Unfortunately neither assumption is valid for the typical configurations one encounters in small-sample-worth experiments. In such experiments one measures the reactivity change which is induced when a very small void in a critical assembly is filled with a sample of some test-material. Only a minute fraction of the fission-source neutrons ever gets into the sample and, of those neutrons that do, most emerge uncollided. Monte Carlo small-sample perturbations computations are described

  5. Standard Deviation for Small Samples

    Science.gov (United States)

    Joarder, Anwar H.; Latif, Raja M.

    2006-01-01

    Neater representations for variance are given for small sample sizes, especially for 3 and 4. With these representations, variance can be calculated without a calculator if sample sizes are small and observations are integers, and an upper bound for the standard deviation is immediate. Accessible proofs of lower and upper bounds are presented for…

  6. Small sample GEE estimation of regression parameters for longitudinal data.

    Science.gov (United States)

    Paul, Sudhir; Zhang, Xuemao

    2014-09-28

    Longitudinal (clustered) response data arise in many bio-statistical applications which, in general, cannot be assumed to be independent. Generalized estimating equation (GEE) is a widely used method to estimate marginal regression parameters for correlated responses. The advantage of the GEE is that the estimates of the regression parameters are asymptotically unbiased even if the correlation structure is misspecified, although their small sample properties are not known. In this paper, two bias adjusted GEE estimators of the regression parameters in longitudinal data are obtained when the number of subjects is small. One is based on a bias correction, and the other is based on a bias reduction. Simulations show that the performances of both the bias-corrected methods are similar in terms of bias, efficiency, coverage probability, average coverage length, impact of misspecification of correlation structure, and impact of cluster size on bias correction. Both these methods show superior properties over the GEE estimates for small samples. Further, analysis of data involving a small number of subjects also shows improvement in bias, MSE, standard error, and length of the confidence interval of the estimates by the two bias adjusted methods over the GEE estimates. For small to moderate sample sizes (N ≤50), either of the bias-corrected methods GEEBc and GEEBr can be used. However, the method GEEBc should be preferred over GEEBr, as the former is computationally easier. For large sample sizes, the GEE method can be used. Copyright © 2014 John Wiley & Sons, Ltd.

  7. Maybe Small Is Too Small a Term: Introduction to Advancing Small Sample Prevention Science.

    Science.gov (United States)

    Fok, Carlotta Ching Ting; Henry, David; Allen, James

    2015-10-01

    Prevention research addressing health disparities often involves work with small population groups experiencing such disparities. The goals of this special section are to (1) address the question of what constitutes a small sample; (2) identify some of the key research design and analytic issues that arise in prevention research with small samples; (3) develop applied, problem-oriented, and methodologically innovative solutions to these design and analytic issues; and (4) evaluate the potential role of these innovative solutions in describing phenomena, testing theory, and evaluating interventions in prevention research. Through these efforts, we hope to promote broader application of these methodological innovations. We also seek whenever possible, to explore their implications in more general problems that appear in research with small samples but concern all areas of prevention research. This special section includes two sections. The first section aims to provide input for researchers at the design phase, while the second focuses on analysis. Each article describes an innovative solution to one or more challenges posed by the analysis of small samples, with special emphasis on testing for intervention effects in prevention research. A concluding article summarizes some of their broader implications, along with conclusions regarding future directions in research with small samples in prevention science. Finally, a commentary provides the perspective of the federal agencies that sponsored the conference that gave rise to this special section.

  8. An Improvement to Interval Estimation for Small Samples

    Directory of Open Access Journals (Sweden)

    SUN Hui-Ling

    2017-02-01

    Full Text Available Because it is difficult and complex to determine the probability distribution of small samples,it is improper to use traditional probability theory to process parameter estimation for small samples. Bayes Bootstrap method is always used in the project. Although,the Bayes Bootstrap method has its own limitation,In this article an improvement is given to the Bayes Bootstrap method,This method extended the amount of samples by numerical simulation without changing the circumstances in a small sample of the original sample. And the new method can give the accurate interval estimation for the small samples. Finally,by using the Monte Carlo simulation to model simulation to the specific small sample problems. The effectiveness and practicability of the Improved-Bootstrap method was proved.

  9. Exploratory Factor Analysis With Small Samples and Missing Data.

    Science.gov (United States)

    McNeish, Daniel

    2017-01-01

    Exploratory factor analysis (EFA) is an extremely popular method for determining the underlying factor structure for a set of variables. Due to its exploratory nature, EFA is notorious for being conducted with small sample sizes, and recent reviews of psychological research have reported that between 40% and 60% of applied studies have 200 or fewer observations. Recent methodological studies have addressed small size requirements for EFA models; however, these models have only considered complete data, which are the exception rather than the rule in psychology. Furthermore, the extant literature on missing data techniques with small samples is scant, and nearly all existing studies focus on topics that are not of primary interest to EFA models. Therefore, this article presents a simulation to assess the performance of various missing data techniques for EFA models with both small samples and missing data. Results show that deletion methods do not extract the proper number of factors and estimate the factor loadings with severe bias, even when data are missing completely at random. Predictive mean matching is the best method overall when considering extracting the correct number of factors and estimating factor loadings without bias, although 2-stage estimation was a close second.

  10. Small sample whole-genome amplification

    Science.gov (United States)

    Hara, Christine; Nguyen, Christine; Wheeler, Elizabeth; Sorensen, Karen; Arroyo, Erin; Vrankovich, Greg; Christian, Allen

    2005-11-01

    Many challenges arise when trying to amplify and analyze human samples collected in the field due to limitations in sample quantity, and contamination of the starting material. Tests such as DNA fingerprinting and mitochondrial typing require a certain sample size and are carried out in large volume reactions; in cases where insufficient sample is present whole genome amplification (WGA) can be used. WGA allows very small quantities of DNA to be amplified in a way that enables subsequent DNA-based tests to be performed. A limiting step to WGA is sample preparation. To minimize the necessary sample size, we have developed two modifications of WGA: the first allows for an increase in amplified product from small, nanoscale, purified samples with the use of carrier DNA while the second is a single-step method for cleaning and amplifying samples all in one column. Conventional DNA cleanup involves binding the DNA to silica, washing away impurities, and then releasing the DNA for subsequent testing. We have eliminated losses associated with incomplete sample release, thereby decreasing the required amount of starting template for DNA testing. Both techniques address the limitations of sample size by providing ample copies of genomic samples. Carrier DNA, included in our WGA reactions, can be used when amplifying samples with the standard purification method, or can be used in conjunction with our single-step DNA purification technique to potentially further decrease the amount of starting sample necessary for future forensic DNA-based assays.

  11. Gaseous radiocarbon measurements of small samples

    International Nuclear Information System (INIS)

    Ruff, M.; Szidat, S.; Gaeggeler, H.W.; Suter, M.; Synal, H.-A.; Wacker, L.

    2010-01-01

    Radiocarbon dating by means of accelerator mass spectrometry (AMS) is a well-established method for samples containing carbon in the milligram range. However, the measurement of small samples containing less than 50 μg carbon often fails. It is difficult to graphitise these samples and the preparation is prone to contamination. To avoid graphitisation, a solution can be the direct measurement of carbon dioxide. The MICADAS, the smallest accelerator for radiocarbon dating in Zurich, is equipped with a hybrid Cs sputter ion source. It allows the measurement of both, graphite targets and gaseous CO 2 samples, without any rebuilding. This work presents experiences dealing with small samples containing 1-40 μg carbon. 500 unknown samples of different environmental research fields have been measured yet. Most of the samples were measured with the gas ion source. These data are compared with earlier measurements of small graphite samples. The performance of the two different techniques is discussed and main contributions to the blank determined. An analysis of blank and standard data measured within years allowed a quantification of the contamination, which was found to be of the order of 55 ng and 750 ng carbon (50 pMC) for the gaseous and the graphite samples, respectively. For quality control, a number of certified standards were measured using the gas ion source to demonstrate reliability of the data.

  12. Local heterogeneity effects on small-sample worths

    International Nuclear Information System (INIS)

    Schaefer, R.W.

    1986-01-01

    One of the parameters usually measured in a fast reactor critical assembly is the reactivity associated with inserting a small sample of a material into the core (sample worth). Local heterogeneities introduced by the worth measurement techniques can have a significant effect on the sample worth. Unfortunately, the capability is lacking to model some of the heterogeneity effects associated with the experimental technique traditionally used at ANL (the radial tube technique). It has been suggested that these effects could account for a large portion of what remains of the longstanding central worth discrepancy. The purpose of this paper is to describe a large body of experimental data - most of which has never been reported - that shows the effect of radial tube-related local heterogeneities

  13. Accurate EPR radiosensitivity calibration using small sample masses

    Science.gov (United States)

    Hayes, R. B.; Haskell, E. H.; Barrus, J. K.; Kenner, G. H.; Romanyukha, A. A.

    2000-03-01

    We demonstrate a procedure in retrospective EPR dosimetry which allows for virtually nondestructive sample evaluation in terms of sample irradiations. For this procedure to work, it is shown that corrections must be made for cavity response characteristics when using variable mass samples. Likewise, methods are employed to correct for empty tube signals, sample anisotropy and frequency drift while considering the effects of dose distribution optimization. A demonstration of the method's utility is given by comparing sample portions evaluated using both the described methodology and standard full sample additive dose techniques. The samples used in this study are tooth enamel from teeth removed during routine dental care. We show that by making all the recommended corrections, very small masses can be both accurately measured and correlated with measurements of other samples. Some issues relating to dose distribution optimization are also addressed.

  14. Accurate EPR radiosensitivity calibration using small sample masses

    International Nuclear Information System (INIS)

    Hayes, R.B.; Haskell, E.H.; Barrus, J.K.; Kenner, G.H.; Romanyukha, A.A.

    2000-01-01

    We demonstrate a procedure in retrospective EPR dosimetry which allows for virtually nondestructive sample evaluation in terms of sample irradiations. For this procedure to work, it is shown that corrections must be made for cavity response characteristics when using variable mass samples. Likewise, methods are employed to correct for empty tube signals, sample anisotropy and frequency drift while considering the effects of dose distribution optimization. A demonstration of the method's utility is given by comparing sample portions evaluated using both the described methodology and standard full sample additive dose techniques. The samples used in this study are tooth enamel from teeth removed during routine dental care. We show that by making all the recommended corrections, very small masses can be both accurately measured and correlated with measurements of other samples. Some issues relating to dose distribution optimization are also addressed

  15. A scanning tunneling microscope capable of imaging specified micron-scale small samples

    Science.gov (United States)

    Tao, Wei; Cao, Yufei; Wang, Huafeng; Wang, Kaiyou; Lu, Qingyou

    2012-12-01

    We present a home-built scanning tunneling microscope (STM) which allows us to precisely position the tip on any specified small sample or sample feature of micron scale. The core structure is a stand-alone soft junction mechanical loop (SJML), in which a small piezoelectric tube scanner is mounted on a sliding piece and a "U"-like soft spring strip has its one end fixed to the sliding piece and its opposite end holding the tip pointing to the sample on the scanner. Here, the tip can be precisely aligned to a specified small sample of micron scale by adjusting the position of the spring-clamped sample on the scanner in the field of view of an optical microscope. The aligned SJML can be transferred to a piezoelectric inertial motor for coarse approach, during which the U-spring is pushed towards the sample, causing the tip to approach the pre-aligned small sample. We have successfully approached a hand cut tip that was made from 0.1 mm thin Pt/Ir wire to an isolated individual 32.5 × 32.5 μm2 graphite flake. Good atomic resolution images and high quality tunneling current spectra for that specified tiny flake are obtained in ambient conditions with high repeatability within one month showing high and long term stability of the new STM structure. In addition, frequency spectra of the tunneling current signals do not show outstanding tip mount related resonant frequency (low frequency), which further confirms the stability of the STM structure.

  16. A scanning tunneling microscope capable of imaging specified micron-scale small samples.

    Science.gov (United States)

    Tao, Wei; Cao, Yufei; Wang, Huafeng; Wang, Kaiyou; Lu, Qingyou

    2012-12-01

    We present a home-built scanning tunneling microscope (STM) which allows us to precisely position the tip on any specified small sample or sample feature of micron scale. The core structure is a stand-alone soft junction mechanical loop (SJML), in which a small piezoelectric tube scanner is mounted on a sliding piece and a "U"-like soft spring strip has its one end fixed to the sliding piece and its opposite end holding the tip pointing to the sample on the scanner. Here, the tip can be precisely aligned to a specified small sample of micron scale by adjusting the position of the spring-clamped sample on the scanner in the field of view of an optical microscope. The aligned SJML can be transferred to a piezoelectric inertial motor for coarse approach, during which the U-spring is pushed towards the sample, causing the tip to approach the pre-aligned small sample. We have successfully approached a hand cut tip that was made from 0.1 mm thin Pt∕Ir wire to an isolated individual 32.5 × 32.5 μm(2) graphite flake. Good atomic resolution images and high quality tunneling current spectra for that specified tiny flake are obtained in ambient conditions with high repeatability within one month showing high and long term stability of the new STM structure. In addition, frequency spectra of the tunneling current signals do not show outstanding tip mount related resonant frequency (low frequency), which further confirms the stability of the STM structure.

  17. ANL small-sample calorimeter system design and operation

    International Nuclear Information System (INIS)

    Roche, C.T.; Perry, R.B.; Lewis, R.N.; Jung, E.A.; Haumann, J.R.

    1978-07-01

    The Small-Sample Calorimetric System is a portable instrument designed to measure the thermal power produced by radioactive decay of plutonium-containing fuels. The small-sample calorimeter is capable of measuring samples producing power up to 32 milliwatts at a rate of one sample every 20 min. The instrument is contained in two packages: a data-acquisition module consisting of a microprocessor with an 8K-byte nonvolatile memory, and a measurement module consisting of the calorimeter and a sample preheater. The total weight of the system is 18 kg

  18. Method to make accurate concentration and isotopic measurements for small gas samples

    Science.gov (United States)

    Palmer, M. R.; Wahl, E.; Cunningham, K. L.

    2013-12-01

    Carbon isotopic ratio measurements of CO2 and CH4 provide valuable insight into carbon cycle processes. However, many of these studies, like soil gas, soil flux, and water head space experiments, provide very small gas sample volumes, too small for direct measurement by current constant-flow Cavity Ring-Down (CRDS) isotopic analyzers. Previously, we addressed this issue by developing a sample introduction module which enabled the isotopic ratio measurement of 40ml samples or smaller. However, the system, called the Small Sample Isotope Module (SSIM), does dilute the sample during the delivery with inert carrier gas which causes a ~5% reduction in concentration. The isotopic ratio measurements are not affected by this small dilution, but researchers are naturally interested accurate concentration measurements. We present the accuracy and precision of a new method of using this delivery module which we call 'double injection.' Two portions of the 40ml of the sample (20ml each) are introduced to the analyzer, the first injection of which flushes out the diluting gas and the second injection is measured. The accuracy of this new method is demonstrated by comparing the concentration and isotopic ratio measurements for a gas sampled directly and that same gas measured through the SSIM. The data show that the CO2 concentration measurements were the same within instrument precision. The isotopic ratio precision (1σ) of repeated measurements was 0.16 permil for CO2 and 1.15 permil for CH4 at ambient concentrations. This new method provides a significant enhancement in the information provided by small samples.

  19. A multi-dimensional sampling method for locating small scatterers

    International Nuclear Information System (INIS)

    Song, Rencheng; Zhong, Yu; Chen, Xudong

    2012-01-01

    A multiple signal classification (MUSIC)-like multi-dimensional sampling method (MDSM) is introduced to locate small three-dimensional scatterers using electromagnetic waves. The indicator is built with the most stable part of signal subspace of the multi-static response matrix on a set of combinatorial sampling nodes inside the domain of interest. It has two main advantages compared to the conventional MUSIC methods. First, the MDSM is more robust against noise. Second, it can work with a single incidence even for multi-scatterers. Numerical simulations are presented to show the good performance of the proposed method. (paper)

  20. Overestimation of test performance by ROC analysis: Effect of small sample size

    International Nuclear Information System (INIS)

    Seeley, G.W.; Borgstrom, M.C.; Patton, D.D.; Myers, K.J.; Barrett, H.H.

    1984-01-01

    New imaging systems are often observer-rated by ROC techniques. For practical reasons the number of different images, or sample size (SS), is kept small. Any systematic bias due to small SS would bias system evaluation. The authors set about to determine whether the area under the ROC curve (AUC) would be systematically biased by small SS. Monte Carlo techniques were used to simulate observer performance in distinguishing signal (SN) from noise (N) on a 6-point scale; P(SN) = P(N) = .5. Four sample sizes (15, 25, 50 and 100 each of SN and N), three ROC slopes (0.8, 1.0 and 1.25), and three intercepts (0.8, 1.0 and 1.25) were considered. In each of the 36 combinations of SS, slope and intercept, 2000 runs were simulated. Results showed a systematic bias: the observed AUC exceeded the expected AUC in every one of the 36 combinations for all sample sizes, with the smallest sample sizes having the largest bias. This suggests that evaluations of imaging systems using ROC curves based on small sample size systematically overestimate system performance. The effect is consistent but subtle (maximum 10% of AUC standard deviation), and is probably masked by the s.d. in most practical settings. Although there is a statistically significant effect (F = 33.34, P<0.0001) due to sample size, none was found for either the ROC curve slope or intercept. Overestimation of test performance by small SS seems to be an inherent characteristic of the ROC technique that has not previously been described

  1. Cerebral Small Vessel Disease: Cognition, Mood, Daily Functioning, and Imaging Findings from a Small Pilot Sample

    Directory of Open Access Journals (Sweden)

    John G. Baker

    2012-04-01

    Full Text Available Cerebral small vessel disease, a leading cause of cognitive decline, is considered a relatively homogeneous disease process, and it can co-occur with Alzheimer’s disease. Clinical reports of magnetic resonance imaging (MRI/computed tomography and single photon emission computed tomography (SPECT imaging and neuropsychology testing for a small pilot sample of 14 patients are presented to illustrate disease characteristics through findings from structural and functional imaging and cognitive assessment. Participants showed some decreases in executive functioning, attention, processing speed, and memory retrieval, consistent with previous literature. An older subgroup showed lower age-corrected scores at a single time point compared to younger participants. Performance on a computer-administered cognitive measure showed a slight overall decline over a period of 8–28 months. For a case study with mild neuropsychology findings, the MRI report was normal while the SPECT report identified perfusion abnormalities. Future research can test whether advances in imaging analysis allow for identification of cerebral small vessel disease before changes are detected in cognition.

  2. Sensitivity study of micro four-point probe measurements on small samples

    DEFF Research Database (Denmark)

    Wang, Fei; Petersen, Dirch Hjorth; Hansen, Torben Mikael

    2010-01-01

    probes than near the outer ones. The sensitive area is defined for infinite film, circular, square, and rectangular test pads, and convergent sensitivities are observed for small samples. The simulations show that the Hall sheet resistance RH in micro Hall measurements with position error suppression...

  3. Development of electric discharge equipment for small specimen sampling

    International Nuclear Information System (INIS)

    Okamoto, Koji; Kitagawa, Hideaki; Kusumoto, Junichi; Kanaya, Akihiro; Kobayashi, Toshimi

    2009-01-01

    We have developed the on-site electric discharge sampling equipment that can effectively take samples such as small specimens from the surface portion of the plant components. Compared with the conventional sampling equipment, our sampling equipment can take samples that are thinner in depth and larger in area. In addition, the affection to the equipment can be held down to the minimum, and the thermally-affected zone of the material due to electric discharge is small, which is to be ignored. Therefore, our equipment is excellent in taking samples for various tests such as residual life evaluation.

  4. Accelerator mass spectrometry of small biological samples.

    Science.gov (United States)

    Salehpour, Mehran; Forsgard, Niklas; Possnert, Göran

    2008-12-01

    Accelerator mass spectrometry (AMS) is an ultra-sensitive technique for isotopic ratio measurements. In the biomedical field, AMS can be used to measure femtomolar concentrations of labeled drugs in body fluids, with direct applications in early drug development such as Microdosing. Likewise, the regenerative properties of cells which are of fundamental significance in stem-cell research can be determined with an accuracy of a few years by AMS analysis of human DNA. However, AMS nominally requires about 1 mg of carbon per sample which is not always available when dealing with specific body substances such as localized, organ-specific DNA samples. Consequently, it is of analytical interest to develop methods for the routine analysis of small samples in the range of a few tens of microg. We have used a 5 MV Pelletron tandem accelerator to study small biological samples using AMS. Different methods are presented and compared. A (12)C-carrier sample preparation method is described which is potentially more sensitive and less susceptible to contamination than the standard procedures.

  5. Comparing interval estimates for small sample ordinal CFA models.

    Science.gov (United States)

    Natesan, Prathiba

    2015-01-01

    Robust maximum likelihood (RML) and asymptotically generalized least squares (AGLS) methods have been recommended for fitting ordinal structural equation models. Studies show that some of these methods underestimate standard errors. However, these studies have not investigated the coverage and bias of interval estimates. An estimate with a reasonable standard error could still be severely biased. This can only be known by systematically investigating the interval estimates. The present study compares Bayesian, RML, and AGLS interval estimates of factor correlations in ordinal confirmatory factor analysis models (CFA) for small sample data. Six sample sizes, 3 factor correlations, and 2 factor score distributions (multivariate normal and multivariate mildly skewed) were studied. Two Bayesian prior specifications, informative and relatively less informative were studied. Undercoverage of confidence intervals and underestimation of standard errors was common in non-Bayesian methods. Underestimated standard errors may lead to inflated Type-I error rates. Non-Bayesian intervals were more positive biased than negatively biased, that is, most intervals that did not contain the true value were greater than the true value. Some non-Bayesian methods had non-converging and inadmissible solutions for small samples and non-normal data. Bayesian empirical standard error estimates for informative and relatively less informative priors were closer to the average standard errors of the estimates. The coverage of Bayesian credibility intervals was closer to what was expected with overcoverage in a few cases. Although some Bayesian credibility intervals were wider, they reflected the nature of statistical uncertainty that comes with the data (e.g., small sample). Bayesian point estimates were also more accurate than non-Bayesian estimates. The results illustrate the importance of analyzing coverage and bias of interval estimates, and how ignoring interval estimates can be misleading

  6. Can recurrence networks show small-world property?

    International Nuclear Information System (INIS)

    Jacob, Rinku; Harikrishnan, K.P.; Misra, R.; Ambika, G.

    2016-01-01

    Recurrence networks are complex networks, constructed from time series data, having several practical applications. Though their properties when constructed with the threshold value ϵ chosen at or just above the percolation threshold of the network are quite well understood, what happens as the threshold increases beyond the usual operational window is still not clear from a complex network perspective. The present Letter is focused mainly on the network properties at intermediate-to-large values of the recurrence threshold, for which no systematic study has been performed so far. We argue, with numerical support, that recurrence networks constructed from chaotic attractors with ϵ equal to the usual recurrence threshold or slightly above cannot, in general, show small-world property. However, if the threshold is further increased, the recurrence network topology initially changes to a small-world structure and finally to that of a classical random graph as the threshold approaches the size of the strange attractor. - Highlights: • Properties of recurrence networks at intermediate-to-large values of recurrence threshold are analyzed from a complex network perspective. • Using a combined plot of characteristic path length and clustering coefficient, it is shown that the recurrence network constructed with recurrence threshold equal to or just above the percolation threshold cannot, in general, display small-world property. • As the recurrence threshold is increased from its usual operational window, the resulting network makes a smooth transition initially to a small-world network for an intermediate range of thresholds and finally to the classical random graph as the threshold becomes comparable to the size of the attractor.

  7. Can recurrence networks show small-world property?

    Energy Technology Data Exchange (ETDEWEB)

    Jacob, Rinku, E-mail: rinku.jacob.vallanat@gmail.com [Department of Physics, The Cochin College, Cochin, 682002 (India); Harikrishnan, K.P., E-mail: kp_hk2002@yahoo.co.in [Department of Physics, The Cochin College, Cochin, 682002 (India); Misra, R., E-mail: rmisra@iucaa.in [Inter University Centre for Astronomy and Astrophysics, Pune, 411007 (India); Ambika, G., E-mail: g.ambika@iiserpune.ac.in [Indian Institute of Science Education and Research, Pune, 411008 (India)

    2016-08-12

    Recurrence networks are complex networks, constructed from time series data, having several practical applications. Though their properties when constructed with the threshold value ϵ chosen at or just above the percolation threshold of the network are quite well understood, what happens as the threshold increases beyond the usual operational window is still not clear from a complex network perspective. The present Letter is focused mainly on the network properties at intermediate-to-large values of the recurrence threshold, for which no systematic study has been performed so far. We argue, with numerical support, that recurrence networks constructed from chaotic attractors with ϵ equal to the usual recurrence threshold or slightly above cannot, in general, show small-world property. However, if the threshold is further increased, the recurrence network topology initially changes to a small-world structure and finally to that of a classical random graph as the threshold approaches the size of the strange attractor. - Highlights: • Properties of recurrence networks at intermediate-to-large values of recurrence threshold are analyzed from a complex network perspective. • Using a combined plot of characteristic path length and clustering coefficient, it is shown that the recurrence network constructed with recurrence threshold equal to or just above the percolation threshold cannot, in general, display small-world property. • As the recurrence threshold is increased from its usual operational window, the resulting network makes a smooth transition initially to a small-world network for an intermediate range of thresholds and finally to the classical random graph as the threshold becomes comparable to the size of the attractor.

  8. On the Structure of Cortical Microcircuits Inferred from Small Sample Sizes.

    Science.gov (United States)

    Vegué, Marina; Perin, Rodrigo; Roxin, Alex

    2017-08-30

    The structure in cortical microcircuits deviates from what would be expected in a purely random network, which has been seen as evidence of clustering. To address this issue, we sought to reproduce the nonrandom features of cortical circuits by considering several distinct classes of network topology, including clustered networks, networks with distance-dependent connectivity, and those with broad degree distributions. To our surprise, we found that all of these qualitatively distinct topologies could account equally well for all reported nonrandom features despite being easily distinguishable from one another at the network level. This apparent paradox was a consequence of estimating network properties given only small sample sizes. In other words, networks that differ markedly in their global structure can look quite similar locally. This makes inferring network structure from small sample sizes, a necessity given the technical difficulty inherent in simultaneous intracellular recordings, problematic. We found that a network statistic called the sample degree correlation (SDC) overcomes this difficulty. The SDC depends only on parameters that can be estimated reliably given small sample sizes and is an accurate fingerprint of every topological family. We applied the SDC criterion to data from rat visual and somatosensory cortex and discovered that the connectivity was not consistent with any of these main topological classes. However, we were able to fit the experimental data with a more general network class, of which all previous topologies were special cases. The resulting network topology could be interpreted as a combination of physical spatial dependence and nonspatial, hierarchical clustering. SIGNIFICANCE STATEMENT The connectivity of cortical microcircuits exhibits features that are inconsistent with a simple random network. Here, we show that several classes of network models can account for this nonrandom structure despite qualitative differences in

  9. A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis.

    Science.gov (United States)

    Lin, Johnny; Bentler, Peter M

    2012-01-01

    Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne's asymptotically distribution-free method and Satorra Bentler's mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler's statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby's study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic.

  10. Nano-Scale Sample Acquisition Systems for Small Class Exploration Spacecraft

    Science.gov (United States)

    Paulsen, G.

    2015-12-01

    The paradigm for space exploration is changing. Large and expensive missions are very rare and the space community is turning to smaller, lighter, and less expensive missions that could still perform great exploration. These missions are also within reach of commercial companies such as the Google Lunar X Prize teams that develop small scale lunar missions. Recent commercial endeavors such as "Planet Labs inc." and Sky Box Imaging, inc. show that there are new benefits and business models associated with miniaturization of space hardware. The Nano-Scale Sample Acquisition System includes NanoDrill for capture of small rock cores and PlanetVac for capture of surface regolith. These two systems are part of the ongoing effort to develop "Micro Sampling" systems for deployment by the small spacecraft with limited payload capacities. The ideal applications include prospecting missions to the Moon and Asteroids. The MicroDrill is a rotary-percussive coring drill that captures cores 7 mm in diameter and up to 2 cm long. The drill weighs less than 1 kg and can capture a core from a 40 MPa strength rock within a few minutes, with less than 10 Watt power and less than 10 Newton of preload. The PlanetVac is a pneumatic based regolith acquisition system that can capture surface sample in touch-and-go maneuver. These sampling systems were integrated within the footpads of commercial quadcopter for testing. As such, they could also be used by geologists on Earth to explore difficult to get to locations.

  11. Pulsed Direct Current Electrospray: Enabling Systematic Analysis of Small Volume Sample by Boosting Sample Economy.

    Science.gov (United States)

    Wei, Zhenwei; Xiong, Xingchuang; Guo, Chengan; Si, Xingyu; Zhao, Yaoyao; He, Muyi; Yang, Chengdui; Xu, Wei; Tang, Fei; Fang, Xiang; Zhang, Sichun; Zhang, Xinrong

    2015-11-17

    We had developed pulsed direct current electrospray ionization mass spectrometry (pulsed-dc-ESI-MS) for systematically profiling and determining components in small volume sample. Pulsed-dc-ESI utilized constant high voltage to induce the generation of single polarity pulsed electrospray remotely. This method had significantly boosted the sample economy, so as to obtain several minutes MS signal duration from merely picoliter volume sample. The elongated MS signal duration enable us to collect abundant MS(2) information on interested components in a small volume sample for systematical analysis. This method had been successfully applied for single cell metabolomics analysis. We had obtained 2-D profile of metabolites (including exact mass and MS(2) data) from single plant and mammalian cell, concerning 1034 components and 656 components for Allium cepa and HeLa cells, respectively. Further identification had found 162 compounds and 28 different modification groups of 141 saccharides in a single Allium cepa cell, indicating pulsed-dc-ESI a powerful tool for small volume sample systematical analysis.

  12. Suitability of small diagnostic peripheral-blood samples for cell-therapy studies.

    Science.gov (United States)

    Stephanou, Coralea; Papasavva, Panayiota; Zachariou, Myria; Patsali, Petros; Epitropou, Marilena; Ladas, Petros; Al-Abdulla, Ruba; Christou, Soteroulla; Antoniou, Michael N; Lederer, Carsten W; Kleanthous, Marina

    2017-02-01

    Primary hematopoietic stem and progenitor cells (HSPCs) are key components of cell-based therapies for blood disorders and are thus the authentic substrate for related research. We propose that ubiquitous small-volume diagnostic samples represent a readily available and as yet untapped resource of primary patient-derived cells for cell- and gene-therapy studies. In the present study we compare isolation and storage methods for HSPCs from normal and thalassemic small-volume blood samples, considering genotype, density-gradient versus lysis-based cell isolation and cryostorage media with different serum contents. Downstream analyses include viability, recovery, differentiation in semi-solid media and performance in liquid cultures and viral transductions. We demonstrate that HSPCs isolated either by ammonium-chloride potassium (ACK)-based lysis or by gradient isolation are suitable for functional analyses in clonogenic assays, high-level HSPC expansion and efficient lentiviral transduction. For cryostorage of cells, gradient isolation is superior to ACK lysis, and cryostorage in freezing media containing 50% fetal bovine serum demonstrated good results across all tested criteria. For assays on freshly isolated cells, ACK lysis performed similar to, and for thalassemic samples better than, gradient isolation, at a fraction of the cost and hands-on time. All isolation and storage methods show considerable variation within sample groups, but this is particularly acute for density gradient isolation of thalassemic samples. This study demonstrates the suitability of small-volume blood samples for storage and preclinical studies, opening up the research field of HSPC and gene therapy to any blood diagnostic laboratory with corresponding bioethics approval for experimental use of surplus material. Copyright © 2017 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  13. [Progress in sample preparation and analytical methods for trace polar small molecules in complex samples].

    Science.gov (United States)

    Zhang, Qianchun; Luo, Xialin; Li, Gongke; Xiao, Xiaohua

    2015-09-01

    Small polar molecules such as nucleosides, amines, amino acids are important analytes in biological, food, environmental, and other fields. It is necessary to develop efficient sample preparation and sensitive analytical methods for rapid analysis of these polar small molecules in complex matrices. Some typical materials in sample preparation, including silica, polymer, carbon, boric acid and so on, are introduced in this paper. Meanwhile, the applications and developments of analytical methods of polar small molecules, such as reversed-phase liquid chromatography, hydrophilic interaction chromatography, etc., are also reviewed.

  14. Integrating sphere based reflectance measurements for small-area semiconductor samples

    Science.gov (United States)

    Saylan, S.; Howells, C. T.; Dahlem, M. S.

    2018-05-01

    This article describes a method that enables reflectance spectroscopy of small semiconductor samples using an integrating sphere, without the use of additional optical elements. We employed an inexpensive sample holder to measure the reflectance of different samples through 2-, 3-, and 4.5-mm-diameter apertures and applied a mathematical formulation to remove the bias from the measured spectra caused by illumination of the holder. Using the proposed method, the reflectance of samples fabricated using expensive or rare materials and/or low-throughput processes can be measured. It can also be incorporated to infer the internal quantum efficiency of small-area, research-level solar cells. Moreover, small samples that reflect light at large angles and develop scattering may also be measured reliably, by virtue of an integrating sphere insensitive to directionalities.

  15. Decision Support on Small size Passive Samples

    Directory of Open Access Journals (Sweden)

    Vladimir Popukaylo

    2018-05-01

    Full Text Available A construction technique of adequate mathematical models for small size passive samples, in conditions when classical probabilistic-statis\\-tical methods do not allow obtaining valid conclusions was developed.

  16. Consensus of heterogeneous multi-agent systems based on sampled data with a small sampling delay

    International Nuclear Information System (INIS)

    Wang Na; Wu Zhi-Hai; Peng Li

    2014-01-01

    In this paper, consensus problems of heterogeneous multi-agent systems based on sampled data with a small sampling delay are considered. First, a consensus protocol based on sampled data with a small sampling delay for heterogeneous multi-agent systems is proposed. Then, the algebra graph theory, the matrix method, the stability theory of linear systems, and some other techniques are employed to derive the necessary and sufficient conditions guaranteeing heterogeneous multi-agent systems to asymptotically achieve the stationary consensus. Finally, simulations are performed to demonstrate the correctness of the theoretical results. (interdisciplinary physics and related areas of science and technology)

  17. Soybean yield modeling using bootstrap methods for small samples

    Energy Technology Data Exchange (ETDEWEB)

    Dalposso, G.A.; Uribe-Opazo, M.A.; Johann, J.A.

    2016-11-01

    One of the problems that occur when working with regression models is regarding the sample size; once the statistical methods used in inferential analyzes are asymptotic if the sample is small the analysis may be compromised because the estimates will be biased. An alternative is to use the bootstrap methodology, which in its non-parametric version does not need to guess or know the probability distribution that generated the original sample. In this work we used a set of soybean yield data and physical and chemical soil properties formed with fewer samples to determine a multiple linear regression model. Bootstrap methods were used for variable selection, identification of influential points and for determination of confidence intervals of the model parameters. The results showed that the bootstrap methods enabled us to select the physical and chemical soil properties, which were significant in the construction of the soybean yield regression model, construct the confidence intervals of the parameters and identify the points that had great influence on the estimated parameters. (Author)

  18. Small Sample Properties of the Wilcoxon Signed Rank Test with Discontinuous and Dependent Observations

    OpenAIRE

    Nadine Chlass; Jens J. Krueger

    2007-01-01

    This Monte-Carlo study investigates sensitivity of the Wilcoxon signed rank test to certain assumption violations in small samples. Emphasis is put on within-sample-dependence, between-sample dependence, and the presence of ties. Our results show that both assumption violations induce severe size distortions and entail power losses. Surprisingly, these consequences do vary substantially with other properties the data may display. Results provided are particularly relevant for experimental set...

  19. Payload specialist Reinhard Furrer show evidence of previous blood sampling

    Science.gov (United States)

    1985-01-01

    Payload specialist Reinhard Furrer shows evidence of previous blood sampling while Wubbo J. Ockels, Dutch payload specialist (only partially visible), extends his right arm after a sample has been taken. Both men show bruises on their arms.

  20. Multi-element analysis of small biological samples

    International Nuclear Information System (INIS)

    Rokita, E.; Cafmeyer, J.; Maenhaut, W.

    1983-01-01

    A method combining PIXE and INAA was developed to determine the elemental composition of small biological samples. The method needs virtually no sample preparation and less than 1 mg is sufficient for the analysis. The method was used for determining up to 18 elements in leaves taken from Cracow Herbaceous. The factors which influence the elemental composition of leaves and the possible use of leaves as an environmental pollution indicator are discussed

  1. Mechanical characteristics of historic mortars from tests on small-sample non-standard on small-sample non-standard specimens

    Czech Academy of Sciences Publication Activity Database

    Drdácký, Miloš; Slížková, Zuzana

    2008-01-01

    Roč. 17, č. 1 (2008), s. 20-29 ISSN 1407-7353 R&D Projects: GA ČR(CZ) GA103/06/1609 Institutional research plan: CEZ:AV0Z20710524 Keywords : small-sample non-standard testing * lime * historic mortar Subject RIV: AL - Art, Architecture, Cultural Heritage

  2. Comparative Analysis of Clinical Samples Showing Weak Serum Reaction on AutoVue System Causing ABO Blood Typing Discrepancies.

    Science.gov (United States)

    Jo, Su Yeon; Lee, Ju Mi; Kim, Hye Lim; Sin, Kyeong Hwa; Lee, Hyeon Ji; Chang, Chulhun Ludgerus; Kim, Hyung Hoi

    2017-03-01

    ABO blood typing in pre-transfusion testing is a major component of the high workload in blood banks that therefore requires automation. We often experienced discrepant results from an automated system, especially weak serum reactions. We evaluated the discrepant results by the reference manual method to confirm ABO blood typing. In total, 13,113 blood samples were tested with the AutoVue system; all samples were run in parallel with the reference manual method according to the laboratory protocol. The AutoVue system confirmed ABO blood typing of 12,816 samples (97.7%), and these results were concordant with those of the manual method. The remaining 297 samples (2.3%) showed discrepant results in the AutoVue system and were confirmed by the manual method. The discrepant results involved weak serum reactions (serum reactions, samples from patients who had received stem cell transplants, ABO subgroups, and specific system error messages. Among the 98 samples showing ≤1+ reaction grade in the AutoVue system, 70 samples (71.4%) showed a normal serum reaction (≥2+ reaction grade) with the manual method, and 28 samples (28.6%) showed weak serum reaction in both methods. ABO blood tying of 97.7% samples could be confirmed by the AutoVue system and a small proportion (2.3%) needed to be re-evaluated by the manual method. Samples with a 2+ reaction grade in serum typing do not need to be evaluated manually, while those with ≤1+ reaction grade do.

  3. Testing of Small Graphite Samples for Nuclear Qualification

    Energy Technology Data Exchange (ETDEWEB)

    Julie Chapman

    2010-11-01

    Accurately determining the mechanical properties of small irradiated samples is crucial to predicting the behavior of the overal irradiated graphite components within a Very High Temperature Reactor. The sample size allowed in a material test reactor, however, is limited, and this poses some difficulties with respect to mechanical testing. In the case of graphite with a larger grain size, a small sample may exhibit characteristics not representative of the bulk material, leading to inaccuracies in the data. A study to determine a potential size effect on the tensile strength was pursued under the Next Generation Nuclear Plant program. It focuses first on optimizing the tensile testing procedure identified in the American Society for Testing and Materials (ASTM) Standard C 781-08. Once the testing procedure was verified, a size effect was assessed by gradually reducing the diameter of the specimens. By monitoring the material response, a size effect was successfully identified.

  4. The use of commercially available PC-interface cards for elemental mapping in small samples using XRF

    International Nuclear Information System (INIS)

    Abu Bakar bin Ghazali; Hoyes Garnet

    1991-01-01

    This paper demonstrates the use of ADC and reed relay cards to scan a small sample for acquiring data of X-ray fluorescence. The result shows the distribution of an element such as zinc content in the sample by means of colours, signifying the concentration

  5. Identification of multiple mRNA and DNA sequences from small tissue samples isolated by laser-assisted microdissection.

    Science.gov (United States)

    Bernsen, M R; Dijkman, H B; de Vries, E; Figdor, C G; Ruiter, D J; Adema, G J; van Muijen, G N

    1998-10-01

    Molecular analysis of small tissue samples has become increasingly important in biomedical studies. Using a laser dissection microscope and modified nucleic acid isolation protocols, we demonstrate that multiple mRNA as well as DNA sequences can be identified from a single-cell sample. In addition, we show that the specificity of procurement of tissue samples is not compromised by smear contamination resulting from scraping of the microtome knife during sectioning of lesions. The procedures described herein thus allow for efficient RT-PCR or PCR analysis of multiple nucleic acid sequences from small tissue samples obtained by laser-assisted microdissection.

  6. A Geology Sampling System for Small Bodies

    Science.gov (United States)

    Naids, Adam J.; Hood, Anthony D.; Abell, Paul; Graff, Trevor; Buffington, Jesse

    2016-01-01

    Human exploration of microgravity bodies is being investigated as a precursor to a Mars surface mission. Asteroids, comets, dwarf planets, and the moons of Mars all fall into this microgravity category and some are being discussed as potential mission targets. Obtaining geological samples for return to Earth will be a major objective for any mission to a small body. Currently, the knowledge base for geology sampling in microgravity is in its infancy. Humans interacting with non-engineered surfaces in microgravity environment pose unique challenges. In preparation for such missions a team at the NASA Johnson Space Center has been working to gain experience on how to safely obtain numerous sample types in such an environment. This paper describes the type of samples the science community is interested in, highlights notable prototype work, and discusses an integrated geology sampling solution.

  7. Small Mammal Sampling in Mortandad and Los Alamos Canyons, 2005

    International Nuclear Information System (INIS)

    Kathy Bennett; Sherri Sherwood; Rhonda Robinson

    2006-01-01

    As part of an ongoing ecological field investigation at Los Alamos National Laboratory, a study was conducted that compared measured contaminant concentrations in sediment to population parameters for small mammals in the Mortandad Canyon watershed. Mortandad Canyon and its tributary canyons have received contaminants from multiple solid waste management units and areas of concern since establishment of the Laboratory in the 1940s. The study included three reaches within Effluent and Mortandad canyons (E-1W, M-2W, and M-3) that had a spread in the concentrations of metals and radionuclides and included locations where polychlorinated biphenyls and perchlorate had been detected. A reference location, reach LA-BKG in upper Los Alamos Canyon, was also included in the study for comparison purposes. A small mammal study was initiated to assess whether potential adverse effects were evident in Mortandad Canyon due to the presence of contaminants, designated as contaminants of potential ecological concern, in the terrestrial media. Study sites, including the reference site, were sampled in late July/early August. Species diversity and the mean daily capture rate were the highest for E-1W reach and the lowest for the reference site. Species composition among the three reaches in Mortandad was similar with very little overlap with the reference canyon. Differences in species composition and diversity were most likely due to differences in habitat. Sex ratios, body weights, and reproductive status of small mammals were also evaluated. However, small sample sizes of some species within some sites affected the analysis. Ratios of males to females by species of each site (n = 5) were tested using a Chi-square analysis. No differences were detected. Where there was sufficient sample size, body weights of adult small mammals were compared between sites. No differences in body weights were found. Reproductive status of species appears to be similar across sites. However, sample

  8. Small Mammal Sampling in Mortandad and Los Alamos Canyons, 2005

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, Kathy; Sherwood, Sherri; Robinson, Rhonda

    2006-08-15

    As part of an ongoing ecological field investigation at Los Alamos National Laboratory, a study was conducted that compared measured contaminant concentrations in sediment to population parameters for small mammals in the Mortandad Canyon watershed. Mortandad Canyon and its tributary canyons have received contaminants from multiple solid waste management units and areas of concern since establishment of the Laboratory in the 1940s. The study included three reaches within Effluent and Mortandad canyons (E-1W, M-2W, and M-3) that had a spread in the concentrations of metals and radionuclides and included locations where polychlorinated biphenyls and perchlorate had been detected. A reference location, reach LA-BKG in upper Los Alamos Canyon, was also included in the study for comparison purposes. A small mammal study was initiated to assess whether potential adverse effects were evident in Mortandad Canyon due to the presence of contaminants, designated as contaminants of potential ecological concern, in the terrestrial media. Study sites, including the reference site, were sampled in late July/early August. Species diversity and the mean daily capture rate were the highest for E-1W reach and the lowest for the reference site. Species composition among the three reaches in Mortandad was similar with very little overlap with the reference canyon. Differences in species composition and diversity were most likely due to differences in habitat. Sex ratios, body weights, and reproductive status of small mammals were also evaluated. However, small sample sizes of some species within some sites affected the analysis. Ratios of males to females by species of each site (n = 5) were tested using a Chi-square analysis. No differences were detected. Where there was sufficient sample size, body weights of adult small mammals were compared between sites. No differences in body weights were found. Reproductive status of species appears to be similar across sites. However, sample

  9. Use of the small gas proportional counters for the carbon-14 measurement of very small samples

    International Nuclear Information System (INIS)

    Sayre, E.V.; Harbottle, G.; Stoenner, R.W.; Otlet, R.L.; Evans, G.V.

    1981-01-01

    Two recent developments are: the first is the mass-spectrometric separation of 14 C and 12 C ions, followed by counting of the 14 C, while the second is the extension of conventional proportional counter operation, using CO 2 as counting gas, to very small counters and samples. Although the second method is slow (months of counting time are required for 10 mg of carbon) it does not require operator intervention and many samples may be counted simultaneously. Also, it costs only a fraction of the capital expense of an accelerator installation. The development, construction and operation of suitable small counters are described, and results of three actual dating studies involving milligram scale carbon samples will be given. None of these could have been carried out if conventional, gram-sized samples had been needed. New installations, based on the use of these counters, are under construction or in the planning stages. These are located at Brookhaven Laboratory, the National Bureau of Standards (USA) and Harwell (UK). The Harwell installation, which is in advanced stages of construction, will be described in outline. The main significance of the small-counter method is, that although it will not suffice to measure the smallest (much less than 10 mg) or oldest samples, it will permit existing radiocarbon laboratories to extend their capability considerably, in the direction of smaller samples, at modest expense

  10. Quantifying predictability through information theory: small sample estimation in a non-Gaussian framework

    International Nuclear Information System (INIS)

    Haven, Kyle; Majda, Andrew; Abramov, Rafail

    2005-01-01

    Many situations in complex systems require quantitative estimates of the lack of information in one probability distribution relative to another. In short term climate and weather prediction, examples of these issues might involve the lack of information in the historical climate record compared with an ensemble prediction, or the lack of information in a particular Gaussian ensemble prediction strategy involving the first and second moments compared with the non-Gaussian ensemble itself. The relative entropy is a natural way to quantify the predictive utility in this information, and recently a systematic computationally feasible hierarchical framework has been developed. In practical systems with many degrees of freedom, computational overhead limits ensemble predictions to relatively small sample sizes. Here the notion of predictive utility, in a relative entropy framework, is extended to small random samples by the definition of a sample utility, a measure of the unlikeliness that a random sample was produced by a given prediction strategy. The sample utility is the minimum predictability, with a statistical level of confidence, which is implied by the data. Two practical algorithms for measuring such a sample utility are developed here. The first technique is based on the statistical method of null-hypothesis testing, while the second is based upon a central limit theorem for the relative entropy of moment-based probability densities. These techniques are tested on known probability densities with parameterized bimodality and skewness, and then applied to the Lorenz '96 model, a recently developed 'toy' climate model with chaotic dynamics mimicking the atmosphere. The results show a detection of non-Gaussian tendencies of prediction densities at small ensemble sizes with between 50 and 100 members, with a 95% confidence level

  11. Microdochium nivale and Microdochium majus in seed samples of Danish small grain cereals

    DEFF Research Database (Denmark)

    Nielsen, L. K.; Justesen, A. F.; Jensen, J. D.

    2013-01-01

    Microdochium nivale and Microdochium majus are two of fungal species found in the Fusarium Head Blight (FHB) complex infecting small grain cereals. Quantitative real-time PCR assays were designed to separate the two Microdochium species based on the translation elongation factor 1a gene (TEF-1a......) and used to analyse a total of 374 seed samples of wheat, barley, triticale, rye and oat sampled from farmers’ fields across Denmark from 2003 to 2007. Both fungal species were detected in the five cereal species but M. majus showed a higher prevalence compared to M. nivale in most years in all cereal...... species except rye, in which M. nivale represented a larger proportion of the biomass and was more prevalent than M. majus in some samples. Historical samples of wheat and barley from 1957 to 2000 similarly showed a strong prevalence of M. majus over M. nivale indicating that M. majus has been the main...

  12. Method for Measuring Thermal Conductivity of Small Samples Having Very Low Thermal Conductivity

    Science.gov (United States)

    Miller, Robert A.; Kuczmarski, Maria a.

    2009-01-01

    This paper describes the development of a hot plate method capable of using air as a standard reference material for the steady-state measurement of the thermal conductivity of very small test samples having thermal conductivity on the order of air. As with other approaches, care is taken to ensure that the heat flow through the test sample is essentially one-dimensional. However, unlike other approaches, no attempt is made to use heated guards to block the flow of heat from the hot plate to the surroundings. It is argued that since large correction factors must be applied to account for guard imperfections when sample dimensions are small, it may be preferable to simply measure and correct for the heat that flows from the heater disc to directions other than into the sample. Experimental measurements taken in a prototype apparatus, combined with extensive computational modeling of the heat transfer in the apparatus, show that sufficiently accurate measurements can be obtained to allow determination of the thermal conductivity of low thermal conductivity materials. Suggestions are made for further improvements in the method based on results from regression analyses of the generated data.

  13. Gray bootstrap method for estimating frequency-varying random vibration signals with small samples

    Directory of Open Access Journals (Sweden)

    Wang Yanqing

    2014-04-01

    Full Text Available During environment testing, the estimation of random vibration signals (RVS is an important technique for the airborne platform safety and reliability. However, the available methods including extreme value envelope method (EVEM, statistical tolerances method (STM and improved statistical tolerance method (ISTM require large samples and typical probability distribution. Moreover, the frequency-varying characteristic of RVS is usually not taken into account. Gray bootstrap method (GBM is proposed to solve the problem of estimating frequency-varying RVS with small samples. Firstly, the estimated indexes are obtained including the estimated interval, the estimated uncertainty, the estimated value, the estimated error and estimated reliability. In addition, GBM is applied to estimating the single flight testing of certain aircraft. At last, in order to evaluate the estimated performance, GBM is compared with bootstrap method (BM and gray method (GM in testing analysis. The result shows that GBM has superiority for estimating dynamic signals with small samples and estimated reliability is proved to be 100% at the given confidence level.

  14. ANALYSIS OF MONTE CARLO SIMULATION SAMPLING TECHNIQUES ON SMALL SIGNAL STABILITY OF WIND GENERATOR- CONNECTED POWER SYSTEM

    Directory of Open Access Journals (Sweden)

    TEMITOPE RAPHAEL AYODELE

    2016-04-01

    Full Text Available Monte Carlo simulation using Simple Random Sampling (SRS technique is popularly known for its ability to handle complex uncertainty problems. However, to produce a reasonable result, it requires huge sample size. This makes it to be computationally expensive, time consuming and unfit for online power system applications. In this article, the performance of Latin Hypercube Sampling (LHS technique is explored and compared with SRS in term of accuracy, robustness and speed for small signal stability application in a wind generator-connected power system. The analysis is performed using probabilistic techniques via eigenvalue analysis on two standard networks (Single Machine Infinite Bus and IEEE 16–machine 68 bus test system. The accuracy of the two sampling techniques is determined by comparing their different sample sizes with the IDEAL (conventional. The robustness is determined based on a significant variance reduction when the experiment is repeated 100 times with different sample sizes using the two sampling techniques in turn. Some of the results show that sample sizes generated from LHS for small signal stability application produces the same result as that of the IDEAL values starting from 100 sample size. This shows that about 100 sample size of random variable generated using LHS method is good enough to produce reasonable results for practical purpose in small signal stability application. It is also revealed that LHS has the least variance when the experiment is repeated 100 times compared to SRS techniques. This signifies the robustness of LHS over that of SRS techniques. 100 sample size of LHS produces the same result as that of the conventional method consisting of 50000 sample size. The reduced sample size required by LHS gives it computational speed advantage (about six times over the conventional method.

  15. Taking sputum samples from small children with cystic fibrosis: a matter of cooperation

    DEFF Research Database (Denmark)

    Pehn, Mette; Bregnballe, Vibeke

    2014-01-01

    Objectives: An important part of the disease control in Danish guidelines for care of patients with cystic fibrosis (CF) is a monthly sputum sample by tracheal suchtion. Coping to this unpleasant procedure in small children depends heavily on the support from parents and nurse. The objective...... of this study was to develop a tool to help parents and children to cope with tracheal suctioning. Methods: Three short videos showing how nurses perform tracheal suctioning to get a sputum sample from small children with cystic fibrosis were made. The videos were shown to and discussed with parents...... and children to help them identify their own challenges in coping with the procedure. The study was carried out in the outpatient clinic at the CF centre, Aarhus Univeristy Hospital. Results: The videos are a useful tool to convince the parents, nurses and children from the age of about four years...

  16. Radioenzymatic assay for trimethoprim in very small serum samples.

    OpenAIRE

    Yogev, R; Melick, C; Tan-Pong, L

    1985-01-01

    A modification of the methotrexate radioassay kit (supplied by New England Enzyme Center) enabled determination of trimethoprim levels in 5-microliter serum samples. An excellent correlation between this assay and high-pressure liquid chromatography assay was found. These preliminary results suggest that with this method rapid determination of trimethoprim levels in very small samples (5 to 10 microliters) can be achieved.

  17. Radioenzymatic assay for trimethoprim in very small serum samples

    International Nuclear Information System (INIS)

    Yogev, R.; Melick, C.; Tan-Pong, L.

    1985-01-01

    A modification of the methotrexate radioassay kit (supplied by New England Enzyme Center) enabled determination of trimethoprim levels in 5-microliter serum samples. An excellent correlation between this assay and high-pressure liquid chromatography assay was found. These preliminary results suggest that with this method rapid determination of trimethoprim levels in very small samples (5 to 10 microliters) can be achieved

  18. Estimating sample size for a small-quadrat method of botanical ...

    African Journals Online (AJOL)

    Reports the results of a study conducted to determine an appropriate sample size for a small-quadrat method of botanical survey for application in the Mixed Bushveld of South Africa. Species density and grass density were measured using a small-quadrat method in eight plant communities in the Nylsvley Nature Reserve.

  19. Shrinkage-based diagonal Hotelling’s tests for high-dimensional small sample size data

    KAUST Repository

    Dong, Kai

    2015-09-16

    DNA sequencing techniques bring novel tools and also statistical challenges to genetic research. In addition to detecting differentially expressed genes, testing the significance of gene sets or pathway analysis has been recognized as an equally important problem. Owing to the “large pp small nn” paradigm, the traditional Hotelling’s T2T2 test suffers from the singularity problem and therefore is not valid in this setting. In this paper, we propose a shrinkage-based diagonal Hotelling’s test for both one-sample and two-sample cases. We also suggest several different ways to derive the approximate null distribution under different scenarios of pp and nn for our proposed shrinkage-based test. Simulation studies show that the proposed method performs comparably to existing competitors when nn is moderate or large, but it is better when nn is small. In addition, we analyze four gene expression data sets and they demonstrate the advantage of our proposed shrinkage-based diagonal Hotelling’s test.

  20. Shrinkage-based diagonal Hotelling’s tests for high-dimensional small sample size data

    KAUST Repository

    Dong, Kai; Pang, Herbert; Tong, Tiejun; Genton, Marc G.

    2015-01-01

    DNA sequencing techniques bring novel tools and also statistical challenges to genetic research. In addition to detecting differentially expressed genes, testing the significance of gene sets or pathway analysis has been recognized as an equally important problem. Owing to the “large pp small nn” paradigm, the traditional Hotelling’s T2T2 test suffers from the singularity problem and therefore is not valid in this setting. In this paper, we propose a shrinkage-based diagonal Hotelling’s test for both one-sample and two-sample cases. We also suggest several different ways to derive the approximate null distribution under different scenarios of pp and nn for our proposed shrinkage-based test. Simulation studies show that the proposed method performs comparably to existing competitors when nn is moderate or large, but it is better when nn is small. In addition, we analyze four gene expression data sets and they demonstrate the advantage of our proposed shrinkage-based diagonal Hotelling’s test.

  1. Conditional estimation of local pooled dispersion parameter in small-sample RNA-Seq data improves differential expression test.

    Science.gov (United States)

    Gim, Jungsoo; Won, Sungho; Park, Taesung

    2016-10-01

    High throughput sequencing technology in transcriptomics studies contribute to the understanding of gene regulation mechanism and its cellular function, but also increases a need for accurate statistical methods to assess quantitative differences between experiments. Many methods have been developed to account for the specifics of count data: non-normality, a dependence of the variance on the mean, and small sample size. Among them, the small number of samples in typical experiments is still a challenge. Here we present a method for differential analysis of count data, using conditional estimation of local pooled dispersion parameters. A comprehensive evaluation of our proposed method in the aspect of differential gene expression analysis using both simulated and real data sets shows that the proposed method is more powerful than other existing methods while controlling the false discovery rates. By introducing conditional estimation of local pooled dispersion parameters, we successfully overcome the limitation of small power and enable a powerful quantitative analysis focused on differential expression test with the small number of samples.

  2. Statistical issues in reporting quality data: small samples and casemix variation.

    Science.gov (United States)

    Zaslavsky, A M

    2001-12-01

    To present two key statistical issues that arise in analysis and reporting of quality data. Casemix variation is relevant to quality reporting when the units being measured have differing distributions of patient characteristics that also affect the quality outcome. When this is the case, adjustment using stratification or regression may be appropriate. Such adjustments may be controversial when the patient characteristic does not have an obvious relationship to the outcome. Stratified reporting poses problems for sample size and reporting format, but may be useful when casemix effects vary across units. Although there are no absolute standards of reliability, high reliabilities (interunit F > or = 10 or reliability > or = 0.9) are desirable for distinguishing above- and below-average units. When small or unequal sample sizes complicate reporting, precision may be improved using indirect estimation techniques that incorporate auxiliary information, and 'shrinkage' estimation can help to summarize the strength of evidence about units with small samples. With broader understanding of casemix adjustment and methods for analyzing small samples, quality data can be analysed and reported more accurately.

  3. Human small-cell lung cancers show amplification and expression of the N-myc gene

    International Nuclear Information System (INIS)

    Nau, M.M.; Brooks, B.J. Jr.; Carney, D.N.; Gazdar, A.F.; Battey, J.F.; Sausville, E.A.; Minna, J.D.

    1986-01-01

    The authors have found that 6 of 31 independently derived human small-cell lung cancer (SCLC) cell lines have 5- to 170-fold amplified N-myc gene sequences. The amplification is seen with probes from two separate exons of N-myc, which are homologous to either the second or the third exon of the c-myc gene. Amplified N-myc sequences were found in a tumor cell line started prior to chemotherapy, in SCLC tumor samples harvested directly from tumor metastases at autopsy, and from a resected primary lung cancer. Several N-myc-amplified tumor cell lines also exhibited N-myc hybridizing fragments not in the germ-line position. In one patient's tumor, an additional amplitifed N-myc DNA fragment was observed and this fragment was heterogeneously distributed in liver metastases. In contrast to SCLC with neuroendocrine properties, no non-small-cell lung cancer lines examined were found to have N-myc amplification. Fragments encoding two N-myc exons also detect increased amounts of a 3.1-kilobase N-myc mRNA in N-myc-amplified SCLC lines and in one cell line that does not show N-myc gene amplification. Both DNA and RNA hybridization experiments, using a 32 P-labelled restriction probe, show that in any one SCLC cell line, only one myc-related gene is amplified and expressed. They conclude that N-myc amplification is both common and potentially significant in the tumorigenesis or tumor progression of SCLC

  4. Optimizing the triple-axis spectrometer PANDA at the MLZ for small samples and complex sample environment conditions

    Science.gov (United States)

    Utschick, C.; Skoulatos, M.; Schneidewind, A.; Böni, P.

    2016-11-01

    The cold-neutron triple-axis spectrometer PANDA at the neutron source FRM II has been serving an international user community studying condensed matter physics problems. We report on a new setup, improving the signal-to-noise ratio for small samples and pressure cell setups. Analytical and numerical Monte Carlo methods are used for the optimization of elliptic and parabolic focusing guides. They are placed between the monochromator and sample positions, and the flux at the sample is compared to the one achieved by standard monochromator focusing techniques. A 25 times smaller spot size is achieved, associated with a factor of 2 increased intensity, within the same divergence limits, ± 2 ° . This optional neutron focusing guide shall establish a top-class spectrometer for studying novel exotic properties of matter in combination with more stringent sample environment conditions such as extreme pressures associated with small sample sizes.

  5. Conversion of Small Algal Oil Sample to JP-8

    Science.gov (United States)

    2012-01-01

    cracking of Algal Oil to SPK Hydroprocessing Lab Plant uop Nitrogen Hydrogen Product ., __ Small Scale Lab Hydprocessing plant - Down flow trickle ... bed configuration - Capable of retaining 25 cc of catalyst bed Meter UOP ·CONFIDENTIAL File Number The catalytic deoxygenation stage of the...content which combined with the samples acidity, is a challenge to reactor metallurgy. None the less, an attempt was made to convert this sample to

  6. Evaluation applications of instrument calibration research findings in psychology for very small samples

    Science.gov (United States)

    Fisher, W. P., Jr.; Petry, P.

    2016-11-01

    Many published research studies document item calibration invariance across samples using Rasch's probabilistic models for measurement. A new approach to outcomes evaluation for very small samples was employed for two workshop series focused on stress reduction and joyful living conducted for health system employees and caregivers since 2012. Rasch-calibrated self-report instruments measuring depression, anxiety and stress, and the joyful living effects of mindfulness behaviors were identified in peer-reviewed journal articles. Items from one instrument were modified for use with a US population, other items were simplified, and some new items were written. Participants provided ratings of their depression, anxiety and stress, and the effects of their mindfulness behaviors before and after each workshop series. The numbers of participants providing both pre- and post-workshop data were low (16 and 14). Analysis of these small data sets produce results showing that, with some exceptions, the item hierarchies defining the constructs retained the same invariant profiles they had exhibited in the published research (correlations (not disattenuated) range from 0.85 to 0.96). In addition, comparisons of the pre- and post-workshop measures for the three constructs showed substantively and statistically significant changes. Implications for program evaluation comparisons, quality improvement efforts, and the organization of communications concerning outcomes in clinical fields are explored.

  7. Respondent-driven sampling and the recruitment of people with small injecting networks.

    Science.gov (United States)

    Paquette, Dana; Bryant, Joanne; de Wit, John

    2012-05-01

    Respondent-driven sampling (RDS) is a form of chain-referral sampling, similar to snowball sampling, which was developed to reach hidden populations such as people who inject drugs (PWID). RDS is said to reach members of a hidden population that may not be accessible through other sampling methods. However, less attention has been paid as to whether there are segments of the population that are more likely to be missed by RDS. This study examined the ability of RDS to capture people with small injecting networks. A study of PWID, using RDS, was conducted in 2009 in Sydney, Australia. The size of participants' injecting networks was examined by recruitment chain and wave. Participants' injecting network characteristics were compared to those of participants from a separate pharmacy-based study. A logistic regression analysis was conducted to examine the characteristics independently associated with having small injecting networks, using the combined RDS and pharmacy-based samples. In comparison with the pharmacy-recruited participants, RDS participants were almost 80% less likely to have small injecting networks, after adjusting for other variables. RDS participants were also more likely to have their injecting networks form a larger proportion of those in their social networks, and to have acquaintances as part of their injecting networks. Compared to those with larger injecting networks, individuals with small injecting networks were equally likely to engage in receptive sharing of injecting equipment, but less likely to have had contact with prevention services. These findings suggest that those with small injecting networks are an important group to recruit, and that RDS is less likely to capture these individuals.

  8. Assessing pesticide concentrations and fluxes in the stream of a small vineyard catchment - Effect of sampling frequency

    International Nuclear Information System (INIS)

    Rabiet, M.; Margoum, C.; Gouy, V.; Carluer, N.; Coquery, M.

    2010-01-01

    This study reports on the occurrence and behaviour of six pesticides and one metabolite in a small stream draining a vineyard catchment. Base flow and flood events were monitored in order to assess the variability of pesticide concentrations according to the season and to evaluate the role of sampling frequency on the evaluation of fluxes estimates. Results showed that dissolved pesticide concentrations displayed a strong temporal and spatial variability. A large mobilisation of pesticides was observed during floods, with total dissolved pesticide fluxes per event ranging from 5.7 x 10 -3 g/Ha to 0.34 g/Ha. These results highlight the major role of floods in the transport of pesticides in this small stream which contributed to more than 89% of the total load of diuron during August 2007. The evaluation of pesticide loads using different sampling strategies and method calculation, showed that grab sampling largely underestimated pesticide concentrations and fluxes transiting through the stream. - This work brings new insights about the fluxes of pesticides in surface water of a vineyard catchment, notably during flood events.

  9. Assessing pesticide concentrations and fluxes in the stream of a small vineyard catchment - Effect of sampling frequency

    Energy Technology Data Exchange (ETDEWEB)

    Rabiet, M., E-mail: marion.rabiet@unilim.f [Cemagref, UR QELY, 3bis quai Chauveau, CP 220, F-69336 Lyon (France); Margoum, C.; Gouy, V.; Carluer, N.; Coquery, M. [Cemagref, UR QELY, 3bis quai Chauveau, CP 220, F-69336 Lyon (France)

    2010-03-15

    This study reports on the occurrence and behaviour of six pesticides and one metabolite in a small stream draining a vineyard catchment. Base flow and flood events were monitored in order to assess the variability of pesticide concentrations according to the season and to evaluate the role of sampling frequency on the evaluation of fluxes estimates. Results showed that dissolved pesticide concentrations displayed a strong temporal and spatial variability. A large mobilisation of pesticides was observed during floods, with total dissolved pesticide fluxes per event ranging from 5.7 x 10{sup -3} g/Ha to 0.34 g/Ha. These results highlight the major role of floods in the transport of pesticides in this small stream which contributed to more than 89% of the total load of diuron during August 2007. The evaluation of pesticide loads using different sampling strategies and method calculation, showed that grab sampling largely underestimated pesticide concentrations and fluxes transiting through the stream. - This work brings new insights about the fluxes of pesticides in surface water of a vineyard catchment, notably during flood events.

  10. ASSESSING SMALL SAMPLE WAR-GAMING DATASETS

    Directory of Open Access Journals (Sweden)

    W. J. HURLEY

    2013-10-01

    Full Text Available One of the fundamental problems faced by military planners is the assessment of changes to force structure. An example is whether to replace an existing capability with an enhanced system. This can be done directly with a comparison of measures such as accuracy, lethality, survivability, etc. However this approach does not allow an assessment of the force multiplier effects of the proposed change. To gauge these effects, planners often turn to war-gaming. For many war-gaming experiments, it is expensive, both in terms of time and dollars, to generate a large number of sample observations. This puts a premium on the statistical methodology used to examine these small datasets. In this paper we compare the power of three tests to assess population differences: the Wald-Wolfowitz test, the Mann-Whitney U test, and re-sampling. We employ a series of Monte Carlo simulation experiments. Not unexpectedly, we find that the Mann-Whitney test performs better than the Wald-Wolfowitz test. Resampling is judged to perform slightly better than the Mann-Whitney test.

  11. Estimation of reference intervals from small samples: an example using canine plasma creatinine.

    Science.gov (United States)

    Geffré, A; Braun, J P; Trumel, C; Concordet, D

    2009-12-01

    According to international recommendations, reference intervals should be determined from at least 120 reference individuals, which often are impossible to achieve in veterinary clinical pathology, especially for wild animals. When only a small number of reference subjects is available, the possible bias cannot be known and the normality of the distribution cannot be evaluated. A comparison of reference intervals estimated by different methods could be helpful. The purpose of this study was to compare reference limits determined from a large set of canine plasma creatinine reference values, and large subsets of this data, with estimates obtained from small samples selected randomly. Twenty sets each of 120 and 27 samples were randomly selected from a set of 1439 plasma creatinine results obtained from healthy dogs in another study. Reference intervals for the whole sample and for the large samples were determined by a nonparametric method. The estimated reference limits for the small samples were minimum and maximum, mean +/- 2 SD of native and Box-Cox-transformed values, 2.5th and 97.5th percentiles by a robust method on native and Box-Cox-transformed values, and estimates from diagrams of cumulative distribution functions. The whole sample had a heavily skewed distribution, which approached Gaussian after Box-Cox transformation. The reference limits estimated from small samples were highly variable. The closest estimates to the 1439-result reference interval for 27-result subsamples were obtained by both parametric and robust methods after Box-Cox transformation but were grossly erroneous in some cases. For small samples, it is recommended that all values be reported graphically in a dot plot or histogram and that estimates of the reference limits be compared using different methods.

  12. Accelerator mass spectrometry of ultra-small samples with applications in the biosciences

    International Nuclear Information System (INIS)

    Salehpour, Mehran; Håkansson, Karl; Possnert, Göran

    2013-01-01

    An overview is presented covering the biological accelerator mass spectrometry activities at Uppsala University. The research utilizes the Uppsala University Tandem laboratory facilities, including a 5 MV Pelletron tandem accelerator and two stable isotope ratio mass spectrometers. In addition, a dedicated sample preparation laboratory for biological samples with natural activity is in use, as well as another laboratory specifically for 14 C-labeled samples. A variety of ongoing projects are described and presented. Examples are: (1) Ultra-small sample AMS. We routinely analyze samples with masses in the 5–10 μg C range. Data is presented regarding the sample preparation method, (2) bomb peak biological dating of ultra-small samples. A long term project is presented where purified and cell-specific DNA from various part of the human body including the heart and the brain are analyzed with the aim of extracting regeneration rate of the various human cells, (3) biological dating of various human biopsies, including atherosclerosis related plaques is presented. The average built up time of the surgically removed human carotid plaques have been measured and correlated to various data including the level of insulin in the human blood, and (4) In addition to standard microdosing type measurements using small pharmaceutical drugs, pre-clinical pharmacokinetic data from a macromolecular drug candidate are discussed.

  13. Accelerator mass spectrometry of ultra-small samples with applications in the biosciences

    Energy Technology Data Exchange (ETDEWEB)

    Salehpour, Mehran, E-mail: mehran.salehpour@physics.uu.se [Department of Physics and Astronomy, Ion Physics, PO Box 516, SE-751 20 Uppsala (Sweden); Hakansson, Karl; Possnert, Goeran [Department of Physics and Astronomy, Ion Physics, PO Box 516, SE-751 20 Uppsala (Sweden)

    2013-01-15

    An overview is presented covering the biological accelerator mass spectrometry activities at Uppsala University. The research utilizes the Uppsala University Tandem laboratory facilities, including a 5 MV Pelletron tandem accelerator and two stable isotope ratio mass spectrometers. In addition, a dedicated sample preparation laboratory for biological samples with natural activity is in use, as well as another laboratory specifically for {sup 14}C-labeled samples. A variety of ongoing projects are described and presented. Examples are: (1) Ultra-small sample AMS. We routinely analyze samples with masses in the 5-10 {mu}g C range. Data is presented regarding the sample preparation method, (2) bomb peak biological dating of ultra-small samples. A long term project is presented where purified and cell-specific DNA from various part of the human body including the heart and the brain are analyzed with the aim of extracting regeneration rate of the various human cells, (3) biological dating of various human biopsies, including atherosclerosis related plaques is presented. The average built up time of the surgically removed human carotid plaques have been measured and correlated to various data including the level of insulin in the human blood, and (4) In addition to standard microdosing type measurements using small pharmaceutical drugs, pre-clinical pharmacokinetic data from a macromolecular drug candidate are discussed.

  14. Transportable high sensitivity small sample radiometric calorimeter

    International Nuclear Information System (INIS)

    Wetzel, J.R.; Biddle, R.S.; Cordova, B.S.; Sampson, T.E.; Dye, H.R.; McDow, J.G.

    1998-01-01

    A new small-sample, high-sensitivity transportable radiometric calorimeter, which can be operated in different modes, contains an electrical calibration method, and can be used to develop secondary standards, will be described in this presentation. The data taken from preliminary tests will be presented to indicate the precision and accuracy of the instrument. The calorimeter and temperature-controlled bath, at present, require only a 30-in. by 20-in. tabletop area. The calorimeter is operated from a laptop computer system using unique measurement module capable of monitoring all necessary calorimeter signals. The calorimeter can be operated in the normal calorimeter equilibration mode, as a comparison instrument, using twin chambers and an external electrical calibration method. The sample chamber is 0.75 in (1.9 cm) in diameter by 2.5 in. (6.35 cm) long. This size will accommodate most 238 Pu heat standards manufactured in the past. The power range runs from 0.001 W to <20 W. The high end is only limited by sample size

  15. EDXRF applied to the chemical element determination of small invertebrate samples

    International Nuclear Information System (INIS)

    Magalhaes, Marcelo L.R.; Santos, Mariana L.O.; Cantinha, Rebeca S.; Souza, Thomas Marques de; Franca, Elvis J. de

    2015-01-01

    Energy Dispersion X-Ray Fluorescence - EDXRF is a fast analytical technique of easy operation, however demanding reliable analytical curves due to the intrinsic matrix dependence and interference during the analysis. By using biological materials of diverse matrices, multielemental analytical protocols can be implemented and a group of chemical elements could be determined in diverse biological matrices depending on the chemical element concentration. Particularly for invertebrates, EDXRF presents some advantages associated to the possibility of the analysis of small size samples, in which a collimator can be used that directing the incidence of X-rays to a small surface of the analyzed samples. In this work, EDXRF was applied to determine Cl, Fe, P, S and Zn in invertebrate samples using the collimator of 3 mm and 10 mm. For the assessment of the analytical protocol, the SRM 2976 Trace Elements in Mollusk produced and SRM 8415 Whole Egg Powder by the National Institute of Standards and Technology - NIST were also analyzed. After sampling by using pitfall traps, invertebrate were lyophilized, milled and transferred to polyethylene vials covered by XRF polyethylene. Analyses were performed at atmosphere lower than 30 Pa, varying voltage and electric current according to the chemical element to be analyzed. For comparison, Zn in the invertebrate material was also quantified by graphite furnace atomic absorption spectrometry after acid treatment (mixture of nitric acid and hydrogen peroxide) of samples have. Compared to the collimator of 10 mm, the SRM 2976 and SRM 8415 results obtained by the 3 mm collimator agreed well at the 95% confidence level since the E n Number were in the range of -1 and 1. Results from GFAAS were in accordance to the EDXRF values for composite samples. Therefore, determination of some chemical elements by EDXRF can be recommended for very small invertebrate samples (lower than 100 mg) with advantage of preserving the samples. (author)

  16. A thermostat for precise measurements of thermoresistance of small samples

    International Nuclear Information System (INIS)

    Rusinowski, Z.; Slowinski, B.; Winiewski, R.

    1996-01-01

    In the work a simple experimental set-up is described in which special attention is paid to the important problem of the thermal stability of thermoresistance measurements of small samples of manganin

  17. Auto-validating von Neumann rejection sampling from small phylogenetic tree spaces

    Directory of Open Access Journals (Sweden)

    York Thomas

    2009-01-01

    Full Text Available Abstract Background In phylogenetic inference one is interested in obtaining samples from the posterior distribution over the tree space on the basis of some observed DNA sequence data. One of the simplest sampling methods is the rejection sampler due to von Neumann. Here we introduce an auto-validating version of the rejection sampler, via interval analysis, to rigorously draw samples from posterior distributions over small phylogenetic tree spaces. Results The posterior samples from the auto-validating sampler are used to rigorously (i estimate posterior probabilities for different rooted topologies based on mitochondrial DNA from human, chimpanzee and gorilla, (ii conduct a non-parametric test of rate variation between protein-coding and tRNA-coding sites from three primates and (iii obtain a posterior estimate of the human-neanderthal divergence time. Conclusion This solves the open problem of rigorously drawing independent and identically distributed samples from the posterior distribution over rooted and unrooted small tree spaces (3 or 4 taxa based on any multiply-aligned sequence data.

  18. Systematic studies of small scintillators for new sampling calorimeter

    Indian Academy of Sciences (India)

    A new sampling calorimeter using very thin scintillators and the multi-pixel photon counter (MPPC) has been proposed to produce better position resolution for the international linear collider (ILC) experiment. As part of this R & D study, small plastic scintillators of different sizes, thickness and wrapping reflectors are ...

  19. A General Linear Method for Equating with Small Samples

    Science.gov (United States)

    Albano, Anthony D.

    2015-01-01

    Research on equating with small samples has shown that methods with stronger assumptions and fewer statistical estimates can lead to decreased error in the estimated equating function. This article introduces a new approach to linear observed-score equating, one which provides flexible control over how form difficulty is assumed versus estimated…

  20. Biota dose assessment of small mammals sampled near uranium mines in northern Arizona

    Energy Technology Data Exchange (ETDEWEB)

    Jannik, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Minter, K. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Kuhne, W. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Kubilius, W. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2018-01-09

    In 2015, the U. S. Geological Survey (USGS) collected approximately 50 small mammal carcasses from Northern Arizona uranium mines and other background locations. Based on the highest gross alpha results, 11 small mammal samples were selected for radioisotopic analyses. None of the background samples had significant gross alpha results. The 11 small mammals were identified relative to the three ‘indicator’ mines located south of Fredonia, AZ on the Kanab Plateau (Kanab North Mine, Pinenut Mine, and Arizona 1 Mine) (Figure 1-1) and are operated by Energy Fuels Resources Inc. (EFRI). EFRI annually reports soil analysis for uranium and radium-226 using Arizona Department of Environmental Quality (ADEQ)-approved Standard Operating Procedures for Soil Sampling (EFRI 2016a, 2016b, 2017). In combination with the USGS small mammal radioiosotopic tissue analyses, a biota dose assessment was completed by Savannah River National Laboratory (SRNL) using the RESidual RADioactivity-BIOTA (RESRAD-BIOTA, V. 1.8) dose assessment tool provided by the Argonne National Laboratory (ANL 2017).

  1. Automated Sampling and Extraction of Krypton from Small Air Samples for Kr-85 Measurement Using Atom Trap Trace Analysis

    International Nuclear Information System (INIS)

    Hebel, S.; Hands, J.; Goering, F.; Kirchner, G.; Purtschert, R.

    2015-01-01

    Atom-Trap-Trace-Analysis (ATTA) provides the capability of measuring the Krypton-85 concentration in microlitre amounts of krypton extracted from air samples of about 1 litre. This sample size is sufficiently small to allow for a range of applications, including on-site spot sampling and continuous sampling over periods of several hours. All samples can be easily handled and transported to an off-site laboratory for ATTA measurement, or stored and analyzed on demand. Bayesian sampling methodologies can be applied by blending samples for bulk measurement and performing in-depth analysis as required. Prerequisite for measurement is the extraction of a pure krypton fraction from the sample. This paper introduces an extraction unit able to isolate the krypton in small ambient air samples with high speed, high efficiency and in a fully automated manner using a combination of cryogenic distillation and gas chromatography. Air samples are collected using an automated smart sampler developed in-house to achieve a constant sampling rate over adjustable time periods ranging from 5 minutes to 3 hours per sample. The smart sampler can be deployed in the field and operate on battery for one week to take up to 60 air samples. This high flexibility of sampling and the fast, robust sample preparation are a valuable tool for research and the application of Kr-85 measurements to novel Safeguards procedures. (author)

  2. EDXRF applied to the chemical element determination of small invertebrate samples

    Energy Technology Data Exchange (ETDEWEB)

    Magalhaes, Marcelo L.R.; Santos, Mariana L.O.; Cantinha, Rebeca S.; Souza, Thomas Marques de; Franca, Elvis J. de, E-mail: marcelo_rlm@hotmail.com, E-mail: marianasantos_ufpe@hotmail.com, E-mail: rebecanuclear@gmail.com, E-mail: thomasmarques@live.com.pt, E-mail: ejfranca@cnen.gov.br [Centro Regional de Ciencias Nucleares do Nordeste (CRCN-NE/CNEN-PE), Recife, PE (Brazil)

    2015-07-01

    Energy Dispersion X-Ray Fluorescence - EDXRF is a fast analytical technique of easy operation, however demanding reliable analytical curves due to the intrinsic matrix dependence and interference during the analysis. By using biological materials of diverse matrices, multielemental analytical protocols can be implemented and a group of chemical elements could be determined in diverse biological matrices depending on the chemical element concentration. Particularly for invertebrates, EDXRF presents some advantages associated to the possibility of the analysis of small size samples, in which a collimator can be used that directing the incidence of X-rays to a small surface of the analyzed samples. In this work, EDXRF was applied to determine Cl, Fe, P, S and Zn in invertebrate samples using the collimator of 3 mm and 10 mm. For the assessment of the analytical protocol, the SRM 2976 Trace Elements in Mollusk produced and SRM 8415 Whole Egg Powder by the National Institute of Standards and Technology - NIST were also analyzed. After sampling by using pitfall traps, invertebrate were lyophilized, milled and transferred to polyethylene vials covered by XRF polyethylene. Analyses were performed at atmosphere lower than 30 Pa, varying voltage and electric current according to the chemical element to be analyzed. For comparison, Zn in the invertebrate material was also quantified by graphite furnace atomic absorption spectrometry after acid treatment (mixture of nitric acid and hydrogen peroxide) of samples have. Compared to the collimator of 10 mm, the SRM 2976 and SRM 8415 results obtained by the 3 mm collimator agreed well at the 95% confidence level since the E{sub n} Number were in the range of -1 and 1. Results from GFAAS were in accordance to the EDXRF values for composite samples. Therefore, determination of some chemical elements by EDXRF can be recommended for very small invertebrate samples (lower than 100 mg) with advantage of preserving the samples. (author)

  3. STATISTICAL EVALUATION OF SMALL SCALE MIXING DEMONSTRATION SAMPLING AND BATCH TRANSFER PERFORMANCE - 12093

    Energy Technology Data Exchange (ETDEWEB)

    GREER DA; THIEN MG

    2012-01-12

    The ability to effectively mix, sample, certify, and deliver consistent batches of High Level Waste (HLW) feed from the Hanford Double Shell Tanks (DST) to the Waste Treatment and Immobilization Plant (WTP) presents a significant mission risk with potential to impact mission length and the quantity of HLW glass produced. DOE's Tank Operations Contractor, Washington River Protection Solutions (WRPS) has previously presented the results of mixing performance in two different sizes of small scale DSTs to support scale up estimates of full scale DST mixing performance. Currently, sufficient sampling of DSTs is one of the largest programmatic risks that could prevent timely delivery of high level waste to the WTP. WRPS has performed small scale mixing and sampling demonstrations to study the ability to sufficiently sample the tanks. The statistical evaluation of the demonstration results which lead to the conclusion that the two scales of small DST are behaving similarly and that full scale performance is predictable will be presented. This work is essential to reduce the risk of requiring a new dedicated feed sampling facility and will guide future optimization work to ensure the waste feed delivery mission will be accomplished successfully. This paper will focus on the analytical data collected from mixing, sampling, and batch transfer testing from the small scale mixing demonstration tanks and how those data are being interpreted to begin to understand the relationship between samples taken prior to transfer and samples from the subsequent batches transferred. An overview of the types of data collected and examples of typical raw data will be provided. The paper will then discuss the processing and manipulation of the data which is necessary to begin evaluating sampling and batch transfer performance. This discussion will also include the evaluation of the analytical measurement capability with regard to the simulant material used in the demonstration tests. The

  4. Privacy problems in the small sample selection

    Directory of Open Access Journals (Sweden)

    Loredana Cerbara

    2013-05-01

    Full Text Available The side of social research that uses small samples for the production of micro data, today finds some operating difficulties due to the privacy law. The privacy code is a really important and necessary law because it guarantees the Italian citizen’s rights, as already happens in other Countries of the world. However it does not seem appropriate to limit once more the possibilities of the data production of the national centres of research. That possibilities are already moreover compromised due to insufficient founds is a common problem becoming more and more frequent in the research field. It would be necessary, therefore, to include in the law the possibility to use telephonic lists to select samples useful for activities directly of interest and importance to the citizen, such as the collection of the data carried out on the basis of opinion polls by the centres of research of the Italian CNR and some universities.

  5. Radiocarbon measurements of small gaseous samples at CologneAMS

    Science.gov (United States)

    Stolz, A.; Dewald, A.; Altenkirch, R.; Herb, S.; Heinze, S.; Schiffer, M.; Feuerstein, C.; Müller-Gatermann, C.; Wotte, A.; Rethemeyer, J.; Dunai, T.

    2017-09-01

    A second SO-110 B (Arnold et al., 2010) ion source was installed at the 6 MV CologneAMS for the measurement of gaseous samples. For the gas supply a dedicated device from Ionplus AG was connected to the ion source. Special effort was devoted to determine optimized operation parameters for the ion source, which give a high carbon current output and a high 14C- yield. The latter is essential in cases when only small samples are available. Additionally a modified immersion lens and modified target pieces were tested and the target position was optimized.

  6. The Accuracy of Inference in Small Samples of Dynamic Panel Data Models

    NARCIS (Netherlands)

    Bun, M.J.G.; Kiviet, J.F.

    2001-01-01

    Through Monte Carlo experiments the small sample behavior is examined of various inference techniques for dynamic panel data models when both the time-series and cross-section dimensions of the data set are small. The LSDV technique and corrected versions of it are compared with IV and GMM

  7. inverse gaussian model for small area estimation via gibbs sampling

    African Journals Online (AJOL)

    ADMIN

    For example, MacGibbon and Tomberlin. (1989) have considered estimating small area rates and binomial parameters using empirical Bayes methods. Stroud (1991) used hierarchical Bayes approach for univariate natural exponential families with quadratic variance functions in sample survey applications, while Chaubey ...

  8. Rules of attraction: The role of bait in small mammal sampling at ...

    African Journals Online (AJOL)

    Baits or lures are commonly used for surveying small mammal communities, not only because they attract large numbers of these animals, but also because they provide sustenance for trapped individuals. In this study we used Sherman live traps with five bait treatments to sample small mammal populations at three ...

  9. Uncertainty budget in internal monostandard NAA for small and large size samples analysis

    International Nuclear Information System (INIS)

    Dasari, K.B.; Acharya, R.

    2014-01-01

    Total uncertainty budget evaluation on determined concentration value is important under quality assurance programme. Concentration calculation in NAA or carried out by relative NAA and k0 based internal monostandard NAA (IM-NAA) method. IM-NAA method has been used for small and large sample analysis of clay potteries. An attempt was made to identify the uncertainty components in IM-NAA and uncertainty budget for La in both small and large size samples has been evaluated and compared. (author)

  10. Simultaneous small-sample comparisons in longitudinal or multi-endpoint trials using multiple marginal models

    DEFF Research Database (Denmark)

    Pallmann, Philip; Ritz, Christian; Hothorn, Ludwig A

    2018-01-01

    , however only asymptotically. In this paper, we show how to make the approach also applicable to small-sample data problems. Specifically, we discuss the computation of adjusted P values and simultaneous confidence bounds for comparisons of randomised treatment groups as well as for levels......Simultaneous inference in longitudinal, repeated-measures, and multi-endpoint designs can be onerous, especially when trying to find a reasonable joint model from which the interesting effects and covariances are estimated. A novel statistical approach known as multiple marginal models greatly...... simplifies the modelling process: the core idea is to "marginalise" the problem and fit multiple small models to different portions of the data, and then estimate the overall covariance matrix in a subsequent, separate step. Using these estimates guarantees strong control of the family-wise error rate...

  11. Evaluation of energy deposition by 153Sm in small samples

    International Nuclear Information System (INIS)

    Cury, M.I.C.; Siqueira, P.T.D.; Yoriyaz, H.; Coelho, P.R.P.; Da Silva, M.A.; Okazaki, K.

    2002-01-01

    Aim: This work presents evaluations of the absorbed dose by 'in vitro' blood cultures when mixed with 153 Sm solutions of different concentrations. Although 153 Sm is used as radiopharmaceutical mainly due to its beta emission, which is short-range radiation, it also emits gamma radiation which has a longer-range penetration. Therefore it turns to be a difficult task to determine the absorbed dose by small samples where the infinite approximation is no longer valid. Materials and Methods: MCNP-4C (Monte Carlo N - Particle transport code) has been used to perform the evaluations. It is not a deterministic code that calculates the value of a specific quantity solving the physical equations involved in the problem, but a virtual experiment where the events related to the problems are simulated and the concerned quantities are tallied. MCNP also stands out by its possibilities to specify geometrically any problem. However, these features, among others, turns MCNP in a time consuming code. The simulated problem consists of a cylindrical plastic tube with 1.5 cm internal diameter and 0.1cm thickness. It also has 2.0 cm height conic bottom end, so that the represented sample has 4.0 ml ( consisted by 1 ml of blood and 3 ml culture medium). To evaluate the energy deposition in the blood culture in each 153 Sm decay, the problem has been divided in 3 steps to account to the β- emissions (which has a continuum spectrum), gammas and conversion and Auger electrons emissions. Afterwards each emission contribution was weighted and summed to present the final value. Besides this radiation 'fragmentation', simulations were performed for many different amounts of 153 Sm solution added to the sample. These amounts cover a range from 1μl to 0.5 ml. Results: The average energy per disintegration of 153 Sm is 331 keV [1]. Gammas account for 63 keV and β-, conversion and Auger electrons account for 268 keV. The simulations performed showed an average energy deposition of 260 ke

  12. Thermal neutron absorption cross section of small samples

    International Nuclear Information System (INIS)

    Nghiep, T.D.; Vinh, T.T.; Son, N.N.; Vuong, T.V.; Hung, N.T.

    1989-01-01

    A modified steady method for determining the macroscopic thermal neutron absorption cross section of small samples 500 cm 3 in volume is described. The method uses a moderating block of paraffin, Pu-Be neutron source emitting 1.1x10 6 n.s. -1 , SNM-14 counter and ordinary counting equipment. The interval of cross section from 2.6 to 1.3x10 4 (10 -3 cm 2 g -1 ) was measured. The experimental data are described by calculation formulae. 7 refs.; 4 figs

  13. System for sampling liquids in small jugs obturated by screwed taps

    International Nuclear Information System (INIS)

    Besnier, J.

    1995-01-01

    This invention describes a machine which samples automatically liquids in small jugs obturated by screwed taps. This device can be situated in an isolated room in order to work with radioactive liquids. The machine can be divided in three main parts: a module to catch the jug, in order to take and fix it, a module to open and to close it, and a module to sample. The later takes the liquid thanks to a suction device and puts it in a container, in order to analyse the sample. (TEC)

  14. Research of pneumatic control transmission system for small irradiation samples

    International Nuclear Information System (INIS)

    Bai Zhongxiong; Zhang Haibing; Rong Ru; Zhang Tao

    2008-01-01

    In order to reduce the absorbed dose damage for the operator, pneumatic control has been adopted to realize the rapid transmission of small irradiation samples. On/off of pneumatic circuit and directions for the rapid transmission system are controlled by the electrical control part. The main program initializes the system and detects the location of the manual/automatic change-over switch, and call for the corresponding subprogram to achieve the automatic or manual operation. Automatic subprogram achieves the automatic sample transmission; Manual subprogram completes the deflation, and back and forth movement of the radiation samples. This paper introduces in detail the implementation of the system, in terms of both hardware and software design. (authors)

  15. Small Sample Properties of Bayesian Multivariate Autoregressive Time Series Models

    Science.gov (United States)

    Price, Larry R.

    2012-01-01

    The aim of this study was to compare the small sample (N = 1, 3, 5, 10, 15) performance of a Bayesian multivariate vector autoregressive (BVAR-SEM) time series model relative to frequentist power and parameter estimation bias. A multivariate autoregressive model was developed based on correlated autoregressive time series vectors of varying…

  16. TableSim--A program for analysis of small-sample categorical data.

    Science.gov (United States)

    David J. Rugg

    2003-01-01

    Documents a computer program for calculating correct P-values of 1-way and 2-way tables when sample sizes are small. The program is written in Fortran 90; the executable code runs in 32-bit Microsoft-- command line environments.

  17. Small-kernel constrained-least-squares restoration of sampled image data

    Science.gov (United States)

    Hazra, Rajeeb; Park, Stephen K.

    1992-10-01

    Constrained least-squares image restoration, first proposed by Hunt twenty years ago, is a linear image restoration technique in which the restoration filter is derived by maximizing the smoothness of the restored image while satisfying a fidelity constraint related to how well the restored image matches the actual data. The traditional derivation and implementation of the constrained least-squares restoration filter is based on an incomplete discrete/discrete system model which does not account for the effects of spatial sampling and image reconstruction. For many imaging systems, these effects are significant and should not be ignored. In a recent paper Park demonstrated that a derivation of the Wiener filter based on the incomplete discrete/discrete model can be extended to a more comprehensive end-to-end, continuous/discrete/continuous model. In a similar way, in this paper, we show that a derivation of the constrained least-squares filter based on the discrete/discrete model can also be extended to this more comprehensive continuous/discrete/continuous model and, by so doing, an improved restoration filter is derived. Building on previous work by Reichenbach and Park for the Wiener filter, we also show that this improved constrained least-squares restoration filter can be efficiently implemented as a small-kernel convolution in the spatial domain.

  18. Chironomid midges (Diptera, chironomidae) show extremely small genome sizes.

    Science.gov (United States)

    Cornette, Richard; Gusev, Oleg; Nakahara, Yuichi; Shimura, Sachiko; Kikawada, Takahiro; Okuda, Takashi

    2015-06-01

    Chironomid midges (Diptera; Chironomidae) are found in various environments from the high Arctic to the Antarctic, including temperate and tropical regions. In many freshwater habitats, members of this family are among the most abundant invertebrates. In the present study, the genome sizes of 25 chironomid species were determined by flow cytometry and the resulting C-values ranged from 0.07 to 0.20 pg DNA (i.e. from about 68 to 195 Mbp). These genome sizes were uniformly very small and included, to our knowledge, the smallest genome sizes recorded to date among insects. Small proportion of transposable elements and short intron sizes were suggested to contribute to the reduction of genome sizes in chironomids. We discuss about the possible developmental and physiological advantages of having a small genome size and about putative implications for the ecological success of the family Chironomidae.

  19. A review of empirical research related to the use of small quantitative samples in clinical outcome scale development.

    Science.gov (United States)

    Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S

    2016-11-01

    There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.

  20. Measuring Blood Glucose Concentrations in Photometric Glucometers Requiring Very Small Sample Volumes.

    Science.gov (United States)

    Demitri, Nevine; Zoubir, Abdelhak M

    2017-01-01

    Glucometers present an important self-monitoring tool for diabetes patients and, therefore, must exhibit high accuracy as well as good usability features. Based on an invasive photometric measurement principle that drastically reduces the volume of the blood sample needed from the patient, we present a framework that is capable of dealing with small blood samples, while maintaining the required accuracy. The framework consists of two major parts: 1) image segmentation; and 2) convergence detection. Step 1 is based on iterative mode-seeking methods to estimate the intensity value of the region of interest. We present several variations of these methods and give theoretical proofs of their convergence. Our approach is able to deal with changes in the number and position of clusters without any prior knowledge. Furthermore, we propose a method based on sparse approximation to decrease the computational load, while maintaining accuracy. Step 2 is achieved by employing temporal tracking and prediction, herewith decreasing the measurement time, and, thus, improving usability. Our framework is tested on several real datasets with different characteristics. We show that we are able to estimate the underlying glucose concentration from much smaller blood samples than is currently state of the art with sufficient accuracy according to the most recent ISO standards and reduce measurement time significantly compared to state-of-the-art methods.

  1. Quantum superposition of the state discrete spectrum of mathematical correlation molecule for small samples of biometric data

    Directory of Open Access Journals (Sweden)

    Vladimir I. Volchikhin

    2017-06-01

    Full Text Available Introduction: The study promotes to decrease a number of errors of calculating the correlation coefficient in small test samples. Materials and Methods: We used simulation tool for the distribution functions of the density values of the correlation coefficient in small samples. A method for quantization of the data, allows obtaining a discrete spectrum states of one of the varieties of correlation functional. This allows us to consider the proposed structure as a mathematical correlation molecule, described by some analogue continuous-quantum Schrödinger equation. Results: The chi-squared Pearson’s molecule on small samples allows enhancing power of classical chi-squared test to 20 times. A mathematical correlation molecule described in the article has similar properties. It allows in the future reducing calculation errors of the classical correlation coefficients in small samples. Discussion and Conclusions: The authors suggest that there are infinitely many mathematical molecules are similar in their properties to the actual physical molecules. Schrödinger equations are not unique, their analogues can be constructed for each mathematical molecule. You can expect a mathematical synthesis of molecules for a large number of known statistical tests and statistical moments. All this should make it possible to reduce calculation errors due to quantum effects that occur in small test samples.

  2. Addressing small sample size bias in multiple-biomarker trials: Inclusion of biomarker-negative patients and Firth correction.

    Science.gov (United States)

    Habermehl, Christina; Benner, Axel; Kopp-Schneider, Annette

    2018-03-01

    In recent years, numerous approaches for biomarker-based clinical trials have been developed. One of these developments are multiple-biomarker trials, which aim to investigate multiple biomarkers simultaneously in independent subtrials. For low-prevalence biomarkers, small sample sizes within the subtrials have to be expected, as well as many biomarker-negative patients at the screening stage. The small sample sizes may make it unfeasible to analyze the subtrials individually. This imposes the need to develop new approaches for the analysis of such trials. With an expected large group of biomarker-negative patients, it seems reasonable to explore options to benefit from including them in such trials. We consider advantages and disadvantages of the inclusion of biomarker-negative patients in a multiple-biomarker trial with a survival endpoint. We discuss design options that include biomarker-negative patients in the study and address the issue of small sample size bias in such trials. We carry out a simulation study for a design where biomarker-negative patients are kept in the study and are treated with standard of care. We compare three different analysis approaches based on the Cox model to examine if the inclusion of biomarker-negative patients can provide a benefit with respect to bias and variance of the treatment effect estimates. We apply the Firth correction to reduce the small sample size bias. The results of the simulation study suggest that for small sample situations, the Firth correction should be applied to adjust for the small sample size bias. Additional to the Firth penalty, the inclusion of biomarker-negative patients in the analysis can lead to further but small improvements in bias and standard deviation of the estimates. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Systematic studies of small scintillators for new sampling calorimeter

    International Nuclear Information System (INIS)

    Jacosalem, E.P.; Sanchez, A.L.C.; Bacala, A.M.; Iba, S.; Nakajima, N.; Ono, H.; Miyata, H.

    2007-01-01

    A new sampling calorimeter using very thin scintillators and the multi-pixel photon counter (MPPC) has been proposed to produce better position resolution for the international linear collider (ILC) experiment. As part of this R and D study, small plastic scintillators of different sizes, thickness and wrapping reflectors are systematically studied. The scintillation light due to beta rays from a collimated 90 Sr source are collected from the scintillator by wavelength-shifting (WLS) fiber and converted into electrical signals at the PMT. The wrapped scintillator that gives the best light yield is determined by comparing the measured pulse height of each 10 x 40 x 2 mm strip scintillator covered with 3M reflective mirror film, teflon, white paint, black tape, gold, aluminum and white paint+teflon. The pulse height dependence on position, length and thickness of the 3M reflective mirror film and teflon wrapped scintillators are measured. Results show that the 3M radiant mirror film-wrapped scintillator has the greatest light yield with an average of 9.2 photoelectrons. It is observed that light yield slightly increases with scintillator length, but increases to about 100% when WLS fiber diameter is increased from 1.0 mm to 1.6 mm. The position dependence measurement along the strip scintillator showed the uniformity of light transmission from the sensor to the PMT. A dip across the strip is observed which is 40% of the maximum pulse height. The block type scintillator pulse height, on the other hand, is found to be almost proportional to scintillator thickness. (author)

  4. A compact time-of-flight SANS instrument optimised for measurements of small sample volumes at the European Spallation Source

    Energy Technology Data Exchange (ETDEWEB)

    Kynde, Søren, E-mail: kynde@nbi.ku.dk [Niels Bohr Institute, University of Copenhagen (Denmark); Hewitt Klenø, Kaspar [Niels Bohr Institute, University of Copenhagen (Denmark); Nagy, Gergely [SINQ, Paul Scherrer Institute (Switzerland); Mortensen, Kell; Lefmann, Kim [Niels Bohr Institute, University of Copenhagen (Denmark); Kohlbrecher, Joachim, E-mail: Joachim.kohlbrecher@psi.ch [SINQ, Paul Scherrer Institute (Switzerland); Arleth, Lise, E-mail: arleth@nbi.ku.dk [Niels Bohr Institute, University of Copenhagen (Denmark)

    2014-11-11

    The high flux at European Spallation Source (ESS) will allow for performing experiments with relatively small beam-sizes while maintaining a high intensity of the incoming beam. The pulsed nature of the source makes the facility optimal for time-of-flight small-angle neutron scattering (ToF-SANS). We find that a relatively compact SANS instrument becomes the optimal choice in order to obtain the widest possible q-range in a single setting and the best possible exploitation of the neutrons in each pulse and hence obtaining the highest possible flux at the sample position. The instrument proposed in the present article is optimised for performing fast measurements of small scattering volumes, typically down to 2×2×2 mm{sup 3}, while covering a broad q-range from about 0.005 1/Å to 0.5 1/Å in a single instrument setting. This q-range corresponds to that available at a typical good BioSAXS instrument and is relevant for a wide set of biomacromolecular samples. A central advantage of covering the whole q-range in a single setting is that each sample has to be loaded only once. This makes it convenient to use the fully automated high-throughput flow-through sample changers commonly applied at modern synchrotron BioSAXS-facilities. The central drawback of choosing a very compact instrument is that the resolution in terms of δλ/λ obtained with the short wavelength neutrons becomes worse than what is usually the standard at state-of-the-art SANS instruments. Our McStas based simulations of the instrument performance for a set of characteristic biomacromolecular samples show that the resulting smearing effects still have relatively minor effects on the obtained data and can be compensated for in the data analysis. However, in cases where a better resolution is required in combination with the large simultaneous q-range characteristic of the instrument, we show that this can be obtained by inserting a set of choppers.

  5. Determination of phosphorus in small amounts of protein samples by ICP-MS.

    Science.gov (United States)

    Becker, J Sabine; Boulyga, Sergei F; Pickhardt, Carola; Becker, J; Buddrus, Stefan; Przybylski, Michael

    2003-02-01

    Inductively coupled plasma mass spectrometry (ICP-MS) is used for phosphorus determination in protein samples. A small amount of solid protein sample (down to 1 micro g) or digest (1-10 micro L) protein solution was denatured in nitric acid and hydrogen peroxide by closed-microvessel microwave digestion. Phosphorus determination was performed with an optimized analytical method using a double-focusing sector field inductively coupled plasma mass spectrometer (ICP-SFMS) and quadrupole-based ICP-MS (ICP-QMS). For quality control of phosphorus determination a certified reference material (CRM), single cell proteins (BCR 273) with a high phosphorus content of 26.8+/-0.4 mg g(-1), was analyzed. For studies on phosphorus determination in proteins while reducing the sample amount as low as possible the homogeneity of CRM BCR 273 was investigated. Relative standard deviation and measurement accuracy in ICP-QMS was within 2%, 3.5%, 11% and 12% when using CRM BCR 273 sample weights of 40 mg, 5 mg, 1 mg and 0.3 mg, respectively. The lowest possible sample weight for an accurate phosphorus analysis in protein samples by ICP-MS is discussed. The analytical method developed was applied for the analysis of homogeneous protein samples in very low amounts [1-100 micro g of solid protein sample, e.g. beta-casein or down to 1 micro L of protein or digest in solution (e.g., tau protein)]. A further reduction of the diluted protein solution volume was achieved by the application of flow injection in ICP-SFMS, which is discussed with reference to real protein digests after protein separation using 2D gel electrophoresis.The detection limits for phosphorus in biological samples were determined by ICP-SFMS down to the ng g(-1) level. The present work discusses the figure of merit for the determination of phosphorus in a small amount of protein sample with ICP-SFMS in comparison to ICP-QMS.

  6. Measurements of accurate x-ray scattering data of protein solutions using small stationary sample cells

    Science.gov (United States)

    Hong, Xinguo; Hao, Quan

    2009-01-01

    In this paper, we report a method of precise in situ x-ray scattering measurements on protein solutions using small stationary sample cells. Although reduction in the radiation damage induced by intense synchrotron radiation sources is indispensable for the correct interpretation of scattering data, there is still a lack of effective methods to overcome radiation-induced aggregation and extract scattering profiles free from chemical or structural damage. It is found that radiation-induced aggregation mainly begins on the surface of the sample cell and grows along the beam path; the diameter of the damaged region is comparable to the x-ray beam size. Radiation-induced aggregation can be effectively avoided by using a two-dimensional scan (2D mode), with an interval as small as 1.5 times the beam size, at low temperature (e.g., 4 °C). A radiation sensitive protein, bovine hemoglobin, was used to test the method. A standard deviation of less than 5% in the small angle region was observed from a series of nine spectra recorded in 2D mode, in contrast to the intensity variation seen using the conventional stationary technique, which can exceed 100%. Wide-angle x-ray scattering data were collected at a standard macromolecular diffraction station using the same data collection protocol and showed a good signal/noise ratio (better than the reported data on the same protein using a flow cell). The results indicate that this method is an effective approach for obtaining precise measurements of protein solution scattering.

  7. Measurements of accurate x-ray scattering data of protein solutions using small stationary sample cells

    International Nuclear Information System (INIS)

    Hong Xinguo; Hao Quan

    2009-01-01

    In this paper, we report a method of precise in situ x-ray scattering measurements on protein solutions using small stationary sample cells. Although reduction in the radiation damage induced by intense synchrotron radiation sources is indispensable for the correct interpretation of scattering data, there is still a lack of effective methods to overcome radiation-induced aggregation and extract scattering profiles free from chemical or structural damage. It is found that radiation-induced aggregation mainly begins on the surface of the sample cell and grows along the beam path; the diameter of the damaged region is comparable to the x-ray beam size. Radiation-induced aggregation can be effectively avoided by using a two-dimensional scan (2D mode), with an interval as small as 1.5 times the beam size, at low temperature (e.g., 4 deg. C). A radiation sensitive protein, bovine hemoglobin, was used to test the method. A standard deviation of less than 5% in the small angle region was observed from a series of nine spectra recorded in 2D mode, in contrast to the intensity variation seen using the conventional stationary technique, which can exceed 100%. Wide-angle x-ray scattering data were collected at a standard macromolecular diffraction station using the same data collection protocol and showed a good signal/noise ratio (better than the reported data on the same protein using a flow cell). The results indicate that this method is an effective approach for obtaining precise measurements of protein solution scattering.

  8. Generic Learning-Based Ensemble Framework for Small Sample Size Face Recognition in Multi-Camera Networks.

    Science.gov (United States)

    Zhang, Cuicui; Liang, Xuefeng; Matsuyama, Takashi

    2014-12-08

    Multi-camera networks have gained great interest in video-based surveillance systems for security monitoring, access control, etc. Person re-identification is an essential and challenging task in multi-camera networks, which aims to determine if a given individual has already appeared over the camera network. Individual recognition often uses faces as a trial and requires a large number of samples during the training phrase. This is difficult to fulfill due to the limitation of the camera hardware system and the unconstrained image capturing conditions. Conventional face recognition algorithms often encounter the "small sample size" (SSS) problem arising from the small number of training samples compared to the high dimensionality of the sample space. To overcome this problem, interest in the combination of multiple base classifiers has sparked research efforts in ensemble methods. However, existing ensemble methods still open two questions: (1) how to define diverse base classifiers from the small data; (2) how to avoid the diversity/accuracy dilemma occurring during ensemble. To address these problems, this paper proposes a novel generic learning-based ensemble framework, which augments the small data by generating new samples based on a generic distribution and introduces a tailored 0-1 knapsack algorithm to alleviate the diversity/accuracy dilemma. More diverse base classifiers can be generated from the expanded face space, and more appropriate base classifiers are selected for ensemble. Extensive experimental results on four benchmarks demonstrate the higher ability of our system to cope with the SSS problem compared to the state-of-the-art system.

  9. Generic Learning-Based Ensemble Framework for Small Sample Size Face Recognition in Multi-Camera Networks

    Directory of Open Access Journals (Sweden)

    Cuicui Zhang

    2014-12-01

    Full Text Available Multi-camera networks have gained great interest in video-based surveillance systems for security monitoring, access control, etc. Person re-identification is an essential and challenging task in multi-camera networks, which aims to determine if a given individual has already appeared over the camera network. Individual recognition often uses faces as a trial and requires a large number of samples during the training phrase. This is difficult to fulfill due to the limitation of the camera hardware system and the unconstrained image capturing conditions. Conventional face recognition algorithms often encounter the “small sample size” (SSS problem arising from the small number of training samples compared to the high dimensionality of the sample space. To overcome this problem, interest in the combination of multiple base classifiers has sparked research efforts in ensemble methods. However, existing ensemble methods still open two questions: (1 how to define diverse base classifiers from the small data; (2 how to avoid the diversity/accuracy dilemma occurring during ensemble. To address these problems, this paper proposes a novel generic learning-based ensemble framework, which augments the small data by generating new samples based on a generic distribution and introduces a tailored 0–1 knapsack algorithm to alleviate the diversity/accuracy dilemma. More diverse base classifiers can be generated from the expanded face space, and more appropriate base classifiers are selected for ensemble. Extensive experimental results on four benchmarks demonstrate the higher ability of our system to cope with the SSS problem compared to the state-of-the-art system.

  10. Rural and small-town attitudes about alcohol use during pregnancy: a community and provider sample.

    Science.gov (United States)

    Logan, T K; Walker, Robert; Nagle, Laura; Lewis, Jimmie; Wiesenhahn, Donna

    2003-01-01

    While there has been considerable research on prenatal alcohol use, there have been limited studies focused on women in rural and small-town environments. This 2-part study examines gender differences in attitudes and perceived barriers to intervention in large community sample of persons living in rural and small-town environments in Kentucky (n = 3,346). The study also examines rural/small-town prenatal service providers' perceptions of barriers to assessment and intervention with pregnant substance abusers (n = 138). Surveys were administered to a convenience sample of employees and customers from 16 rural and small-town community outlets. There were 1503 males (45%) and 1843 females (55%) ranging in age from under 18 years old to over 66 years old. Surveys also were mailed to prenatal providers in county health departments of the 13-county study area, with 138 of 149 responding. Overall results of the community sample suggest that neither males nor females were knowledgeable about the harmful effects of alcohol use during pregnancy. Results also indicate substantial gender differences in alcohol attitudes, knowledge, and perceived barriers. Further, prenatal care providers identified several barriers in assessment and treatment of pregnant women with alcohol use problems in rural and small-town communities, including lack of knowledge and comfort with assessment as well as a lack of available and accessible treatment for referrals.

  11. Small angle neutron scattering and small angle X-ray scattering ...

    Indian Academy of Sciences (India)

    Abstract. The morphology of carbon nanofoam samples comprising platinum nanopar- ticles dispersed in the matrix was characterized by small angle neutron scattering (SANS) and small angle X-ray scattering (SAXS) techniques. Results show that the structure of pores of carbon matrix exhibits a mass (pore) fractal nature ...

  12. Analysis of small sample size studies using nonparametric bootstrap test with pooled resampling method.

    Science.gov (United States)

    Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A

    2017-06-30

    Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Basic distribution free identification tests for small size samples of environmental data

    International Nuclear Information System (INIS)

    Federico, A.G.; Musmeci, F.

    1998-01-01

    Testing two or more data sets for the hypothesis that they are sampled form the same population is often required in environmental data analysis. Typically the available samples have a small number of data and often then assumption of normal distributions is not realistic. On the other hand the diffusion of the days powerful Personal Computers opens new possible opportunities based on a massive use of the CPU resources. The paper reviews the problem introducing the feasibility of two non parametric approaches based on intrinsic equi probability properties of the data samples. The first one is based on a full re sampling while the second is based on a bootstrap approach. A easy to use program is presented. A case study is given based on the Chernobyl children contamination data [it

  14. Mutational status of synchronous and metachronous tumor samples in patients with metastatic non-small-cell lung cancer

    International Nuclear Information System (INIS)

    Quéré, Gilles; Descourt, Renaud; Robinet, Gilles; Autret, Sandrine; Raguenes, Odile; Fercot, Brigitte; Alemany, Pierre; Uguen, Arnaud; Férec, Claude; Quintin-Roué, Isabelle; Le Gac, Gérald

    2016-01-01

    Despite reported discordance between the mutational status of primary lung cancers and their metastases, metastatic sites are rarely biopsied and targeted therapy is guided by genetic biomarkers detected in the primary tumor. This situation is mostly explained by the apparent stability of EGFR-activating mutations. Given the dramatic increase in the range of candidate drugs and high rates of drug resistance, rebiopsy or liquid biopsy may become widespread. The purpose of this study was to test genetic biomarkers used in clinical practice (EGFR, ALK) and candidate biomarkers identified by the French National Cancer Institute (KRAS, BRAF, PIK3CA, HER2) in patients with metastatic non-small-cell lung cancer for whom two tumor samples were available. A retrospective study identified 88 tumor samples collected synchronously or metachronously, from the same or two different sites, in 44 patients. Mutation analysis used SNaPshot (EGFR, KRAS, BRAF missense mutations), pyrosequencing (EGFR and PIK3CA missense mutations), sizing assays (EGFR and HER2 indels) and IHC and/or FISH (ALK rearrangements). About half the patients (52 %) harbored at least one mutation. Five patients had an activating mutation of EGFR in both the primary tumor and the metastasis. The T790M resistance mutation was detected in metastases in 3 patients with acquired resistance to EGFR tyrosine kinase inhibitors. FISH showed discordance in ALK status between a small biopsy sample and the surgical specimen. KRAS mutations were observed in 36 % of samples, six patients (14 %) having discordant genotypes; all discordances concerned sampling from different sites. Two patients (5 %) showed PI3KCA mutations. One metastasis harbored both PI3KCA and KRAS mutations, while the synchronously sampled primary tumor was mutation free. No mutations were detected in BRAF and HER2. This study highlighted noteworthy intra-individual discordance in KRAS mutational status, whereas EGFR status was stable. Intratumoral

  15. Decoder calibration with ultra small current sample set for intracortical brain-machine interface

    Science.gov (United States)

    Zhang, Peng; Ma, Xuan; Chen, Luyao; Zhou, Jin; Wang, Changyong; Li, Wei; He, Jiping

    2018-04-01

    Objective. Intracortical brain-machine interfaces (iBMIs) aim to restore efficient communication and movement ability for paralyzed patients. However, frequent recalibration is required for consistency and reliability, and every recalibration will require relatively large most current sample set. The aim in this study is to develop an effective decoder calibration method that can achieve good performance while minimizing recalibration time. Approach. Two rhesus macaques implanted with intracortical microelectrode arrays were trained separately on movement and sensory paradigm. Neural signals were recorded to decode reaching positions or grasping postures. A novel principal component analysis-based domain adaptation (PDA) method was proposed to recalibrate the decoder with only ultra small current sample set by taking advantage of large historical data, and the decoding performance was compared with other three calibration methods for evaluation. Main results. The PDA method closed the gap between historical and current data effectively, and made it possible to take advantage of large historical data for decoder recalibration in current data decoding. Using only ultra small current sample set (five trials of each category), the decoder calibrated using the PDA method could achieve much better and more robust performance in all sessions than using other three calibration methods in both monkeys. Significance. (1) By this study, transfer learning theory was brought into iBMIs decoder calibration for the first time. (2) Different from most transfer learning studies, the target data in this study were ultra small sample set and were transferred to the source data. (3) By taking advantage of historical data, the PDA method was demonstrated to be effective in reducing recalibration time for both movement paradigm and sensory paradigm, indicating a viable generalization. By reducing the demand for large current training data, this new method may facilitate the application

  16. Baysian estimation of P(X > x) from a small sample of Gaussian data

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    2017-01-01

    The classical statistical uncertainty problem of estimation of upper tail probabilities on the basis of a small sample of observations of a Gaussian random variable is considered. Predictive posterior estimation is discussed, adopting the standard statistical model with diffuse priors of the two...

  17. Small-vessel Survey and Auction Sampling to Estimate Growth and Maturity of Eteline Snappers

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Small-vessel Survey and Auction Sampling to Estimate Growth and Maturity of Eteline Snappers and Improve Data-Limited Stock Assessments. This biosampling project...

  18. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples.

    Science.gov (United States)

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G

    2015-10-01

    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  19. Clustering Methods with Qualitative Data: A Mixed Methods Approach for Prevention Research with Small Samples

    Science.gov (United States)

    Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.

    2016-01-01

    Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969

  20. Interval estimation methods of the mean in small sample situation and the results' comparison

    International Nuclear Information System (INIS)

    Wu Changli; Guo Chunying; Jiang Meng; Lin Yuangen

    2009-01-01

    The methods of the sample mean's interval estimation, namely the classical method, the Bootstrap method, the Bayesian Bootstrap method, the Jackknife method and the spread method of the Empirical Characteristic distribution function are described. Numerical calculation on the samples' mean intervals is carried out where the numbers of the samples are 4, 5, 6 respectively. The results indicate the Bootstrap method and the Bayesian Bootstrap method are much more appropriate than others in small sample situation. (authors)

  1. Calculation of coincidence summing corrections for a specific small soil sample geometry

    Energy Technology Data Exchange (ETDEWEB)

    Helmer, R.G.; Gehrke, R.J.

    1996-10-01

    Previously, a system was developed at the INEL for measuring the {gamma}-ray emitting nuclides in small soil samples for the purpose of environmental monitoring. These samples were counted close to a {approx}20% Ge detector and, therefore, it was necessary to take into account the coincidence summing that occurs for some nuclides. In order to improve the technical basis for the coincidence summing corrections, the authors have carried out a study of the variation in the coincidence summing probability with position within the sample volume. A Monte Carlo electron and photon transport code (CYLTRAN) was used to compute peak and total efficiencies for various photon energies from 30 to 2,000 keV at 30 points throughout the sample volume. The geometry for these calculations included the various components of the detector and source along with the shielding. The associated coincidence summing corrections were computed at these 30 positions in the sample volume and then averaged for the whole source. The influence of the soil and the detector shielding on the efficiencies was investigated.

  2. Speeding Up Non-Parametric Bootstrap Computations for Statistics Based on Sample Moments in Small/Moderate Sample Size Applications.

    Directory of Open Access Journals (Sweden)

    Elias Chaibub Neto

    Full Text Available In this paper we propose a vectorized implementation of the non-parametric bootstrap for statistics based on sample moments. Basically, we adopt the multinomial sampling formulation of the non-parametric bootstrap, and compute bootstrap replications of sample moment statistics by simply weighting the observed data according to multinomial counts instead of evaluating the statistic on a resampled version of the observed data. Using this formulation we can generate a matrix of bootstrap weights and compute the entire vector of bootstrap replications with a few matrix multiplications. Vectorization is particularly important for matrix-oriented programming languages such as R, where matrix/vector calculations tend to be faster than scalar operations implemented in a loop. We illustrate the application of the vectorized implementation in real and simulated data sets, when bootstrapping Pearson's sample correlation coefficient, and compared its performance against two state-of-the-art R implementations of the non-parametric bootstrap, as well as a straightforward one based on a for loop. Our investigations spanned varying sample sizes and number of bootstrap replications. The vectorized bootstrap compared favorably against the state-of-the-art implementations in all cases tested, and was remarkably/considerably faster for small/moderate sample sizes. The same results were observed in the comparison with the straightforward implementation, except for large sample sizes, where the vectorized bootstrap was slightly slower than the straightforward implementation due to increased time expenditures in the generation of weight matrices via multinomial sampling.

  3. QNB: differential RNA methylation analysis for count-based small-sample sequencing data with a quad-negative binomial model.

    Science.gov (United States)

    Liu, Lian; Zhang, Shao-Wu; Huang, Yufei; Meng, Jia

    2017-08-31

    As a newly emerged research area, RNA epigenetics has drawn increasing attention recently for the participation of RNA methylation and other modifications in a number of crucial biological processes. Thanks to high throughput sequencing techniques, such as, MeRIP-Seq, transcriptome-wide RNA methylation profile is now available in the form of count-based data, with which it is often of interests to study the dynamics at epitranscriptomic layer. However, the sample size of RNA methylation experiment is usually very small due to its costs; and additionally, there usually exist a large number of genes whose methylation level cannot be accurately estimated due to their low expression level, making differential RNA methylation analysis a difficult task. We present QNB, a statistical approach for differential RNA methylation analysis with count-based small-sample sequencing data. Compared with previous approaches such as DRME model based on a statistical test covering the IP samples only with 2 negative binomial distributions, QNB is based on 4 independent negative binomial distributions with their variances and means linked by local regressions, and in the way, the input control samples are also properly taken care of. In addition, different from DRME approach, which relies only the input control sample only for estimating the background, QNB uses a more robust estimator for gene expression by combining information from both input and IP samples, which could largely improve the testing performance for very lowly expressed genes. QNB showed improved performance on both simulated and real MeRIP-Seq datasets when compared with competing algorithms. And the QNB model is also applicable to other datasets related RNA modifications, including but not limited to RNA bisulfite sequencing, m 1 A-Seq, Par-CLIP, RIP-Seq, etc.

  4. Show small close comfort and listen - How to overcome barriers in the use of social media

    NARCIS (Netherlands)

    Verjans, Steven

    2011-01-01

    Verjans, S. (2011, 23 November). Show small close comfort and listen - How to overcome barriers in the use of social media. Presentation at the SVEA Final Conference "Next Generation Learning - How to Integrate Social Media in Vocational and Adult Training", Brussels, Belgium.

  5. Investigation of Phase Transition-Based Tethered Systems for Small Body Sample Capture

    Science.gov (United States)

    Quadrelli, Marco; Backes, Paul; Wilkie, Keats; Giersch, Lou; Quijano, Ubaldo; Scharf, Daniel; Mukherjee, Rudranarayan

    2009-01-01

    This paper summarizes the modeling, simulation, and testing work related to the development of technology to investigate the potential that shape memory actuation has to provide mechanically simple and affordable solutions for delivering assets to a surface and for sample capture and possible return to Earth. We investigate the structural dynamics and controllability aspects of an adaptive beam carrying an end-effector which, by changing equilibrium phases is able to actively decouple the end-effector dynamics from the spacecraft dynamics during the surface contact phase. Asset delivery and sample capture and return are at the heart of several emerging potential missions to small bodies, such as asteroids and comets, and to the surface of large bodies, such as Titan.

  6. Correcting Model Fit Criteria for Small Sample Latent Growth Models with Incomplete Data

    Science.gov (United States)

    McNeish, Daniel; Harring, Jeffrey R.

    2017-01-01

    To date, small sample problems with latent growth models (LGMs) have not received the amount of attention in the literature as related mixed-effect models (MEMs). Although many models can be interchangeably framed as a LGM or a MEM, LGMs uniquely provide criteria to assess global data-model fit. However, previous studies have demonstrated poor…

  7. Using Data-Dependent Priors to Mitigate Small Sample Bias in Latent Growth Models: A Discussion and Illustration Using M"plus"

    Science.gov (United States)

    McNeish, Daniel M.

    2016-01-01

    Mixed-effects models (MEMs) and latent growth models (LGMs) are often considered interchangeable save the discipline-specific nomenclature. Software implementations of these models, however, are not interchangeable, particularly with small sample sizes. Restricted maximum likelihood estimation that mitigates small sample bias in MEMs has not been…

  8. Vertical Sampling Scales for Atmospheric Boundary Layer Measurements from Small Unmanned Aircraft Systems (sUAS

    Directory of Open Access Journals (Sweden)

    Benjamin L. Hemingway

    2017-09-01

    Full Text Available The lowest portion of the Earth’s atmosphere, known as the atmospheric boundary layer (ABL, plays an important role in the formation of weather events. Simple meteorological measurements collected from within the ABL, such as temperature, pressure, humidity, and wind velocity, are key to understanding the exchange of energy within this region, but conventional surveillance techniques such as towers, radar, weather balloons, and satellites do not provide adequate spatial and/or temporal coverage for monitoring weather events. Small unmanned aircraft, or aerial, systems (sUAS provide a versatile, dynamic platform for atmospheric sensing that can provide higher spatio-temporal sampling frequencies than available through most satellite sensing methods. They are also able to sense portions of the atmosphere that cannot be measured from ground-based radar, weather stations, or weather balloons and have the potential to fill gaps in atmospheric sampling. However, research on the vertical sampling scales for collecting atmospheric measurements from sUAS and the variabilities of these scales across atmospheric phenomena (e.g., temperature and humidity is needed. The objective of this study is to use variogram analysis, a common geostatistical technique, to determine optimal spatial sampling scales for two atmospheric variables (temperature and relative humidity captured from sUAS. Results show that vertical sampling scales of approximately 3 m for temperature and 1.5–2 m for relative humidity were sufficient to capture the spatial structure of these phenomena under the conditions tested. Future work is needed to model these scales across the entire ABL as well as under variable conditions.

  9. Polygenic scores predict alcohol problems in an independent sample and show moderation by the environment.

    Science.gov (United States)

    Salvatore, Jessica E; Aliev, Fazil; Edwards, Alexis C; Evans, David M; Macleod, John; Hickman, Matthew; Lewis, Glyn; Kendler, Kenneth S; Loukola, Anu; Korhonen, Tellervo; Latvala, Antti; Rose, Richard J; Kaprio, Jaakko; Dick, Danielle M

    2014-04-10

    Alcohol problems represent a classic example of a complex behavioral outcome that is likely influenced by many genes of small effect. A polygenic approach, which examines aggregate measured genetic effects, can have predictive power in cases where individual genes or genetic variants do not. In the current study, we first tested whether polygenic risk for alcohol problems-derived from genome-wide association estimates of an alcohol problems factor score from the age 18 assessment of the Avon Longitudinal Study of Parents and Children (ALSPAC; n = 4304 individuals of European descent; 57% female)-predicted alcohol problems earlier in development (age 14) in an independent sample (FinnTwin12; n = 1162; 53% female). We then tested whether environmental factors (parental knowledge and peer deviance) moderated polygenic risk to predict alcohol problems in the FinnTwin12 sample. We found evidence for both polygenic association and for additive polygene-environment interaction. Higher polygenic scores predicted a greater number of alcohol problems (range of Pearson partial correlations 0.07-0.08, all p-values ≤ 0.01). Moreover, genetic influences were significantly more pronounced under conditions of low parental knowledge or high peer deviance (unstandardized regression coefficients (b), p-values (p), and percent of variance (R2) accounted for by interaction terms: b = 1.54, p = 0.02, R2 = 0.33%; b = 0.94, p = 0.04, R2 = 0.30%, respectively). Supplementary set-based analyses indicated that the individual top single nucleotide polymorphisms (SNPs) contributing to the polygenic scores were not individually enriched for gene-environment interaction. Although the magnitude of the observed effects are small, this study illustrates the usefulness of polygenic approaches for understanding the pathways by which measured genetic predispositions come together with environmental factors to predict complex behavioral outcomes.

  10. Polygenic Scores Predict Alcohol Problems in an Independent Sample and Show Moderation by the Environment

    Directory of Open Access Journals (Sweden)

    Jessica E. Salvatore

    2014-04-01

    Full Text Available Alcohol problems represent a classic example of a complex behavioral outcome that is likely influenced by many genes of small effect. A polygenic approach, which examines aggregate measured genetic effects, can have predictive power in cases where individual genes or genetic variants do not. In the current study, we first tested whether polygenic risk for alcohol problems—derived from genome-wide association estimates of an alcohol problems factor score from the age 18 assessment of the Avon Longitudinal Study of Parents and Children (ALSPAC; n = 4304 individuals of European descent; 57% female—predicted alcohol problems earlier in development (age 14 in an independent sample (FinnTwin12; n = 1162; 53% female. We then tested whether environmental factors (parental knowledge and peer deviance moderated polygenic risk to predict alcohol problems in the FinnTwin12 sample. We found evidence for both polygenic association and for additive polygene-environment interaction. Higher polygenic scores predicted a greater number of alcohol problems (range of Pearson partial correlations 0.07–0.08, all p-values ≤ 0.01. Moreover, genetic influences were significantly more pronounced under conditions of low parental knowledge or high peer deviance (unstandardized regression coefficients (b, p-values (p, and percent of variance (R2 accounted for by interaction terms: b = 1.54, p = 0.02, R2 = 0.33%; b = 0.94, p = 0.04, R2 = 0.30%, respectively. Supplementary set-based analyses indicated that the individual top single nucleotide polymorphisms (SNPs contributing to the polygenic scores were not individually enriched for gene-environment interaction. Although the magnitude of the observed effects are small, this study illustrates the usefulness of polygenic approaches for understanding the pathways by which measured genetic predispositions come together with environmental factors to predict complex behavioral outcomes.

  11. Evaluation of Approaches to Analyzing Continuous Correlated Eye Data When Sample Size Is Small.

    Science.gov (United States)

    Huang, Jing; Huang, Jiayan; Chen, Yong; Ying, Gui-Shuang

    2018-02-01

    To evaluate the performance of commonly used statistical methods for analyzing continuous correlated eye data when sample size is small. We simulated correlated continuous data from two designs: (1) two eyes of a subject in two comparison groups; (2) two eyes of a subject in the same comparison group, under various sample size (5-50), inter-eye correlation (0-0.75) and effect size (0-0.8). Simulated data were analyzed using paired t-test, two sample t-test, Wald test and score test using the generalized estimating equations (GEE) and F-test using linear mixed effects model (LMM). We compared type I error rates and statistical powers, and demonstrated analysis approaches through analyzing two real datasets. In design 1, paired t-test and LMM perform better than GEE, with nominal type 1 error rate and higher statistical power. In design 2, no test performs uniformly well: two sample t-test (average of two eyes or a random eye) achieves better control of type I error but yields lower statistical power. In both designs, the GEE Wald test inflates type I error rate and GEE score test has lower power. When sample size is small, some commonly used statistical methods do not perform well. Paired t-test and LMM perform best when two eyes of a subject are in two different comparison groups, and t-test using the average of two eyes performs best when the two eyes are in the same comparison group. When selecting the appropriate analysis approach the study design should be considered.

  12. The use of secondary ion mass spectrometry in forensic analyses of ultra-small samples

    Science.gov (United States)

    Cliff, John

    2010-05-01

    less than 300 picograms of bacteria. Perhaps the most successful application of SIMS for forensic purposes to date is in the field of nuclear forensics. An example that has been used by laboratories associated with the International Atomic Energy Agency is the examination of environmental samples for enriched uranium particles indicative of clandestine weapons production activities.. The analytical challenge in these types of measurements is to search complex environmental matrices for U-bearing particles which must then be analyzed for 234U, 235U, and 236U content with high precision and accuracy. Older-generation SIMS instruments were hampered by small geometries that made resolution of significant interferences problematic. In addition, automated particle search software was proprietary and difficult to obtain. With the development of new search software, the IMS 1280 is capable of searching a sample in a matter of hours, flagging U-bearing particles for later analyses, and providing a rough 235U content. Particles of interest can be revisited for high precision analyses, and all U-isotopes can be measured simultaneously in multicollector mode, dramatically improving analysis time and internal precision. Further, the large geometry of the instrument allows complete resolution of isobaric interferences that have traditionally limited SIMS analyses of difficult samples. Examples of analyses of micron-sized standard particles indicate that estimates of 235U enrichment can be obtained with an external relative precision of 0.1% and 234U and 236U contents can be obtained with a relative precision of less than 1%. Analyses of 'real' samples show a dramatic improvement in the data quality obtained compared with small-geometry SIMS instruments making SIMS the method of choice for these high-profile samples when accurate, precise, and rapid results are required.

  13. Sensitivity and specificity of normality tests and consequences on reference interval accuracy at small sample size: a computer-simulation study.

    Science.gov (United States)

    Le Boedec, Kevin

    2016-12-01

    According to international guidelines, parametric methods must be chosen for RI construction when the sample size is small and the distribution is Gaussian. However, normality tests may not be accurate at small sample size. The purpose of the study was to evaluate normality test performance to properly identify samples extracted from a Gaussian population at small sample sizes, and assess the consequences on RI accuracy of applying parametric methods to samples that falsely identified the parent population as Gaussian. Samples of n = 60 and n = 30 values were randomly selected 100 times from simulated Gaussian, lognormal, and asymmetric populations of 10,000 values. The sensitivity and specificity of 4 normality tests were compared. Reference intervals were calculated using 6 different statistical methods from samples that falsely identified the parent population as Gaussian, and their accuracy was compared. Shapiro-Wilk and D'Agostino-Pearson tests were the best performing normality tests. However, their specificity was poor at sample size n = 30 (specificity for P Box-Cox transformation) on all samples regardless of their distribution or adjusting, the significance level of normality tests depending on sample size would limit the risk of constructing inaccurate RI. © 2016 American Society for Veterinary Clinical Pathology.

  14. A Rational Approach for Discovering and Validating Cancer Markers in Very Small Samples Using Mass Spectrometry and ELISA Microarrays

    Directory of Open Access Journals (Sweden)

    Richard C. Zangar

    2004-01-01

    Full Text Available Identifying useful markers of cancer can be problematic due to limited amounts of sample. Some samples such as nipple aspirate fluid (NAF or early-stage tumors are inherently small. Other samples such as serum are collected in larger volumes but archives of these samples are very valuable and only small amounts of each sample may be available for a single study. Also, given the diverse nature of cancer and the inherent variability in individual protein levels, it seems likely that the best approach to screen for cancer will be to determine the profile of a battery of proteins. As a result, a major challenge in identifying protein markers of disease is the ability to screen many proteins using very small amounts of sample. In this review, we outline some technological advances in proteomics that greatly advance this capability. Specifically, we propose a strategy for identifying markers of breast cancer in NAF that utilizes mass spectrometry (MS to simultaneously screen hundreds or thousands of proteins in each sample. The best potential markers identified by the MS analysis can then be extensively characterized using an ELISA microarray assay. Because the microarray analysis is quantitative and large numbers of samples can be efficiently analyzed, this approach offers the ability to rapidly assess a battery of selected proteins in a manner that is directly relevant to traditional clinical assays.

  15. Measurement of phthalates in small samples of mammalian tissue

    International Nuclear Information System (INIS)

    Acott, P.D.; Murphy, M.G.; Ogborn, M.R.; Crocker, J.F.S.

    1987-01-01

    Di-(2-ethylhexyl)-phthalate (DEHP) is a phthalic acid ester that is used as a plasticizer in polyvinyl chloride products, many of which have widespread medical application. DEHP has been shown to be leached from products used for storage and delivery of blood transfusions during procedures such as plasmaphoresis, hemodialysis and open heart surgery. Results of studies in this laboratory have suggested that there is an association between the absorption and deposition of DEHP (and/or related chemicals) in the kidney and the acquired renal cystic disease (ACD) frequently seen in patients who have undergone prolonged dialysis treatment. In order to determine the relationship between the two, it has been necessary to establish a method for extracting and accurately quantitating minute amounts of these chemicals in small tissue samples. The authors have now established such a method using kidneys from normal rats and from a rat model for ACD

  16. A novel approach for small sample size family-based association studies: sequential tests.

    Science.gov (United States)

    Ilk, Ozlem; Rajabli, Farid; Dungul, Dilay Ciglidag; Ozdag, Hilal; Ilk, Hakki Gokhan

    2011-08-01

    In this paper, we propose a sequential probability ratio test (SPRT) to overcome the problem of limited samples in studies related to complex genetic diseases. The results of this novel approach are compared with the ones obtained from the traditional transmission disequilibrium test (TDT) on simulated data. Although TDT classifies single-nucleotide polymorphisms (SNPs) to only two groups (SNPs associated with the disease and the others), SPRT has the flexibility of assigning SNPs to a third group, that is, those for which we do not have enough evidence and should keep sampling. It is shown that SPRT results in smaller ratios of false positives and negatives, as well as better accuracy and sensitivity values for classifying SNPs when compared with TDT. By using SPRT, data with small sample size become usable for an accurate association analysis.

  17. Report of the advisory group meeting on elemental analysis of extremely small samples

    International Nuclear Information System (INIS)

    2002-01-01

    This publication contains summary of discussions held at the meeting with brief description and comparative characteristics of most common nuclear analytical techniques used for analysis of very small samples as well as the conclusions of the meeting. Some aspect of reference materials and quality control are also discussed. The publication also contains individual contributions made by the participants, each of these papers haven provided with an abstract and indexed separately

  18. Predicting Antitumor Activity of Peptides by Consensus of Regression Models Trained on a Small Data Sample

    Directory of Open Access Journals (Sweden)

    Ivanka Jerić

    2011-11-01

    Full Text Available Predicting antitumor activity of compounds using regression models trained on a small number of compounds with measured biological activity is an ill-posed inverse problem. Yet, it occurs very often within the academic community. To counteract, up to some extent, overfitting problems caused by a small training data, we propose to use consensus of six regression models for prediction of biological activity of virtual library of compounds. The QSAR descriptors of 22 compounds related to the opioid growth factor (OGF, Tyr-Gly-Gly-Phe-Met with known antitumor activity were used to train regression models: the feed-forward artificial neural network, the k-nearest neighbor, sparseness constrained linear regression, the linear and nonlinear (with polynomial and Gaussian kernel support vector machine. Regression models were applied on a virtual library of 429 compounds that resulted in six lists with candidate compounds ranked by predicted antitumor activity. The highly ranked candidate compounds were synthesized, characterized and tested for an antiproliferative activity. Some of prepared peptides showed more pronounced activity compared with the native OGF; however, they were less active than highly ranked compounds selected previously by the radial basis function support vector machine (RBF SVM regression model. The ill-posedness of the related inverse problem causes unstable behavior of trained regression models on test data. These results point to high complexity of prediction based on the regression models trained on a small data sample.

  19. Mass amplifying probe for sensitive fluorescence anisotropy detection of small molecules in complex biological samples.

    Science.gov (United States)

    Cui, Liang; Zou, Yuan; Lin, Ninghang; Zhu, Zhi; Jenkins, Gareth; Yang, Chaoyong James

    2012-07-03

    Fluorescence anisotropy (FA) is a reliable and excellent choice for fluorescence sensing. One of the key factors influencing the FA value for any molecule is the molar mass of the molecule being measured. As a result, the FA method with functional nucleic acid aptamers has been limited to macromolecules such as proteins and is generally not applicable for the analysis of small molecules because their molecular masses are relatively too small to produce observable FA value changes. We report here a molecular mass amplifying strategy to construct anisotropy aptamer probes for small molecules. The probe is designed in such a way that only when a target molecule binds to the probe does it activate its binding ability to an anisotropy amplifier (a high molecular mass molecule such as protein), thus significantly increasing the molecular mass and FA value of the probe/target complex. Specifically, a mass amplifying probe (MAP) consists of a targeting aptamer domain against a target molecule and molecular mass amplifying aptamer domain for the amplifier protein. The probe is initially rendered inactive by a small blocking strand partially complementary to both target aptamer and amplifier protein aptamer so that the mass amplifying aptamer domain would not bind to the amplifier protein unless the probe has been activated by the target. In this way, we prepared two probes that constitute a target (ATP and cocaine respectively) aptamer, a thrombin (as the mass amplifier) aptamer, and a fluorophore. Both probes worked well against their corresponding small molecule targets, and the detection limits for ATP and cocaine were 0.5 μM and 0.8 μM, respectively. More importantly, because FA is less affected by environmental interferences, ATP in cell media and cocaine in urine were directly detected without any tedious sample pretreatment. Our results established that our molecular mass amplifying strategy can be used to design aptamer probes for rapid, sensitive, and selective

  20. 40 CFR Appendix A to Subpart F of... - Sampling Plans for Selective Enforcement Auditing of Small Nonroad Engines

    Science.gov (United States)

    2010-07-01

    ... Enforcement Auditing of Small Nonroad Engines A Appendix A to Subpart F of Part 90 Protection of Environment...-IGNITION ENGINES AT OR BELOW 19 KILOWATTS Selective Enforcement Auditing Pt. 90, Subpt. F, App. A Appendix A to Subpart F of Part 90—Sampling Plans for Selective Enforcement Auditing of Small Nonroad Engines...

  1. Small Scale Mixing Demonstration Batch Transfer and Sampling Performance of Simulated HLW - 12307

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, Jesse; Townson, Paul; Vanatta, Matt [EnergySolutions, Engineering and Technology Group, Richland, WA, 99354 (United States)

    2012-07-01

    The ability to effectively mix, sample, certify, and deliver consistent batches of High Level Waste (HLW) feed from the Hanford Double Shell Tanks (DST) to the Waste treatment Plant (WTP) has been recognized as a significant mission risk with potential to impact mission length and the quantity of HLW glass produced. At the end of 2009 DOE's Tank Operations Contractor, Washington River Protection Solutions (WRPS), awarded a contract to EnergySolutions to design, fabricate and operate a demonstration platform called the Small Scale Mixing Demonstration (SSMD) to establish pre-transfer sampling capacity, and batch transfer performance data at two different scales. This data will be used to examine the baseline capacity for a tank mixed via rotational jet mixers to transfer consistent or bounding batches, and provide scale up information to predict full scale operational performance. This information will then in turn be used to define the baseline capacity of such a system to transfer and sample batches sent to WTP. The Small Scale Mixing Demonstration (SSMD) platform consists of 43'' and 120'' diameter clear acrylic test vessels, each equipped with two scaled jet mixer pump assemblies, and all supporting vessels, controls, services, and simulant make up facilities. All tank internals have been modeled including the air lift circulators (ALCs), the steam heating coil, and the radius between the wall and floor. The test vessels are set up to simulate the transfer of HLW out of a mixed tank, and collect a pre-transfer sample in a manner similar to the proposed baseline configuration. The collected material is submitted to an NQA-1 laboratory for chemical analysis. Previous work has been done to assess tank mixing performance at both scales. This work involved a combination of unique instruments to understand the three dimensional distribution of solids using a combination of Coriolis meter measurements, in situ chord length distribution

  2. [Monitoring microbiological safety of small systems of water distribution. Comparison of two sampling programs in a town in central Italy].

    Science.gov (United States)

    Papini, Paolo; Faustini, Annunziata; Manganello, Rosa; Borzacchi, Giancarlo; Spera, Domenico; Perucci, Carlo A

    2005-01-01

    To determine the frequency of sampling in small water distribution systems (distribution. We carried out two sampling programs to monitor the water distribution system in a town in Central Italy between July and September 1992; the Poisson distribution assumption implied 4 water samples, the assumption of negative binomial distribution implied 21 samples. Coliform organisms were used as indicators of water safety. The network consisted of two pipe rings and two wells fed by the same water source. The number of summer customers varied considerably from 3,000 to 20,000. The mean density was 2.33 coliforms/100 ml (sd= 5.29) for 21 samples and 3 coliforms/100 ml (sd= 6) for four samples. However the hypothesis of homogeneity was rejected (p-value samples (beta= 0.24) than with 21 (beta= 0.05). For this small network, determining the samples' size according to heterogeneity hypothesis strengthens the statement that water is drinkable compared with homogeneity assumption.

  3. A morphometric CT study of Down's syndrome showing small posterior fossa and calcification of basal ganglia

    International Nuclear Information System (INIS)

    Ieshima, A.; Yoshino, K.; Takashima, S.; Takeshita, K.; Kisa, T.

    1984-01-01

    We report characteristic and morphometric changes of cranial computed tomography (CT) with increasing age in 56 patients with Down's syndrome aged from 0 month to 37 years. Patients were compared with 142 normal controls aged 0 to 59 years. Width of ventricles, Sylvian fissures, posterior fossa, pons and cisterna magna were measured on CT. The incidences of the cavum septi pellucidi, cavum vergae and cavum veli interpositi and high density in the basal ganglia were examined. There was high incidence (10.7%) of bilateral calcification of basal ganglia in Down's syndrome, although that of pineal body and choroid plexus calcification was similar in Down's syndrome and controls. Basal ganglia calcification is more frequently seen in young Down's syndrome and may be related to the premature aging characteristic of Down's syndrome. The CT in Down's syndrome showed relatively small posterior fossa, small cerebellum, small brain stem and relatively large Sylvian fissures in those under one year of age. There was a high frequency of midline cava and large cisterna magna. There were no significant atrophic changes on CT except after the fifth decade comparing with controls. (orig.)

  4. Bootstrap-DEA analysis of BRICS’ energy efficiency based on small sample data

    International Nuclear Information System (INIS)

    Song, Ma-Lin; Zhang, Lin-Ling; Liu, Wei; Fisher, Ron

    2013-01-01

    Highlights: ► The BRICS’ economies have flourished with increasingly energy consumptions. ► The analyses and comparison of energy efficiency are conducted among the BRICS. ► As a whole, there is low energy efficiency but a growing trend of BRICS. ► The BRICS should adopt relevant energy policies based on their own conditions. - Abstract: As a representative of many emerging economies, BRICS’ economies have been greatly developed in recent years. Meanwhile, the proportion of energy consumption of BRICS to the whole world consumption has increased. Therefore, it is significant to analyze and compare the energy efficiency among them. This paper firstly utilizes a Super-SBM model to measure and calculate the energy efficiency of BRICS, then analyzes their present status and development trend. Further, Bootstrap is applied to modify the values based on DEA derived from small sample data, and finally the relationship between energy efficiency and carbon emissions is measured. Results show that energy efficiency of BRICS as a whole is low but has a quickly increasing trend. Also, the relationship between energy efficiency and carbon emissions vary from country to country because of their different energy structures. The governments of BRICS should make some relevant energy policies according to their own conditions

  5. Sensitive power compensated scanning calorimeter for analysis of phase transformations in small samples

    International Nuclear Information System (INIS)

    Lopeandia, A.F.; Cerdo, Ll.; Clavaguera-Mora, M.T.; Arana, Leonel R.; Jensen, K.F.; Munoz, F.J.; Rodriguez-Viejo, J.

    2005-01-01

    We have designed and developed a sensitive scanning calorimeter for use with microgram or submicrogram, thin film, or powder samples. Semiconductor processing techniques are used to fabricate membrane based microreactors with a small heat capacity of the addenda, 120 nJ/K at room temperature. At heating rates below 10 K/s the heat released or absorbed by the sample during a given transformation is compensated through a resistive Pt heater by a digital controller so that the calorimeter works as a power compensated device. Its use and dynamic sensitivity is demonstrated by analyzing the melting behavior of thin films of indium and high density polyethylene. Melting enthalpies in the range of 40-250 μJ for sample masses on the order of 1.5 μg have been measured with accuracy better than 5% at heating rates ∼0.2 K/s. The signal-to-noise ratio, limited by the electronic setup, is 200 nW

  6. Using the multi-objective optimization replica exchange Monte Carlo enhanced sampling method for protein-small molecule docking.

    Science.gov (United States)

    Wang, Hongrui; Liu, Hongwei; Cai, Leixin; Wang, Caixia; Lv, Qiang

    2017-07-10

    In this study, we extended the replica exchange Monte Carlo (REMC) sampling method to protein-small molecule docking conformational prediction using RosettaLigand. In contrast to the traditional Monte Carlo (MC) and REMC sampling methods, these methods use multi-objective optimization Pareto front information to facilitate the selection of replicas for exchange. The Pareto front information generated to select lower energy conformations as representative conformation structure replicas can facilitate the convergence of the available conformational space, including available near-native structures. Furthermore, our approach directly provides min-min scenario Pareto optimal solutions, as well as a hybrid of the min-min and max-min scenario Pareto optimal solutions with lower energy conformations for use as structure templates in the REMC sampling method. These methods were validated based on a thorough analysis of a benchmark data set containing 16 benchmark test cases. An in-depth comparison between MC, REMC, multi-objective optimization-REMC (MO-REMC), and hybrid MO-REMC (HMO-REMC) sampling methods was performed to illustrate the differences between the four conformational search strategies. Our findings demonstrate that the MO-REMC and HMO-REMC conformational sampling methods are powerful approaches for obtaining protein-small molecule docking conformational predictions based on the binding energy of complexes in RosettaLigand.

  7. Evaluating the biological potential in samples returned from planetary satellites and small solar system bodies: framework for decision making

    National Research Council Canada - National Science Library

    National Research Council Staff; Space Studies Board; Division on Engineering and Physical Sciences; National Research Council; National Academy of Sciences

    ... from Planetary Satellites and Small Solar System Bodies Framework for Decision Making Task Group on Sample Return from Small Solar System Bodies Space Studies Board Commission on Physical Sciences, Mathematics, and Applications National Research Council NATIONAL ACADEMY PRESS Washington, D.C. 1998 i Copyrightthe true use are Please breaks...

  8. A simple Bayesian approach to quantifying confidence level of adverse event incidence proportion in small samples.

    Science.gov (United States)

    Liu, Fang

    2016-01-01

    In both clinical development and post-marketing of a new therapy or a new treatment, incidence of an adverse event (AE) is always a concern. When sample sizes are small, large sample-based inferential approaches on an AE incidence proportion in a certain time period no longer apply. In this brief discussion, we introduce a simple Bayesian framework to quantify, in small sample studies and the rare AE case, (1) the confidence level that the incidence proportion of a particular AE p is over or below a threshold, (2) the lower or upper bounds on p with a certain level of confidence, and (3) the minimum required number of patients with an AE before we can be certain that p surpasses a specific threshold, or the maximum allowable number of patients with an AE after which we can no longer be certain that p is below a certain threshold, given a certain confidence level. The method is easy to understand and implement; the interpretation of the results is intuitive. This article also demonstrates the usefulness of simple Bayesian concepts when it comes to answering practical questions.

  9. Small Sample Reactivity Measurements in the RRR/SEG Facility: Reanalysis using TRIPOLI-4

    Energy Technology Data Exchange (ETDEWEB)

    Hummel, Andrew [Idaho National Lab. (INL), Idaho Falls, ID (United States); Palmiotti, Guiseppe [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-08-01

    This work involved reanalyzing the RRR/SEG integral experiments performed at the Rossendorf facility in Germany throughout the 1970s and 80s. These small sample reactivity worth measurements were carried out using the pile oscillator technique for many different fission products, structural materials, and standards. The coupled fast-thermal system was designed such that the measurements would provide insight into elemental data, specifically the competing effects between neutron capture and scatter. Comparing the measured to calculated reactivity values can then provide adjustment criteria to ultimately improve nuclear data for fast reactor designs. Due to the extremely small reactivity effects measured (typically less than 1 pcm) and the specific heterogeneity of the core, the tool chosen for this analysis was TRIPOLI-4. This code allows for high fidelity 3-dimensional geometric modeling, and the most recent, unreleased version, is capable of exact perturbation theory.

  10. MaxEnt’s parameter configuration and small samples: are we paying attention to recommendations? A systematic review

    Directory of Open Access Journals (Sweden)

    Narkis S. Morales

    2017-03-01

    Full Text Available Environmental niche modeling (ENM is commonly used to develop probabilistic maps of species distribution. Among available ENM techniques, MaxEnt has become one of the most popular tools for modeling species distribution, with hundreds of peer-reviewed articles published each year. MaxEnt’s popularity is mainly due to the use of a graphical interface and automatic parameter configuration capabilities. However, recent studies have shown that using the default automatic configuration may not be always appropriate because it can produce non-optimal models; particularly when dealing with a small number of species presence points. Thus, the recommendation is to evaluate the best potential combination of parameters (feature classes and regularization multiplier to select the most appropriate model. In this work we reviewed 244 articles published between 2013 and 2015 to assess whether researchers are following recommendations to avoid using the default parameter configuration when dealing with small sample sizes, or if they are using MaxEnt as a “black box tool.” Our results show that in only 16% of analyzed articles authors evaluated best feature classes, in 6.9% evaluated best regularization multipliers, and in a meager 3.7% evaluated simultaneously both parameters before producing the definitive distribution model. We analyzed 20 articles to quantify the potential differences in resulting outputs when using software default parameters instead of the alternative best model. Results from our analysis reveal important differences between the use of default parameters and the best model approach, especially in the total area identified as suitable for the assessed species and the specific areas that are identified as suitable by both modelling approaches. These results are worrying, because publications are potentially reporting over-complex or over-simplistic models that can undermine the applicability of their results. Of particular importance

  11. Improvement of 137Cs analysis in small volume seawater samples using the Ogoya underground facility

    International Nuclear Information System (INIS)

    Hirose, K.; Komura, K.; Kanazawa University, Ishikawa; Aoyama, M.; Igarashi, Y.

    2008-01-01

    137 Cs in seawater is one of the most powerful tracers of water motion. Large volumes of samples have been required for determination of 137 Cs in seawater. This paper describes improvement of separation and purification processes of 137 Cs in seawater, which includes purification of 137 Cs using hexachloroplatinic acid in addition to ammonium phosphomolybdate (AMP) precipitation. As a result, we succeeded the 137 Cs determination in seawater with a smaller sample volume of 10 liter by using ultra-low background gamma-spectrometry in the Ogoya underground facility. 137 Cs detection limit was about 0.1 mBq (counting time: 10 6 s). This method is applied to determine 137 Cs in small samples of the South Pacific deep waters. (author)

  12. Modeling and Testing of Phase Transition-Based Deployable Systems for Small Body Sample Capture

    Science.gov (United States)

    Quadrelli, Marco; Backes, Paul; Wilkie, Keats; Giersch, Lou; Quijano, Ubaldo; Keim, Jason; Mukherjee, Rudranarayan

    2009-01-01

    This paper summarizes the modeling, simulation, and testing work related to the development of technology to investigate the potential that shape memory actuation has to provide mechanically simple and affordable solutions for delivering assets to a surface and for sample capture and return. We investigate the structural dynamics and controllability aspects of an adaptive beam carrying an end-effector which, by changing equilibrium phases is able to actively decouple the end-effector dynamics from the spacecraft dynamics during the surface contact phase. Asset delivery and sample capture and return are at the heart of several emerging potential missions to small bodies, such as asteroids and comets, and to the surface of large bodies, such as Titan.

  13. The small sample uncertainty aspect in relation to bullwhip effect measurement

    DEFF Research Database (Denmark)

    Nielsen, Erland Hejn

    2009-01-01

    The bullwhip effect as a concept has been known for almost half a century starting with the Forrester effect. The bullwhip effect is observed in many supply chains, and it is generally accepted as a potential malice. Despite of this fact, the bullwhip effect still seems to be first and foremost...... a conceptual phenomenon. This paper intends primarily to investigate why this might be so and thereby investigate the various aspects, possibilities and obstacles that must be taken into account, when considering the potential practical use and measure of the bullwhip effect in order to actually get the supply...... chain under control. This paper will put special emphasis on the unavoidable small-sample uncertainty aspects relating to the measurement or estimation of the bullwhip effect.  ...

  14. Sampling Error in Relation to Cyst Nematode Population Density Estimation in Small Field Plots.

    Science.gov (United States)

    Župunski, Vesna; Jevtić, Radivoje; Jokić, Vesna Spasić; Župunski, Ljubica; Lalošević, Mirjana; Ćirić, Mihajlo; Ćurčić, Živko

    2017-06-01

    Cyst nematodes are serious plant-parasitic pests which could cause severe yield losses and extensive damage. Since there is still very little information about error of population density estimation in small field plots, this study contributes to the broad issue of population density assessment. It was shown that there was no significant difference between cyst counts of five or seven bulk samples taken per each 1-m 2 plot, if average cyst count per examined plot exceeds 75 cysts per 100 g of soil. Goodness of fit of data to probability distribution tested with χ 2 test confirmed a negative binomial distribution of cyst counts for 21 out of 23 plots. The recommended measure of sampling precision of 17% expressed through coefficient of variation ( cv ) was achieved if the plots of 1 m 2 contaminated with more than 90 cysts per 100 g of soil were sampled with 10-core bulk samples taken in five repetitions. If plots were contaminated with less than 75 cysts per 100 g of soil, 10-core bulk samples taken in seven repetitions gave cv higher than 23%. This study indicates that more attention should be paid on estimation of sampling error in experimental field plots to ensure more reliable estimation of population density of cyst nematodes.

  15. Monitoring, Modeling, and Diagnosis of Alkali-Silica Reaction in Small Concrete Samples

    Energy Technology Data Exchange (ETDEWEB)

    Agarwal, Vivek [Idaho National Lab. (INL), Idaho Falls, ID (United States); Cai, Guowei [Idaho National Lab. (INL), Idaho Falls, ID (United States); Gribok, Andrei V. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mahadevan, Sankaran [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    Assessment and management of aging concrete structures in nuclear power plants require a more systematic approach than simple reliance on existing code margins of safety. Structural health monitoring of concrete structures aims to understand the current health condition of a structure based on heterogeneous measurements to produce high-confidence actionable information regarding structural integrity that supports operational and maintenance decisions. This report describes alkali-silica reaction (ASR) degradation mechanisms and factors influencing the ASR. A fully coupled thermo-hydro-mechanical-chemical model developed by Saouma and Perotti by taking into consideration the effects of stress on the reaction kinetics and anisotropic volumetric expansion is presented in this report. This model is implemented in the GRIZZLY code based on the Multiphysics Object Oriented Simulation Environment. The implemented model in the GRIZZLY code is randomly used to initiate ASR in a 2D and 3D lattice to study the percolation aspects of concrete. The percolation aspects help determine the transport properties of the material and therefore the durability and service life of concrete. This report summarizes the effort to develop small-size concrete samples with embedded glass to mimic ASR. The concrete samples were treated in water and sodium hydroxide solution at elevated temperature to study how ingress of sodium ions and hydroxide ions at elevated temperature impacts concrete samples embedded with glass. Thermal camera was used to monitor the changes in the concrete sample and results are summarized.

  16. Automated microfluidic sample-preparation platform for high-throughput structural investigation of proteins by small-angle X-ray scattering

    DEFF Research Database (Denmark)

    Lafleur, Josiane P.; Snakenborg, Detlef; Nielsen, Søren Skou

    2011-01-01

    A new microfluidic sample-preparation system is presented for the structural investigation of proteins using small-angle X-ray scattering (SAXS) at synchrotrons. The system includes hardware and software features for precise fluidic control, sample mixing by diffusion, automated X-ray exposure...... control, UV absorbance measurements and automated data analysis. As little as 15 l of sample is required to perform a complete analysis cycle, including sample mixing, SAXS measurement, continuous UV absorbance measurements, and cleaning of the channels and X-ray cell with buffer. The complete analysis...

  17. Spatial Distribution of Stony Desertification and Key Influencing Factors on Different Sampling Scales in Small Karst Watersheds

    Science.gov (United States)

    Zhang, Zhenming; Zhou, Yunchao; Wang, Shijie

    2018-01-01

    Karst areas are typical ecologically fragile areas, and stony desertification has become the most serious ecological and economic problems in these areas worldwide as well as a source of disasters and poverty. A reasonable sampling scale is of great importance for research on soil science in karst areas. In this paper, the spatial distribution of stony desertification characteristics and its influencing factors in karst areas are studied at different sampling scales using a grid sampling method based on geographic information system (GIS) technology and geo-statistics. The rock exposure obtained through sampling over a 150 m × 150 m grid in the Houzhai River Basin was utilized as the original data, and five grid scales (300 m × 300 m, 450 m × 450 m, 600 m × 600 m, 750 m × 750 m, and 900 m × 900 m) were used as the subsample sets. The results show that the rock exposure does not vary substantially from one sampling scale to another, while the average values of the five subsamples all fluctuate around the average value of the entire set. As the sampling scale increases, the maximum value and the average value of the rock exposure gradually decrease, and there is a gradual increase in the coefficient of variability. At the scale of 150 m × 150 m, the areas of minor stony desertification, medium stony desertification, and major stony desertification in the Houzhai River Basin are 7.81 km2, 4.50 km2, and 1.87 km2, respectively. The spatial variability of stony desertification at small scales is influenced by many factors, and the variability at medium scales is jointly influenced by gradient, rock content, and rock exposure. At large scales, the spatial variability of stony desertification is mainly influenced by soil thickness and rock content. PMID:29652811

  18. Spatial Distribution of Stony Desertification and Key Influencing Factors on Different Sampling Scales in Small Karst Watersheds

    Directory of Open Access Journals (Sweden)

    Zhenming Zhang

    2018-04-01

    Full Text Available Karst areas are typical ecologically fragile areas, and stony desertification has become the most serious ecological and economic problems in these areas worldwide as well as a source of disasters and poverty. A reasonable sampling scale is of great importance for research on soil science in karst areas. In this paper, the spatial distribution of stony desertification characteristics and its influencing factors in karst areas are studied at different sampling scales using a grid sampling method based on geographic information system (GIS technology and geo-statistics. The rock exposure obtained through sampling over a 150 m × 150 m grid in the Houzhai River Basin was utilized as the original data, and five grid scales (300 m × 300 m, 450 m × 450 m, 600 m × 600 m, 750 m × 750 m, and 900 m × 900 m were used as the subsample sets. The results show that the rock exposure does not vary substantially from one sampling scale to another, while the average values of the five subsamples all fluctuate around the average value of the entire set. As the sampling scale increases, the maximum value and the average value of the rock exposure gradually decrease, and there is a gradual increase in the coefficient of variability. At the scale of 150 m × 150 m, the areas of minor stony desertification, medium stony desertification, and major stony desertification in the Houzhai River Basin are 7.81 km2, 4.50 km2, and 1.87 km2, respectively. The spatial variability of stony desertification at small scales is influenced by many factors, and the variability at medium scales is jointly influenced by gradient, rock content, and rock exposure. At large scales, the spatial variability of stony desertification is mainly influenced by soil thickness and rock content.

  19. Spatial Distribution of Stony Desertification and Key Influencing Factors on Different Sampling Scales in Small Karst Watersheds.

    Science.gov (United States)

    Zhang, Zhenming; Zhou, Yunchao; Wang, Shijie; Huang, Xianfei

    2018-04-13

    Karst areas are typical ecologically fragile areas, and stony desertification has become the most serious ecological and economic problems in these areas worldwide as well as a source of disasters and poverty. A reasonable sampling scale is of great importance for research on soil science in karst areas. In this paper, the spatial distribution of stony desertification characteristics and its influencing factors in karst areas are studied at different sampling scales using a grid sampling method based on geographic information system (GIS) technology and geo-statistics. The rock exposure obtained through sampling over a 150 m × 150 m grid in the Houzhai River Basin was utilized as the original data, and five grid scales (300 m × 300 m, 450 m × 450 m, 600 m × 600 m, 750 m × 750 m, and 900 m × 900 m) were used as the subsample sets. The results show that the rock exposure does not vary substantially from one sampling scale to another, while the average values of the five subsamples all fluctuate around the average value of the entire set. As the sampling scale increases, the maximum value and the average value of the rock exposure gradually decrease, and there is a gradual increase in the coefficient of variability. At the scale of 150 m × 150 m, the areas of minor stony desertification, medium stony desertification, and major stony desertification in the Houzhai River Basin are 7.81 km², 4.50 km², and 1.87 km², respectively. The spatial variability of stony desertification at small scales is influenced by many factors, and the variability at medium scales is jointly influenced by gradient, rock content, and rock exposure. At large scales, the spatial variability of stony desertification is mainly influenced by soil thickness and rock content.

  20. A simple technique for measuring the superconducting critical temperature of small (>= 10 μg) samples

    International Nuclear Information System (INIS)

    Pereira, R.F.R.; Meyer, E.; Silveira, M.F. da.

    1983-01-01

    A simple technique for measuring the superconducting critical temperature of small (>=10μg) samples is described. The apparatus is built in the form of a probe, which can be introduced directly into a liquid He storage dewar and permits the determination of the critical temperature, with an imprecision of +- 0.05 K above 4.2 K, in about 10 minutes. (Author) [pt

  1. Sophistication of 14C measurement at JAEA-AMS-MUTSU. Attempt on a small quantity of sample

    International Nuclear Information System (INIS)

    Tanaka, Takayuki; Kabuto, Shoji; Kinoshita, Naoki; Yamamoto, Nobuo

    2010-01-01

    In the investigations on substance dynamics using the molecular weight and chemical fractionation, the utilization of 14 C measurement by an accelerator mass spectrometry (AMS) have started. As a result of the fractionation, sample contents required for AMS measurement have been downsized. We expect that this trend toward a small quantity of sample will be steadily accelerated in the future. As 14 C measurement by AMS established at Mutsu office require about 2 mg of sample content at present, our AMS lags behind the others in the trend. We try to downsize the needed sample content for 14 C measurement by our AMS. In this study, we modified the shape of the target-piece in which the sample is packed and which is regularly needed to radiocarbon measurement by our AMS. Moreover, we improved on the apparatus needed to pack the sample. As a result of the improvement, we revealed that it is possible to measure the 14 C using our AMS even by the amount of the sample of about 0.5 mg. (author)

  2. Enrichment and determination of small amounts of 90Sr/90Y in water samples

    International Nuclear Information System (INIS)

    Mundschenk, H.

    1979-01-01

    Small amounts of 90 Sr/ 90 Y can be concentrated from large volumes of surface water (100 l) by precipitation of the phosphates, using bentonite as adsorber matrix. In the case of samples containing no or nearly no suspended matter (tap water, ground water, sea water), the daughter 90 Y can be extracted directly by using filter beds impregnated with HDEHP. The applicability of both techniques is demonstrated under realistic conditions. (orig.) 891 HP/orig. 892 MKO [de

  3. Basic distribution free identification tests for small size samples of environmental data

    Energy Technology Data Exchange (ETDEWEB)

    Federico, A.G.; Musmeci, F. [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dipt. Ambiente

    1998-01-01

    Testing two or more data sets for the hypothesis that they are sampled form the same population is often required in environmental data analysis. Typically the available samples have a small number of data and often then assumption of normal distributions is not realistic. On the other hand the diffusion of the days powerful Personal Computers opens new possible opportunities based on a massive use of the CPU resources. The paper reviews the problem introducing the feasibility of two non parametric approaches based on intrinsic equi probability properties of the data samples. The first one is based on a full re sampling while the second is based on a bootstrap approach. A easy to use program is presented. A case study is given based on the Chernobyl children contamination data. [Italiano] Nell`analisi di dati ambientali ricorre spesso il caso di dover sottoporre a test l`ipotesi di provenienza di due, o piu`, insiemi di dati dalla stessa popolazione. Tipicamente i dati disponibili sono pochi e spesso l`ipotesi di provenienza da distribuzioni normali non e` sostenibile. D`altra aprte la diffusione odierna di Personal Computer fornisce nuove possibili soluzioni basate sull`uso intensivo delle risorse della CPU. Il rapporto analizza il problema e presenta la possibilita` di utilizzo di due test non parametrici basati sulle proprieta` intrinseche di equiprobabilita` dei campioni. Il primo e` basato su una tecnica di ricampionamento esaustivo mentre il secondo su un approccio di tipo bootstrap. E` presentato un programma di semplice utilizzo e un caso di studio basato su dati di contaminazione di bambini a Chernobyl.

  4. Autoregressive Prediction with Rolling Mechanism for Time Series Forecasting with Small Sample Size

    Directory of Open Access Journals (Sweden)

    Zhihua Wang

    2014-01-01

    Full Text Available Reasonable prediction makes significant practical sense to stochastic and unstable time series analysis with small or limited sample size. Motivated by the rolling idea in grey theory and the practical relevance of very short-term forecasting or 1-step-ahead prediction, a novel autoregressive (AR prediction approach with rolling mechanism is proposed. In the modeling procedure, a new developed AR equation, which can be used to model nonstationary time series, is constructed in each prediction step. Meanwhile, the data window, for the next step ahead forecasting, rolls on by adding the most recent derived prediction result while deleting the first value of the former used sample data set. This rolling mechanism is an efficient technique for its advantages of improved forecasting accuracy, applicability in the case of limited and unstable data situations, and requirement of little computational effort. The general performance, influence of sample size, nonlinearity dynamic mechanism, and significance of the observed trends, as well as innovation variance, are illustrated and verified with Monte Carlo simulations. The proposed methodology is then applied to several practical data sets, including multiple building settlement sequences and two economic series.

  5. CA II TRIPLET SPECTROSCOPY OF SMALL MAGELLANIC CLOUD RED GIANTS. III. ABUNDANCES AND VELOCITIES FOR A SAMPLE OF 14 CLUSTERS

    Energy Technology Data Exchange (ETDEWEB)

    Parisi, M. C.; Clariá, J. J.; Marcionni, N. [Observatorio Astronómico, Universidad Nacional de Córdoba, Laprida 854, Córdoba, CP 5000 (Argentina); Geisler, D.; Villanova, S. [Departamento de Astronomía, Universidad de Concepción Casilla 160-C, Concepción (Chile); Sarajedini, A. [Department of Astronomy, University of Florida P.O. Box 112055, Gainesville, FL 32611 (United States); Grocholski, A. J., E-mail: celeste@oac.uncor.edu, E-mail: claria@oac.uncor.edu, E-mail: nmarcionni@oac.uncor.edu, E-mail: dgeisler@astro-udec.cl, E-mail: svillanova@astro-udec.cl, E-mail: ata@astro.ufl.edu, E-mail: grocholski@phys.lsu.edu [Department of Physics and Astronomy, Louisiana State University 202 Nicholson Hall, Tower Drive, Baton Rouge, LA 70803-4001 (United States)

    2015-05-15

    We obtained spectra of red giants in 15 Small Magellanic Cloud (SMC) clusters in the region of the Ca ii lines with FORS2 on the Very Large Telescope. We determined the mean metallicity and radial velocity with mean errors of 0.05 dex and 2.6 km s{sup −1}, respectively, from a mean of 6.5 members per cluster. One cluster (B113) was too young for a reliable metallicity determination and was excluded from the sample. We combined the sample studied here with 15 clusters previously studied by us using the same technique, and with 7 clusters whose metallicities determined by other authors are on a scale similar to ours. This compilation of 36 clusters is the largest SMC cluster sample currently available with accurate and homogeneously determined metallicities. We found a high probability that the metallicity distribution is bimodal, with potential peaks at −1.1 and −0.8 dex. Our data show no strong evidence of a metallicity gradient in the SMC clusters, somewhat at odds with recent evidence from Ca ii triplet spectra of a large sample of field stars. This may be revealing possible differences in the chemical history of clusters and field stars. Our clusters show a significant dispersion of metallicities, whatever age is considered, which could be reflecting the lack of a unique age–metallicity relation in this galaxy. None of the chemical evolution models currently available in the literature satisfactorily represents the global chemical enrichment processes of SMC clusters.

  6. Retrospective biodosimetry with small tooth enamel samples using K-Band and X-Band

    International Nuclear Information System (INIS)

    Gomez, Jorge A.; Kinoshita, Angela; Leonor, Sergio J.; Belmonte, Gustavo C.; Baffa, Oswaldo

    2011-01-01

    In an attempt to make the in vitro electron spin resonance (ESR) retrospective dosimetry of the tooth enamel a lesser invasive method, experiments using X-Band and K-Band were performed, aiming to determine conditions that could be used in cases of accidental exposures. First, a small prism from the enamel was removed and ground with an agate mortar and pestle until particles reach a diameter of approximately less than 0.5 mm. This enamel extraction process resulted in lower signal artifact compared with the direct enamel extraction performed with a diamond burr abrasion. The manual grinding of the enamel does not lead to any induced ESR signal artifact, whereas the use of a diamond burr at low speed produces a signal artifact equivalent to the dosimetric signal induced by a dose of 500 mGy of gamma irradiation. A mass of 25 mg of enamel was removed from a sound molar tooth previously irradiated in vitro with a dose of 100 mGy. This amount of enamel was enough to detect the dosimetric signal in a standard X-Band spectrometer. However using a K-Band spectrometer, samples mass between 5 and 10 mg were sufficient to obtain the same sensitivity. An overall evaluation of the uncertainties involved in the process in this and other dosimetric assessments performed at our laboratory indicates that it is possible at K-Band to estimate a 100 mGy dose with 25% accuracy. In addition, the use of K-Band also presented higher sensitivity and allowed the use of smaller sample mass in comparison with X-Band. Finally, the restoration process performed on a tooth after extraction of the 25 mg of enamel is described. This was conducted by dental treatment using photopolymerizable resin which enabled complete recovery of the tooth from the functional and aesthetic viewpoint showing that this procedure can be minimally invasive.

  7. Retrospective biodosimetry with small tooth enamel samples using K-Band and X-Band

    Energy Technology Data Exchange (ETDEWEB)

    Gomez, Jorge A. [Departamento de Fisica, FFCLRP, Universidade de Sao Paulo, 14040-901 Ribeirao Preto, Sao Paulo (Brazil); Kinoshita, Angela [Departamento de Fisica, FFCLRP, Universidade de Sao Paulo, 14040-901 Ribeirao Preto, Sao Paulo (Brazil); Universidade Sagrado Coracao - USC, 17011-160 Bauru, Sao Paulo (Brazil); Leonor, Sergio J. [Departamento de Fisica, FFCLRP, Universidade de Sao Paulo, 14040-901 Ribeirao Preto, Sao Paulo (Brazil); Belmonte, Gustavo C. [Universidade Sagrado Coracao - USC, 17011-160 Bauru, Sao Paulo (Brazil); Baffa, Oswaldo, E-mail: baffa@usp.br [Departamento de Fisica, FFCLRP, Universidade de Sao Paulo, 14040-901 Ribeirao Preto, Sao Paulo (Brazil)

    2011-09-15

    In an attempt to make the in vitro electron spin resonance (ESR) retrospective dosimetry of the tooth enamel a lesser invasive method, experiments using X-Band and K-Band were performed, aiming to determine conditions that could be used in cases of accidental exposures. First, a small prism from the enamel was removed and ground with an agate mortar and pestle until particles reach a diameter of approximately less than 0.5 mm. This enamel extraction process resulted in lower signal artifact compared with the direct enamel extraction performed with a diamond burr abrasion. The manual grinding of the enamel does not lead to any induced ESR signal artifact, whereas the use of a diamond burr at low speed produces a signal artifact equivalent to the dosimetric signal induced by a dose of 500 mGy of gamma irradiation. A mass of 25 mg of enamel was removed from a sound molar tooth previously irradiated in vitro with a dose of 100 mGy. This amount of enamel was enough to detect the dosimetric signal in a standard X-Band spectrometer. However using a K-Band spectrometer, samples mass between 5 and 10 mg were sufficient to obtain the same sensitivity. An overall evaluation of the uncertainties involved in the process in this and other dosimetric assessments performed at our laboratory indicates that it is possible at K-Band to estimate a 100 mGy dose with 25% accuracy. In addition, the use of K-Band also presented higher sensitivity and allowed the use of smaller sample mass in comparison with X-Band. Finally, the restoration process performed on a tooth after extraction of the 25 mg of enamel is described. This was conducted by dental treatment using photopolymerizable resin which enabled complete recovery of the tooth from the functional and aesthetic viewpoint showing that this procedure can be minimally invasive.

  8. Filter Bank Regularized Common Spatial Pattern Ensemble for Small Sample Motor Imagery Classification.

    Science.gov (United States)

    Park, Sang-Hoon; Lee, David; Lee, Sang-Goog

    2018-02-01

    For the last few years, many feature extraction methods have been proposed based on biological signals. Among these, the brain signals have the advantage that they can be obtained, even by people with peripheral nervous system damage. Motor imagery electroencephalograms (EEG) are inexpensive to measure, offer a high temporal resolution, and are intuitive. Therefore, these have received a significant amount of attention in various fields, including signal processing, cognitive science, and medicine. The common spatial pattern (CSP) algorithm is a useful method for feature extraction from motor imagery EEG. However, performance degradation occurs in a small-sample setting (SSS), because the CSP depends on sample-based covariance. Since the active frequency range is different for each subject, it is also inconvenient to set the frequency range to be different every time. In this paper, we propose the feature extraction method based on a filter bank to solve these problems. The proposed method consists of five steps. First, motor imagery EEG is divided by a using filter bank. Second, the regularized CSP (R-CSP) is applied to the divided EEG. Third, we select the features according to mutual information based on the individual feature algorithm. Fourth, parameter sets are selected for the ensemble. Finally, we classify using ensemble based on features. The brain-computer interface competition III data set IVa is used to evaluate the performance of the proposed method. The proposed method improves the mean classification accuracy by 12.34%, 11.57%, 9%, 4.95%, and 4.47% compared with CSP, SR-CSP, R-CSP, filter bank CSP (FBCSP), and SR-FBCSP. Compared with the filter bank R-CSP ( , ), which is a parameter selection version of the proposed method, the classification accuracy is improved by 3.49%. In particular, the proposed method shows a large improvement in performance in the SSS.

  9. Characteristic Performance Evaluation of a new SAGe Well Detector for Small and Large Sample Geometries

    International Nuclear Information System (INIS)

    Adekola, A.S.; Colaresi, J.; Douwen, J.; Jaederstroem, H.; Mueller, W.F.; Yocum, K.M.; Carmichael, K.

    2015-01-01

    Environmental scientific research requires a detector that has sensitivity low enough to reveal the presence of any contaminant in the sample at a reasonable counting time. Canberra developed the germanium detector geometry called Small Anode Germanium (SAGe) Well detector, which is now available commercially. The SAGe Well detector is a new type of low capacitance germanium well detector manufactured using small anode technology capable of advancing many environmental scientific research applications. The performance of this detector has been evaluated for a range of sample sizes and geometries counted inside the well, and on the end cap of the detector. The detector has energy resolution performance similar to semi-planar detectors, and offers significant improvement over the existing coaxial and Well detectors. Energy resolution performance of 750 eV Full Width at Half Maximum (FWHM) at 122 keV γ-ray energy and resolution of 2.0 - 2.3 keV FWHM at 1332 keV γ-ray energy are guaranteed for detector volumes up to 425 cm 3 . The SAGe Well detector offers an optional 28 mm well diameter with the same energy resolution as the standard 16 mm well. Such outstanding resolution performance will benefit environmental applications in revealing the detailed radionuclide content of samples, particularly at low energy, and will enhance the detection sensitivity resulting in reduced counting time. The detector is compatible with electric coolers without any sacrifice in performance and supports the Canberra Mathematical efficiency calibration method (In situ Object Calibration Software or ISOCS, and Laboratory Source-less Calibration Software or LABSOCS). In addition, the SAGe Well detector supports true coincidence summing available in the ISOCS/LABSOCS framework. The improved resolution performance greatly enhances detection sensitivity of this new detector for a range of sample sizes and geometries counted inside the well. This results in lower minimum detectable

  10. Characteristic Performance Evaluation of a new SAGe Well Detector for Small and Large Sample Geometries

    Energy Technology Data Exchange (ETDEWEB)

    Adekola, A.S.; Colaresi, J.; Douwen, J.; Jaederstroem, H.; Mueller, W.F.; Yocum, K.M.; Carmichael, K. [Canberra Industries Inc., 800 Research Parkway, Meriden, CT 06450 (United States)

    2015-07-01

    Environmental scientific research requires a detector that has sensitivity low enough to reveal the presence of any contaminant in the sample at a reasonable counting time. Canberra developed the germanium detector geometry called Small Anode Germanium (SAGe) Well detector, which is now available commercially. The SAGe Well detector is a new type of low capacitance germanium well detector manufactured using small anode technology capable of advancing many environmental scientific research applications. The performance of this detector has been evaluated for a range of sample sizes and geometries counted inside the well, and on the end cap of the detector. The detector has energy resolution performance similar to semi-planar detectors, and offers significant improvement over the existing coaxial and Well detectors. Energy resolution performance of 750 eV Full Width at Half Maximum (FWHM) at 122 keV γ-ray energy and resolution of 2.0 - 2.3 keV FWHM at 1332 keV γ-ray energy are guaranteed for detector volumes up to 425 cm{sup 3}. The SAGe Well detector offers an optional 28 mm well diameter with the same energy resolution as the standard 16 mm well. Such outstanding resolution performance will benefit environmental applications in revealing the detailed radionuclide content of samples, particularly at low energy, and will enhance the detection sensitivity resulting in reduced counting time. The detector is compatible with electric coolers without any sacrifice in performance and supports the Canberra Mathematical efficiency calibration method (In situ Object Calibration Software or ISOCS, and Laboratory Source-less Calibration Software or LABSOCS). In addition, the SAGe Well detector supports true coincidence summing available in the ISOCS/LABSOCS framework. The improved resolution performance greatly enhances detection sensitivity of this new detector for a range of sample sizes and geometries counted inside the well. This results in lower minimum detectable

  11. Forecasting elections with mere recognition from small, lousy samples: A comparison of collective recognition, wisdom of crowds, and representative polls

    Directory of Open Access Journals (Sweden)

    Wolfgang Gaissmeier

    2011-02-01

    Full Text Available We investigated the extent to which the human capacity for recognition helps to forecast political elections: We compared naive recognition-based election forecasts computed from convenience samples of citizens' recognition of party names to (i standard polling forecasts computed from representative samples of citizens' voting intentions, and to (ii simple---and typically very accurate---wisdom-of-crowds-forecasts computed from the same convenience samples of citizens' aggregated hunches about election results. Results from four major German elections show that mere recognition of party names forecast the parties' electoral success fairly well. Recognition-based forecasts were most competitive with the other models when forecasting the smaller parties' success and for small sample sizes. However, wisdom-of-crowds-forecasts outperformed recognition-based forecasts in most cases. It seems that wisdom-of-crowds-forecasts are able to draw on the benefits of recognition while at the same time avoiding its downsides, such as lack of discrimination among very famous parties or recognition caused by factors unrelated to electoral success. Yet it seems that a simple extension of the recognition-based forecasts---asking people what proportion of the population would recognize a party instead of whether they themselves recognize it---is also able to eliminate these downsides.

  12. Preparing Monodisperse Macromolecular Samples for Successful Biological Small-Angle X-ray and Neutron Scattering Experiments

    Science.gov (United States)

    Jeffries, Cy M.; Graewert, Melissa A.; Blanchet, Clément E.; Langley, David B.; Whitten, Andrew E.; Svergun, Dmitri I

    2017-01-01

    Small-angle X-ray and neutron scattering (SAXS and SANS) are techniques used to extract structural parameters and determine the overall structures and shapes of biological macromolecules, complexes and assemblies in solution. The scattering intensities measured from a sample contain contributions from all atoms within the illuminated sample volume including the solvent and buffer components as well as the macromolecules of interest. In order to obtain structural information, it is essential to prepare an exactly matched solvent blank so that background scattering contributions can be accurately subtracted from the sample scattering to obtain the net scattering from the macromolecules in the sample. In addition, sample heterogeneity caused by contaminants, aggregates, mismatched solvents, radiation damage or other factors can severely influence and complicate data analysis so it is essential that the samples are pure and monodisperse for the duration of the experiment. This Protocol outlines the basic physics of SAXS and SANS and reveals how the underlying conceptual principles of the techniques ultimately ‘translate’ into practical laboratory guidance for the production of samples of sufficiently high quality for scattering experiments. The procedure describes how to prepare and characterize protein and nucleic acid samples for both SAXS and SANS using gel electrophoresis, size exclusion chromatography and light scattering. Also included are procedures specific to X-rays (in-line size exclusion chromatography SAXS) and neutrons, specifically preparing samples for contrast matching/variation experiments and deuterium labeling of proteins. PMID:27711050

  13. Sample Preparation and Extraction in Small Sample Volumes Suitable for Pediatric Clinical Studies: Challenges, Advances, and Experiences of a Bioanalytical HPLC-MS/MS Method Validation Using Enalapril and Enalaprilat

    Science.gov (United States)

    Burckhardt, Bjoern B.; Laeer, Stephanie

    2015-01-01

    In USA and Europe, medicines agencies force the development of child-appropriate medications and intend to increase the availability of information on the pediatric use. This asks for bioanalytical methods which are able to deal with small sample volumes as the trial-related blood lost is very restricted in children. Broadly used HPLC-MS/MS, being able to cope with small volumes, is susceptible to matrix effects. The latter restrains the precise drug quantification through, for example, causing signal suppression. Sophisticated sample preparation and purification utilizing solid-phase extraction was applied to reduce and control matrix effects. A scale-up from vacuum manifold to positive pressure manifold was conducted to meet the demands of high-throughput within a clinical setting. Faced challenges, advances, and experiences in solid-phase extraction are exemplarily presented on the basis of the bioanalytical method development and validation of low-volume samples (50 μL serum). Enalapril, enalaprilat, and benazepril served as sample drugs. The applied sample preparation and extraction successfully reduced the absolute and relative matrix effect to comply with international guidelines. Recoveries ranged from 77 to 104% for enalapril and from 93 to 118% for enalaprilat. The bioanalytical method comprising sample extraction by solid-phase extraction was fully validated according to FDA and EMA bioanalytical guidelines and was used in a Phase I study in 24 volunteers. PMID:25873972

  14. Sample Preparation and Extraction in Small Sample Volumes Suitable for Pediatric Clinical Studies: Challenges, Advances, and Experiences of a Bioanalytical HPLC-MS/MS Method Validation Using Enalapril and Enalaprilat

    Directory of Open Access Journals (Sweden)

    Bjoern B. Burckhardt

    2015-01-01

    Full Text Available In USA and Europe, medicines agencies force the development of child-appropriate medications and intend to increase the availability of information on the pediatric use. This asks for bioanalytical methods which are able to deal with small sample volumes as the trial-related blood lost is very restricted in children. Broadly used HPLC-MS/MS, being able to cope with small volumes, is susceptible to matrix effects. The latter restrains the precise drug quantification through, for example, causing signal suppression. Sophisticated sample preparation and purification utilizing solid-phase extraction was applied to reduce and control matrix effects. A scale-up from vacuum manifold to positive pressure manifold was conducted to meet the demands of high-throughput within a clinical setting. Faced challenges, advances, and experiences in solid-phase extraction are exemplarily presented on the basis of the bioanalytical method development and validation of low-volume samples (50 μL serum. Enalapril, enalaprilat, and benazepril served as sample drugs. The applied sample preparation and extraction successfully reduced the absolute and relative matrix effect to comply with international guidelines. Recoveries ranged from 77 to 104% for enalapril and from 93 to 118% for enalaprilat. The bioanalytical method comprising sample extraction by solid-phase extraction was fully validated according to FDA and EMA bioanalytical guidelines and was used in a Phase I study in 24 volunteers.

  15. A comparison of confidence/credible interval methods for the area under the ROC curve for continuous diagnostic tests with small sample size.

    Science.gov (United States)

    Feng, Dai; Cortese, Giuliana; Baumgartner, Richard

    2017-12-01

    The receiver operating characteristic (ROC) curve is frequently used as a measure of accuracy of continuous markers in diagnostic tests. The area under the ROC curve (AUC) is arguably the most widely used summary index for the ROC curve. Although the small sample size scenario is common in medical tests, a comprehensive study of small sample size properties of various methods for the construction of the confidence/credible interval (CI) for the AUC has been by and large missing in the literature. In this paper, we describe and compare 29 non-parametric and parametric methods for the construction of the CI for the AUC when the number of available observations is small. The methods considered include not only those that have been widely adopted, but also those that have been less frequently mentioned or, to our knowledge, never applied to the AUC context. To compare different methods, we carried out a simulation study with data generated from binormal models with equal and unequal variances and from exponential models with various parameters and with equal and unequal small sample sizes. We found that the larger the true AUC value and the smaller the sample size, the larger the discrepancy among the results of different approaches. When the model is correctly specified, the parametric approaches tend to outperform the non-parametric ones. Moreover, in the non-parametric domain, we found that a method based on the Mann-Whitney statistic is in general superior to the others. We further elucidate potential issues and provide possible solutions to along with general guidance on the CI construction for the AUC when the sample size is small. Finally, we illustrate the utility of different methods through real life examples.

  16. Adiponectin levels measured in dried blood spot samples from neonates born small and appropriate for gestational age

    DEFF Research Database (Denmark)

    Klamer, A; Skogstrand, Kristin; Hougaard, D M

    2007-01-01

    Adiponectin levels measured in neonatal dried blood spot samples (DBSS) might be affected by both prematurity and being born small for gestational age (SGA). The aim of the study was to measure adiponectin levels in routinely collected neonatal DBSS taken on day 5 (range 3-12) postnatal from...

  17. Report: Independent Environmental Sampling Shows Some Properties Designated by EPA as Available for Use Had Some Contamination

    Science.gov (United States)

    Report #15-P-0221, July 21, 2015. Some OIG sampling results showed contamination was still present at sites designated by the EPA as ready for reuse. This was unexpected and could signal a need to implement changes to ensure human health protection.

  18. Tools for Inspecting and Sampling Waste in Underground Radioactive Storage Tanks with Small Access Riser Openings

    International Nuclear Information System (INIS)

    Nance, T.A.

    1998-01-01

    Underground storage tanks with 2 inches to 3 inches diameter access ports at the Department of Energy's Savannah River Site have been used to store radioactive solvents and sludge. In order to close these tanks, the contents of the tanks need to first be quantified in terms of volume and chemical and radioactive characteristics. To provide information on the volume of waste contained within the tanks, a small remote inspection system was needed. This inspection system was designed to provide lighting and provide pan and tilt capabilities in an inexpensive package with zoom abilities and color video. This system also needed to be utilized inside of a plastic tent built over the access port to contain any contamination exiting from the port. This system had to be build to travel into the small port opening, through the riser pipe, into the tank evacuated space, and out of the riser pipe and access port with no possibility of being caught and blocking the access riser. Long thin plates were found in many access riser pipes that blocked the inspection system from penetrating into the tank interiors. Retrieval tools to clear the plates from the tanks using developed sampling devices while providing safe containment for the samples. This paper will discuss the inspection systems, tools for clearing access pipes, and solvent sampling tools developed to evaluate the tank contents of the underground solvent storage tanks

  19. Weighted piecewise LDA for solving the small sample size problem in face verification.

    Science.gov (United States)

    Kyperountas, Marios; Tefas, Anastasios; Pitas, Ioannis

    2007-03-01

    A novel algorithm that can be used to boost the performance of face-verification methods that utilize Fisher's criterion is presented and evaluated. The algorithm is applied to similarity, or matching error, data and provides a general solution for overcoming the "small sample size" (SSS) problem, where the lack of sufficient training samples causes improper estimation of a linear separation hyperplane between the classes. Two independent phases constitute the proposed method. Initially, a set of weighted piecewise discriminant hyperplanes are used in order to provide a more accurate discriminant decision than the one produced by the traditional linear discriminant analysis (LDA) methodology. The expected classification ability of this method is investigated throughout a series of simulations. The second phase defines proper combinations for person-specific similarity scores and describes an outlier removal process that further enhances the classification ability. The proposed technique has been tested on the M2VTS and XM2VTS frontal face databases. Experimental results indicate that the proposed framework greatly improves the face-verification performance.

  20. The activation analysis of gold in small refractory pebbles

    International Nuclear Information System (INIS)

    Bibby, D.M.; Chaix, R.P.

    1975-08-01

    The gold content of a suite of small pebbles, residual to the milling and leach of a gold bearing ore, has been investigated by means of neutron activation analysis (NAA). An NAA technique presenting a sensitivity of 0.02μgm gold, was used as being appropriate to the samples under investigation. An alternative NAA technique developed with the same sample suite showed a sensitivity of the order of 10 -4 to 10 -5 μgm gold. The NAA techniques developed, are appropriate to the determination of gold in small samples of ore not normally amenable to milling and/or dissolution

  1. Information in small neuronal ensemble activity in the hippocampal CA1 during delayed non-matching to sample performance in rats

    Directory of Open Access Journals (Sweden)

    Takahashi Susumu

    2009-09-01

    Full Text Available Abstract Background The matrix-like organization of the hippocampus, with its several inputs and outputs, has given rise to several theories related to hippocampal information processing. Single-cell electrophysiological studies and studies of lesions or genetically altered animals using recognition memory tasks such as delayed non-matching-to-sample (DNMS tasks support the theories. However, a complete understanding of hippocampal function necessitates knowledge of the encoding of information by multiple neurons in a single trial. The role of neuronal ensembles in the hippocampal CA1 for a DNMS task was assessed quantitatively in this study using multi-neuronal recordings and an artificial neural network classifier as a decoder. Results The activity of small neuronal ensembles (6-18 cells over brief time intervals (2-50 ms contains accurate information specifically related to the matching/non-matching of continuously presented stimuli (stimulus comparison. The accuracy of the combination of neurons pooled over all the ensembles was markedly lower than those of the ensembles over all examined time intervals. Conclusion The results show that the spatiotemporal patterns of spiking activity among cells in the small neuronal ensemble contain much information that is specifically useful for the stimulus comparison. Small neuronal networks in the hippocampal CA1 might therefore act as a comparator during recognition memory tasks.

  2. Perspectives of an acoustic–electrostatic/electrodynamic hybrid levitator for small fluid and solid samples

    International Nuclear Information System (INIS)

    Lierke, E G; Holitzner, L

    2008-01-01

    The feasibility of an acoustic–electrostatic hybrid levitator for small fluid and solid samples is evaluated. A proposed design and its theoretical assessment are based on the optional implementation of simple hardware components (ring electrodes) and standard laboratory equipment into typical commercial ultrasonic standing wave levitators. These levitators allow precise electrical charging of drops during syringe- or ink-jet-type deployment. The homogeneous electric 'Millikan field' between the grounded ultrasonic transducer and the electrically charged reflector provide an axial compensation of the sample weight in an indifferent equilibrium, which can be balanced by using commercial optical position sensors in combination with standard electronic PID position control. Radial electrostatic repulsion forces between the charged sample and concentric ring electrodes of the same polarity provide stable positioning at the centre of the levitator. The levitator can be used in a pure acoustic or electrostatic mode or in a hybrid combination of both subsystems. Analytical evaluations of the radial–axial force profiles are verified with detailed numerical finite element calculations under consideration of alternative boundary conditions. The simple hardware modification with implemented double-ring electrodes in ac/dc operation is also feasible for an electrodynamic/acoustic hybrid levitator

  3. Beyond simple small-angle X-ray scattering: developments in online complementary techniques and sample environments.

    Science.gov (United States)

    Bras, Wim; Koizumi, Satoshi; Terrill, Nicholas J

    2014-11-01

    Small- and wide-angle X-ray scattering (SAXS, WAXS) are standard tools in materials research. The simultaneous measurement of SAXS and WAXS data in time-resolved studies has gained popularity due to the complementary information obtained. Furthermore, the combination of these data with non X-ray based techniques, via either simultaneous or independent measurements, has advanced understanding of the driving forces that lead to the structures and morphologies of materials, which in turn give rise to their properties. The simultaneous measurement of different data regimes and types, using either X-rays or neutrons, and the desire to control parameters that initiate and control structural changes have led to greater demands on sample environments. Examples of developments in technique combinations and sample environment design are discussed, together with a brief speculation about promising future developments.

  4. Patient Safety Outcomes in Small Urban and Small Rural Hospitals

    Science.gov (United States)

    Vartak, Smruti; Ward, Marcia M.; Vaughn, Thomas E.

    2010-01-01

    Purpose: To assess patient safety outcomes in small urban and small rural hospitals and to examine the relationship of hospital and patient factors to patient safety outcomes. Methods: The Nationwide Inpatient Sample and American Hospital Association annual survey data were used for analyses. To increase comparability, the study sample was…

  5. Ca II TRIPLET SPECTROSCOPY OF SMALL MAGELLANIC CLOUD RED GIANTS. I. ABUNDANCES AND VELOCITIES FOR A SAMPLE OF CLUSTERS

    International Nuclear Information System (INIS)

    Parisi, M. C.; Claria, J. J.; Grocholski, A. J.; Geisler, D.; Sarajedini, A.

    2009-01-01

    We have obtained near-infrared spectra covering the Ca II triplet lines for a large number of stars associated with 16 Small Magellanic Cloud (SMC) clusters using the VLT + FORS2. These data compose the largest available sample of SMC clusters with spectroscopically derived abundances and velocities. Our clusters span a wide range of ages and provide good areal coverage of the galaxy. Cluster members are selected using a combination of their positions relative to the cluster center as well as their location in the color-magnitude diagram, abundances, and radial velocities (RVs). We determine mean cluster velocities to typically 2.7 km s -1 and metallicities to 0.05 dex (random errors), from an average of 6.4 members per cluster. By combining our clusters with previously published results, we compile a sample of 25 clusters on a homogeneous metallicity scale and with relatively small metallicity errors, and thereby investigate the metallicity distribution, metallicity gradient, and age-metallicity relation (AMR) of the SMC cluster system. For all 25 clusters in our expanded sample, the mean metallicity [Fe/H] = -0.96 with σ = 0.19. The metallicity distribution may possibly be bimodal, with peaks at ∼-0.9 dex and -1.15 dex. Similar to the Large Magellanic Cloud (LMC), the SMC cluster system gives no indication of a radial metallicity gradient. However, intermediate age SMC clusters are both significantly more metal-poor and have a larger metallicity spread than their LMC counterparts. Our AMR shows evidence for three phases: a very early (>11 Gyr) phase in which the metallicity reached ∼-1.2 dex, a long intermediate phase from ∼10 to 3 Gyr in which the metallicity only slightly increased, and a final phase from 3 to 1 Gyr ago in which the rate of enrichment was substantially faster. We find good overall agreement with the model of Pagel and Tautvaisiene, which assumes a burst of star formation at 4 Gyr. Finally, we find that the mean RV of the cluster system

  6. The polymorphic insertion of the luteinizing hormone receptor "insLQ" show a negative association to LHR gene expression and to the follicular fluid hormonal profile in human small antral follicles

    DEFF Research Database (Denmark)

    Borgbo, T; Chrudimska, J; Macek, M

    2018-01-01

    (AMHR2) and LHCGR, respectively, were observed for insLQ/insLQ compared to -/insLQ and the -/- genotypes. Moreover, LHCGR and CYP19a1 together with oestradiol and inhibin-B were significantly increased in -/insLQ compared to the -/- genotype. The homozygous insLQ genotype showed strong significant......The luteinizing hormone receptor (LHCGR) has a little studied polymorphic 6 bp insertion (rs4539842/insLQ). This study has evaluated the insLQ polymorphism in relation to potential associations with hormonal characteristics of human small antral follicles (hSAFs). In total, 310 hSAFs were collected...... from 86 women undergoing fertility preservation. Analysis included hormonal profile of 297 follicular fluid (FF) samples and 148 corresponding granulosa cells samples were evaluated by qPCR for selected genes. Significantly reduced and non-detectable mRNA levels of anti-Müllerian hormone receptor II...

  7. Small Is Not Always Beautiful – A Comparative Study Of Working Conditions

    DEFF Research Database (Denmark)

    Sørensen, Ole Henning; Hasle, Peter; Bach, Elsa

    2004-01-01

    When considering working conditions, the myth sometimes holds that small is beautiful. Some studies of occupational accidents show that small companies have fever accidents than large. This indicates that small companies have better working conditions than large. A Danish study based on a random...... sample of employees in the general industry rejects this myth. The figures show that physical working conditions become systematically worse the smaller the size of privately owned companies. The study also supports international data that small companies underreport accidents, and that this might...... be the reason why small seems beautiful....

  8. A TIMS-based method for the high precision measurements of the three-isotope potassium composition of small samples

    DEFF Research Database (Denmark)

    Wielandt, Daniel Kim Peel; Bizzarro, Martin

    2011-01-01

    A novel thermal ionization mass spectrometry (TIMS) method for the three-isotope analysis of K has been developed, and ion chromatographic methods for the separation of K have been adapted for the processing of small samples. The precise measurement of K-isotopes is challenged by the presence of ...

  9. Identification of potential small molecule allosteric modulator sites on IL-1R1 ectodomain using accelerated conformational sampling method.

    Directory of Open Access Journals (Sweden)

    Chao-Yie Yang

    Full Text Available The interleukin-1 receptor (IL-1R is the founding member of the interleukin 1 receptor family which activates innate immune response by its binding to cytokines. Reports showed dysregulation of cytokine production leads to aberrant immune cells activation which contributes to auto-inflammatory disorders and diseases. Current therapeutic strategies focus on utilizing antibodies or chimeric cytokine biologics. The large protein-protein interaction interface between cytokine receptor and cytokine poses a challenge in identifying binding sites for small molecule inhibitor development. Based on the significant conformational change of IL-1R type 1 (IL-1R1 ectodomain upon binding to different ligands observed in crystal structures, we hypothesized that transient small molecule binding sites may exist when IL-1R1 undergoes conformational transition and thus suitable for inhibitor development. Here, we employed accelerated molecular dynamics (MD simulation to efficiently sample conformational space of IL-1R1 ectodomain. Representative IL-1R1 ectodomain conformations determined from the hierarchy cluster analysis were analyzed by the SiteMap program which leads to identify small molecule binding sites at the protein-protein interaction interface and allosteric modulator locations. The cosolvent mapping analysis using phenol as the probe molecule further confirms the allosteric modulator site as a binding hotspot. Eight highest ranked fragment molecules identified from in silico screening at the modulator site were evaluated by MD simulations. Four of them restricted the IL-1R1 dynamical motion to inactive conformational space. The strategy from this study, subject to in vitro experimental validation, can be useful to identify small molecule compounds targeting the allosteric modulator sites of IL-1R and prevent IL-1R from binding to cytokine by trapping IL-1R in inactive conformations.

  10. Beyond simple small-angle X-ray scattering: developments in online complementary techniques and sample environments

    Directory of Open Access Journals (Sweden)

    Wim Bras

    2014-11-01

    Full Text Available Small- and wide-angle X-ray scattering (SAXS, WAXS are standard tools in materials research. The simultaneous measurement of SAXS and WAXS data in time-resolved studies has gained popularity due to the complementary information obtained. Furthermore, the combination of these data with non X-ray based techniques, via either simultaneous or independent measurements, has advanced understanding of the driving forces that lead to the structures and morphologies of materials, which in turn give rise to their properties. The simultaneous measurement of different data regimes and types, using either X-rays or neutrons, and the desire to control parameters that initiate and control structural changes have led to greater demands on sample environments. Examples of developments in technique combinations and sample environment design are discussed, together with a brief speculation about promising future developments.

  11. A combined approach of generalized additive model and bootstrap with small sample sets for fault diagnosis in fermentation process of glutamate.

    Science.gov (United States)

    Liu, Chunbo; Pan, Feng; Li, Yun

    2016-07-29

    Glutamate is of great importance in food and pharmaceutical industries. There is still lack of effective statistical approaches for fault diagnosis in the fermentation process of glutamate. To date, the statistical approach based on generalized additive model (GAM) and bootstrap has not been used for fault diagnosis in fermentation processes, much less the fermentation process of glutamate with small samples sets. A combined approach of GAM and bootstrap was developed for the online fault diagnosis in the fermentation process of glutamate with small sample sets. GAM was first used to model the relationship between glutamate production and different fermentation parameters using online data from four normal fermentation experiments of glutamate. The fitted GAM with fermentation time, dissolved oxygen, oxygen uptake rate and carbon dioxide evolution rate captured 99.6 % variance of glutamate production during fermentation process. Bootstrap was then used to quantify the uncertainty of the estimated production of glutamate from the fitted GAM using 95 % confidence interval. The proposed approach was then used for the online fault diagnosis in the abnormal fermentation processes of glutamate, and a fault was defined as the estimated production of glutamate fell outside the 95 % confidence interval. The online fault diagnosis based on the proposed approach identified not only the start of the fault in the fermentation process, but also the end of the fault when the fermentation conditions were back to normal. The proposed approach only used a small sample sets from normal fermentations excitements to establish the approach, and then only required online recorded data on fermentation parameters for fault diagnosis in the fermentation process of glutamate. The proposed approach based on GAM and bootstrap provides a new and effective way for the fault diagnosis in the fermentation process of glutamate with small sample sets.

  12. High-speed imaging upgrade for a standard sample scanning atomic force microscope using small cantilevers

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Jonathan D.; Nievergelt, Adrian; Erickson, Blake W.; Yang, Chen; Dukic, Maja; Fantner, Georg E., E-mail: georg.fantner@epfl.ch [Ecole Polytechnique Fédérale de Lausanne, Lausanne (Switzerland)

    2014-09-15

    We present an atomic force microscope (AFM) head for optical beam deflection on small cantilevers. Our AFM head is designed to be small in size, easily integrated into a commercial AFM system, and has a modular architecture facilitating exchange of the optical and electronic assemblies. We present two different designs for both the optical beam deflection and the electronic readout systems, and evaluate their performance. Using small cantilevers with our AFM head on an otherwise unmodified commercial AFM system, we are able to take tapping mode images approximately 5–10 times faster compared to the same AFM system using large cantilevers. By using additional scanner turnaround resonance compensation and a controller designed for high-speed AFM imaging, we show tapping mode imaging of lipid bilayers at line scan rates of 100–500 Hz for scan areas of several micrometers in size.

  13. High-throughput analysis using non-depletive SPME: challenges and applications to the determination of free and total concentrations in small sample volumes.

    Science.gov (United States)

    Boyacı, Ezel; Bojko, Barbara; Reyes-Garcés, Nathaly; Poole, Justen J; Gómez-Ríos, Germán Augusto; Teixeira, Alexandre; Nicol, Beate; Pawliszyn, Janusz

    2018-01-18

    In vitro high-throughput non-depletive quantitation of chemicals in biofluids is of growing interest in many areas. Some of the challenges facing researchers include the limited volume of biofluids, rapid and high-throughput sampling requirements, and the lack of reliable methods. Coupled to the above, growing interest in the monitoring of kinetics and dynamics of miniaturized biosystems has spurred the demand for development of novel and revolutionary methodologies for analysis of biofluids. The applicability of solid-phase microextraction (SPME) is investigated as a potential technology to fulfill the aforementioned requirements. As analytes with sufficient diversity in their physicochemical features, nicotine, N,N-Diethyl-meta-toluamide, and diclofenac were selected as test compounds for the study. The objective was to develop methodologies that would allow repeated non-depletive sampling from 96-well plates, using 100 µL of sample. Initially, thin film-SPME was investigated. Results revealed substantial depletion and consequent disruption in the system. Therefore, new ultra-thin coated fibers were developed. The applicability of this device to the described sampling scenario was tested by determining the protein binding of the analytes. Results showed good agreement with rapid equilibrium dialysis. The presented method allows high-throughput analysis using small volumes, enabling fast reliable free and total concentration determinations without disruption of system equilibrium.

  14. Gravimetric and volumetric approaches adapted for hydrogen sorption measurements with in situ conditioning on small sorbent samples

    International Nuclear Information System (INIS)

    Poirier, E.; Chahine, R.; Tessier, A.; Bose, T.K.

    2005-01-01

    We present high sensitivity (0 to 1 bar, 295 K) gravimetric and volumetric hydrogen sorption measurement systems adapted for in situ sample conditioning at high temperature and high vacuum. These systems are designed especially for experiments on sorbents available in small masses (mg) and requiring thorough degassing prior to sorption measurements. Uncertainty analysis from instrumental specifications and hydrogen absorption measurements on palladium are presented. The gravimetric and volumetric systems yield cross-checkable results within about 0.05 wt % on samples weighing from (3 to 25) mg. Hydrogen storage capacities of single-walled carbon nanotubes measured at 1 bar and 295 K with both systems are presented

  15. Inverse Gaussian model for small area estimation via Gibbs sampling

    African Journals Online (AJOL)

    We present a Bayesian method for estimating small area parameters under an inverse Gaussian model. The method is extended to estimate small area parameters for finite populations. The Gibbs sampler is proposed as a mechanism for implementing the Bayesian paradigm. We illustrate the method by application to ...

  16. Determination of Organic Pollutants in Small Samples of Groundwaters by Liquid-Liquid Extraction and Capillary Gas Chromatography

    DEFF Research Database (Denmark)

    Harrison, I.; Leader, R.U.; Higgo, J.J.W.

    1994-01-01

    A method is presented for the determination of 22 organic compounds in polluted groundwaters. The method includes liquid-liquid extraction of the base/neutral organics from small, alkaline groundwater samples, followed by derivatisation and liquid-liquid extraction of phenolic compounds after neu...... neutralisation. The extracts were analysed by capillary gas chromatography. Dual detection by flame Ionisation and electron capture was used to reduce analysis time....

  17. Present status of NMCC and sample preparation method for bio-samples

    International Nuclear Information System (INIS)

    Futatsugawa, S.; Hatakeyama, S.; Saitou, S.; Sera, K.

    1993-01-01

    In NMCC(Nishina Memorial Cyclotron Center) we are doing researches on PET of nuclear medicine (Positron Emission Computed Tomography) and PIXE analysis (Particle Induced X-ray Emission) using a small cyclotron of compactly designed. The NMCC facilities have been opened to researchers of other institutions since April 1993. The present status of NMCC is described. Bio-samples (medical samples, plants, animals and environmental samples) have mainly been analyzed by PIXE in NMCC. Small amounts of bio-samples for PIXE are decomposed quickly and easily in a sealed PTFE (polytetrafluoroethylene) vessel with a microwave oven. This sample preparation method of bio-samples also is described. (author)

  18. Use of aspiration method for collecting brain samples for rabies diagnosis in small wild animals.

    Science.gov (United States)

    Iamamoto, K; Quadros, J; Queiroz, L H

    2011-02-01

    In developing countries such as Brazil, where canine rabies is still a considerable problem, samples from wildlife species are infrequently collected and submitted for screening for rabies. A collaborative study was established involving environmental biologists and veterinarians for rabies epidemiological research in a specific ecological area located at the Sao Paulo State, Brazil. The wild animals' brains are required to be collected without skull damage because the skull's measurements are important in the identification of the captured animal species. For this purpose, samples from bats and small mammals were collected using an aspiration method by inserting a plastic pipette into the brain through the magnum foramen. While there is a progressive increase in the use of the plastic pipette technique in various studies undertaken, it is also appreciated that this method could foster collaborative research between wildlife scientists and rabies epidemiologists thus improving rabies surveillance. © 2009 Blackwell Verlag GmbH.

  19. An Inset CT Specimen for Evaluating Fracture in Small Samples of Material

    Science.gov (United States)

    Yahyazadehfar, M.; Nazari, A.; Kruzic, J.J.; Quinn, G.D.; Arola, D.

    2013-01-01

    In evaluations on the fracture behavior of hard tissues and many biomaterials, the volume of material available to study is not always sufficient to apply a standard method of practice. In the present study an inset Compact Tension (inset CT) specimen is described, which uses a small cube of material (approximately 2×2×2 mm3) that is molded within a secondary material to form the compact tension geometry. A generalized equation describing the Mode I stress intensity was developed for the specimen using the solutions from a finite element model that was defined over permissible crack lengths, variations in specimen geometry, and a range in elastic properties of the inset and mold materials. A validation of the generalized equation was performed using estimates for the fracture toughness of a commercial dental composite via the “inset CT” specimen and the standard geometry defined by ASTM E399. Results showed that the average fracture toughness obtained from the new specimen (1.23 ± 0.02 MPa•m0.5) was within 2% of that from the standard. Applications of the inset CT specimen are presented for experimental evaluations on the crack growth resistance of dental enamel and root dentin, including their fracture resistance curves. Potential errors in adopting this specimen are then discussed, including the effects of debonding between the inset and molding material on the estimated stress intensity distribution. Results of the investigation show that the inset CT specimen offers a viable approach for studying the fracture behavior of small volumes of structural materials. PMID:24268892

  20. Small area estimation (SAE) model: Case study of poverty in West Java Province

    Science.gov (United States)

    Suhartini, Titin; Sadik, Kusman; Indahwati

    2016-02-01

    This paper showed the comparative of direct estimation and indirect/Small Area Estimation (SAE) model. Model selection included resolve multicollinearity problem in auxiliary variable, such as choosing only variable non-multicollinearity and implemented principal component (PC). Concern parameters in this paper were the proportion of agricultural venture poor households and agricultural poor households area level in West Java Province. The approach for estimating these parameters could be performed based on direct estimation and SAE. The problem of direct estimation, three area even zero and could not be conducted by directly estimation, because small sample size. The proportion of agricultural venture poor households showed 19.22% and agricultural poor households showed 46.79%. The best model from agricultural venture poor households by choosing only variable non-multicollinearity and the best model from agricultural poor households by implemented PC. The best estimator showed SAE better then direct estimation both of the proportion of agricultural venture poor households and agricultural poor households area level in West Java Province. The solution overcame small sample size and obtained estimation for small area was implemented small area estimation method for evidence higher accuracy and better precision improved direct estimator.

  1. Technical Note: New methodology for measuring viscosities in small volumes characteristic of environmental chamber particle samples

    Directory of Open Access Journals (Sweden)

    L. Renbaum-Wolff

    2013-01-01

    Full Text Available Herein, a method for the determination of viscosities of small sample volumes is introduced, with important implications for the viscosity determination of particle samples from environmental chambers (used to simulate atmospheric conditions. The amount of sample needed is < 1 μl, and the technique is capable of determining viscosities (η ranging between 10−3 and 103 Pascal seconds (Pa s in samples that cover a range of chemical properties and with real-time relative humidity and temperature control; hence, the technique should be well-suited for determining the viscosities, under atmospherically relevant conditions, of particles collected from environmental chambers. In this technique, supermicron particles are first deposited on an inert hydrophobic substrate. Then, insoluble beads (~1 μm in diameter are embedded in the particles. Next, a flow of gas is introduced over the particles, which generates a shear stress on the particle surfaces. The sample responds to this shear stress by generating internal circulations, which are quantified with an optical microscope by monitoring the movement of the beads. The rate of internal circulation is shown to be a function of particle viscosity but independent of the particle material for a wide range of organic and organic-water samples. A calibration curve is constructed from the experimental data that relates the rate of internal circulation to particle viscosity, and this calibration curve is successfully used to predict viscosities in multicomponent organic mixtures.

  2. Static, Mixed-Array Total Evaporation for Improved Quantitation of Plutonium Minor Isotopes in Small Samples

    Science.gov (United States)

    Stanley, F. E.; Byerly, Benjamin L.; Thomas, Mariam R.; Spencer, Khalil J.

    2016-06-01

    Actinide isotope measurements are a critical signature capability in the modern nuclear forensics "toolbox", especially when interrogating anthropogenic constituents in real-world scenarios. Unfortunately, established methodologies, such as traditional total evaporation via thermal ionization mass spectrometry, struggle to confidently measure low abundance isotope ratios (evaporation techniques as a straightforward means of improving plutonium minor isotope measurements, which have been resistant to enhancement in recent years because of elevated radiologic concerns. Results are presented for small sample (~20 ng) applications involving a well-known plutonium isotope reference material, CRM-126a, and compared with traditional total evaporation methods.

  3. A passive guard for low thermal conductivity measurement of small samples by the hot plate method

    International Nuclear Information System (INIS)

    Jannot, Yves; Godefroy, Justine; Degiovanni, Alain; Grigorova-Moutiers, Veneta

    2017-01-01

    Hot plate methods under steady state conditions are based on a 1D model to estimate the thermal conductivity, using measurements of the temperatures T 0 and T 1 of the two sides of the sample and of the heat flux crossing it. To be consistent with the hypothesis of the 1D heat flux, either a hot plate guarded apparatus is used, or the temperature is measured at the centre of the sample. On one hand the latter method can be used only if the ratio thickness/width of the sample is sufficiently low and on the other hand the guarded hot plate method requires large width samples (typical cross section of 0.6  ×  0.6 m 2 ). That is why both methods cannot be used for low width samples. The method presented in this paper is based on an optimal choice of the temperatures T 0 and T 1 compared to the ambient temperature T a , enabling the estimation of the thermal conductivity with a centered hot plate method, by applying the 1D heat flux model. It will be shown that these optimal values do not depend on the size or on the thermal conductivity of samples (in the range 0.015–0.2 W m −1 K −1 ), but only on T a . The experimental results obtained validate the method for several reference samples for values of the ratio thickness/width up to 0.3, thus enabling the measurement of the thermal conductivity of samples having a small cross-section, down to 0.045  ×  0.045 m 2 . (paper)

  4. 7 CFR 201.42 - Small containers.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Small containers. 201.42 Section 201.42 Agriculture... REGULATIONS Sampling in the Administration of the Act § 201.42 Small containers. In sampling seed in small containers that it is not practical to sample as required in § 201.41, a portion of one unopened container or...

  5. Fungal communities in wheat grain show significant co-existence patterns among species

    DEFF Research Database (Denmark)

    Nicolaisen, M.; Justesen, A. F.; Knorr, K.

    2014-01-01

    identified as ‘core’ OTUs as they were found in all or almost all samples and accounted for almost 99 % of all sequences. The remaining OTUs were only sporadically found and only in small amounts. Cluster and factor analyses showed patterns of co-existence among the core species. Cluster analysis grouped...... the 21 core OTUs into three clusters: cluster 1 consisting of saprotrophs, cluster 2 consisting mainly of yeasts and saprotrophs and cluster 3 consisting of wheat pathogens. Principal component extraction showed that the Fusarium graminearum group was inversely related to OTUs of clusters 1 and 2....

  6. Safety evaluation of small samples for isotope production

    International Nuclear Information System (INIS)

    Sharma, Archana; Singh, Tej; Varde, P.V.

    2015-09-01

    Radioactive isotopes are widely used in basic and applied science and engineering, most notably as environmental and industrial tracers, and for medical imaging procedures. Production of radioisotope constitutes important activity of Indian nuclear program. Since its initial criticality DHRUVA reactor has been facilitating the regular supply of most of the radioisotopes required in the country for application in the fields of medicine, industry and agriculture. In-pile irradiation of the samples requires a prior estimation of the sample reactivity load, heating rate, activity developed and shielding thickness required for post irradiation handling. This report is an attempt to highlight the contributions of DHRUVA reactor, as well as to explain in detail the methodologies used in safety evaluation of the in pile irradiation samples. (author)

  7. Algorithm for computing significance levels using the Kolmogorov-Smirnov statistic and valid for both large and small samples

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, S.E.; Fields, D.E.

    1983-10-01

    The KSTEST code presented here is designed to perform the Kolmogorov-Smirnov one-sample test. The code may be used as a stand-alone program or the principal subroutines may be excerpted and used to service other programs. The Kolmogorov-Smirnov one-sample test is a nonparametric goodness-of-fit test. A number of codes to perform this test are in existence, but they suffer from the inability to provide meaningful results in the case of small sample sizes (number of values less than or equal to 80). The KSTEST code overcomes this inadequacy by using two distinct algorithms. If the sample size is greater than 80, an asymptotic series developed by Smirnov is evaluated. If the sample size is 80 or less, a table of values generated by Birnbaum is referenced. Valid results can be obtained from KSTEST when the sample contains from 3 to 300 data points. The program was developed on a Digital Equipment Corporation PDP-10 computer using the FORTRAN-10 language. The code size is approximately 450 card images and the typical CPU execution time is 0.19 s.

  8. A rheo-optical apparatus for real time kinetic studies on shear-induced alignment of self-assembled soft matter with small sample volumes

    Science.gov (United States)

    Laiho, Ari; Ikkala, Olli

    2007-01-01

    In soft materials, self-assembled nanoscale structures can allow new functionalities but a general problem is to align such local structures aiming at monodomain overall order. In order to achieve shear alignment in a controlled manner, a novel type of rheo-optical apparatus has here been developed that allows small sample volumes and in situ monitoring of the alignment process during the shear. Both the amplitude and orientation angles of low level linear birefringence and dichroism are measured while the sample is subjected to large amplitude oscillatory shear flow. The apparatus is based on a commercial rheometer where we have constructed a flow cell that consists of two quartz teeth. The lower tooth can be set in oscillatory motion whereas the upper one is connected to the force transducers of the rheometer. A custom made cylindrical oven allows the operation of the flow cell at elevated temperatures up to 200 °C. Only a small sample volume is needed (from 9 to 25 mm3), which makes the apparatus suitable especially for studying new materials which are usually obtainable only in small quantities. Using this apparatus the flow alignment kinetics of a lamellar polystyrene-b-polyisoprene diblock copolymer is studied during shear under two different conditions which lead to parallel and perpendicular alignment of the lamellae. The open device geometry allows even combined optical/x-ray in situ characterization of the alignment process by combining small-angle x-ray scattering using concepts shown by Polushkin et al. [Macromolecules 36, 1421 (2003)].

  9. The Effect of Small Sample Size on Measurement Equivalence of Psychometric Questionnaires in MIMIC Model: A Simulation Study

    Directory of Open Access Journals (Sweden)

    Jamshid Jamali

    2017-01-01

    Full Text Available Evaluating measurement equivalence (also known as differential item functioning (DIF is an important part of the process of validating psychometric questionnaires. This study aimed at evaluating the multiple indicators multiple causes (MIMIC model for DIF detection when latent construct distribution is nonnormal and the focal group sample size is small. In this simulation-based study, Type I error rates and power of MIMIC model for detecting uniform-DIF were investigated under different combinations of reference to focal group sample size ratio, magnitude of the uniform-DIF effect, scale length, the number of response categories, and latent trait distribution. Moderate and high skewness in the latent trait distribution led to a decrease of 0.33% and 0.47% power of MIMIC model for detecting uniform-DIF, respectively. The findings indicated that, by increasing the scale length, the number of response categories and magnitude DIF improved the power of MIMIC model, by 3.47%, 4.83%, and 20.35%, respectively; it also decreased Type I error of MIMIC approach by 2.81%, 5.66%, and 0.04%, respectively. This study revealed that power of MIMIC model was at an acceptable level when latent trait distributions were skewed. However, empirical Type I error rate was slightly greater than nominal significance level. Consequently, the MIMIC was recommended for detection of uniform-DIF when latent construct distribution is nonnormal and the focal group sample size is small.

  10. The Effect of Small Sample Size on Measurement Equivalence of Psychometric Questionnaires in MIMIC Model: A Simulation Study.

    Science.gov (United States)

    Jamali, Jamshid; Ayatollahi, Seyyed Mohammad Taghi; Jafari, Peyman

    2017-01-01

    Evaluating measurement equivalence (also known as differential item functioning (DIF)) is an important part of the process of validating psychometric questionnaires. This study aimed at evaluating the multiple indicators multiple causes (MIMIC) model for DIF detection when latent construct distribution is nonnormal and the focal group sample size is small. In this simulation-based study, Type I error rates and power of MIMIC model for detecting uniform-DIF were investigated under different combinations of reference to focal group sample size ratio, magnitude of the uniform-DIF effect, scale length, the number of response categories, and latent trait distribution. Moderate and high skewness in the latent trait distribution led to a decrease of 0.33% and 0.47% power of MIMIC model for detecting uniform-DIF, respectively. The findings indicated that, by increasing the scale length, the number of response categories and magnitude DIF improved the power of MIMIC model, by 3.47%, 4.83%, and 20.35%, respectively; it also decreased Type I error of MIMIC approach by 2.81%, 5.66%, and 0.04%, respectively. This study revealed that power of MIMIC model was at an acceptable level when latent trait distributions were skewed. However, empirical Type I error rate was slightly greater than nominal significance level. Consequently, the MIMIC was recommended for detection of uniform-DIF when latent construct distribution is nonnormal and the focal group sample size is small.

  11. Clear as daylight: analysis of diurnal raptor pellets for small mammal studies

    Directory of Open Access Journals (Sweden)

    Matos, M.

    2015-04-01

    Full Text Available Non–invasive approaches are increasingly investigated and applied in studies of small mammal assemblages because they are more cost–effective and bypass conservation and animal welfare issues. However, pellets of diurnal raptors have rarely been used for these purposes. We evaluated the potential of marsh harrier pellets (Circus aeruginosus as a non–invasive method to sample small mammal assemblages, by comparing the results with those of sampling using Sherman live–traps and pitfalls. The three methods were applied simultaneously in an agricultural–wetland complex in NW Portugal. Estimates of species richness, diversity, evenness, abundance, and proportion of each species within the assemblage showed significant differences between the three methods. Our results suggest that the use of marsh harrier pellets is more effective in inventorying small mammal species than either of the two kinds of traps, while also avoiding any involuntary fatalities associated with the sampling of small non–volant mammals. Moreover, the analysis of pellets was the most cost–effective method. Comparison of the two trapping methodologies showed involuntary fatalities were higher in pitfalls than in Sherman traps. We discuss the advantages and flaws of the three methods, both from technical and conservational perspectives.

  12. Small population size of Pribilof Rock Sandpipers confirmed through distance-sampling surveys in Alaska

    Science.gov (United States)

    Ruthrauff, Daniel R.; Tibbitts, T. Lee; Gill, Robert E.; Dementyev, Maksim N.; Handel, Colleen M.

    2012-01-01

    The Rock Sandpiper (Calidris ptilocnemis) is endemic to the Bering Sea region and unique among shorebirds in the North Pacific for wintering at high latitudes. The nominate subspecies, the Pribilof Rock Sandpiper (C. p. ptilocnemis), breeds on four isolated islands in the Bering Sea and appears to spend the winter primarily in Cook Inlet, Alaska. We used a stratified systematic sampling design and line-transect method to survey the entire breeding range of this population during springs 2001-2003. Densities were up to four times higher on the uninhabited and more northerly St. Matthew and Hall islands than on St. Paul and St. George islands, which both have small human settlements and introduced reindeer herds. Differences in density, however, appeared to be more related to differences in vegetation than to anthropogenic factors, raising some concern for prospective effects of climate change. We estimated the total population at 19 832 birds (95% CI 17 853–21 930), ranking it among the smallest of North American shorebird populations. To determine the vulnerability of C. p. ptilocnemis to anthropogenic and stochastic environmental threats, future studies should focus on determining the amount of gene flow among island subpopulations, the full extent of the subspecies' winter range, and the current trajectory of this small population.

  13. Breaking Free of Sample Size Dogma to Perform Innovative Translational Research

    Science.gov (United States)

    Bacchetti, Peter; Deeks, Steven G.; McCune, Joseph M.

    2011-01-01

    Innovative clinical and translational research is often delayed or prevented by reviewers’ expectations that any study performed in humans must be shown in advance to have high statistical power. This supposed requirement is not justifiable and is contradicted by the reality that increasing sample size produces diminishing marginal returns. Studies of new ideas often must start small (sometimes even with an N of 1) because of cost and feasibility concerns, and recent statistical work shows that small sample sizes for such research can produce more projected scientific value per dollar spent than larger sample sizes. Renouncing false dogma about sample size would remove a serious barrier to innovation and translation. PMID:21677197

  14. A rapid procedure for the determination of thorium, uranium, cadmium and molybdenum in small sediment samples by inductively coupled plasma-mass spectrometry: application in Chesapeake Bay

    International Nuclear Information System (INIS)

    Zheng, Y.; Weinman, B.; Cronin, T.; Fleisher, M.Q.; Anderson, R.F.

    2003-01-01

    This paper describes a rapid procedure that allows precise analysis of Mo, Cd, U and Th in sediment samples as small as 10 mg by using a novel approach that utilizes a 'pseudo' isotope dilution for Th and conventional isotope dilution for Mo, Cd and U by ICP-MS. Long-term reproducibility of the method is between 2.5 and 5% with an advantage of rapid analysis on a single digestion of sediment sample and the potential of adding other elements of interest if so desired. Application of this method to two piston cores collected near the mouth of the Patuxent River in Chesapeake Bay showed that the accumulation of authigenic Mo and Cd varied in response to the changing bottom water redox conditions, with anoxia showing consistent oscillations throughout both pre-industrial and industrial times. Accumulation of authigenic U shows consistent oscillations as well, without any apparent increase in productivity related to anoxic trends. Degrees of Mo and Cd enrichment also inversely correlate to halophilic microfaunal assemblages already established as paleoclimate proxies within the bay indicating that bottom water anoxia is driven in part by the amount of freshwater discharge that the area receives

  15. Antibiotic Resistance in Animal and Environmental Samples Associated with Small-Scale Poultry Farming in Northwestern Ecuador.

    Science.gov (United States)

    Braykov, Nikolay P; Eisenberg, Joseph N S; Grossman, Marissa; Zhang, Lixin; Vasco, Karla; Cevallos, William; Muñoz, Diana; Acevedo, Andrés; Moser, Kara A; Marrs, Carl F; Foxman, Betsy; Trostle, James; Trueba, Gabriel; Levy, Karen

    2016-01-01

    The effects of animal agriculture on the spread of antibiotic resistance (AR) are cross-cutting and thus require a multidisciplinary perspective. Here we use ecological, epidemiological, and ethnographic methods to examine populations of Escherichia coli circulating in the production poultry farming environment versus the domestic environment in rural Ecuador, where small-scale poultry production employing nontherapeutic antibiotics is increasingly common. We sampled 262 "production birds" (commercially raised broiler chickens and laying hens) and 455 "household birds" (raised for domestic use) and household and coop environmental samples from 17 villages between 2010 and 2013. We analyzed data on zones of inhibition from Kirby-Bauer tests, rather than established clinical breakpoints for AR, to distinguish between populations of organisms. We saw significantly higher levels of AR in bacteria from production versus household birds; resistance to either amoxicillin-clavulanate, cephalothin, cefotaxime, and gentamicin was found in 52.8% of production bird isolates and 16% of household ones. A strain jointly resistant to the 4 drugs was exclusive to a subset of isolates from production birds (7.6%) and coop surfaces (6.5%) and was associated with a particular purchase site. The prevalence of AR in production birds declined with bird age (P resistance (AR) in E. coli isolates from small-scale poultry production environments versus domestic environments in rural Ecuador, where such backyard poultry operations have become established over the past decade. Our previous research in the region suggests that introduction of AR bacteria through travel and commerce may be an important source of AR in villages of this region. This report extends the prior analysis by examining small-scale production chicken farming as a potential source of resistant strains. Our results suggest that AR strains associated with poultry production likely originate from sources outside the study

  16. Acoustically levitated droplets: a contactless sampling method for fluorescence studies.

    Science.gov (United States)

    Leiterer, Jork; Grabolle, Markus; Rurack, Knut; Resch-Genger, Ute; Ziegler, Jan; Nann, Thomas; Panne, Ulrich

    2008-01-01

    Acoustic levitation is used as a new tool to study concentration-dependent processes in fluorescence spectroscopy. With this technique, small amounts of liquid and solid samples can be measured without the need for sample supports or containers, which often limits signal acquisition and can even alter sample properties due to interactions with the support material. We demonstrate that, because of the small sample volume, fluorescence measurements at high concentrations of an organic dye are possible without the limitation of inner-filter effects, which hamper such experiments in conventional, cuvette-based measurements. Furthermore, we show that acoustic levitation of liquid samples provides an experimentally simple way to study distance-dependent fluorescence modulations in semiconductor nanocrystals. The evaporation of the solvent during levitation leads to a continuous increase of solute concentration and can easily be monitored by laser-induced fluorescence.

  17. Small polaron hopping conduction in samples of ceramic La1.4Sr1.6Mn2O7.06

    International Nuclear Information System (INIS)

    Nakatsugawa, H.; Iguchi, E.; Jung, W.H.; Munakata, F.

    1999-01-01

    The ceramic sample of La 1.4 Sr 1.6 Mn 2 O 7.06 exhibits the metal-insulator transition and a negative magnetoresistance in the vicinity of the Curie temperature (T C ∼ 100 K). The dc magnetic susceptibility between 100 K and 280 K is nearly constant and decreases gradually with increasing temperature above 280 K. The measurements of dc resistivity and the thermoelectric power indicate that small polaron hopping conduction takes place at T > 280 K. The spin ordering due to the two-dimensional d x 2 -y 2 state occurring at T > 280 K is directly related to the hopping conduction above 280 K, although the spin ordering due to the one-dimensional d 3z 2 -r 2 state takes place at T > T C . The two-dimensional d x 2 -y 2 state extending within the MnO 2 sheets starts to narrow and leads to the carrier localisation at 280 K. The effective number of holes in this sample estimated from the thermoelectric power is considerably smaller than the nominal value. This indicates that the small polaron hopping conduction takes place predominantly within the in-plane MnO 2 sheets. A discussion is given of the experimental results of the ceramic sample of La 2/3 Ca 1/3 MnO 2.98 . Copyright (1999) CSIRO Australia

  18. A semi-nested real-time PCR method to detect low chimerism percentage in small quantity of hematopoietic stem cell transplant DNA samples.

    Science.gov (United States)

    Aloisio, Michelangelo; Bortot, Barbara; Gandin, Ilaria; Severini, Giovanni Maria; Athanasakis, Emmanouil

    2017-02-01

    Chimerism status evaluation of post-allogeneic hematopoietic stem cell transplantation samples is essential to predict post-transplant relapse. The most commonly used technique capable of detecting small increments of chimerism is quantitative real-time PCR. Although this method is already used in several laboratories, previously described protocols often lack sensitivity and the amount of the DNA required for each chimerism analysis is too high. In the present study, we compared a novel semi-nested allele-specific real-time PCR (sNAS-qPCR) protocol with our in-house standard allele-specific real-time PCR (gAS-qPCR) protocol. We selected two genetic markers and analyzed technical parameters (slope, y-intercept, R2, and standard deviation) useful to determine the performances of the two protocols. The sNAS-qPCR protocol showed better sensitivity and precision. Moreover, the sNAS-qPCR protocol requires, as input, only 10 ng of DNA, which is at least 10-fold less than the gAS-qPCR protocols described in the literature. Finally, the proposed sNAS-qPCR protocol could prove very useful for performing chimerism analysis with a small amount of DNA, as in the case of blood cell subsets.

  19. Success and failure rates of tumor genotyping techniques in routine pathological samples with non-small-cell lung cancer.

    Science.gov (United States)

    Vanderlaan, Paul A; Yamaguchi, Norihiro; Folch, Erik; Boucher, David H; Kent, Michael S; Gangadharan, Sidharta P; Majid, Adnan; Goldstein, Michael A; Huberman, Mark S; Kocher, Olivier N; Costa, Daniel B

    2014-04-01

    Identification of some somatic molecular alterations in non-small-cell lung cancer (NSCLC) has become evidence-based practice. The success and failure rate of using commercially available tumor genotyping techniques in routine day-to-day NSCLC pathology samples is not well described. We sought to evaluate the success and failure rate of EGFR mutation, KRAS mutation, and ALK FISH in a cohort of lung cancers subjected to routine clinical tumor genotype. Clinicopathologic data, tumor genotype success and failure rates were retrospectively compiled and analyzed from 381 patient-tumor samples. From these 381 patients with lung cancer, the mean age was 65 years, 61.2% were women, 75.9% were white, 27.8% were never smokers, 73.8% had advanced NSCLC and 86.1% had adenocarcinoma histology. The tumor tissue was obtained from surgical specimens in 48.8%, core needle biopsies in 17.9%, and as cell blocks from aspirates or fluid in 33.3% of cases. Anatomic sites for tissue collection included lung (49.3%), lymph nodes (22.3%), pleura (11.8%), bone (6.0%), brain (6.0%), among others. The overall success rate for EGFR mutation analysis was 94.2%, for KRAS mutation 91.6% and for ALK FISH 91.6%. The highest failure rates were observed when the tissue was obtained from image-guided percutaneous transthoracic core-needle biopsies (31.8%, 27.3%, and 35.3% for EGFR, KRAS, and ALK tests, respectively) and bone specimens (23.1%, 15.4%, and 23.1%, respectively). In specimens obtained from bone, the failure rates were significantly higher for biopsies than resection specimens (40% vs. 0%, p=0.024 for EGFR) and for decalcified compared to non-decalcified samples (60% vs. 5.5%, p=0.021 for EGFR). Tumor genotype techniques are feasible in most samples, outside small image-guided percutaneous transthoracic core-needle biopsies and bone samples from core biopsies with decalcification, and therefore expansion of routine tumor genotype into the care of patients with NSCLC may not require special

  20. Investigation of the Effect of Small Hardening Spots Created on the Sample Surface by Laser Complex with Solid-State Laser

    Science.gov (United States)

    Nozdrina, O.; Zykov, I.; Melnikov, A.; Tsipilev, V.; Turanov, S.

    2018-03-01

    This paper describes the results of an investigation of the effect of small hardening spots (about 1 mm) created on the surface of a sample by laser complex with solid-state laser. The melted area of the steel sample is not exceed 5%. Steel microhardness change in the region subjected to laser treatment is studied. Also there is a graph of the deformation of samples dependence on the tension. As a result, the yield plateau and plastic properties changes were detected. The flow line was tracked in the series of speckle photographs. As a result we can see how mm surface inhomogeneity can influence on the deformation and strength properties of steel.

  1. Sampling versus systematic full lymphatic dissection in surgical treatment of non-small cell lung cancer.

    Science.gov (United States)

    Koulaxouzidis, Georgios; Karagkiouzis, Grigorios; Konstantinou, Marios; Gkiozos, Ioannis; Syrigos, Konstantinos

    2013-04-22

    The extent of mediastinal lymph node assessment during surgery for non-small cell cancer remains controversial. Different techniques are used, ranging from simple visual inspection of the unopened mediastinum to an extended bilateral lymph node dissection. Furthermore, different terms are used to define these techniques. Sampling is the removal of one or more lymph nodes under the guidance of pre-operative findings. Systematic (full) nodal dissection is the removal of all mediastinal tissue containing the lymph nodes systematically within anatomical landmarks. A Medline search was conducted to identify articles in the English language that addressed the role of mediastinal lymph node resection in the treatment of non-small cell lung cancer. Opinions as to the reasons for favoring full lymphatic dissection include complete resection, improved nodal staging and better local control due to resection of undetected micrometastasis. Arguments against routine full lymphatic dissection are increased morbidity, increase in operative time, and lack of evidence of improved survival. For complete resection of non-small cell lung cancer, many authors recommend a systematic nodal dissection as the standard approach during surgery, and suggest that this provides both adequate nodal staging and guarantees complete resection. Whether extending the lymph node dissection influences survival or recurrence rate is still not known. There are valid arguments in favor in terms not only of an improved local control but also of an improved long-term survival. However, the impact of lymph node dissection on long-term survival should be further assessed by large-scale multicenter randomized trials.

  2. Determination of 35S-aminoacyl-transfer ribonucleic acid specific radioactivity in small tissue samples

    International Nuclear Information System (INIS)

    Samarel, A.M.; Ogunro, E.A.; Ferguson, A.G.; Lesch, M.

    1981-01-01

    Rate determination of protein synthesis utilizing tracer amino acid incorporation requires accurate assessment of the specific radioactivity of the labeled precursor aminoacyl-tRNA pool. Previously published methods presumably useful for the measurement of any aminoacyl-tRNA were unsuccessful when applied to [ 35 S]methionine, due to the unique chemical properties of this amino acid. Herein we describe modifications of these methods necessary for the measurement of 35 S-aminoacyl-tRNA specific radioactivity from small tissue samples incubated in the presence of [ 35 S]methionine. The use of [ 35 S]methionine of high specific radioactivity enables analysis of the methionyl-tRNA from less than 100 mg of tissue. Conditions for optimal recovery of 35 S-labeled dansyl-amino acid derivatives are presented and possible applications of this method are discussed

  3. 14CO2 analysis of soil gas: Evaluation of sample size limits and sampling devices

    Science.gov (United States)

    Wotte, Anja; Wischhöfer, Philipp; Wacker, Lukas; Rethemeyer, Janet

    2017-12-01

    Radiocarbon (14C) analysis of CO2 respired from soils or sediments is a valuable tool to identify different carbon sources. The collection and processing of the CO2, however, is challenging and prone to contamination. We thus continuously improve our handling procedures and present a refined method for the collection of even small amounts of CO2 in molecular sieve cartridges (MSCs) for accelerator mass spectrometry 14C analysis. Using a modified vacuum rig and an improved desorption procedure, we were able to increase the CO2 recovery from the MSC (95%) as well as the sample throughput compared to our previous study. By processing series of different sample size, we show that our MSCs can be used for CO2 samples of as small as 50 μg C. The contamination by exogenous carbon determined in these laboratory tests, was less than 2.0 μg C from fossil and less than 3.0 μg C from modern sources. Additionally, we tested two sampling devices for the collection of CO2 samples released from soils or sediments, including a respiration chamber and a depth sampler, which are connected to the MSC. We obtained a very promising, low process blank for the entire CO2 sampling and purification procedure of ∼0.004 F14C (equal to 44,000 yrs BP) and ∼0.003 F14C (equal to 47,000 yrs BP). In contrast to previous studies, we observed no isotopic fractionation towards lighter δ13C values during the passive sampling with the depth samplers.

  4. A yigP mutant strain is a small colony variant of E. coli and shows pleiotropic antibiotic resistance.

    Science.gov (United States)

    Xia, Hui; Tang, Qiongwei; Song, Jie; Ye, Jiang; Wu, Haizhen; Zhang, Huizhan

    2017-12-01

    Small colony variants (SCVs) are a commonly observed subpopulation of bacteria that have a small colony size and distinctive biochemical characteristics. SCVs are more resistant than the wild type to some antibiotics and usually cause persistent infections in the clinic. SCV studies have been very active during the past 2 decades, especially Staphylococcus aureus SCVs. However, fewer studies on Escherichia coli SCVs exist, so we studied an E. coli SCV during an experiment involving the deletion of the yigP locus. PCR and DNA sequencing revealed that the SCV was attributable to a defect in the yigP function. Furthermore, we investigated the antibiotic resistance profile of the E. coli SCV and it showed increased erythromycin, kanamycin, and d-cycloserine resistance, but collateral sensitivity to ampicillin, polymyxin, chloramphenicol, tetracycline, rifampin, and nalidixic acid. We tried to determine the association between yigP and the pleiotropic antibiotic resistance of the SCV by analyzing biofilm formation, cellular morphology, and coenzyme Q (Q 8 ) production. Our results indicated that impaired Q 8 biosynthesis was the primary factor that contributed to the increased resistance and collateral sensitivity of the SCV. This study offers a novel genetic basis for E. coli SCVs and an insight into the development of alternative antimicrobial strategies for clinical therapy.

  5. Small GSK-3 Inhibitor Shows Efficacy in a Motor Neuron Disease Murine Model Modulating Autophagy.

    Directory of Open Access Journals (Sweden)

    Estefanía de Munck

    Full Text Available Amyotrophic lateral sclerosis (ALS is a progressive motor neuron degenerative disease that has no effective treatment up to date. Drug discovery tasks have been hampered due to the lack of knowledge in its molecular etiology together with the limited animal models for research. Recently, a motor neuron disease animal model has been developed using β-N-methylamino-L-alanine (L-BMAA, a neurotoxic amino acid related to the appearing of ALS. In the present work, the neuroprotective role of VP2.51, a small heterocyclic GSK-3 inhibitor, is analysed in this novel murine model together with the analysis of autophagy. VP2.51 daily administration for two weeks, starting the first day after L-BMAA treatment, leads to total recovery of neurological symptoms and prevents the activation of autophagic processes in rats. These results show that the L-BMAA murine model can be used to test the efficacy of new drugs. In addition, the results confirm the therapeutic potential of GSK-3 inhibitors, and specially VP2.51, for the disease-modifying future treatment of motor neuron disorders like ALS.

  6. [Effect sizes, statistical power and sample sizes in "the Japanese Journal of Psychology"].

    Science.gov (United States)

    Suzukawa, Yumi; Toyoda, Hideki

    2012-04-01

    This study analyzed the statistical power of research studies published in the "Japanese Journal of Psychology" in 2008 and 2009. Sample effect sizes and sample statistical powers were calculated for each statistical test and analyzed with respect to the analytical methods and the fields of the studies. The results show that in the fields like perception, cognition or learning, the effect sizes were relatively large, although the sample sizes were small. At the same time, because of the small sample sizes, some meaningful effects could not be detected. In the other fields, because of the large sample sizes, meaningless effects could be detected. This implies that researchers who could not get large enough effect sizes would use larger samples to obtain significant results.

  7. A combined Importance Sampling and Kriging reliability method for small failure probabilities with time-demanding numerical models

    International Nuclear Information System (INIS)

    Echard, B.; Gayton, N.; Lemaire, M.; Relun, N.

    2013-01-01

    Applying reliability methods to a complex structure is often delicate for two main reasons. First, such a structure is fortunately designed with codified rules leading to a large safety margin which means that failure is a small probability event. Such a probability level is difficult to assess efficiently. Second, the structure mechanical behaviour is modelled numerically in an attempt to reproduce the real response and numerical model tends to be more and more time-demanding as its complexity is increased to improve accuracy and to consider particular mechanical behaviour. As a consequence, performing a large number of model computations cannot be considered in order to assess the failure probability. To overcome these issues, this paper proposes an original and easily implementable method called AK-IS for active learning and Kriging-based Importance Sampling. This new method is based on the AK-MCS algorithm previously published by Echard et al. [AK-MCS: an active learning reliability method combining Kriging and Monte Carlo simulation. Structural Safety 2011;33(2):145–54]. It associates the Kriging metamodel and its advantageous stochastic property with the Importance Sampling method to assess small failure probabilities. It enables the correction or validation of the FORM approximation with only a very few mechanical model computations. The efficiency of the method is, first, proved on two academic applications. It is then conducted for assessing the reliability of a challenging aerospace case study submitted to fatigue.

  8. Continuous-flow liquid microjunction surface sampling probe connected on-line with high-performance liquid chromatography/mass spectrometry for spatially resolved analysis of small molecules and proteins.

    Science.gov (United States)

    Van Berkel, Gary J; Kertesz, Vilmos

    2013-06-30

    A continuous-flow liquid microjunction surface sampling probe extracts soluble material from surfaces for direct ionization and detection by mass spectrometry. Demonstrated here is the on-line coupling of such a probe with high-performance liquid chromatography/mass spectrometry (HPLC/MS) enabling extraction, separation and detection of small molecules and proteins from surfaces in a spatially resolved (~0.5 mm diameter spots) manner. A continuous-flow liquid microjunction surface sampling probe was connected to a six-port, two-position valve for extract collection and injection to an HPLC column. A QTRAP® 5500 hybrid triple quadrupole linear ion trap equipped with a Turbo V™ ion source operated in positive electrospray ionization (ESI) mode was used for all experiments. The system operation was tested with the extraction, separation and detection of propranolol and associated metabolites from drug dosed tissues, caffeine from a coffee bean, cocaine from paper currency, and proteins from dried sheep blood spots on paper. Confirmed in the tissue were the parent drug and two different hydroxypropranolol glucuronides. The mass spectrometric response for these compounds from different locations in the liver showed an increase with increasing extraction time (5, 20 and 40 s). For on-line separation and detection/identification of extracted proteins from dried sheep blood spots, two major protein peaks dominated the chromatogram and could be correlated with the expected masses for the hemoglobin α and β chains. Spatially resolved sampling, separation, and detection of small molecules and proteins from surfaces can be accomplished using a continuous-flow liquid microjunction surface sampling probe coupled on-line with HPLC/MS detection. Published in 2013. This article is a U.S. Government work and is in the public domain in the USA.

  9. Preparation of highly multiplexed small RNA sequencing libraries.

    Science.gov (United States)

    Persson, Helena; Søkilde, Rolf; Pirona, Anna Chiara; Rovira, Carlos

    2017-08-01

    MicroRNAs (miRNAs) are ~22-nucleotide-long small non-coding RNAs that regulate the expression of protein-coding genes by base pairing to partially complementary target sites, preferentially located in the 3´ untranslated region (UTR) of target mRNAs. The expression and function of miRNAs have been extensively studied in human disease, as well as the possibility of using these molecules as biomarkers for prognostication and treatment guidance. To identify and validate miRNAs as biomarkers, their expression must be screened in large collections of patient samples. Here, we develop a scalable protocol for the rapid and economical preparation of a large number of small RNA sequencing libraries using dual indexing for multiplexing. Combined with the use of off-the-shelf reagents, more samples can be sequenced simultaneously on large-scale sequencing platforms at a considerably lower cost per sample. Sample preparation is simplified by pooling libraries prior to gel purification, which allows for the selection of a narrow size range while minimizing sample variation. A comparison with publicly available data from benchmarking of miRNA analysis platforms showed that this method captures absolute and differential expression as effectively as commercially available alternatives.

  10. DOE-2 sample run book: Version 2.1E

    Energy Technology Data Exchange (ETDEWEB)

    Winkelmann, F.C.; Birdsall, B.E.; Buhl, W.F.; Ellington, K.L.; Erdem, A.E. [Lawrence Berkeley Lab., CA (United States); Hirsch, J.J.; Gates, S. [Hirsch (James J.) and Associates, Camarillo, CA (United States)

    1993-11-01

    The DOE-2 Sample Run Book shows inputs and outputs for a variety of building and system types. The samples start with a simple structure and continue to a high-rise office building, a medical building, three small office buildings, a bar/lounge, a single-family residence, a small office building with daylighting, a single family residence with an attached sunspace, a ``parameterized`` building using input macros, and a metric input/output example. All of the samples use Chicago TRY weather. The main purpose of the Sample Run Book is instructional. It shows the relationship of LOADS-SYSTEMS-PLANT-ECONOMICS inputs, displays various input styles, and illustrates many of the basic and advanced features of the program. Many of the sample runs are preceded by a sketch of the building showing its general appearance and the zoning used in the input. In some cases we also show a 3-D rendering of the building as produced by the program DrawBDL. Descriptive material has been added as comments in the input itself. We find that a number of users have loaded these samples onto their editing systems and use them as ``templates`` for creating new inputs. Another way of using them would be to store various portions as files that can be read into the input using the {number_sign}{number_sign} include command, which is part of the Input Macro feature introduced in version DOE-2.lD. Note that the energy rate structures here are the same as in the DOE-2.lD samples, but have been rewritten using the new DOE-2.lE commands and keywords for ECONOMICS. The samples contained in this report are the same as those found on the DOE-2 release files. However, the output numbers that appear here may differ slightly from those obtained from the release files. The output on the release files can be used as a check set to compare results on your computer.

  11. Hes1-deficient mice show precocious differentiation of Paneth cells in the small intestine

    International Nuclear Information System (INIS)

    Suzuki, Katsumasa; Fukui, Hirokazu; Kayahara, Takahisa; Sawada, Mitsutaka; Seno, Hiroshi; Hiai, Hiroshi; Kageyama, Ryoichiro; Okano, Hideyuki; Chiba, Tsutomu

    2005-01-01

    We have previously shown that Hes1 is expressed both in putative epithelial stem cells just above Paneth cells and in the crypt base columnar cells between Paneth cells, while Hes1 is completely absent in Paneth cells. This study was undertaken to clarify the role of Hes1 in Paneth cell differentiation, using Hes1-knockout (KO) newborn (P0) mice. Electron microscopy revealed premature appearance of distinct cells containing cytoplasmic granules in the intervillous region in Hes1-KO P0 mice, whereas those cells were absent in wild-type (WT) P0 mice. In Hes1-KO P0 mice, the gene expressions of cryptdins, exclusively present in Paneth cells, were all enhanced compared with WT P0 mice. Immunohistochemistry demonstrated increased number of both lysozyme-positive and cryptdin-4-positive cells in the small intestinal epithelium of Hes1-KO P0 mice as compared to WT P0 mice. Thus, Hes1 appears to have an inhibitory role in Paneth cell differentiation in the small intestine

  12. A high-efficiency neutron coincidence counter for small samples

    International Nuclear Information System (INIS)

    Miller, M.C.; Menlove, H.O.; Russo, P.A.

    1991-01-01

    The inventory sample coincidence counter (INVS) has been modified to enhance its performance. The new design is suitable for use with a glove box sample-well (in-line application) as well as for use in the standard at-line mode. The counter has been redesigned to count more efficiently and be less sensitive to variations in sample position. These factors lead to a higher degree of precision and accuracy in a given counting period and allow for the practical use of the INVS counter with gamma-ray isotopics to obtain a plutonium assay independent of operator declarations and time-consuming chemicals analysis. A calculation study was performed using the Los Alamos transport code MCNP to optimize the design parameters. 5 refs., 7 figs., 8 tabs

  13. Interstitial water studies on small core samples, Deep Sea Drilling Project, Leg 5

    Science.gov (United States)

    Manheim, F. T.; Chan, K.M.; Sayles, F.L.

    1970-01-01

    Leg 5 samples fall into two categories with respect to interstitial water composition: 1) rapidly deposited terrigenous or appreciably terrigenous deposits, such as in Hole 35 (western Escanaba trough, off Cape Mendocino, California); and, 2) slowly deposited pelagic clays and biogenic muds and oozes. Interstitial waters in the former show modest to slight variations in chloride and sodium, but drastic changes in non-conservative ions such as magnesium and sulfate. The pelagic deposits show only relatively minor changes in both conservative and non-conservative pore fluid constituents. As was pointed out in earlier Leg Reports, it is believed that much of the variation in chloride in pore fluids within individual holes is attributable to the manipulation of samples on board ship and in the laboratory. On the other hand, the scatter in sodium is due in part to analytical error (on the order of 2 to 3 per cent, in terms of a standard deviation), and it probably accounts for most of the discrepancies in total anion and cation balance. All constituents reported here, with the exception of bulk water content, were analyzed on water samples which were sealed in plastic tubes aboard ship and were subsequently opened and divided into weighed aliquots in the laboratory. Analytical methods follow the atomic absorption, wet chemical and emission spectrochemical techniques briefly summarized in previous reports, e.g. Manheim et al., 1969, and Chan and Manheim, 1970. The authors acknowledge assistance from W. Sunda, D. Kerr, C. Lawson and H. Richards, and thank D. Spencer, P. Brewer and E. Degens for allowing the use of equipment and laboratory facilities.

  14. Temporal variation of phytoplankton in a small tropical crater lake, Costa Rica

    Directory of Open Access Journals (Sweden)

    Gerardo Umaña-Villalobos

    2010-12-01

    Full Text Available The temporal variation in lake’s phytoplankton is important to understand its general biodiversity. For tropical lakes, it has been hypothesized that they follow a similar pattern as temperate ones, on a much accelerated pace; nevertheless, few case studies have tried to elucidate this. Most studies in Costa Rica have used a monthly sampling scheme and failed in showing the expected changes. In this study, the phytoplankton of the small Barvas’s crater lake was followed for more than three years, first with monthly and later with weekly samplings, that covered almost two years. Additional information on temperature and oxygen vertical profiles was obtained on a monthly basis, and surface temperature was measured during weekly samplings around noon. Results showed that in spite of its shallow condition (max. depth: 7m and low surface temperature (11 to 19°C, the lake stratifies at least for brief periods. The phytoplankton showed both, rapid change periods, and prolonged ones of relative stasis. The plankton composition fluctuated between three main phases, one characterized by the abundance of small sized desmids (Staurastrum paradoxum, Cosmarium asphaerosporum, a second phase dominated by equally small cryptomonads (Chryptochrysis minor, Chroomonas sp. and a third phase dominated by the green alga Eutetramorus tetrasporus. Although data evidenced that monthly sampling could miss short term events, the temporal variation did not follow the typical dry and rainy seasons of the region, or any particular annual pattern. Year to year variation was high. As this small lake is located at the summit of Barva Volcano and receives the influence from both the Caribbean and the Pacific weather, seasonality at the lake is not clearly defined as in the rest of the country and short term variations in the local weather might have a stronger effect than broad seasonal trends. The occurrence of this short term changes in the phytoplankton of small tropical

  15. Split Hopkinson Resonant Bar Test for Sonic-Frequency Acoustic Velocity and Attenuation Measurements of Small, Isotropic Geologic Samples

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa, S.

    2011-04-01

    Mechanical properties (seismic velocities and attenuation) of geological materials are often frequency dependent, which necessitates measurements of the properties at frequencies relevant to a problem at hand. Conventional acoustic resonant bar tests allow measuring seismic properties of rocks and sediments at sonic frequencies (several kilohertz) that are close to the frequencies employed for geophysical exploration of oil and gas resources. However, the tests require a long, slender sample, which is often difficult to obtain from the deep subsurface or from weak and fractured geological formations. In this paper, an alternative measurement technique to conventional resonant bar tests is presented. This technique uses only a small, jacketed rock or sediment core sample mediating a pair of long, metal extension bars with attached seismic source and receiver - the same geometry as the split Hopkinson pressure bar test for large-strain, dynamic impact experiments. Because of the length and mass added to the sample, the resonance frequency of the entire system can be lowered significantly, compared to the sample alone. The experiment can be conducted under elevated confining pressures up to tens of MPa and temperatures above 100 C, and concurrently with x-ray CT imaging. The described Split Hopkinson Resonant Bar (SHRB) test is applied in two steps. First, extension and torsion-mode resonance frequencies and attenuation of the entire system are measured. Next, numerical inversions for the complex Young's and shear moduli of the sample are performed. One particularly important step is the correction of the inverted Young's moduli for the effect of sample-rod interfaces. Examples of the application are given for homogeneous, isotropic polymer samples and a natural rock sample.

  16. Small-scale impacts as potential trigger for landslides on small Solar system bodies

    Science.gov (United States)

    Hofmann, Marc; Sierks, Holger; Blum, Jürgen

    2017-07-01

    We conducted a set of experiments to investigate whether millimetre-sized impactors impinging on a granular material at several m s-1 are able to trigger avalanches on small, atmosphereless planetary bodies. These experiments were carried out at the Zentrum für angewandte Raumfahrttechnologie und Mikrogravitation (ZARM) drop tower facility in Bremen, Germany to facilitate a reduced gravity environment. Additional data were gathered at Earth gravity levels in the laboratory. As sample materials we used a ground Howardites, Eucrites and Diogenites (HED) meteorite and the Johnson Space Center (JSC) Mars-1 Martian soil simulant. We found that this type of small-scale impact can trigger avalanches with a moderate probability, if the target material is tilted to an angle close to the angle of repose. We additionally simulated a small-scale impact using the discrete element method code esys-particle. These simulations show that energy transfer from impactor to the target material is most efficient at low- and moderate-impactor inclinations and the transferred energy is retained in particles close to the surface due to a rapid dissipation of energy in lower material layers driven by inelastic collisions. Through Monte Carlo simulations we estimate the time-scale on which small-scale impacts with the observed characteristics will trigger avalanches covering all steep slopes on the surface of a small planetary body to be of the order 105 yr.

  17. The effect of albedo neutrons on the neutron multiplication of small plutonium oxide samples in a PNCC chamber

    CERN Document Server

    Bourva, L C A; Weaver, D R

    2002-01-01

    This paper describes how to evaluate the effect of neutrons reflected from parts of a passive neutron coincidence chamber on the neutron leakage self-multiplication, M sub L , of a fissile sample. It is shown that albedo neutrons contribute, in the case of small plutonium bearing samples, to a significant part of M sub L , and that their effect has to be taken into account in the relationship between the measured coincidence count rates and the sup 2 sup 4 sup 0 Pu effective mass of the sample. A simple one-interaction model has been used to write the balance of neutron gains and losses in the material when exposed to the re-entrant neutron flux. The energy and intensity profiles of the re-entrant flux have been parameterised using Monte Carlo MCNP sup T sup M calculations. This technique has been implemented for the On Site Laboratory neutron/gamma counter within the existing MEPL 1.0 code for the determination of the neutron leakage self-multiplication. Benchmark tests of the resulting MEPL 2.0 code with MC...

  18. Application of inductively coupled plasma mass spectrometry for multielement analysis in small sample amounts of thyroid tissue from Chernobyl area

    International Nuclear Information System (INIS)

    Becker, J.S.; Dietze, H.J.; Boulyga, S.F.; Bazhanova, N.N.; Kanash, N.V.; Malenchenko, A.F.

    2000-01-01

    As a result of the Chernobyl nuclear power plant accident in 1986, thyroid pathologies occurred among children in some regions of belarus. Besides the irradiation of children's thyroids by radioactive iodine and caesium nuclides, toxic elements from fallout are a direct risk to health. Inductively coupled plasma quadrupole-based mass spectrometry (Icp-Ms) and instrumental neutron activation analysis (IAA) were used for multielement determination in small amounts (I-10 mg) of human thyroid tissue samples. The accuracy of the applied analytical technique for small biological sample amounts was checked using NIST standard reference material oyster tissue (SRM 1566 b). Almost all essential elements as well as a number of toxic elements such as Cd, Pb, Hg, U etc. Were determined in a multitude of human thyroid tissues by quadrupole-based Icp-Ms using micro nebulization. In general, the thyroid tissue affected by pathology is characterized by higher calcium content. Some other elements, among them Sr, Zn, Fe, Mn, V, As, Cr, Ni, Pb, U, Ba, Sb, were also Accumulated in such tissue. The results obtained will be used as initial material for further specific studies of the role of particular elements in thyroid pathology development

  19. Sample types applied for molecular diagnosis of therapeutic management of advanced non-small cell lung cancer in the precision medicine.

    Science.gov (United States)

    Han, Yanxi; Li, Jinming

    2017-10-26

    In this era of precision medicine, molecular biology is becoming increasingly significant for the diagnosis and therapeutic management of non-small cell lung cancer. The specimen as the primary element of the whole testing flow is particularly important for maintaining the accuracy of gene alteration testing. Presently, the main sample types applied in routine diagnosis are tissue and cytology biopsies. Liquid biopsies are considered as the most promising alternatives when tissue and cytology samples are not available. Each sample type possesses its own strengths and weaknesses, pertaining to the disparity of sampling, preparation and preservation procedures, the heterogeneity of inter- or intratumors, the tumor cellularity (percentage and number of tumor cells) of specimens, etc., and none of them can individually be a "one size to fit all". Therefore, in this review, we summarized the strengths and weaknesses of different sample types that are widely used in clinical practice, offered solutions to reduce the negative impact of the samples and proposed an optimized strategy for choice of samples during the entire diagnostic course. We hope to provide valuable information to laboratories for choosing optimal clinical specimens to achieve comprehensive functional genomic landscapes and formulate individually tailored treatment plans for NSCLC patients that are in advanced stages.

  20. Desmoplastic small round cell tumour: Cytological and immunocytochemical features

    Directory of Open Access Journals (Sweden)

    Filho Adhemar

    2005-01-01

    Full Text Available Abstract Background Desmoplastic small round cell tumor (DSRCT is a rare and highly aggressive neoplasm. The cytological diagnosis of these tumors can be difficult because they show morphological features quite similar to other small round blue cells tumors. We described four cases of DSRCT with cytological sampling: one obtained by fine needle aspiration biopsy (FNAB and three from serous effusions. The corresponding immunocytochemical panel was also reviewed. Methods Papanicolaou stained samples from FNAB and effusions were morphologically described. Immunoreaction with WT1 antibody was performed in all cytological samples. An immunohistochemical panel including the following antibodies was performed in the corresponding biopsies: 34BE12, AE1/AE3, Chromogranin A, CK20, CK7, CK8, Desmin, EMA, NSE, Vimentin and WT1. Results The smears showed high cellularity with minor size alteration. Nuclei were round to oval, some of them with inconspicuous nucleoli. Tumor cells are clustered, showing rosette-like feature. Tumor cells in effusions and FNA were positive to WT1 in 3 of 4 cytology specimens (2 out 3 effusions and one FNA. Immunohistochemical reactions for vimentin, NSE, AE1/AE3 and WT1 were positive in all cases in tissue sections. Conclusion The use of an adjunct immunocytochemical panel coupled with the cytomorphological characteristics allows the diagnosis of DSRCT in cytological specimens.

  1. Sample preparation strategies for food and biological samples prior to nanoparticle detection and imaging

    DEFF Research Database (Denmark)

    Larsen, Erik Huusfeldt; Löschner, Katrin

    2014-01-01

    microscopy (TEM) proved to be necessary for trouble shooting of results obtained from AFFF-LS-ICP-MS. Aqueous and enzymatic extraction strategies were tested for thorough sample preparation aiming at degrading the sample matrix and to liberate the AgNPs from chicken meat into liquid suspension. The resulting...... AFFF-ICP-MS fractograms, which corresponded to the enzymatic digests, showed a major nano-peak (about 80 % recovery of AgNPs spiked to the meat) plus new smaller peaks that eluted close to the void volume of the fractograms. Small, but significant shifts in retention time of AFFF peaks were observed...... for the meat sample extracts and the corresponding neat AgNP suspension, and rendered sizing by way of calibration with AgNPs as sizing standards inaccurate. In order to gain further insight into the sizes of the separated AgNPs, or their possible dissolved state, fractions of the AFFF eluate were collected...

  2. Gender Segregation Small Firms

    OpenAIRE

    Kenneth R Troske; William J Carrington

    1992-01-01

    This paper studies interfirm gender segregation in a unique sample of small employers. We focus on small firms because previous research on interfirm segregation has studied only large firms and because it is easier to link the demographic characteristics of employers and employees in small firms. This latter feature permits an assessment of the role of employer discrimination in creating gender segregation. Our first finding is that interfirm segregation is prevalent among small employers. I...

  3. Determination of sampling constants in NBS geochemical standard reference materials

    International Nuclear Information System (INIS)

    Filby, R.H.; Bragg, A.E.; Grimm, C.A.

    1986-01-01

    Recently Filby et al. showed that, for several elements, National Bureau of Standards (NBS) Fly Ash standard reference material (SRM) 1633a was a suitable reference material for microanalysis (sample weights 2 , and the mean sample weight, W vector, K/sub s/ = (S/sub s/%) 2 W vector, could not be determined from these data because it was not possible to quantitate other sources of error in the experimental variances. K/sub s/ values for certified elements in geochemical SRMs provide important homogeneity information for microanalysis. For mineralogically homogeneous SRMs (i.e., small K/sub s/ values for associated elements) such as the proposed clays, it is necessary to determine K/sub s/ by analysis of very small sample aliquots to maximize the subsampling variance relative to other sources of error. This source of error and the blank correction for the sample container can be eliminated by determining K/sub s/ from radionuclide activities of weighed subsamples of a preirradiated SRM

  4. Small sample approach, and statistical and epidemiological aspects

    NARCIS (Netherlands)

    Offringa, Martin; van der Lee, Hanneke

    2011-01-01

    In this chapter, the design of pharmacokinetic studies and phase III trials in children is discussed. Classical approaches and relatively novel approaches, which may be more useful in the context of drug research in children, are discussed. The burden of repeated blood sampling in pediatric

  5. Small-angle X-ray scattering on growth of AgCl crystallites in photochromic glasses

    International Nuclear Information System (INIS)

    Takatohi, U.E.; Bittencourt, D.R.S.; Watanabe, S.

    1997-01-01

    Reversible changes in the optical properties of photochromic glasses are observed owing to the presence of small silver halide crystals inside the glassy matrix. These crystals grow during the glass heat-treatment processing. Samples with molar composition of 40SiO 2 .10Al 2 O 3 .16.1K 2 O.33.9B 2 O 3 , doped with AgCl and CuO, were produced and submitted to different heat treatments: (i) for 0.5 h at temperatures from 753 to 893 K and (ii) at 873 K for periods of time from 0.25 to 1.25 h. Small-angle X-ray scattering (SAXS) was used to characterize the samples. The samples heat treated between 843 and 893 K presented an increasing growth rate of the Guinier radius (R g ). Samples heat treated at a fixed temperature of 873 K and different time t showed a law R g 3 = kt + c. Variations in the optical absorbance at 280 nm and the additional absorbance spectra of samples exposed to light showed correlation with the SAXS results. (orig.)

  6. A new CF-IRMS system for quantifying stable isotopes of carbon monoxide from ice cores and small air samples

    Directory of Open Access Journals (Sweden)

    Z. Wang

    2010-10-01

    Full Text Available We present a new analysis technique for stable isotope ratios (δ13C and δ18O of atmospheric carbon monoxide (CO from ice core samples. The technique is an online cryogenic vacuum extraction followed by continuous-flow isotope ratio mass spectrometry (CF-IRMS; it can also be used with small air samples. The CO extraction system includes two multi-loop cryogenic cleanup traps, a chemical oxidant for oxidation to CO2, a cryogenic collection trap, a cryofocusing unit, gas chromatography purification, and subsequent injection into a Finnigan Delta Plus IRMS. Analytical precision of 0.2‰ (±1δ for δ13C and 0.6‰ (±1δ for δ18O can be obtained for 100 mL (STP air samples with CO mixing ratios ranging from 60 ppbv to 140 ppbv (~268–625 pmol CO. Six South Pole ice core samples from depths ranging from 133 m to 177 m were processed for CO isotope analysis after wet extraction. To our knowledge, this is the first measurement of stable isotopes of CO in ice core air.

  7. Assessment of real-time PCR method for detection of EGFR mutation using both supernatant and cell pellet of malignant pleural effusion samples from non-small-cell lung cancer patients.

    Science.gov (United States)

    Shin, Saeam; Kim, Juwon; Kim, Yoonjung; Cho, Sun-Mi; Lee, Kyung-A

    2017-10-26

    EGFR mutation is an emerging biomarker for treatment selection in non-small-cell lung cancer (NSCLC) patients. However, optimal mutation detection is hindered by complications associated with the biopsy procedure, tumor heterogeneity and limited sensitivity of test methodology. In this study, we evaluated the diagnostic utility of real-time PCR using malignant pleural effusion samples. A total of 77 pleural fluid samples from 77 NSCLC patients were tested using the cobas EGFR mutation test (Roche Molecular Systems). Pleural fluid was centrifuged, and separated cell pellets and supernatants were tested in parallel. Results were compared with Sanger sequencing and/or peptide nucleic acid (PNA)-mediated PCR clamping of matched tumor tissue or pleural fluid samples. All samples showed valid real-time PCR results in one or more DNA samples extracted from cell pellets and supernatants. Compared with other molecular methods, the sensitivity of real-time PCR method was 100%. Concordance rate of real-time PCR and Sanger sequencing plus PNA-mediated PCR clamping was 98.7%. We have confirmed that real-time PCR using pleural fluid had a high concordance rate compared to conventional methods, with no failed samples. Our data demonstrated that the parallel real-time PCR testing using supernatant and cell pellet could offer reliable and robust surrogate strategy when tissue is not available.

  8. Analytical Method for Carbon and Oxygen Isotope of Small Carbonate Samples with the GasBench Ⅱ-IRMS Device

    Directory of Open Access Journals (Sweden)

    LIANG Cui-cui

    2015-01-01

    Full Text Available An analytical method for measuring carbon and oxygen isotopic compositions of trace amount carbonate (>15 μg was established by Delta V Advantage isotope Ratio MS coupled with GasBench Ⅱ. Different trace amount (5-50 μg carbonate standard samples (IAEA-CO-1 were measured by GasBench Ⅱ with 12 mL and 3.7 mL vials. When the weight of samples was less than 40 μg and it was acidified in 12 mL vials, most standard deviations of the δ13C and δ18O were more than 0.1‰, which couldn’t satisfied high-precision measurements. When the weight of samples was greater than 15 μg and it was acidified in 3.7 mL vials, standard deviations for the δ13C and δ18O were 0.01‰-0.07‰ and 0.01‰-0.08‰ respectively, which satisfied high-precision measurements. Therefore, small 3.7 mL vials were used to increase the concentration of carbon dioxide in headspace, carbonate samples even less as 15 μg can be analyzed routinely by a GasBench Ⅱ continuous-flow IRMS. Meanwhile, the linear relationship between sample’s weight and peak’s area was strong (R2>0.993 2 and it can be used to determine the carbon content of carbonate samples.

  9. Determination of /sup 35/S-aminoacyl-transfer ribonucleic acid specific radioactivity in small tissue samples

    Energy Technology Data Exchange (ETDEWEB)

    Samarel, A.M.; Ogunro, E.A.; Ferguson, A.G.; Lesch, M.

    1981-11-15

    Rate determination of protein synthesis utilizing tracer amino acid incorporation requires accurate assessment of the specific radioactivity of the labeled precursor aminoacyl-tRNA pool. Previously published methods presumably useful for the measurement of any aminoacyl-tRNA were unsuccessful when applied to (/sup 35/S)methionine, due to the unique chemical properties of this amino acid. Herein we describe modifications of these methods necessary for the measurement of /sup 35/S-aminoacyl-tRNA specific radioactivity from small tissue samples incubated in the presence of (/sup 35/S)methionine. The use of (/sup 35/S)methionine of high specific radioactivity enables analysis of the methionyl-tRNA from less than 100 mg of tissue. Conditions for optimal recovery of /sup 35/S-labeled dansyl-amino acid derivatives are presented and possible applications of this method are discussed.

  10. Comparison of Time-of-flight and Multicollector ICP Mass Spectrometers for Measuring Actinides in Small Samples using single shot Laser Ablation

    International Nuclear Information System (INIS)

    R.S. Houk; D.B. Aeschliman; S.J. Bajic; D. Baldwin

    2005-01-01

    The objective of these experiments is to evaluate the performance of two types of ICP-MS device for measurement of actinide isotopes by laser ablation (LA) ICP-MS. The key advantage of ICP-MS compared to monitoring of radioactive decay is that the element need not decay during the measurement time. Hence ICP-MS is much faster for long-lived radionuclides. The LA process yields a transient signal. When spatially resolved analysis is required for small samples, the laser ablation sample pulse lasts only ∼10 seconds. It is difficult to measure signals at several isotopes with analyzers that are scanned for such a short sample transient. In this work, a time-of-flight (TOF) ICP-MS device, the GBC Optimass 8000 (Figure 1) is one instrument used. Strictly speaking, ions at different m/z values are not measured simultaneously in TOF. However, they are measured in very rapid sequence with little or no compromise between the number of m/z values monitored and the performance. Ions can be measured throughout the m/z range in single sample transients by TOF. The other ICP-MS instrument used is a magnetic sector multicollector MS, the NU Plasma 1700 (Figure 2). Up to 8 adjacent m/z values can be monitored at one setting of the magnetic field and accelerating voltage. Three of these m/z values can be measured with an electron multiplier. This device is usually used for high precision isotope ratio measurements with the Faraday cup detectors. The electron multipliers have much higher sensitivity. In our experience with the scanning magnetic sector instrument in Ames, these devices have the highest sensitivity and lowest background of any ICP-MS device. The ability to monitor several ions simultaneously, or nearly so, should make these devices valuable for the intended application: measurement of actinide isotopes at low concentrations in very small samples for nonproliferation purposes. The primary sample analyzed was an urban dust pellet reference material, NIST 1648. The

  11. Accounting for Sampling Error in Genetic Eigenvalues Using Random Matrix Theory.

    Science.gov (United States)

    Sztepanacz, Jacqueline L; Blows, Mark W

    2017-07-01

    The distribution of genetic variance in multivariate phenotypes is characterized by the empirical spectral distribution of the eigenvalues of the genetic covariance matrix. Empirical estimates of genetic eigenvalues from random effects linear models are known to be overdispersed by sampling error, where large eigenvalues are biased upward, and small eigenvalues are biased downward. The overdispersion of the leading eigenvalues of sample covariance matrices have been demonstrated to conform to the Tracy-Widom (TW) distribution. Here we show that genetic eigenvalues estimated using restricted maximum likelihood (REML) in a multivariate random effects model with an unconstrained genetic covariance structure will also conform to the TW distribution after empirical scaling and centering. However, where estimation procedures using either REML or MCMC impose boundary constraints, the resulting genetic eigenvalues tend not be TW distributed. We show how using confidence intervals from sampling distributions of genetic eigenvalues without reference to the TW distribution is insufficient protection against mistaking sampling error as genetic variance, particularly when eigenvalues are small. By scaling such sampling distributions to the appropriate TW distribution, the critical value of the TW statistic can be used to determine if the magnitude of a genetic eigenvalue exceeds the sampling error for each eigenvalue in the spectral distribution of a given genetic covariance matrix. Copyright © 2017 by the Genetics Society of America.

  12. Reliability assessment based on small samples of normal distribution

    International Nuclear Information System (INIS)

    Ma Zhibo; Zhu Jianshi; Xu Naixin

    2003-01-01

    When the pertinent parameter involved in reliability definition complies with normal distribution, the conjugate prior of its distributing parameters (μ, h) is of normal-gamma distribution. With the help of maximum entropy and the moments-equivalence principles, the subjective information of the parameter and the sampling data of its independent variables are transformed to a Bayesian prior of (μ,h). The desired estimates are obtained from either the prior or the posterior which is formed by combining the prior and sampling data. Computing methods are described and examples are presented to give demonstrations

  13. Estimation of individual reference intervals in small sample sizes

    DEFF Research Database (Denmark)

    Hansen, Ase Marie; Garde, Anne Helene; Eller, Nanna Hurwitz

    2007-01-01

    In occupational health studies, the study groups most often comprise healthy subjects performing their work. Sampling is often planned in the most practical way, e.g., sampling of blood in the morning at the work site just after the work starts. Optimal use of reference intervals requires...... from various variables such as gender, age, BMI, alcohol, smoking, and menopause. The reference intervals were compared to reference intervals calculated using IFCC recommendations. Where comparable, the IFCC calculated reference intervals had a wider range compared to the variance component models...

  14. Radioisotopic method for the measurement of lipolysis in small samples of human adipose tissue

    International Nuclear Information System (INIS)

    Leibel, R.L.; Hirsch, J.; Berry, E.M.; Gruen, R.K.

    1984-01-01

    To facilitate the study of adrenoreceptor response in small needle biopsy samples of human subcutaneous adipose tissue, we developed a dual radioisotopic technique for measuring lipolysis rate. Aliquots (20-75 mg) of adipose tissue fragments were incubated in a buffered albumin medium containing [ 3 H]palmitate and [ 14 C]glucose, each of high specific activity. In neutral glycerides synthesized in this system, [ 14 C]glucose is incorporated exclusively into the glyceride-glycerol moiety and 3 H appears solely in the esterified fatty acid. Alpha-2 and beta-1 adrenoreceptor activation of tissue incubated in this system does not alter rates of 14 C-labeled glyceride accumulation, but does produce a respective increase or decrease in the specific activity of fatty acids esterified into newly synthesized glycerides. This alteration in esterified fatty acid specific activity is reflected in the ratio of 14 C: 3 H in newly synthesized triglycerides extracted from the incubated adipose tissue. There is a high correlation (r . 0.90) between the 14 C: 3 H ratio in triglycerides and the rate of lipolysis as reflected in glycerol release into the incubation medium. The degree of adrenoreceptor activation by various concentrations of lipolytic and anti-lipolytic substances can be assessed by comparing this ratio in stimulated tissue to that characterizing unstimulated tissue or the incubation medium. This technique permits the study of very small, unweighed tissue biopsy fragments, the only limitation on sensitivity being the specific activity of the medium glucose and palmitate. It is, therefore, useful for serial examinations of adipose tissue adrenoreceptor dose-response characteristics under a variety of clinical circumstances

  15. Fast micro Hall effect measurements on small pads

    DEFF Research Database (Denmark)

    Østerberg, Frederik Westergaard; Petersen, Dirch Hjorth; Nielsen, Peter F.

    2011-01-01

    Sheet resistance, carrier mobility, and sheet carrier density are important parameters in semiconductor production, and it is therefore important to be able to rapidly and accurately measure these parameters even on small samples or pads. The interpretation of four-point probe measurements on small...... pads is non-trivial. In this paper we discuss how conformal mapping can be used to evaluate theoretically expected measurement values on small pads. Theoretical values calculated from analytical mappings of simple geometries are compared to the values found from the numerical conformal mapping...... of a square onto the infinite half-plane, where well-established solutions are known. Hall effect measurements are performed to show, experimentally, that it is possible to measure Hall mobility in less than one minute on squares as small as 7070 lm2 with a deviation of 66.5% on a 1r level from accurate...

  16. Leadership styles and effectiveness: A study of small firms in Chile

    OpenAIRE

    Pedraja-Rejas, Liliana; Rodríguez-Ponce, Emilio; Rodríguez-Ponce, Juan

    2006-01-01

    The purpose of this work is to analyze the relationship between leadership styles and effectiveness in small firms. The work considers a sample of 126 top and medium level managers based in small firms in the north of Chile. The results obtained show that the supportive leadership style is prominent. It was also found that the participative leadership style is present, but to a lesser extent, and that the instrumental leadership style is infrequent. Supportive and participative leadership sty...

  17. An Improved Metabolism Grey Model for Predicting Small Samples with a Singular Datum and Its Application to Sulfur Dioxide Emissions in China

    Directory of Open Access Journals (Sweden)

    Wei Zhou

    2016-01-01

    Full Text Available This study proposes an improved metabolism grey model [IMGM(1,1] to predict small samples with a singular datum, which is a common phenomenon in daily economic data. This new model combines the fitting advantage of the conventional GM(1,1 in small samples and the additional advantages of the MGM(1,1 in new real-time data, while overcoming the limitations of both the conventional GM(1,1 and MGM(1,1 when the predicted results are vulnerable at any singular datum. Thus, this model can be classified as an improved grey prediction model. Its improvements are illustrated through a case study of sulfur dioxide emissions in China from 2007 to 2013 with a singular datum in 2011. Some features of this model are presented based on the error analysis in the case study. Results suggest that if action is not taken immediately, sulfur dioxide emissions in 2016 will surpass the standard level required by the Twelfth Five-Year Plan proposed by the China State Council.

  18. SampleCNN: End-to-End Deep Convolutional Neural Networks Using Very Small Filters for Music Classification

    Directory of Open Access Journals (Sweden)

    Jongpil Lee

    2018-01-01

    Full Text Available Convolutional Neural Networks (CNN have been applied to diverse machine learning tasks for different modalities of raw data in an end-to-end fashion. In the audio domain, a raw waveform-based approach has been explored to directly learn hierarchical characteristics of audio. However, the majority of previous studies have limited their model capacity by taking a frame-level structure similar to short-time Fourier transforms. We previously proposed a CNN architecture which learns representations using sample-level filters beyond typical frame-level input representations. The architecture showed comparable performance to the spectrogram-based CNN model in music auto-tagging. In this paper, we extend the previous work in three ways. First, considering the sample-level model requires much longer training time, we progressively downsample the input signals and examine how it affects the performance. Second, we extend the model using multi-level and multi-scale feature aggregation technique and subsequently conduct transfer learning for several music classification tasks. Finally, we visualize filters learned by the sample-level CNN in each layer to identify hierarchically learned features and show that they are sensitive to log-scaled frequency.

  19. A low-volume cavity ring-down spectrometer for sample-limited applications

    Science.gov (United States)

    Stowasser, C.; Farinas, A. D.; Ware, J.; Wistisen, D. W.; Rella, C.; Wahl, E.; Crosson, E.; Blunier, T.

    2014-08-01

    In atmospheric and environmental sciences, optical spectrometers are used for the measurements of greenhouse gas mole fractions and the isotopic composition of water vapor or greenhouse gases. The large sample cell volumes (tens of milliliters to several liters) in commercially available spectrometers constrain the usefulness of such instruments for applications that are limited in sample size and/or need to track fast variations in the sample stream. In an effort to make spectrometers more suitable for sample-limited applications, we developed a low-volume analyzer capable of measuring mole fractions of methane and carbon monoxide based on a commercial cavity ring-down spectrometer. The instrument has a small sample cell (9.6 ml) and can selectively be operated at a sample cell pressure of 140, 45, or 20 Torr (effective internal volume of 1.8, 0.57, and 0.25 ml). We present the new sample cell design and the flow path configuration, which are optimized for small sample sizes. To quantify the spectrometer's usefulness for sample-limited applications, we determine the renewal rate of sample molecules within the low-volume spectrometer. Furthermore, we show that the performance of the low-volume spectrometer matches the performance of the standard commercial analyzers by investigating linearity, precision, and instrumental drift.

  20. Standard Format for Chromatographic-polarimetric System small samples assessment

    International Nuclear Information System (INIS)

    Naranjo, S.; Fajer, V.; Fonfria, C.; Patinno, R.

    2012-01-01

    The treatment of samples containing optically active substances to be evaluated as part of quality control of raw material entering industrial process, and also during the modifications exerted on it to obtain the desired final composition is still and unsolved problem for many industries. That is the case of sugarcane industry. Sometimes the troubles implied are enlarged because samples to be evaluated are not bigger than one milliliter. Reduction of gel beds in G-10 and G-50 chromatographic columns having an inner diameter of 16 mm, instead of 25, and bed heights adjustable to requirements by means of sliding stoppers to increase analytical power were evaluated with glucose and sucrose standards in concentrations from 1 to 10 g/dL, using aliquots of 1 ml without undesirable dilutions that could affect either detection or chromatographic profile. Assays with seaweed extracts gave good results that are shown. It is established the advantage to know concentration of a separated substance by the height of its peak and the savings in time and reagents resulting . Sample expanded uncertainty in both systems is compared. It is also presented several programs for data acquisition, storing and processing. (Author)

  1. Toward cost-efficient sampling methods

    Science.gov (United States)

    Luo, Peng; Li, Yongli; Wu, Chong; Zhang, Guijie

    2015-09-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper proposes two new sampling methods based on the idea that a small part of vertices with high node degree could possess the most structure information of a complex network. The two proposed sampling methods are efficient in sampling high degree nodes so that they would be useful even if the sampling rate is low, which means cost-efficient. The first new sampling method is developed on the basis of the widely used stratified random sampling (SRS) method and the second one improves the famous snowball sampling (SBS) method. In order to demonstrate the validity and accuracy of two new sampling methods, we compare them with the existing sampling methods in three commonly used simulation networks that are scale-free network, random network, small-world network, and also in two real networks. The experimental results illustrate that the two proposed sampling methods perform much better than the existing sampling methods in terms of achieving the true network structure characteristics reflected by clustering coefficient, Bonacich centrality and average path length, especially when the sampling rate is low.

  2. Sample Reuse in Statistical Remodeling.

    Science.gov (United States)

    1987-08-01

    as the jackknife and bootstrap, is an expansion of the functional, T(Fn), or of its distribution function or both. Frangos and Schucany (1987a) used...accelerated bootstrap. In the same report Frangos and Schucany demonstrated the small sample superiority of that approach over the proposals that take...higher order terms of an Edgeworth expansion into account. In a second report Frangos and Schucany (1987b) examined the small sample performance of

  3. Gas-driven pump for ground-water samples

    Science.gov (United States)

    Signor, Donald C.

    1978-01-01

    Observation wells installed for artificial-recharge research and other wells used in different ground-water programs are frequently cased with small-diameter steel pipe. To obtain samples from these small-diameter wells in order to monitor water quality, and to calibrate solute-transport models, a small-diameter pump with unique operating characteristics is required that causes a minimum alternation of samples during field sampling. A small-diameter gas-driven pump was designed and built to obtain water samples from wells of two-inch diameter or larger. The pump is a double-piston type with the following characteristics: (1) The water sample is isolated from the operating gas, (2) no source of electricity is ncessary, (3) operation is continuous, (4) use of compressed gas is efficient, and (5) operation is reliable over extended periods of time. Principles of operation, actual operation techniques, gas-use analyses and operating experience are described. Complete working drawings and a component list are included. Recent modifications and pump construction for high-pressure applications also are described. (Woodard-USGS)

  4. Not Too Small To Care: Small Businesses and Child Care. National Advisory Panel Exchange #2.

    Science.gov (United States)

    Eichman, Caroline; Reisman, Barbara

    This report of the Child Care Action Committee's National Advisory Panel profiles 29 small businesses employing under 250 workers in 15 states which offer child care benefits to their employees. These businesses do not constitute a representative sample of small businesses. A series of factors for small businesses to consider when planning a child…

  5. Collateral Information for Equating in Small Samples: A Preliminary Investigation

    Science.gov (United States)

    Kim, Sooyeon; Livingston, Samuel A.; Lewis, Charles

    2011-01-01

    This article describes a preliminary investigation of an empirical Bayes (EB) procedure for using collateral information to improve equating of scores on test forms taken by small numbers of examinees. Resampling studies were done on two different forms of the same test. In each study, EB and non-EB versions of two equating methods--chained linear…

  6. Small-angle X-ray scattering on growth of AgCl crystallites in photochromic glasses

    Energy Technology Data Exchange (ETDEWEB)

    Takatohi, U.E. [Inst. Adventista de Ensino, Sao Paulo (Brazil); Bittencourt, D.R.S.; Watanabe, S.

    1997-10-01

    Reversible changes in the optical properties of photochromic glasses are observed owing to the presence of small silver halide crystals inside the glassy matrix. These crystals grow during the glass heat-treatment processing. Samples with molar composition of 40SiO{sub 2}.10Al{sub 2}O{sub 3}.16.1K{sub 2}O.33.9B{sub 2}O{sub 3}, doped with AgCl and CuO, were produced and submitted to different heat treatments: (i) for 0.5 h at temperatures from 753 to 893 K and (ii) at 873 K for periods of time from 0.25 to 1.25 h. Small-angle X-ray scattering (SAXS) was used to characterize the samples. The samples heat treated between 843 and 893 K presented an increasing growth rate of the Guinier radius (R{sub g}). Samples heat treated at a fixed temperature of 873 K and different time t showed a law R{sub g}{sup 3} = kt + c. Variations in the optical absorbance at 280 nm and the additional absorbance spectra of samples exposed to light showed correlation with the SAXS results. (orig.). 16 refs.

  7. Capillary absorption spectrometer and process for isotopic analysis of small samples

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, M. Lizabeth; Kelly, James F.; Sams, Robert L.; Moran, James J.; Newburn, Matthew K.; Blake, Thomas A.

    2018-04-24

    A capillary absorption spectrometer and process are described that provide highly sensitive and accurate stable absorption measurements of analytes in a sample gas that may include isotopologues of carbon and oxygen obtained from gas and biological samples. It further provides isotopic images of microbial communities that allow tracking of nutrients at the single cell level. It further targets naturally occurring variations in carbon and oxygen isotopes that avoids need for expensive isotopically labeled mixtures which allows study of samples taken from the field without modification. The process also permits sampling in vivo permitting real-time ambient studies of microbial communities.

  8. Capillary absorption spectrometer and process for isotopic analysis of small samples

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, M. Lizabeth; Kelly, James F.; Sams, Robert L.; Moran, James J.; Newburn, Matthew K.; Blake, Thomas A.

    2016-03-29

    A capillary absorption spectrometer and process are described that provide highly sensitive and accurate stable absorption measurements of analytes in a sample gas that may include isotopologues of carbon and oxygen obtained from gas and biological samples. It further provides isotopic images of microbial communities that allow tracking of nutrients at the single cell level. It further targets naturally occurring variations in carbon and oxygen isotopes that avoids need for expensive isotopically labeled mixtures which allows study of samples taken from the field without modification. The method also permits sampling in vivo permitting real-time ambient studies of microbial communities.

  9. Finishing pigs that are divergent in feed efficiency show small differences in intestinal functionality and structure.

    Directory of Open Access Journals (Sweden)

    Barbara U Metzler-Zebeli

    Full Text Available Controversial information is available regarding the feed efficiency-related variation in intestinal size, structure and functionality in pigs. The present objective was therefore to investigate the differences in visceral organ size, intestinal morphology, mucosal enzyme activity, intestinal integrity and related gene expression in low and high RFI pigs which were reared at three different geographical locations (Austria, AT; Northern Ireland, NI; Republic of Ireland, ROI using similar protocols. Pigs (n = 369 were ranked for their RFI between days 42 and 91 postweaning and low and high RFI pigs (n = 16 from AT, n = 24 from NI, and n = 60 from ROI were selected. Pigs were sacrificed and sampled on ~day 110 of life. In general, RFI-related variation in intestinal size, structure and function was small. Some energy saving mechanisms and enhanced digestive and absorptive capacity were indicated in low versus high RFI pigs by shorter crypts, higher duodenal lactase and maltase activity and greater mucosal permeability (P < 0.05, but differences were mainly seen in pigs from AT and to a lesser degree in pigs from ROI. Additionally, low RFI pigs from AT had more goblet cells in duodenum but fewer in jejunum compared to high RFI pigs (P < 0.05. Together with the lower expression of TLR4 and TNFA in low versus high RFI pigs from AT and ROI (P < 0.05, these results might indicate differences in the innate immune response between low and high RFI pigs. Results demonstrated that the variation in the size of visceral organs and intestinal structure and functionality was greater between geographic location (local environmental factors than between RFI ranks of pigs. In conclusion, present results support previous findings that the intestinal size, structure and functionality do not significantly contribute to variation in RFI of pigs.

  10. Ochratoxin A in raisins and currants: basic extraction procedure used in two small marketing surveys of the occurrence and control of the heterogeneity of the toxins in samples.

    Science.gov (United States)

    Möller, T E; Nyberg, M

    2003-11-01

    A basic extraction procedure for analysis of ochratoxin A (OTA) in currants and raisins is described, as well as the occurrence of OTA and a control of heterogeneity of the toxin in samples bought for two small marketing surveys 1999/2000 and 2001/02. Most samples in the surveys were divided into two subsamples that were individually prepared as slurries and analysed separately. The limit of quantification for the method was estimated as 0.1 microg kg(-1) and recoveries of 85, 90 and 115% were achieved in recovery experiments at 10, 5 and 0.1 microg kg(-1), respectively. Of all 118 subsamples analysed in the surveys, 96 (84%) contained ochratoxin A at levels above the quantification level and five samples (4%) contained more than the European Community legislation of 10 microg kg(-1). The OTA concentrations found in the first survey were in the range Big differences were often achieved between individual subsamples of the original sample, which indicate a wide heterogeneous distribution of the toxin. Data from the repeatability test as well as recovery experiments from the same slurries showed that preparation of slurries as described here seemed to give a homogeneous and representative sample. The extraction with the basic sodium bicarbonate-methanol mixture used in the surveys gave similar or somewhat higher OTA values on some samples tested in a comparison with a weak phosphoric acid water-methanol extraction mixture.

  11. Identification of Gene Biomarkers for Distinguishing Small-Cell Lung Cancer from Non-Small-Cell Lung Cancer Using a Network-Based Approach

    Directory of Open Access Journals (Sweden)

    Fei Long

    2015-01-01

    Full Text Available Lung cancer consists of two main subtypes: small-cell lung cancer (SCLC and non-small-cell lung cancer (NSCLC that are classified according to their physiological phenotypes. In this study, we have developed a network-based approach to identify molecular biomarkers that can distinguish SCLC from NSCLC. By identifying positive and negative coexpression gene pairs in normal lung tissues, SCLC, or NSCLC samples and using functional association information from the STRING network, we first construct a lung cancer-specific gene association network. From the network, we obtain gene modules in which genes are highly functionally associated with each other and are either positively or negatively coexpressed in the three conditions. Then, we identify gene modules that not only are differentially expressed between cancer and normal samples, but also show distinctive expression patterns between SCLC and NSCLC. Finally, we select genes inside those modules with discriminating coexpression patterns between the two lung cancer subtypes and predict them as candidate biomarkers that are of diagnostic use.

  12. Small-angle X-ray scattering tensor tomography: model of the three-dimensional reciprocal-space map, reconstruction algorithm and angular sampling requirements.

    Science.gov (United States)

    Liebi, Marianne; Georgiadis, Marios; Kohlbrecher, Joachim; Holler, Mirko; Raabe, Jörg; Usov, Ivan; Menzel, Andreas; Schneider, Philipp; Bunk, Oliver; Guizar-Sicairos, Manuel

    2018-01-01

    Small-angle X-ray scattering tensor tomography, which allows reconstruction of the local three-dimensional reciprocal-space map within a three-dimensional sample as introduced by Liebi et al. [Nature (2015), 527, 349-352], is described in more detail with regard to the mathematical framework and the optimization algorithm. For the case of trabecular bone samples from vertebrae it is shown that the model of the three-dimensional reciprocal-space map using spherical harmonics can adequately describe the measured data. The method enables the determination of nanostructure orientation and degree of orientation as demonstrated previously in a single momentum transfer q range. This article presents a reconstruction of the complete reciprocal-space map for the case of bone over extended ranges of q. In addition, it is shown that uniform angular sampling and advanced regularization strategies help to reduce the amount of data required.

  13. Mars ascent propulsion options for small sample return vehicles

    International Nuclear Information System (INIS)

    Whitehead, J. C.

    1997-01-01

    An unprecedented combination of high propellant fraction and small size is required for affordable-scale Mars return, regardless of the number of stages, or whether Mars orbit rendezvous or in-situ propellant options are used. Conventional space propulsion technology is too heavy, even without structure or other stage subsystems. The application of launch vehicle design principles to the development of new hardware on a tiny scale is therefore suggested. Miniature pump-fed rocket engines fed by low pressure tanks can help to meet this challenge. New concepts for engine cycles using piston pumps are described, and development issues are outlined

  14. Three-dimensional magnetic nanoparticle imaging using small field gradient and multiple pickup coils

    Energy Technology Data Exchange (ETDEWEB)

    Sasayama, Teruyoshi, E-mail: sasayama@sc.kyushu-u.ac.jp; Tsujita, Yuya; Morishita, Manabu; Muta, Masahiro; Yoshida, Takashi; Enpuku, Keiji

    2017-04-01

    We propose a magnetic particle imaging (MPI) method based on third harmonic signal detection using a small field gradient and multiple pickup coils. First, we developed a system using two pickup coils and performed three-dimensional detection of two magnetic nanoparticle (MNP) samples, which were spaced 15 mm apart. In the experiments, an excitation field strength of 1.6 mT was used at an operating frequency of 3 kHz. A DC gradient field with a typical value of 0.2 T/m was also used to produce the so-called field-free line. A third harmonic signal generated by the MNP samples was detected using the two pickup coils, and the samples were then mechanically scanned to obtain field maps. The field maps were subsequently analyzed using the nonnegative least squares method to obtain three-dimensional position information for the MNP samples. The results show that the positions of the two MNP samples were estimated with good accuracy, despite the small field gradient used. Further improvement in MPI performance will be achieved by increasing the number of pickup coils used. - Highlights: • 3D magnetic particle imaging system combining field-free line and two pickup coils. • Imaging method based on third harmonic signal detection and small field gradient. • Nonnegative least squares method for 3D magnetic nanoparticle image reconstruction. • High spatial resolution despite use of small field gradient.

  15. Treating Sample Covariances for Use in Strongly Coupled Atmosphere-Ocean Data Assimilation

    Science.gov (United States)

    Smith, Polly J.; Lawless, Amos S.; Nichols, Nancy K.

    2018-01-01

    Strongly coupled data assimilation requires cross-domain forecast error covariances; information from ensembles can be used, but limited sampling means that ensemble derived error covariances are routinely rank deficient and/or ill-conditioned and marred by noise. Thus, they require modification before they can be incorporated into a standard assimilation framework. Here we compare methods for improving the rank and conditioning of multivariate sample error covariance matrices for coupled atmosphere-ocean data assimilation. The first method, reconditioning, alters the matrix eigenvalues directly; this preserves the correlation structures but does not remove sampling noise. We show that it is better to recondition the correlation matrix rather than the covariance matrix as this prevents small but dynamically important modes from being lost. The second method, model state-space localization via the Schur product, effectively removes sample noise but can dampen small cross-correlation signals. A combination that exploits the merits of each is found to offer an effective alternative.

  16. Small-mass AMS radiocarbon analysis at Nagoya University

    Energy Technology Data Exchange (ETDEWEB)

    Minami, Masayo, E-mail: minami@nendai.nagoya-u.ac.jp [Center for Chronological Research, Nagoya University, Chikusa, Nagoya 464-8602 (Japan); Kato, Tomomi [Faculty of Science, Nagoya University, Nagoya 464-8602 (Japan); Miyata, Yoshiki; Nakamura, Toshio [Center for Chronological Research, Nagoya University, Chikusa, Nagoya 464-8602 (Japan); Hua Quan [Australian Nuclear Science and Technology Organization, Locked Bag 2001, Kirrawee DC, NSW 2232 (Australia)

    2013-01-15

    As part of the ongoing development at the AMS facility of the Center for Chronological Research at Nagoya University to radiocarbon ({sup 14}C) analyze samples smaller than 0.5 mg carbon (mgC), a compact graphitization manifold has been built. Tests with various reference materials show it performs well for samples as small as 0.1 mgC. Preparation with this new system is compared with the performance of the older protocol for regular-sized samples. Furthermore, it is shown that the addition of Cu and Ag before and stepwise heating during sealed-tube combustion of samples with high S content improve the degree of conversion to CO{sub 2} without having to resort to special purification measures such as the use of Co{sub 3}O{sub 4} + Ag reagent and an n-pentane/LN{sub 2} trap before graphitization.

  17. Isothermal multiple displacement amplification: a methodical approach enhancing molecular routine diagnostics of microcarcinomas and small biopsies.

    Science.gov (United States)

    Mairinger, Fabian D; Walter, Robert Fh; Vollbrecht, Claudia; Hager, Thomas; Worm, Karl; Ting, Saskia; Wohlschläger, Jeremias; Zarogoulidis, Paul; Zarogoulidis, Konstantinos; Schmid, Kurt W

    2014-01-01

    Isothermal multiple displacement amplification (IMDA) can be a powerful tool in molecular routine diagnostics for homogeneous and sequence-independent whole-genome amplification of notably small tumor samples, eg, microcarcinomas and biopsies containing a small amount of tumor. Currently, this method is not well established in pathology laboratories. We designed a study to confirm the feasibility and convenience of this method for routine diagnostics with formalin-fixed, paraffin-embedded samples prepared by laser-capture microdissection. A total of 250 μg DNA (concentration 5 μg/μL) was generated by amplification over a period of 8 hours with a material input of approximately 25 cells, approximately equivalent to 175 pg of genomic DNA. In the generated DNA, a representation of all chromosomes could be shown and the presence of elected genes relevant for diagnosis in clinical samples could be proven. Mutational analysis of clinical samples could be performed without any difficulty and showed concordance with earlier diagnostic findings. We established the feasibility and convenience of IMDA for routine diagnostics. We also showed that small amounts of DNA, which were not analyzable with current molecular methods, could be sufficient for a wide field of applications in molecular routine diagnostics when they are preamplified with IMDA.

  18. Analysis and comparison of fish growth from small samples of length-at-age data : Detection of sexual dimorphism in Eurasian perch as an example

    NARCIS (Netherlands)

    Mooij, WM; Van Rooij, JM; Wijnhoven, S

    A relatively simple approach is presented for statistical analysis and comparison of fish growth patterns inferred from size-at-age data. It can be used for any growth model and small sample sizes. Bootstrapping is used to generate confidence regions for the model parameters and for size and growth

  19. New Generation Flask Sampling Technology Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Smith, James R. [AOS, Inc., Colorado Springs, CO (United States)

    2017-11-09

    Scientists are turning their focus to the Arctic, site of one of the strongest climate change signals. A new generation of technologies is required to function within that harsh environment, chart evolution of its trace gases and provide new kinds of information for models of the atmosphere. Our response to the solicitation tracks how global atmospheric monitoring was launched more than a half century ago; namely, acquisition of discrete samples of air by flask and subsequent analysis in the laboratory. AOS is proposing to develop a new generation of flask sampling technology. It will enable the new Arctic programs to begin with objective high density sampling of the atmosphere by UAS. The Phase I program will build the prototype flask technology and show that it can acquire and store mol fractions of CH4 and CO2 and value of δ13C with good fidelity. A CAD model will be produced for the entire platform including a package with 100 flasks and the airframe with auto-pilot, electronic propulsion and ground-to-air communications. A mobile flask analysis station will be prototyped in Phase I and designed to final form in Phase II. It expends very small sample per analysis and will interface directly to the flask package integrated permanently into the UAS fuselage. Commercial Applications and Other Benefits: • The New Generation Flask Sampling Technology able to provide a hundred or more samples of air per UAS mission. • A mobile analysis station expending far less sample than the existing ones and small enough to be stationed at the remote sites of Arctic operations. • A new form of validation for continuous trace gas observations from all platforms including the small UAS. • Further demonstration to potential customers of the AOS capabilities to invent, build, deploy and exploit entire platforms for observations of Earth’s atmosphere and ocean. Key Words: Flask Sampler, Mobile Analysis Station, Trace Gas, CO2, CH4, δC13, UAS, Baseline Airborne Observatory

  20. Triacylglycerol Analysis in Human Milk and Other Mammalian Species: Small-Scale Sample Preparation, Characterization, and Statistical Classification Using HPLC-ELSD Profiles.

    Science.gov (United States)

    Ten-Doménech, Isabel; Beltrán-Iturat, Eduardo; Herrero-Martínez, José Manuel; Sancho-Llopis, Juan Vicente; Simó-Alfonso, Ernesto Francisco

    2015-06-24

    In this work, a method for the separation of triacylglycerols (TAGs) present in human milk and from other mammalian species by reversed-phase high-performance liquid chromatography using a core-shell particle packed column with UV and evaporative light-scattering detectors is described. Under optimal conditions, a mobile phase containing acetonitrile/n-pentanol at 10 °C gave an excellent resolution among more than 50 TAG peaks. A small-scale method for fat extraction in these milks (particularly of interest for human milk samples) using minimal amounts of sample and reagents was also developed. The proposed extraction protocol and the traditional method were compared, giving similar results, with respect to the total fat and relative TAG contents. Finally, a statistical study based on linear discriminant analysis on the TAG composition of different types of milks (human, cow, sheep, and goat) was carried out to differentiate the samples according to their mammalian origin.

  1. Fractal morphology in lignite coal: a small angle x-ray scattering investigation

    International Nuclear Information System (INIS)

    Chitra, R.; Sen, D.; Mazumder, S.; Chandrasekaran, K.S.

    1999-01-01

    Small angle x-ray scattering technique has been used to study the pore morphology in lignite coal from Neyveli lignite mine (Tamilnadu, India). The sample were collected from three different locations of the same mine. SAXS profiles from all the three samples show almost identical functionality, irrespective of the locations from where the samples were collected. SAXS experiment using two different wavelengths also exhibit same functionality indicating the absence of multiple scattering. The analysis indicates the surface fractal nature of the pore morphology. The surface fractal dimension is calculated to be 2.58. (author)

  2. Conformation of comb liquid crystal polymers by neutron small angle scattering

    International Nuclear Information System (INIS)

    Noirez, L.; Pepy, G.

    1988-01-01

    A review is made of the direct information obtained by small angle neutron scattering about the anisotropy of the components parallel and perpendicular to the orienting magnetic field of the radius of gyration of comb like liquid crystal polymers. The behaviour of the conformation versus temperature is reported for several samples. Until now all samples show an oblate conformation in the smectic phase and probably the whole range of the nematic phase. The results are compared with the available theoretical predictions

  3. Test plan for the Sample Transfer Canister system

    International Nuclear Information System (INIS)

    Flanagan, B.D.

    1998-01-01

    The Sample Transfer Canister will be used by the Waste Receiving and Processing Facility (WRAP) for the transport of small quantity liquid samples that meet the definition of a limited quantity radioactive material, and may also be corrosive and/or flammable. These samples will be packaged and shipped in accordance with the US Department of Transportation (DOT) regulation 49 CFR 173.4, ''Exceptions for small quantities.'' The Sample Transfer Canister is of a ''French Can'' design, intended to be mated with a glove box for loading/unloading. Transport will typically take place north of the Wye Barricade between WRAP and the 222-S Laboratory. The Sample Transfer Canister will be shipped in an insulated ice chest, but the ice chest will not be a part of the small quantity package during prototype testing

  4. Estimation of Poverty in Small Areas

    Directory of Open Access Journals (Sweden)

    Agne Bikauskaite

    2014-12-01

    Full Text Available A qualitative techniques of poverty estimation is needed to better implement, monitor and determine national areas where support is most required. The problem of small area estimation (SAE is the production of reliable estimates in areas with small samples. The precision of estimates in strata deteriorates (i.e. the precision decreases when the standard deviation increases, if the sample size is smaller. In these cases traditional direct estimators may be not precise and therefore pointless. Currently there are many indirect methods for SAE. The purpose of this paper is to analyze several diff erent types of techniques which produce small area estimates of poverty.

  5. The Small Heat Shock Protein α-Crystallin B Shows Neuroprotective Properties in a Glaucoma Animal Model

    Directory of Open Access Journals (Sweden)

    Fabian Anders

    2017-11-01

    Full Text Available Glaucoma is a neurodegenerative disease that leads to irreversible retinal ganglion cell (RGC loss and is one of the main causes of blindness worldwide. The pathogenesis of glaucoma remains unclear, and novel approaches for neuroprotective treatments are urgently needed. Previous studies have revealed significant down-regulation of α-crystallin B as an initial reaction to elevated intraocular pressure (IOP, followed by a clear but delayed up-regulation, suggesting that this small heat-shock protein plays a pathophysiological role in the disease. This study analyzed the neuroprotective effect of α-crystallin B in an experimental animal model of glaucoma. Significant IOP elevation induced by episcleral vein cauterization resulted in a considerable impairment of the RGCs and the retinal nerve fiber layer. An intravitreal injection of α-crystallin B at the time of the IOP increase was able to rescue the RGCs, as measured in a functional photopic electroretinogram, retinal nerve fiber layer thickness, and RGC counts. Mass-spectrometry-based proteomics and antibody-microarray measurements indicated that a α-crystallin injection distinctly up-regulated all of the subclasses (α, β, and γ of the crystallin protein family. The creation of an interactive protein network revealed clear correlations between individual proteins, which showed a regulatory shift resulting from the crystallin injection. The neuroprotective properties of α-crystallin B further demonstrate the potential importance of crystallin proteins in developing therapeutic options for glaucoma.

  6. Application of In-Segment Multiple Sampling in Object-Based Classification

    Directory of Open Access Journals (Sweden)

    Nataša Đurić

    2014-12-01

    Full Text Available When object-based analysis is applied to very high-resolution imagery, pixels within the segments reveal large spectral inhomogeneity; their distribution can be considered complex rather than normal. When normality is violated, the classification methods that rely on the assumption of normally distributed data are not as successful or accurate. It is hard to detect normality violations in small samples. The segmentation process produces segments that vary highly in size; samples can be very big or very small. This paper investigates whether the complexity within the segment can be addressed using multiple random sampling of segment pixels and multiple calculations of similarity measures. In order to analyze the effect sampling has on classification results, statistics and probability value equations of non-parametric two-sample Kolmogorov-Smirnov test and parametric Student’s t-test are selected as similarity measures in the classification process. The performance of both classifiers was assessed on a WorldView-2 image for four land cover classes (roads, buildings, grass and trees and compared to two commonly used object-based classifiers—k-Nearest Neighbor (k-NN and Support Vector Machine (SVM. Both proposed classifiers showed a slight improvement in the overall classification accuracies and produced more accurate classification maps when compared to the ground truth image.

  7. The perception of small crime

    NARCIS (Netherlands)

    Douhou, S.; Magnus, J.R.; van Soest, A.H.O.

    2011-01-01

    In this paper we measure perceptions of incorrect behavior or ‘small crime’, based on a questionnaire administered to a large representative sample from the Dutch population. In the questionnaire we ask the respondents to rate the severity or justifiability of a number of small crimes. We present

  8. Sample preparation

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Sample preparation prior to HPLC analysis is certainly one of the most important steps to consider in trace or ultratrace analysis. For many years scientists have tried to simplify the sample preparation process. It is rarely possible to inject a neat liquid sample or a sample where preparation may not be any more complex than dissolution of the sample in a given solvent. The last process alone can remove insoluble materials, which is especially helpful with the samples in complex matrices if other interactions do not affect extraction. Here, it is very likely a large number of components will not dissolve and are, therefore, eliminated by a simple filtration process. In most cases, the process of sample preparation is not as simple as dissolution of the component interest. At times, enrichment is necessary, that is, the component of interest is present in very large volume or mass of material. It needs to be concentrated in some manner so a small volume of the concentrated or enriched sample can be injected into HPLC. 88 refs

  9. Sample preparation techniques of biological material for isotope analysis

    International Nuclear Information System (INIS)

    Axmann, H.; Sebastianelli, A.; Arrillaga, J.L.

    1990-01-01

    Sample preparation is an essential step in all isotope-aided experiments but often it is not given enough attention. The methods of sample preparation are very important to obtain reliable and precise analytical data and for further interpretation of results. The size of a sample required for chemical analysis is usually very small (10mg-1500mg). On the other hand the amount of harvested plant material from plots in a field experiment is often bulky (several kilograms) and the entire sample is too large for processing. In addition, while approaching maturity many crops show not only differences in physical consistency but also a non-uniformity in 15 N content among plant parts, requiring a plant fractionation or separation into parts (vegetative and reproductive) e.g. shoots and spikes, in case of small grain cereals, shoots and pods in case of grain legumes and tops and roots or beets (including crown) in case of sugar beet, etc. In any case the ultimate goal of these procedures is to obtain representative subsample harvested from greenhouse or field experiments for chemical analysis. Before harvesting an isotopic-aided experiment the method of sampling has to be selected. It should be based on the type of information required in relation to the objectives of the research and the availability of resources (staff, sample preparation equipment, analytical facilities, chemicals and supplies, etc.). 10 refs, 3 figs, 3 tabs

  10. Towards Cost-efficient Sampling Methods

    OpenAIRE

    Peng, Luo; Yongli, Li; Chong, Wu

    2014-01-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper presents two new sampling methods based on the perspective that a small part of vertices with high node degree can possess the most structure information of a network. The two proposed sampling methods are efficient in sampling the nodes with high degree. The first new sampling method is improved on the basis of the stratified random sampling method and...

  11. Small sample analysis using sputter atomization/resonance ionization mass spectrometry

    International Nuclear Information System (INIS)

    Christie, W.H.; Goeringer, D.E.

    1986-01-01

    We have used secondary ion mass spectrometry (SIMS) to investigate the emission of ions via argon sputtering from U metal, UO 2 , and U 3 O 8 samples. We have also used laser resonance ionization techniques to study argon-sputtered neutral atoms and molecules emitted from these same samples. For the case of U metal, a significant enhancement in detection sensitivity for U is obtained via SA/RIMS. For U in the fully oxidized form (U 3 O 8 ), SA/RIMS offers no improvement in U detection sensitivity over conventional SIMS when sputtering with argon. 9 refs., 1 fig., 2 tabs

  12. Raman spectroscopy for forensic examination of β-ketophenethylamine "legal highs": reference and seized samples of cathinone derivatives.

    Science.gov (United States)

    Stewart, Samantha P; Bell, Steven E J; Fletcher, Nicholas C; Bouazzaoui, Samira; Ho, Yen Cheng; Speers, S James; Peters, K Laota

    2012-01-20

    Raman spectra of a representative range of β-ketophenethylamine (β-KP), the rapidly growing family of cathinone-related "legal high" recreational drugs, have been recorded. These spectra showed characteristic changes that were associated with the pattern of substitution on the aromatic rings, for example, the compounds carrying substituents at the 4- position could be distinguished from 3,4-methylenedioxy "ecstasy" derivatives. They also showed small but detectable changes with differences in substitution on the ethylamine substituent. These features allowed the β-KPs present in seized casework samples to be identified. The seized samples typically contained only small amounts of bulking agents, which meant that the band intensities of these components within averaged data were very small. In contrast, grid sampling normally gave at least some spectra which had a higher than average proportion of the bulking agent(s), which allowed them to also be identified. This study therefore demonstrates that Raman spectroscopy can be used both to provide a rapid, non-destructive technique for identification of this class of drugs in seized samples and to detect minor constituents, giving a composition profile which can be used for drugs intelligence work. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  13. Simultaneous extraction and clean-up of polychlorinated biphenyls and their metabolites from small tissue samples using pressurized liquid extraction

    Science.gov (United States)

    Kania-Korwel, Izabela; Zhao, Hongxia; Norstrom, Karin; Li, Xueshu; Hornbuckle, Keri C.; Lehmler, Hans-Joachim

    2008-01-01

    A pressurized liquid extraction-based method for the simultaneous extraction and in situ clean-up of polychlorinated biphenyls (PCBs), hydroxylated (OH)-PCBs and methylsulfonyl (MeSO2)-PCBs from small (< 0.5 gram) tissue samples was developed and validated. Extraction of a laboratory reference material with hexane:dichloromethane:methanol (48:43:9, v/v) and Florisil as fat retainer allowed an efficient recovery of PCBs (78–112%; RSD: 13–37%), OH-PCBs (46±2%; RSD: 4%) and MeSO2-PCBs (89±21%; RSD: 24%). Comparable results were obtained with an established analysis method for PCBs, OH-PCBs and MeSO2-PCBs. PMID:19019378

  14. Number of core samples: Mean concentrations and confidence intervals

    International Nuclear Information System (INIS)

    Jensen, L.; Cromar, R.D.; Wilmarth, S.R.; Heasler, P.G.

    1995-01-01

    This document provides estimates of how well the mean concentration of analytes are known as a function of the number of core samples, composite samples, and replicate analyses. The estimates are based upon core composite data from nine recently sampled single-shell tanks. The results can be used when determining the number of core samples needed to ''characterize'' the waste from similar single-shell tanks. A standard way of expressing uncertainty in the estimate of a mean is with a 95% confidence interval (CI). The authors investigate how the width of a 95% CI on the mean concentration decreases as the number of observations increase. Specifically, the tables and figures show how the relative half-width (RHW) of a 95% CI decreases as the number of core samples increases. The RHW of a CI is a unit-less measure of uncertainty. The general conclusions are as follows: (1) the RHW decreases dramatically as the number of core samples is increased, the decrease is much smaller when the number of composited samples or the number of replicate analyses are increase; (2) if the mean concentration of an analyte needs to be estimated with a small RHW, then a large number of core samples is required. The estimated number of core samples given in the tables and figures were determined by specifying different sizes of the RHW. Four nominal sizes were examined: 10%, 25%, 50%, and 100% of the observed mean concentration. For a majority of analytes the number of core samples required to achieve an accuracy within 10% of the mean concentration is extremely large. In many cases, however, two or three core samples is sufficient to achieve a RHW of approximately 50 to 100%. Because many of the analytes in the data have small concentrations, this level of accuracy may be satisfactory for some applications

  15. Small-kernel, constrained least-squares restoration of sampled image data

    Science.gov (United States)

    Hazra, Rajeeb; Park, Stephen K.

    1992-01-01

    Following the work of Park (1989), who extended a derivation of the Wiener filter based on the incomplete discrete/discrete model to a more comprehensive end-to-end continuous/discrete/continuous model, it is shown that a derivation of the constrained least-squares (CLS) filter based on the discrete/discrete model can also be extended to this more comprehensive continuous/discrete/continuous model. This results in an improved CLS restoration filter, which can be efficiently implemented as a small-kernel convolution in the spatial domain.

  16. Magnetization reversal of a Nd-Cu-infiltrated Nd-Fe-B nanocrystalline magnet observed with small-angle neutron scattering

    Energy Technology Data Exchange (ETDEWEB)

    Saito, Kotaro, E-mail: kotaro.saito@kek.jp; Ono, Kanta [Institute of Materials Structure Science, High Energy Accelerator Research Organization, 1-1 Oho, 305-0803 Tsukuba (Japan); Ueno, Tetsuro [Elements Strategy Initiative Center for Magnetic Materials, National Institute for Materials Science, 1-2-1 Sengen, 305-0047 Tsukuba (Japan); Yano, Masao; Shoji, Tetsuya; Sakuma, Noritsugu; Manabe, Akira; Kato, Akira [Toyota Motor Corporation, Toyota, Aichi 471-8571 (Japan); Harada, Masashi [Toyota Central R and D Labs, Inc., Aichi 480-1192 (Japan); Keiderling, Uwe [Helmholtz-Zentrum Berlin für Materialien and Energie, 14109 Berlin (Germany)

    2015-05-07

    The magnetization reversal process of Nd-Fe-B nanocrystalline magnets infiltrated with Nd-Cu alloy was examined using small-angle neutron scattering (SANS). The magnetic-field dependence of SANS intensity revealed a qualitative difference between Nd-Cu-infiltrated samples and as-deformed samples. Insufficient magnetic isolation along the direction perpendicular to the nominal c-axis is expected from comparable SANS intensities for different ranges of q values along this direction. For small q values near the coercivity field, Nd-Cu-infiltrated samples show a noticeable reduction in SANS intensity along the nominal c-axis, which is parallel to the external magnetic field. This indicates less spatial fluctuation of magnetic moments in Nd-Cu-infiltrated samples, owing to magnetically isolated Nd{sub 2}Fe{sub 14}B grains.

  17. Magnetization reversal of a Nd-Cu-infiltrated Nd-Fe-B nanocrystalline magnet observed with small-angle neutron scattering

    International Nuclear Information System (INIS)

    Saito, Kotaro; Ono, Kanta; Ueno, Tetsuro; Yano, Masao; Shoji, Tetsuya; Sakuma, Noritsugu; Manabe, Akira; Kato, Akira; Harada, Masashi; Keiderling, Uwe

    2015-01-01

    The magnetization reversal process of Nd-Fe-B nanocrystalline magnets infiltrated with Nd-Cu alloy was examined using small-angle neutron scattering (SANS). The magnetic-field dependence of SANS intensity revealed a qualitative difference between Nd-Cu-infiltrated samples and as-deformed samples. Insufficient magnetic isolation along the direction perpendicular to the nominal c-axis is expected from comparable SANS intensities for different ranges of q values along this direction. For small q values near the coercivity field, Nd-Cu-infiltrated samples show a noticeable reduction in SANS intensity along the nominal c-axis, which is parallel to the external magnetic field. This indicates less spatial fluctuation of magnetic moments in Nd-Cu-infiltrated samples, owing to magnetically isolated Nd 2 Fe 14 B grains

  18. Accounting for randomness in measurement and sampling in studying cancer cell population dynamics.

    Science.gov (United States)

    Ghavami, Siavash; Wolkenhauer, Olaf; Lahouti, Farshad; Ullah, Mukhtar; Linnebacher, Michael

    2014-10-01

    Knowing the expected temporal evolution of the proportion of different cell types in sample tissues gives an indication about the progression of the disease and its possible response to drugs. Such systems have been modelled using Markov processes. We here consider an experimentally realistic scenario in which transition probabilities are estimated from noisy cell population size measurements. Using aggregated data of FACS measurements, we develop MMSE and ML estimators and formulate two problems to find the minimum number of required samples and measurements to guarantee the accuracy of predicted population sizes. Our numerical results show that the convergence mechanism of transition probabilities and steady states differ widely from the real values if one uses the standard deterministic approach for noisy measurements. This provides support for our argument that for the analysis of FACS data one should consider the observed state as a random variable. The second problem we address is about the consequences of estimating the probability of a cell being in a particular state from measurements of small population of cells. We show how the uncertainty arising from small sample sizes can be captured by a distribution for the state probability.

  19. Moessbauer and XRD Comparative Study of Host Rock and Iron Rich Mineral Samples from Paz del Rio Iron Ore Mineral Mine in Colombia

    International Nuclear Information System (INIS)

    Fajardo, M.; Perez Alcazar, G. A.; Moreira, A. M.; Speziali, N. L.

    2004-01-01

    A comparative study between the host rock and the iron rich mineral samples from the Paz del Rio iron ore mineral mine in Colombia was performed using X-ray diffraction and Moessbauer spectroscopy. Diffraction results of the iron rich mineral sample show that goethite, hematite, quartz, kaolinite and siderite are the main phases, and that a small amount of illite is also present. By Moessbauer spectroscopy at room temperature (RT) the presence of all the above mentioned phases was detected except quartz as well as an additional presence of small amount of biotite. The goethite, which appears as four sextets with hyperfine fields of 33.5, 30.5, 27.5 and 18.5 T, respectively, is the majority phase. This result shows the different grades of formation of this oxyhydroxide. The Moessbauer spectrum of this sample at 80 K presents the same phases obtained at RT without any superparamagnetic effect. In this case the goethite appears as two sextets. Diffraction results of the host rock sample show a large amount of quartz and kaolinite and small amounts of illite and biotite, whereas by Moessbauer spectroscopy illite, kaolinite and biotite were detected.

  20. Determinants of investment decisions in a crisis: Perspective of Croatian small firms

    Directory of Open Access Journals (Sweden)

    Jelena Đurkin

    2015-01-01

    Full Text Available The aim of the paper is to determine the investment activity of small firms in Croatia during the crisis year 2012 and further analyse the effect of selected factors on capital investment decisions. For this purpose, an on-line survey of small businesses was conducted. The research sample consisted of small firms in Primorsko-Goranska County that earned the highest revenue in 2011, because it was assumed they would have greater investment potential in 2012. Research results show that almost half of the firms invested in new fixed assets in 2012. Their investment decisions were primarily motivated by mere "survival" purpose, since the investment activities were mostly oriented toward the replacement of worn-out assets.

  1. Evaluation of Respondent-Driven Sampling

    Science.gov (United States)

    McCreesh, Nicky; Frost, Simon; Seeley, Janet; Katongole, Joseph; Tarsh, Matilda Ndagire; Ndunguse, Richard; Jichi, Fatima; Lunel, Natasha L; Maher, Dermot; Johnston, Lisa G; Sonnenberg, Pam; Copas, Andrew J; Hayes, Richard J; White, Richard G

    2012-01-01

    Background Respondent-driven sampling is a novel variant of link-tracing sampling for estimating the characteristics of hard-to-reach groups, such as HIV prevalence in sex-workers. Despite its use by leading health organizations, the performance of this method in realistic situations is still largely unknown. We evaluated respondent-driven sampling by comparing estimates from a respondent-driven sampling survey with total-population data. Methods Total-population data on age, tribe, religion, socioeconomic status, sexual activity and HIV status were available on a population of 2402 male household-heads from an open cohort in rural Uganda. A respondent-driven sampling (RDS) survey was carried out in this population, employing current methods of sampling (RDS sample) and statistical inference (RDS estimates). Analyses were carried out for the full RDS sample and then repeated for the first 250 recruits (small sample). Results We recruited 927 household-heads. Full and small RDS samples were largely representative of the total population, but both samples under-represented men who were younger, of higher socioeconomic status, and with unknown sexual activity and HIV status. Respondent-driven-sampling statistical-inference methods failed to reduce these biases. Only 31%-37% (depending on method and sample size) of RDS estimates were closer to the true population proportions than the RDS sample proportions. Only 50%-74% of respondent-driven-sampling bootstrap 95% confidence intervals included the population proportion. Conclusions Respondent-driven sampling produced a generally representative sample of this well-connected non-hidden population. However, current respondent-driven-sampling inference methods failed to reduce bias when it occurred. Whether the data required to remove bias and measure precision can be collected in a respondent-driven sampling survey is unresolved. Respondent-driven sampling should be regarded as a (potentially superior) form of convenience-sampling

  2. Evaluation of respondent-driven sampling.

    Science.gov (United States)

    McCreesh, Nicky; Frost, Simon D W; Seeley, Janet; Katongole, Joseph; Tarsh, Matilda N; Ndunguse, Richard; Jichi, Fatima; Lunel, Natasha L; Maher, Dermot; Johnston, Lisa G; Sonnenberg, Pam; Copas, Andrew J; Hayes, Richard J; White, Richard G

    2012-01-01

    Respondent-driven sampling is a novel variant of link-tracing sampling for estimating the characteristics of hard-to-reach groups, such as HIV prevalence in sex workers. Despite its use by leading health organizations, the performance of this method in realistic situations is still largely unknown. We evaluated respondent-driven sampling by comparing estimates from a respondent-driven sampling survey with total population data. Total population data on age, tribe, religion, socioeconomic status, sexual activity, and HIV status were available on a population of 2402 male household heads from an open cohort in rural Uganda. A respondent-driven sampling (RDS) survey was carried out in this population, using current methods of sampling (RDS sample) and statistical inference (RDS estimates). Analyses were carried out for the full RDS sample and then repeated for the first 250 recruits (small sample). We recruited 927 household heads. Full and small RDS samples were largely representative of the total population, but both samples underrepresented men who were younger, of higher socioeconomic status, and with unknown sexual activity and HIV status. Respondent-driven sampling statistical inference methods failed to reduce these biases. Only 31%-37% (depending on method and sample size) of RDS estimates were closer to the true population proportions than the RDS sample proportions. Only 50%-74% of respondent-driven sampling bootstrap 95% confidence intervals included the population proportion. Respondent-driven sampling produced a generally representative sample of this well-connected nonhidden population. However, current respondent-driven sampling inference methods failed to reduce bias when it occurred. Whether the data required to remove bias and measure precision can be collected in a respondent-driven sampling survey is unresolved. Respondent-driven sampling should be regarded as a (potentially superior) form of convenience sampling method, and caution is required

  3. Poor Man's Asteroid Sample Return Missions

    Science.gov (United States)

    Landis, R. R.; Graham, L. D.

    2018-02-01

    A cislunar platform at a Near-Rectilinear [Halo] Orbit in the vicinity of the Moon could provide an opportunity for a small NEA sample return mission at relatively low cost. There are a couple potential small ( 1m) object target dynamical groups.

  4. Small Body GN and C Research Report: G-SAMPLE - An In-Flight Dynamical Method for Identifying Sample Mass [External Release Version

    Science.gov (United States)

    Carson, John M., III; Bayard, David S.

    2006-01-01

    G-SAMPLE is an in-flight dynamical method for use by sample collection missions to identify the presence and quantity of collected sample material. The G-SAMPLE method implements a maximum-likelihood estimator to identify the collected sample mass, based on onboard force sensor measurements, thruster firings, and a dynamics model of the spacecraft. With G-SAMPLE, sample mass identification becomes a computation rather than an extra hardware requirement; the added cost of cameras or other sensors for sample mass detection is avoided. Realistic simulation examples are provided for a spacecraft configuration with a sample collection device mounted on the end of an extended boom. In one representative example, a 1000 gram sample mass is estimated to within 110 grams (95% confidence) under realistic assumptions of thruster profile error, spacecraft parameter uncertainty, and sensor noise. For convenience to future mission design, an overall sample-mass estimation error budget is developed to approximate the effect of model uncertainty, sensor noise, data rate, and thrust profile error on the expected estimate of collected sample mass.

  5. Miniaturizing 3D assay for high-throughput drug and genetic screens for small patient-derived tumor samples (Conference Presentation)

    Science.gov (United States)

    Rotem, Asaf; Garraway, Levi; Su, Mei-Ju; Basu, Anindita; Regev, Aviv; Struhl, Kevin

    2017-02-01

    Three-dimensional growth conditions reflect the natural environment of cancer cells and are crucial to be performed at drug screens. We developed a 3D assay for cellular transformation that involves growth in low attachment (GILA) conditions and is strongly correlated with the 50-year old benchmark assay-soft agar. Using GILA, we performed high-throughput screens for drugs and genes that selectively inhibit or increase transformation, but not proliferation. This phenotypic approach is complementary to our genetic approach that utilizes single-cell RNA-sequencing of a patient sample to identify putative oncogenes that confer sensitivity to drugs designed to specifically inhibit the identified oncoprotein. Currently, we are dealing with a big challenge in our field- the limited number of cells that might be extracted from a biopsy. Small patient-derived samples are hard to test in the traditional multiwell plate and it will be helpful to minimize the culture area and the experimental system. We managed to design a suitable microfluidic device for limited number of cells and perform the assay using image analysis. We aim to test drugs on tumor cells, outside of the patient body- and recommend on the ideal treatment that is tailored to the individual. This device will help to minimize biopsy-sampling volumes and minimize interventions in the patient's tumor.

  6. Bayesian hierarchical models for smoothing in two-phase studies, with application to small area estimation.

    Science.gov (United States)

    Ross, Michelle; Wakefield, Jon

    2015-10-01

    Two-phase study designs are appealing since they allow for the oversampling of rare sub-populations which improves efficiency. In this paper we describe a Bayesian hierarchical model for the analysis of two-phase data. Such a model is particularly appealing in a spatial setting in which random effects are introduced to model between-area variability. In such a situation, one may be interested in estimating regression coefficients or, in the context of small area estimation, in reconstructing the population totals by strata. The efficiency gains of the two-phase sampling scheme are compared to standard approaches using 2011 birth data from the research triangle area of North Carolina. We show that the proposed method can overcome small sample difficulties and improve on existing techniques. We conclude that the two-phase design is an attractive approach for small area estimation.

  7. Is a 'convenience' sample useful for estimating immunization coverage in a small population?

    Science.gov (United States)

    Weir, Jean E; Jones, Carrie

    2008-01-01

    Rapid survey methodologies are widely used for assessing immunization coverage in developing countries, approximating true stratified random sampling. Non-random ('convenience') sampling is not considered appropriate for estimating immunization coverage rates but has the advantages of low cost and expediency. We assessed the validity of a convenience sample of children presenting to a travelling clinic by comparing the coverage rate in the convenience sample to the true coverage established by surveying each child in three villages in rural Papua New Guinea. The rate of DTF immunization coverage as estimated by the convenience sample was within 10% of the true coverage when the proportion of children in the sample was two-thirds or when only children over the age of one year were counted, but differed by 11% when the sample included only 53% of the children and when all eligible children were included. The convenience sample may be sufficiently accurate for reporting purposes and is useful for identifying areas of low coverage.

  8. Health indicators: eliminating bias from convenience sampling estimators.

    Science.gov (United States)

    Hedt, Bethany L; Pagano, Marcello

    2011-02-28

    Public health practitioners are often called upon to make inference about a health indicator for a population at large when the sole available information are data gathered from a convenience sample, such as data gathered on visitors to a clinic. These data may be of the highest quality and quite extensive, but the biases inherent in a convenience sample preclude the legitimate use of powerful inferential tools that are usually associated with a random sample. In general, we know nothing about those who do not visit the clinic beyond the fact that they do not visit the clinic. An alternative is to take a random sample of the population. However, we show that this solution would be wasteful if it excluded the use of available information. Hence, we present a simple annealing methodology that combines a relatively small, and presumably far less expensive, random sample with the convenience sample. This allows us to not only take advantage of powerful inferential tools, but also provides more accurate information than that available from just using data from the random sample alone. Copyright © 2011 John Wiley & Sons, Ltd.

  9. Air sampling with solid phase microextraction

    Science.gov (United States)

    Martos, Perry Anthony

    There is an increasing need for simple yet accurate air sampling methods. The acceptance of new air sampling methods requires compatibility with conventional chromatographic equipment, and the new methods have to be environmentally friendly, simple to use, yet with equal, or better, detection limits, accuracy and precision than standard methods. Solid phase microextraction (SPME) satisfies the conditions for new air sampling methods. Analyte detection limits, accuracy and precision of analysis with SPME are typically better than with any conventional air sampling methods. Yet, air sampling with SPME requires no pumps, solvents, is re-usable, extremely simple to use, is completely compatible with current chromatographic equipment, and requires a small capital investment. The first SPME fiber coating used in this study was poly(dimethylsiloxane) (PDMS), a hydrophobic liquid film, to sample a large range of airborne hydrocarbons such as benzene and octane. Quantification without an external calibration procedure is possible with this coating. Well understood are the physical and chemical properties of this coating, which are quite similar to those of the siloxane stationary phase used in capillary columns. The log of analyte distribution coefficients for PDMS are linearly related to chromatographic retention indices and to the inverse of temperature. Therefore, the actual chromatogram from the analysis of the PDMS air sampler will yield the calibration parameters which are used to quantify unknown airborne analyte concentrations (ppb v to ppm v range). The second fiber coating used in this study was PDMS/divinyl benzene (PDMS/DVB) onto which o-(2,3,4,5,6- pentafluorobenzyl) hydroxylamine (PFBHA) was adsorbed for the on-fiber derivatization of gaseous formaldehyde (ppb v range), with and without external calibration. The oxime formed from the reaction can be detected with conventional gas chromatographic detectors. Typical grab sampling times were as small as 5 seconds

  10. Sample sizes and model comparison metrics for species distribution models

    Science.gov (United States)

    B.B. Hanberry; H.S. He; D.C. Dey

    2012-01-01

    Species distribution models use small samples to produce continuous distribution maps. The question of how small a sample can be to produce an accurate model generally has been answered based on comparisons to maximum sample sizes of 200 observations or fewer. In addition, model comparisons often are made with the kappa statistic, which has become controversial....

  11. Reliable calculation in probabilistic logic: Accounting for small sample size and model uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Ferson, S. [Applied Biomathematics, Setauket, NY (United States)

    1996-12-31

    A variety of practical computational problems arise in risk and safety assessments, forensic statistics and decision analyses in which the probability of some event or proposition E is to be estimated from the probabilities of a finite list of related subevents or propositions F,G,H,.... In practice, the analyst`s knowledge may be incomplete in two ways. First, the probabilities of the subevents may be imprecisely known from statistical estimations, perhaps based on very small sample sizes. Second, relationships among the subevents may be known imprecisely. For instance, there may be only limited information about their stochastic dependencies. Representing probability estimates as interval ranges on has been suggested as a way to address the first source of imprecision. A suite of AND, OR and NOT operators defined with reference to the classical Frochet inequalities permit these probability intervals to be used in calculations that address the second source of imprecision, in many cases, in a best possible way. Using statistical confidence intervals as inputs unravels the closure properties of this approach however, requiring that probability estimates be characterized by a nested stack of intervals for all possible levels of statistical confidence, from a point estimate (0% confidence) to the entire unit interval (100% confidence). The corresponding logical operations implied by convolutive application of the logical operators for every possible pair of confidence intervals reduces by symmetry to a manageably simple level-wise iteration. The resulting calculus can be implemented in software that allows users to compute comprehensive and often level-wise best possible bounds on probabilities for logical functions of events.

  12. The study of penetration of energetic ions in botanic samples with transmission measurements

    International Nuclear Information System (INIS)

    Wang, Y.G.; Chen, Q.Z.; Xue, J.M.; Du, G.H.; Qin, H.L; Zhang, W.M.; Yan, S.; Zhao, W.J.

    2006-01-01

    Botanic samples (onion endocuticles, kidney bean slices) were exposed to energetic ions. By recording transmission spectra, we studied the energy loss in such samples. Individual protrusion-like damage produced in highly oriented pyrolytic graphite (HOPG) substrate allowed us to analyze the mass density of the samples by scanning tunneling microscope (STM). The experimental results showed that the botanic sample is inhomogeneous in mass density, some incident ions lose only a small part of their energy after being stopped by a layer of botanic sample. Additionally, about 10 -7 of the incident ions with energy of tens of keV can penetrate through the botanic slice with a thickness of 50 μm. The dynamic change of the transmission spectrum of MeV heavy ions through a layer of botanic slice showed that the penetration ability of the incident ions increases with increasing ion fluence. These experimental results indicate that the inhomogeneousity of mass density of botanic samples and irradiation damage are the main reasons of the ultra-depth penetration of low-energy ions in such kind of botanic samples

  13. Detailed characterization of welding fumes in personal exposure samples

    International Nuclear Information System (INIS)

    Quémerais, B; Mino, James; Amin, M R; Golshahi, H; Izadi, H

    2015-01-01

    The objective of the project was to develop a method allowing for detailed characterization of welding particles including particle number concentration, size distribution, surface chemistry and chemical composition of individual particles, as well as metal concentration of various welding fumes in personal exposure samples using regular sampling equipment. A sample strategy was developed to evaluate the variation of the collection methods on mass concentration. Samples were collected with various samplers and filters at two different locations using our collection system. The first location was using a robotic welding system while the second was manual welding. Collected samples were analysed for mass concentration using gravimetryand metal concentration using ICP/OES. More advanced analysis was performed on selected filters using X-Ray Photoelectron Spectroscopy to determine surface composition of the particles, and X-Ray Diffraction to determine chemical composition of the fumes. Results showed that the robotic system had a lot of variation in space when the collection system was located close to the weld. Collection efficiency was found to be quite variable depending upon the type of filter. As well, metal concentrations in blank filters were dependent upon the type of filter with MCE presenting with the highest blank values. Results obtained with the XRD and XPS systems showed that it was possible to analyse a small of powdered welding fume sample but results on filters were not conclusive. (paper)

  14. Thermal neutron absorption cross-section for small samples (experiments in cylindrical geometry)

    International Nuclear Information System (INIS)

    Czubek, J.A.; Drozdowicz, K.; Igielski, A.; Krynicka-Drozdowicz, E.; Woznicka, U.

    1982-01-01

    Measurement results for thermal neutron macroscopic absorption cross-sections Σsub(a)1 when applying the cylindrical sample-moderator system are presented. Experiments for liquid (water solutions of H 3 BO 3 ) and solid (crushed basalts) samples are reported. Solid samples have been saturated with the H 3 BO 3 ''poisoning'' solution. The accuracy obtained for the determination of the absorption cross-section of the solid material was σ(Σsub(ma))=(1.2+2.2) c.u. in the case when porosity was measured with the accuracy of σ(phi)=0.001+0.002. The dispersion of the Σsub(ma) data obtained for basalts (taken from different quarries) was higher than the accuracy of the measurement. All experimental data for the fundamental decay constants lambda 0 together with the whole information about the samples are given. (author)

  15. Clostridium botulinum Spores Found in Honey from Small Apiaries in Poland

    Directory of Open Access Journals (Sweden)

    Wojtacka Joanna

    2016-12-01

    Full Text Available A total of 102 honey samples collected from small apiaries (≤ 20 hives in Poland were analysed for the presence of Clostridium botulinum spores. The samples were prepared using the dilution centrifugation method and cultured in parallel in cooked meat medium (CMM and tripticase peptone glucose yeast (TPGY enrichment broths. Identification of toxin types A, B, and E of Clostridium botulinum strains was performed with the use of the multiplex PCR method. Positive samples were also subjected to quantitative analysis with the use of Clostridium botulinum Isolation Agar Base (CBAB. The prevalence analysis showed 22 (21.6% samples contaminated with C. botulinum spores. The major serotype detected was botulin neurotoxin type A – 16 (72.7% whereas type B was found in 3 (13.6% honey samples and type E also only in 3 (13.6% honey samples. Dual-toxin-producing strains were noted. The average quantity of spores in PCR - C. botulinum positive samples was 190 in 1 gram of honey.

  16. On the surveillance for animal diseases in small herds

    DEFF Research Database (Denmark)

    Greiner, Matthias; Dekker, Aldo

    2005-01-01

    Small herds may present a problem in surveillance for infectious animal diseases because typical levels of a within-herd design prevalence are not directly applicable. We suggest a definition of small herds as those smaller than 2/(within-herd design prevalence) on the basis that such herds would...... be expected to have less than two (i.e. only one) infected animals. Consequently, the probability of detecting small herds cannot be improved by choosing a larger sample size within the herd. We derive necessary sample sizes of herds and the probability ("confidence") of detecting disease within a stratum...... conservative (lower) estimates of the confidence for a given sample size and should therefore be preferred....

  17. A Bayesian Method for Weighted Sampling

    OpenAIRE

    Lo, Albert Y.

    1993-01-01

    Bayesian statistical inference for sampling from weighted distribution models is studied. Small-sample Bayesian bootstrap clone (BBC) approximations to the posterior distribution are discussed. A second-order property for the BBC in unweighted i.i.d. sampling is given. A consequence is that BBC approximations to a posterior distribution of the mean and to the sampling distribution of the sample average, can be made asymptotically accurate by a proper choice of the random variables that genera...

  18. An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates.

    Science.gov (United States)

    Hobbs, Michael T; Brehme, Cheryl S

    2017-01-01

    Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.

  19. An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates

    Science.gov (United States)

    Hobbs, Michael T.; Brehme, Cheryl S.

    2017-01-01

    Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.

  20. Evaluating the use of gel-based sub-sampling for assessing responses of terrestrial microarthropods (Collembola and Acari) to different slurry applications and organic matter contents

    NARCIS (Netherlands)

    Jagers op Akkerhuis, G.A.J.M.; Dimmers, W.J.; Vliet, van P.C.J.; Goedhart, P.W.; Martakis, G.F.P.; Goede, de R.G.M.

    2008-01-01

    Microarthropods show considerable small-scale spatial variation, which on statistical grounds demands the use of numerous samples per experimental unit to obtain a representative density estimate. To avoid the identification of all individuals in these samples a sub-sampling technique is desirable.

  1. Detection of Small Numbers of Campylobacter jejuni and Campylobacter coli Cells in Environmental Water, Sewage, and Food Samples by a Seminested PCR Assay

    Science.gov (United States)

    Waage, Astrid S.; Vardund, Traute; Lund, Vidar; Kapperud, Georg

    1999-01-01

    A rapid and sensitive assay was developed for detection of small numbers of Campylobacter jejuni and Campylobacter coli cells in environmental water, sewage, and food samples. Water and sewage samples were filtered, and the filters were enriched overnight in a nonselective medium. The enrichment cultures were prepared for PCR by a rapid and simple procedure consisting of centrifugation, proteinase K treatment, and boiling. A seminested PCR based on specific amplification of the intergenic sequence between the two Campylobacter flagellin genes, flaA and flaB, was performed, and the PCR products were visualized by agarose gel electrophoresis. The assay allowed us to detect 3 to 15 CFU of C. jejuni per 100 ml in water samples containing a background flora consisting of up to 8,700 heterotrophic organisms per ml and 10,000 CFU of coliform bacteria per 100 ml. Dilution of the enriched cultures 1:10 with sterile broth prior to the PCR was sometimes necessary to obtain positive results. The assay was also conducted with food samples analyzed with or without overnight enrichment. As few as ≤3 CFU per g of food could be detected with samples subjected to overnight enrichment, while variable results were obtained for samples analyzed without prior enrichment. This rapid and sensitive nested PCR assay provides a useful tool for specific detection of C. jejuni or C. coli in drinking water, as well as environmental water, sewage, and food samples containing high levels of background organisms. PMID:10103261

  2. The Brine Sampling and Evaluation Program (PSEP) at WIPP

    International Nuclear Information System (INIS)

    Deal, D.E.; Roggenthen, W.M.

    1989-01-01

    The Permian salt beds of the WIPP facility are virtually dry. The amount of water present in the rocks exposed in the excavations that is free to migrate under pressure gradients was estimated by heating salt samples to 95 degrees C and measuring weight loss. Clear balite contains about 0.22 weight percent water and the more argillaceous units average about 0.75 percent. Measurements made since 1984 as part of the Brine Sampling and Evaluation Program (BSEP) indicate that small amounts of this brine can migrate into the excavations and does accumulate in the underground environment. Brine seepage into drillholes monitored since thy were drilled show that brine seepage decreases with time and that many have dried up entirely. Weeping of brine from the walls of the repository excavations also decreases after two or more years. Chemical analyses of brines shows that they are sodium-chloride saturated and magnesium-rich

  3. Quantification of Protozoa and Viruses from Small Water Volumes

    Directory of Open Access Journals (Sweden)

    J. Alfredo Bonilla

    2015-06-01

    Full Text Available Large sample volumes are traditionally required for the analysis of waterborne pathogens. The need for large volumes greatly limits the number of samples that can be processed. The aims of this study were to compare extraction and detection procedures for quantifying protozoan parasites and viruses from small volumes of marine water. The intent was to evaluate a logistically simpler method of sample collection and processing that would facilitate direct pathogen measures as part of routine monitoring programs. Samples were collected simultaneously using a bilayer device with protozoa capture by size (top filter and viruses capture by charge (bottom filter. Protozoan detection technologies utilized for recovery of Cryptosporidium spp. and Giardia spp. were qPCR and the more traditional immunomagnetic separation—IFA-microscopy, while virus (poliovirus detection was based upon qPCR versus plaque assay. Filters were eluted using reagents consistent with the downstream detection technologies. Results showed higher mean recoveries using traditional detection methods over qPCR for Cryptosporidium (91% vs. 45% and poliovirus (67% vs. 55% whereas for Giardia the qPCR-based methods were characterized by higher mean recoveries (41% vs. 28%. Overall mean recoveries are considered high for all detection technologies. Results suggest that simultaneous filtration may be suitable for isolating different classes of pathogens from small marine water volumes. More research is needed to evaluate the suitability of this method for detecting pathogens at low ambient concentration levels.

  4. Advanced path sampling of the kinetic network of small proteins

    NARCIS (Netherlands)

    Du, W.

    2014-01-01

    This thesis is focused on developing advanced path sampling simulation methods to study protein folding and unfolding, and to build kinetic equilibrium networks describing these processes. In Chapter 1 the basic knowledge of protein structure and folding theories were introduced and a brief overview

  5. Spectroelectrochemical Sensing Based on Multimode Selectivity simultaneously Achievable in a Single Device. 11. Design and Evaluation of a Small Portable Sensor for the Determination of Ferrocyanide in Hanford Waste Samples

    International Nuclear Information System (INIS)

    Stegemiller, Michael L.; Heineman, William R.; Seliskar, Carl J.; Ridgway, Thomas H.; Bryan, Samuel A.; Hubler, Timothy L.; Sell, Richard L.

    2003-01-01

    Spectroelectrochemical sensing based on multimode selectivity simultaneously achievable in a single device. 11. Design and evaluation of a small portable sensor for the determination of ferrocyanide in Hanford waste samples

  6. Adopting Advertising and Communication Innovations in Small Firms

    Directory of Open Access Journals (Sweden)

    Fabricia Durieux Zucco

    2017-03-01

    Full Text Available The adoption of innovation for market penetration of small firms tends to consider the effectiveness of communication channels. In this study we investigate the adoption of communication and advertising innovations, the use of virtual social networks, in a sample of 227 small firms in the Brazilian countryside. We analyse how distinct communication objectives of a firm influence the choice of using virtual social networks to build links between firms and customers.  We also look at the role of paid advertising in virtual social networks. Main results show a trend of such firms to engage in virtual social networks when they are targeting at stronger brand image and when building deeper links through brand information with their public. Firms also tend to use advertising in virtual social networks when intending to raise brand recall from their brand portfolio.

  7. Small-sized reverberation chamber for the measurement of sound absorption

    International Nuclear Information System (INIS)

    Rey, R. del; Alba, J.; Bertó, L.; Gregori, A.

    2017-01-01

    This paper presents the design, construction, calibration and automation of a reverberation chamber for small samples. A balance has been sought between reducing sample size, to reduce the manufacturing costs of materials, and finding the appropriate volume of the chamber, to obtain reliable values at high and mid frequencies. The small-sized reverberation chamber, that was built, has a volume of 1.12 m3 and allows for the testing of samples of 0.3 m2. By using diffusers, to improve the diffusion degree, and automating measurements, we were able to improve the reliability of the results, thus reducing test errors. Several comparison studies of the measurements of the small-sized reverberation chamber and the standardised reverberation chamber are shown, and a good degree of adjustment can be seen between them, within the range of valid frequencies. This paper presents a small laboratory for comparing samples and making decisions before the manufacturing of larger sizes. [es

  8. Introducing the religio-centric positional advantage to Indonesian small businesses

    Directory of Open Access Journals (Sweden)

    Hendar Hendar

    2017-03-01

    Full Text Available With a focus on small religion-based businesses in Indonesia, this research examines whether marketing innovativeness, customers responsiveness and competitors′ responsiveness can improve marketing performance. A conceptual model on the correlation of these three variables with religio-centric positional advantage and marketing performance is examined by using structural equation model. For this purpose 335 small businesses of Islamic Fashion had been studied using purposive sampling from 11 regencies/cities in Central Java (Indonesia. The results showed that (1 small companies in these market segments based on religion is likely to have better marketing performance when they have the Religio-centric positional advantage, (2 The increase in marketing innovativeness, CuR and CoR is required to improve and maintain religio-centric positional advantage, and (3 religio-centric positional advantage is definitely a mediator in the correlation of marketing innovativeness, customer responsiveness and competitors responsiveness with marketing performance.

  9. EFFECT OF MARKET ORIENTATION ON SMALL BUSINESS PERORMANCE IN SMALL TOWN IN MALAYSIA: AN EMPRICAL STUDY ON MALAYSIAN SMALL FIRMS

    Directory of Open Access Journals (Sweden)

    Muhammad Masroor ALAM

    2010-01-01

    Full Text Available Most research on market orientation, and performance was related to big firms. In this study, based on the theoretical framework, a model was developed to investigate the effect of market orientation on business performance in small firms. To test the relationships among the variables, data from 53 small firms in the small town of Chunglun at Sintok, Kedah was used. The findings show that the three components of market orientation are related positive to business performance of small firms. The further analysis also confirmed that customer orientation and competitor orientation are strong predictors of small firm performance. The findings of this study confirm that market orientation behavior also applies to small firms.

  10. Radionuclide contaminant analysis of small mammals at Area G, TA-54, 1994

    International Nuclear Information System (INIS)

    Biggs, J.; Bennett, K.; Fresquez, P.

    1995-09-01

    Small mammals were sampled at two waste burial sites (1 and 2) at Area G, TA-54 and a control site outside Area G (Site 3) to identify radionuclides that are present within surface and subsurface soils at waste burial sites, to compare the amount of radionuclide uptake by small mammals at waste burial sites to a control site, and to identify the primary mode of contamination to small mammals, either through surface contact or ingestion/inhalation. Three composite samples of at least five animals per sample were collected at each site. Pelts and carcasses of each animal were separated and analyzed independently. Samples were analyzed for 241 Am, 90 Sr, 238 Pu, 239 Pu, total U, and gamma spectroscopy (including 137 Cs). Significantly higher (parametric t-test at p = 0.05) levels of total U, 241 Am, 238 Pu, 239 Pu, and 40 K were detected in pelts as compared to the carcasses of small mammals at TA-54. Concentrations of other measured radionuclides in carcasses were nearly equal to or exceeded the mean concentrations in the pelts. The authors results show higher concentrations in pelts compared to carcasses which is similar to what has been found at waste burial/contaminated sites outside of Los Alamos National Laboratory. Site 1 had significantly higher (alpha = 0.05, F = 0.0095) total U concentrations in carcasses than Sites 2 and 3. Site 2 had significantly higher (alpha = 0.05, F = 0.0195) 239 Pu concentrations in carcasses than either Site 1 or Site 3. A significant difference in 90 Sr concentration existed between Sites 1 and 2 (alpha = 0.05, F = 0.0681) and concentrations of 40 K at Site 1 were significantly different from Site 3

  11. Drone Transport of Chemistry and Hematology Samples Over Long Distances.

    Science.gov (United States)

    Amukele, Timothy K; Hernandez, James; Snozek, Christine L H; Wyatt, Ryan G; Douglas, Matthew; Amini, Richard; Street, Jeff

    2017-11-02

    We addressed the stability of biological samples in prolonged drone flights by obtaining paired chemistry and hematology samples from 21 adult volunteers in a single phlebotomy event-84 samples total. Half of the samples were held stationary, while the other samples were flown for 3 hours (258 km) in a custom active cooling box mounted on the drone. After the flight, 19 chemistry and hematology tests were performed. Seventeen analytes had small or no bias, but glucose and potassium in flown samples showed an 8% and 6.2% bias, respectively. The flown samples (mean, 24.8°C) were a mean of 2.5°C cooler than the stationary samples (mean, 27.3°C) during transportation to the flight field as well as during the flight. The changes in glucose and potassium are consistent with the magnitude and duration of the temperature difference between the flown and stationary samples. Long drone flights of biological samples are feasible but require stringent environmental controls to ensure consistent results. © American Society for Clinical Pathology, 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  12. Probing the Small-scale Structure in Strongly Lensed Systems via Transdimensional Inference

    Science.gov (United States)

    Daylan, Tansu; Cyr-Racine, Francis-Yan; Diaz Rivero, Ana; Dvorkin, Cora; Finkbeiner, Douglas P.

    2018-02-01

    Strong lensing is a sensitive probe of the small-scale density fluctuations in the Universe. We implement a pipeline to model strongly lensed systems using probabilistic cataloging, which is a transdimensional, hierarchical, and Bayesian framework to sample from a metamodel (union of models with different dimensionality) consistent with observed photon count maps. Probabilistic cataloging allows one to robustly characterize modeling covariances within and across lens models with different numbers of subhalos. Unlike traditional cataloging of subhalos, it does not require model subhalos to improve the goodness of fit above the detection threshold. Instead, it allows the exploitation of all information contained in the photon count maps—for instance, when constraining the subhalo mass function. We further show that, by not including these small subhalos in the lens model, fixed-dimensional inference methods can significantly mismodel the data. Using a simulated Hubble Space Telescope data set, we show that the subhalo mass function can be probed even when many subhalos in the sample catalogs are individually below the detection threshold and would be absent in a traditional catalog. The implemented software, Probabilistic Cataloger (PCAT) is made publicly available at https://github.com/tdaylan/pcat.

  13. Molecular ecology of Listeria monocytogenes and other Listeria species in small and very small ready-to-eat meat processing plants.

    Science.gov (United States)

    Williams, Shanna K; Roof, Sherry; Boyle, Elizabeth A; Burson, Dennis; Thippareddi, Harshavardhan; Geornaras, Ifigenia; Sofos, John N; Wiedmann, Martin; Nightingale, Kendra

    2011-01-01

    A longitudinal study was conducted to track Listeria contamination patterns in ready-to-eat meats from six small or very small meat processing plants located in three states over 1 year. A total of 688 environmental sponge samples were collected from nonfood contact surfaces during bimonthly visits to each plant. Overall, L. monocytogenes was isolated from 42 (6.1%) environmental samples, and its prevalence ranged from 1.7 to 10.8% across different plants. Listeria spp., other than L. monocytogenes, were isolated from 9.5% of samples overall, with the prevalence ranging from 1.5 to 18.3% across different plants. The prevalence of L. monocytogenes correlated well with that of other Listeria spp. for some but not all plants. One L. monocytogenes isolate representing each positive sample was characterized by molecular serotyping, EcoRI ribotyping, and pulsed-field gel electrophoresis typing. Seven sample sites tested positive for L. monocytogenes on more than one occasion, and the same ribotype was detected more than once at five of these sites. Partial sigB sequencing was used to speciate other Listeria spp. isolates and assign an allelic type to each isolate. Other Listeria spp. were isolated more than once from 14 sample sites, and the same sigB allelic type was recovered at least twice from seven of these sites. One plant was colonized by an atypical hemolytic L. innocua strain. Our findings indicate that small and very small meat processing plants that produce ready-to-eat meat products are characterized by a varied prevalence of Listeria, inconsistent correlation between contamination by L. monocytogenes and other Listeria spp., and a unique Listeria molecular ecology.

  14. Bias Assessment of General Chemistry Analytes using Commutable Samples.

    Science.gov (United States)

    Koerbin, Gus; Tate, Jillian R; Ryan, Julie; Jones, Graham Rd; Sikaris, Ken A; Kanowski, David; Reed, Maxine; Gill, Janice; Koumantakis, George; Yen, Tina; St John, Andrew; Hickman, Peter E; Simpson, Aaron; Graham, Peter

    2014-11-01

    Harmonisation of reference intervals for routine general chemistry analytes has been a goal for many years. Analytical bias may prevent this harmonisation. To determine if analytical bias is present when comparing methods, the use of commutable samples, or samples that have the same properties as the clinical samples routinely analysed, should be used as reference samples to eliminate the possibility of matrix effect. The use of commutable samples has improved the identification of unacceptable analytical performance in the Netherlands and Spain. The International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) has undertaken a pilot study using commutable samples in an attempt to determine not only country specific reference intervals but to make them comparable between countries. Australia and New Zealand, through the Australasian Association of Clinical Biochemists (AACB), have also undertaken an assessment of analytical bias using commutable samples and determined that of the 27 general chemistry analytes studied, 19 showed sufficiently small between method biases as to not prevent harmonisation of reference intervals. Application of evidence based approaches including the determination of analytical bias using commutable material is necessary when seeking to harmonise reference intervals.

  15. The IGF1 small dog haplotype is derived from Middle Eastern grey wolves

    Directory of Open Access Journals (Sweden)

    Ostrander Elaine A

    2010-02-01

    Full Text Available Abstract Background A selective sweep containing the insulin-like growth factor 1 (IGF1 gene is associated with size variation in domestic dogs. Intron 2 of IGF1 contains a SINE element and single nucleotide polymorphism (SNP found in all small dog breeds that is almost entirely absent from large breeds. In this study, we surveyed a large sample of grey wolf populations to better understand the ancestral pattern of variation at IGF1 with a particular focus on the distribution of the small dog haplotype and its relationship to the origin of the dog. Results We present DNA sequence data that confirms the absence of the derived small SNP allele in the intron 2 region of IGF1 in a large sample of grey wolves and further establishes the absence of a small dog associated SINE element in all wild canids and most large dog breeds. Grey wolf haplotypes from the Middle East have higher nucleotide diversity suggesting an origin there. Additionally, PCA and phylogenetic analyses suggests a closer kinship of the small domestic dog IGF1 haplotype with those from Middle Eastern grey wolves. Conclusions The absence of both the SINE element and SNP allele in grey wolves suggests that the mutation for small body size post-dates the domestication of dogs. However, because all small dogs possess these diagnostic mutations, the mutations likely arose early in the history of domestic dogs. Our results show that the small dog haplotype is closely related to those in Middle Eastern wolves and is consistent with an ancient origin of the small dog haplotype there. Thus, in concordance with past archeological studies, our molecular analysis is consistent with the early evolution of small size in dogs from the Middle East. See associated opinion by Driscoll and Macdonald: http://jbiol.com/content/9/2/10

  16. Effect of Small Numbers of Test Results on Accuracy of Hoek-Brown Strength Parameter Estimations: A Statistical Simulation Study

    Science.gov (United States)

    Bozorgzadeh, Nezam; Yanagimura, Yoko; Harrison, John P.

    2017-12-01

    The Hoek-Brown empirical strength criterion for intact rock is widely used as the basis for estimating the strength of rock masses. Estimations of the intact rock H-B parameters, namely the empirical constant m and the uniaxial compressive strength σc, are commonly obtained by fitting the criterion to triaxial strength data sets of small sample size. This paper investigates how such small sample sizes affect the uncertainty associated with the H-B parameter estimations. We use Monte Carlo (MC) simulation to generate data sets of different sizes and different combinations of H-B parameters, and then investigate the uncertainty in H-B parameters estimated from these limited data sets. We show that the uncertainties depend not only on the level of variability but also on the particular combination of parameters being investigated. As particular combinations of H-B parameters can informally be considered to represent specific rock types, we discuss that as the minimum number of required samples depends on rock type it should correspond to some acceptable level of uncertainty in the estimations. Also, a comparison of the results from our analysis with actual rock strength data shows that the probability of obtaining reliable strength parameter estimations using small samples may be very low. We further discuss the impact of this on ongoing implementation of reliability-based design protocols and conclude with suggestions for improvements in this respect.

  17. Extraction of network topology from multi-electrode recordings: Is there a small-world effect?

    Directory of Open Access Journals (Sweden)

    Felipe eGerhard

    2011-02-01

    Full Text Available The simultaneous recording of the activity of many neurons poses challenges for multivariate data analysis. Here, we propose a general scheme of reconstruction of the functional network from spike train recordings. Effective, causal interactions are estimated by fitting Generalized Linear Models (GLMs on the neural responses, incorporating effects of the neurons' self-history, of input from other neurons in the recorded network and of modulation by an external stimulus. The coupling terms arising from synaptic input can be transformed by thresholding into a binary connectivity matrix which is directed. Each link between two neurons represents a causal influence from one neuron to the other, given the observation of all other neurons from the population. The resulting graph is analyzed with respect to small-world and scale-free properties using quantitative measures for directed networks. Such graph-theoretic analyses have been performed on many complex dynamic networks, including the connectivity structure between different brain areas. Only few studies have attempted to look at the structure of cortical neural networks on the level of individual neurons. Here, using multi-electrode recordings from the visual system of the awake monkey, we find that cortical networks lack scale-free behavior, but show a small, but significant small-world structure. Assuming a simple distance-dependent probabilistic wiring between neurons, we find that this connectivity structure can account for all of the networks' observed small-world-ness. Moreover, for multi-electrode recordings the sampling of neurons is not uniform across the population. We show that the small-world-ness obtained by such a localized sub-sampling overestimates the strength of the true small-world-structure of the network. This bias is likely to be present in all previous experiments based on multi-electrode recordings.

  18. Rate-distortion optimization for compressive video sampling

    Science.gov (United States)

    Liu, Ying; Vijayanagar, Krishna R.; Kim, Joohee

    2014-05-01

    The recently introduced compressed sensing (CS) framework enables low complexity video acquisition via sub- Nyquist rate sampling. In practice, the resulting CS samples are quantized and indexed by finitely many bits (bit-depth) for transmission. In applications where the bit-budget for video transmission is constrained, rate- distortion optimization (RDO) is essential for quality video reconstruction. In this work, we develop a double-level RDO scheme for compressive video sampling, where frame-level RDO is performed by adaptively allocating the fixed bit-budget per frame to each video block based on block-sparsity, and block-level RDO is performed by modelling the block reconstruction peak-signal-to-noise ratio (PSNR) as a quadratic function of quantization bit-depth. The optimal bit-depth and the number of CS samples are then obtained by setting the first derivative of the function to zero. In the experimental studies the model parameters are initialized with a small set of training data, which are then updated with local information in the model testing stage. Simulation results presented herein show that the proposed double-level RDO significantly enhances the reconstruction quality for a bit-budget constrained CS video transmission system.

  19. Secondary School Students' Reasoning about Conditional Probability, Samples, and Sampling Procedures

    Science.gov (United States)

    Prodromou, Theodosia

    2016-01-01

    In the Australian mathematics curriculum, Year 12 students (aged 16-17) are asked to solve conditional probability problems that involve the representation of the problem situation with two-way tables or three-dimensional diagrams and consider sampling procedures that result in different correct answers. In a small exploratory study, we…

  20. The impact of sample size on the reproducibility of voxel-based lesion-deficit mappings.

    Science.gov (United States)

    Lorca-Puls, Diego L; Gajardo-Vidal, Andrea; White, Jitrachote; Seghier, Mohamed L; Leff, Alexander P; Green, David W; Crinion, Jenny T; Ludersdorfer, Philipp; Hope, Thomas M H; Bowman, Howard; Price, Cathy J

    2018-07-01

    This study investigated how sample size affects the reproducibility of findings from univariate voxel-based lesion-deficit analyses (e.g., voxel-based lesion-symptom mapping and voxel-based morphometry). Our effect of interest was the strength of the mapping between brain damage and speech articulation difficulties, as measured in terms of the proportion of variance explained. First, we identified a region of interest by searching on a voxel-by-voxel basis for brain areas where greater lesion load was associated with poorer speech articulation using a large sample of 360 right-handed English-speaking stroke survivors. We then randomly drew thousands of bootstrap samples from this data set that included either 30, 60, 90, 120, 180, or 360 patients. For each resample, we recorded effect size estimates and p values after conducting exactly the same lesion-deficit analysis within the previously identified region of interest and holding all procedures constant. The results show (1) how often small effect sizes in a heterogeneous population fail to be detected; (2) how effect size and its statistical significance varies with sample size; (3) how low-powered studies (due to small sample sizes) can greatly over-estimate as well as under-estimate effect sizes; and (4) how large sample sizes (N ≥ 90) can yield highly significant p values even when effect sizes are so small that they become trivial in practical terms. The implications of these findings for interpreting the results from univariate voxel-based lesion-deficit analyses are discussed. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  1. Impact of multicollinearity on small sample hydrologic regression models

    Science.gov (United States)

    Kroll, Charles N.; Song, Peter

    2013-06-01

    Often hydrologic regression models are developed with ordinary least squares (OLS) procedures. The use of OLS with highly correlated explanatory variables produces multicollinearity, which creates highly sensitive parameter estimators with inflated variances and improper model selection. It is not clear how to best address multicollinearity in hydrologic regression models. Here a Monte Carlo simulation is developed to compare four techniques to address multicollinearity: OLS, OLS with variance inflation factor screening (VIF), principal component regression (PCR), and partial least squares regression (PLS). The performance of these four techniques was observed for varying sample sizes, correlation coefficients between the explanatory variables, and model error variances consistent with hydrologic regional regression models. The negative effects of multicollinearity are magnified at smaller sample sizes, higher correlations between the variables, and larger model error variances (smaller R2). The Monte Carlo simulation indicates that if the true model is known, multicollinearity is present, and the estimation and statistical testing of regression parameters are of interest, then PCR or PLS should be employed. If the model is unknown, or if the interest is solely on model predictions, is it recommended that OLS be employed since using more complicated techniques did not produce any improvement in model performance. A leave-one-out cross-validation case study was also performed using low-streamflow data sets from the eastern United States. Results indicate that OLS with stepwise selection generally produces models across study regions with varying levels of multicollinearity that are as good as biased regression techniques such as PCR and PLS.

  2. Bacteria isolated from abscesses of small ruminants inspected in the semiarid region of Brazil

    Directory of Open Access Journals (Sweden)

    Wellington Erasmo Lima e Silva

    2016-06-01

    Full Text Available Loss in the supply chain of small ruminants owing to condemnations of carcasses in the abattoirs and slaughterhouses is common in northeastern Brazil. This study aims to identify bacterial agents, including Mycobacterium spp., in the abscesses found in the postmortem analysis of the carcasses of sheep and goats bred in northeastern Brazil. Our analysis involved 679 goats and 1,838 sheep carcasses. Abscess samples were extracted and inoculated on blood agar and Lowenstein Jensen with pyruvate or glycerol for bacterial isolation. We then performed polymerase chain reaction of the hps 65 gene; samples positive for Mycobacterium spp. were subjected to DNA sequencing. Relative frequencies of abscesses in goats and sheep were 5.44 and 3.26%, respectively. Microbiological analysis revealed 87.7% bacterial growth in the inoculated samples. Among these, Corynebacterium pseudotuberculosis represented 67.7% of the isolates. We observed 1.9% mycobacteria growth in the abscess samples inoculated on Lowenstein-Jensen medium. PCR of DNA extracted from abscesses samples showed amplification of 0.9% of samples. After sequencing, Mycobacterium spp. isolate was identified as M. novocastrense. C. pseudotuberculosis was the main agent responsible for the formation of abscesses in the examined animals, and we did not identify any species of the M. tuberculosis complex in the examined small ruminants.

  3. Interference and k-point sampling in the supercell approach to phase-coherent transport - art. no. 0333401

    DEFF Research Database (Denmark)

    Thygesen, Kristian Sommer; Jacobsen, Karsten Wedel

    2005-01-01

    We present a systematic study of interference and k-point sampling effects in the supercell approach to phase-coherent electron transport. We use a representative tight-binding model to show that interference between the repeated images is a small effect compared to the error introduced by using...... only the Gamma-point for a supercell containing (3,3) sites in the transverse plane. An insufficient k-point sampling can introduce strong but unphysical features in the transmission function which can be traced to the presence of van Hove singularities in the lead. We present a first......-principles calculation of the transmission through a Pt contact which shows that the k-point sampling is also important for realistic systems....

  4. Exploration of small RNA-seq data for small non-coding RNAs in Human Colorectal Cancer.

    Science.gov (United States)

    Koduru, Srinivas V; Tiwari, Amit K; Hazard, Sprague W; Mahajan, Milind; Ravnic, Dino J

    2017-01-01

    Background: Improved healthcare and recent breakthroughs in technology have substantially reduced cancer mortality rates worldwide. Recent advancements in next-generation sequencing (NGS) have allowed genomic analysis of the human transcriptome. Now, using NGS we can further look into small non-coding regions of RNAs (sncRNAs) such as microRNAs (miRNAs), Piwi-interacting-RNAs (piRNAs), long non-coding RNAs (lncRNAs), and small nuclear/nucleolar RNAs (sn/snoRNAs) among others. Recent studies looking at sncRNAs indicate their role in important biological processes such as cancer progression and predict their role as biomarkers for disease diagnosis, prognosis, and therapy. Results: In the present study, we data mined publically available small RNA sequencing data from colorectal tissue samples of eight matched patients (benign, tumor, and metastasis) and remapped the data for various small RNA annotations. We identified aberrant expression of 13 miRNAs in tumor and metastasis specimens [tumor vs benign group (19 miRNAs) and metastasis vs benign group (38 miRNAs)] of which five were upregulated, and eight were downregulated, during disease progression. Pathway analysis of aberrantly expressed miRNAs showed that the majority of miRNAs involved in colon cancer were also involved in other cancers. Analysis of piRNAs revealed six to be over-expressed in the tumor vs benign cohort and 24 in the metastasis vs benign group. Only two piRNAs were shared between the two cohorts. Examining other types of small RNAs [sn/snoRNAs, mt_rRNA, miscRNA, nonsense mediated decay (NMD), and rRNAs] identified 15 sncRNAs in the tumor vs benign group and 104 in the metastasis vs benign group, with only four others being commonly expressed. Conclusion: In summary, our comprehensive analysis on publicly available small RNA-seq data identified multiple differentially expressed sncRNAs during colorectal cancer progression at different stages compared to normal colon tissue. We speculate that

  5. Detection and genotyping of Chlamydia species responsible for reproductive disorders in Algerian small ruminants.

    Science.gov (United States)

    Merdja, Salah-Eddine; Khaled, Hamza; Aaziz, Rachid; Vorimore, Fabien; Bertin, Claire; Dahmani, Ali; Bouyoucef, Abdallah; Laroucau, Karine

    2015-02-01

    Chlamydiosis in small ruminants is a zoonotic disease mainly related to Chlamydia abortus. This bacterium is responsible for abortions and reproductive disorders in sheep and goats. Stillbirth and infertility, leading to important economic losses, are also associated with this pathology. In Algeria, abortion cases are frequently reported by veterinarians but, except for brucellosis which is a notifiable disease in this country, abortive diseases are in general poorly studied. In order to detect and genotype Chlamydia species in small ruminants in different areas of Algeria, a study was conducted on samples collected from females (164 blood samples and 199 vaginal swabs) between October 2011 and March 2013. Serum samples were tested with a C. abortus-specific indirect ELISA test. Fourteen samples (8.5 %), from six farms (6/20, 30 %) were tested positive. Vaginal swabs were analysed with a real-time PCR targeting all Chlamydiaceae spp. Thirty samples (15 %) were diagnosed positive in 16 farms (16/25, 64 %). Positive samples were all re-tested with a C. abortus- and a C. pecorum-specific real-time PCR. Finally, 13/30 (43.3 %) and 6/30 (20 %) were identified as C. abortus and C. pecorum, respectively. Enough concentrated C. abortus samples were genotyped by multi-loci variable number of tandem repeat (VNTR) analysis (MLVA), and all were related to the genotype [2] group which mainly includes French C. abortus isolates. C. pecorum-positive samples were genotyped by multi-locus sequence typing (MLST). Interestingly, two of them were successfully genotyped and showed identical MLST sequences to VB2, AB10, E58 and SBE, a group which includes C. pecorum isolates considered as highly pathogenic. These findings suggest a possible role of C. abortus and C. pecorum strains in the aetiology of abortion in Algerian small ruminants.

  6. Venus Suface Sampling and Analysis

    Data.gov (United States)

    National Aeronautics and Space Administration — This effort is developing the technology to transfer particulate samples from a Venus drill (being developed by Honeybee Robotics in a Phase 2 Small Business...

  7. EVALUATION OF THE ULTRASTRUCTURE OF THE SMALL INTESTINE OF HIV INFECTED CHILDREN BY TRANSMISSION AND SCANNING ELECTRONIC MICROSCOPY

    Directory of Open Access Journals (Sweden)

    Christiane Araujo Chaves LEITE

    2013-03-01

    Full Text Available Objectives To describe HIV children's small intestinal ultrastructural findings. Methods Descriptive, observational study of small intestine biopsies performed between August 1994 and May 1995 at São Paulo, SP, Brazil. This material pertained to 11 HIV infected children and was stored in a laboratory in paraffin blocks. Scanning and transmission electronic microscopy were used to view those intestine samples and ultrastructural findings were described by analyzing digitalized photos of this material. Ethical Committee approval was obtained. Results In most samples scanning microscopy showed various degrees of shortening and decreasing number of microvilli and also completes effacements in some areas. Derangement of the enterocytes was seen frequently and sometimes cells well defined borders limits seemed to be loosened. In some areas a mucous-fibrin like membrane with variable thickness and extension appeared to partially or totally coat the epithelial surface. Fat drops were present in the intestinal lumen in various samples and a bacterium morphologically resembling bacilli was seen in two occasions. Scanning microscopy confirmed transmission microscopy microvilli findings and also showed little “tufts” of those structures. In addition, it showed an increased number of vacuoles and multivesicular bodies inside various enterocytes, an increased presence of intraepithelial lymphocytes, mitochondrial vacuolization and basement membrane enlargement in the majority of samples analyzed. However, some samples exhibited normal aspect. Conclusions Our study showed the common occurrence of various important intestinal ultrastructural alterations with variable degrees among HIV infected children, some of them in our knowledge not described before.

  8. Demonstration/Validation of Incremental Sampling at Two Diverse Military Ranges and Development of an Incremental Sampling Tool

    Science.gov (United States)

    2010-06-01

    Sampling (MIS)? • Technique of combining many increments of soil from a number of points within exposure area • Developed by Enviro Stat (Trademarked...Demonstrating a reliable soil sampling strategy to accurately characterize contaminant concentrations in spatially extreme and heterogeneous...into a set of decision (exposure) units • One or several discrete or small- scale composite soil samples collected to represent each decision unit

  9. Small head size after atomic irradiation

    International Nuclear Information System (INIS)

    Miller, R.W.; Mulvihill, J.J.

    1975-01-01

    A study of children exposed to nuclear explosions in Hiroshima and Nagasaki showed small head size and mental retardation when exposure occurred less than 18 weeks of gestational age. Increased frequency of small head size occurred when maternal exposure was 10 to 19 rad. Tables and graphs are presented to show relationships between dose, gestational age, and frequency of small head size

  10. IMPLEMENTING SMALL AND MEDIUM IT PROJECTS IN SMALL AND MEDIUM ENERPRISES

    Directory of Open Access Journals (Sweden)

    Felix DUMITRESCU

    2014-06-01

    Full Text Available Information technology is essential nowadays for all companies. Small enterprises are an important part of the economy and this article aims at providing some useful insight in implementing modern IT projects to their benefit. Due to the limited funding available for the IT infrastructure in most start-ups and small businesses, the projects should be adapted to fulfill the needs of the company for the lowest cost. The paper will start by defining small and medium project management theory and outlining the target of the study, small and medium sized companies. Next it will show a number of case studies of IT projects implemented in different types of small companies in Romania. Based on these implementations the article will draw some conclusions relevant to most small companies which need to design or improve their IT infrastructure.

  11. Use of Monte Carlo Bootstrap Method in the Analysis of Sample Sufficiency for Radioecological Data

    International Nuclear Information System (INIS)

    Silva, A. N. C. da; Amaral, R. S.; Araujo Santos Jr, J.; Wilson Vieira, J.; Lima, F. R. de A.

    2015-01-01

    There are operational difficulties in obtaining samples for radioecological studies. Population data may no longer be available during the study and obtaining new samples may not be possible. These problems do the researcher sometimes work with a small number of data. Therefore, it is difficult to know whether the number of samples will be sufficient to estimate the desired parameter. Hence, it is critical do the analysis of sample sufficiency. It is not interesting uses the classical methods of statistic to analyze sample sufficiency in Radioecology, because naturally occurring radionuclides have a random distribution in soil, usually arise outliers and gaps with missing values. The present work was developed aiming to apply the Monte Carlo Bootstrap method in the analysis of sample sufficiency with quantitative estimation of a single variable such as specific activity of a natural radioisotope present in plants. The pseudo population was a small sample with 14 values of specific activity of 226 Ra in forage palm (Opuntia spp.). Using the R software was performed a computational procedure to calculate the number of the sample values. The re sampling process with replacement took the 14 values of original sample and produced 10,000 bootstrap samples for each round. Then was calculated the estimated average θ for samples with 2, 5, 8, 11 and 14 values randomly selected. The results showed that if the researcher work with only 11 sample values, the average parameter will be within a confidence interval with 90% probability . (Author)

  12. Heterogeneity in small aliquots of Apolllo 15 olivine-normative basalt: Implications for breccia clast studies

    Science.gov (United States)

    Lindstrom, Marilyn M.; Shervais, John W.; Vetter, Scott K.

    1993-01-01

    Most of the recent advances in lunar petrology are the direct result of breccia pull-apart studies, which have identified a wide array of new highland and mare basalt rock types that occur only as clasts within the breccias. These rocks show that the lunar crust is far more complex than suspected previously, and that processes such as magma mixing and wall-rock assimilation were important in its petrogenesis. These studies are based on the implicit assumption that the breccia clasts, which range in size from a few mm to several cm across, are representative of the parent rock from which they were derived. In many cases, the aliquot allocated for analysis may be only a few grain diameters across. While this problem is most acute for coarse-grained highland rocks, it can also cause considerable uncertainty in the analysis of mare basalt clasts. Similar problems arise with small aliquots of individual hand samples. Our study of sample heterogeneity in 9 samples of Apollo 15 olivine normative basalt (ONB) which exhibit a range in average grain size from coarse to fine are reported. Seven of these samples have not been analyzed previously, one has been analyzed by INAA only, and one has been analyzed by XRF+INAA. Our goal is to assess the effects of small aliquot size on the bulk chemistry of large mare basalt samples, and to extend this assessment to analyses of small breccia clasts.

  13. Cd and Zn concentrations in small mammals and willow leaves on disposal facilities for dredged material

    NARCIS (Netherlands)

    Mertens, J.; Luyssaert, S.; Verbeeren, S.; Vervaeke, P; Lust, N

    2001-01-01

    Disposal sites for dredged material are often polluted with heavy metals. The uptake of Cd and Zn by small mammals and willow trees was assessed on three sites with a different pollution degree. Detailed soil sampling showed a huge variation in soil characteristics within the sites, typical for

  14. Comparing distribution models for small samples of overdispersed counts of freshwater fish

    Science.gov (United States)

    Vaudor, Lise; Lamouroux, Nicolas; Olivier, Jean-Michel

    2011-05-01

    The study of species abundance often relies on repeated abundance counts whose number is limited by logistic or financial constraints. The distribution of abundance counts is generally right-skewed (i.e. with many zeros and few high values) and needs to be modelled for statistical inference. We used an extensive dataset involving about 100,000 fish individuals of 12 freshwater fish species collected in electrofishing points (7 m 2) during 350 field surveys made in 25 stream sites, in order to compare the performance and the generality of four distribution models of counts (Poisson, negative binomial and their zero-inflated counterparts). The negative binomial distribution was the best model (Bayesian Information Criterion) for 58% of the samples (species-survey combinations) and was suitable for a variety of life histories, habitat, and sample characteristics. The performance of the models was closely related to samples' statistics such as total abundance and variance. Finally, we illustrated the consequences of a distribution assumption by calculating confidence intervals around the mean abundance, either based on the most suitable distribution assumption or on an asymptotical, distribution-free (Student's) method. Student's method generally corresponded to narrower confidence intervals, especially when there were few (≤3) non-null counts in the samples.

  15. Likelihood inference of non-constant diversification rates with incomplete taxon sampling.

    Science.gov (United States)

    Höhna, Sebastian

    2014-01-01

    Large-scale phylogenies provide a valuable source to study background diversification rates and investigate if the rates have changed over time. Unfortunately most large-scale, dated phylogenies are sparsely sampled (fewer than 5% of the described species) and taxon sampling is not uniform. Instead, taxa are frequently sampled to obtain at least one representative per subgroup (e.g. family) and thus to maximize diversity (diversified sampling). So far, such complications have been ignored, potentially biasing the conclusions that have been reached. In this study I derive the likelihood of a birth-death process with non-constant (time-dependent) diversification rates and diversified taxon sampling. Using simulations I test if the true parameters and the sampling method can be recovered when the trees are small or medium sized (fewer than 200 taxa). The results show that the diversification rates can be inferred and the estimates are unbiased for large trees but are biased for small trees (fewer than 50 taxa). Furthermore, model selection by means of Akaike's Information Criterion favors the true model if the true rates differ sufficiently from alternative models (e.g. the birth-death model is recovered if the extinction rate is large and compared to a pure-birth model). Finally, I applied six different diversification rate models--ranging from a constant-rate pure birth process to a decreasing speciation rate birth-death process but excluding any rate shift models--on three large-scale empirical phylogenies (ants, mammals and snakes with respectively 149, 164 and 41 sampled species). All three phylogenies were constructed by diversified taxon sampling, as stated by the authors. However only the snake phylogeny supported diversified taxon sampling. Moreover, a parametric bootstrap test revealed that none of the tested models provided a good fit to the observed data. The model assumptions, such as homogeneous rates across species or no rate shifts, appear to be

  16. Small Scale Yielding Correction of Constraint Loss in Small Sized Fracture Toughness Test Specimens

    International Nuclear Information System (INIS)

    Kim, Maan Won; Kim, Min Chul; Lee, Bong Sang; Hong, Jun Hwa

    2005-01-01

    Fracture toughness data in the ductile-brittle transition region of ferritic steels show scatter produced by local sampling effects and specimen geometry dependence which results from relaxation in crack tip constraint. The ASTM E1921 provides a standard test method to define the median toughness temperature curve, so called Master Curve, for the material corresponding to a 1T crack front length and also defines a reference temperature, T 0 , at which median toughness value is 100 MPam for a 1T size specimen. The ASTM E1921 procedures assume that high constraint, small scaling yielding (SSY) conditions prevail at fracture along the crack front. Violation of the SSY assumption occurs most often during tests of smaller specimens. Constraint loss in such cases leads to higher toughness values and thus lower T 0 values. When applied to a structure with low constraint geometry, the standard fracture toughness estimates may lead to strongly over-conservative estimates. A lot of efforts have been made to adjust the constraint effect. In this work, we applied a small-scale yielding correction (SSYC) to adjust the constraint loss of 1/3PCVN and PCVN specimens which are relatively smaller than 1T size specimen at the fracture toughness Master Curve test

  17. Analysis of industrial coke samples by activation with cyclotron protons

    International Nuclear Information System (INIS)

    Chaudhri, M.A.; Lee, M.M.; Spicer, B.M.

    1982-01-01

    A Melbourne Petrochemical Company was experiencing excessive coke formation in its ''cracking furnaces'', which was causing unnecessary stoppage of production and wastage of energy due to additional insulation. In order to investigate the possible causes of this excessive coke formation, we analyzed various coke samples and other coke-like materials obtained from these furnaces by activation with cyclotron-protons. Our results showed that, out of the two suspected coke promoters As and Sb whose small concentration in feed would produce greatly accelerated coke formation, As could not be detected in any of the eight samples investigated, while Sb was present in only one sample. However, we did observe Ca, Cr and Fe in all the samples, with surprisingly high concentrations in some of them. It has, therefore, been suggested that Ca, and perhaps Cr and Fe, but not As or Sb, could have been responsible for the excessive coke formation in the ''cracking furnaces''

  18. Decomposition and forecasting analysis of China's energy efficiency: An application of three-dimensional decomposition and small-sample hybrid models

    International Nuclear Information System (INIS)

    Meng, Ming; Shang, Wei; Zhao, Xiaoli; Niu, Dongxiao; Li, Wei

    2015-01-01

    The coordinated actions of the central and the provincial governments are important in improving China's energy efficiency. This paper uses a three-dimensional decomposition model to measure the contribution of each province in improving the country's energy efficiency and a small-sample hybrid model to forecast this contribution. Empirical analysis draws the following conclusions which are useful for the central government to adjust its provincial energy-related policies. (a) There are two important areas for the Chinese government to improve its energy efficiency: adjusting the provincial economic structure and controlling the number of the small-scale private industrial enterprises; (b) Except for a few outliers, the energy efficiency growth rates of the northern provinces are higher than those of the southern provinces; provinces with high growth rates tend to converge geographically; (c) With regard to the energy sustainable development level, Beijing, Tianjin, Jiangxi, and Shaanxi are the best performers and Heilongjiang, Shanxi, Shanghai, and Guizhou are the worst performers; (d) By 2020, China's energy efficiency may reach 24.75 thousand yuan per ton of standard coal; as well as (e) Three development scenarios are designed to forecast China's energy consumption in 2012–2020. - Highlights: • Decomposition and forecasting models are used to analyze China's energy efficiency. • China should focus on the small industrial enterprises and local protectionism. • The energy sustainable development level of each province is evaluated. • Geographic distribution characteristics of energy efficiency changes are revealed. • Future energy efficiency and energy consumption are forecasted

  19. Analysis of submicrogram samples by INAA

    Energy Technology Data Exchange (ETDEWEB)

    Lindstrom, D J [National Aeronautics and Space Administration, Houston, TX (USA). Lyndon B. Johnson Space Center

    1990-12-20

    Procedure have been developed to increase the sensitivity of instrumental neutron activation analysis (INAA) so that cosmic-dust samples weighing only 10{sup -9}-10{sup -7} g are routinely analyzed for a sizable number of elements. The primary differences from standard techniques are: (1) irradiation of the samples is much more intense, (2) gamma ray assay of the samples is done using long counting times and large Ge detectors that are operated in an excellent low-background facility, (3) specially prepared glass standards are used, (4) samples are too small to be weighed routinely and concentrations must be obtained indirectly, (5) sample handling is much more difficult, and contamination of small samples with normally insignificant amounts of contaminants is difficult to prevent. In spite of the difficulties, INAA analyses have been done on 15 cosmic-dust particles and a large number of other stratospheric particles. Two-sigma detection limits for some elements are in the range of femtograms (10{sup -15} g), e.g. Co=11, Sc=0.9, Sm=0.2 A particle weighing just 0.2 ng was analyzed, obtaining abundances with relative analytical uncertainties of less than 10% for four elements (Fe, Co, Ni and Sc), which were sufficient to allow identification of the particle as chondritic interplanetary dust. Larger samples allow abundances of twenty or more elements to be obtained. (orig.).

  20. Autonomy and job satisfaction for a sample of Greek teachers.

    Science.gov (United States)

    Koustelios, Athanasios D; Karabatzaki, Despina; Kousteliou, Ioanna

    2004-12-01

    Analysing the relation between Job Satisfaction and Autonomy in a sample of 300 Greek teachers (114 men and 186 women, 28 to 59 years old) from primary and secondary schools, showed statistically significant positive correlations between Job Satisfaction and Autonomy. Particularly, Autonomy was correlated with Job Itself (.21), Supervision (.22), and the Organizational as a Whole (.27), aspects of Job Satisfaction. Findings are in line with previous studies conducted in different cultural contexts. Percent common variance accounted for is small.

  1. In situ microfluidic dialysis for biological small-angle X-ray scattering

    DEFF Research Database (Denmark)

    Skou, Magda; Skou, Soren; Jensen, Thomas Glasdam

    2014-01-01

    Owing to the demand for low sample consumption and automated sample changing capabilities at synchrotron small-angle X-ray (solution) scattering (SAXS) beamlines, X-ray microfluidics is receiving continuously increasing attention. Here, a remote-controlled microfluidic device is presented for sim...... in incidental sample purification. Hence, this versatile microfluidic device enables investigation of experimentally induced structural changes under dynamically controllable sample conditions. (C) 2014 International Union of Crystallography......Owing to the demand for low sample consumption and automated sample changing capabilities at synchrotron small-angle X-ray (solution) scattering (SAXS) beamlines, X-ray microfluidics is receiving continuously increasing attention. Here, a remote-controlled microfluidic device is presented...

  2. Predicting Drug-Target Interactions Based on Small Positive Samples.

    Science.gov (United States)

    Hu, Pengwei; Chan, Keith C C; Hu, Yanxing

    2018-01-01

    evaluation of ODT shows that it can be potentially useful. It confirms that predicting potential or missing DTIs based on the known interactions is a promising direction to solve problems related to the use of uncertain and unreliable negative samples and those related to the great demand in computational resources. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  3. Vegetation patterns and abundances of amphibians and small mammals along small streams in a northwestern California watershed

    Science.gov (United States)

    Jeffrey R. Waters; Cynthia J. Zabel; Kevin S. McKelvey; Hartwell H. Welsh

    2001-01-01

    Our goal was to describe and evaluate patterns of association between stream size and abundances of amphibians and small mammals in a northwestern California watershed. We sampled populations at 42 stream sites and eight upland sites within a 100- watershed in 1995 and 1996. Stream reaches sampled ranged from poorly defined channels that rarely flowed to 10-m-wide...

  4. Network 'small-world-ness': a quantitative method for determining canonical network equivalence.

    Directory of Open Access Journals (Sweden)

    Mark D Humphries

    Full Text Available BACKGROUND: Many technological, biological, social, and information networks fall into the broad class of 'small-world' networks: they have tightly interconnected clusters of nodes, and a shortest mean path length that is similar to a matched random graph (same number of nodes and edges. This semi-quantitative definition leads to a categorical distinction ('small/not-small' rather than a quantitative, continuous grading of networks, and can lead to uncertainty about a network's small-world status. Moreover, systems described by small-world networks are often studied using an equivalent canonical network model--the Watts-Strogatz (WS model. However, the process of establishing an equivalent WS model is imprecise and there is a pressing need to discover ways in which this equivalence may be quantified. METHODOLOGY/PRINCIPAL FINDINGS: We defined a precise measure of 'small-world-ness' S based on the trade off between high local clustering and short path length. A network is now deemed a 'small-world' if S>1--an assertion which may be tested statistically. We then examined the behavior of S on a large data-set of real-world systems. We found that all these systems were linked by a linear relationship between their S values and the network size n. Moreover, we show a method for assigning a unique Watts-Strogatz (WS model to any real-world network, and show analytically that the WS models associated with our sample of networks also show linearity between S and n. Linearity between S and n is not, however, inevitable, and neither is S maximal for an arbitrary network of given size. Linearity may, however, be explained by a common limiting growth process. CONCLUSIONS/SIGNIFICANCE: We have shown how the notion of a small-world network may be quantified. Several key properties of the metric are described and the use of WS canonical models is placed on a more secure footing.

  5. Sampling of high molecular weight hydrocarbons with adsorbent tubes

    International Nuclear Information System (INIS)

    Stroemberg, B.

    1996-12-01

    Adsorption tubes have been used to determine the content of hydrocarbons in gas samples from small scale combustion and gasification of biomass. Compounds from benzene (mw 78) to indeno (1,2,3-cd) pyrene (mw 276) have been examined. The results show that it is possible to analyze polyaromatic hydrocarbons (PAH) with 4 aromatic rings (mw 202). Detection limits for these compounds are 3 . PAH with higher molecule weight can be identified and quantified in samples with high amounts of PAH e.g. at gasification of biomass. Sampling on adsorption tubes is extremely quick and easy. The tube is inserted in the gas of interest and the sample is sucked through the tube with a pump. Sampling times of 2-10 minutes are often sufficient. High moisture content in the gas may result in losses of the most volatile compounds, when drying. Even very low concentrations of water in the tube may cause ice formation in the cold-trap and the sample will be destroyed. The analysis is unfortunately time-consuming because the desorption oven must be cooled between every analysis. This will reduce the number of samples which can be analyzed per day. The tubes can be stored for several weeks before analysis without deterioration. 4 refs, 5 figs, 3 tabs

  6. A technique of evaluating most probable stochastic valuables from a small number of samples and their accuracies and degrees of confidence

    Energy Technology Data Exchange (ETDEWEB)

    Katoh, K [Ibaraki Pref. Univ. Health Sci., (Japan)

    1997-12-31

    A problem of estimating stochastic characteristics of a population from a small number of samples is solved as an inverse problem, from view point of information theory and with the Bayesian statistics. For both Poisson-process and Bernoulli-process, the most probable values of the characteristics of the mother population and their accuracies and degrees of confidence are successfully obtained. Mathematical expressions are given to the general case where a limit amount of information and/or knowledge with the stochastic characteristics are available and a special case where no a priori information nor knowledge are available. Mathematical properties of the solutions obtained, practical appreciation to the problem to radiation measurement are also discussed.

  7. Drop-on-demand sample introduction system coupled with the flowing atmospheric-pressure afterglow for direct molecular analysis of complex liquid microvolume samples.

    Science.gov (United States)

    Schaper, J Niklas; Pfeuffer, Kevin P; Shelley, Jacob T; Bings, Nicolas H; Hieftje, Gary M

    2012-11-06

    One of the fastest developing fields in analytical spectrochemistry in recent years is ambient desorption/ionization mass spectrometry (ADI-MS). This burgeoning interest has been due to the demonstrated advantages of the method: simple mass spectra, little or no sample preparation, and applicability to samples in the solid, liquid, or gaseous state. One such ADI-MS source, the flowing atmospheric-pressure afterglow (FAPA), is capable of direct analysis of solids just by aiming the source at the solid surface and sampling the produced ions into a mass spectrometer. However, direct introduction of significant volumes of liquid samples into this source has not been possible, as solvent loads can quench the afterglow and, thus, the formation of reagent ions. As a result, the analysis of liquid samples is preferably carried out by analyzing dried residues or by desorbing small amounts of liquid samples directly from the liquid surface. In the former case, reproducibility of sample introduction is crucial if quantitative results are desired. In the present study, introduction of liquid samples as very small droplets helps overcome the issues of sample positioning and reduced levels of solvent intake. A recently developed "drop-on-demand" (DOD) aerosol generator is capable of reproducibly producing very small volumes of liquid (∼17 pL). In this paper, the coupling of FAPA-MS and DOD is reported and applications are suggested. Analytes representing different classes of substances were tested and limits of detections were determined. Matrix tolerance was investigated for drugs of abuse and their metabolites by analyzing raw urine samples and quantification without the use of internal standards. Limits of detection below 2 μg/mL, without sample pretreatment, were obtained.

  8. Sample vial inserts: A better approach for sampling heterogeneous slurry samples in the SRS Defense Waste Processing Facility

    International Nuclear Information System (INIS)

    Coleman, C.J.; Goode, S.R.

    1996-01-01

    A convenient and effective new approach for analyzing DWPF samples involves the use of inserts with volumes of 1.5--3 ml placed in the neck of 14 ml sample vials. The inserts have rims that conform to the rim of the vials so that they sit straight and stable in the vial. The DWPF tank sampling system fills the pre-weighed insert rather than the entire vial, so the vial functions only as the insert holder. The shielded cell operator then removes the vial cap and decants the insert containing the sample into a plastic bottle, crucible, etc., for analysis. Inert materials such as Teflon, plastic, and zirconium are used for the insert so it is unnecessary to separate the insert from the sample for most analyses. The key technique advantage of using inserts to take DWPF samples versus filling sample vials is that it provides a convenient and almost foolproof way of obtaining and handling small volumes of slurry samples in a shielded cell without corrupting the sample. Since the insert allows the entire sample to be analyzed, this approach eliminates the errors inherent with subsampling heterogeneous slurries that comprise DWPF samples. Slurry samples can then be analyzed with confidence. Analysis times are dramatically reduced by eliminating the drying and vitrification steps normally used to produce a homogeneous solid sample. Direct dissolution and elemental analysis of slurry samples are achieved in 8 hours or less compared with 40 hours for analysis of vitrified slurry samples. Comparison of samples taken in inserts versus full vials indicate that the insert does not significantly affect sample composition

  9. A simple approach to power and sample size calculations in logistic regression and Cox regression models.

    Science.gov (United States)

    Vaeth, Michael; Skovlund, Eva

    2004-06-15

    For a given regression problem it is possible to identify a suitably defined equivalent two-sample problem such that the power or sample size obtained for the two-sample problem also applies to the regression problem. For a standard linear regression model the equivalent two-sample problem is easily identified, but for generalized linear models and for Cox regression models the situation is more complicated. An approximately equivalent two-sample problem may, however, also be identified here. In particular, we show that for logistic regression and Cox regression models the equivalent two-sample problem is obtained by selecting two equally sized samples for which the parameters differ by a value equal to the slope times twice the standard deviation of the independent variable and further requiring that the overall expected number of events is unchanged. In a simulation study we examine the validity of this approach to power calculations in logistic regression and Cox regression models. Several different covariate distributions are considered for selected values of the overall response probability and a range of alternatives. For the Cox regression model we consider both constant and non-constant hazard rates. The results show that in general the approach is remarkably accurate even in relatively small samples. Some discrepancies are, however, found in small samples with few events and a highly skewed covariate distribution. Comparison with results based on alternative methods for logistic regression models with a single continuous covariate indicates that the proposed method is at least as good as its competitors. The method is easy to implement and therefore provides a simple way to extend the range of problems that can be covered by the usual formulas for power and sample size determination. Copyright 2004 John Wiley & Sons, Ltd.

  10. Major and trace elements in geological samples from Itingussu Basin in Coroa-Grande, RJ

    Energy Technology Data Exchange (ETDEWEB)

    Araripe, Denise R. [Universidade Federal Fluminense, Niteroi, RJ (Brazil). Inst. de Quimica. Dept. de Quimica Analitica; E-mail: drararipe@vm.uff.br; Bellido, Alfredo V.B.; Canesim, Fatima [Universidade Federal Fluminense, Niteroi, RJ (Brazil). Inst. de Quimica. Dept. de Fisico-Quimica; Patchineelam, Sambasiva R.; Machdo, Edimar [Universidade Federal Fluminense, Niteroi, RJ (Brazil). Dept. de Geoquimica; Bellido, Luis F. [Instituto de Engenharia Nuclear IEN, Rio de Janeiro, RJ (Brazil); Vasconcelos, Marina B.A. [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil)

    2005-07-01

    The goal of the present work was to characterize soil samples and sediment of mangrove belong the Itingussu river drainage basin, with a view to investigate the lithological signature of it. This small drainage ends in area not yet largely impacted by other sources such as industrial and domestic waste in relation to the elements studied here. The results showed some enrichment of the U,Th and some light rare earth elements in the Itingussu sediment sample. This represent the leucocratic rock signature, according to the normalized of data by upper crustal mean values . (author)

  11. Major and trace elements in geological samples from Itingussu Basin in Coroa-Grande, RJ

    International Nuclear Information System (INIS)

    Araripe, Denise R.; Vasconcelos, Marina B.A.

    2005-01-01

    The goal of the present work was to characterize soil samples and sediment of mangrove belong the Itingussu river drainage basin, with a view to investigate the lithological signature of it. This small drainage ends in area not yet largely impacted by other sources such as industrial and domestic waste in relation to the elements studied here. The results showed some enrichment of the U,Th and some light rare earth elements in the Itingussu sediment sample. This represent the leucocratic rock signature, according to the normalized of data by upper crustal mean values . (author)

  12. Evaluation of complex gonioapparent samples using a bidirectional spectrometer.

    Science.gov (United States)

    Rogelj, Nina; Penttinen, Niko; Gunde, Marta Klanjšek

    2015-08-24

    Many applications use gonioapparent targets whose appearance depends on irradiation and viewing angles; the strongest effects are provided by light diffraction. These targets, optically variable devices (OVDs), are used in both security and authentication applications. This study introduces a bidirectional spectrometer, which enables to analyze samples with most complex angular and spectral properties. In our work, the spectrometer is evaluated with samples having very different types of reflection, concerning spectral and angular distributions. Furthermore, an OVD containing several different grating patches is evaluated. The device uses automatically adjusting exposure time to provide maximum signal dynamics and is capable of doing steps as small as 0.01°. However, even 2° steps for the detector movement showed that this device is more than capable of characterizing even the most complex reflecting surfaces. This study presents sRGB visualizations, discussion of bidirectional reflection, and accurate grating period calculations for all of the grating samples used.

  13. NIF small optics laser damage test specifications

    International Nuclear Information System (INIS)

    Sheehan, L

    1999-01-01

    The Laser Damage Group is currently conducting tests on small optics samples supplied for initial evaluation of potential NIF suppliers. This document is meant to define the specification of laser-induced damage for small optics and the test methods used to collect the data. A rating system which will be applied for vendor selection is presented

  14. CHARACTERIZATION AND ACTUAL WASTE TEST WITH TANK 5F SAMPLES

    International Nuclear Information System (INIS)

    Fletcher, D.

    2007-01-01

    during the cleaning test with oxalic acid along with the iron, aluminum, and sodium in the sludge. During the neutralization of the oxalic acid, the majority of the uranium precipitates from solution along with the iron and other typical sludge elements. The CSEM results of the 75 C neutralization test provide some evidence of uranium separation from other sludge elements. However, the CSEM analysis looked at a very small amount of sample, which might not be representative of the bulk material and the sludge sample also showed areas of high uranium concentration. Additionally, how the test results will scale to the full-scale neutralization in a waste tank remains uncertain. The analysis of the oxalic acid filtrates indicates that only a small portion of the plutonium dissolved during the tank cleaning test. However, the analytical data from the solid residues filtered from the cleaning test contradict the solution data and indicate approximately half of the plutonium dissolved

  15. An immunoglobulin G-4 related sclerosing disease of the small bowel: CT and small bowel series findings

    International Nuclear Information System (INIS)

    Ko, Young Hwan; Hwang, Dae Hyun; Min, Seon Jeong; Woo, Ji Young; Kim, Jeong Won; Hong, Hye Sook; Yang, Ik; Lee, Yul

    2013-01-01

    Immunoglobulin G4 (IgG4)-related sclerosing disease is rare and is known to involve various organs. We present a case of histologically proven IgG4-related sclerosing disease of the small bowel with imaging findings on computed tomography (CT) and small bowel series. CT showed irregular wall thickening, loss of mural stratification and aneurysmal dilatation of the distal ileum. Small bowel series showed aneurysmal dilatations, interloop adhesion with traction and abrupt angulation.

  16. chemical and microbiological assessment of surface water samples

    African Journals Online (AJOL)

    PROF EKWUEME

    were investigated in this study: Nine samples from different surface water bodies, two samples from two effluent sources ... Ezeagu, Udi, Nkanu, Oji River and some parts of Awgu and Aninri ..... Study of Stream Output from Small Catchments.

  17. Measuring helium bubble diameter distributions in tungsten with grazing incidence small angle x-ray scattering (GISAXS)

    Science.gov (United States)

    Thompson, M.; Kluth, P.; Doerner, R. P.; Kirby, N.; Riley, D.; Corr, C. S.

    2016-02-01

    Grazing incidence small angle x-ray scattering was performed on tungsten samples exposed to helium plasma in the MAGPIE and Pisces-A linear plasma devices to measure the size distributions of resulting helium nano-bubbles. Nano-bubbles were fitted assuming spheroidal particles and an exponential diameter distribution. These particles had mean diameters between 0.36 and 0.62 nm. Pisces-A exposed samples showed more complex patterns, which may suggest the formation of faceted nano-bubbles or nano-scale surface structures.

  18. Lung cancer - non-small cell

    Science.gov (United States)

    Cancer - lung - non-small cell; Non-small cell lung cancer; NSCLC; Adenocarcinoma - lung; Squamous cell carcinoma - lung ... Research shows that smoking marijuana may help cancer cells grow. But there is no direct link between ...

  19. Impact of habitat diversity on the sampling effort required for the assessment of river fish communities and IBI

    NARCIS (Netherlands)

    Van Liefferinge, C.; Simoens, I.; Vogt, C.; Cox, T.J.S.; Breine, J.; Ercken, D.; Goethals, P.; Belpaire, C.; Meire, P.

    2010-01-01

    The spatial variation in the fish communities of four small Belgian rivers with variable habitat diversity was investigated by electric fishing to define the minimum sampling distance required for optimal fish stock assessment and determination of the Index of Biotic Integrity. This study shows that

  20. HOT-DUST-POOR QUASARS IN MID-INFRARED AND OPTICALLY SELECTED SAMPLES

    International Nuclear Information System (INIS)

    Hao Heng; Elvis, Martin; Civano, Francesca; Lawrence, Andy

    2011-01-01

    We show that the hot-dust-poor (HDP) quasars, originally found in the X-ray-selected XMM-COSMOS type 1 active galactic nucleus (AGN) sample, are just as common in two samples selected at optical/infrared wavelengths: the Richards et al. Spitzer/SDSS sample (8.7% ± 2.2%) and the Palomar-Green-quasar-dominated sample of Elvis et al. (9.5% ± 5.0%). The properties of the HDP quasars in these two samples are consistent with the XMM-COSMOS sample, except that, at the 99% (∼ 2.5σ) significance, a larger proportion of the HDP quasars in the Spitzer/SDSS sample have weak host galaxy contributions, probably due to the selection criteria used. Either the host dust is destroyed (dynamically or by radiation) or is offset from the central black hole due to recoiling. Alternatively, the universality of HDP quasars in samples with different selection methods and the continuous distribution of dust covering factor in type 1 AGNs suggest that the range of spectral energy distributions could be related to the range of tilts in warped fueling disks, as in the model of Lawrence and Elvis, with HDP quasars having relatively small warps.

  1. Sampling large landscapes with small-scale stratification-User's Manual

    Science.gov (United States)

    Bart, Jonathan

    2011-01-01

    This manual explains procedures for partitioning a large landscape into plots, assigning the plots to strata, and selecting plots in each stratum to be surveyed. These steps are referred to as the "sampling large landscapes (SLL) process." We assume that users of the manual have a moderate knowledge of ArcGIS and Microsoft ® Excel. The manual is written for a single user but in many cases, some steps will be carried out by a biologist designing the survey and some steps will be carried out by a quantitative assistant. Thus, the manual essentially may be passed back and forth between these users. The SLL process primarily has been used to survey birds, and we refer to birds as subjects of the counts. The process, however, could be used to count any objects. ®

  2. The UEA Small RNA Workbench: A Suite of Computational Tools for Small RNA Analysis.

    Science.gov (United States)

    Mohorianu, Irina; Stocks, Matthew Benedict; Applegate, Christopher Steven; Folkes, Leighton; Moulton, Vincent

    2017-01-01

    RNA silencing (RNA interference, RNAi) is a complex, highly conserved mechanism mediated by short, typically 20-24 nt in length, noncoding RNAs known as small RNAs (sRNAs). They act as guides for the sequence-specific transcriptional and posttranscriptional regulation of target mRNAs and play a key role in the fine-tuning of biological processes such as growth, response to stresses, or defense mechanism.High-throughput sequencing (HTS) technologies are employed to capture the expression levels of sRNA populations. The processing of the resulting big data sets facilitated the computational analysis of the sRNA patterns of variation within biological samples such as time point experiments, tissue series or various treatments. Rapid technological advances enable larger experiments, often with biological replicates leading to a vast amount of raw data. As a result, in this fast-evolving field, the existing methods for sequence characterization and prediction of interaction (regulatory) networks periodically require adapting or in extreme cases, a complete redesign to cope with the data deluge. In addition, the presence of numerous tools focused only on particular steps of HTS analysis hinders the systematic parsing of the results and their interpretation.The UEA small RNA Workbench (v1-4), described in this chapter, provides a user-friendly, modular, interactive analysis in the form of a suite of computational tools designed to process and mine sRNA datasets for interesting characteristics that can be linked back to the observed phenotypes. First, we show how to preprocess the raw sequencing output and prepare it for downstream analysis. Then we review some quality checks that can be used as a first indication of sources of variability between samples. Next we show how the Workbench can provide a comparison of the effects of different normalization approaches on the distributions of expression, enhanced methods for the identification of differentially expressed

  3. The Work Role Functioning Questionnaire v2.0 Showed Consistent Factor Structure Across Six Working Samples

    DEFF Research Database (Denmark)

    Abma, Femke I.; Bültmann, Ute; Amick, Benjamin C.

    2017-01-01

    Objective: The Work Role Functioning Questionnaire v2.0 (WRFQ) is an outcome measure linking a persons’ health to the ability to meet work demands in the twenty-first century. We aimed to examine the construct validity of the WRFQ in a heterogeneous set of working samples in the Netherlands...

  4. The Work Role Functioning Questionnaire v2.0 Showed Consistent Factor Structure Across Six Working Samples

    NARCIS (Netherlands)

    Abma, F.I.; Bultmann, U.; Amick III, B.C.; Arends, I.; Dorland, P.A.; Flach, P.A.; Klink, J.J.L van der; Ven H.A., van de; Bjørner, J.B.

    2017-01-01

    Objective The Work Role Functioning Questionnaire v2.0 (WRFQ) is an outcome measure linking a persons’ health to the ability to meet work demands in the twenty-first century. We aimed to examine the construct validity of the WRFQ in a heterogeneous set of working samples in the Netherlands with

  5. Small-size low-temperature scanning tunnel microscope

    International Nuclear Information System (INIS)

    Al'tfeder, I.B.; Khajkin, M.S.

    1989-01-01

    A small-size scanning tunnel microscope, designed for operation in transport helium-filled Dewar flasks is described. The microscope design contains a device moving the pin to the tested sample surface and a piezoelectric fine positioning device. High vibration protection of the microscope is provided by its suspension using silk threads. The small-size scanning tunnel microscope provides for atomic resolution

  6. A Bayesian Justification for Random Sampling in Sample Survey

    Directory of Open Access Journals (Sweden)

    Glen Meeden

    2012-07-01

    Full Text Available In the usual Bayesian approach to survey sampling the sampling design, plays a minimal role, at best. Although a close relationship between exchangeable prior distributions and simple random sampling has been noted; how to formally integrate simple random sampling into the Bayesian paradigm is not clear. Recently it has been argued that the sampling design can be thought of as part of a Bayesian's prior distribution. We will show here that under this scenario simple random sample can be given a Bayesian justification in survey sampling.

  7. THE COMPUTER AND SMALL BUSINESS.

    Science.gov (United States)

    The place of the computer in small business is investigated with respect to what type of problems it can solve for small business and how the small...firm can acquire time on one. The decision-making process and the importance of information is discussed in relation to small business . Several...applications of computers are examined to show how the firm can use the computer in day-to-day business operations. The capabilities of a digital computer

  8. A new set-up for simultaneous high-precision measurements of CO2, δ13C-CO2 and δ18O-CO2 on small ice core samples

    Science.gov (United States)

    Jenk, Theo Manuel; Rubino, Mauro; Etheridge, David; Ciobanu, Viorela Gabriela; Blunier, Thomas

    2016-08-01

    Palaeoatmospheric records of carbon dioxide and its stable carbon isotope composition (δ13C) obtained from polar ice cores provide important constraints on the natural variability of the carbon cycle. However, the measurements are both analytically challenging and time-consuming; thus only data exist from a limited number of sampling sites and time periods. Additional analytical resources with high analytical precision and throughput are thus desirable to extend the existing datasets. Moreover, consistent measurements derived by independent laboratories and a variety of analytical systems help to further increase confidence in the global CO2 palaeo-reconstructions. Here, we describe our new set-up for simultaneous measurements of atmospheric CO2 mixing ratios and atmospheric δ13C and δ18O-CO2 in air extracted from ice core samples. The centrepiece of the system is a newly designed needle cracker for the mechanical release of air entrapped in ice core samples of 8-13 g operated at -45 °C. The small sample size allows for high resolution and replicate sampling schemes. In our method, CO2 is cryogenically and chromatographically separated from the bulk air and its isotopic composition subsequently determined by continuous flow isotope ratio mass spectrometry (IRMS). In combination with thermal conductivity measurement of the bulk air, the CO2 mixing ratio is calculated. The analytical precision determined from standard air sample measurements over ice is ±1.9 ppm for CO2 and ±0.09 ‰ for δ13C. In a laboratory intercomparison study with CSIRO (Aspendale, Australia), good agreement between CO2 and δ13C results is found for Law Dome ice core samples. Replicate analysis of these samples resulted in a pooled standard deviation of 2.0 ppm for CO2 and 0.11 ‰ for δ13C. These numbers are good, though they are rather conservative estimates of the overall analytical precision achieved for single ice sample measurements. Facilitated by the small sample requirement

  9. Diversity of human small intestinal Streptococcus and Veillonella populations

    NARCIS (Netherlands)

    van den Bogert, Bartholomeus; Erkus, Oylum; Boekhorst, Jos; de Goffau, Marcus; Smid, Eddy J.; Zoetendal, Erwin G.; Kleerebezem, Michiel

    Molecular and cultivation approaches were employed to study the phylogenetic richness and temporal dynamics of Streptococcus and Veillonella populations in the small intestine. Microbial profiling of human small intestinal samples collected from four ileostomy subjects at four time points displayed

  10. miRge - A Multiplexed Method of Processing Small RNA-Seq Data to Determine MicroRNA Entropy.

    Directory of Open Access Journals (Sweden)

    Alexander S Baras

    Full Text Available Small RNA RNA-seq for microRNAs (miRNAs is a rapidly developing field where opportunities still exist to create better bioinformatics tools to process these large datasets and generate new, useful analyses. We built miRge to be a fast, smart small RNA-seq solution to process samples in a highly multiplexed fashion. miRge employs a Bayesian alignment approach, whereby reads are sequentially aligned against customized mature miRNA, hairpin miRNA, noncoding RNA and mRNA sequence libraries. miRNAs are summarized at the level of raw reads in addition to reads per million (RPM. Reads for all other RNA species (tRNA, rRNA, snoRNA, mRNA are provided, which is useful for identifying potential contaminants and optimizing small RNA purification strategies. miRge was designed to optimally identify miRNA isomiRs and employs an entropy based statistical measurement to identify differential production of isomiRs. This allowed us to identify decreasing entropy in isomiRs as stem cells mature into retinal pigment epithelial cells. Conversely, we show that pancreatic tumor miRNAs have similar entropy to matched normal pancreatic tissues. In a head-to-head comparison with other miRNA analysis tools (miRExpress 2.0, sRNAbench, omiRAs, miRDeep2, Chimira, UEA small RNA Workbench, miRge was faster (4 to 32-fold and was among the top-two methods in maximally aligning miRNAs reads per sample. Moreover, miRge has no inherent limits to its multiplexing. miRge was capable of simultaneously analyzing 100 small RNA-Seq samples in 52 minutes, providing an integrated analysis of miRNA expression across all samples. As miRge was designed for analysis of single as well as multiple samples, miRge is an ideal tool for high and low-throughput users. miRge is freely available at http://atlas.pathology.jhu.edu/baras/miRge.html.

  11. Are most samples of animals systematically biased? Consistent individual trait differences bias samples despite random sampling.

    Science.gov (United States)

    Biro, Peter A

    2013-02-01

    Sampling animals from the wild for study is something nearly every biologist has done, but despite our best efforts to obtain random samples of animals, 'hidden' trait biases may still exist. For example, consistent behavioral traits can affect trappability/catchability, independent of obvious factors such as size and gender, and these traits are often correlated with other repeatable physiological and/or life history traits. If so, systematic sampling bias may exist for any of these traits. The extent to which this is a problem, of course, depends on the magnitude of bias, which is presently unknown because the underlying trait distributions in populations are usually unknown, or unknowable. Indeed, our present knowledge about sampling bias comes from samples (not complete population censuses), which can possess bias to begin with. I had the unique opportunity to create naturalized populations of fish by seeding each of four small fishless lakes with equal densities of slow-, intermediate-, and fast-growing fish. Using sampling methods that are not size-selective, I observed that fast-growing fish were up to two-times more likely to be sampled than slower-growing fish. This indicates substantial and systematic bias with respect to an important life history trait (growth rate). If correlations between behavioral, physiological and life-history traits are as widespread as the literature suggests, then many animal samples may be systematically biased with respect to these traits (e.g., when collecting animals for laboratory use), and affect our inferences about population structure and abundance. I conclude with a discussion on ways to minimize sampling bias for particular physiological/behavioral/life-history types within animal populations.

  12. Use of a 137Cs re-sampling technique to investigate temporal changes in soil erosion and sediment mobilisation for a small forested catchment in southern Italy

    International Nuclear Information System (INIS)

    Porto, Paolo; Walling, Des E.; Alewell, Christine; Callegari, Giovanni; Mabit, Lionel; Mallimo, Nicola; Meusburger, Katrin; Zehringer, Markus

    2014-01-01

    Soil erosion and both its on-site and off-site impacts are increasingly seen as a serious environmental problem across the world. The need for an improved evidence base on soil loss and soil redistribution rates has directed attention to the use of fallout radionuclides, and particularly 137 Cs, for documenting soil redistribution rates. This approach possesses important advantages over more traditional means of documenting soil erosion and soil redistribution. However, one key limitation of the approach is the time-averaged or lumped nature of the estimated erosion rates. In nearly all cases, these will relate to the period extending from the main period of bomb fallout to the time of sampling. Increasing concern for the impact of global change, particularly that related to changing land use and climate change, has frequently directed attention to the need to document changes in soil redistribution rates within this period. Re-sampling techniques, which should be distinguished from repeat-sampling techniques, have the potential to meet this requirement. As an example, the use of a re-sampling technique to derive estimates of the mean annual net soil loss from a small (1.38 ha) forested catchment in southern Italy is reported. The catchment was originally sampled in 1998 and samples were collected from points very close to the original sampling points again in 2013. This made it possible to compare the estimate of mean annual erosion for the period 1954–1998 with that for the period 1999–2013. The availability of measurements of sediment yield from the catchment for parts of the overall period made it possible to compare the results provided by the 137 Cs re-sampling study with the estimates of sediment yield for the same periods. In order to compare the estimates of soil loss and sediment yield for the two different periods, it was necessary to establish the uncertainty associated with the individual estimates. In the absence of a generally accepted procedure

  13. Isothermal multiple displacement amplification: a methodical approach enhancing molecular routine diagnostics of microcarcinomas and small biopsies

    Directory of Open Access Journals (Sweden)

    Mairinger FD

    2014-08-01

    Full Text Available Fabian D Mairinger,1 Robert FH Walter,2 Claudia Vollbrecht,3 Thomas Hager,1 Karl Worm,1 Saskia Ting,1 Jeremias Wohlschläger,1 Paul Zarogoulidis,4 Konstantinos Zarogoulidis,4 Kurt W Schmid1 1Institute of Pathology, 2Ruhrlandklinik, West German Lung Center, University Hospital Essen, Essen, 3Institute of Pathology, University Hospital Cologne, Cologne, Germany; 4Pulmonary Department, Oncology Unit, G Papanikolaou General Hospital, Aristotle University of Thessaloniki, Thessaloniki, Greece Background and methods: Isothermal multiple displacement amplification (IMDA can be a powerful tool in molecular routine diagnostics for homogeneous and sequence-independent whole-genome amplification of notably small tumor samples, eg, microcarcinomas and biopsies containing a small amount of tumor. Currently, this method is not well established in pathology laboratories. We designed a study to confirm the feasibility and convenience of this method for routine diagnostics with formalin-fixed, paraffin-embedded samples prepared by laser-capture microdissection. Results: A total of 250 µg DNA (concentration 5 µg/µL was generated by amplification over a period of 8 hours with a material input of approximately 25 cells, approximately equivalent to 175 pg of genomic DNA. In the generated DNA, a representation of all chromosomes could be shown and the presence of elected genes relevant for diagnosis in clinical samples could be proven. Mutational analysis of clinical samples could be performed without any difficulty and showed concordance with earlier diagnostic findings. Conclusion: We established the feasibility and convenience of IMDA for routine diagnostics. We also showed that small amounts of DNA, which were not analyzable with current molecular methods, could be sufficient for a wide field of applications in molecular routine diagnostics when they are preamplified with IMDA. Keywords: isothermal multiple displacement amplification, isothermal, whole

  14. Treatability study sample exemption: update

    International Nuclear Information System (INIS)

    1997-01-01

    This document is a RCRA Information Brief intended to update the information in the 1991 Small-Scale Treatability Study Information Brief, and to address questions about the waste and treatability study sample exemptions that have arisen since References 3 and 5 were published

  15. Comparative Analysis of Clinical Samples Showing Weak Serum Reaction on AutoVue System Causing ABO Blood Typing Discrepancies

    OpenAIRE

    Jo, Su Yeon; Lee, Ju Mi; Kim, Hye Lim; Sin, Kyeong Hwa; Lee, Hyeon Ji; Chang, Chulhun Ludgerus; Kim, Hyung-Hoi

    2016-01-01

    Background ABO blood typing in pre-transfusion testing is a major component of the high workload in blood banks that therefore requires automation. We often experienced discrepant results from an automated system, especially weak serum reactions. We evaluated the discrepant results by the reference manual method to confirm ABO blood typing. Methods In total, 13,113 blood samples were tested with the AutoVue system; all samples were run in parallel with the reference manual method according to...

  16. Evaluation of biomass fuel sampling in heat and power plants; Utvaerdering av mekanisk och manuell provtagning av biobraensle vid energiverk

    Energy Technology Data Exchange (ETDEWEB)

    Werkelin, R [Norrkoeping Energi AB (Sweden); Ryk, L [SGS Solid Fuels AB (Sweden)

    1996-06-01

    This report describes bias-test of manual and mechanical sampling systems of bio-fuels at 6 Swedish plants. Two parallel series of 30 samples each were taken at each plant. One series of samples was taken by the normal procedure. The samples of the other series were taken by a reference method. Either by stopped belt and taking a full cut using a sampling frame or by manual increment division of a whole truck-load was used with the assistance of a pay-loader. The rectangular pile formed was divided in 120 squares. Some of the sampling systems gave biased samples. This was due to drawing too small increments, having too small openings of the sampling tools and making errors in sample preparation. Determination of moisture was more accurate than determination of ash content. Ash is more heterogeneously distributed in biofuels than moisture. Ash bearing components as clay or stone segregate easily. Although the project has not been able to show that mechanical sampling gives a more representative moisture sampling than manual sampling, it is recommended to invest in mechanical sampling if possible. The reason for this is to reduce the risk of human error and for safety reasons. 14 refs, figs, tabs, photos

  17. IDMS analysis of blank swipe samples for uranium quantity and isotopic composition

    International Nuclear Information System (INIS)

    Ryjinski, M.; Donohue, D.

    2001-01-01

    Since 1996 the IAEA has started routine implementation of environmental sampling. During the last 5 years more than 1700 swipe samples were collected and analyzed in the Network of Analytical Laboratories (NWAL). One sensitive point of analyzing environmental samples is evidence of the presence of enriched U. The U content on swipes is extremely low and therefore there is a relatively high probability of a false positive, e.g. small contamination or a measurement bias. In order to avoid and/or control this the IAEA systematically sends to the laboratories blind blank QC samples. In particular more than 50 blank samples were analyzed during the last two years. A preliminary analysis of blank swipes showed the swipe material itself contains up to 10 ng of NU per swipe. However, about 50% of blind blank swipes analyzed show the presence of enriched uranium. A source of this bias has to be clarified and excluded. This paper presents the results of modeling of IDMS analysis for quantity and isotopic composition of uranium in order to identify the possible contribution of different factors to the final measurement uncertainty. This modeling was carried out based on the IAEA Clean Laboratory measurement data and simulation technique

  18. Metagenes Associated with Survival in Non-Small Cell Lung Cancer

    Science.gov (United States)

    Urgard, Egon; Vooder, Tõnu; Võsa, Urmo; Välk, Kristjan; Liu, Mingming; Luo, Cheng; Hoti, Fabian; Roosipuu, Retlav; Annilo, Tarmo; Laine, Jukka; Frenz, Christopher M.; Zhang, Liqing; Metspalu, Andres

    2011-01-01

    NSCLC (non-small cell lung cancer) comprises about 80% of all lung cancer cases worldwide. Surgery is most effective treatment for patients with early-stage disease. However, 30%–55% of these patients develop recurrence within 5 years. Therefore, markers that can be used to accurately classify early-stage NSCLC patients into different prognostic groups may be helpful in selecting patients who should receive specific therapies. A previously published dataset was used to evaluate gene expression profiles of different NSCLC subtypes. A moderated two-sample t-test was used to identify differentially expressed genes between all tumor samples and cancer-free control tissue, between SCC samples and AC/BC samples and between stage I tumor samples and all other tumor samples. Gene expression microarray measurements were validated using qRT-PCR. Bayesian regression analysis and Kaplan-Meier survival analysis were performed to determine metagenes associated with survival. We identified 599 genes which were down-regulated and 402 genes which were up-regulated in NSCLC compared to the normal lung tissue and 112 genes which were up-regulated and 101 genes which were down-regulated in AC/BC compared to the SCC. Further, for stage Ib patients the metagenes potentially associated with survival were identified. Genes that expressed differently between normal lung tissue and cancer showed enrichment in gene ontology terms which were associated with mitosis and proliferation. Bayesian regression and Kaplan-Meier analysis showed that gene-expression patterns and metagene profiles can be applied to predict the probability of different survival outcomes in NSCLC patients. PMID:21695068

  19. Study on porosity of ceramic SiC using small angle neutron scattering

    International Nuclear Information System (INIS)

    Li Jizhou; Yang Jilian; Kang Jian; Ye Chuntang

    1996-01-01

    The mechanical properties of functional heat-resistant ceramics SiC are significantly influenced by the concentration and dimensions of pores. Small angle neutron scattering measurements for 3 SiC samples with different densities are performed on C1-2 SANS instrument of the University of Tokyo. Two groups of the neutron data are obtained using 8 and 16 m of secondary flight path, 1 and 0.7 nm of neutron wave lengths, respectively. After deduction of background measurement and transmission correction, both neutron data are linked up with each other. The patterns of neutron data of 3 samples with Q range from 0.028∼0.5 nm -1 are almost with axial symmetry, showing that the shape of pores is almost spherical. Using Mellin transform, size distributions of pores in 3 samples are obtained. The average size (∼19 nm) of pores for hot-pressed SiC sample with higher density is smaller than the others (∼ 21 nm). It seems to be the reason why the density of hot-pressed SiC sample is higher than not hot-pressed sample

  20. Quantum tomography via compressed sensing: error bounds, sample complexity and efficient estimators

    International Nuclear Information System (INIS)

    Flammia, Steven T; Gross, David; Liu, Yi-Kai; Eisert, Jens

    2012-01-01

    Intuitively, if a density operator has small rank, then it should be easier to estimate from experimental data, since in this case only a few eigenvectors need to be learned. We prove two complementary results that confirm this intuition. Firstly, we show that a low-rank density matrix can be estimated using fewer copies of the state, i.e. the sample complexity of tomography decreases with the rank. Secondly, we show that unknown low-rank states can be reconstructed from an incomplete set of measurements, using techniques from compressed sensing and matrix completion. These techniques use simple Pauli measurements, and their output can be certified without making any assumptions about the unknown state. In this paper, we present a new theoretical analysis of compressed tomography, based on the restricted isometry property for low-rank matrices. Using these tools, we obtain near-optimal error bounds for the realistic situation where the data contain noise due to finite statistics, and the density matrix is full-rank with decaying eigenvalues. We also obtain upper bounds on the sample complexity of compressed tomography, and almost-matching lower bounds on the sample complexity of any procedure using adaptive sequences of Pauli measurements. Using numerical simulations, we compare the performance of two compressed sensing estimators—the matrix Dantzig selector and the matrix Lasso—with standard maximum-likelihood estimation (MLE). We find that, given comparable experimental resources, the compressed sensing estimators consistently produce higher fidelity state reconstructions than MLE. In addition, the use of an incomplete set of measurements leads to faster classical processing with no loss of accuracy. Finally, we show how to certify the accuracy of a low-rank estimate using direct fidelity estimation, and describe a method for compressed quantum process tomography that works for processes with small Kraus rank and requires only Pauli eigenstate preparations

  1. Experimental technique of small angle neutron scattering

    International Nuclear Information System (INIS)

    Xia Qingzhong; Chen Bo

    2006-03-01

    The main parts of Small Angle Neutron Scattering (SANS) spectrometer, and their function and different parameters are introduced from experimental aspect. Detailed information is also introduced for SANS spectrometer 'Membrana-2'. Based on practical experiments, the fundamental requirements and working condition for SANS experiments, including sample preparation, detector calibration, standard sample selection and data preliminary process are described. (authors)

  2. A small-angle neutron scattering investigation of coke deposits on catalysts

    International Nuclear Information System (INIS)

    Acharya, D.R.; Hughes, R.; Allen, A.J.

    1990-01-01

    Small-angle neutron scattering (SANS) has been used to characterize a silica-alumina catalyst before and after coke deposition. The reaction used to deactivate the catalyst was the isomerization of xylenes. The results showed that, while most of the surface area in this type of catalyst resides in the ultrafine pores of diameters less than 1 nm occupying about 7% of the sample volume, there appears to be no coke deposition in these pores. The coke seems to coat the solid structures of 3.3-nm diameter which are of capillary shape. Such structures occupy about 6% of the sample volume. The coke was found to correspond to amonolayer of composition CH 0.3 with a density of 1660 kg/m 3

  3. Assessment of Small-scale Buffalo Milk Dairy Production-A Premise for a Durable Development

    Directory of Open Access Journals (Sweden)

    Marian MIHAIU

    2012-05-01

    Full Text Available Buffalo husbandry is an important source of income for a number of small-scale producers in Romania that is why an assessment of its products quality is much needed for improvement and evaluation of their vulnerability to international competition. In order to ascertain possible developments in the buffalo dairy sector and to broadly identify areas of intervention that favor small-scale dairy producers, the study examined the potential to improve buffalo milk production by evaluating its authenticity and hygienic quality. The methods used involved the molecular testing (PCR-technique for identifying cow, sheep or goat DNA in the dairy products samples collected from the small-scale producers market. The hygienic quality of these samples was determined through classical microbiology methods, highly developed techniques (Trek System and PCR for bacterial species confirmation. The results showed that a high percent (65%, from the products found were adulterated with other species milk, mostly cow milk. The most commonly falsified buffalo dairy products were the cheese and the traditional product telemea. The prevalence of the bacterial species identified belonged to Listeria innocua and Listeria welshmeri. The conclusion of this study is the need of a durable development system in this particular dairy chain to improve and assure the authenticity and quality of the small-scale producers products and their reliability for the consumers.

  4. Assessment of Small-scale Buffalo Milk Dairy Production-A Premise for a Durable Development

    Directory of Open Access Journals (Sweden)

    Marian MIHAIU

    2012-05-01

    Full Text Available Buffalo husbandry is an important source of income for a number of small-scale producers in Romania that is why an assessment of its� product�s quality is much needed for improvement and evaluation of their vulnerability to international competition. In order to ascertain possible developments in the buffalo dairy sector and to broadly identify areas of intervention that favor small-scale dairy producers, the study examined the potential to improve buffalo milk production by evaluating its authenticity and hygienic quality. The methods used involved the molecular testing (PCR-technique for identifying cow, sheep or goat DNA in the dairy products� samples collected from the small-scale producers market. The hygienic quality of these samples was determined through classical microbiology methods, highly developed techniques (Trek System and PCR for bacterial species confirmation. The results showed that a high percent (65%, from the products found were adulterated with other species milk, mostly cow milk. The most commonly falsified buffalo dairy products were the cheese and the traditional product �telemea�. The prevalence of the bacterial species identified belonged to Listeria innocua and Listeria welshmeri. The conclusion of this study is the need of a durable development system in this particular dairy chain to improve and assure the authenticity and quality of the small-scale producers� products and their reliability for the consumers.

  5. Contamination risk of stable isotope samples during milling.

    Science.gov (United States)

    Isaac-Renton, M; Schneider, L; Treydte, K

    2016-07-15

    Isotope analysis of wood is an important tool in dendrochronology and ecophysiology. Prior to mass spectrometry analysis, wood must be homogenized, and a convenient method involves a ball mill capable of milling samples directly in sample tubes. However, sample-tube plastic can contaminate wood during milling, which could lead to biological misinterpretations. We tested possible contamination of whole wood and cellulose samples during ball-mill homogenization for carbon and oxygen isotope measurements. We used a multi-factorial design with two/three steel milling balls, two sample amounts (10 mg, 40 mg), and two milling times (5 min, 10 min). We further analyzed abrasion by milling empty tubes, and measured the isotope ratios of pure contaminants. A strong risk exists for carbon isotope bias through plastic contamination: the δ(13) C value of polypropylene deviated from the control by -6.77‰. Small fibers from PTFE filter bags used during cellulose extraction also present a risk as the δ(13) C value of this plastic deviated by -5.02‰. Low sample amounts (10 mg) showed highest contamination due to increased abrasion during milling (-1.34‰), which is further concentrated by cellulose extraction (-3.38‰). Oxygen isotope measurements were unaffected. A ball mill can be used to homogenize samples within test tubes prior to oxygen isotope analysis, but not prior to carbon or radiocarbon isotope analysis. There is still a need for a fast, simple and contamination-free sample preparation procedure. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. A methodology for more efficient tail area sampling with discrete probability distribution

    International Nuclear Information System (INIS)

    Park, Sang Ryeol; Lee, Byung Ho; Kim, Tae Woon

    1988-01-01

    Monte Carlo Method is commonly used to observe the overall distribution and to determine the lower or upper bound value in statistical approach when direct analytical calculation is unavailable. However, this method would not be efficient if the tail area of a distribution is concerned. A new method entitled 'Two Step Tail Area Sampling' is developed, which uses the assumption of discrete probability distribution and samples only the tail area without distorting the overall distribution. This method uses two step sampling procedure. First, sampling at points separated by large intervals is done and second, sampling at points separated by small intervals is done with some check points determined at first step sampling. Comparison with Monte Carlo Method shows that the results obtained from the new method converge to analytic value faster than Monte Carlo Method if the numbers of calculation of both methods are the same. This new method is applied to DNBR (Departure from Nucleate Boiling Ratio) prediction problem in design of the pressurized light water nuclear reactor

  7. Likelihood inference of non-constant diversification rates with incomplete taxon sampling.

    Directory of Open Access Journals (Sweden)

    Sebastian Höhna

    Full Text Available Large-scale phylogenies provide a valuable source to study background diversification rates and investigate if the rates have changed over time. Unfortunately most large-scale, dated phylogenies are sparsely sampled (fewer than 5% of the described species and taxon sampling is not uniform. Instead, taxa are frequently sampled to obtain at least one representative per subgroup (e.g. family and thus to maximize diversity (diversified sampling. So far, such complications have been ignored, potentially biasing the conclusions that have been reached. In this study I derive the likelihood of a birth-death process with non-constant (time-dependent diversification rates and diversified taxon sampling. Using simulations I test if the true parameters and the sampling method can be recovered when the trees are small or medium sized (fewer than 200 taxa. The results show that the diversification rates can be inferred and the estimates are unbiased for large trees but are biased for small trees (fewer than 50 taxa. Furthermore, model selection by means of Akaike's Information Criterion favors the true model if the true rates differ sufficiently from alternative models (e.g. the birth-death model is recovered if the extinction rate is large and compared to a pure-birth model. Finally, I applied six different diversification rate models--ranging from a constant-rate pure birth process to a decreasing speciation rate birth-death process but excluding any rate shift models--on three large-scale empirical phylogenies (ants, mammals and snakes with respectively 149, 164 and 41 sampled species. All three phylogenies were constructed by diversified taxon sampling, as stated by the authors. However only the snake phylogeny supported diversified taxon sampling. Moreover, a parametric bootstrap test revealed that none of the tested models provided a good fit to the observed data. The model assumptions, such as homogeneous rates across species or no rate shifts, appear

  8. Small ring testing of a creep resistant material

    International Nuclear Information System (INIS)

    Hyde, C.J.; Hyde, T.H.; Sun, W.; Nardone, S.; De Bruycker, E.

    2013-01-01

    Many components in conventional and nuclear power plant, aero-engines, chemical plant etc., operate at temperatures which are high enough for creep to occur. These include steam pipes, pipe branches, gas and steam turbine blades, etc. The manufacture of such components may also require welds to be part of them. In most cases, only nominal operating conditions (i.e. pressure, temperatures, system load, etc.) are known and hence precise life predictions for these components are not possible. Also, the proportion of life consumed will vary from position to position within a component. Hence, non-destructive techniques are adopted to assist in making decisions on whether to repair, continue operating or replace certain components. One such approach is to test a small sample removed from the component to make small creep test specimens which can be tested to give information on the remaining creep life of the component. When such a small sample cannot be removed from the operating component, e.g. in the case of small components, the component can be taken out of operation in order to make small creep test specimens, the results from which can then be used to assist with making decisions regarding similar or future components. This paper presents a small creep test specimen which can be used for the testing of particularly strong and creep resistant materials, such as nickel-based superalloys

  9. Equipment and techniques for remote sampling of stored radioactive waste

    International Nuclear Information System (INIS)

    Nance, T.A.

    1996-01-01

    Several tools have been developed at the Savannah River Site (SRS) to remotely sample stored radioactive waste. These sampling tools have been developed to determine the chemical characteristics of the waste prior to processing. The processing of waste material varies according to the chemical characteristics of the waste, which change due to additions, settling, mixing, and chemical reactions during storage. Once the waste has been sampled to identify its characteristics, the chemical composition of the waste can then be altered if needed to prepare for processing. Various types of waste material in several types of containment must be sampled at SRS. Stored waste materials consist of liquids, floating organics, sludge, salt and solids. Waste is stored in four basic types of tanks with different means of access and interior obstructions. The waste tanks can only be accessed by small openings: access ports, risers and downcomers. Requirements for sampling depend on the type of tank being accessed, the waste within the tank, and the particular location in the tank desired for taking the sample. Sampling devices have been developed to sample all of the waste material forms found in the SRS tank farms. The fluid type samplers are capable of sampling surface liquid, subsurface liquid at varying depth, surface sludge, subsurface sludge, and floating organics. The solid type samplers are capable of sampling salt, sampling a solid layer on the bottom of the tank, and capturing a small solid mass on the tank bottom. The sampling devices are all designed to access the tanks through small access ports. The samplers are reusable and are designed to allow quick transfer of the samples to shielded packaging for transport, reducing the amount of radiation exposure to sampling personnel. The samplers weigh less than 100 lb. and are designed in sections to allow easy disassembly for storage and transport by personnel. (Abstract Truncated)

  10. Nanoparticle assisted laser desorption/ionization mass spectrometry for small molecule analytes.

    Science.gov (United States)

    Abdelhamid, Hani Nasser

    2018-03-01

    Nanoparticle assisted laser desorption/ionization mass spectrometry (NPs-ALDI-MS) shows remarkable characteristics and has a promising future in terms of real sample analysis. The incorporation of NPs can advance several methods including surface assisted LDI-MS, and surface enhanced LDI-MS. These methods have advanced the detection of many thermally labile and nonvolatile biomolecules. Nanoparticles circumvent the drawbacks of conventional organic matrices for the analysis of small molecules. In most cases, NPs offer a clear background without interfering peaks, absence of fragmentation of thermally labile molecules, and allow the ionization of species with weak noncovalent interactions. Furthermore, an enhancement in sensitivity and selectivity can be achieved. NPs enable straightforward analysis of target species in a complex sample. This review (with 239 refs.) covers the progress made in laser-based mass spectrometry in combination with the use of metallic NPs (such as AuNPs, AgNPs, PtNPs, and PdNPs), NPs consisting of oxides and chalcogenides, silicon-based NPs, carbon-based nanomaterials, quantum dots, and metal-organic frameworks. Graphical abstract An overview is given on nanomaterials for use in surface-assisted laser desorption/ionization mass spectrometry of small molecules.

  11. Robust inference in sample selection models

    KAUST Repository

    Zhelonkin, Mikhail; Genton, Marc G.; Ronchetti, Elvezio

    2015-01-01

    The problem of non-random sample selectivity often occurs in practice in many fields. The classical estimators introduced by Heckman are the backbone of the standard statistical analysis of these models. However, these estimators are very sensitive to small deviations from the distributional assumptions which are often not satisfied in practice. We develop a general framework to study the robustness properties of estimators and tests in sample selection models. We derive the influence function and the change-of-variance function of Heckman's two-stage estimator, and we demonstrate the non-robustness of this estimator and its estimated variance to small deviations from the model assumed. We propose a procedure for robustifying the estimator, prove its asymptotic normality and give its asymptotic variance. Both cases with and without an exclusion restriction are covered. This allows us to construct a simple robust alternative to the sample selection bias test. We illustrate the use of our new methodology in an analysis of ambulatory expenditures and we compare the performance of the classical and robust methods in a Monte Carlo simulation study.

  12. Robust inference in sample selection models

    KAUST Repository

    Zhelonkin, Mikhail

    2015-11-20

    The problem of non-random sample selectivity often occurs in practice in many fields. The classical estimators introduced by Heckman are the backbone of the standard statistical analysis of these models. However, these estimators are very sensitive to small deviations from the distributional assumptions which are often not satisfied in practice. We develop a general framework to study the robustness properties of estimators and tests in sample selection models. We derive the influence function and the change-of-variance function of Heckman\\'s two-stage estimator, and we demonstrate the non-robustness of this estimator and its estimated variance to small deviations from the model assumed. We propose a procedure for robustifying the estimator, prove its asymptotic normality and give its asymptotic variance. Both cases with and without an exclusion restriction are covered. This allows us to construct a simple robust alternative to the sample selection bias test. We illustrate the use of our new methodology in an analysis of ambulatory expenditures and we compare the performance of the classical and robust methods in a Monte Carlo simulation study.

  13. Modeling small angle scattering data using FISH

    International Nuclear Information System (INIS)

    Elliott, T.; Buckely, C.E.

    2002-01-01

    Full text: Small angle neutron scattering (SANS) and small angle x-ray scattering (SAXS) are important techniques for the characterisation of samples on the nanometer scale. From the scattered intensity pattern information about the sample such as particle size distribution, concentration and particle interaction can be determined. Since the experimental data is in reciprocal space and information is needed about real space, modeling of the scattering data to obtain parameters is extremely important and several paradigms are available. The use of computer programs to analyze the data is imperative for a robust description of the sample to be obtained. This presentation gives an overview of the SAS process and describes the data-modeling program FISH, written by R. Heenan 1983-2000. The results of using FISH to obtain the particle size distribution of bubbles in the aluminum hydrogen system and other systems of interest are described. Copyright (2002) Australian X-ray Analytical Association Inc

  14. Predicting sample size required for classification performance

    Directory of Open Access Journals (Sweden)

    Figueroa Rosa L

    2012-02-01

    Full Text Available Abstract Background Supervised learning methods need annotated data in order to generate efficient models. Annotated data, however, is a relatively scarce resource and can be expensive to obtain. For both passive and active learning methods, there is a need to estimate the size of the annotated sample required to reach a performance target. Methods We designed and implemented a method that fits an inverse power law model to points of a given learning curve created using a small annotated training set. Fitting is carried out using nonlinear weighted least squares optimization. The fitted model is then used to predict the classifier's performance and confidence interval for larger sample sizes. For evaluation, the nonlinear weighted curve fitting method was applied to a set of learning curves generated using clinical text and waveform classification tasks with active and passive sampling methods, and predictions were validated using standard goodness of fit measures. As control we used an un-weighted fitting method. Results A total of 568 models were fitted and the model predictions were compared with the observed performances. Depending on the data set and sampling method, it took between 80 to 560 annotated samples to achieve mean average and root mean squared error below 0.01. Results also show that our weighted fitting method outperformed the baseline un-weighted method (p Conclusions This paper describes a simple and effective sample size prediction algorithm that conducts weighted fitting of learning curves. The algorithm outperformed an un-weighted algorithm described in previous literature. It can help researchers determine annotation sample size for supervised machine learning.

  15. The challenge of NSCLC diagnosis and predictive analysis on small samples. Practical approach of a working group

    DEFF Research Database (Denmark)

    Thunnissen, Erik; Kerr, Keith M; Herth, Felix J F

    2012-01-01

    Until recently, the division of pulmonary carcinomas into small cell lung cancer (SCLC) and non-small cell lung cancer (NSCLC) was adequate for therapy selection. Due to the emergence of new treatment options subtyping of NSCLC and predictive testing have become mandatory. A practical approach to...

  16. Calculation code of heterogeneity effects for analysis of small sample reactivity worth

    International Nuclear Information System (INIS)

    Okajima, Shigeaki; Mukaiyama, Takehiko; Maeda, Akio.

    1988-03-01

    The discrepancy between experimental and calculated central reactivity worths has been one of the most significant interests for the analysis of fast reactor critical experiment. Two effects have been pointed out so as to be taken into account in the calculation as the possible cause of the discrepancy; one is the local heterogeneity effect which is associated with the measurement geometry, the other is the heterogeneity effect on the distribution of the intracell adjoint flux. In order to evaluate these effects in the analysis of FCA actinide sample reactivity worth the calculation code based on the collision probability method was developed. The code can handle the sample size effect which is one of the local heterogeneity effects and also the intracell adjoint heterogeneity effect. (author)

  17. Cannabis, motivation, and life satisfaction in an internet sample

    Directory of Open Access Journals (Sweden)

    Wilcox Rand

    2006-01-01

    Full Text Available Abstract Although little evidence supports cannabis-induced amotivational syndrome, sources continue to assert that the drug saps motivation 1, which may guide current prohibitions. Few studies report low motivation in chronic users; another reveals that they have higher subjective wellbeing. To assess differences in motivation and subjective wellbeing, we used a large sample (N = 487 and strict definitions of cannabis use (7 days/week and abstinence (never. Standard statistical techniques showed no differences. Robust statistical methods controlling for heteroscedasticity, non-normality and extreme values found no differences in motivation but a small difference in subjective wellbeing. Medical users of cannabis reporting health problems tended to account for a significant portion of subjective wellbeing differences, suggesting that illness decreased wellbeing. All p-values were above p = .05. Thus, daily use of cannabis does not impair motivation. Its impact on subjective wellbeing is small and may actually reflect lower wellbeing due to medical symptoms rather than actual consumption of the plant.

  18. Design and evaluation of a new Peltier-cooled laser ablation cell with on-sample temperature control.

    Science.gov (United States)

    Konz, Ioana; Fernández, Beatriz; Fernández, M Luisa; Pereiro, Rosario; Sanz-Medel, Alfredo

    2014-01-27

    A new custom-built Peltier-cooled laser ablation cell is described. The proposed cryogenic cell combines a small internal volume (20 cm(3)) with a unique and reliable on-sample temperature control. The use of a flexible temperature sensor, directly located on the sample surface, ensures a rigorous sample temperature control throughout the entire analysis time and allows instant response to any possible fluctuation. In this way sample integrity and, therefore, reproducibility can be guaranteed during the ablation. The refrigeration of the proposed cryogenic cell combines an internal refrigeration system, controlled by a sensitive thermocouple, with an external refrigeration system. Cooling of the sample is directly carried out by 8 small (1 cm×1 cm) Peltier elements placed in a circular arrangement in the base of the cell. These Peltier elements are located below a copper plate where the sample is placed. Due to the small size of the cooling electronics and their circular allocation it was possible to maintain a peephole under the sample for illumination allowing a much better visualization of the sample, a factor especially important when working with structurally complex tissue sections. The analytical performance of the cryogenic cell was studied using a glass reference material (SRM NIST 612) at room temperature and at -20°C. The proposed cell design shows a reasonable signal washout (signal decay within less than 10 s to background level), high sensitivity and good signal stability (in the range 6.6-11.7%). Furthermore, high precision (0.4-2.6%) and accuracy (0.3-3.9%) in the isotope ratio measurements were also observed operating the cell both at room temperature and at -20°C. Finally, experimental results obtained for the cell application to qualitative elemental imaging of structurally complex tissue samples (e.g. eye sections from a native frozen porcine eye and fresh flower leaves) demonstrate that working in cryogenic conditions is critical in such

  19. Variability of the raindrop size distribution at small spatial scales

    Science.gov (United States)

    Berne, A.; Jaffrain, J.

    2010-12-01

    Because of the interactions between atmospheric turbulence and cloud microphysics, the raindrop size distribution (DSD) is strongly variable in space and time. The spatial variability of the DSD at small spatial scales (below a few km) is not well documented and not well understood, mainly because of a lack of adequate measurements at the appropriate resolutions. A network of 16 disdrometers (Parsivels) has been designed and set up over EPFL campus in Lausanne, Switzerland. This network covers a typical operational weather radar pixel of 1x1 km2. The question of the significance of the variability of the DSD at such small scales is relevant for radar remote sensing of rainfall because the DSD is often assumed to be uniform within a radar sample volume and because the Z-R relationships used to convert the measured radar reflectivity Z into rain rate R are usually derived from point measurements. Thanks to the number of disdrometers, it was possible to quantify the spatial variability of the DSD at the radar pixel scale and to show that it can be significant. In this contribution, we show that the variability of the total drop concentration, of the median volume diameter and of the rain rate are significant, taking into account the sampling uncertainty associated with disdrometer measurements. The influence of this variability on the Z-R relationship can be non-negligible. Finally, the spatial structure of the DSD is quantified using a geostatistical tool, the variogram, and indicates high spatial correlation within a radar pixel.

  20. Bayesian posterior sampling via stochastic gradient Fisher scoring

    NARCIS (Netherlands)

    Ahn, S.; Korattikara, A.; Welling, M.; Langford, J.; Pineau, J.

    2012-01-01

    In this paper we address the following question: "Can we approximately sample from a Bayesian posterior distribution if we are only allowed to touch a small mini-batch of data-items for every sample we generate?". An algorithm based on the Langevin equation with stochastic gradients (SGLD) was

  1. Turbidity-controlled sampling for suspended sediment load estimation

    Science.gov (United States)

    Jack Lewis

    2003-01-01

    Abstract - Automated data collection is essential to effectively measure suspended sediment loads in storm events, particularly in small basins. Continuous turbidity measurements can be used, along with discharge, in an automated system that makes real-time sampling decisions to facilitate sediment load estimation. The Turbidity Threshold Sampling method distributes...

  2. Can we trust small x resummation?

    Energy Technology Data Exchange (ETDEWEB)

    Forte, Stefano [Dipartimento di Fisica, Universita di Milano and INFN, Sezione di Milano, Via Celoria 16, I-20133 Milan (Italy); Altarelli, Guido [Dipartimento di Fisica ' E.Amaldi' , Universita Roma Tre and INFN, Sezione di Roma Tre, Via della Vasca Navale 84, I-00146 Roma, Italy, CERN, Department of Physics, Theory Division, CH-1211 Geneve 23 (Switzerland); Ball, Richard D. [School of Physics, University of Edinburgh, Edinburgh EH9 3JZ, Scotland (United Kingdom)

    2009-06-15

    We review the current status of small x resummation of evolution of parton distributions and of deep-inelastic coefficient functions. We show that the resummed perturbative expansion is stable, robust upon different treatments of subleading terms, and that it matches smoothly to the unresummed perturbative expansions, with corrections which are of the same order as the typical NNLO ones in the HERA kinematic region. We discuss different approaches to small x resummation: we show that the ambiguities in the resummation procedure are small, provided all parametrically enhanced terms are included in the resummation and properly matched.

  3. Can we trust small x resummation?

    CERN Document Server

    Forte, Stefano; Ball, Richard D

    2009-01-01

    We review the current status of small x resummation of evolution of parton distributions and of deep-inelastic coefficient functions. We show that the resummed perturbative expansion is stable, robust upon different treatments of subleading terms, and that it matches smoothly to the unresummed perturbative expansions, with corrections which are of the same order as the typical NNLO ones in the HERA kinematic region. We discuss different approaches to small x resummation: we show that the ambiguities in the resummation procedure are small, provided all parametrically enhanced terms are included in the resummation and properly matched.

  4. Detecting spatial structures in throughfall data: The effect of extent, sample size, sampling design, and variogram estimation method

    Science.gov (United States)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-09-01

    In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous

  5. Low Cost Mars Sample Return Utilizing Dragon Lander Project

    Science.gov (United States)

    Stoker, Carol R.

    2014-01-01

    We studied a Mars sample return (MSR) mission that lands a SpaceX Dragon Capsule on Mars carrying sample collection hardware (an arm, drill, or small rover) and a spacecraft stack consisting of a Mars Ascent Vehicle (MAV) and Earth Return Vehicle (ERV) that collectively carry the sample container from Mars back to Earth orbit.

  6. Empirical evidences of owners’ managerial behaviour - the case of small companies

    Science.gov (United States)

    Lobontiu, G.; Banica, M.; Ravai-Nagy, S.

    2017-05-01

    In a small firm, the founder or the owner-manager often leaves his or her own personal “stamp” on the way things are done, finding solutions for the multitude of problems the firm faces, and maintaining control over the firm’s operations. The paper aims to investigate the degree to which the owner-managers are controlling the operations of their firm on a day-to-day basis or even getting involved into the management of the functional areas. Our empirical research, conducted on a sample of 200 small and medium-sized enterprises (SME) from the North-Western Romania, Maramures (NUTS3 level - RO114), shows that owner-managers tend to be all-powerful, making decisions based on their experience. Furthermore, the survey highlights the focus of owner-managers on two functional areas, namely the production, and sales and marketing. Finally, the correlation analysis states that in the case of small firms, the owner-manager is more involved in managing the functional areas of the firm, as compared to the medium-ones.

  7. Prevalence of intestinal and haemoprotozoan parasites of small ruminants in Tamil Nadu, India.

    Science.gov (United States)

    Velusamy, R; Rani, N; Ponnudurai, G; Anbarasi, P

    2015-10-01

    The aim of the present study is to assess the prevalence of intestinal and haemoprotozoan parasites of small ruminants (Sheep and Goats) in North Western part of Tamil Nadu, India. A total of 630 faecal samples (251-sheep, 379-goats) and 554 blood smears (242-sheep, 312-goats) were examined, for the presence of eggs of intestinal and haemoprotozoan parasites, respectively. The samples were received from the Veterinary college hospital and Veterinary dispensaries in North Western part of Tamil Nadu. Faecal samples were processed by sedimentation technique and examined under low power objective (×10), and blood smears were stained using Giemsa's technique and examined under oil immersion (×100). The analysis of data on the prevalence of intestinal and haemoprotozoan parasites of sheep and goats in North Western part of Tamil Nadu for the period from 2004 to 2013, showed an overall prevalence of intestinal parasites was found to be 67% and 35% in sheep and goats, respectively, whereas only 11% of sheep and 3% of goats had the haemoprotozoan parasitic infection. Highly, significant difference (pTamil Nadu is highly endemic for intestinal parasites such as coccidia and strongyles and haemoprotozoans such as Anaplasma and Theileria species in small ruminants.

  8. A high sensitivity SQUID-method for the measurement of magnetic susceptibility of small samples in the temperature range 1.5 K-40 K and application on small palladium particles

    International Nuclear Information System (INIS)

    Tu Nguyen Quang.

    1979-01-01

    In this paper a method is developed for magnetic susceptibility measurements which is superior to the common methods. The method is based on the SQUID-principle (Superconducting Quantum Interference Device) using the tunnel effect of a superconducting point contact and magnetic flux quantization for measuring electric and magnetic quantities. Due to this refined method susceptibility changes of very small palladium particles could be detected in the temperature range 1.5 K-40 K with respect to the bulk. In addition susceptibility differences of particle distributions with different means diameters (81 Angstroem and 65 Angstroem) have been measured for the first time. A quantitative comparison of the measurements with theoretical results shows satisfactory agreement. (orig./WBU) [de

  9. Age distribution of human gene families shows significant roles of both large- and small-scale duplications in vertebrate evolution.

    Science.gov (United States)

    Gu, Xun; Wang, Yufeng; Gu, Jianying

    2002-06-01

    The classical (two-round) hypothesis of vertebrate genome duplication proposes two successive whole-genome duplication(s) (polyploidizations) predating the origin of fishes, a view now being seriously challenged. As the debate largely concerns the relative merits of the 'big-bang mode' theory (large-scale duplication) and the 'continuous mode' theory (constant creation by small-scale duplications), we tested whether a significant proportion of paralogous genes in the contemporary human genome was indeed generated in the early stage of vertebrate evolution. After an extensive search of major databases, we dated 1,739 gene duplication events from the phylogenetic analysis of 749 vertebrate gene families. We found a pattern characterized by two waves (I, II) and an ancient component. Wave I represents a recent gene family expansion by tandem or segmental duplications, whereas wave II, a rapid paralogous gene increase in the early stage of vertebrate evolution, supports the idea of genome duplication(s) (the big-bang mode). Further analysis indicated that large- and small-scale gene duplications both make a significant contribution during the early stage of vertebrate evolution to build the current hierarchy of the human proteome.

  10. Thermal transfer and apparent-dose distributions in poorly bleached mortar samples: results from single grains and small aliquots of quartz

    International Nuclear Information System (INIS)

    Jain, M.; Thomsen, K.J.; Boetter-Jensen, L.; Urray, A.S.

    2004-01-01

    In the assessment of doses received from a nuclear accident, considerable attention has been paid to retrospective dosimetry using the optically stimulated luminescence (OSL) of heated materials such as bricks and tiles. quartz extracted from these artefacts was heated during manufacture; this process releases all the prior trapped charge and simultaneously sensitises he quartz. Unfortunately unheated materials such as mortar and concrete are ore common in industrial sites and particularly in nuclear installations. These materials are usually exposed to daylight during quarrying and construction, but in general this exposure is insufficient to completely empty (bleach) any geological trapped charge. This leads to a distribution of apparent doses in the sample at the time of construction with only some (if ny) grains exposed to sufficient light to be considered well bleached for SL dosimetry. The challenge in using such materials as retrospective dosemeters is in identifying these well-bleached grains when an accident dose as been superimposed on the original dose distribution. We investigate here, sing OSL, the background dose in three different mortar samples: render, whitewash and inner wall plaster from a building built in 1964. These samples re found to be both poorly bleached and weakly sensitive (only 0.3% of rains giving a detectable dose response). We study thermal transfer in ingle grains of quartz, investigate the grain-size dependence of bleaching n the size range 90-300 μm and compare the dose-distributions obtained rom small aliquots and single-grain procedures. A comparison of three different methods viz. (a) first 5%, (b) probability plot and (c) comparison f internal and external uncertainties, is made for equivalent dose estimation. The results have implications for accident dosimetry, archaeological studies and dating of poorly bleached sediments

  11. The significance of PIWI family expression in human lung embryogenesis and non-small cell lung cancer.

    Science.gov (United States)

    Navarro, Alfons; Tejero, Rut; Viñolas, Nuria; Cordeiro, Anna; Marrades, Ramon M; Fuster, Dolors; Caritg, Oriol; Moises, Jorge; Muñoz, Carmen; Molins, Laureano; Ramirez, Josep; Monzo, Mariano

    2015-10-13

    The expression of Piwi-interacting RNAs, small RNAs that bind to PIWI proteins, was until recently believed to be limited to germinal stem cells. We have studied the expression of PIWI genes during human lung embryogenesis and in paired tumor and normal tissue prospectively collected from 71 resected non-small-cell lung cancer patients. The mRNA expression analysis showed that PIWIL1 was highly expressed in 7-week embryos and downregulated during the subsequent weeks of development. PIWIL1 was expressed in 11 of the tumor samples but in none of the normal tissue samples. These results were validated by immunohistochemistry, showing faint cytoplasmic reactivity in the PIWIL1-positive samples. Interestingly, the patients expressing PIWIL1 had a shorter time to relapse (TTR) (p = 0.006) and overall survival (OS) (p = 0.0076) than those without PIWIL1 expression. PIWIL2 and 4 were downregulated in tumor tissue in comparison to the normal tissue (p < 0.001) and the patients with lower levels of PIWIL4 had shorter TTR (p = 0.048) and OS (p = 0.033). In the multivariate analysis, PIWIL1 expression emerged as an independent prognostic marker. Using 5-Aza-dC treatment and bisulfite sequencing, we observed that PIWIL1 expression could be regulated in part by methylation. Finally, an in silico study identified a stem-cell expression signature associated with PIWIL1 expression.

  12. Estimating the probability that the sample mean is within a desired fraction of the standard deviation of the true mean.

    Science.gov (United States)

    Schillaci, Michael A; Schillaci, Mario E

    2009-02-01

    The use of small sample sizes in human and primate evolutionary research is commonplace. Estimating how well small samples represent the underlying population, however, is not commonplace. Because the accuracy of determinations of taxonomy, phylogeny, and evolutionary process are dependant upon how well the study sample represents the population of interest, characterizing the uncertainty, or potential error, associated with analyses of small sample sizes is essential. We present a method for estimating the probability that the sample mean is within a desired fraction of the standard deviation of the true mean using small (nresearchers to determine post hoc the probability that their sample is a meaningful approximation of the population parameter. We tested the method using a large craniometric data set commonly used by researchers in the field. Given our results, we suggest that sample estimates of the population mean can be reasonable and meaningful even when based on small, and perhaps even very small, sample sizes.

  13. Origins of collectivity in small systems

    Science.gov (United States)

    Schenke, Björn

    2017-11-01

    We review recent developments in the theoretical description and understanding of multi-particle correlation measurements in collisions of small projectiles (p/d/3He) with heavy nuclei (Au, Pb) as well as proton+proton collisions. We focus on whether the physical processes responsible for the observed long range rapidity correlations and their azimuthal structure are the same in small systems as in heavy ion collisions. In the latter they are interpreted as generated by the initial spatial geometry being transformed into momentum correlations by strong final state interactions. However, explicit calculations show that also initial state momentum correlations are present and could contribute to observables in small systems. If strong final state interactions are present in small systems, recent developments show that results are sensitive to the shape of the proton and its fluctuations.

  14. Planetary Sample Caching System Design Options

    Science.gov (United States)

    Collins, Curtis; Younse, Paulo; Backes, Paul

    2009-01-01

    Potential Mars Sample Return missions would aspire to collect small core and regolith samples using a rover with a sample acquisition tool and sample caching system. Samples would need to be stored in individual sealed tubes in a canister that could be transfered to a Mars ascent vehicle and returned to Earth. A sample handling, encapsulation and containerization system (SHEC) has been developed as part of an integrated system for acquiring and storing core samples for application to future potential MSR and other potential sample return missions. Requirements and design options for the SHEC system were studied and a recommended design concept developed. Two families of solutions were explored: 1)transfer of a raw sample from the tool to the SHEC subsystem and 2)transfer of a tube containing the sample to the SHEC subsystem. The recommended design utilizes sample tool bit change out as the mechanism for transferring tubes to and samples in tubes from the tool. The SHEC subsystem design, called the Bit Changeout Caching(BiCC) design, is intended for operations on a MER class rover.

  15. Shielded scanning electron microscope for radioactive samples

    International Nuclear Information System (INIS)

    Crouse, R.S.; Parsley, W.B.

    1977-01-01

    A small commercial SEM had been successfully shielded for examining radioactive materials transferred directly from a remote handling facility. Relatively minor mechanical modifications were required to achieve excellent operation. Two inches of steel provide adequate shielding for most samples encountered. However, samples reading 75 rad/hr γ have been examined by adding extra shielding in the form of tungsten sample holders and external lead shadow shields. Some degradation of secondary electron imaging was seen but was adequately compensated for by changing operating conditions

  16. Audit sampling: A qualitative study on the role of statistical and non-statistical sampling approaches on audit practices in Sweden

    OpenAIRE

    Ayam, Rufus Tekoh

    2011-01-01

    PURPOSE: The two approaches to audit sampling; statistical and nonstatistical have been examined in this study. The overall purpose of the study is to explore the current extent at which statistical and nonstatistical sampling approaches are utilized by independent auditors during auditing practices. Moreover, the study also seeks to achieve two additional purposes; the first is to find out whether auditors utilize different sampling techniques when auditing SME´s (Small and Medium-Sized Ente...

  17. Mutual Guarantee Institutions and Small Business Finance

    NARCIS (Netherlands)

    Columba, F.; Gambacorta, L.; Mistrulli, P.E.

    2009-01-01

    A large literature showed that small firms experience difficulties in accessing the credit market due to informational asymmetries; these may be mitigated by collateral or relationship lending, possibilities often precluded to small business. We investigate the effect on small business finance of an

  18. Radionuclide contaminant analysis of small mammals at Area G, TA-54, Los Alamos National Laboratory, 1995

    International Nuclear Information System (INIS)

    Bennett, K.; Biggs, J.; Fresquez, P.

    1997-01-01

    At Los Alamos National Laboratory, small mammals were sampled at two waste burial sites (Site 1-recently disturbed and Site 2-partially disturbed) at Area G, Technical Area 54 and a control site on Frijoles Mesa (Site 4) in 1995. Our objectives were (1) to identify radionuclides that are present within surface and subsurface soils at waste burial sites, (2) to compare the amount of radionuclide uptake by small mammals at waste burial sites to a control site, and (3) to identify if the primary mode of contamination to small mammals is by surface contact or ingestion/inhalation. Three composite samples of at least rive animals per sample were collected at each site. Pelts and carcasses of each animal were separated and analyzed independently. Samples were analyzed for 241 Am, 90 Sr , 238 Pu, 239 Pu, total U, 137 Cs, and 3 H. Significantly higher (parametric West at p=0.05) levels of total U, 241 Am, 238 Pu and 239 Pu were detected in pelts than in carcasses of small mammals at TA-54. Concentrations of other measured radionuclides in carcasses were nearly equal to or exceeded the mean concentrations in the pelts. Our results show higher concentrations in pelts compared to carcasses, which is similar to what has been found at waste burial/contaminated sites outside of Los Alamos National Laboratory. Site 1 had a significantly higher (alpha=0.05, P=0.0125) mean tritium concentration in carcasses than Site 2 or Site 4. In addition Site 1 also had a significantly higher (alpha=0.05, p=0.0024) mean tritium concentration in pelts than Site 2 or Site 4. Site 2 had a significantly higher (alpha=0.05, P=0.0499) mean 239 Pu concentration in carcasses than either Site 1 or Site 4

  19. Comparing Server Energy Use and Efficiency Using Small Sample Sizes

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Henry C.; Qin, Yong; Price, Phillip N.

    2014-11-01

    This report documents a demonstration that compared the energy consumption and efficiency of a limited sample size of server-type IT equipment from different manufacturers by measuring power at the server power supply power cords. The results are specific to the equipment and methods used. However, it is hoped that those responsible for IT equipment selection can used the methods described to choose models that optimize energy use efficiency. The demonstration was conducted in a data center at Lawrence Berkeley National Laboratory in Berkeley, California. It was performed with five servers of similar mechanical and electronic specifications; three from Intel and one each from Dell and Supermicro. Server IT equipment is constructed using commodity components, server manufacturer-designed assemblies, and control systems. Server compute efficiency is constrained by the commodity component specifications and integration requirements. The design freedom, outside of the commodity component constraints, provides room for the manufacturer to offer a product with competitive efficiency that meets market needs at a compelling price. A goal of the demonstration was to compare and quantify the server efficiency for three different brands. The efficiency is defined as the average compute rate (computations per unit of time) divided by the average energy consumption rate. The research team used an industry standard benchmark software package to provide a repeatable software load to obtain the compute rate and provide a variety of power consumption levels. Energy use when the servers were in an idle state (not providing computing work) were also measured. At high server compute loads, all brands, using the same key components (processors and memory), had similar results; therefore, from these results, it could not be concluded that one brand is more efficient than the other brands. The test results show that the power consumption variability caused by the key components as a

  20. Use of Unlabeled Samples for Mitigating the Hughes Phenomenon

    Science.gov (United States)

    Landgrebe, David A.; Shahshahani, Behzad M.

    1993-01-01

    The use of unlabeled samples in improving the performance of classifiers is studied. When the number of training samples is fixed and small, additional feature measurements may reduce the performance of a statistical classifier. It is shown that by using unlabeled samples, estimates of the parameters can be improved and therefore this phenomenon may be mitigated. Various methods for using unlabeled samples are reviewed and experimental results are provided.

  1. Scalability on LHS (Latin Hypercube Sampling) samples for use in uncertainty analysis of large numerical models

    International Nuclear Information System (INIS)

    Baron, Jorge H.; Nunez Mac Leod, J.E.

    2000-01-01

    The present paper deals with the utilization of advanced sampling statistical methods to perform uncertainty and sensitivity analysis on numerical models. Such models may represent physical phenomena, logical structures (such as boolean expressions) or other systems, and various of their intrinsic parameters and/or input variables are usually treated as random variables simultaneously. In the present paper a simple method to scale-up Latin Hypercube Sampling (LHS) samples is presented, starting with a small sample and duplicating its size at each step, making it possible to use the already run numerical model results with the smaller sample. The method does not distort the statistical properties of the random variables and does not add any bias to the samples. The results is a significant reduction in numerical models running time can be achieved (by re-using the previously run samples), keeping all the advantages of LHS, until an acceptable representation level is achieved in the output variables. (author)

  2. Quantum Metropolis sampling.

    Science.gov (United States)

    Temme, K; Osborne, T J; Vollbrecht, K G; Poulin, D; Verstraete, F

    2011-03-03

    The original motivation to build a quantum computer came from Feynman, who imagined a machine capable of simulating generic quantum mechanical systems--a task that is believed to be intractable for classical computers. Such a machine could have far-reaching applications in the simulation of many-body quantum physics in condensed-matter, chemical and high-energy systems. Part of Feynman's challenge was met by Lloyd, who showed how to approximately decompose the time evolution operator of interacting quantum particles into a short sequence of elementary gates, suitable for operation on a quantum computer. However, this left open the problem of how to simulate the equilibrium and static properties of quantum systems. This requires the preparation of ground and Gibbs states on a quantum computer. For classical systems, this problem is solved by the ubiquitous Metropolis algorithm, a method that has basically acquired a monopoly on the simulation of interacting particles. Here we demonstrate how to implement a quantum version of the Metropolis algorithm. This algorithm permits sampling directly from the eigenstates of the Hamiltonian, and thus evades the sign problem present in classical simulations. A small-scale implementation of this algorithm should be achievable with today's technology.

  3. Development of an evaluation method for fracture mechanical tests on small samples based on a cohesive zone model

    International Nuclear Information System (INIS)

    Mahler, Michael

    2016-01-01

    The safety and reliability of nuclear power plants of the fourth generation is an important issue. It is based on a reliable interpretation of the components for which, among other fracture mechanical material properties are required. The existing irradiation in the power plants significantly affects the material properties which therefore need to be determined on irradiated material. Often only small amounts of irradiated material are available for characterization. In that case it is not possible to manufacture sufficiently large specimens, which are necessary for fracture mechanical testing in agreement with the standard. Small specimens must be used. From this follows the idea of this study, in which the fracture toughness can be predicted with the developed method based on tests of small specimens. For this purpose, the fracture process including the crack growth is described with a continuum mechanical approach using the finite element method and the cohesive zone model. The experiments on small specimens are used for parameter identification of the cohesive zone model. The two parameters of the cohesive zone model are determined by tensile tests on notched specimens (cohesive stress) and by parameter fitting to the fracture behavior of smalls specimens (cohesive energy). To account the different triaxialities of the specimens, the cohesive stress is used depending on the triaxiality. After parameter identification a large specimen can be simulated with the cohesive zone parameters derived from small specimens. The predicted fracture toughness of this big specimen fulfills the size requirements in the standard (ASTM E1820 or ASTM E399) in contrast to the small specimen. This method can be used for ductile and brittle material behavior and was validated in this work. In summary, this method offers the possibility to determine the fracture toughness indirectly based on small specimen testing. Main advantage is the low required specimen volume. Thereby massively

  4. Neutron spectrum in small iron pile surrounded by lead reflector

    International Nuclear Information System (INIS)

    Kimura, Itsuro; Hayashi, S.A.; Kobayashi, Katsuhei; Matsumura, Tetsuo; Nishihara, Hiroshi.

    1978-01-01

    In order to save the quantity of sample material, a possibility to assess group constants of a reactor material through measurement and analysis of neutron spectrum in a small sample pile surrounded by a reflector of heavy moderator, was investigated. As the sample and the reflector, we chose iron and lead, respectively. Although the time dispersion in moderation of neutrons was considerably prolonged by the lead reflector, this hardly interferes with the assessment of group constants. Theoretical calculation revealed that both the neutron flux spectrum and the sensitivity coefficient of group constants in an iron sphere, 35 cm in diameter surrounded by the lead reflector, 25 cm thick, were close to those of the bare iron sphere, 108 cm in diameter. The neutron spectra in a small iron pile surrounded by a lead reflector were experimentally obtained by the time-of-flight method with an electron linear accelerator and the result was compared with the predicted values. It could be confirmed that a small sample pile surrounded by a reflector, such as lead, was as useful as a much larger bulk pile for the assessment of group constants of a reactor material. (auth.)

  5. Forward selection two sample binomial test

    Science.gov (United States)

    Wong, Kam-Fai; Wong, Weng-Kee; Lin, Miao-Shan

    2016-01-01

    Fisher’s exact test (FET) is a conditional method that is frequently used to analyze data in a 2 × 2 table for small samples. This test is conservative and attempts have been made to modify the test to make it less conservative. For example, Crans and Shuster (2008) proposed adding more points in the rejection region to make the test more powerful. We provide another way to modify the test to make it less conservative by using two independent binomial distributions as the reference distribution for the test statistic. We compare our new test with several methods and show that our test has advantages over existing methods in terms of control of the type 1 and type 2 errors. We reanalyze results from an oncology trial using our proposed method and our software which is freely available to the reader. PMID:27335577

  6. Small target pre-detection with an attention mechanism

    Science.gov (United States)

    Wang, Yuehuan; Zhang, Tianxu; Wang, Guoyou

    2002-04-01

    We introduce the concept of predetection based on an attention mechanism to improve the efficiency of small-target detection by limiting the image region of detection. According to the characteristics of small-target detection, local contrast is taken as the only feature in predetection and a nonlinear sampling model is adopted to make the predetection adaptive to detect small targets with different area sizes. To simplify the predetection itself and decrease the false alarm probability, neighboring nodes in the sampling grid are used to generate a saliency map, and a short-term memory is adopted to accelerate the `pop-out' of targets. We discuss the fact that the proposed approach is simple enough in computational complexity. In addition, even in a cluttered background, attention can be led to targets in a satisfying few iterations, which ensures that the detection efficiency will not be decreased due to false alarms. Experimental results are presented to demonstrate the applicability of the approach.

  7. Understanding the statistics of small risks

    International Nuclear Information System (INIS)

    Siddall, E.

    1983-10-01

    Monte Carlo analyses are used to show what inferences can and cannot be drawn when either a very small number of accidents result from a considerable exposure or where a very small number of people, down to a single individual, are exposed to small added risks. The distinction between relative and absolute uncertainty is illustrated. No new statistical principles are involved

  8. Road Zone Effects in Small-Mammal Communities

    Directory of Open Access Journals (Sweden)

    John A. Bissonette

    2009-06-01

    Full Text Available Our study focused on the putative effects of roads on small-mammal communities in a high desert region of southern Utah. Specifically, we tested whether or not roads create adjacent zones characterized by lower small- mammal densities, abundance, and diversity. We sampled abundance of small mammals at increasing distances from Interstate 15 during two summers. We recorded 11 genera and 13 species. We detected no clear abundance, density, or diversity effects relative to distance from the road. Only two of 13 species were never captured near roads. The abundance of the remaining 11 small mammal species was either similar at different distances from the road or higher closer to the road. We conclude that although roads may act as barriers and possible sources of mortality, adjacent zones of vegetation often provide favorable microhabitat in the desert landscape for many small mammals.

  9. Comparison of POCIS passive samplers vs. composite water sampling: A case study.

    Science.gov (United States)

    Criquet, Justine; Dumoulin, David; Howsam, Michael; Mondamert, Leslie; Goossens, Jean-François; Prygiel, Jean; Billon, Gabriel

    2017-12-31

    The relevance of Polar Organic Chemical Integrative Samplers (POCIS) was evaluated for the assessment of concentrations of 46 pesticides and 19 pharmaceuticals in a small, peri-urban river with multi-origin inputs. Throughout the period of POCIS deployment, 24h-average water samples were collected automatically, and showed the rapid temporal evolution of concentrations of several micropollutants, as well as permitting the calculation of average concentrations in the water phase for comparison with those estimated from POCIS passive samplers. In the daily water samples, cyproconazol, epoxyconazol and imidacloprid showed high temporal variations with concentrations ranging from under the limit of detection up to several hundreds of ngL -1 . Erythromycin, cyprofloxacin and iopromide also increased rapidly up to tens of ngL -1 within a few days. Conversely, atrazine, caffeine, diclofenac, and to a lesser extent carbamazepine and sucralose, were systematically present in the water samples and showed limited variation in concentrations. For most of the substances studied here, the passive samplers gave reliable average concentrations between the minimal and maximal daily concentrations during the time of deployment. For pesticides, a relatively good correlation was clearly established (R 2 =0.89) between the concentrations obtained by POCIS and those gained from average water samples. A slight underestimation of the concentration by POCIS can be attributed to inappropriate sampling rates extracted from the literature and for our system, and new values are proposed. Considering the all data set, 75% of the results indicate a relatively good agreement between the POCIS and the average water samples concentration (values of the ratio ranging between 0,33 and 3). Note further that this agreement between these concentrations remains valid considering different sampling rates extracted from the literature. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Efficient free energy calculations by combining two complementary tempering sampling methods.

    Science.gov (United States)

    Xie, Liangxu; Shen, Lin; Chen, Zhe-Ning; Yang, Mingjun

    2017-01-14

    Although energy barriers can be efficiently crossed in the reaction coordinate (RC) guided sampling, this type of method suffers from identification of the correct RCs or requirements of high dimensionality of the defined RCs for a given system. If only the approximate RCs with significant barriers are used in the simulations, hidden energy barriers with small to medium height would exist in other degrees of freedom (DOFs) relevant to the target process and consequently cause the problem of insufficient sampling. To address the sampling in this so-called hidden barrier situation, here we propose an effective approach to combine temperature accelerated molecular dynamics (TAMD), an efficient RC-guided sampling method, with the integrated tempering sampling (ITS), a generalized ensemble sampling method. In this combined ITS-TAMD method, the sampling along the major RCs with high energy barriers is guided by TAMD and the sampling of the rest of the DOFs with lower but not negligible barriers is enhanced by ITS. The performance of ITS-TAMD to three systems in the processes with hidden barriers has been examined. In comparison to the standalone TAMD or ITS approach, the present hybrid method shows three main improvements. (1) Sampling efficiency can be improved at least five times even if in the presence of hidden energy barriers. (2) The canonical distribution can be more accurately recovered, from which the thermodynamic properties along other collective variables can be computed correctly. (3) The robustness of the selection of major RCs suggests that the dimensionality of necessary RCs can be reduced. Our work shows more potential applications of the ITS-TAMD method as the efficient and powerful tool for the investigation of a broad range of interesting cases.

  11. Robust weak measurements on finite samples

    International Nuclear Information System (INIS)

    Tollaksen, Jeff

    2007-01-01

    A new weak measurement procedure is introduced for finite samples which yields accurate weak values that are outside the range of eigenvalues and which do not require an exponentially rare ensemble. This procedure provides a unique advantage in the amplification of small nonrandom signals by minimizing uncertainties in determining the weak value and by minimizing sample size. This procedure can also extend the strength of the coupling between the system and measuring device to a new regime

  12. Canadian small wind market

    International Nuclear Information System (INIS)

    Moorhouse, E.

    2010-01-01

    This PowerPoint presentation discussed initiatives and strategies adopted by the Canadian Wind Energy Association (CanWEA) to support the development of Canada's small wind market. The general public has shown a significant interest in small wind projects of 300 kW. Studies have demonstrated that familiarity and comfort with small wind projects can help to ensure the successful implementation of larger wind projects. Small wind markets include residential, farming and commercial, and remote community applications. The results of CanWEA market survey show that the small wind market grew by 78 percent in 2008 over 2007, and again in 2009 by 32 percent over 2008. The average turbine size is 1 kW. A total of 11,000 turbines were purchased in 2007 and 2008. Global small wind market growth increased by 110 percent in 2008, and the average turbine size was 2.4 kW. Eighty-seven percent of the turbines made by Canadian mid-size wind turbine manufacturers are exported, and there is now a significant risk that Canada will lose its competitive advantage in small wind manufacturing as financial incentives have not been implemented. American and Canadian-based small wind manufacturers were listed, and small wind policies were reviewed. The presentation concluded with a set of recommendations for future incentives, educational programs and legislation. tabs., figs.

  13. The Dirichet-Multinomial model for multivariate randomized response data and small samples

    NARCIS (Netherlands)

    Avetisyan, Marianna; Fox, Gerardus J.A.

    2012-01-01

    In survey sampling the randomized response (RR) technique can be used to obtain truthful answers to sensitive questions. Although the individual answers are masked due to the RR technique, individual (sensitive) response rates can be estimated when observing multivariate response data. The

  14. Risk management strategies utilized by small scale poultry farmers ...

    African Journals Online (AJOL)

    Birds can only tolerate narrow temperature changes; therefore, poultry flocks are vulnerable to climate induced risk. This study investigated risk management strategies utilized by small scale poultry farmers in Oyo state. A total of 118 respondents were sampled using multi stage sampling procedure. Interview schedule was ...

  15. Comparison of three dehydration scales showed that they were of limited or no value for assessing small children with acute diarrhoea.

    Science.gov (United States)

    Pomorska, Dominika; Dziechciarz, Piotr; Mduma, Esto; Gidion, Joshua; Falszewska, Anna; Szajewska, Hania

    2018-02-20

    We explored the diagnostic accuracy of the clinical dehydration scale (CDS), the World Health Organization (WHO) scale and the Gorelick scale for assessing dehydration in children admitted to a Tanzanian referral hospital. This was a prospective, observational study, carried out from April 2015 to January 2017 on children aged one month to five years admitted to the hospital with acute diarrhoea lasting less than five days. Before rehydration therapy, each patient's weight was recorded and the degree of dehydration was assessed based on the three scales. The reference standard was the percentage weight change between admission and discharge. The main outcomes were the sensitivity, specificity and positive and negative likelihood ratios (LRs) of the scales. Data from 124 eligible patients were available. The CDS showed limited value for ruling in cases with some dehydration (LR 1.9, 95% confidence interval 1.1-2.8), but was of no value in assessing no and moderate to severe dehydration. The WHO and Gorelick scales were of no value in evaluating any degree of dehydration. The WHO and Gorelick dehydration scales were no use for assessing dehydration in small children, and the CDS was of limited use for predicting cases with some dehydration. ©2018 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.

  16. Analysis of femtogram-sized plutonium samples by thermal ionization mass spectrometry

    International Nuclear Information System (INIS)

    Smith, D.H.; Duckworth, D.C.; Bostick, D.T.; Coleman, R.M.; McPherson, R.L.; McKown, H.S.

    1994-01-01

    The goal of this investigation was to extend the ability to perform isotopic analysis of plutonium to samples as small as possible. Plutonium ionizes thermally with quite good efficiency (first ionization potential 5.7 eV). Sub-nanogram sized samples can be analyzed on a near-routine basis given the necessary instrumentation. Efforts in this laboratory have been directed at rhenium-carbon systems; solutions of carbon in rhenium provide surfaces with work functions higher than pure rhenium (5.8 vs. ∼ 5.4 eV). Using a single resin bead as a sample loading medium both concentrates the sample nearly to a point and, due to its interaction with rhenium, produces the desired composite surface. Earlier work in this area showed that a layer of rhenium powder slurried in solution containing carbon substantially enhanced precision of isotopic measurements for uranium. Isotopic fractionation was virtually eliminated, and ionization efficiencies 2-5 times better than previously measured were attained for both Pu and U (1.7 and 0.5%, respectively). The other side of this coin should be the ability to analyze smaller samples, which is the subject of this report

  17. [Acidity and microbial flora contaminating Senegalese reconstituted curdled milk produced on small scale].

    Science.gov (United States)

    Seydi, M; Ndiaye, M

    1993-01-01

    The present study is carried out to appreciate the acidity, and survey the microbiological quality of reconstituted curdled milk (R.C.M.) obtained on small scale. For his, a total of 100 samples collected from Dakar's sellers were examined. The results show that: pH of R.C.M. is low and very variable: 4.17 +/- 0.30, titrable acidity is high: 152.6 +/- 26.5 degrees D, 5% of samples are contaminated by fecal coliforms and yielded counts of 20 germs/g, 19% of samples are polluted by Staphylococcus aureus to level equal to 10(2) germs/g or higher, yeasts and moulds are present in all samples, salmonella were absent. Saphylococcus aureus survival in curdled milk despite a dysgenesic acidity may be the result of very unhygienic conditions of preparation. Sellers and consumers must be encouraged to respect hygienic rules of preparation. It is the only way to prevent food poisoning.

  18. Obtaining Self-Samples to Diagnose Curable Sexually Transmitted Infections: A Systematic Review of Patients’ Experiences

    Science.gov (United States)

    Paudyal, Priyamvada; Llewellyn, Carrie; Lau, Jason; Mahmud, Mohammad; Smith, Helen

    2015-01-01

    Background Routine screening is key to sexually transmitted infection (STI) prevention and control. Previous studies suggest that clinic-based screening programmes capture only a small proportion of people with STIs. Self-sampling using non- or minimally invasive techniques may be beneficial for those reluctant to actively engage with conventional sampling methods. We systematically reviewed studies of patients’ experiences of obtaining self-samples to diagnose curable STIs. Methods We conducted an electronic search of MEDLINE, EMBASE, CINAHL, PsychINFO, BNI, and Cochrane Database of Systematic Reviews to identify relevant articles published in English between January 1980 and March 2014. Studies were included if participants self-sampled for the diagnosis of a curable STI and had specifically sought participants’ opinions of their experience, acceptability, preferences, or willingness to self-sample. Results The initial search yielded 558 references. Of these, 45 studies met the inclusion criteria. Thirty-six studies assessed patients’ acceptability and experiences of self-sampling. Pooled results from these studies shows that self-sampling is a highly acceptable method with 85% of patients reporting the method to be well received and acceptable. Twenty-eight studies reported on ease of self-sampling; the majority of patients (88%) in these studies found self-sampling an “easy” procedure. Self-sampling was favoured compared to clinician sampling, and home sampling was preferred to clinic-based sampling. Females and older participants were more accepting of self-sampling. Only a small minority of participants (13%) reported pain during self-sampling. Participants were willing to undergo self-sampling and recommend others. Privacy and safety were the most common concerns. Conclusion Self-sampling for diagnostic testing is well accepted with the majority having a positive experience and willingness to use again. Standardization of self-sampling procedures

  19. Influences of sampling effort on detected patterns and structuring processes of a Neotropical plant-hummingbird network.

    Science.gov (United States)

    Vizentin-Bugoni, Jeferson; Maruyama, Pietro K; Debastiani, Vanderlei J; Duarte, L da S; Dalsgaard, Bo; Sazima, Marlies

    2016-01-01

    Virtually all empirical ecological interaction networks to some extent suffer from undersampling. However, how limitations imposed by sampling incompleteness affect our understanding of ecological networks is still poorly explored, which may hinder further advances in the field. Here, we use a plant-hummingbird network with unprecedented sampling effort (2716 h of focal observations) from the Atlantic Rainforest in Brazil, to investigate how sampling effort affects the description of network structure (i.e. widely used network metrics) and the relative importance of distinct processes (i.e. species abundances vs. traits) in determining the frequency of pairwise interactions. By dividing the network into time slices representing a gradient of sampling effort, we show that quantitative metrics, such as interaction evenness, specialization (H2 '), weighted nestedness (wNODF) and modularity (Q; QuanBiMo algorithm) were less biased by sampling incompleteness than binary metrics. Furthermore, the significance of some network metrics changed along the sampling effort gradient. Nevertheless, the higher importance of traits in structuring the network was apparent even with small sampling effort. Our results (i) warn against using very poorly sampled networks as this may bias our understanding of networks, both their patterns and structuring processes, (ii) encourage the use of quantitative metrics little influenced by sampling when performing spatio-temporal comparisons and (iii) indicate that in networks strongly constrained by species traits, such as plant-hummingbird networks, even small sampling is sufficient to detect their relative importance for the frequencies of interactions. Finally, we argue that similar effects of sampling are expected for other highly specialized subnetworks. © 2015 The Authors. Journal of Animal Ecology © 2015 British Ecological Society.

  20. SmallSat Database

    Science.gov (United States)

    Petropulos, Dolores; Bittner, David; Murawski, Robert; Golden, Bert

    2015-01-01

    The SmallSat has an unrealized potential in both the private industry and in the federal government. Currently over 70 companies, 50 universities and 17 governmental agencies are involved in SmallSat research and development. In 1994, the U.S. Army Missile and Defense mapped the moon using smallSat imagery. Since then Smart Phones have introduced this imagery to the people of the world as diverse industries watched this trend. The deployment cost of smallSats is also greatly reduced compared to traditional satellites due to the fact that multiple units can be deployed in a single mission. Imaging payloads have become more sophisticated, smaller and lighter. In addition, the growth of small technology obtained from private industries has led to the more widespread use of smallSats. This includes greater revisit rates in imagery, significantly lower costs, the ability to update technology more frequently and the ability to decrease vulnerability of enemy attacks. The popularity of smallSats show a changing mentality in this fast paced world of tomorrow. What impact has this created on the NASA communication networks now and in future years? In this project, we are developing the SmallSat Relational Database which can support a simulation of smallSats within the NASA SCaN Compatability Environment for Networks and Integrated Communications (SCENIC) Modeling and Simulation Lab. The NASA Space Communications and Networks (SCaN) Program can use this modeling to project required network support needs in the next 10 to 15 years. The SmallSat Rational Database could model smallSats just as the other SCaN databases model the more traditional larger satellites, with a few exceptions. One being that the smallSat Database is designed to be built-to-order. The SmallSat database holds various hardware configurations that can be used to model a smallSat. It will require significant effort to develop as the research material can only be populated by hand to obtain the unique data

  1. Composition and abundance of small mammal communities in forest fragments and vegetation corridors in Southern Minas Gerais, Brazil

    Directory of Open Access Journals (Sweden)

    Andréa O. Mesquita

    2012-09-01

    Full Text Available Habitat fragmentation leads to isolation and reduce habitat areas, in addition to a series of negative effects on natural populations, affecting richness, abundance and distribution of animal species. In such a text, habitat corridors serve as an alternative for connectivity in fragmented landscapes, minimizing the effects of structural isolation of different habitat areas. This study evaluated the richness, composition and abundance of small mammal communities in forest fragments and in the relevant vegetation corridors that connect these fragments, located in Southern Minas Gerais, Southeastern Brazil. Ten sites were sampled (five forest fragments and five vegetation corridors using the capture-mark-recapture method, from April 2007-March 2008. A total sampling effort of 6 300 trapnights resulted in 656 captures of 249 individuals. Across the 10 sites sampled, 11 small mammal species were recorded. Multidimensional scaling (MDS ordinations and ANOSIM based on the composition of small mammal communities within the corridor and fragment revealed a qualitative difference between the two environments. Regarding abundance, there was no significant difference between corridors and fragments. In comparing mean values of abundance per species in each environment, only Cerradomys subflavus showed a significant difference, being more abundant in the corridor environment. Results suggest that the presence of several small mammal species in the corridor environment, in relatively high abundances, could indicate corridors use as habitat, though they might also facilitate and/or allow the movement of individuals using different habitat patches (fragments.

  2. Respondent-Driven Sampling – Testing Assumptions: Sampling with Replacement

    Directory of Open Access Journals (Sweden)

    Barash Vladimir D.

    2016-03-01

    Full Text Available Classical Respondent-Driven Sampling (RDS estimators are based on a Markov Process model in which sampling occurs with replacement. Given that respondents generally cannot be interviewed more than once, this assumption is counterfactual. We join recent work by Gile and Handcock in exploring the implications of the sampling-with-replacement assumption for bias of RDS estimators. We differ from previous studies in examining a wider range of sampling fractions and in using not only simulations but also formal proofs. One key finding is that RDS estimates are surprisingly stable even in the presence of substantial sampling fractions. Our analyses show that the sampling-with-replacement assumption is a minor contributor to bias for sampling fractions under 40%, and bias is negligible for the 20% or smaller sampling fractions typical of field applications of RDS.

  3. Determination of the optimal sample size for a clinical trial accounting for the population size.

    Science.gov (United States)

    Stallard, Nigel; Miller, Frank; Day, Simon; Hee, Siew Wan; Madan, Jason; Zohar, Sarah; Posch, Martin

    2017-07-01

    The problem of choosing a sample size for a clinical trial is a very common one. In some settings, such as rare diseases or other small populations, the large sample sizes usually associated with the standard frequentist approach may be infeasible, suggesting that the sample size chosen should reflect the size of the population under consideration. Incorporation of the population size is possible in a decision-theoretic approach either explicitly by assuming that the population size is fixed and known, or implicitly through geometric discounting of the gain from future patients reflecting the expected population size. This paper develops such approaches. Building on previous work, an asymptotic expression is derived for the sample size for single and two-arm clinical trials in the general case of a clinical trial with a primary endpoint with a distribution of one parameter exponential family form that optimizes a utility function that quantifies the cost and gain per patient as a continuous function of this parameter. It is shown that as the size of the population, N, or expected size, N∗ in the case of geometric discounting, becomes large, the optimal trial size is O(N1/2) or O(N∗1/2). The sample size obtained from the asymptotic expression is also compared with the exact optimal sample size in examples with responses with Bernoulli and Poisson distributions, showing that the asymptotic approximations can also be reasonable in relatively small sample sizes. © 2016 The Author. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Distribution and abundance of small plastic debris on beaches in the SE Pacific (Chile): a study supported by a citizen science project.

    Science.gov (United States)

    Hidalgo-Ruz, Valeria; Thiel, Martin

    2013-01-01

    The accumulation of large and small plastic debris is a problem throughout the world's oceans and coastlines. Abundances and types of small plastic debris have only been reported for some isolated beaches in the SE Pacific, but these data are insufficient to evaluate the situation in this region. The citizen science project "National Sampling of Small Plastic Debris" was supported by schoolchildren from all over Chile who documented the distribution and abundance of small plastic debris on Chilean beaches. Thirty-nine schools and nearly 1000 students from continental Chile and Easter Island participated in the activity. To validate the data obtained by the students, all samples were recounted in the laboratory. The results of the present study showed that the students were able to follow the instructions and generate reliable data. The average abundance obtained was 27 small plastic pieces per m(2) for the continental coast of Chile, but the samples from Easter Island had extraordinarily higher abundances (>800 items per m(2)). The abundance of small plastic debris on the continental coast could be associated with coastal urban centers and their economic activities. The high abundance found on Easter Island can be explained mainly by the transport of plastic debris via the surface currents in the South Pacific Subtropical Gyre, resulting in the accumulation of small plastic debris on the beaches of the island. This first report of the widespread distribution and abundance of small plastic debris on Chilean beaches underscores the need to extend plastic debris research to ecological aspects of the problem and to improve waste management. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. The marketing strategy of a small enterprise

    OpenAIRE

    Trandafilović Igor

    2008-01-01

    Small enterprise shows some specific characteristics which are necessary for creation of marketing strategy. The goal of this work is an attempt of marketing strategy analysis of the small enterprises. The necessary request of the analysis is to define the small enterprise. There have also been analyzed the specific orders for small enterpriser. Special part of work is dedicated to a small enterprises management strategy and the other one to their marketing communications. There are mentioned...

  6. Factors related to the capital structure of small new ventures

    NARCIS (Netherlands)

    Harms, Rainer; Breitenecker, R.; Schwartz, E.J.; Wdowiak, M.A.

    2012-01-01

    In the literature, there exists evidence on the capital structure determinants for small ventures, but empirical research for new ventures is limited. We seek to address this gap by presenting a confirmatory analysis of determinants of capital structure of a sample of small new ventures in Austria.

  7. Sampling point selection for energy estimation in the quasicontinuum method

    NARCIS (Netherlands)

    Beex, L.A.A.; Peerlings, R.H.J.; Geers, M.G.D.

    2010-01-01

    The quasicontinuum (QC) method reduces computational costs of atomistic calculations by using interpolation between a small number of so-called repatoms to represent the displacements of the complete lattice and by selecting a small number of sampling atoms to estimate the total potential energy of

  8. An inexpensive and portable microvolumeter for rapid evaluation of biological samples.

    Science.gov (United States)

    Douglass, John K; Wcislo, William T

    2010-08-01

    We describe an improved microvolumeter (MVM) for rapidly measuring volumes of small biological samples, including live zooplankton, embryos, and small animals and organs. Portability and low cost make this instrument suitable for widespread use, including at remote field sites. Beginning with Archimedes' principle, which states that immersing an arbitrarily shaped sample in a fluid-filled container displaces an equivalent volume, we identified procedures that maximize measurement accuracy and repeatability across a broad range of absolute volumes. Crucial steps include matching the overall configuration to the size of the sample, using reflected light to monitor fluid levels precisely, and accounting for evaporation during measurements. The resulting precision is at least 100 times higher than in previous displacement-based methods. Volumes are obtained much faster than by traditional histological or confocal methods and without shrinkage artifacts due to fixation or dehydration. Calibrations using volume standards confirmed accurate measurements of volumes as small as 0.06 microL. We validated the feasibility of evaluating soft-tissue samples by comparing volumes of freshly dissected ant brains measured with the MVM and by confocal reconstruction.

  9. The marketing strategy of a small enterprise

    Directory of Open Access Journals (Sweden)

    Trandafilović Igor

    2008-01-01

    Full Text Available Small enterprise shows some specific characteristics which are necessary for creation of marketing strategy. The goal of this work is an attempt of marketing strategy analysis of the small enterprises. The necessary request of the analysis is to define the small enterprise. There have also been analyzed the specific orders for small enterpriser. Special part of work is dedicated to a small enterprises management strategy and the other one to their marketing communications. There are mentioned some specific advantages and problems of the small enterprises, too.

  10. Linear models for airborne-laser-scanning-based operational forest inventory with small field sample size and highly correlated LiDAR data

    Science.gov (United States)

    Junttila, Virpi; Kauranne, Tuomo; Finley, Andrew O.; Bradford, John B.

    2015-01-01

    Modern operational forest inventory often uses remotely sensed data that cover the whole inventory area to produce spatially explicit estimates of forest properties through statistical models. The data obtained by airborne light detection and ranging (LiDAR) correlate well with many forest inventory variables, such as the tree height, the timber volume, and the biomass. To construct an accurate model over thousands of hectares, LiDAR data must be supplemented with several hundred field sample measurements of forest inventory variables. This can be costly and time consuming. Different LiDAR-data-based and spatial-data-based sampling designs can reduce the number of field sample plots needed. However, problems arising from the features of the LiDAR data, such as a large number of predictors compared with the sample size (overfitting) or a strong correlation among predictors (multicollinearity), may decrease the accuracy and precision of the estimates and predictions. To overcome these problems, a Bayesian linear model with the singular value decomposition of predictors, combined with regularization, is proposed. The model performance in predicting different forest inventory variables is verified in ten inventory areas from two continents, where the number of field sample plots is reduced using different sampling designs. The results show that, with an appropriate field plot selection strategy and the proposed linear model, the total relative error of the predicted forest inventory variables is only 5%–15% larger using 50 field sample plots than the error of a linear model estimated with several hundred field sample plots when we sum up the error due to both the model noise variance and the model’s lack of fit.

  11. Universal quantum dot-based sandwich-like immunoassay strategy for rapid and ultrasensitive detection of small molecules using portable and reusable optofluidic nano-biosensing platform

    International Nuclear Information System (INIS)

    Zhou, Liping; Zhu, Anna; Lou, Xuening; Song, Dan; Yang, Rong; Shi, Hanchang; Long, Feng

    2016-01-01

    A universal sandwich-like immunoassay strategy based on quantum-dots immunoprobe (QD-labeled anti-mouse IgG antibody) was developed for rapid and ultrasensitive detection of small molecules. A portable and reusable optofluidic nano-biosensing platform was applied to investigate the sandwich-like immunoassay mechanism and format of small molecules, as well as the binding kinetics between QD immunoprobe and anti-small molecule antibody. A two-step immunoassay method that involves pre-incubation mixture of different concentration of small molecule and anti-small molecule antibody, and subsequent introduction of QD immunoprobe into the optofluidic cell was conducted for small molecule determination. Compared with the one-step immunoassay method, the two-step immunoassay method can obtain higher fluorescence signal and higher sensitivity index, thus improving the nano-biosensing performance. Based on the proposed strategy, two mode targets, namely, microcystin-LR (MC-LR) and Bisphenol A (BPA) were tested with high sensitivity, rapidity, and ease of use. A higher concentration of small molecules in the sample led to less anti-small molecule antibody bound with antigen-carrier protein conjugate immobilized onto the sensor surface, and less QD immunoprobes bound with anti-small molecule antibody. This phenomenon lowered the fluorescence signal detected by nano-biosensing platform. Under optimal operating conditions, MC-LR and BPA exhibited a limit of detection of 0.003 and 0.04 μg/L, respectively. The LODs were better than those of the indirect competitive immunoassay method for small molecules via Cy5.5-labeled anti-small molecule antibody. The proposed QD-based sandwich-like immunoassay strategy was evaluated in spiked water samples, and showed good recovery, precision and accuracy without complicated sample pretreatments. All these results demonstrate that the new detection strategy could be readily applied to the other trace small molecules in real water samples

  12. Universal quantum dot-based sandwich-like immunoassay strategy for rapid and ultrasensitive detection of small molecules using portable and reusable optofluidic nano-biosensing platform

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Liping; Zhu, Anna; Lou, Xuening; Song, Dan; Yang, Rong [School of Environment and Natural Resources, Renmin University of China, Beijing (China); Shi, Hanchang [School of Environment, Tsinghua University, Beijing (China); Long, Feng, E-mail: longf04@ruc.edu.cn [School of Environment and Natural Resources, Renmin University of China, Beijing (China)

    2016-01-28

    A universal sandwich-like immunoassay strategy based on quantum-dots immunoprobe (QD-labeled anti-mouse IgG antibody) was developed for rapid and ultrasensitive detection of small molecules. A portable and reusable optofluidic nano-biosensing platform was applied to investigate the sandwich-like immunoassay mechanism and format of small molecules, as well as the binding kinetics between QD immunoprobe and anti-small molecule antibody. A two-step immunoassay method that involves pre-incubation mixture of different concentration of small molecule and anti-small molecule antibody, and subsequent introduction of QD immunoprobe into the optofluidic cell was conducted for small molecule determination. Compared with the one-step immunoassay method, the two-step immunoassay method can obtain higher fluorescence signal and higher sensitivity index, thus improving the nano-biosensing performance. Based on the proposed strategy, two mode targets, namely, microcystin-LR (MC-LR) and Bisphenol A (BPA) were tested with high sensitivity, rapidity, and ease of use. A higher concentration of small molecules in the sample led to less anti-small molecule antibody bound with antigen-carrier protein conjugate immobilized onto the sensor surface, and less QD immunoprobes bound with anti-small molecule antibody. This phenomenon lowered the fluorescence signal detected by nano-biosensing platform. Under optimal operating conditions, MC-LR and BPA exhibited a limit of detection of 0.003 and 0.04 μg/L, respectively. The LODs were better than those of the indirect competitive immunoassay method for small molecules via Cy5.5-labeled anti-small molecule antibody. The proposed QD-based sandwich-like immunoassay strategy was evaluated in spiked water samples, and showed good recovery, precision and accuracy without complicated sample pretreatments. All these results demonstrate that the new detection strategy could be readily applied to the other trace small molecules in real water samples

  13. Analysis of methods commonly used in biomedicine for treatment versus control comparison of very small samples.

    Science.gov (United States)

    Ristić-Djurović, Jasna L; Ćirković, Saša; Mladenović, Pavle; Romčević, Nebojša; Trbovich, Alexander M

    2018-04-01

    A rough estimate indicated that use of samples of size not larger than ten is not uncommon in biomedical research and that many of such studies are limited to strong effects due to sample sizes smaller than six. For data collected from biomedical experiments it is also often unknown if mathematical requirements incorporated in the sample comparison methods are satisfied. Computer simulated experiments were used to examine performance of methods for qualitative sample comparison and its dependence on the effectiveness of exposure, effect intensity, distribution of studied parameter values in the population, and sample size. The Type I and Type II errors, their average, as well as the maximal errors were considered. The sample size 9 and the t-test method with p = 5% ensured error smaller than 5% even for weak effects. For sample sizes 6-8 the same method enabled detection of weak effects with errors smaller than 20%. If the sample sizes were 3-5, weak effects could not be detected with an acceptable error; however, the smallest maximal error in the most general case that includes weak effects is granted by the standard error of the mean method. The increase of sample size from 5 to 9 led to seven times more accurate detection of weak effects. Strong effects were detected regardless of the sample size and method used. The minimal recommended sample size for biomedical experiments is 9. Use of smaller sizes and the method of their comparison should be justified by the objective of the experiment. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Sample processing procedures and radiocarbon dating

    International Nuclear Information System (INIS)

    Svetlik, Ivo; Tomaskova, Lenka; Dreslerova, Dagmar

    2010-01-01

    The article outlines radiocarbon dating routines and highlights the potential and limitations of this method. The author's institutions have been jointly running a conventional radiocarbon dating laboratory using the international CRL code. A procedure based on the synthesis of benzene is used. Small samples are sent abroad for dating because no AMS instrumentation is available in the Czech Republic so far. Our laboratory plans to introduce routines for the processing of milligram samples and preparation of graphitized targets for AMS

  15. Porosity of the Marcellus Shale: A contrast matching small-angle neutron scattering study

    Science.gov (United States)

    Bahadur, Jitendra; Ruppert, Leslie F.; Pipich, Vitaliy; Sakurovs, Richard; Melnichenko, Yuri B.

    2018-01-01

    Neutron scattering techniques were used to determine the effect of mineral matter on the accessibility of water and toluene to pores in the Devonian Marcellus Shale. Three Marcellus Shale samples, representing quartz-rich, clay-rich, and carbonate-rich facies, were examined using contrast matching small-angle neutron scattering (CM-SANS) at ambient pressure and temperature. Contrast matching compositions of H2O, D2O and toluene, deuterated toluene were used to probe open and closed pores of these three shale samples. Results show that although the mean pore radius was approximately the same for all three samples, the fractal dimension of the quartz-rich sample was higher than for the clay-rich and carbonate-rich samples, indicating different pore size distributions among the samples. The number density of pores was highest in the clay-rich sample and lowest in the quartz-rich sample. Contrast matching with water and toluene mixtures shows that the accessibility of pores to water and toluene also varied among the samples. In general, water accessed approximately 70–80% of the larger pores (>80 nm radius) in all three samples. At smaller pore sizes (~5–80 nm radius), the fraction of accessible pores decreases. The lowest accessibility to both fluids is at pore throat size of ~25 nm radii with the quartz-rich sample exhibiting lower accessibility than the clay- and carbonate-rich samples. The mechanism for this behaviour is unclear, but because the mineralogy of the three samples varies, it is likely that the inaccessible pores in this size range are associated with organics and not a specific mineral within the samples. At even smaller pore sizes (~generally follows that of water; however, in the smallest pores (~<2.5 nm radius), accessibility to toluene decreases, especially in the clay-rich sample which contains about 30% more closed pores than the quartz- and carbonate-rich samples. Results from this study show that mineralogy of producing intervals

  16. Rhesus monkeys (Macaca mulatta) show robust primacy and recency in memory for lists from small, but not large, image sets.

    Science.gov (United States)

    Basile, Benjamin M; Hampton, Robert R

    2010-02-01

    The combination of primacy and recency produces a U-shaped serial position curve typical of memory for lists. In humans, primacy is often thought to result from rehearsal, but there is little evidence for rehearsal in nonhumans. To further evaluate the possibility that rehearsal contributes to primacy in monkeys, we compared memory for lists of familiar stimuli (which may be easier to rehearse) to memory for unfamiliar stimuli (which are likely difficult to rehearse). Six rhesus monkeys saw lists of five images drawn from either large, medium, or small image sets. After presentation of each list, memory for one item was assessed using a serial probe recognition test. Across four experiments, we found robust primacy and recency with lists drawn from small and medium, but not large, image sets. This finding is consistent with the idea that familiar items are easier to rehearse and that rehearsal contributes to primacy, warranting further study of the possibility of rehearsal in monkeys. However, alternative interpretations are also viable and are discussed. Copyright 2009 Elsevier B.V. All rights reserved.

  17. Phobos Sample Return: Next Approach

    Science.gov (United States)

    Zelenyi, Lev; Martynov, Maxim; Zakharov, Alexander; Korablev, Oleg; Ivanov, Alexey; Karabadzak, George

    The Martian moons still remain a mystery after numerous studies by Mars orbiting spacecraft. Their study cover three major topics related to (1) Solar system in general (formation and evolution, origin of planetary satellites, origin and evolution of life); (2) small bodies (captured asteroid, or remnants of Mars formation, or reaccreted Mars ejecta); (3) Mars (formation and evolution of Mars; Mars ejecta at the satellites). As reviewed by Galimov [2010] most of the above questions require the sample return from the Martian moon, while some (e.g. the characterization of the organic matter) could be also answered by in situ experiments. There is the possibility to obtain the sample of Mars material by sampling Phobos: following to Chappaz et al. [2012] a 200-g sample could contain 10-7 g of Mars surface material launched during the past 1 mln years, or 5*10-5 g of Mars material launched during the past 10 mln years, or 5*1010 individual particles from Mars, quantities suitable for accurate laboratory analyses. The studies of Phobos have been of high priority in the Russian program on planetary research for many years. Phobos-88 mission consisted of two spacecraft (Phobos-1, Phobos-2) and aimed the approach to Phobos at 50 m and remote studies, and also the release of small landers (long-living stations DAS). This mission implemented the program incompletely. It was returned information about the Martian environment and atmosphere. The next profect Phobos Sample Return (Phobos-Grunt) initially planned in early 2000 has been delayed several times owing to budget difficulties; the spacecraft failed to leave NEO in 2011. The recovery of the science goals of this mission and the delivery of the samples of Phobos to Earth remain of highest priority for Russian scientific community. The next Phobos SR mission named Boomerang was postponed following the ExoMars cooperation, but is considered the next in the line of planetary exploration, suitable for launch around 2022. A

  18. exTAS - next-generation TAS for small samples and extreme conditions

    International Nuclear Information System (INIS)

    Kulda, J.; Hiess, A.

    2011-01-01

    The currently used implementation of horizontally and vertically focusing optics in three-axis spectrometers (TAS) permits efficient studies of excitations in sub-cm 3 - sized single crystals]. With the present proposal we wish to stimulate a further paradigm shift into the domain of mm 3 -sized samples. exTAS combines highly focused mm-sized focal spots, boosting the sensitivity limits, with a spectrometer layout down-scaled to a table-top size to provide high flexibility in optimizing acceptance angles and to achieve sub-millimeter positioning accuracy. (authors)

  19. Environmental certification for small hydropower plants

    International Nuclear Information System (INIS)

    Truffer, B.; Meier, W.; Vollenweider, S.; Seiler, B.; Dettli, R.

    2001-01-01

    This report for the Swiss Federal Institute for Environmental Science and Technology describes product-differentiation options for small hydropower plant in Switzerland and proposes a form of differentiation based on ecological characteristics as a promising market strategy. The labels created in various countries to assure customers of the environmental compatibility of 'green' power production are looked at. In particular, the implications for small hydropower plant associated with the Swiss green power labelling procedure introduced by the Association for the Promotion of Environmentally Sound Electricity (VUE) are discussed. The report proposes a simplified procedure for these small power stations and presents a sample calculation for the overall costs of certification. The report is rounded off with four detailed case studies in which the necessary upgrades to the plant and associated costs are discussed in detail

  20. Elucidating the Small Regulatory RNA Repertoire of the Sea Anemone Anemonia viridis Based on Whole Genome and Small RNA Sequencing.

    Science.gov (United States)

    Urbarova, Ilona; Patel, Hardip; Forêt, Sylvain; Karlsen, Bård Ove; Jørgensen, Tor Erik; Hall-Spencer, Jason M; Johansen, Steinar D

    2018-02-01

    Cnidarians harbor a variety of small regulatory RNAs that include microRNAs (miRNAs) and PIWI-interacting RNAs (piRNAs), but detailed information is limited. Here, we report the identification and expression of novel miRNAs and putative piRNAs, as well as their genomic loci, in the symbiotic sea anemone Anemonia viridis. We generated a draft assembly of the A. viridis genome with putative size of 313 Mb that appeared to be composed of about 36% repeats, including known transposable elements. We detected approximately equal fractions of DNA transposons and retrotransposons. Deep sequencing of small RNA libraries constructed from A. viridis adults sampled at a natural CO2 gradient off Vulcano Island, Italy, identified 70 distinct miRNAs. Eight were homologous to previously reported miRNAs in cnidarians, whereas 62 appeared novel. Nine miRNAs were recognized as differentially expressed along the natural seawater pH gradient. We found a highly abundant and diverse population of piRNAs, with a substantial fraction showing ping-pong signatures. We identified nearly 22% putative piRNAs potentially targeting transposable elements within the A. viridis genome. The A. viridis genome appeared similar in size to that of other hexacorals with a very high divergence of transposable elements resembling that of the sea anemone genus Exaiptasia. The genome encodes and expresses a high number of small regulatory RNAs, which include novel miRNAs and piRNAs. Differentially expressed small RNAs along the seawater pH gradient indicated regulatory gene responses to environmental stressors. © The Author(s) 2018. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  1. FACTORS CRISIS OF SMALL BUSINESS DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    A. I. Khorev

    2015-01-01

    Full Text Available Today it became clear that Russia needs not only economic growth, but also economic growth of a certain quality. Re¬gions are not an exception, since they are the most vulnerable to changes in market conditions resource exports. The main economic indicators for larger regions of the country should become entrepreneurs. The main economic indicators for larger regions of the country should become entrepreneurs. Not accidentally retrospective analysis of the evolutionary development of small business shows that the most important role in the economic development of Europe and America have played and continue to play a small business. Results of activity of small businesses are generally positive; the country saw an increase in the main indicators: the number of small enterprises, the average number of employees in small enterprises, turnover of small businesses and investments in fixed assets. The statistics of recent years show a decline of interest in the business population. Number of small businesses created in the last few years there has been a downward trend, while the proportion of surviving for three years after their registration is low. The greatest negative impact on the business affairs of the factors of regulatory and administrative regulation of small business, corruption and bribery of officials and the activities of control and inspection bodies. Addressing these challenges will require proper coordination of regional and local resources in building infrastructure, forming investment mechanisms to support small enterprises, the development of the entrepreneurial potential of the population.

  2. Accounting for thermodynamic non-ideality in the Guinier region of small-angle scattering data of proteins.

    Science.gov (United States)

    Scott, David J

    2016-12-01

    Hydrodynamic studies of the solution properties of proteins and other biological macromolecules are often hard to interpret when the sample is present at a reasonably concentrated solution. The reason for this is that solutions exhibit deviations from ideal behaviour which is manifested as thermodynamic non-ideality. The range of concentrations at which this behaviour typically is exhibited is as low as 1-2 mg/ml, well within the range of concentrations used for their analysis by techniques such as small-angle scattering. Here we discuss thermodynamic non-ideality used previously used in the context of light scattering and sedimentation equilibrium analytical ultracentrifugation and apply it to the Guinier region of small-angle scattering data. The results show that there is a complementarity between the radially averaged structure factor derived from small-angle X-ray scattering/small-angle neutron scattering studies and the second virial coefficient derived from sedimentation equilibrium analytical ultracentrifugation experiments.

  3. Sampling methods for amphibians in streams in the Pacific Northwest.

    Science.gov (United States)

    R. Bruce Bury; Paul Stephen. Corn

    1991-01-01

    Methods describing how to sample aquatic and semiaquatic amphibians in small streams and headwater habitats in the Pacific Northwest are presented. We developed a technique that samples 10-meter stretches of selected streams, which was adequate to detect presence or absence of amphibian species and provided sample sizes statistically sufficient to compare abundance of...

  4. Classification of natural formations based on their optical characteristics using small volumes of samples

    Science.gov (United States)

    Abramovich, N. S.; Kovalev, A. A.; Plyuta, V. Y.

    1986-02-01

    A computer algorithm has been developed to classify the spectral bands of natural scenes on Earth according to their optical characteristics. The algorithm is written in FORTRAN-IV and can be used in spectral data processing programs requiring small data loads. The spectral classifications of some different types of green vegetable canopies are given in order to illustrate the effectiveness of the algorithm.

  5. Decellularized ovine arteries as small-diameter vascular grafts

    International Nuclear Information System (INIS)

    Mancuso, L; Cao, G; Gualerzi, A; Boschetti, F; Loy, F

    2014-01-01

    Atherosclerosis and its complications still represent the leading cause of death in the developed countries. While autologous blood vessels may be regarded as the best solution for peripheral and coronary bypass, they are unavailable in most patients. Even though tissue engineering techniques are often applied to the development of small-diameter vascular grafts, limiting factors of this approach are represented by the lack of essential extracellular matrix proteins and/or poor biomechanical properties of the scaffolds used. Along these lines, the aim of this study was to develop a decellularization protocol for ovine carotids to be used as suitable small-diameter vascular grafts. Samples were treated either with sodium dodecyl sulphate (SDS) or with Trypsin and Triton X-100; a final nuclease digestion was performed for both protocols. Morphological analyses demonstrate complete removal of nuclei and cellular components in treated vessels, also confirmed by significant reduction in wall thickness and DNA content. Essential extracellular matrix proteins such as collagen, elastin, and fibronectin are well preserved after decellularization. From a mechanical point of view, Trypsin and Triton X-100 treated arteries show elastic modules and compliance comparable to native carotids, whereas the use of SDS makes samples stiffer, with a significant decrease in the compliance mean value and an increase in longitudinal and circumferential Young’s modules. It is demonstrated that the treatment where Trypsin and Triton X-100 are combined guarantees complete decellularization of carotids, with no significant alteration of biomechanical and structural properties, thus preserving a suitable environment for adhesion, proliferation, and migration of cells. (paper)

  6. Pseudolisthesis secondary to small L5

    International Nuclear Information System (INIS)

    Bagheri, Mohammad Hadi; Hosseini, Mehrdad Mohammad; Foroughi, Amin Aboulhassani

    2011-01-01

    Rationale and objective: Small AP diameter of L5 can mimic spondylolisthesis in lumbosacral MRI. The aim of the present study was to evaluate the prevalence of small L5 mimicking spondylolisthesis in lumbosacral MRI of the patients referred to an imaging center by any symptoms. Association between small anteroposterior diameter of L5 and presence and absence of unilateral or bilateral spondylolysis at L5 were also evaluated. Materials and methods: 2000 lumbosacral MRI of the patients referred to an imaging center for any reasons were evaluated. The posterior step of L5 was measured and the cases with posterior step of more than 2 mm were selected. These cases were evaluated for presence or absence of spondylolysis. Results: Small anteroposterior diameters of L5 were detected in 38 cases (2%), mimicking the spondylolisthesis in lumbosacral MRI. The prevalence of small L5 in lumbosacral MRI of the patients referred by any symptoms was about 2%. In these selected cases, 6 (15.8%) show unilateral spondylolysis and 5 (13.1%) show bilateral spondylolysis. Conclusion: Pseudospondylolisthesis secondary to small AP diameter of L5 has prevalence of about 2% in lumbosacral MRI of cases referred for any reasons and can be overlooked in reporting of LS MRI. There was an association between small anteroposterior diameter of L5 and presence of spondylolysis at L5.

  7. The Dirichlet-Multinomial Model for Multivariate Randomized Response Data and Small Samples

    Science.gov (United States)

    Avetisyan, Marianna; Fox, Jean-Paul

    2012-01-01

    In survey sampling the randomized response (RR) technique can be used to obtain truthful answers to sensitive questions. Although the individual answers are masked due to the RR technique, individual (sensitive) response rates can be estimated when observing multivariate response data. The beta-binomial model for binary RR data will be generalized…

  8. Small scale temporal variability in the phytoplankton of Independencia Bay, Pisco, Perú

    Directory of Open Access Journals (Sweden)

    Noemí Ochoa

    2013-06-01

    Full Text Available Temporal variations at small scale of the coastal marine phytoplankton assemblages were studied. Water samples were collected at a fixed station in Bahia Independencia (Pisco-Peru. The sampling took place in the morning (08:00 h. and afternoon (15:00 h over a period of 29 days (March 28 to April 25, 1988. Surface temperatures also were taken, fluctuating from 15,4 °C to 17,2 °C. Diatoms were the principal component of the phytoplankton community and were more related with the total of phytoplankton. Other groups as Dinoflagellates, Coccolitophorids, Silicoflagellates and small flagellates were present but were less important. Skeletonema costatum was the dominant specie during the first nine days of sampling, after that it was substituted by Thalassionema nitzschioides, which remained as dominant until the end of the study. Small variation in species composition but large fluctuations in density of phytoplankton were recorded over a period of few hours. Small increments in temperature influenced in the phytoplankton assemblages.

  9. Towards a representative periphytic diatom sample

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The need to acquire a representative periphytic diatom sample for river water quality monitoring has been recognised in the development of existing diatom indices, important in the development and employment of diatom monitoring tools for the Water Framework Directive. In this study, a nested design with replication is employed to investigate the magnitude of variation in diatom biomass, composition and Trophic Diatom Index at varying scales within a small chalk river. The study shows that the use of artificial substrates may not result in diatom communities that are typical of the surrounding natural substrates. Periphytic diatom biomass and composition varies between artificial and natural substrates, riffles and glides and between two stretches of the river channel. The study also highlights the existence of high variation in diatom frustule frequency and biovolume at the individual replicate scale which may have implications for the use of diatoms in routine monitoring.

  10. An improved correlated sampling method for calculating correction factor of detector

    International Nuclear Information System (INIS)

    Wu Zhen; Li Junli; Cheng Jianping

    2006-01-01

    In the case of a small size detector lying inside a bulk of medium, there are two problems in the correction factors calculation of the detectors. One is that the detector is too small for the particles to arrive at and collide in; the other is that the ratio of two quantities is not accurate enough. The method discussed in this paper, which combines correlated sampling with modified particle collision auto-importance sampling, and has been realized on the MCNP-4C platform, can solve these two problems. Besides, other 3 variance reduction techniques are also combined with correlated sampling respectively to calculate a simple calculating model of the correction factors of detectors. The results prove that, although all the variance reduction techniques combined with correlated sampling can improve the calculating efficiency, the method combining the modified particle collision auto-importance sampling with the correlated sampling is the most efficient one. (authors)

  11. A New Attribute Control Chart using Multiple Dependent State Repetitive Sampling

    KAUST Repository

    Aldosari, Mansour Sattam; Aslam, Muhammad; Jun, Chi-Hyuck

    2017-01-01

    In this manuscript, a new attribute control chart using multiple dependent state repetitive sampling is designed. The operational procedure and structure of the proposed control chart is given. The required measures to determine the average run length (ARL) for in-control and out-of-control processes are given. Tables of ARLs are reported for various control chart parameters. The proposed control chart is more sensitive in detecting a small shift in the process as compared to the existing attribute control charts. The simulation study shows the efficiency of the proposed chart over the existing charts. An example is given for the illustration purpose.

  12. A New Attribute Control Chart using Multiple Dependent State Repetitive Sampling

    KAUST Repository

    Aldosari, Mansour Sattam

    2017-03-25

    In this manuscript, a new attribute control chart using multiple dependent state repetitive sampling is designed. The operational procedure and structure of the proposed control chart is given. The required measures to determine the average run length (ARL) for in-control and out-of-control processes are given. Tables of ARLs are reported for various control chart parameters. The proposed control chart is more sensitive in detecting a small shift in the process as compared to the existing attribute control charts. The simulation study shows the efficiency of the proposed chart over the existing charts. An example is given for the illustration purpose.

  13. A preliminary study of breast cancer diagnosis using laboratory based small angle x-ray scattering

    Science.gov (United States)

    Round, A. R.; Wilkinson, S. J.; Hall, C. J.; Rogers, K. D.; Glatter, O.; Wess, T.; Ellis, I. O.

    2005-09-01

    Breast tissue collected from tumour samples and normal tissue from bi-lateral mastectomy procedures were examined using small angle x-ray scattering. Previous work has indicated that breast tissue disease diagnosis could be performed using small angle x-ray scattering (SAXS) from a synchrotron radiation source. The technique would be more useful to health services if it could be made to work using a conventional x-ray source. Consistent and reliable differences in x-ray scatter distributions were observed between samples from normal and tumour tissue samples using the laboratory based 'SAXSess' system. Albeit from a small number of samples, a sensitivity of 100% was obtained. This result encourages us to pursue the implementation of SAXS as a laboratory based diagnosis technique.

  14. A preliminary study of breast cancer diagnosis using laboratory based small angle x-ray scattering

    Energy Technology Data Exchange (ETDEWEB)

    Round, A R [Daresbury Laboratories, Warrington, WA4 4AD (United Kingdom); Wilkinson, S J [Daresbury Laboratories, Warrington, WA4 4AD (United Kingdom); Hall, C J [Daresbury Laboratories, Warrington, WA4 4AD (United Kingdom); Rogers, K D [Department of Materials and Medical Sciences, Cranfield University, Swindon, SN6 8LA (United Kingdom); Glatter, O [Department of Chemistry, University of Graz (Austria); Wess, T [School of Optometry and Vision Sciences, Cardiff University, Cardiff CF10 3NB, Wales (United Kingdom); Ellis, I O [Nottingham City Hospital, Nottingham (United Kingdom)

    2005-09-07

    Breast tissue collected from tumour samples and normal tissue from bi-lateral mastectomy procedures were examined using small angle x-ray scattering. Previous work has indicated that breast tissue disease diagnosis could be performed using small angle x-ray scattering (SAXS) from a synchrotron radiation source. The technique would be more useful to health services if it could be made to work using a conventional x-ray source. Consistent and reliable differences in x-ray scatter distributions were observed between samples from normal and tumour tissue samples using the laboratory based 'SAXSess' system. Albeit from a small number of samples, a sensitivity of 100% was obtained. This result encourages us to pursue the implementation of SAXS as a laboratory based diagnosis technique.

  15. A preliminary study of breast cancer diagnosis using laboratory based small angle x-ray scattering

    International Nuclear Information System (INIS)

    Round, A R; Wilkinson, S J; Hall, C J; Rogers, K D; Glatter, O; Wess, T; Ellis, I O

    2005-01-01

    Breast tissue collected from tumour samples and normal tissue from bi-lateral mastectomy procedures were examined using small angle x-ray scattering. Previous work has indicated that breast tissue disease diagnosis could be performed using small angle x-ray scattering (SAXS) from a synchrotron radiation source. The technique would be more useful to health services if it could be made to work using a conventional x-ray source. Consistent and reliable differences in x-ray scatter distributions were observed between samples from normal and tumour tissue samples using the laboratory based 'SAXSess' system. Albeit from a small number of samples, a sensitivity of 100% was obtained. This result encourages us to pursue the implementation of SAXS as a laboratory based diagnosis technique

  16. Small data in the era of big data

    OpenAIRE

    Kitchin, Rob; Lauriault, Tracey P.

    2015-01-01

    Academic knowledge building has progressed for the past few centuries using small data studies characterized by sampled data generated to answer specific questions. It is a strategy that has been remarkably successful, enabling the sciences, social sciences and humanities to advance in leaps and bounds. This approach is presently being challenged by the development of big data. Small data studies will however, we argue, continue to be popular and valuable in the fut...

  17. Analysis of the Touch-And-Go Surface Sampling Concept for Comet Sample Return Missions

    Science.gov (United States)

    Mandic, Milan; Acikmese, Behcet; Bayard, David S.; Blackmore, Lars

    2012-01-01

    This paper studies the Touch-and-Go (TAG) concept for enabling a spacecraft to take a sample from the surface of a small primitive body, such as an asteroid or comet. The idea behind the TAG concept is to let the spacecraft descend to the surface, make contact with the surface for several seconds, and then ascend to a safe location. Sampling would be accomplished by an end-effector that is active during the few seconds of surface contact. The TAG event is one of the most critical events in a primitive body sample-return mission. The purpose of this study is to evaluate the dynamic behavior of a representative spacecraft during the TAG event, i.e., immediately prior, during, and after surface contact of the sampler. The study evaluates the sample-collection performance of the proposed sampling end-effector, in this case a brushwheel sampler, while acquiring material from the surface during the contact. A main result of the study is a guidance and control (G&C) validation of the overall TAG concept, in addition to specific contributions to demonstrating the effectiveness of using nonlinear clutch mechanisms in the sampling arm joints, and increasing the length of the sampling arms to improve robustness.

  18. Performance of next-generation sequencing on small tumor specimens and/or low tumor content samples using a commercially available platform.

    Directory of Open Access Journals (Sweden)

    Scott Morris

    Full Text Available Next generation sequencing tests (NGS are usually performed on relatively small core biopsy or fine needle aspiration (FNA samples. Data is limited on what amount of tumor by volume or minimum number of FNA passes are needed to yield sufficient material for running NGS. We sought to identify the amount of tumor for running the PCDx NGS platform.2,723 consecutive tumor tissues of all cancer types were queried and reviewed for inclusion. Information on tumor volume, success of performing NGS, and results of NGS were compiled. Assessment of sequence analysis, mutation calling and sensitivity, quality control, drug associations, and data aggregation and analysis were performed.6.4% of samples were rejected from all testing due to insufficient tumor quantity. The number of genes with insufficient sensitivity make definitive mutation calls increased as the percentage of tumor decreased, reaching statistical significance below 5% tumor content. The number of drug associations also decreased with a lower percentage of tumor, but this difference only became significant between 1-3%. The number of drug associations did decrease with smaller tissue size as expected. Neither specimen size or percentage of tumor affected the ability to pass mRNA quality control. A tumor area of 10 mm2 provides a good margin of error for specimens to yield adequate drug association results.Specimen suitability remains a major obstacle to clinical NGS testing. We determined that PCR-based library creation methods allow the use of smaller specimens, and those with a lower percentage of tumor cells to be run on the PCDx NGS platform.

  19. Bartonella Prevalence and Genetic Diversity in Small Mammals from Ethiopia

    DEFF Research Database (Denmark)

    Meheretu, Yonas; Leirs, Herwig E.l.; Welegerima, Kiros

    2013-01-01

    More than 500 small mammals were trapped at 3 localities in northern Ethiopia to investigate Bartonella infection prevalence and the genetic diversity of the Bartonella spp. We extracted total DNA from liver samples and performed PCR using the primers 1400F and 2300R targeting 852 bp of the Barto......More than 500 small mammals were trapped at 3 localities in northern Ethiopia to investigate Bartonella infection prevalence and the genetic diversity of the Bartonella spp. We extracted total DNA from liver samples and performed PCR using the primers 1400F and 2300R targeting 852 bp...

  20. Adenocarcinoma of the small bowel

    International Nuclear Information System (INIS)

    Savli, M.; Jamar, B.

    2007-01-01

    Adenocarcinoma of small bowel is generally a rather rare primary tumour of small bowel with a prevalence rate of 0.5 - 3.0 / 100.000 population, but the most frequent tumour of small intestine. It more often involves the duodenum and jejunum than the ileum. The aim of this paper is also to point out the value of small bowel follow through (SBFT) in the diagnosis of stenosing lesions. An 83 - year old male patient suffered from abdominal pain, malaise, vomiting, cachexia and diarrhoea for 3 months. The result of occult blood testing was negative. Haemoglobin level was normal. Proctoscopy, colonoscopy, upper gastrointestinal (GI) endoscopy, and ultrasonography (US) did not explain the patient's problems. Ileus of the small bowel was established with abdominal plain film. Small bowel follow through (SBFT) and computer tomography (CT) showed a stenosing tumour in the jejunum. Adenocarcinoma of the small bowel was established with histological examination after resection of the tumor. SBFT, with manual compression of all segments of the small bowel, can be a very accurate diagnostic investigation for evaluation of stenosing lesions in this part of the intestine. (author)