WorldWideScience

Sample records for space sampling improves

  1. Development of improved space sampling strategies for ocean chemical properties: Total carbon dioxide and dissolved nitrate

    Science.gov (United States)

    Goyet, Catherine; Davis, Daniel; Peltzer, Edward T.; Brewer, Peter G.

    1995-01-01

    Large-scale ocean observing programs such as the Joint Global Ocean Flux Study (JGOFS) and the World Ocean Circulation Experiment (WOCE) today, must face the problem of designing an adequate sampling strategy. For ocean chemical variables, the goals and observing technologies are quite different from ocean physical variables (temperature, salinity, pressure). We have recently acquired data on the ocean CO2 properties on WOCE cruises P16c and P17c that are sufficiently dense to test for sampling redundancy. We use linear and quadratic interpolation methods on the sampled field to investigate what is the minimum number of samples required to define the deep ocean total inorganic carbon (TCO2) field within the limits of experimental accuracy (+/- 4 micromol/kg). Within the limits of current measurements, these lines were oversampled in the deep ocean. Should the precision of the measurement be improved, then a denser sampling pattern may be desirable in the future. This approach rationalizes the efficient use of resources for field work and for estimating gridded (TCO2) fields needed to constrain geochemical models.

  2. Improved abdominal MRI in non-breath-holding children using a radial k-space sampling technique

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Hyuk; Choi, Young Hun; Cheon, Jung Eun; Lee, So Mi; Cho, Hyun Hae; Kim, Woo Sun; Kim, In One [Seoul National University Children' s Hospital, Department of Radiology, Seoul (Korea, Republic of); Shin, Su Mi [SMG-SNU Boramae Medical Center, Department of Radiology, Seoul (Korea, Republic of)

    2015-06-15

    Radial k-space sampling techniques have been shown to reduce motion artifacts in adult abdominal MRI. To compare a T2-weighted radial k-space sampling MRI pulse sequence (BLADE) with standard respiratory-triggered T2-weighted turbo spin echo (TSE) in pediatric abdominal imaging. Axial BLADE and respiratory-triggered turbo spin echo sequences were performed without fat suppression in 32 abdominal MR examinations in children. We retrospectively assessed overall image quality, the presence of respiratory, peristaltic and radial artifact, and lesion conspicuity. We evaluated signal uniformity of each sequence. BLADE showed improved overall image quality (3.35 ± 0.85 vs. 2.59 ± 0.59, P < 0.001), reduced respiratory motion artifact (0.51 ± 0.56 vs. 1.89 ± 0.68, P < 0.001), and improved lesion conspicuity (3.54 ± 0.88 vs. 2.92 ± 0.77, P = 0.006) compared to respiratory triggering turbo spin-echo (TSE) sequences. The bowel motion artifact scores were similar for both sequences (1.65 ± 0.77 vs. 1.79 ± 0.74, P = 0.691). BLADE introduced a radial artifact that was not observed on the respiratory triggering-TSE images (1.10 ± 0.85 vs. 0, P < 0.001). BLADE was associated with diminished signal variation compared with respiratory triggering-TSE in the liver, spleen and air (P < 0.001). The radial k-space sampling technique improved the quality and reduced respiratory motion artifacts in young children compared with conventional respiratory-triggered turbo spin-echo sequences. (orig.)

  3. Particle System Based Adaptive Sampling on Spherical Parameter Space to Improve the MDL Method for Construction of Statistical Shape Models

    Directory of Open Access Journals (Sweden)

    Rui Xu

    2013-01-01

    Full Text Available Minimum description length (MDL based group-wise registration was a state-of-the-art method to determine the corresponding points of 3D shapes for the construction of statistical shape models (SSMs. However, it suffered from the problem that determined corresponding points did not uniformly spread on original shapes, since corresponding points were obtained by uniformly sampling the aligned shape on the parameterized space of unit sphere. We proposed a particle-system based method to obtain adaptive sampling positions on the unit sphere to resolve this problem. Here, a set of particles was placed on the unit sphere to construct a particle system whose energy was related to the distortions of parameterized meshes. By minimizing this energy, each particle was moved on the unit sphere. When the system became steady, particles were treated as vertices to build a spherical mesh, which was then relaxed to slightly adjust vertices to obtain optimal sampling-positions. We used 47 cases of (left and right lungs and 50 cases of livers, (left and right kidneys, and spleens for evaluations. Experiments showed that the proposed method was able to resolve the problem of the original MDL method, and the proposed method performed better in the generalization and specificity tests.

  4. Improving sample recovery

    International Nuclear Information System (INIS)

    Blanchard, R.J.

    1995-09-01

    This Engineering Task Plan (ETP) describes the tasks, i.e., tests, studies, external support and modifications planned to increase the recovery of the recovery of the waste tank contents using combinations of improved techniques, equipment, knowledge, experience and testing to better the recovery rates presently being experienced

  5. Stochastic sampling of the RNA structural alignment space.

    Science.gov (United States)

    Harmanci, Arif Ozgun; Sharma, Gaurav; Mathews, David H

    2009-07-01

    A novel method is presented for predicting the common secondary structures and alignment of two homologous RNA sequences by sampling the 'structural alignment' space, i.e. the joint space of their alignments and common secondary structures. The structural alignment space is sampled according to a pseudo-Boltzmann distribution based on a pseudo-free energy change that combines base pairing probabilities from a thermodynamic model and alignment probabilities from a hidden Markov model. By virtue of the implicit comparative analysis between the two sequences, the method offers an improvement over single sequence sampling of the Boltzmann ensemble. A cluster analysis shows that the samples obtained from joint sampling of the structural alignment space cluster more closely than samples generated by the single sequence method. On average, the representative (centroid) structure and alignment of the most populated cluster in the sample of structures and alignments generated by joint sampling are more accurate than single sequence sampling and alignment based on sequence alone, respectively. The 'best' centroid structure that is closest to the known structure among all the centroids is, on average, more accurate than structure predictions of other methods. Additionally, cluster analysis identifies, on average, a few clusters, whose centroids can be presented as alternative candidates. The source code for the proposed method can be downloaded at http://rna.urmc.rochester.edu.

  6. An improved selective sampling method

    International Nuclear Information System (INIS)

    Miyahara, Hiroshi; Iida, Nobuyuki; Watanabe, Tamaki

    1986-01-01

    The coincidence methods which are currently used for the accurate activity standardisation of radio-nuclides, require dead time and resolving time corrections which tend to become increasingly uncertain as countrates exceed about 10 K. To reduce the dependence on such corrections, Muller, in 1981, proposed the selective sampling method using a fast multichannel analyser (50 ns ch -1 ) for measuring the countrates. It is, in many ways, more convenient and possibly potentially more reliable to replace the MCA with scalers and a circuit is described employing five scalers; two of them serving to measure the background correction. Results of comparisons using our new method and the coincidence method for measuring the activity of 60 Co sources yielded agree-ment within statistical uncertainties. (author)

  7. Studying Space: Improving Space Planning with User Studies

    Science.gov (United States)

    Pierard, Cindy; Lee, Norice

    2011-01-01

    How can libraries best assess and improve user space, even if they are not in a position to undertake new construction or a major renovation? Staff at New Mexico State University used a variety of ethnographic methods to learn how our spaces were being used as well as what our users considered to be ideal library space. Our findings helped us make…

  8. Improvement in Space Food Packaging Methods

    Data.gov (United States)

    National Aeronautics and Space Administration — The Space Food Systems Laboratory's (SFSL) current Bulk Overwrap Bag (BOB) package, while simple and effective, leaves room for improvement. Currently, BOBs are...

  9. National Space Agencies vs. Commercial Space: Towards Improved Space Safety

    Science.gov (United States)

    Pelton, J.

    2013-09-01

    Traditional space policies as developed at the national level includes many elements but they are most typically driven by economic and political objectives. Legislatively administered programs apportion limited public funds to achieve "gains" that can involve employment, stimulus to the economy, national defense or other advancements. Yet political advantage is seldom far from the picture.Within the context of traditional space policies, safety issues cannot truly be described as "afterthoughts", but they are usually, at best, a secondary or even tertiary consideration. "Space safety" is often simply assumed to be "in there" somewhere. The current key question is can "safety and risk minimization", within new commercial space programs actually be elevated in importance and effectively be "designed in" at the outset. This has long been the case with commercial aviation and there is at least reasonable hope that this could also be the case for the commercial space industry in coming years. The cooperative role that the insurance industry has now played for centuries in the shipping industry and for decades in aviation can perhaps now play a constructive role in risk minimization in the commercial space domain as well. This paper begins by examining two historical case studies in the context of traditional national space policy development to see how major space policy decisions involving "manned space programs" have given undue primacy to "political considerations" over "safety" and other factors. The specific case histories examined here include first the decision to undertake the Space Shuttle Program (i.e. 1970-1972) and the second is the International Space Station. In both cases the key and overarching decisions were driven by political, schedule and cost considerations, and safety seems absence as a prime consideration. In publicly funded space programs—whether in the United States, Europe, Russia, Japan, China, India or elsewhere—it seems realistic to

  10. Random sampling of evolution time space and Fourier transform processing

    International Nuclear Information System (INIS)

    Kazimierczuk, Krzysztof; Zawadzka, Anna; Kozminski, Wiktor; Zhukov, Igor

    2006-01-01

    Application of Fourier Transform for processing 3D NMR spectra with random sampling of evolution time space is presented. The 2D FT is calculated for pairs of frequencies, instead of conventional sequence of one-dimensional transforms. Signal to noise ratios and linewidths for different random distributions were investigated by simulations and experiments. The experimental examples include 3D HNCA, HNCACB and 15 N-edited NOESY-HSQC spectra of 13 C 15 N labeled ubiquitin sample. Obtained results revealed general applicability of proposed method and the significant improvement of resolution in comparison with conventional spectra recorded in the same time

  11. Judgment sampling: a health care improvement perspective.

    Science.gov (United States)

    Perla, Rocco J; Provost, Lloyd P

    2012-01-01

    Sampling plays a major role in quality improvement work. Random sampling (assumed by most traditional statistical methods) is the exception in improvement situations. In most cases, some type of "judgment sample" is used to collect data from a system. Unfortunately, judgment sampling is not well understood. Judgment sampling relies upon those with process and subject matter knowledge to select useful samples for learning about process performance and the impact of changes over time. It many cases, where the goal is to learn about or improve a specific process or system, judgment samples are not merely the most convenient and economical approach, they are technically and conceptually the most appropriate approach. This is because improvement work is done in the real world in complex situations involving specific areas of concern and focus; in these situations, the assumptions of classical measurement theory neither can be met nor should an attempt be made to meet them. The purpose of this article is to describe judgment sampling and its importance in quality improvement work and studies with a focus on health care settings.

  12. Autonomous spatially adaptive sampling in experiments based on curvature, statistical error and sample spacing with applications in LDA measurements

    Science.gov (United States)

    Theunissen, Raf; Kadosh, Jesse S.; Allen, Christian B.

    2015-06-01

    Spatially varying signals are typically sampled by collecting uniformly spaced samples irrespective of the signal content. For signals with inhomogeneous information content, this leads to unnecessarily dense sampling in regions of low interest or insufficient sample density at important features, or both. A new adaptive sampling technique is presented directing sample collection in proportion to local information content, capturing adequately the short-period features while sparsely sampling less dynamic regions. The proposed method incorporates a data-adapted sampling strategy on the basis of signal curvature, sample space-filling, variable experimental uncertainty and iterative improvement. Numerical assessment has indicated a reduction in the number of samples required to achieve a predefined uncertainty level overall while improving local accuracy for important features. The potential of the proposed method has been further demonstrated on the basis of Laser Doppler Anemometry experiments examining the wake behind a NACA0012 airfoil and the boundary layer characterisation of a flat plate.

  13. MO-FG-CAMPUS-JeP2-01: 4D-MRI with 3D Radial Sampling and Self-Gating-Based K-Space Sorting: Image Quality Improvement by Slab-Selective Excitation

    Energy Technology Data Exchange (ETDEWEB)

    Deng, Z; Pang, J; Tuli, R; Fraass, B; Fan, Z [Cedars Sinai Medical Center, Los Angeles, CA (United States); Yang, W [Cedars-Sinai Medical Center, Los Angeles, CA (United States); Bi, X [Siemens Healthcare, Los Angeles, CA (United States); Hakimian, B [Cedars Sinai Medical Center, Los Angeles CA (United States); Li, D [Cedars Sinai Medical Center, Los Angeles, California (United States)

    2016-06-15

    Purpose: A recent 4D MRI technique based on 3D radial sampling and self-gating-based K-space sorting has shown promising results in characterizing respiratory motion. However due to continuous acquisition and potentially drastic k-space undersampling resultant images could suffer from low blood-to-tissue contrast and streaking artifacts. In this study 3D radial sampling with slab-selective excitation (SS) was proposed in attempt to enhance blood-to-tissue contrast by exploiting the in-flow effect and to suppress the excess signal from the peripheral structures particularly in the superior-inferior direction. The feasibility of improving image quality by using this approach was investigated through a comparison with the previously developed non-selective excitation (NS) approach. Methods: Two excitation approaches SS and NS were compared in 5 cancer patients (1 lung 1 liver 2 pancreas and 1 esophagus) at 3Tesla. Image artifact was assessed in all patients on a 4-point scale (0: poor; 3: excellent). Signal-tonoise ratio (SNR) of the blood vessel (aorta) at the center of field-of-view and its nearby tissue were measured in 3 of the 5 patients (1 liver 2 pancreas) and blood-to-tissue contrast-to-noise ratio (CNR) were then determined. Results: Compared with NS the image quality of SS was visually improved with overall higher signal in all patients (2.6±0.55 vs. 3.4±0.55). SS showed an approximately 2-fold increase of SNR in the blood (aorta: 16.39±1.95 vs. 32.19±7.93) and slight increase in the surrounding tissue (liver/pancreas: 16.91±1.82 vs. 22.31±3.03). As a result the blood-totissue CNR was dramatically higher in the SS method (1.20±1.20 vs. 9.87±6.67). Conclusion: The proposed 3D radial sampling with slabselective excitation allows for reduced image artifact and improved blood SNR and blood-to-tissue CNR. The success of this technique could potentially benefit patients with cancerous tumors that have invaded the surrounding blood vessels where radiation

  14. MO-FG-CAMPUS-JeP2-01: 4D-MRI with 3D Radial Sampling and Self-Gating-Based K-Space Sorting: Image Quality Improvement by Slab-Selective Excitation

    International Nuclear Information System (INIS)

    Deng, Z; Pang, J; Tuli, R; Fraass, B; Fan, Z; Yang, W; Bi, X; Hakimian, B; Li, D

    2016-01-01

    Purpose: A recent 4D MRI technique based on 3D radial sampling and self-gating-based K-space sorting has shown promising results in characterizing respiratory motion. However due to continuous acquisition and potentially drastic k-space undersampling resultant images could suffer from low blood-to-tissue contrast and streaking artifacts. In this study 3D radial sampling with slab-selective excitation (SS) was proposed in attempt to enhance blood-to-tissue contrast by exploiting the in-flow effect and to suppress the excess signal from the peripheral structures particularly in the superior-inferior direction. The feasibility of improving image quality by using this approach was investigated through a comparison with the previously developed non-selective excitation (NS) approach. Methods: Two excitation approaches SS and NS were compared in 5 cancer patients (1 lung 1 liver 2 pancreas and 1 esophagus) at 3Tesla. Image artifact was assessed in all patients on a 4-point scale (0: poor; 3: excellent). Signal-tonoise ratio (SNR) of the blood vessel (aorta) at the center of field-of-view and its nearby tissue were measured in 3 of the 5 patients (1 liver 2 pancreas) and blood-to-tissue contrast-to-noise ratio (CNR) were then determined. Results: Compared with NS the image quality of SS was visually improved with overall higher signal in all patients (2.6±0.55 vs. 3.4±0.55). SS showed an approximately 2-fold increase of SNR in the blood (aorta: 16.39±1.95 vs. 32.19±7.93) and slight increase in the surrounding tissue (liver/pancreas: 16.91±1.82 vs. 22.31±3.03). As a result the blood-totissue CNR was dramatically higher in the SS method (1.20±1.20 vs. 9.87±6.67). Conclusion: The proposed 3D radial sampling with slabselective excitation allows for reduced image artifact and improved blood SNR and blood-to-tissue CNR. The success of this technique could potentially benefit patients with cancerous tumors that have invaded the surrounding blood vessels where radiation

  15. A Improved Seabed Surface Sand Sampling Device

    Science.gov (United States)

    Luo, X.

    2017-12-01

    In marine geology research it is necessary to obtain a suf fcient quantity of seabed surface samples, while also en- suring that the samples are in their original state. Currently,there are a number of seabed surface sampling devices available, but we fnd it is very diffcult to obtain sand samples using these devices, particularly when dealing with fne sand. Machine-controlled seabed surface sampling devices are also available, but generally unable to dive into deeper regions of water. To obtain larger quantities of seabed surface sand samples in their original states, many researchers have tried to improve upon sampling devices,but these efforts have generally produced ambiguous results, in our opinion.To resolve this issue, we have designed an improved andhighly effective seabed surface sand sampling device that incorporates the strengths of a variety of sampling devices. It is capable of diving into deepwater to obtain fne sand samples and is also suited for use in streams, rivers, lakes and seas with varying levels of depth (up to 100 m). This device can be used for geological mapping, underwater prospecting, geological engineering and ecological, environmental studies in both marine and terrestrial waters.

  16. Improving blood sample logistics using simulation

    DEFF Research Database (Denmark)

    Jørgensen, Pelle Morten Thomas; Jacobsen, Peter

    2012-01-01

    Using simulation as an approach to display and improve internal logistics and handling at hospitals has great potential. This research will show how a simulation model can be used to evaluate changes made to two different cases of transportation of blood samples at a hospital, by evaluating...

  17. Space Shuttle main engine product improvement

    Science.gov (United States)

    Lucci, A. D.; Klatt, F. P.

    1985-01-01

    The current design of the Space Shuttle Main Engine has passed 11 certification cycles, amassed approximately a quarter million seconds of engine test time in 1200 tests and successfully launched the Space Shuttle 17 times of 51 engine launches through May 1985. Building on this extensive background, two development programs are underway at Rocketdyne to improve the flow of hot gas through the powerhead and evaluate the changes to increase the performance margins in the engine. These two programs, called Phase II+ and Technology Test Bed Precursor program are described. Phase II+ develops a two-tube hot-gas manifold that improves the component environment. The Precursor program will evaluate a larger throat main combustion chamber, conduct combustion stability testing of a baffleless main injector, fabricate an experimental weld-free heat exchanger tube, fabricate and test a high pressure oxidizer turbopump with an improved inlet, and develop and test methods for reducing temperature transients at start and shutdown.

  18. Adaptive importance sampling of random walks on continuous state spaces

    International Nuclear Information System (INIS)

    Baggerly, K.; Cox, D.; Picard, R.

    1998-01-01

    The authors consider adaptive importance sampling for a random walk with scoring in a general state space. Conditions under which exponential convergence occurs to the zero-variance solution are reviewed. These results generalize previous work for finite, discrete state spaces in Kollman (1993) and in Kollman, Baggerly, Cox, and Picard (1996). This paper is intended for nonstatisticians and includes considerable explanatory material

  19. Improved mesh based photon sampling techniques for neutron activation analysis

    International Nuclear Information System (INIS)

    Relson, E.; Wilson, P. P. H.; Biondo, E. D.

    2013-01-01

    The design of fusion power systems requires analysis of neutron activation of large, complex volumes, and the resulting particles emitted from these volumes. Structured mesh-based discretization of these problems allows for improved modeling in these activation analysis problems. Finer discretization of these problems results in large computational costs, which drives the investigation of more efficient methods. Within an ad hoc subroutine of the Monte Carlo transport code MCNP, we implement sampling of voxels and photon energies for volumetric sources using the alias method. The alias method enables efficient sampling of a discrete probability distribution, and operates in 0(1) time, whereas the simpler direct discrete method requires 0(log(n)) time. By using the alias method, voxel sampling becomes a viable alternative to sampling space with the 0(1) approach of uniformly sampling the problem volume. Additionally, with voxel sampling it is straightforward to introduce biasing of volumetric sources, and we implement this biasing of voxels as an additional variance reduction technique that can be applied. We verify our implementation and compare the alias method, with and without biasing, to direct discrete sampling of voxels, and to uniform sampling. We study the behavior of source biasing in a second set of tests and find trends between improvements and source shape, material, and material density. Overall, however, the magnitude of improvements from source biasing appears to be limited. Future work will benefit from the implementation of efficient voxel sampling - particularly with conformal unstructured meshes where the uniform sampling approach cannot be applied. (authors)

  20. Less is more: Sampling chemical space with active learning

    Science.gov (United States)

    Smith, Justin S.; Nebgen, Ben; Lubbers, Nicholas; Isayev, Olexandr; Roitberg, Adrian E.

    2018-06-01

    The development of accurate and transferable machine learning (ML) potentials for predicting molecular energetics is a challenging task. The process of data generation to train such ML potentials is a task neither well understood nor researched in detail. In this work, we present a fully automated approach for the generation of datasets with the intent of training universal ML potentials. It is based on the concept of active learning (AL) via Query by Committee (QBC), which uses the disagreement between an ensemble of ML potentials to infer the reliability of the ensemble's prediction. QBC allows the presented AL algorithm to automatically sample regions of chemical space where the ML potential fails to accurately predict the potential energy. AL improves the overall fitness of ANAKIN-ME (ANI) deep learning potentials in rigorous test cases by mitigating human biases in deciding what new training data to use. AL also reduces the training set size to a fraction of the data required when using naive random sampling techniques. To provide validation of our AL approach, we develop the COmprehensive Machine-learning Potential (COMP6) benchmark (publicly available on GitHub) which contains a diverse set of organic molecules. Active learning-based ANI potentials outperform the original random sampled ANI-1 potential with only 10% of the data, while the final active learning-based model vastly outperforms ANI-1 on the COMP6 benchmark after training to only 25% of the data. Finally, we show that our proposed AL technique develops a universal ANI potential (ANI-1x) that provides accurate energy and force predictions on the entire COMP6 benchmark. This universal ML potential achieves a level of accuracy on par with the best ML potentials for single molecules or materials, while remaining applicable to the general class of organic molecules composed of the elements CHNO.

  1. Improved backward ray tracing with stochastic sampling

    Science.gov (United States)

    Ryu, Seung Taek; Yoon, Kyung-Hyun

    1999-03-01

    This paper presents a new technique that enhances the diffuse interreflection with the concepts of backward ray tracing. In this research, we have modeled the diffuse rays with the following conditions. First, as the reflection from the diffuse surfaces occurs in all directions, it is impossible to trace all of the reflected rays. We confined the diffuse rays by sampling the spherical angle out of the reflected rays around the normal vector. Second, the traveled distance of reflected energy from the diffuse surface differs according to the object's property, and has a comparatively short reflection distance. Considering the fact that the rays created on the diffuse surfaces affect relatively small area, it is very inefficient to trace all of the sampled diffused rays. Therefore, we set a fixed distance as the critical distance and all the rays beyond this distance are ignored. The result of this research is that as the improved backward ray tracing can model the illumination effects such as the color bleeding effects, we can replace the radiosity algorithm under the limited environment.

  2. Improved sample size determination for attributes and variables sampling

    International Nuclear Information System (INIS)

    Stirpe, D.; Picard, R.R.

    1985-01-01

    Earlier INMM papers have addressed the attributes/variables problem and, under conservative/limiting approximations, have reported analytical solutions for the attributes and variables sample sizes. Through computer simulation of this problem, we have calculated attributes and variables sample sizes as a function of falsification, measurement uncertainties, and required detection probability without using approximations. Using realistic assumptions for uncertainty parameters of measurement, the simulation results support the conclusions: (1) previously used conservative approximations can be expensive because they lead to larger sample sizes than needed; and (2) the optimal verification strategy, as well as the falsification strategy, are highly dependent on the underlying uncertainty parameters of the measurement instruments. 1 ref., 3 figs

  3. An improved sampling method of complex network

    Science.gov (United States)

    Gao, Qi; Ding, Xintong; Pan, Feng; Li, Weixing

    2014-12-01

    Sampling subnet is an important topic of complex network research. Sampling methods influence the structure and characteristics of subnet. Random multiple snowball with Cohen (RMSC) process sampling which combines the advantages of random sampling and snowball sampling is proposed in this paper. It has the ability to explore global information and discover the local structure at the same time. The experiments indicate that this novel sampling method could keep the similarity between sampling subnet and original network on degree distribution, connectivity rate and average shortest path. This method is applicable to the situation where the prior knowledge about degree distribution of original network is not sufficient.

  4. An Improved Nested Sampling Algorithm for Model Selection and Assessment

    Science.gov (United States)

    Zeng, X.; Ye, M.; Wu, J.; WANG, D.

    2017-12-01

    Multimodel strategy is a general approach for treating model structure uncertainty in recent researches. The unknown groundwater system is represented by several plausible conceptual models. Each alternative conceptual model is attached with a weight which represents the possibility of this model. In Bayesian framework, the posterior model weight is computed as the product of model prior weight and marginal likelihood (or termed as model evidence). As a result, estimating marginal likelihoods is crucial for reliable model selection and assessment in multimodel analysis. Nested sampling estimator (NSE) is a new proposed algorithm for marginal likelihood estimation. The implementation of NSE comprises searching the parameters' space from low likelihood area to high likelihood area gradually, and this evolution is finished iteratively via local sampling procedure. Thus, the efficiency of NSE is dominated by the strength of local sampling procedure. Currently, Metropolis-Hasting (M-H) algorithm and its variants are often used for local sampling in NSE. However, M-H is not an efficient sampling algorithm for high-dimensional or complex likelihood function. For improving the performance of NSE, it could be feasible to integrate more efficient and elaborated sampling algorithm - DREAMzs into the local sampling. In addition, in order to overcome the computation burden problem of large quantity of repeating model executions in marginal likelihood estimation, an adaptive sparse grid stochastic collocation method is used to build the surrogates for original groundwater model.

  5. Interpolation and sampling in spaces of analytic functions

    CERN Document Server

    Seip, Kristian

    2004-01-01

    The book is about understanding the geometry of interpolating and sampling sequences in classical spaces of analytic functions. The subject can be viewed as arising from three classical topics: Nevanlinna-Pick interpolation, Carleson's interpolation theorem for H^\\infty, and the sampling theorem, also known as the Whittaker-Kotelnikov-Shannon theorem. The book aims at clarifying how certain basic properties of the space at hand are reflected in the geometry of interpolating and sampling sequences. Key words for the geometric descriptions are Carleson measures, Beurling densities, the Nyquist rate, and the Helson-Szegő condition. The book is based on six lectures given by the author at the University of Michigan. This is reflected in the exposition, which is a blend of informal explanations with technical details. The book is essentially self-contained. There is an underlying assumption that the reader has a basic knowledge of complex and functional analysis. Beyond that, the reader should have some familiari...

  6. Rotary Mode Core Sample System availability improvement

    International Nuclear Information System (INIS)

    Jenkins, W.W.; Bennett, K.L.; Potter, J.D.; Cross, B.T.; Burkes, J.M.; Rogers, A.C.

    1995-01-01

    The Rotary Mode Core Sample System (RMCSS) is used to obtain stratified samples of the waste deposits in single-shell and double-shell waste tanks at the Hanford Site. The samples are used to characterize the waste in support of ongoing and future waste remediation efforts. Four sampling trucks have been developed to obtain these samples. Truck I was the first in operation and is currently being used to obtain samples where the push mode is appropriate (i.e., no rotation of drill). Truck 2 is similar to truck 1, except for added safety features, and is in operation to obtain samples using either a push mode or rotary drill mode. Trucks 3 and 4 are now being fabricated to be essentially identical to truck 2

  7. An improved ashing procedure for biologic sample

    Energy Technology Data Exchange (ETDEWEB)

    Zongmei, Wu [Zhejiang Province Enviromental Radiation Monitoring Centre (China)

    1992-07-01

    The classical ashing procedure in muffle was modified for biologic samples. In the modified procedure the door of muffle was open in the duration of ashing process, the ashing was accelerated and the ashing product quality was comparable to that the classical procedure. The modified procedure is suitable for ashing biologic samples in large batches.

  8. An improved ashing procedure for biologic sample

    International Nuclear Information System (INIS)

    Wu Zongmei

    1992-01-01

    The classical ashing procedure in muffle was modified for biologic samples. In the modified procedure the door of muffle was open in the duration of ashing process, the ashing was accelerated and the ashing product quality was comparable to that the classical procedure. The modified procedure is suitable for ashing biologic samples in large batches

  9. Cryogenic Liquid Sample Acquisition System for Remote Space Applications

    Science.gov (United States)

    Mahaffy, Paul; Trainer, Melissa; Wegel, Don; Hawk, Douglas; Melek, Tony; Johnson, Christopher; Amato, Michael; Galloway, John

    2013-01-01

    There is a need to acquire autonomously cryogenic hydrocarbon liquid sample from remote planetary locations such as the lakes of Titan for instruments such as mass spectrometers. There are several problems that had to be solved relative to collecting the right amount of cryogenic liquid sample into a warmer spacecraft, such as not allowing the sample to boil off or fractionate too early; controlling the intermediate and final pressures within carefully designed volumes; designing for various particulates and viscosities; designing to thermal, mass, and power-limited spacecraft interfaces; and reducing risk. Prior art inlets for similar instruments in spaceflight were designed primarily for atmospheric gas sampling and are not useful for this front-end application. These cryogenic liquid sample acquisition system designs for remote space applications allow for remote, autonomous, controlled sample collections of a range of challenging cryogenic sample types. The design can control the size of the sample, prevent fractionation, control pressures at various stages, and allow for various liquid sample levels. It is capable of collecting repeated samples autonomously in difficult lowtemperature conditions often found in planetary missions. It is capable of collecting samples for use by instruments from difficult sample types such as cryogenic hydrocarbon (methane, ethane, and propane) mixtures with solid particulates such as found on Titan. The design with a warm actuated valve is compatible with various spacecraft thermal and structural interfaces. The design uses controlled volumes, heaters, inlet and vent tubes, a cryogenic valve seat, inlet screens, temperature and cryogenic liquid sensors, seals, and vents to accomplish its task.

  10. Improved Rock Core Sample Break-off, Retention and Ejection System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed effort advances the design of an innovative core sampling and acquisition system with improved core break-off, retention and ejection features. The...

  11. Improved Rock Core Sample Break-off, Retention and Ejection System, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed effort advances the design of an innovative core sampling and acquisition system with improved core break-off, retention and ejection features. Phase 1...

  12. An improved sampling system installed for reprocessing

    International Nuclear Information System (INIS)

    Finsterwalder, L.; Zeh, H.

    1979-03-01

    Sampling devices are needed for taking representative samples from individual process containers during the reprocessing of irradiated fuel. The aqueous process stream in a reprocessing plant frequently contains, in addition to the dissolved radioactive materials, more or less small quantities of solid matter fraction of fuel material still remaining undissolved, insoluble fission-, corrosion-, or degradation products as well, in exceptional cases, ion exchange resin or silica gel. The solid matter is deposited partly on the upper surfaces of the sampling system and the radiation due to this makes maintenance and repair of the sampler more difficult. The purpose of the development work was to reduce the chance of accident and the maintenance costs and to lower the radiation exposure of the personnel. A new sampling system was developed and is described. (author)

  13. Extra-large letter spacing improves reading in dyslexia

    Science.gov (United States)

    Zorzi, Marco; Barbiero, Chiara; Facoetti, Andrea; Lonciari, Isabella; Carrozzi, Marco; Montico, Marcella; Bravar, Laura; George, Florence; Pech-Georgel, Catherine; Ziegler, Johannes C.

    2012-01-01

    Although the causes of dyslexia are still debated, all researchers agree that the main challenge is to find ways that allow a child with dyslexia to read more words in less time, because reading more is undisputedly the most efficient intervention for dyslexia. Sophisticated training programs exist, but they typically target the component skills of reading, such as phonological awareness. After the component skills have improved, the main challenge remains (that is, reading deficits must be treated by reading more—a vicious circle for a dyslexic child). Here, we show that a simple manipulation of letter spacing substantially improved text reading performance on the fly (without any training) in a large, unselected sample of Italian and French dyslexic children. Extra-large letter spacing helps reading, because dyslexics are abnormally affected by crowding, a perceptual phenomenon with detrimental effects on letter recognition that is modulated by the spacing between letters. Extra-large letter spacing may help to break the vicious circle by rendering the reading material more easily accessible. PMID:22665803

  14. Sampling Indoor Aerosols on the International Space Station

    Science.gov (United States)

    Meyer, Marit E.

    2016-01-01

    In a spacecraft cabin environment, the size range of indoor aerosols is much larger and they persist longer than on Earth because they are not removed by gravitational settling. A previous aerosol experiment in 1991 documented that over 90 of the mass concentration of particles in the NASA Space Shuttle air were between 10 m and 100 m based on measurements with a multi-stage virtual impactor and a nephelometer (Liu et al. 1991). While the now-retired Space Shuttle had short duration missions (less than two weeks), the International Space Station (ISS) has been continually inhabited by astronauts for over a decade. High concentrations of inhalable particles on ISS are potentially responsible for crew complaints of respiratory and eye irritation and comments about 'dusty' air. Air filtration is the current control strategy for airborne particles on the ISS, and filtration modeling, performed for engineering and design validation of the air revitalization system in ISS, predicted that PM requirements would be met. However, aerosol monitoring has never been performed on the ISS to verify PM levels. A flight experiment is in preparation which will provide data on particulate matter in ISS ambient air. Particles will be collected with a thermophoretic sampler as well as with passive samplers which will extend the particle size range of sampling. Samples will be returned to Earth for chemical and microscopic analyses, providing the first aerosol data for ISS ambient air.

  15. Improved sample holders for the PMMA dosimeters

    International Nuclear Information System (INIS)

    Kobayashi, Toshikazu; Sone, Koji; Iso, Katsuaki

    1994-01-01

    PMMA dosimeters are widely used for high dose dosimetry. Dose is determined by measuring the change in optical density of the irradiated PMMA dosimeter element. Measurement precision depends on the mounting method of a dosimeter element in the sample room of a spectrophotometer. We tried to prepare three types of holders, (holders A, B and C in Figs. 1-3), according to the shape of PMMA dosimeter elements. We measured optical density of the irradiated PMMA dosimeter elements by using the three types of holders. It is revealed that the holder of the type A gives more precise results for the Red 4034 or Gammachrome YR dosimeter than that of the type B. The measurements with a spectrophotometer using the type C holder gives better results for the Red acrylic dosimeter than the case of the measurements by the exclusive reader. (author)

  16. Improved optical ranging for space based gravitational wave detection

    International Nuclear Information System (INIS)

    Sutton, Andrew J; Shaddock, Daniel A; McKenzie, Kirk; Ware, Brent; De Vine, Glenn; Spero, Robert E; Klipstein, W

    2013-01-01

    The operation of 10 6  km scale laser interferometers in space will permit the detection of gravitational waves at previously unaccessible frequency regions. Multi-spacecraft missions, such as the Laser Interferometer Space Antenna (LISA), will use time delay interferometry to suppress the otherwise dominant laser frequency noise from their measurements. This is accomplished by performing sub-sample interpolation of the optical phase measurements recorded at each spacecraft for synchronization and cancellation of the otherwise dominant laser frequency noise. These sub-sample interpolation time shifts are dependent upon the inter-spacecraft range and will be measured using a pseudo-random noise ranging modulation upon the science laser. One limit to the ranging performance is mutual interference between the outgoing and incoming ranging signals upon each spacecraft. This paper reports on the demonstration of a noise cancellation algorithm which is shown to providing a factor of ∼8 suppression of the mutual interference noise. Demonstration of the algorithm in an optical test bed showed an rms ranging error of 0.06 m, improved from 0.19 m in previous results, surpassing the 1 m RMS LISA specification and potentially improving the cancellation of laser frequency noise. (paper)

  17. Improved space bandwidth product in image upconversion

    DEFF Research Database (Denmark)

    Dam, Jeppe Seidelin; Pedersen, Christian; Tidemand-Lichtenberg, Peter

    2012-01-01

    We present a technique increasing the space bandwidth product of a nonlinear image upconversion process used for spectral imaging. The technique exploits the strong dependency of the phase-matching condition in sum frequency generation (SFG) on the angle of propagation of the interacting fields...

  18. International Space Station Future Correlation Analysis Improvements

    Science.gov (United States)

    Laible, Michael R.; Pinnamaneni, Murthy; Sugavanam, Sujatha; Grygier, Michael

    2018-01-01

    Ongoing modal analyses and model correlation are performed on different configurations of the International Space Station (ISS). These analyses utilize on-orbit dynamic measurements collected using four main ISS instrumentation systems: External Wireless Instrumentation System (EWIS), Internal Wireless Instrumentation System (IWIS), Space Acceleration Measurement System (SAMS), and Structural Dynamic Measurement System (SDMS). Remote Sensor Units (RSUs) are network relay stations that acquire flight data from sensors. Measured data is stored in the Remote Sensor Unit (RSU) until it receives a command to download data via RF to the Network Control Unit (NCU). Since each RSU has its own clock, it is necessary to synchronize measurements before analysis. Imprecise synchronization impacts analysis results. A study was performed to evaluate three different synchronization techniques: (i) measurements visually aligned to analytical time-response data using model comparison, (ii) Frequency Domain Decomposition (FDD), and (iii) lag from cross-correlation to align measurements. This paper presents the results of this study.

  19. An alternative phase-space distribution to sample initial conditions for classical dynamics simulations

    International Nuclear Information System (INIS)

    Garcia-Vela, A.

    2002-01-01

    A new quantum-type phase-space distribution is proposed in order to sample initial conditions for classical trajectory simulations. The phase-space distribution is obtained as the modulus of a quantum phase-space state of the system, defined as the direct product of the coordinate and momentum representations of the quantum initial state. The distribution is tested by sampling initial conditions which reproduce the initial state of the Ar-HCl cluster prepared by ultraviolet excitation, and by simulating the photodissociation dynamics by classical trajectories. The results are compared with those of a wave packet calculation, and with a classical simulation using an initial phase-space distribution recently suggested. A better agreement is found between the classical and the quantum predictions with the present phase-space distribution, as compared with the previous one. This improvement is attributed to the fact that the phase-space distribution propagated classically in this work resembles more closely the shape of the wave packet propagated quantum mechanically

  20. Heroic Reliability Improvement in Manned Space Systems

    Science.gov (United States)

    Jones, Harry W.

    2017-01-01

    System reliability can be significantly improved by a strong continued effort to identify and remove all the causes of actual failures. Newly designed systems often have unexpected high failure rates which can be reduced by successive design improvements until the final operational system has an acceptable failure rate. There are many causes of failures and many ways to remove them. New systems may have poor specifications, design errors, or mistaken operations concepts. Correcting unexpected problems as they occur can produce large early gains in reliability. Improved technology in materials, components, and design approaches can increase reliability. The reliability growth is achieved by repeatedly operating the system until it fails, identifying the failure cause, and fixing the problem. The failure rate reduction that can be obtained depends on the number and the failure rates of the correctable failures. Under the strong assumption that the failure causes can be removed, the decline in overall failure rate can be predicted. If a failure occurs at the rate of lambda per unit time, the expected time before the failure occurs and can be corrected is 1/lambda, the Mean Time Before Failure (MTBF). Finding and fixing a less frequent failure with the rate of lambda/2 per unit time requires twice as long, time of 1/(2 lambda). Cutting the failure rate in half requires doubling the test and redesign time and finding and eliminating the failure causes.Reducing the failure rate significantly requires a heroic reliability improvement effort.

  1. Screen Space Ambient Occlusion Based Multiple Importance Sampling for Real-Time Rendering

    Science.gov (United States)

    Zerari, Abd El Mouméne; Babahenini, Mohamed Chaouki

    2018-03-01

    We propose a new approximation technique for accelerating the Global Illumination algorithm for real-time rendering. The proposed approach is based on the Screen-Space Ambient Occlusion (SSAO) method, which approximates the global illumination for large, fully dynamic scenes at interactive frame rates. Current algorithms that are based on the SSAO method suffer from difficulties due to the large number of samples that are required. In this paper, we propose an improvement to the SSAO technique by integrating it with a Multiple Importance Sampling technique that combines a stratified sampling method with an importance sampling method, with the objective of reducing the number of samples. Experimental evaluation demonstrates that our technique can produce high-quality images in real time and is significantly faster than traditional techniques.

  2. Improving the Acquisition and Management of Sample Curation Data

    Science.gov (United States)

    Todd, Nancy S.; Evans, Cindy A.; Labasse, Dan

    2011-01-01

    This paper discusses the current sample documentation processes used during and after a mission, examines the challenges and special considerations needed for designing effective sample curation data systems, and looks at the results of a simulated sample result mission and the lessons learned from this simulation. In addition, it introduces a new data architecture for an integrated sample Curation data system being implemented at the NASA Astromaterials Acquisition and Curation department and discusses how it improves on existing data management systems.

  3. Improvement of fuel sampling device for STACY and TRACY

    International Nuclear Information System (INIS)

    Hirose, Hideyuki; Sakuraba, Koichi; Onodera, Seiji

    1998-05-01

    STACY and TRACY, static and transient experiment facilities in NUCEF, use solution fuel. It is important to analyze accurately fuel composition (uranium enrichment, uranium concentration, nitric acid morality, amount of impurities, radioactivity of FP) for their safety operation and improvement of experimental accuracy. Both STACY and TRACY have the sampling devices to sample fuel solution for that purpose. The previous sampling devices of STACY and TRACY had been designed to dilute fuel sample with nitric acid. Its sampling mechanism could pour fuel sample into sampling vessel by a piston drive of nitric acid in the burette. It was, however, sometimes found that sample fuel solution was diluted by mixing with nitric acid in the burette. Therefore, the sampling mechanism was change into a fixed quantity pump drive which didn't use nitric acid. The authors confirmed that the performance of the new sampling device was improved by changing sampling mechanism. It was confirmed through the function test that the uncertainty in uranium concentration measurement using the improved sampling device was 0.14%, and less than the designed value of 0.2% (coefficient of variation). (author)

  4. CM Process Improvement and the International Space Station Program (ISSP)

    Science.gov (United States)

    Stephenson, Ginny

    2007-01-01

    This viewgraph presentation reviews the Configuration Management (CM) process improvements planned and undertaken for the International Space Station Program (ISSP). It reviews the 2004 findings and recommendations and the progress towards their implementation.

  5. Simulating and assessing boson sampling experiments with phase-space representations

    Science.gov (United States)

    Opanchuk, Bogdan; Rosales-Zárate, Laura; Reid, Margaret D.; Drummond, Peter D.

    2018-04-01

    The search for new, application-specific quantum computers designed to outperform any classical computer is driven by the ending of Moore's law and the quantum advantages potentially obtainable. Photonic networks are promising examples, with experimental demonstrations and potential for obtaining a quantum computer to solve problems believed classically impossible. This introduces a challenge: how does one design or understand such photonic networks? One must be able to calculate observables using general methods capable of treating arbitrary inputs, dissipation, and noise. We develop complex phase-space software for simulating these photonic networks, and apply this to boson sampling experiments. Our techniques give sampling errors orders of magnitude lower than experimental correlation measurements for the same number of samples. We show that these techniques remove systematic errors in previous algorithms for estimating correlations, with large improvements in errors in some cases. In addition, we obtain a scalable channel-combination strategy for assessment of boson sampling devices.

  6. An Improvement to Interval Estimation for Small Samples

    Directory of Open Access Journals (Sweden)

    SUN Hui-Ling

    2017-02-01

    Full Text Available Because it is difficult and complex to determine the probability distribution of small samples,it is improper to use traditional probability theory to process parameter estimation for small samples. Bayes Bootstrap method is always used in the project. Although,the Bayes Bootstrap method has its own limitation,In this article an improvement is given to the Bayes Bootstrap method,This method extended the amount of samples by numerical simulation without changing the circumstances in a small sample of the original sample. And the new method can give the accurate interval estimation for the small samples. Finally,by using the Monte Carlo simulation to model simulation to the specific small sample problems. The effectiveness and practicability of the Improved-Bootstrap method was proved.

  7. Sample selection via angular distance in the space of the arguments of an artificial neural network

    Science.gov (United States)

    Fernández Jaramillo, J. M.; Mayerle, R.

    2018-05-01

    In the construction of an artificial neural network (ANN) a proper data splitting of the available samples plays a major role in the training process. This selection of subsets for training, testing and validation affects the generalization ability of the neural network. Also the number of samples has an impact in the time required for the design of the ANN and the training. This paper introduces an efficient and simple method for reducing the set of samples used for training a neural network. The method reduces the required time to calculate the network coefficients, while keeping the diversity and avoiding overtraining the ANN due the presence of similar samples. The proposed method is based on the calculation of the angle between two vectors, each one representing one input of the neural network. When the angle formed among samples is smaller than a defined threshold only one input is accepted for the training. The accepted inputs are scattered throughout the sample space. Tidal records are used to demonstrate the proposed method. The results of a cross-validation show that with few inputs the quality of the outputs is not accurate and depends on the selection of the first sample, but as the number of inputs increases the accuracy is improved and differences among the scenarios with a different starting sample have and important reduction. A comparison with the K-means clustering algorithm shows that for this application the proposed method with a smaller number of samples is producing a more accurate network.

  8. Reachable Distance Space: Efficient Sampling-Based Planning for Spatially Constrained Systems

    KAUST Repository

    Xinyu Tang,; Thomas, S.; Coleman, P.; Amato, N. M.

    2010-01-01

    reachable distance space (RD-space), in which all configurations lie in the set of constraint-satisfying subspaces. This enables us to directly sample the constrained subspaces with complexity linear in the number of the robot's degrees of freedom

  9. Fabrication Techniques of Stretchable and Cloth Electroadhesion Samples for Implementation on Devices with Space Application

    Data.gov (United States)

    National Aeronautics and Space Administration — The purpose of this study is to determine materials and fabrication techniques for efficient space-rated electroadhesion (EA) samples. Liquid metals, including...

  10. Space discretization in SN methods: Features, improvements and convergence patterns

    International Nuclear Information System (INIS)

    Coppa, G.G.M.; Lapenta, G.; Ravetto, P.

    1990-01-01

    A comparative analysis of the space discretization schemes currently used in S N methods is performed and special attention is devoted to direct integration techniques. Some improvements are proposed in one- and two-dimensional applications, which are based on suitable choices for the spatial variation of the collision source. A study of the convergence pattern is carried out for eigenvalue calculations within the space asymptotic approximation by means of both analytical and numerical investigations. (orig.) [de

  11. Within-session spacing improves delayed recall in children.

    Science.gov (United States)

    Zigterman, Jessica R; Simone, Patricia M; Bell, Matthew C

    2015-01-01

    Multiple retrievals of a memory over a spaced manner improve long-term memory performance in infants, children, younger and older adults; however, few studies have examined spacing effects with young school-age children. To expand the understanding of the spacing benefit in children, the current study presented weakly associated English word-pairs to children aged 7-11 and cued their recall two times immediately (massed), after a delay of 5 or 10 items (spaced) or not at all (control). After this encoding session with or without two retrievals, participants were tested two times for memory of all word-pairs: immediately and 30 minutes after the encoding session. Multiple retrievals significantly improved memory on the tests. However, words repeated in a spaced design were remembered at higher rates than those that were massed, while gap size between repetitions (5 or 10) did not differentially impact performance. The data show that a within-session spacing strategy can benefit children's ability to remember word-pairs after 30 minutes. Thus, asking students to recall what they have learned within a lesson is a technique that can be used in a classroom to improve long-term recall.

  12. Improvements to robotics-inspired conformational sampling in rosetta.

    Directory of Open Access Journals (Sweden)

    Amelie Stein

    Full Text Available To accurately predict protein conformations in atomic detail, a computational method must be capable of sampling models sufficiently close to the native structure. All-atom sampling is difficult because of the vast number of possible conformations and extremely rugged energy landscapes. Here, we test three sampling strategies to address these difficulties: conformational diversification, intensification of torsion and omega-angle sampling and parameter annealing. We evaluate these strategies in the context of the robotics-based kinematic closure (KIC method for local conformational sampling in Rosetta on an established benchmark set of 45 12-residue protein segments without regular secondary structure. We quantify performance as the fraction of sub-Angstrom models generated. While improvements with individual strategies are only modest, the combination of intensification and annealing strategies into a new "next-generation KIC" method yields a four-fold increase over standard KIC in the median percentage of sub-Angstrom models across the dataset. Such improvements enable progress on more difficult problems, as demonstrated on longer segments, several of which could not be accurately remodeled with previous methods. Given its improved sampling capability, next-generation KIC should allow advances in other applications such as local conformational remodeling of multiple segments simultaneously, flexible backbone sequence design, and development of more accurate energy functions.

  13. Communications Relay and Human-Assisted Sample Return from the Deep Space Gateway

    Science.gov (United States)

    Cichan, T.; Hopkins, J. B.; Bierhaus, B.; Murrow, D. W.

    2018-02-01

    The Deep Space Gateway can enable or enhance exploration of the lunar surface through two capabilities: 1. communications relay, opening up access to the lunar farside, and 2. sample return, enhancing the ability to return large sample masses.

  14. Universal Sample Preparation Module for Molecular Analysis in Space, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Lynntech proposes to develop and demonstrate the ability of a compact, light-weight, and automated universal sample preparation module (USPM) to process samples from...

  15. Improved Space Surveillance Network (SSN) Scheduling using Artificial Intelligence Techniques

    Science.gov (United States)

    Stottler, D.

    There are close to 20,000 cataloged manmade objects in space, the large majority of which are not active, functioning satellites. These are tracked by phased array and mechanical radars and ground and space-based optical telescopes, collectively known as the Space Surveillance Network (SSN). A better SSN schedule of observations could, using exactly the same legacy sensor resources, improve space catalog accuracy through more complementary tracking, provide better responsiveness to real-time changes, better track small debris in low earth orbit (LEO) through efficient use of applicable sensors, efficiently track deep space (DS) frequent revisit objects, handle increased numbers of objects and new types of sensors, and take advantage of future improved communication and control to globally optimize the SSN schedule. We have developed a scheduling algorithm that takes as input the space catalog and the associated covariance matrices and produces a globally optimized schedule for each sensor site as to what objects to observe and when. This algorithm is able to schedule more observations with the same sensor resources and have those observations be more complementary, in terms of the precision with which each orbit metric is known, to produce a satellite observation schedule that, when executed, minimizes the covariances across the entire space object catalog. If used operationally, the results would be significantly increased accuracy of the space catalog with fewer lost objects with the same set of sensor resources. This approach inherently can also trade-off fewer high priority tasks against more lower-priority tasks, when there is benefit in doing so. Currently the project has completed a prototyping and feasibility study, using open source data on the SSN's sensors, that showed significant reduction in orbit metric covariances. The algorithm techniques and results will be discussed along with future directions for the research.

  16. Sampling large random knots in a confined space

    International Nuclear Information System (INIS)

    Arsuaga, J; Blackstone, T; Diao, Y; Hinson, K; Karadayi, E; Saito, M

    2007-01-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e n 2 )). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n 2 ). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications

  17. Sampling large random knots in a confined space

    Science.gov (United States)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.

    2007-09-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  18. Sampling large random knots in a confined space

    Energy Technology Data Exchange (ETDEWEB)

    Arsuaga, J [Department of Mathematics, San Francisco State University, 1600 Holloway Ave, San Francisco, CA 94132 (United States); Blackstone, T [Department of Computer Science, San Francisco State University, 1600 Holloway Ave., San Francisco, CA 94132 (United States); Diao, Y [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Hinson, K [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Karadayi, E [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States); Saito, M [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States)

    2007-09-28

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e{sup n{sup 2}}). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n{sup 2}). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  19. On the Exploitation of Sensitivity Derivatives for Improving Sampling Methods

    Science.gov (United States)

    Cao, Yanzhao; Hussaini, M. Yousuff; Zang, Thomas A.

    2003-01-01

    Many application codes, such as finite-element structural analyses and computational fluid dynamics codes, are capable of producing many sensitivity derivatives at a small fraction of the cost of the underlying analysis. This paper describes a simple variance reduction method that exploits such inexpensive sensitivity derivatives to increase the accuracy of sampling methods. Three examples, including a finite-element structural analysis of an aircraft wing, are provided that illustrate an order of magnitude improvement in accuracy for both Monte Carlo and stratified sampling schemes.

  20. Dielectric sample with two-layer charge distribution for space charge calibration purposes

    DEFF Research Database (Denmark)

    Holbøll, Joachim; Henriksen, Mogens; Rasmussen, C.

    2002-01-01

    In the present paper is described a dielectric test sample with two very narrow concentrations of bulk charges, achieved by two internal electrodes not affecting the acoustical properties of the sample, a fact important for optimal application of most space charge measuring systems. Space charge...

  1. Human-Robot Site Survey and Sampling for Space Exploration

    Science.gov (United States)

    Fong, Terrence; Bualat, Maria; Edwards, Laurence; Flueckiger, Lorenzo; Kunz, Clayton; Lee, Susan Y.; Park, Eric; To, Vinh; Utz, Hans; Ackner, Nir

    2006-01-01

    NASA is planning to send humans and robots back to the Moon before 2020. In order for extended missions to be productive, high quality maps of lunar terrain and resources are required. Although orbital images can provide much information, many features (local topography, resources, etc) will have to be characterized directly on the surface. To address this need, we are developing a system to perform site survey and sampling. The system includes multiple robots and humans operating in a variety of team configurations, coordinated via peer-to-peer human-robot interaction. In this paper, we present our system design and describe planned field tests.

  2. Improvements and experience in the analysis of reprocessing samples

    International Nuclear Information System (INIS)

    Koch, L.; Cricchio, A.; Meester, R. de; Romkowski, M.; Wilhelmi, M.; Arenz, H.J.; Stijl, E. van der; Baeckmann, A. von

    1976-01-01

    Improvements in the analysis of input samples for reprocessing were obtained. To cope with the decomposition of reprocessing input solutions owling to the high radioactivity, an aluminium capsule technique was developed. A known amount of the dissolver solution was weighed into an aluminium can, dried, and the capsule was sealed. In this form, the sample could be stored over a long period and could be redissolved later for the analysis. The isotope correlation technique offers an attractive alternative for measuring the plutonium isotopic content in the dissolver solution. Moreover, this technique allows for consistency checks of analytical results. For this purpose, a data bank of correlated isotopic data is in use. To improve the efficiency of analytical work, four automatic instruments have been developed. The conditioning of samples for the U-Pu isotopic measurement was achieved by an automatic ion exchanger. A mass spectrometer, to which a high vacuum lock is connected, allows the automatic measurement of U-Pu samples. A process-computer controls the heating, focusing and scanning processes during the measurement and evaluates the data. To ease the data handling, alpha-spectrometry as well as a balance have been automated. (author)

  3. An algorithm to improve sampling efficiency for uncertainty propagation using sampling based method

    International Nuclear Information System (INIS)

    Campolina, Daniel; Lima, Paulo Rubens I.; Pereira, Claubia; Veloso, Maria Auxiliadora F.

    2015-01-01

    Sample size and computational uncertainty were varied in order to investigate sample efficiency and convergence of the sampling based method for uncertainty propagation. Transport code MCNPX was used to simulate a LWR model and allow the mapping, from uncertain inputs of the benchmark experiment, to uncertain outputs. Random sampling efficiency was improved through the use of an algorithm for selecting distributions. Mean range, standard deviation range and skewness were verified in order to obtain a better representation of uncertainty figures. Standard deviation of 5 pcm in the propagated uncertainties for 10 n-samples replicates was adopted as convergence criterion to the method. Estimation of 75 pcm uncertainty on reactor k eff was accomplished by using sample of size 93 and computational uncertainty of 28 pcm to propagate 1σ uncertainty of burnable poison radius. For a fixed computational time, in order to reduce the variance of the uncertainty propagated, it was found, for the example under investigation, it is preferable double the sample size than double the amount of particles followed by Monte Carlo process in MCNPX code. (author)

  4. NASA IDEAS to Improve Instruction in Astronomy and Space Science

    Science.gov (United States)

    Malphrus, B.; Kidwell, K.

    1999-12-01

    The IDEAS to Improve Instructional Competencies in Astronomy and Space Science project is intended to develop and/or enhance teacher competencies in astronomy and space sciences of teacher participants (Grades 5-12) in Kentucky. The project is being implemented through a two-week summer workshop, a series of five follow-up meetings, and an academic year research project. The resources of Kentucky's only Radio Astronomy Observatory- the Morehead Radio Telescope (MRT), Goldstone Apple Valley Radio Telescope (GAVRT) (via remote observing using the Internet), and the Kentucky Department of Education regional service centers are combined to provide a unique educational experience. The project is designed to improve science teacher's instructional methodologies by providing pedagogical assistance, content training, involving the teachers and their students in research in radio astronomy, providing access to the facilities of the Morehead Astrophysical Observatory, and by working closely with a NASA-JOVE research astronomer. Participating teachers will ultimately produce curriculum units and research projects, the results of which will be published on the WWW. A major goal of this project is to share with teachers and ultimately students the excitement and importance of scientific research. The project represents a partnership of five agencies, each matching the commitment both financially and/or personnel. This project is funded by the NASA IDEAS initiative administered by the Space Telescope Science Institute and the National Air and Space Administration (NASA).

  5. Improved mixing and sampling systems for vitrification melter feeds

    International Nuclear Information System (INIS)

    Ebadian, M.A.

    1998-01-01

    This report summarizes the methods used and results obtained during the progress of the study of waste slurry mixing and sampling systems during fiscal year 1977 (FY97) at the Hemispheric Center for Environmental Technology (HCET) at Florida International University (FIU). The objective of this work is to determine optimal mixing configurations and operating conditions as well as improved sampling technology for defense waste processing facility (DWPF) waste melter feeds at US Department of Energy (DOE) sites. Most of the research on this project was performed experimentally by using a tank mixing configuration with different rotating impellers. The slurry simulants for the experiments were prepared in-house based on the properties of the DOE sites' typical waste slurries. A sampling system was designed to withdraw slurry from the mixing tank. To obtain insight into the waste mixing process, the slurry flow in the mixing tank was also simulated numerically by applying computational fluid dynamics (CFD) methods. The major parameters investigated in both the experimental and numerical studies included power consumption of mixer, mixing time to reach slurry uniformity, slurry type, solids concentration, impeller type, impeller size, impeller rotating speed, sampling tube size, and sampling velocities. Application of the results to the DWPF melter feed preparation process will enhance and modify the technical base for designing slurry transportation equipment and pipeline systems. These results will also serve as an important reference for improving waste slurry mixing performance and melter operating conditions. These factors will contribute to an increase in the capability of the vitrification process and the quality of the waste glass

  6. Combined Space and Water Heating: Next Steps to Improved Performance

    Energy Technology Data Exchange (ETDEWEB)

    Schoenbauer, B. [NorthernSTAR Building America Partnership, Minneapolis, MN (United States); Bohac, D. [NorthernSTAR Building America Partnership, Minneapolis, MN (United States); Huelman, P. [NorthernSTAR Building America Partnership, Minneapolis, MN (United States)

    2016-07-13

    A combined space- and water-heating (combi) system uses a high-efficiency direct-vent burner that eliminates safety issues associated with natural draft appliances. Past research with these systems shows that using condensing water heaters or boilers with hydronic air handling units can provide both space and water heating with efficiencies of 90% or higher. Improved controls have the potential to reduce complexity and improve upon the measured performance. This project demonstrates that controls can significantly benefit these first-generation systems. Laboratory tests and daily load/performance models showed that the set point temperature reset control produced a 2.1%-4.3% (20-40 therms/year) savings for storage and hybrid water heater combi systems operated in moderate-load homes.

  7. Increasing fMRI sampling rate improves Granger causality estimates.

    Directory of Open Access Journals (Sweden)

    Fa-Hsuan Lin

    Full Text Available Estimation of causal interactions between brain areas is necessary for elucidating large-scale functional brain networks underlying behavior and cognition. Granger causality analysis of time series data can quantitatively estimate directional information flow between brain regions. Here, we show that such estimates are significantly improved when the temporal sampling rate of functional magnetic resonance imaging (fMRI is increased 20-fold. Specifically, healthy volunteers performed a simple visuomotor task during blood oxygenation level dependent (BOLD contrast based whole-head inverse imaging (InI. Granger causality analysis based on raw InI BOLD data sampled at 100-ms resolution detected the expected causal relations, whereas when the data were downsampled to the temporal resolution of 2 s typically used in echo-planar fMRI, the causality could not be detected. An additional control analysis, in which we SINC interpolated additional data points to the downsampled time series at 0.1-s intervals, confirmed that the improvements achieved with the real InI data were not explainable by the increased time-series length alone. We therefore conclude that the high-temporal resolution of InI improves the Granger causality connectivity analysis of the human brain.

  8. Improved phylogenomic taxon sampling noticeably affects nonbilaterian relationships.

    Science.gov (United States)

    Pick, K S; Philippe, H; Schreiber, F; Erpenbeck, D; Jackson, D J; Wrede, P; Wiens, M; Alié, A; Morgenstern, B; Manuel, M; Wörheide, G

    2010-09-01

    Despite expanding data sets and advances in phylogenomic methods, deep-level metazoan relationships remain highly controversial. Recent phylogenomic analyses depart from classical concepts in recovering ctenophores as the earliest branching metazoan taxon and propose a sister-group relationship between sponges and cnidarians (e.g., Dunn CW, Hejnol A, Matus DQ, et al. (18 co-authors). 2008. Broad phylogenomic sampling improves resolution of the animal tree of life. Nature 452:745-749). Here, we argue that these results are artifacts stemming from insufficient taxon sampling and long-branch attraction (LBA). By increasing taxon sampling from previously unsampled nonbilaterians and using an identical gene set to that reported by Dunn et al., we recover monophyletic Porifera as the sister group to all other Metazoa. This suggests that the basal position of the fast-evolving Ctenophora proposed by Dunn et al. was due to LBA and that broad taxon sampling is of fundamental importance to metazoan phylogenomic analyses. Additionally, saturation in the Dunn et al. character set is comparatively high, possibly contributing to the poor support for some nonbilaterian nodes.

  9. Improvement of the mechanical properties of reinforced aluminum foam samples

    Science.gov (United States)

    Formisano, A.; Barone, A.; Carrino, L.; De Fazio, D.; Langella, A.; Viscusi, A.; Durante, M.

    2018-05-01

    Closed-cell aluminum foam has attracted increasing attention due to its very interesting properties, thanks to which it is expected to be used as both structural and functional material. A research challenge is the improvement of the mechanical properties of foam-based structures adopting a reinforced approach that does not compromise their lightness. Consequently, the aim of this research is the fabrication of enhanced aluminum foam samples without significantly increasing their original weight. In this regard, cylindrical samples with a core of closed-cell aluminum foam and a skin of fabrics and grids of different materials were fabricated in a one step process and were mechanically characterized, in order to investigate their behaviour and to compare their mechanical properties to the ones of the traditional foam.

  10. Two-dimensional T2 distribution mapping in rock core plugs with optimal k-space sampling.

    Science.gov (United States)

    Xiao, Dan; Balcom, Bruce J

    2012-07-01

    Spin-echo single point imaging has been employed for 1D T(2) distribution mapping, but a simple extension to 2D is challenging since the time increase is n fold, where n is the number of pixels in the second dimension. Nevertheless 2D T(2) mapping in fluid saturated rock core plugs is highly desirable because the bedding plane structure in rocks often results in different pore properties within the sample. The acquisition time can be improved by undersampling k-space. The cylindrical shape of rock core plugs yields well defined intensity distributions in k-space that may be efficiently determined by new k-space sampling patterns that are developed in this work. These patterns acquire 22.2% and 11.7% of the k-space data points. Companion density images may be employed, in a keyhole imaging sense, to improve image quality. T(2) weighted images are fit to extract T(2) distributions, pixel by pixel, employing an inverse Laplace transform. Images reconstructed with compressed sensing, with similar acceleration factors, are also presented. The results show that restricted k-space sampling, in this application, provides high quality results. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Sample preparation combined with electroanalysis to improve simultaneous determination of antibiotics in animal derived food samples.

    Science.gov (United States)

    da Silva, Wesley Pereira; de Oliveira, Luiz Henrique; Santos, André Luiz Dos; Ferreira, Valdir Souza; Trindade, Magno Aparecido Gonçalves

    2018-06-01

    A procedure based on liquid-liquid extraction (LLE) and phase separation using magnetically stirred salt-induced high-temperature liquid-liquid extraction (PS-MSSI-HT-LLE) was developed to extract and pre-concentrate ciprofloxacin (CIPRO) and enrofloxacin (ENRO) from animal food samples before electroanalysis. Firstly, simple LLE was used to extract the fluoroquinolones (FQs) from animal food samples, in which dilution was performed to reduce interference effects to below a tolerable threshold. Then, adapted PS-MSSI-HT-LLE protocols allowed re-extraction and further pre-concentration of target analytes in the diluted acid samples for simultaneous electrochemical quantification at low concentration levels. To improve the peak separation, in simultaneous detection, a baseline-corrected second-order derivative approach was processed. These approaches allowed quantification of target FQs from animal food samples spiked at levels of 0.80 to 2.00 µmol L -1 in chicken meat, with recovery values always higher than 80.5%, as well as in milk samples spiked at 4.00 µmol L -1 , with recovery values close to 70.0%. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Improved Space Object Orbit Determination Using CMOS Detectors

    Science.gov (United States)

    Schildknecht, T.; Peltonen, J.; Sännti, T.; Silha, J.; Flohrer, T.

    2014-09-01

    a sensor in a sun-synchronous LEO orbit, always pointing in the anti-sun direction to achieve optimum illumination conditions for small LEO debris, was simulated. For the space-based scenario the simulations showed a 20 130 % improvement of the accuracy of all orbital parameters when varying the frame rate from 1/3 fps, which is the fastest rate for a typical CCD detector, to 50 fps, which represents the highest rate of scientific CMOS cameras. Changing the epoch registration accuracy from a typical 20.0 ms for a mechanical shutter to 0.025 ms, the theoretical value for the electronic shutter of a CMOS camera, improved the orbit accuracy by 4 to 190 %. The ground-based scenario also benefit from the specific CMOS characteristics, but to a lesser extent.

  13. Maritime Activities: Requirements for Improving Space Based Solutions

    Science.gov (United States)

    Cragnolini, A.; Miguel-Lago, M.

    2005-03-01

    Maritime initiatives cannot be pursued only within their own perimeter. Sector endeavours and the policies which rule over them have wide range implications and several links with other sectors of activity. A well- balanced relationship of sea exploitation, maritime transportation, environmental protection and security ruled by national or international laws, will be a main issue for the future of all kind of maritime activities. Scientific research and technology development, along with enlightened and appropriate institutional regulations are relevant to ensure maritime sustainability.The use of satellite technology for monitoring international agreements should have a close co- ordination and be based on institutional consensus. Frequently, rules and new regulations set by policy makers are not demanding enough due to lack of knowledge about the possibilities offered by available technologies.Law enforcement actions could bring space technology new opportunities to offer solutions for monitoring and verification. Operators should aim at offering space data in a more operational and user-friendly way, providing them with useful and timely information.This paper will analyse the contribution of satellite technology to deal with the specificity of maritime sector, stressing the conditions for both an adequate technology improvement and an effective policy implementation.After analysing the links between maritime activities, space technologies and the institutional environment, the paper identifies some boundary conditions of the future developments. Conclusions are basically a check list for improving the present situation, while a road map is suggested as a matter of a way to proceed.

  14. Improved sampling and analysis of images in corneal confocal microscopy.

    Science.gov (United States)

    Schaldemose, E L; Fontain, F I; Karlsson, P; Nyengaard, J R

    2017-10-01

    Corneal confocal microscopy (CCM) is a noninvasive clinical method to analyse and quantify corneal nerve fibres in vivo. Although the CCM technique is in constant progress, there are methodological limitations in terms of sampling of images and objectivity of the nerve quantification. The aim of this study was to present a randomized sampling method of the CCM images and to develop an adjusted area-dependent image analysis. Furthermore, a manual nerve fibre analysis method was compared to a fully automated method. 23 idiopathic small-fibre neuropathy patients were investigated using CCM. Corneal nerve fibre length density (CNFL) and corneal nerve fibre branch density (CNBD) were determined in both a manual and automatic manner. Differences in CNFL and CNBD between (1) the randomized and the most common sampling method, (2) the adjusted and the unadjusted area and (3) the manual and automated quantification method were investigated. The CNFL values were significantly lower when using the randomized sampling method compared to the most common method (p = 0.01). There was not a statistical significant difference in the CNBD values between the randomized and the most common sampling method (p = 0.85). CNFL and CNBD values were increased when using the adjusted area compared to the standard area. Additionally, the study found a significant increase in the CNFL and CNBD values when using the manual method compared to the automatic method (p ≤ 0.001). The study demonstrated a significant difference in the CNFL values between the randomized and common sampling method indicating the importance of clear guidelines for the image sampling. The increase in CNFL and CNBD values when using the adjusted cornea area is not surprising. The observed increases in both CNFL and CNBD values when using the manual method of nerve quantification compared to the automatic method are consistent with earlier findings. This study underlines the importance of improving the analysis of the

  15. Improved grand canonical sampling of vapour-liquid transitions.

    Science.gov (United States)

    Wilding, Nigel B

    2016-10-19

    Simulation within the grand canonical ensemble is the method of choice for accurate studies of first order vapour-liquid phase transitions in model fluids. Such simulations typically employ sampling that is biased with respect to the overall number density in order to overcome the free energy barrier associated with mixed phase states. However, at low temperature and for large system size, this approach suffers a drastic slowing down in sampling efficiency. The culprits are geometrically induced transitions (stemming from the periodic boundary conditions) which involve changes in droplet shape from sphere to cylinder and cylinder to slab. Since the overall number density does not discriminate sufficiently between these shapes, it fails as an order parameter for biasing through the transitions. Here we report two approaches to ameliorating these difficulties. The first introduces a droplet shape based order parameter that generates a transition path from vapour to slab states for which spherical and cylindrical droplets are suppressed. The second simply biases with respect to the number density in a tetragonal subvolume of the system. Compared to the standard approach, both methods offer improved sampling, allowing estimates of coexistence parameters and vapor-liquid surface tension for larger system sizes and lower temperatures.

  16. Enabling Global Lunar Sample Return and Life-Detection Studies Using a Deep-Space Gateway

    Science.gov (United States)

    Cohen, B. A.; Eigenbrode, J. A.; Young, K. E.; Bleacher, J. E.; Trainer, M. E.

    2018-02-01

    The Deep Space Gateway could uniquely enable a lunar robotic sampling campaign that would provide incredible science return as well as feed forward to Mars and Europa by testing instrument sterility and ability to distinguish biogenic signals.

  17. Description of European Space Agency (ESA) Concept Development for a Mars Sample Receiving Facility (MSRF)

    Science.gov (United States)

    Vrublevskis, J.; Berthoud, L.; Guest, M.; Smith, C.; Bennett, A.; Gaubert, F.; Schroeven-Deceuninck, H.; Duvet, L.; van Winnendael, M.

    2018-04-01

    This presentation gives an overview of the several studies conducted for the European Space Agency (ESA) since 2007, which progressively developed layouts for a potential implementation of a Mars Sample Receiving Facility (MSRF).

  18. Classifier-guided sampling for discrete variable, discontinuous design space exploration: Convergence and computational performance

    Energy Technology Data Exchange (ETDEWEB)

    Backlund, Peter B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shahan, David W. [HRL Labs., LLC, Malibu, CA (United States); Seepersad, Carolyn Conner [Univ. of Texas, Austin, TX (United States)

    2014-04-22

    A classifier-guided sampling (CGS) method is introduced for solving engineering design optimization problems with discrete and/or continuous variables and continuous and/or discontinuous responses. The method merges concepts from metamodel-guided sampling and population-based optimization algorithms. The CGS method uses a Bayesian network classifier for predicting the performance of new designs based on a set of known observations or training points. Unlike most metamodeling techniques, however, the classifier assigns a categorical class label to a new design, rather than predicting the resulting response in continuous space, and thereby accommodates nondifferentiable and discontinuous functions of discrete or categorical variables. The CGS method uses these classifiers to guide a population-based sampling process towards combinations of discrete and/or continuous variable values with a high probability of yielding preferred performance. Accordingly, the CGS method is appropriate for discrete/discontinuous design problems that are ill-suited for conventional metamodeling techniques and too computationally expensive to be solved by population-based algorithms alone. In addition, the rates of convergence and computational properties of the CGS method are investigated when applied to a set of discrete variable optimization problems. Results show that the CGS method significantly improves the rate of convergence towards known global optima, on average, when compared to genetic algorithms.

  19. Multitask Classification Hypothesis Space With Improved Generalization Bounds.

    Science.gov (United States)

    Li, Cong; Georgiopoulos, Michael; Anagnostopoulos, Georgios C

    2015-07-01

    This paper presents a pair of hypothesis spaces (HSs) of vector-valued functions intended to be used in the context of multitask classification. While both are parameterized on the elements of reproducing kernel Hilbert spaces and impose a feature mapping that is common to all tasks, one of them assumes this mapping as fixed, while the more general one learns the mapping via multiple kernel learning. For these new HSs, empirical Rademacher complexity-based generalization bounds are derived, and are shown to be tighter than the bound of a particular HS, which has appeared recently in the literature, leading to improved performance. As a matter of fact, the latter HS is shown to be a special case of ours. Based on an equivalence to Group-Lasso type HSs, the proposed HSs are utilized toward corresponding support vector machine-based formulations. Finally, experimental results on multitask learning problems underline the quality of the derived bounds and validate this paper's analysis.

  20. Improving science literacy and education through space life sciences

    Science.gov (United States)

    MacLeish, M. Y.; Moreno, N. P.; Tharp, B. Z.; Denton, J. J.; Jessup, G.; Clipper, M. C.

    2001-01-01

    The National Space Biomedical Research Institute (NSBRI) encourages open involvement by scientists and the public at large in the Institute's activities. Through its Education and Public Outreach Program, the Institute is supporting national efforts to improve Kindergarten through grade twelve (K-12) and undergraduate education and to communicate knowledge generated by space life science research to lay audiences. Three academic institution Baylor College of Medicine, Morehouse School of Medicine and Texas A&M University are designing, producing, field-testing, and disseminating a comprehensive array of programs and products to achieve this goal. The objectives of the NSBRI Education and Public Outreach program are to: promote systemic change in elementary and secondary science education; attract undergraduate students--especially those from underrepresented groups--to careers in space life sciences, engineering and technology-based fields; increase scientific literacy; and to develop public and private sector partnerships that enhance and expand NSBRI efforts to reach students and families. c 2001. Elsevier Science Ltd. All rights reserved.

  1. Accuracy of micro four-point probe measurements on inhomogeneous samples: A probe spacing dependence study

    DEFF Research Database (Denmark)

    Wang, Fei; Petersen, Dirch Hjorth; Østerberg, Frederik Westergaard

    2009-01-01

    In this paper, we discuss a probe spacing dependence study in order to estimate the accuracy of micro four-point probe measurements on inhomogeneous samples. Based on sensitivity calculations, both sheet resistance and Hall effect measurements are studied for samples (e.g. laser annealed samples...... the probe spacing is smaller than 1/40 of the variation wavelength, micro four-point probes can provide an accurate record of local properties with less than 1% measurement error. All the calculations agree well with previous experimental results.......) with periodic variations of sheet resistance, sheet carrier density, and carrier mobility. With a variation wavelength of ¿, probe spacings from 0.0012 to 1002 have been applied to characterize the local variations. The calculations show that the measurement error is highly dependent on the probe spacing. When...

  2. Combined Space and Water Heating: Next Steps to Improved Performance

    Energy Technology Data Exchange (ETDEWEB)

    Schoenbauer, B. [NorthernSTAR Building America Partnership, Minneapolis, MN (United States); Bohac, D. [NorthernSTAR Building America Partnership, Minneapolis, MN (United States); Huelman, P. [NorthernSTAR Building America Partnership, Minneapolis, MN (United States)

    2016-07-13

    A combined space- and water-heating (combi) system uses a high-efficiency direct-vent burner that eliminates safety issues associated with natural draft appliances. Past research with these systems shows that using condensing water heaters or boilers with hydronic air handling units can provide both space and water heating with efficiencies of 90% or higher. Improved controls have the potential to reduce complexity and improve upon the measured performance. This project demonstrates that controls can significantly benefit these first-generation systems. Laboratory tests and daily load/performance models showed that the set point temperature reset control produced a 2.1%–4.3% (20–40 therms/year) savings for storage and hybrid water heater combi systems operated in moderate-load homes. The full modulation control showed additional savings over set point control (in high-load homes almost doubling the savings: 4%–5% over the no-control case). At the time of installation the reset control can be implemented for $200–$400, which would provide paybacks of 6–25 years for low-load houses and 3–15 years for high-load houses. Full modulation implementation costs would be similar to the outdoor reset and would provide paybacks of 5-½–20 years for low-load houses and 2-½–10 years for high-load houses.

  3. Comparison of Directionally Solidified Samples Solidified Terrestrially and Aboard the International Space Station

    Science.gov (United States)

    Angart, S.; Lauer, M.; Tewari, S. N.; Grugel, R. N.; Poirier, D. R.

    2014-01-01

    This article reports research that has been carried out under the aegis of NASA as part of a collaboration between ESA and NASA for solidification experiments on the International Space Station (ISS). The focus has been on the effect of convection on the microstructural evolution and macrosegregation in hypoeutectic Al-Si alloys during directional solidification (DS). Terrestrial DS-experiments have been carried out at Cleveland State University (CSU) and under microgravity on the International Space Station (ISS). The thermal processing-history of the experiments is well defined for both the terrestrially processed samples and the ISS-processed samples. As of this writing, two dendritic metrics was measured: primary dendrite arm spacings and primary dendrite trunk diameters. We have observed that these dendrite-metrics of two samples grown in the microgravity environment show good agreements with models based on diffusion controlled growth and diffusion controlled ripening, respectively. The gravity-driven convection (i.e., thermosolutal convection) in terrestrially grown samples has the effect of decreasing the primary dendrite arm spacings and causes macrosegregation. Dendrite trunk diameters also show differences between the earth- and space-grown samples. In order to process DS-samples aboard the ISS, the dendritic seed crystals were partially remelted in a stationary thermal gradient before the DS was carried out. Microstructural changes and macrosegregation effects during this period are described and have modeled.

  4. Primary Dendrite Array Morphology: Observations from Ground-based and Space Station Processed Samples

    Science.gov (United States)

    Tewari, Surendra; Rajamure, Ravi; Grugel, Richard; Erdmann, Robert; Poirier, David

    2012-01-01

    Influence of natural convection on primary dendrite array morphology during directional solidification is being investigated under a collaborative European Space Agency-NASA joint research program, "Microstructure Formation in Castings of Technical Alloys under Diffusive and Magnetically Controlled Convective Conditions (MICAST)". Two Aluminum-7 wt pct Silicon alloy samples, MICAST6 and MICAST7, were directionally solidified in microgravity on the International Space Station. Terrestrially grown dendritic monocrystal cylindrical samples were remelted and directionally solidified at 18 K/cm (MICAST6) and 28 K/cm (MICAST7). Directional solidification involved a growth speed step increase (MICAST6-from 5 to 50 micron/s) and a speed decrease (MICAST7-from 20 to 10 micron/s). Distribution and morphology of primary dendrites is currently being characterized in these samples, and also in samples solidified on earth under nominally similar thermal gradients and growth speeds. Primary dendrite spacing and trunk diameter measurements from this investigation will be presented.

  5. Reduced aliasing artifacts using shaking projection k-space sampling trajectory

    Science.gov (United States)

    Zhu, Yan-Chun; Du, Jiang; Yang, Wen-Chao; Duan, Chai-Jie; Wang, Hao-Yu; Gao, Song; Bao, Shang-Lian

    2014-03-01

    Radial imaging techniques, such as projection-reconstruction (PR), are used in magnetic resonance imaging (MRI) for dynamic imaging, angiography, and short-T2 imaging. They are less sensitive to flow and motion artifacts, and support fast imaging with short echo times. However, aliasing and streaking artifacts are two main sources which degrade radial imaging quality. For a given fixed number of k-space projections, data distributions along radial and angular directions will influence the level of aliasing and streaking artifacts. Conventional radial k-space sampling trajectory introduces an aliasing artifact at the first principal ring of point spread function (PSF). In this paper, a shaking projection (SP) k-space sampling trajectory was proposed to reduce aliasing artifacts in MR images. SP sampling trajectory shifts the projection alternately along the k-space center, which separates k-space data in the azimuthal direction. Simulations based on conventional and SP sampling trajectories were compared with the same number projections. A significant reduction of aliasing artifacts was observed using the SP sampling trajectory. These two trajectories were also compared with different sampling frequencies. A SP trajectory has the same aliasing character when using half sampling frequency (or half data) for reconstruction. SNR comparisons with different white noise levels show that these two trajectories have the same SNR character. In conclusion, the SP trajectory can reduce the aliasing artifact without decreasing SNR and also provide a way for undersampling reconstruction. Furthermore, this method can be applied to three-dimensional (3D) hybrid or spherical radial k-space sampling for a more efficient reduction of aliasing artifacts.

  6. Reduced aliasing artifacts using shaking projection k-space sampling trajectory

    International Nuclear Information System (INIS)

    Zhu Yan-Chun; Yang Wen-Chao; Wang Hao-Yu; Gao Song; Bao Shang-Lian; Du Jiang; Duan Chai-Jie

    2014-01-01

    Radial imaging techniques, such as projection-reconstruction (PR), are used in magnetic resonance imaging (MRI) for dynamic imaging, angiography, and short-T2 imaging. They are less sensitive to flow and motion artifacts, and support fast imaging with short echo times. However, aliasing and streaking artifacts are two main sources which degrade radial imaging quality. For a given fixed number of k-space projections, data distributions along radial and angular directions will influence the level of aliasing and streaking artifacts. Conventional radial k-space sampling trajectory introduces an aliasing artifact at the first principal ring of point spread function (PSF). In this paper, a shaking projection (SP) k-space sampling trajectory was proposed to reduce aliasing artifacts in MR images. SP sampling trajectory shifts the projection alternately along the k-space center, which separates k-space data in the azimuthal direction. Simulations based on conventional and SP sampling trajectories were compared with the same number projections. A significant reduction of aliasing artifacts was observed using the SP sampling trajectory. These two trajectories were also compared with different sampling frequencies. A SP trajectory has the same aliasing character when using half sampling frequency (or half data) for reconstruction. SNR comparisons with different white noise levels show that these two trajectories have the same SNR character. In conclusion, the SP trajectory can reduce the aliasing artifact without decreasing SNR and also provide a way for undersampling reconstruction. Furthermore, this method can be applied to three-dimensional (3D) hybrid or spherical radial k-space sampling for a more efficient reduction of aliasing artifacts

  7. A phoswich detector design for improved spatial sampling in PET

    Science.gov (United States)

    Thiessen, Jonathan D.; Koschan, Merry A.; Melcher, Charles L.; Meng, Fang; Schellenberg, Graham; Goertzen, Andrew L.

    2018-02-01

    Block detector designs, utilizing a pixelated scintillator array coupled to a photosensor array in a light-sharing design, are commonly used for positron emission tomography (PET) imaging applications. In practice, the spatial sampling of these designs is limited by the crystal pitch, which must be large enough for individual crystals to be resolved in the detector flood image. Replacing the conventional 2D scintillator array with an array of phoswich elements, each consisting of an optically coupled side-by-side scintillator pair, may improve spatial sampling in one direction of the array without requiring resolving smaller crystal elements. To test the feasibility of this design, a 4 × 4 phoswich array was constructed, with each phoswich element consisting of two optically coupled, 3 . 17 × 1 . 58 × 10mm3 LSO crystals co-doped with cerium and calcium. The amount of calcium doping was varied to create a 'fast' LSO crystal with decay time of 32.9 ns and a 'slow' LSO crystal with decay time of 41.2 ns. Using a Hamamatsu R8900U-00-C12 position-sensitive photomultiplier tube (PS-PMT) and a CAEN V1720 250 MS/s waveform digitizer, we were able to show effective discrimination of the fast and slow LSO crystals in the phoswich array. Although a side-by-side phoswich array is feasible, reflections at the crystal boundary due to a mismatch between the refractive index of the optical adhesive (n = 1 . 5) and LSO (n = 1 . 82) caused it to behave optically as an 8 × 4 array rather than a 4 × 4 array. Direct coupling of each phoswich element to individual photodetector elements may be necessary with the current phoswich array design. Alternatively, in order to implement this phoswich design with a conventional light sharing PET block detector, a high refractive index optical adhesive is necessary to closely match the refractive index of LSO.

  8. On the improvement of blood sample collection at clinical laboratories.

    Science.gov (United States)

    Grasas, Alex; Ramalhinho, Helena; Pessoa, Luciana S; Resende, Mauricio G C; Caballé, Imma; Barba, Nuria

    2014-01-09

    Blood samples are usually collected daily from different collection points, such hospitals and health centers, and transported to a core laboratory for testing. This paper presents a project to improve the collection routes of two of the largest clinical laboratories in Spain. These routes must be designed in a cost-efficient manner while satisfying two important constraints: (i) two-hour time windows between collection and delivery, and (ii) vehicle capacity. A heuristic method based on a genetic algorithm has been designed to solve the problem of blood sample collection. The user enters the following information for each collection point: postal address, average collecting time, and average demand (in thermal containers). After implementing the algorithm using C programming, this is run and, in few seconds, it obtains optimal (or near-optimal) collection routes that specify the collection sequence for each vehicle. Different scenarios using various types of vehicles have been considered. Unless new collection points are added or problem parameters are changed substantially, routes need to be designed only once. The two laboratories in this study previously planned routes manually for 43 and 74 collection points, respectively. These routes were covered by an external carrier company. With the implementation of this algorithm, the number of routes could be reduced from ten to seven in one laboratory and from twelve to nine in the other, which represents significant annual savings in transportation costs. The algorithm presented can be easily implemented in other laboratories that face this type of problem, and it is particularly interesting and useful as the number of collection points increases. The method designs blood collection routes with reduced costs that meet the time and capacity constraints of the problem.

  9. Improving Early Career Science Teachers' Ability to Teach Space Science

    Science.gov (United States)

    Schultz, G. R.; Slater, T. F.; Wierman, T.; Erickson, J. G.; Mendez, B. J.

    2012-12-01

    The GEMS Space Science Sequence is a high quality, hands-on curriculum for elementary and middle schools, created by a national team of astronomers and science educators with NASA funding and support. The standards-aligned curriculum includes 24 class sessions for upper elementary grades targeting the scale and nature of Earth's, shape, motion and gravity, and 36 class sessions for middle school grades focusing on the interactions between our Sun and Earth and the nature of the solar system and beyond. These materials feature extensive teacher support materials which results in pre-test to post-test content gains for students averaging 22%. Despite the materials being highly successful, there has been a less than desired uptake by teachers in using these materials, largely due to a lack of professional development training. Responding to the need to improve the quantity and quality of space science education, a collaborative of space scientists and science educators - from the University of California, Berkeley's Lawrence Hall of Science (LHS) and Center for Science Education at the Space Sciences Laboratory (CSE@SSL), the Astronomical Society of the Pacific (ASP), the University of Wyoming, and the CAPER Center for Astronomy & Physics Education - experimented with a unique professional development model focused on helping master teachers work closely with pre-service teachers during their student teaching internship field experience. Research on the exodus of young teachers from the teaching profession clearly demonstrates that early career teachers often leave teaching because of a lack of mentoring support and classroom ready curriculum materials. The Advancing Mentor and Novice Teachers in Space Science (AMANTISS) team first identified master teachers who supervise novice, student teachers in middle school, and trained these master teachers to use the GEMS Space Science Sequence for Grades 6-8. Then, these master teachers were mentored in how to coach their

  10. Improving Satellite Compatible Microdevices to Study Biology in Space

    Science.gov (United States)

    Kalkus, Trevor; Snyder, Jessica; Paulino-Lima, Ivan; Rothschild, Lynn

    2017-01-01

    The technology for biology in space lags far behind the gold standard for biological experiments on Earth. To remedy this disparity, the Rothschild lab works on proof of concept, prototyping, and developing of new sensors and devices to further the capabilities of biology research on satellites. One such device is the PowerCell Payload System. One goal for synthetic biology in aiding space travel and colonization is to genetically engineer living cells to produce biochemicals in space. However, such farming in space presupposes bacteria retain their functionality post-launch, bombarded by radiation, and without the 1G of Earth. Our questions is, does a co-culture of cyanobacteria and protein-synthesizing bacteria produce Earth-like yields of target proteins? Is the yield sensitive to variable gravitational forces? To answer these questions, a PowerCell Payload System will spend 1 year aboard the German Aerospace Center's Euglena and Combined Regenerative Organic-food Production In Space (Eu:CROPIS) mission satellite. The PowerCell system is a pair of two 48-well microfluidic cards, each well seeded with bacteria. The system integrates fluidic, thermal, optical, electronic, and control systems to germinate bacteria spores, then measure the protein synthesized for comparison to parallel experiments conducted on the Earth. In developing the PowerCell Payload, we gained insight into the shortcomings of biology experiments on satellites. To address these issues, we have started three new prototyping projects: 1) The development of an extremely stable and radiation resistant cell-free system, allowing for the construction of proteins utilizing only cell components instead of living cells. This can be lyophilized on a substrate, like paper. (2) Using paper as a microfluidic platform that is flexible, stable, cheap, and wicking. The capillary action eliminates the need for pumps, reducing volume, mass, and potential failing points. Electrodes can be printed on the paper to

  11. Improved Ionic Liquids as Space Lubricants, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Ionic liquids are candidate lubricant materials. However for application in low temperature space mechanisms their lubrication performance needs to be enhanced. UES...

  12. Histological and Transcriptomic Analysis of Adult Japanese Medaka Sampled Onboard the International Space Station.

    Directory of Open Access Journals (Sweden)

    Yasuhiko Murata

    Full Text Available To understand how humans adapt to the space environment, many experiments can be conducted on astronauts as they work aboard the Space Shuttle or the International Space Station (ISS. We also need animal experiments that can apply to human models and help prevent or solve the health issues we face in space travel. The Japanese medaka (Oryzias latipes is a suitable model fish for studying space adaptation as evidenced by adults of the species having mated successfully in space during 15 days of flight during the second International Microgravity Laboratory mission in 1994. The eggs laid by the fish developed normally and hatched as juveniles in space. In 2012, another space experiment ("Medaka Osteoclast" was conducted. Six-week-old male and female Japanese medaka (Cab strain osteoblast transgenic fish were maintained in the Aquatic Habitat system for two months in the ISS. Fish of the same strain and age were used as the ground controls. Six fish were fixed with paraformaldehyde or kept in RNA stabilization reagent (n = 4 and dissected for tissue sampling after being returned to the ground, so that several principal investigators working on the project could share samples. Histology indicated no significant changes except in the ovary. However, the RNA-seq analysis of 5345 genes from six tissues revealed highly tissue-specific space responsiveness after a two-month stay in the ISS. Similar responsiveness was observed among the brain and eye, ovary and testis, and the liver and intestine. Among these six tissues, the intestine showed the highest space response with 10 genes categorized as oxidation-reduction processes (gene ontogeny term GO:0055114, and the expression levels of choriogenin precursor genes were suppressed in the ovary. Eleven genes including klf9, klf13, odc1, hsp70 and hif3a were upregulated in more than four of the tissues examined, thus suggesting common immunoregulatory and stress responses during space adaptation.

  13. On the asymptotic improvement of supervised learning by utilizing additional unlabeled samples - Normal mixture density case

    Science.gov (United States)

    Shahshahani, Behzad M.; Landgrebe, David A.

    1992-01-01

    The effect of additional unlabeled samples in improving the supervised learning process is studied in this paper. Three learning processes. supervised, unsupervised, and combined supervised-unsupervised, are compared by studying the asymptotic behavior of the estimates obtained under each process. Upper and lower bounds on the asymptotic covariance matrices are derived. It is shown that under a normal mixture density assumption for the probability density function of the feature space, the combined supervised-unsupervised learning is always superior to the supervised learning in achieving better estimates. Experimental results are provided to verify the theoretical concepts.

  14. An integrated sampling and analysis approach for improved biodiversity monitoring

    Science.gov (United States)

    DeWan, Amielle A.; Zipkin, Elise F.

    2010-01-01

    Successful biodiversity conservation requires high quality monitoring data and analyses to ensure scientifically defensible policy, legislation, and management. Although monitoring is a critical component in assessing population status and trends, many governmental and non-governmental organizations struggle to develop and implement effective sampling protocols and statistical analyses because of the magnitude and diversity of species in conservation concern. In this article we describe a practical and sophisticated data collection and analysis framework for developing a comprehensive wildlife monitoring program that includes multi-species inventory techniques and community-level hierarchical modeling. Compared to monitoring many species individually, the multi-species approach allows for improved estimates of individual species occurrences, including rare species, and an increased understanding of the aggregated response of a community to landscape and habitat heterogeneity. We demonstrate the benefits and practicality of this approach to address challenges associated with monitoring in the context of US state agencies that are legislatively required to monitor and protect species in greatest conservation need. We believe this approach will be useful to regional, national, and international organizations interested in assessing the status of both common and rare species.

  15. Greater vertical spot spacing to improve femtosecond laser capsulotomy quality.

    Science.gov (United States)

    Schultz, Tim; Joachim, Stephanie C; Noristani, Rozina; Scott, Wendell; Dick, H Burkhard

    2017-03-01

    To evaluate the effect of adapted capsulotomy laser settings on the cutting quality in femtosecond laser-assisted cataract surgery. Ruhr-University Eye Clinic, Bochum, Germany. Prospective randomized case series. Eyes were treated with 1 of 2 laser settings. In Group 1, the regular standard settings were used (incisional depth 600 μm, pulse energy 4 μJ, horizontal spot spacing 5 μm, vertical spot spacing 10 μm, treatment time 1.2 seconds). In Group 2, vertical spot spacing was increased to 15 μm and the treatment time was 1.0 seconds. Light microscopy was used to evaluate the cut quality of the capsule edge. The size and number of tags (misplaced laser spots, which form a second cut of the capsule with high tear risk) were evaluated in a blinded manner. Groups were compared using the Mann-Whitney U test. The study comprised 100 eyes (50 eyes in each group). Cataract surgery was successfully completed in all eyes, and no anterior capsule tear occurred during the treatment. Histologically, significant fewer tags were observed with the new capsulotomy laser setting. The mean score for the number and size of free tags was significantly lower in this group than with the standard settings (P laser settings improved cut quality and reduced the number of tags. The modification has the potential to reduce the risk for radial capsule tears in femtosecond laser-assisted cataract surgery. With the new settings, no tags and no capsule tears were observed under the operating microscope in any eye. Copyright © 2017 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  16. Improved Nuclear Reactor and Shield Mass Model for Space Applications

    Science.gov (United States)

    Robb, Kevin

    2004-01-01

    New technologies are being developed to explore the distant reaches of the solar system. Beyond Mars, solar energy is inadequate to power advanced scientific instruments. One technology that can meet the energy requirements is the space nuclear reactor. The nuclear reactor is used as a heat source for which a heat-to-electricity conversion system is needed. Examples of such conversion systems are the Brayton, Rankine, and Stirling cycles. Since launch cost is proportional to the amount of mass to lift, mass is always a concern in designing spacecraft. Estimations of system masses are an important part in determining the feasibility of a design. I worked under Michael Barrett in the Thermal Energy Conversion Branch of the Power & Electric Propulsion Division. An in-house Closed Cycle Engine Program (CCEP) is used for the design and performance analysis of closed-Brayton-cycle energy conversion systems for space applications. This program also calculates the system mass including the heat source. CCEP uses the subroutine RSMASS, which has been updated to RSMASS-D, to estimate the mass of the reactor. RSMASS was developed in 1986 at Sandia National Laboratories to quickly estimate the mass of multi-megawatt nuclear reactors for space applications. In response to an emphasis for lower power reactors, RSMASS-D was developed in 1997 and is based off of the SP-100 liquid metal cooled reactor. The subroutine calculates the mass of reactor components such as the safety systems, instrumentation and control, radiation shield, structure, reflector, and core. The major improvements in RSMASS-D are that it uses higher fidelity calculations, is easier to use, and automatically optimizes the systems mass. RSMASS-D is accurate within 15% of actual data while RSMASS is only accurate within 50%. My goal this summer was to learn FORTRAN 77 programming language and update the CCEP program with the RSMASS-D model.

  17. Forecasting space weather: Can new econometric methods improve accuracy?

    Science.gov (United States)

    Reikard, Gordon

    2011-06-01

    Space weather forecasts are currently used in areas ranging from navigation and communication to electric power system operations. The relevant forecast horizons can range from as little as 24 h to several days. This paper analyzes the predictability of two major space weather measures using new time series methods, many of them derived from econometrics. The data sets are the A p geomagnetic index and the solar radio flux at 10.7 cm. The methods tested include nonlinear regressions, neural networks, frequency domain algorithms, GARCH models (which utilize the residual variance), state transition models, and models that combine elements of several techniques. While combined models are complex, they can be programmed using modern statistical software. The data frequency is daily, and forecasting experiments are run over horizons ranging from 1 to 7 days. Two major conclusions stand out. First, the frequency domain method forecasts the A p index more accurately than any time domain model, including both regressions and neural networks. This finding is very robust, and holds for all forecast horizons. Combining the frequency domain method with other techniques yields a further small improvement in accuracy. Second, the neural network forecasts the solar flux more accurately than any other method, although at short horizons (2 days or less) the regression and net yield similar results. The neural net does best when it includes measures of the long-term component in the data.

  18. Reachable Distance Space: Efficient Sampling-Based Planning for Spatially Constrained Systems

    KAUST Repository

    Xinyu Tang,

    2010-01-25

    Motion planning for spatially constrained robots is difficult due to additional constraints placed on the robot, such as closure constraints for closed chains or requirements on end-effector placement for articulated linkages. It is usually computationally too expensive to apply sampling-based planners to these problems since it is difficult to generate valid configurations. We overcome this challenge by redefining the robot\\'s degrees of freedom and constraints into a new set of parameters, called reachable distance space (RD-space), in which all configurations lie in the set of constraint-satisfying subspaces. This enables us to directly sample the constrained subspaces with complexity linear in the number of the robot\\'s degrees of freedom. In addition to supporting efficient sampling of configurations, we show that the RD-space formulation naturally supports planning and, in particular, we design a local planner suitable for use by sampling-based planners. We demonstrate the effectiveness and efficiency of our approach for several systems including closed chain planning with multiple loops, restricted end-effector sampling, and on-line planning for drawing/sculpting. We can sample single-loop closed chain systems with 1,000 links in time comparable to open chain sampling, and we can generate samples for 1,000-link multi-loop systems of varying topologies in less than a second. © 2010 The Author(s).

  19. Directionally Solidified Aluminum - 7 wt% Silicon Alloys: Comparison of Earth and International Space Station Processed Samples

    Science.gov (United States)

    Grugel, Richard N,; Tewari, Surendra; Rajamure, R. S.; Erdman, Robert; Poirier, David

    2012-01-01

    Primary dendrite arm spacings of Al-7 wt% Si alloy directionally solidified in low gravity environment of space (MICAST-6 and MICAST-7: Thermal gradient approx. 19 to 26 K/cm, Growth speeds varying from 5 to 50 microns/s show good agreement with the Hunt-Lu model. Primary dendrite trunk diameters of the ISS processed samples show a good fit with a simple analytical model based on Kirkwood s approach, proposed here. Natural convection, a) decreases primary dendrite arm spacing. b) appears to increase primary dendrite trunk diameter.

  20. Improving the image of student-recruited samples : a commentary

    NARCIS (Netherlands)

    Demerouti, E.; Rispens, S.

    2014-01-01

    This commentary argues that the quality and usefulness of student-recruited data can be evaluated by examining the external validity and generalization issues related to this sampling method. Therefore, we discuss how the sampling methods of student- and non-student-recruited samples can enhance or

  1. Improvements in and relating to the incubation of samples

    International Nuclear Information System (INIS)

    Bagshawe, K.D.

    1978-01-01

    Apparatus is described for incubating a plurality of biological samples and particularly as part of an analysis, e.g. radioimmunoassay or enzyme assay, of the samples. The apparatus is comprised of an incubation station with a plurality of containers to which samples together with diluent and reagents are supplied. The containers are arranged in rows in two side-by-side columns and are circulated sequentially. Sample removal means is provided either at a fixed location or at a movable point relative to the incubator. Circulation of the containers and the length of sample incubation time is controlled by a computer. The incubation station may include a plurality of sections with the columns in communication so that rows of samples can be moved from the column of one section to the column of an adjacent section, to provide alternative paths for circulation of the samples. (author)

  2. Correlation between k-space sampling pattern and MTF in compressed sensing MRSI.

    Science.gov (United States)

    Heikal, A A; Wachowicz, K; Fallone, B G

    2016-10-01

    To investigate the relationship between the k-space sampling patterns used for compressed sensing MR spectroscopic imaging (CS-MRSI) and the modulation transfer function (MTF) of the metabolite maps. This relationship may allow the desired frequency content of the metabolite maps to be quantitatively tailored when designing an undersampling pattern. Simulations of a phantom were used to calculate the MTF of Nyquist sampled (NS) 32 × 32 MRSI, and four-times undersampled CS-MRSI reconstructions. The dependence of the CS-MTF on the k-space sampling pattern was evaluated for three sets of k-space sampling patterns generated using different probability distribution functions (PDFs). CS-MTFs were also evaluated for three more sets of patterns generated using a modified algorithm where the sampling ratios are constrained to adhere to PDFs. Strong visual correlation as well as high R 2 was found between the MTF of CS-MRSI and the product of the frequency-dependant sampling ratio and the NS 32 × 32 MTF. Also, PDF-constrained sampling patterns led to higher reproducibility of the CS-MTF, and stronger correlations to the above-mentioned product. The relationship established in this work provides the user with a theoretical solution for the MTF of CS MRSI that is both predictable and customizable to the user's needs.

  3. Action Research to Improve the Learning Space for Diagnostic Techniques

    Directory of Open Access Journals (Sweden)

    Ellen Ariel

    2015-08-01

    Full Text Available The module described and evaluated here was created in response to perceived learning difficulties in diagnostic test design and interpretation for students in third-year Clinical Microbiology. Previously, the activities in lectures and laboratory classes in the module fell into the lower cognitive operations of “knowledge” and “understanding.” The new approach was to exchange part of the traditional activities with elements of interactive learning, where students had the opportunity to engage in deep learning using a variety of learning styles. The effectiveness of the new curriculum was assessed by means of on-course student assessment throughout the module, a final exam, an anonymous questionnaire on student evaluation of the different activities and a focus group of volunteers. Although the new curriculum enabled a major part of the student cohort to achieve higher pass grades (p < 0.001, it did not meet the requirements of the weaker students, and the proportion of the students failing the module remained at 34%. The action research applied here provided a number of valuable suggestions from students on how to improve future curricula from their perspective. Most importantly, an interactive online program that facilitated flexibility in the learning space for the different reagents and their interaction in diagnostic tests was proposed. The methods applied to improve and assess a curriculum refresh by involving students as partners in the process, as well as the outcomes, are discussed.

  4. Action Research to Improve the Learning Space for Diagnostic Techniques.

    Science.gov (United States)

    Ariel, Ellen; Owens, Leigh

    2015-12-01

    The module described and evaluated here was created in response to perceived learning difficulties in diagnostic test design and interpretation for students in third-year Clinical Microbiology. Previously, the activities in lectures and laboratory classes in the module fell into the lower cognitive operations of "knowledge" and "understanding." The new approach was to exchange part of the traditional activities with elements of interactive learning, where students had the opportunity to engage in deep learning using a variety of learning styles. The effectiveness of the new curriculum was assessed by means of on-course student assessment throughout the module, a final exam, an anonymous questionnaire on student evaluation of the different activities and a focus group of volunteers. Although the new curriculum enabled a major part of the student cohort to achieve higher pass grades (p < 0.001), it did not meet the requirements of the weaker students, and the proportion of the students failing the module remained at 34%. The action research applied here provided a number of valuable suggestions from students on how to improve future curricula from their perspective. Most importantly, an interactive online program that facilitated flexibility in the learning space for the different reagents and their interaction in diagnostic tests was proposed. The methods applied to improve and assess a curriculum refresh by involving students as partners in the process, as well as the outcomes, are discussed. Journal of Microbiology & Biology Education.

  5. A Transmission Electron Microscope Investigation of Space Weathering Effects in Hayabusa Samples

    Science.gov (United States)

    Keller, Lindsay P.; Berger, Eve L.

    2014-01-01

    The Hayabusa mission to asteroid 25143 Itokawa successfully returned the first direct samples of the regolith from the surface of an asteroid. The Hayabusa samples thus present a special opportunity to directly investigate the evolution of asteroidal surfaces, from the development of the regolith to the study of the more complex effects of space weathering. Here we describe the mineralogy, microstructure and composition of three Hayabusa mission particles using transmission electron microscope (TEM) techniques

  6. Space Surveillance Network and Analysis Model (SSNAM) Performance Improvements

    National Research Council Canada - National Science Library

    Butkus, Albert; Roe, Kevin; Mitchell, Barbara L; Payne, Timothy

    2007-01-01

    ... capacity by sensor, models for sensors yet to be created, user defined weather conditions, National Aeronautical and Space Administration catalog growth model including space debris, and solar flux just to name a few...

  7. Architectural Design Space Exploration of an FPGA-based Compressed Sampling Engine

    DEFF Research Database (Denmark)

    El-Sayed, Mohammad; Koch, Peter; Le Moullec, Yannick

    2015-01-01

    We present the architectural design space exploration of a compressed sampling engine for use in a wireless heart-rate monitoring system. We show how parallelism affects execution time at the register transfer level. Furthermore, two example solutions (modified semi-parallel and full...

  8. An extended sampling of the configurational space of HPr from E-coli

    NARCIS (Netherlands)

    de Groot, B.L.; Amadei, A; Scheek, R.M.; van Nuland, N.A.J.; Berendsen, H.J.C.

    1996-01-01

    Recently, we developed a method (Amadei et al., J. Biomol, Str. Dyn, 13: 815-626; de Groot et al., J. Biomol. Str. Dyn. 13: 741-751, 1996) to obtain an extended sampling of the configurational space of proteins, casing an adapted form of molecular dynamics (MD) simulations, based on the essential

  9. Numerically Accelerated Importance Sampling for Nonlinear Non-Gaussian State Space Models

    NARCIS (Netherlands)

    Koopman, S.J.; Lucas, A.; Scharth, M.

    2015-01-01

    We propose a general likelihood evaluation method for nonlinear non-Gaussian state-space models using the simulation-based method of efficient importance sampling. We minimize the simulation effort by replacing some key steps of the likelihood estimation procedure by numerical integration. We refer

  10. An extended sampling of the configurational space of HPr from E-coli

    NARCIS (Netherlands)

    de Groot, B.L.; Amadei, A; Scheek, R.M.; van Nuland, N.A.J.; Berendsen, H.J.C.

    Recently, we developed a method (Amadei et al., J. Biomol, Str. Dyn, 13: 815-626; de Groot et al., J. Biomol. Str. Dyn. 13: 741-751, 1996) to obtain an extended sampling of the configurational space of proteins, casing an adapted form of molecular dynamics (MD) simulations, based on the essential

  11. Improved orientation sampling for indexing diffraction patterns of polycrystalline materials

    DEFF Research Database (Denmark)

    Larsen, Peter Mahler; Schmidt, Søren

    2017-01-01

    to that of optimally distributing points on a four‐dimensional sphere. In doing so, the number of orientation samples needed to achieve a desired indexing accuracy is significantly reduced. Orientation sets at a range of sizes are generated in this way for all Laue groups and are made available online for easy use.......Orientation mapping is a widely used technique for revealing the microstructure of a polycrystalline sample. The crystalline orientation at each point in the sample is determined by analysis of the diffraction pattern, a process known as pattern indexing. A recent development in pattern indexing...... in the presence of noise, it has very high computational requirements. In this article, the computational burden is reduced by developing a method for nearly optimal sampling of orientations. By using the quaternion representation of orientations, it is shown that the optimal sampling problem is equivalent...

  12. Auto-validating von Neumann rejection sampling from small phylogenetic tree spaces

    Directory of Open Access Journals (Sweden)

    York Thomas

    2009-01-01

    Full Text Available Abstract Background In phylogenetic inference one is interested in obtaining samples from the posterior distribution over the tree space on the basis of some observed DNA sequence data. One of the simplest sampling methods is the rejection sampler due to von Neumann. Here we introduce an auto-validating version of the rejection sampler, via interval analysis, to rigorously draw samples from posterior distributions over small phylogenetic tree spaces. Results The posterior samples from the auto-validating sampler are used to rigorously (i estimate posterior probabilities for different rooted topologies based on mitochondrial DNA from human, chimpanzee and gorilla, (ii conduct a non-parametric test of rate variation between protein-coding and tRNA-coding sites from three primates and (iii obtain a posterior estimate of the human-neanderthal divergence time. Conclusion This solves the open problem of rigorously drawing independent and identically distributed samples from the posterior distribution over rooted and unrooted small tree spaces (3 or 4 taxa based on any multiply-aligned sequence data.

  13. Space charge and steady state current in LDPE samples containing a permittivity/conductivity gradient

    DEFF Research Database (Denmark)

    Holbøll, Joachim; Bambery, K. R.; Fleming, R. J.

    2000-01-01

    Electromagnetic theory predicts that a dielectric sample in which a steady DC current of density ε is flowing, and in which the ratio of permittivity ε to conductivity σ varies with position, will acquire a space charge density j·grad(ε/σ). A simple and convenient way to generate an ε/σ gradient...... in a homogeneous sample is to establish a temperature gradient across it. The resulting spatial variation in ε is usually small in polymeric insulators, but the variation in σ can be appreciable. Laser induced pressure pulse (LIPP) measurements were made on 1.5 mm thick plaques of ultra pure LDPE equipped...... with vacuum-evaporated aluminium electrodes. Temperature differences up to 27°C were maintained across the samples, which were subjected to DC fields up to 20 kV/mm. Current density was measured as a function of temperature and field. Negligible thermally generated space charge was observed. The charge...

  14. A novel heterogeneous training sample selection method on space-time adaptive processing

    Science.gov (United States)

    Wang, Qiang; Zhang, Yongshun; Guo, Yiduo

    2018-04-01

    The performance of ground target detection about space-time adaptive processing (STAP) decreases when non-homogeneity of clutter power is caused because of training samples contaminated by target-like signals. In order to solve this problem, a novel nonhomogeneous training sample selection method based on sample similarity is proposed, which converts the training sample selection into a convex optimization problem. Firstly, the existing deficiencies on the sample selection using generalized inner product (GIP) are analyzed. Secondly, the similarities of different training samples are obtained by calculating mean-hausdorff distance so as to reject the contaminated training samples. Thirdly, cell under test (CUT) and the residual training samples are projected into the orthogonal subspace of the target in the CUT, and mean-hausdorff distances between the projected CUT and training samples are calculated. Fourthly, the distances are sorted in order of value and the training samples which have the bigger value are selective preference to realize the reduced-dimension. Finally, simulation results with Mountain-Top data verify the effectiveness of the proposed method.

  15. NASA Lunar Sample Education Disk Program - Space Rocks for Classrooms, Museums, Science Centers and Libraries

    Science.gov (United States)

    Allen, J. S.

    2009-12-01

    NASA is eager for students and the public to experience lunar Apollo rocks and regolith soils first hand. Lunar samples embedded in plastic are available for educators to use in their classrooms, museums, science centers, and public libraries for education activities and display. The sample education disks are valuable tools for engaging students in the exploration of the Solar System. Scientific research conducted on the Apollo rocks has revealed the early history of our Earth-Moon system. The rocks help educators make the connections to this ancient history of our planet as well as connections to the basic lunar surface processes - impact and volcanism. With these samples educators in museums, science centers, libraries, and classrooms can help students and the public understand the key questions pursued by missions to Moon. The Office of the Curator at Johnson Space Center is in the process of reorganizing and renewing the Lunar and Meteorite Sample Education Disk Program to increase reach, security and accountability. The new program expands the reach of these exciting extraterrestrial rocks through increased access to training and educator borrowing. One of the expanded opportunities is that trained certified educators from science centers, museums, and libraries may now borrow the extraterrestrial rock samples. Previously the loan program was only open to classroom educators so the expansion will increase the public access to the samples and allow educators to make the critical connections of the rocks to the exciting exploration missions taking place in our solar system. Each Lunar Disk contains three lunar rocks and three regolith soils embedded in Lucite. The anorthosite sample is a part of the magma ocean formed on the surface of Moon in the early melting period, the basalt is part of the extensive lunar mare lava flows, and the breccias sample is an important example of the violent impact history of the Moon. The disks also include two regolith soils and

  16. Grouped fuzzy SVM with EM-based partition of sample space for clustered microcalcification detection.

    Science.gov (United States)

    Wang, Huiya; Feng, Jun; Wang, Hongyu

    2017-07-20

    Detection of clustered microcalcification (MC) from mammograms plays essential roles in computer-aided diagnosis for early stage breast cancer. To tackle problems associated with the diversity of data structures of MC lesions and the variability of normal breast tissues, multi-pattern sample space learning is required. In this paper, a novel grouped fuzzy Support Vector Machine (SVM) algorithm with sample space partition based on Expectation-Maximization (EM) (called G-FSVM) is proposed for clustered MC detection. The diversified pattern of training data is partitioned into several groups based on EM algorithm. Then a series of fuzzy SVM are integrated for classification with each group of samples from the MC lesions and normal breast tissues. From DDSM database, a total of 1,064 suspicious regions are selected from 239 mammography, and the measurement of Accuracy, True Positive Rate (TPR), False Positive Rate (FPR) and EVL = TPR* 1-FPR are 0.82, 0.78, 0.14 and 0.72, respectively. The proposed method incorporates the merits of fuzzy SVM and multi-pattern sample space learning, decomposing the MC detection problem into serial simple two-class classification. Experimental results from synthetic data and DDSM database demonstrate that our integrated classification framework reduces the false positive rate significantly while maintaining the true positive rate.

  17. Performance samples on academic tasks : improving prediction of academic performance

    NARCIS (Netherlands)

    Tanilon, Jenny

    2011-01-01

    This thesis is about the development and validation of a performance-based test, labeled as Performance Samples on academic tasks in Education and Child Studies (PSEd). PSEd is designed to identify students who are most able to perform the academic tasks involved in an Education and Child Studies

  18. Sampling in Qualitative Research: Improving the Quality of ...

    African Journals Online (AJOL)

    Sampling consideration in qualitative research is very important, yet in practice this appears not to be given the prominence and the rigour it deserves among Higher Education researchers. Accordingly, the quality of research outcomes in Higher Education has suffered from low utilisation. This has motivated the production ...

  19. Improved sample capsule for determination of oxygen in hemolyzed blood

    Science.gov (United States)

    Malik, W. M.

    1967-01-01

    Sample capsule for determination of oxygen in hemolyzed blood consists of a measured section of polytetrafluoroethylene tubing equipped at each end with a connector and a stopcock valve. This method eliminates errors from air entrainment or from the use of mercury or syringe lubricant.

  20. Predictive Sampling of Rare Conformational Events in Aqueous Solution: Designing a Generalized Orthogonal Space Tempering Method.

    Science.gov (United States)

    Lu, Chao; Li, Xubin; Wu, Dongsheng; Zheng, Lianqing; Yang, Wei

    2016-01-12

    In aqueous solution, solute conformational transitions are governed by intimate interplays of the fluctuations of solute-solute, solute-water, and water-water interactions. To promote molecular fluctuations to enhance sampling of essential conformational changes, a common strategy is to construct an expanded Hamiltonian through a series of Hamiltonian perturbations and thereby broaden the distribution of certain interactions of focus. Due to a lack of active sampling of configuration response to Hamiltonian transitions, it is challenging for common expanded Hamiltonian methods to robustly explore solvent mediated rare conformational events. The orthogonal space sampling (OSS) scheme, as exemplified by the orthogonal space random walk and orthogonal space tempering methods, provides a general framework for synchronous acceleration of slow configuration responses. To more effectively sample conformational transitions in aqueous solution, in this work, we devised a generalized orthogonal space tempering (gOST) algorithm. Specifically, in the Hamiltonian perturbation part, a solvent-accessible-surface-area-dependent term is introduced to implicitly perturb near-solute water-water fluctuations; more importantly in the orthogonal space response part, the generalized force order parameter is generalized as a two-dimension order parameter set, in which essential solute-solvent and solute-solute components are separately treated. The gOST algorithm is evaluated through a molecular dynamics simulation study on the explicitly solvated deca-alanine (Ala10) peptide. On the basis of a fully automated sampling protocol, the gOST simulation enabled repetitive folding and unfolding of the solvated peptide within a single continuous trajectory and allowed for detailed constructions of Ala10 folding/unfolding free energy surfaces. The gOST result reveals that solvent cooperative fluctuations play a pivotal role in Ala10 folding/unfolding transitions. In addition, our assessment

  1. Dynamic Sampling of Trace Contaminants During the Mission Operations Test of the Deep Space Habitat

    Science.gov (United States)

    Monje, Oscar; Valling, Simo; Cornish, Jim

    2013-01-01

    The atmospheric composition inside spacecraft during long duration space missions is dynamic due to changes in the living and working environment of crew members, crew metabolism and payload operations. A portable FTIR gas analyzer was used to monitor the atmospheric composition within the Deep Space Habitat (DSH) during the Mission Operations Test (MOT) conducted at the Johnson Space Center (JSC). The FTIR monitored up to 20 gases in near- real time. The procedures developed for operating the FTIR were successful and data was collected with the FTIR at 5 minute intervals. Not all the 20 gases sampled were detected in all the modules and it was possible to measure dynamic changes in trace contaminant concentrations that were related to crew activities involving exercise and meal preparation.

  2. A novel variable selection approach that iteratively optimizes variable space using weighted binary matrix sampling.

    Science.gov (United States)

    Deng, Bai-chuan; Yun, Yong-huan; Liang, Yi-zeng; Yi, Lun-zhao

    2014-10-07

    In this study, a new optimization algorithm called the Variable Iterative Space Shrinkage Approach (VISSA) that is based on the idea of model population analysis (MPA) is proposed for variable selection. Unlike most of the existing optimization methods for variable selection, VISSA statistically evaluates the performance of variable space in each step of optimization. Weighted binary matrix sampling (WBMS) is proposed to generate sub-models that span the variable subspace. Two rules are highlighted during the optimization procedure. First, the variable space shrinks in each step. Second, the new variable space outperforms the previous one. The second rule, which is rarely satisfied in most of the existing methods, is the core of the VISSA strategy. Compared with some promising variable selection methods such as competitive adaptive reweighted sampling (CARS), Monte Carlo uninformative variable elimination (MCUVE) and iteratively retaining informative variables (IRIV), VISSA showed better prediction ability for the calibration of NIR data. In addition, VISSA is user-friendly; only a few insensitive parameters are needed, and the program terminates automatically without any additional conditions. The Matlab codes for implementing VISSA are freely available on the website: https://sourceforge.net/projects/multivariateanalysis/files/VISSA/.

  3. Improvements in PIXE analysis of hourly particulate matter samples

    Energy Technology Data Exchange (ETDEWEB)

    Calzolai, G., E-mail: calzolai@fi.infn.it [Department of Physics and Astronomy, University of Florence, Via G. Sansone 1, 50019 Sesto Fiorentino (Italy); National Institute of Nuclear Physics (INFN), Division of Florence, Via G. Sansone 1, 50019 Sesto Fiorentino (Italy); Lucarelli, F. [Department of Physics and Astronomy, University of Florence, Via G. Sansone 1, 50019 Sesto Fiorentino (Italy); National Institute of Nuclear Physics (INFN), Division of Florence, Via G. Sansone 1, 50019 Sesto Fiorentino (Italy); Chiari, M.; Nava, S. [National Institute of Nuclear Physics (INFN), Division of Florence, Via G. Sansone 1, 50019 Sesto Fiorentino (Italy); Giannoni, M. [National Institute of Nuclear Physics (INFN), Division of Florence, Via G. Sansone 1, 50019 Sesto Fiorentino (Italy); Department of Chemistry, University of Florence, Via della Lastruccia 3, 50019 Sesto Fiorentino (Italy); Carraresi, L. [Department of Physics and Astronomy, University of Florence, Via G. Sansone 1, 50019 Sesto Fiorentino (Italy); National Institute of Nuclear Physics (INFN), Division of Florence, Via G. Sansone 1, 50019 Sesto Fiorentino (Italy); Prati, P. [Department of Physics, University of Genoa and INFN Division of Genoa, Via Dodecaneso 33, 16146 Genoa (Italy); Vecchi, R. [Department of Physics, Università degli Studi di Milano and INFN Division of Milan, Via Celoria 16, 20133 Milan (Italy)

    2015-11-15

    Most air quality studies on particulate matter (PM) are based on 24-h averaged data; however, many PM emissions as well as their atmospheric dilution processes change within a few hours. Samplings of PM with 1-h resolution can be performed by the streaker sampler (PIXE International Corporation), which is designed to separate the fine (aerodynamic diameter less than 2.5 μm) and the coarse (aerodynamic diameter between 2.5 and 10 μm) fractions of PM. These samples are efficiently analyzed by Particle Induced X-ray Emission (PIXE) at the LABEC laboratory of INFN in Florence (Italy), equipped with a 3 MV Tandetron accelerator, thanks to an optimized external-beam set-up, a convenient choice of the beam energy and suitable collecting substrates. A detailed description of the adopted set-up and results from a methodological study on the detection limits for the selection of the optimal beam energy are shown; the outcomes of the research on alternative collecting substrates, which produce a lower background during the measurements, and with lower contaminations, are also discussed.

  4. Improved gamma spectrometry of very low level radioactive samples

    International Nuclear Information System (INIS)

    Pineira, T.H.

    1989-01-01

    Today, many laboratories face the need to perform measurements of very low level activities using gamma spectroscopy. The techniques in use are identical to those applicable for higher levels of activities, but there is a need to use better adapted materials and modify the measurement conditions to minimize the background noise around the area. This paper presents the design of a very low level activity laboratory which has addressed the laboratory itself, the measuring chamber and the detector. The lab is constructed underground using specially selected materials of construction. The lab atmosphere is filtered and recycled with frequent changeovers. The rate of make-up fresh air is reduced and is sampled high above ground and filtered

  5. Improvement of correlated sampling Monte Carlo methods for reactivity calculations

    International Nuclear Information System (INIS)

    Nakagawa, Masayuki; Asaoka, Takumi

    1978-01-01

    Two correlated Monte Carlo methods, the similar flight path and the identical flight path methods, have been improved to evaluate up to the second order change of the reactivity perturbation. Secondary fission neutrons produced by neutrons having passed through perturbed regions in both unperturbed and perturbed systems are followed in a way to have a strong correlation between secondary neutrons in both the systems. These techniques are incorporated into the general purpose Monte Carlo code MORSE, so as to be able to estimate also the statistical error of the calculated reactivity change. The control rod worths measured in the FCA V-3 assembly are analyzed with the present techniques, which are shown to predict the measured values within the standard deviations. The identical flight path method has revealed itself more useful than the similar flight path method for the analysis of the control rod worth. (auth.)

  6. Extended phase-space methods for enhanced sampling in molecular simulations: a review

    Directory of Open Access Journals (Sweden)

    Hiroshi eFujisaki

    2015-09-01

    Full Text Available Molecular Dynamics simulations are a powerful approach to study biomolecular conformational changes or protein-ligand, protein-protein and protein-DNA/RNA interactions. Straightforward applications however are often hampered by incomplete sampling, since in a typical simulated trajectory the system will spend most of its time trapped by high energy barriers in restricted regions of the configuration space. Over the years, several techniques have been designed to overcome this problem and enhance space sampling. Here, we review a class of methods that rely on the idea of extending the set of dynamical variables of the system by adding extra ones associated to functions describing the process under study. In particular, we illustrate the Temperature Accelerated Molecular Dynamics (TAMD, Logarithmic Mean Force Dynamics (LogMFD, andMultiscale Enhanced Sampling (MSES algorithms. We also discuss combinations with techniques for searching reaction paths. We show the advantages presented by this approach and how it allows to quickly sample important regions of the free energy landscape via automatic exploration.

  7. Space density and clustering properties of a new sample of emission-line galaxies

    International Nuclear Information System (INIS)

    Wasilewski, A.J.

    1982-01-01

    A moderate-dispersion objective-prism survey for low-redshift emission-line galaxies has been carried out in an 825 sq. deg. region of sky with the Burrell Schmidt telescope of Case Western Reserve University. A 4 0 prism (300 A/mm at H#betta#) was used with the Illa-J emulsion to show that a new sample of emission-line galaxies is available even in areas already searched with the excess uv-continuum technique. The new emission-line galaxies occur quite commonly in systems with peculiar morphology indicating gravitational interaction with a close companion or other disturbance. About 10 to 15% of the sample are Seyfert galaxies. It is suggested that tidal interaction involving matter infall play a significant role in the generation of an emission-line spectrum. The space density of the new galaxies is found to be similar to the space density of the Makarian galaxies. Like the Markarian sample, the galaxies in the present survey represent about 10% of all galaxies in the absolute magnitude range M/sub p/ = -16 to -22. The observations also indicate that current estimates of dwarf galaxy space densities may be too low. The clustering properties of the new galaxies have been investigated using two approaches: cluster contour maps and the spatial correlation function. These tests suggest that there is weak clustering and possibly superclustering within the sample itself and that the galaxies considered here are about as common in clusters of ordinary galaxies as in the field

  8. Galaxy power-spectrum responses and redshift-space super-sample effect

    Science.gov (United States)

    Li, Yin; Schmittfull, Marcel; Seljak, Uroš

    2018-02-01

    As a major source of cosmological information, galaxy clustering is susceptible to long-wavelength density and tidal fluctuations. These long modes modulate the growth and expansion rate of local structures, shifting them in both amplitude and scale. These effects are often named the growth and dilation effects, respectively. In particular the dilation shifts the baryon acoustic oscillation (BAO) peak and breaks the assumption of the Alcock-Paczynski (AP) test. This cannot be removed with reconstruction techniques because the effect originates from long modes outside the survey. In redshift space, the long modes generate a large-scale radial peculiar velocity that affects the redshift-space distortion (RSD) signal. We compute the redshift-space response functions of the galaxy power spectrum to long density and tidal modes at leading order in perturbation theory, including both the growth and dilation terms. We validate these response functions against measurements from simulated galaxy mock catalogs. As one application, long density and tidal modes beyond the scale of a survey correlate various observables leading to an excess error known as the super-sample covariance, and thus weaken their constraining power. We quantify the super-sample effect on BAO, AP, and RSD measurements, and study its impact on current and future surveys.

  9. The Space Situational Assessment Report to Improve Public Awareness in China

    Science.gov (United States)

    Li, Hongbo; Zhang, Qi; Xie, Zebing; Wei, Xiangwang; Wang, Tao

    For improvement of public awareness of the impact of space activities in China, a Space Situational Assessment Report 2013 will be issued in March 2014. More than ten Chinese main medium are invited for a special press conference. The Space Situational Assessment Report aims to introduce international space activities to Chinese public, and provide a common, comprehensive knowledge base to support the development of national policies and international security cooperation of outer space. The full report organizes international space activities until 2013 according to three parts those are Foundations, Strategies and Environment, including nine chapters, such as Space laws and policies; Space facility and equipment; Institutions and Human Resource; Military space, Civil space and Commercial space; Natural space environment; Space situational awareness, etc. A kind of Space Situational Assessment Index System is presented as a globally-focused analytic framework that defines, measures, and ranks national space activity. To use for a variety of public themes, different assessment indexes are constituted by scores of individual qualitative and quantitative metrics based on the Index System. Three research organizaitons of space sciences and technologies collaborated on the Space Situational Assessment Report. It is a scholarly and ungovernmental work.

  10. Lunar and Meteorite Sample Education Disk Program - Space Rocks for Classrooms, Museums, Science Centers, and Libraries

    Science.gov (United States)

    Allen, Jaclyn; Luckey, M.; McInturff, B.; Huynh, P.; Tobola, K.; Loftin, L.

    2010-01-01

    NASA is eager for students and the public to experience lunar Apollo samples and meteorites first hand. Lunar rocks and soil, embedded in Lucite disks, are available for educators to use in their classrooms, museums, science centers, and public libraries for education activities and display. The sample education disks are valuable tools for engaging students in the exploration of the Solar System. Scientific research conducted on the Apollo rocks reveals the early history of our Earth-Moon system and meteorites reveal much of the history of the early solar system. The rocks help educators make the connections to this ancient history of our planet and solar system and the basic processes accretion, differentiation, impact and volcanism. With these samples, educators in museums, science centers, libraries, and classrooms can help students and the public understand the key questions pursued by many NASA planetary missions. The Office of the Curator at Johnson Space Center is in the process of reorganizing and renewing the Lunar and Meteorite Sample Education Disk Program to increase reach, security and accountability. The new program expands the reach of these exciting extraterrestrial rocks through increased access to training and educator borrowing. One of the expanded opportunities is that trained certified educators from science centers, museums, and libraries may now borrow the extraterrestrial rock samples. Previously the loan program was only open to classroom educators so the expansion will increase the public access to the samples and allow educators to make the critical connections to the exciting exploration missions taking place in our solar system. Each Lunar Disk contains three lunar rocks and three regolith soils embedded in Lucite. The anorthosite sample is a part of the magma ocean formed on the surface of Moon in the early melting period, the basalt is part of the extensive lunar mare lava flows, and the breccias sample is an important example of the

  11. Endodontic pathogens causing deep neck space infections: clinical impact of different sampling techniques and antibiotic susceptibility.

    Science.gov (United States)

    Poeschl, Paul W; Crepaz, Valentina; Russmueller, Guenter; Seemann, Rudolf; Hirschl, Alexander M; Ewers, Rolf

    2011-09-01

    The aims of the present study were to compare microbial populations in patients suffering from deep neck space abscesses caused by primary endodontic infections by sampling the infections with aspiration or swabbing techniques and to determine the susceptibility rates of the isolated bacteria to commonly used antibiotics. A total of 89 patients with deep neck space abscesses caused by primary endodontic infections requiring extraoral incision and drainage under general anesthesia were included. Either aspiration or swabbing was used to sample microbial pus specimens. The culture of the microbial specimens and susceptibility testing were performed following standard procedures. A total of 142 strains were recovered from 76 patients. In 13 patients, no bacteria were found. The predominant bacteria observed were streptococci (36%), staphylococci (13%), Prevotella (8%), and Peptostreptococcus (6%). A statistically significant greater number of obligate anaerobes were found in the aspiration group. The majority of patients presented a mixed aerobic-anaerobic population of bacterial flora (62%). The antibiotic resistance rates for the predominant bacteria were 10% for penicillin G, 9% for amoxicillin, 0% for amoxicillin clavulanate, 24% for clindamycin, and 24% for erythromycin. The results of our study indicated that a greater number of anaerobes were found when sampling using the aspiration technique. Penicillin G and aminopenicillins alone are not always sufficient for the treatment of severe deep neck space abscesses; beta-lactamase inhibitor combinations are more effective. Bacteria showed significant resistant rates to clindamycin. Thus, its single use in penicillin-allergic patients has to be carefully considered. Copyright © 2011 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  12. An improved standard total dose test for CMOS space electronics

    International Nuclear Information System (INIS)

    Fleetwood, D.M.; Winokur, P.S.; Riewe, L.C.; Pease, R.L.

    1989-01-01

    The postirradiation response of hardened and commercial CMOS devices is investigated as a function of total dose, dose rate, and annealing time and temperature. Cobalt-60 irradiation at ≅ 200 rad(SiO 2 )/s followed by a 1-week 100 degrees C biased anneal and testing is shown to be an effective screen of hardened devices for space use. However, a similar screen and single-point test performed after Co-60 irradiation and elevated temperature anneal cannot be generally defined for commercial devices. In the absence of detailed knowledge about device and circuit radiation response, a two-point standard test is proposed to ensure space surviability of CMOS circuits: a Co-60 irradiation and test to screen against oxide-trapped charge related failures, and an additional rebound test to screen against interface-trap related failures. Testing implications for bipolar technologies are also discussed

  13. Goddard Technology Efforts to Improve Space Borne Laser Reliability

    Science.gov (United States)

    Heaps, William S.

    2006-01-01

    In an effort to reduce the risk, perceived and actual, of employing instruments containing space borne lasers NASA initiated the Laser Risk Reduction Program (LRRP) in 2001. This program managed jointly by NASA Langley and NASA Goddard and employing lasers researchers from government, university and industrial labs is nearing the conclusion of its planned 5 year duration. This paper will describe some of the efforts and results obtained by the Goddard half of the program.

  14. Sampling Methodologies for Epidemiologic Surveillance of Men Who Have Sex with Men and Transgender Women in Latin America: An Empiric Comparison of Convenience Sampling, Time Space Sampling, and Respondent Driven Sampling

    OpenAIRE

    Clark, J. L.; Konda, K. A.; Silva-Santisteban, A.; Peinado, J.; Lama, J. R.; Kusunoki, L.; Perez-Brumer, A.; Pun, M.; Cabello, R.; Sebastian, J. L.; Suarez-Ognio, L.; Sanchez, J.

    2014-01-01

    Alternatives to convenience sampling (CS) are needed for HIV/STI surveillance of most-at-risk populations in Latin America. We compared CS, time space sampling (TSS), and respondent driven sampling (RDS) for recruitment of men who have sex with men (MSM) and transgender women (TW) in Lima, Peru. During concurrent 60-day periods from June-August, 2011, we recruited MSM/TW for epidemiologic surveillance using CS, TSS, and RDS. A total of 748 participants were recruited through CS, 233 through T...

  15. Improvements in Space Geodesy Data Discovery at the CDDIS

    Science.gov (United States)

    Noll, C.; Pollack, N.; Michael, P.

    2011-01-01

    The Crustal Dynamics Data Information System (CDDIS) supports data archiving and distribution activities for the space geodesy and geodynamics community. The main objectives of the system are to store space geodesy and geodynamics related data products in a central data bank. to maintain information about the archival of these data, and to disseminate these data and information in a timely manner to a global scientific research community. The archive consists of GNSS, laser ranging, VLBI, and DORIS data sets and products derived from these data. The CDDIS is one of NASA's Earth Observing System Data and Information System (EOSDIS) distributed data centers; EOSDIS data centers serve a diverse user community and arc tasked to provide facilities to search and access science data and products. Several activities are currently under development at the CDDIS to aid users in data discovery, both within the current community and beyond. The CDDIS is cooperating in the development of Geodetic Seamless Archive Centers (GSAC) with colleagues at UNAVCO and SIO. TIle activity will provide web services to facilitate data discovery within and across participating archives. In addition, the CDDIS is currently implementing modifications to the metadata extracted from incoming data and product files pushed to its archive. These enhancements will permit information about COOlS archive holdings to be made available through other data portals such as Earth Observing System (EOS) Clearinghouse (ECHO) and integration into the Global Geodetic Observing System (GGOS) portal.

  16. Improving primary health care facility performance in Ghana: efficiency analysis and fiscal space implications.

    Science.gov (United States)

    Novignon, Jacob; Nonvignon, Justice

    2017-06-12

    Health centers in Ghana play an important role in health care delivery especially in deprived communities. They usually serve as the first line of service and meet basic health care needs. Unfortunately, these facilities are faced with inadequate resources. While health policy makers seek to increase resources committed to primary healthcare, it is important to understand the nature of inefficiencies that exist in these facilities. Therefore, the objectives of this study are threefold; (i) estimate efficiency among primary health facilities (health centers), (ii) examine the potential fiscal space from improved efficiency and (iii) investigate the efficiency disparities in public and private facilities. Data was from the 2015 Access Bottlenecks, Cost and Equity (ABCE) project conducted by the Institute for Health Metrics and Evaluation. The Stochastic Frontier Analysis (SFA) was used to estimate efficiency of health facilities. Efficiency scores were then used to compute potential savings from improved efficiency. Outpatient visits was used as output while number of personnel, hospital beds, expenditure on other capital items and administration were used as inputs. Disparities in efficiency between public and private facilities was estimated using the Nopo matching decomposition procedure. Average efficiency score across all health centers included in the sample was estimated to be 0.51. Also, average efficiency was estimated to be about 0.65 and 0.50 for private and public facilities, respectively. Significant disparities in efficiency were identified across the various administrative regions. With regards to potential fiscal space, we found that, on average, facilities could save about GH₵11,450.70 (US$7633.80) if efficiency was improved. We also found that fiscal space from efficiency gains varies across rural/urban as well as private/public facilities, if best practices are followed. The matching decomposition showed an efficiency gap of 0.29 between private

  17. Feature-space transformation improves supervised segmentation across scanners

    DEFF Research Database (Denmark)

    van Opbroek, Annegreet; Achterberg, Hakim C.; de Bruijne, Marleen

    2015-01-01

    Image-segmentation techniques based on supervised classification generally perform well on the condition that training and test samples have the same feature distribution. However, if training and test images are acquired with different scanners or scanning parameters, their feature distributions...

  18. HI-STAR. Health Improvements Through Space Technologies and Resources: Final Report

    Science.gov (United States)

    Finarelli, Margaret G.

    2002-01-01

    The purpose of this document is to describe a global strategy to integrate the use of space technology in the fight against malaria. Given the well-documented relationship between the vector and its environment, and the ability of existing space technologies to monitor environmental factors, malaria is a strong candidate for the application of space technology. The concept of a malaria early warning system has been proposed in the past' and pilot studies have been conducted. The HI-STAR project (Health Improvement through Space Technologies and Resources) seeks to build on this concept and enhance the space elements of the suggested framework. As such, the mission statement for this International Space University design project has been defined as follows: "Our mission is to develop and promote a global strategy to help combat malaria using space technology". A general overview of malaria, aspects of how space technology can be useful, and an outline of the HI-STAR strategy is presented.

  19. Axiomatic method of partitions in the theory of Noebeling spaces. I. Improvement of partition connectivity

    International Nuclear Information System (INIS)

    Ageev, S M

    2007-01-01

    The Noebeling space N k 2k+1 , a k-dimensional analogue of the Hilbert space, is considered; this is a topologically complete separable (that is, Polish) k-dimensional absolute extensor in dimension k (that is, AE(k)) and a strongly k-universal space. The conjecture that the above-listed properties characterize the Noebeling space N k 2k+1 in an arbitrary finite dimension k is proved. In the first part of the paper a full axiom system of the Noebeling spaces is presented and the problem of the improvement of a partition connectivity is solved on its basis. Bibliography: 29 titles.

  20. Improved Sampling Algorithms in the Risk-Informed Safety Margin Characterization Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States); Alfonsi, Andrea [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Cogliati, Joshua Joseph [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    The RISMC approach is developing advanced set of methodologies and algorithms in order to perform Probabilistic Risk Analyses (PRAs). In contrast to classical PRA methods, which are based on Event-Tree and Fault-Tree methods, the RISMC approach largely employs system simulator codes applied to stochastic analysis tools. The basic idea is to randomly perturb (by employing sampling algorithms) timing and sequencing of events and internal parameters of the system codes (i.e., uncertain parameters) in order to estimate stochastic parameters such as core damage probability. This approach applied to complex systems such as nuclear power plants requires to perform a series of computationally expensive simulation runs given a large set of uncertain parameters. These types of analysis are affected by two issues. Firstly, the space of the possible solutions (a.k.a., the issue space or the response surface) can be sampled only very sparsely, and this precludes the ability to fully analyze the impact of uncertainties on the system dynamics. Secondly, large amounts of data are generated and tools to generate knowledge from such data sets are not yet available. This report focuses on the first issue and in particular employs novel methods that optimize the information generated by the sampling process by sampling unexplored and risk-significant regions of the issue space: adaptive (smart) sampling algorithms. They infer system response from surrogate models constructed from existing samples and predict the most relevant location of the next sample. It is therefore possible to understand features of the issue space with a small number of carefully selected samples. In this report, we will present how it is possible to perform adaptive sampling using the RISMC toolkit and highlight the advantages compared to more classical sampling approaches such Monte-Carlo. We will employ RAVEN to perform such statistical analyses using both analytical cases but also another RISMC code: RELAP-7.

  1. Improved Sampling Algorithms in the Risk-Informed Safety Margin Characterization Toolkit

    International Nuclear Information System (INIS)

    Mandelli, Diego; Smith, Curtis Lee; Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua Joseph

    2015-01-01

    The RISMC approach is developing advanced set of methodologies and algorithms in order to perform Probabilistic Risk Analyses (PRAs). In contrast to classical PRA methods, which are based on Event-Tree and Fault-Tree methods, the RISMC approach largely employs system simulator codes applied to stochastic analysis tools. The basic idea is to randomly perturb (by employing sampling algorithms) timing and sequencing of events and internal parameters of the system codes (i.e., uncertain parameters) in order to estimate stochastic parameters such as core damage probability. This approach applied to complex systems such as nuclear power plants requires to perform a series of computationally expensive simulation runs given a large set of uncertain parameters. These types of analysis are affected by two issues. Firstly, the space of the possible solutions (a.k.a., the issue space or the response surface) can be sampled only very sparsely, and this precludes the ability to fully analyze the impact of uncertainties on the system dynamics. Secondly, large amounts of data are generated and tools to generate knowledge from such data sets are not yet available. This report focuses on the first issue and in particular employs novel methods that optimize the information generated by the sampling process by sampling unexplored and risk-significant regions of the issue space: adaptive (smart) sampling algorithms. They infer system response from surrogate models constructed from existing samples and predict the most relevant location of the next sample. It is therefore possible to understand features of the issue space with a small number of carefully selected samples. In this report, we will present how it is possible to perform adaptive sampling using the RISMC toolkit and highlight the advantages compared to more classical sampling approaches such Monte-Carlo. We will employ RAVEN to perform such statistical analyses using both analytical cases but also another RISMC code: RELAP-7.

  2. Space weather at Low Latitudes: Considerations to improve its forecasting

    Science.gov (United States)

    Chau, J. L.; Goncharenko, L.; Valladares, C. E.; Milla, M. A.

    2013-05-01

    In this work we present a summary of space weather events that are unique to low-latitude regions. Special emphasis will be devoted to events that occur during so-called quiet (magnetically) conditions. One of these events is the occurrence of nighttime F-region irregularities, also known Equatorial Spread F (ESF). When such irregularities occur navigation and communications systems get disrupted or perturbed. After more than 70 years of studies, many features of ESF irregularities (climatology, physical mechanisms, longitudinal dependence, time dependence, etc.) are well known, but so far they cannot be forecast on time scales of minutes to hours. We present a summary of some of these features and some of the efforts being conducted to contribute to their forecasting. In addition to ESF, we have recently identified a clear connection between lower atmospheric forcing and the low latitude variability, particularly during the so-called sudden stratospheric warming (SSW) events. During SSW events and magnetically quiet conditions, we have observed changes in total electron content (TEC) that are comparable to changes that occur during strong magnetically disturbed conditions. We present results from recent events as well as outline potential efforts to forecast the ionospheric effects during these events.

  3. Improved Rhenium Thrust Chambers for In-Space Propulsion, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Radiation-cooled, bipropellant thrust chambers are being considered for the ascent/descent engines and reaction control systems for NASA missions such as Mars Sample...

  4. Discrete- vs. Continuous-Time Modeling of Unequally Spaced Experience Sampling Method Data

    Directory of Open Access Journals (Sweden)

    Silvia de Haan-Rietdijk

    2017-10-01

    Full Text Available The Experience Sampling Method is a common approach in psychological research for collecting intensive longitudinal data with high ecological validity. One characteristic of ESM data is that it is often unequally spaced, because the measurement intervals within a day are deliberately varied, and measurement continues over several days. This poses a problem for discrete-time (DT modeling approaches, which are based on the assumption that all measurements are equally spaced. Nevertheless, DT approaches such as (vector autoregressive modeling are often used to analyze ESM data, for instance in the context of affective dynamics research. There are equivalent continuous-time (CT models, but they are more difficult to implement. In this paper we take a pragmatic approach and evaluate the practical relevance of the violated model assumption in DT AR(1 and VAR(1 models, for the N = 1 case. We use simulated data under an ESM measurement design to investigate the bias in the parameters of interest under four different model implementations, ranging from the true CT model that accounts for all the exact measurement times, to the crudest possible DT model implementation, where even the nighttime is treated as a regular interval. An analysis of empirical affect data illustrates how the differences between DT and CT modeling can play out in practice. We find that the size and the direction of the bias in DT (VAR models for unequally spaced ESM data depend quite strongly on the true parameter in addition to data characteristics. Our recommendation is to use CT modeling whenever possible, especially now that new software implementations have become available.

  5. Improving KPCA Online Extraction by Orthonormalization in the Feature Space.

    Science.gov (United States)

    Souza Filho, Joao B O; Diniz, Paulo S R

    2018-04-01

    Recently, some online kernel principal component analysis (KPCA) techniques based on the generalized Hebbian algorithm (GHA) were proposed for use in large data sets, defining kernel components using concise dictionaries automatically extracted from data. This brief proposes two new online KPCA extraction algorithms, exploiting orthogonalized versions of the GHA rule. In both the cases, the orthogonalization of kernel components is achieved by the inclusion of some low complexity additional steps to the kernel Hebbian algorithm, thus not substantially affecting the computational cost of the algorithm. Results show improved convergence speed and accuracy of components extracted by the proposed methods, as compared with the state-of-the-art online KPCA extraction algorithms.

  6. The Hubble Space Telescope Medium Deep Survey Cluster Sample: Methodology and Data

    Science.gov (United States)

    Ostrander, E. J.; Nichol, R. C.; Ratnatunga, K. U.; Griffiths, R. E.

    1998-12-01

    We present a new, objectively selected, sample of galaxy overdensities detected in the Hubble Space Telescope Medium Deep Survey (MDS). These clusters/groups were found using an automated procedure that involved searching for statistically significant galaxy overdensities. The contrast of the clusters against the field galaxy population is increased when morphological data are used to search around bulge-dominated galaxies. In total, we present 92 overdensities above a probability threshold of 99.5%. We show, via extensive Monte Carlo simulations, that at least 60% of these overdensities are likely to be real clusters and groups and not random line-of-sight superpositions of galaxies. For each overdensity in the MDS cluster sample, we provide a richness and the average of the bulge-to-total ratio of galaxies within each system. This MDS cluster sample potentially contains some of the most distant clusters/groups ever detected, with about 25% of the overdensities having estimated redshifts z > ~0.9. We have made this sample publicly available to facilitate spectroscopic confirmation of these clusters and help more detailed studies of cluster and galaxy evolution. We also report the serendipitous discovery of a new cluster close on the sky to the rich optical cluster Cl l0016+16 at z = 0.546. This new overdensity, HST 001831+16208, may be coincident with both an X-ray source and a radio source. HST 001831+16208 is the third cluster/group discovered near to Cl 0016+16 and appears to strengthen the claims of Connolly et al. of superclustering at high redshift.

  7. Evaluation and Improvement of Lighting Efficiency in Working Spaces

    Directory of Open Access Journals (Sweden)

    Ana Castillo-Martinez

    2018-04-01

    Full Text Available Lighting is an essential element for modern life, promoting a sense of wellbeing for users. However, bad illumination may produce health problems such as headaches and fatigue, among other vision problems. For that reason, this paper proposes the development of a smartphone-based application to help in lighting evaluation to guarantee the compliance of illumination regulations and to help increase illuminance efficiency, reducing its energy consumption. To perform this evaluation, the smartphone can be used as a lighting measurement tool, evaluating those measurements through an intelligent agent based in rules capable of guiding the decision-making process. As a result, this tool allows the evaluation of the real working environment to guarantee lighting requirements, helping in the prevention of health problems derived from bad illumination and improving the lighting efficiency at the same time.

  8. Use the Bar Code System to Improve Accuracy of the Patient and Sample Identification.

    Science.gov (United States)

    Chuang, Shu-Hsia; Yeh, Huy-Pzu; Chi, Kun-Hung; Ku, Hsueh-Chen

    2018-01-01

    In time and correct sample collection were highly related to patient's safety. The sample error rate was 11.1%, because misbranded patient information and wrong sample containers during January to April, 2016. We developed a barcode system of "Specimens Identify System" through process of reengineering of TRM, used bar code scanners, add sample container instructions, and mobile APP. Conclusion, the bar code systems improved the patient safety and created green environment.

  9. The Lyman alpha reference sample. II. Hubble space telescope imaging results, integrated properties, and trends

    Energy Technology Data Exchange (ETDEWEB)

    Hayes, Matthew; Östlin, Göran; Duval, Florent; Sandberg, Andreas; Guaita, Lucia; Melinder, Jens; Rivera-Thorsen, Thøger [Department of Astronomy, Oskar Klein Centre, Stockholm University, AlbaNova University Centre, SE-106 91 Stockholm (Sweden); Adamo, Angela [Max Planck Institute for Astronomy, Königstuhl 17, D-69117 Heidelberg (Germany); Schaerer, Daniel [Université de Toulouse, UPS-OMP, IRAP, F-31000 Toulouse (France); Verhamme, Anne; Orlitová, Ivana [Geneva Observatory, University of Geneva, 51 Chemin des Maillettes, CH-1290 Versoix (Switzerland); Mas-Hesse, J. Miguel; Otí-Floranes, Héctor [Centro de Astrobiología (CSIC-INTA), Departamento de Astrofísica, P.O. Box 78, E-28691 Villanueva de la Cañada (Spain); Cannon, John M.; Pardy, Stephen [Department of Physics and Astronomy, Macalester College, 1600 Grand Avenue, Saint Paul, MN 55105 (United States); Atek, Hakim [Laboratoire dAstrophysique, École Polytechnique Fédérale de Lausanne (EPFL), Observatoire, CH-1290 Sauverny (Switzerland); Kunth, Daniel [Institut d' Astrophysique de Paris, UMR 7095, CNRS and UPMC, 98 bis Bd Arago, F-75014 Paris (France); Laursen, Peter [Dark Cosmology Centre, Niels Bohr Institute, University of Copenhagen, Juliane Maries Vej 30, DK-2100 Copenhagen (Denmark); Herenz, E. Christian, E-mail: matthew@astro.su.se [Leibniz-Institut für Astrophysik (AIP), An der Sternwarte 16, D-14482 Potsdam (Germany)

    2014-02-10

    We report new results regarding the Lyα output of galaxies, derived from the Lyman Alpha Reference Sample, and focused on Hubble Space Telescope imaging. For 14 galaxies we present intensity images in Lyα, Hα, and UV, and maps of Hα/Hβ, Lyα equivalent width (EW), and Lyα/Hα. We present Lyα and UV radial light profiles and show they are well-fitted by Sérsic profiles, but Lyα profiles show indices systematically lower than those of the UV (n ≈ 1-2 instead of ≳ 4). This reveals a general lack of the central concentration in Lyα that is ubiquitous in the UV. Photometric growth curves increase more slowly for Lyα than the far ultraviolet, showing that small apertures may underestimate the EW. For most galaxies, however, flux and EW curves flatten by radii ≈10 kpc, suggesting that if placed at high-z only a few of our galaxies would suffer from large flux losses. We compute global properties of the sample in large apertures, and show total Lyα luminosities to be independent of all other quantities. Normalized Lyα throughput, however, shows significant correlations: escape is found to be higher in galaxies of lower star formation rate, dust content, mass, and nebular quantities that suggest harder ionizing continuum and lower metallicity. Six galaxies would be selected as high-z Lyα emitters, based upon their luminosity and EW. We discuss the results in the context of high-z Lyα and UV samples. A few galaxies have EWs above 50 Å, and one shows f{sub esc}{sup Lyα} of 80%; such objects have not previously been reported at low-z.

  10. The Lyman alpha reference sample. II. Hubble space telescope imaging results, integrated properties, and trends

    International Nuclear Information System (INIS)

    Hayes, Matthew; Östlin, Göran; Duval, Florent; Sandberg, Andreas; Guaita, Lucia; Melinder, Jens; Rivera-Thorsen, Thøger; Adamo, Angela; Schaerer, Daniel; Verhamme, Anne; Orlitová, Ivana; Mas-Hesse, J. Miguel; Otí-Floranes, Héctor; Cannon, John M.; Pardy, Stephen; Atek, Hakim; Kunth, Daniel; Laursen, Peter; Herenz, E. Christian

    2014-01-01

    We report new results regarding the Lyα output of galaxies, derived from the Lyman Alpha Reference Sample, and focused on Hubble Space Telescope imaging. For 14 galaxies we present intensity images in Lyα, Hα, and UV, and maps of Hα/Hβ, Lyα equivalent width (EW), and Lyα/Hα. We present Lyα and UV radial light profiles and show they are well-fitted by Sérsic profiles, but Lyα profiles show indices systematically lower than those of the UV (n ≈ 1-2 instead of ≳ 4). This reveals a general lack of the central concentration in Lyα that is ubiquitous in the UV. Photometric growth curves increase more slowly for Lyα than the far ultraviolet, showing that small apertures may underestimate the EW. For most galaxies, however, flux and EW curves flatten by radii ≈10 kpc, suggesting that if placed at high-z only a few of our galaxies would suffer from large flux losses. We compute global properties of the sample in large apertures, and show total Lyα luminosities to be independent of all other quantities. Normalized Lyα throughput, however, shows significant correlations: escape is found to be higher in galaxies of lower star formation rate, dust content, mass, and nebular quantities that suggest harder ionizing continuum and lower metallicity. Six galaxies would be selected as high-z Lyα emitters, based upon their luminosity and EW. We discuss the results in the context of high-z Lyα and UV samples. A few galaxies have EWs above 50 Å, and one shows f esc Lyα of 80%; such objects have not previously been reported at low-z.

  11. IMPROVEMENT OF METHODS FOR HYDROBIOLOGICAL RESEARCH AND MODIFICATION OF STANDARD TOOLS FOR SAMPLE COLLECTION

    Directory of Open Access Journals (Sweden)

    M. M. Aligadjiev

    2015-01-01

    Full Text Available Aim. The paper discusses the improvement of methods of hydrobiological studies by modifying tools for plankton and benthic samples collecting. Methods. In order to improve the standard methods of hydro-biological research, we have developed tools for sampling zooplankton and benthic environment of the Caspian Sea. Results. Long-term practice of selecting hydrobiological samples in the Caspian Sea shows that it is required to complete the modernization of the sampling tools used to collect hydrobiological material. With the introduction of Azov and Black Sea invasive comb jelly named Mnemiopsis leidyi A. Agassiz to the Caspian Sea there is a need to collect plankton samples without disturbing its integrity. Tools for collecting benthic fauna do not always give a complete picture of the state of benthic ecosystems because of the lack of visual site selection for sampling. Moreover, while sampling by dredge there is a probable loss of the samples, especially in areas with difficult terrain. Conclusion. We propose to modify a small model of Upstein net (applied in shallow water to collect zooplankton samples with an upper inverted cone that will significantly improve the catchability of the net in theCaspian Sea. Bottom sampler can be improved by installing a video camera for visual inspection of the bottom topography, and use sensors to determine tilt of the dredge and the position of the valves of the bucket. 

  12. Towards an exhaustive sampling of the configurational spaces of the two forms of the peptide hormone guanylin

    NARCIS (Netherlands)

    de Groot, B.L.; Amadei, A; van Aalten, D.M.F.; Berendsen, H.J.C.

    The recently introduced Essential Dynamics sampling method is extended such that an exhaustive sampling of the available (backbone) configurational space can be achieved. From an initial Molecular Dynamics simulation an approximated definition of the essential subspace is obtained. This subspace is

  13. Using spatiotemporal models and distance sampling to map the space use and abundance of newly metamorphosed Western Toads (Anaxyrus boreas)

    Science.gov (United States)

    Chelgren, Nathan D.; Samora, Barbara; Adams, Michael J.; McCreary, Brome

    2011-01-01

    High variability in abundance, cryptic coloration, and small body size of newly metamorphosed anurans have limited demographic studies of this life-history stage. We used line-transect distance sampling and Bayesian methods to estimate the abundance and spatial distribution of newly metamorphosed Western Toads (Anaxyrus boreas) in terrestrial habitat surrounding a montane lake in central Washington, USA. We completed 154 line-transect surveys from the commencement of metamorphosis (15 September 2009) to the date of first snow accumulation in fall (1 October 2009), and located 543 newly metamorphosed toads. After accounting for variable detection probability associated with the extent of barren habitats, estimates of total surface abundance ranged from a posterior median of 3,880 (95% credible intervals from 2,235 to 12,600) in the first week of sampling to 12,150 (5,543 to 51,670) during the second week of sampling. Numbers of newly metamorphosed toads dropped quickly with increasing distance from the lakeshore in a pattern that differed over the three weeks of the study and contradicted our original hypotheses. Though we hypothesized that the spatial distribution of toads would initially be concentrated near the lake shore and then spread outward from the lake over time, we observed the opposite. Ninety-five percent of individuals occurred within 20, 16, and 15 m of shore during weeks one, two, and three respectively, probably reflecting continued emergence of newly metamorphosed toads from the lake and mortality or burrow use of dispersed individuals. Numbers of toads were highest near the inlet stream of the lake. Distance sampling may provide a useful method for estimating the surface abundance of newly metamorphosed toads and relating their space use to landscape variables despite uncertain and variable probability of detection. We discuss means of improving the precision of estimates of total abundance.

  14. Improved Procedure for Transport of Dental Plaque Samples and Other Clinical Specimens Containing Anaerobic Bacteria

    Science.gov (United States)

    Spiegel, Carol A.; Minah, Glenn E.; Krywolap, George N.

    1979-01-01

    An improved transport system for samples containing anaerobic bacteria was developed. This system increased the recovery rate of anaerobic bacteria up to 28.8% as compared to a commonly used method. PMID:39087

  15. Improved explosive collection and detection with rationally assembled surface sampling materials

    Energy Technology Data Exchange (ETDEWEB)

    Chouyyok, Wilaiwan; Bays, J. Timothy; Gerasimenko, Aleksandr A.; Cinson, Anthony D.; Ewing, Robert G.; Atkinson, David A.; Addleman, R. Shane

    2016-01-01

    Sampling and detection of trace explosives is a key analytical process in modern transportation safety. In this work we have explored some of the fundamental analytical processes for collection and detection of trace level explosive on surfaces with the most widely utilized system, thermal desorption IMS. The performance of the standard muslin swipe material was compared with chemically modified fiberglass cloth. The fiberglass surface was modified to include phenyl functional groups. When compared to standard muslin, the phenyl functionalized fiberglass sampling material showed better analyte release from the sampling material as well as improved response and repeatability from multiple uses of the same swipe. The improved sample release of the functionalized fiberglass swipes resulted in a significant increase in sensitivity. Various physical and chemical properties were systematically explored to determine optimal performance. The results herein have relevance to improving the detection of other explosive compounds and potentially to a wide range of other chemical sampling and field detection challenges.

  16. Emotional experience improves with age : Evidence based on over 10 years of experience sampling

    NARCIS (Netherlands)

    Carstensen, L.L.; Turan, B.; Scheibe, S.; Ram, N.; Ersner-Hershfield, H.; Samanez-Larkin, G.R.; Brooks, K.P.; Nesselroade, J.R.

    Recent evidence suggests that emotional well-being improves from early adulthood to old age. This study used experience-sampling to examine the developmental course of emotional experience in a representative sample of adults spanning early to very late adulthood. Participants (N = 184, Wave 1; N =

  17. Improving ambulatory saliva-sampling compliance in pregnant women: a randomized controlled study.

    Directory of Open Access Journals (Sweden)

    Julian Moeller

    Full Text Available OBJECTIVE: Noncompliance with scheduled ambulatory saliva sampling is common and has been associated with biased cortisol estimates in nonpregnant subjects. This study is the first to investigate in pregnant women strategies to improve ambulatory saliva-sampling compliance, and the association between sampling noncompliance and saliva cortisol estimates. METHODS: We instructed 64 pregnant women to collect eight scheduled saliva samples on two consecutive days each. Objective compliance with scheduled sampling times was assessed with a Medication Event Monitoring System and self-reported compliance with a paper-and-pencil diary. In a randomized controlled study, we estimated whether a disclosure intervention (informing women about objective compliance monitoring and a reminder intervention (use of acoustical reminders improved compliance. A mixed model analysis was used to estimate associations between women's objective compliance and their diurnal cortisol profiles, and between deviation from scheduled sampling and the cortisol concentration measured in the related sample. RESULTS: Self-reported compliance with a saliva-sampling protocol was 91%, and objective compliance was 70%. The disclosure intervention was associated with improved objective compliance (informed: 81%, noninformed: 60%, F(1,60  = 17.64, p<0.001, but not the reminder intervention (reminders: 68%, without reminders: 72%, F(1,60 = 0.78, p = 0.379. Furthermore, a woman's increased objective compliance was associated with a higher diurnal cortisol profile, F(2,64  = 8.22, p<0.001. Altered cortisol levels were observed in less objective compliant samples, F(1,705  = 7.38, p = 0.007, with delayed sampling associated with lower cortisol levels. CONCLUSIONS: The results suggest that in pregnant women, objective noncompliance with scheduled ambulatory saliva sampling is common and is associated with biased cortisol estimates. To improve sampling compliance, results suggest

  18. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2

    Science.gov (United States)

    Johnson, Kenneth L.; White, K. Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

  19. Bacterial communities of disease vectors sampled across time, space, and species.

    Science.gov (United States)

    Jones, Ryan T; Knight, Rob; Martin, Andrew P

    2010-02-01

    A common strategy of pathogenic bacteria is to form close associations with parasitic insects that feed on animals and to use these insects as vectors for their own transmission. Pathogens interact closely with other coexisting bacteria within the insect, and interactions between co-occurring bacteria may influence the vector competency of the parasite. Interactions between particular lineages can be explored through measures of alpha-diversity. Furthermore, general patterns of bacterial community assembly can be explored through measures of beta-diversity. Here, we use pyrosequencing (n=115,924 16S rRNA gene sequences) to describe the bacterial communities of 230 prairie dog fleas sampled across space and time. We use these communinty characterizations to assess interactions between dominant community members and to explore general patterns of bacterial community assembly in fleas. An analysis of co-occurrence patterns suggests non-neutral negative interactions between dominant community members (Pspace (phylotype-based: R=0.418, Pspace and time.

  20. Process Improvement for Next Generation Space Flight Vehicles: MSFC Lessons Learned

    Science.gov (United States)

    Housch, Helen

    2008-01-01

    This viewgraph presentation reviews the lessons learned from process improvement for Next Generation Space Flight Vehicles. The contents include: 1) Organizational profile; 2) Process Improvement History; 3) Appraisal Preparation; 4) The Appraisal Experience; 5) Useful Tools; and 6) Is CMMI working?

  1. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  2. A Review of New and Developing Technology to Significantly Improve Mars Sample-Return Missions

    Science.gov (United States)

    Carsey, F.; Brophy, J.; Gilmore, M.; Rodgers, D.; Wilcox, B.

    2000-07-01

    A JPL development activity was initiated in FY 1999 for the purpose of examining and evaluating technologies that could materially improve future (i.e., beyond the 2005 launch) Mars sample return missions. The scope of the technology review was comprehensive and end-to-end; the goal was to improve mass, cost, risk, and scientific return. A specific objective was to assess approaches to sample return with only one Earth launch. While the objective of the study was specifically for sample-return, in-situ missions can also benefit from using many of the technologies examined.

  3. An improved combustion apparatus for the determination of organically bound tritium in environmental samples

    International Nuclear Information System (INIS)

    Du, Lin; Shan, Jian; Ma, Yu-Hua; Wang, Ling; Qin, Lai-Lai; Pi, Li; Zeng, You-Shi; Xia, Zheng-Hai; Wang, Guang-Hua; Liu, Wei

    2016-01-01

    This paper reports an improved combustion apparatus for the determination of organically bound tritium in environmental samples. The performance of this apparatus including the recovery rate and reproducibility was investigated by combusting lettuce and pork samples. To determine the factors for the different recovery rates of lettuce and pork and investigate whether the samples were completely oxidized, the ashes and exhaust gases produced by the combustion were analyzed. The results indicate that the apparatus showed an excellent performance in the combustion of environmental samples. Thus, the improvements conducted in this study were effective. - Highlights: • Three major improvements were made to develop the combustion apparatus for OBT. • The recovery is higher and more stable than that of current equipment. • Little hydrogen was present in the ashes and exhaust after combustion.

  4. Development of a Novel Self-Enclosed Sample Preparation Device for DNA/RNA Isolation in Space

    Science.gov (United States)

    Zhang, Ye; Mehta, Satish K.; Pensinger, Stuart J.; Pickering, Karen D.

    2011-01-01

    Modern biology techniques present potentials for a wide range of molecular, cellular, and biochemistry applications in space, including detection of infectious pathogens and environmental contaminations, monitoring of drug-resistant microbial and dangerous mutations, identification of new phenotypes of microbial and new life species. However, one of the major technological blockades in enabling these technologies in space is a lack of devices for sample preparation in the space environment. To overcome such an obstacle, we constructed a prototype of a DNA/RNA isolation device based on our novel designs documented in the NASA New Technology Reporting System (MSC-24811-1/3-1). This device is self-enclosed and pipette free, purposely designed for use in the absence of gravity. Our design can also be modified easily for preparing samples in space for other applications, such as flowcytometry, immunostaining, cell separation, sample purification and separation according to its size and charges, sample chemical labeling, and sample purification. The prototype of our DNA/RNA isolation device was tested for efficiencies of DNA and RNA isolation from various cell types for PCR analysis. The purity and integrity of purified DNA and RNA were determined as well. Results showed that our developed DNA/RNA isolation device offers similar efficiency and quality in comparison to the samples prepared using the standard protocol in the laboratory.

  5. Sampling methodologies for epidemiologic surveillance of men who have sex with men and transgender women in Latin America: an empiric comparison of convenience sampling, time space sampling, and respondent driven sampling.

    Science.gov (United States)

    Clark, J L; Konda, K A; Silva-Santisteban, A; Peinado, J; Lama, J R; Kusunoki, L; Perez-Brumer, A; Pun, M; Cabello, R; Sebastian, J L; Suarez-Ognio, L; Sanchez, J

    2014-12-01

    Alternatives to convenience sampling (CS) are needed for HIV/STI surveillance of most-at-risk populations in Latin America. We compared CS, time space sampling (TSS), and respondent driven sampling (RDS) for recruitment of men who have sex with men (MSM) and transgender women (TW) in Lima, Peru. During concurrent 60-day periods from June-August, 2011, we recruited MSM/TW for epidemiologic surveillance using CS, TSS, and RDS. A total of 748 participants were recruited through CS, 233 through TSS, and 127 through RDS. The TSS sample included the largest proportion of TW (30.7 %) and the lowest percentage of subjects who had previously participated in HIV/STI research (14.9 %). The prevalence of newly diagnosed HIV infection, according to participants' self-reported previous HIV diagnosis, was highest among TSS recruits (17.9 %) compared with RDS (12.6 %) and CS (10.2 %). TSS identified diverse populations of MSM/TW with higher prevalences of HIV/STIs not accessed by other methods.

  6. Sampling Methodologies for Epidemiologic Surveillance of Men Who Have Sex with Men and Transgender Women in Latin America: An Empiric Comparison of Convenience Sampling, Time Space Sampling, and Respondent Driven Sampling

    Science.gov (United States)

    Clark, J. L.; Konda, K. A.; Silva-Santisteban, A.; Peinado, J.; Lama, J. R.; Kusunoki, L.; Perez-Brumer, A.; Pun, M.; Cabello, R.; Sebastian, J. L.; Suarez-Ognio, L.; Sanchez, J.

    2014-01-01

    Alternatives to convenience sampling (CS) are needed for HIV/STI surveillance of most-at-risk populations in Latin America. We compared CS, time space sampling (TSS), and respondent driven sampling (RDS) for recruitment of men who have sex with men (MSM) and transgender women (TW) in Lima, Peru. During concurrent 60-day periods from June–August, 2011, we recruited MSM/TW for epidemiologic surveillance using CS, TSS, and RDS. A total of 748 participants were recruited through CS, 233 through TSS, and 127 through RDS. The TSS sample included the largest proportion of TW (30.7 %) and the lowest percentage of subjects who had previously participated in HIV/STI research (14.9 %). The prevalence of newly diagnosed HIV infection, according to participants’ self-reported previous HIV diagnosis, was highest among TSS recruits (17.9 %) compared with RDS (12.6 %) and CS (10.2 %). TSS identified diverse populations of MSM/TW with higher prevalences of HIV/STIs not accessed by other methods. PMID:24362754

  7. Approximate distance oracles for planar graphs with improved query time-space tradeoff

    DEFF Research Database (Denmark)

    Wulff-Nilsen, Christian

    2016-01-01

    We consider approximate distance oracles for edge-weighted n-vertex undirected planar graphs. Given fixed ϵ > 0, we present a (1 + ϵ)-approximate distance oracle with O(n(log log n)2) space and O((loglogr?,)3) query time. This improves the previous best product of query time and space...... of the oracles of Thorup (FOCS 2001, J. ACM 2004) and Klein (SODA 2002) from O(nlogn) to O(n(loglogn)5)....

  8. Extracting Hydrologic Understanding from the Unique Space-time Sampling of the Surface Water and Ocean Topography (SWOT) Mission

    Science.gov (United States)

    Nickles, C.; Zhao, Y.; Beighley, E.; Durand, M. T.; David, C. H.; Lee, H.

    2017-12-01

    The Surface Water and Ocean Topography (SWOT) satellite mission is jointly developed by NASA, the French space agency (CNES), with participation from the Canadian and UK space agencies to serve both the hydrology and oceanography communities. The SWOT mission will sample global surface water extents and elevations (lakes/reservoirs, rivers, estuaries, oceans, sea and land ice) at a finer spatial resolution than is currently possible enabling hydrologic discovery, model advancements and new applications that are not currently possible or likely even conceivable. Although the mission will provide global cover, analysis and interpolation of the data generated from the irregular space/time sampling represents a significant challenge. In this study, we explore the applicability of the unique space/time sampling for understanding river discharge dynamics throughout the Ohio River Basin. River network topology, SWOT sampling (i.e., orbit and identified SWOT river reaches) and spatial interpolation concepts are used to quantify the fraction of effective sampling of river reaches each day of the three-year mission. Streamflow statistics for SWOT generated river discharge time series are compared to continuous daily river discharge series. Relationships are presented to transform SWOT generated streamflow statistics to equivalent continuous daily discharge time series statistics intended to support hydrologic applications using low-flow and annual flow duration statistics.

  9. Microstructure and Macrosegregation Study of Directionally Solidified Al-7Si Samples Processed Terrestrially and Aboard the International Space Station

    Science.gov (United States)

    Angart, Samuel; Erdman, R. G.; Poirier, David R.; Tewari, S.N.; Grugel, R. N.

    2014-01-01

    This talk reports research that has been carried out under the aegis of NASA as part of a collaboration between ESA and NASA for solidification experiments on the International Space Station (ISS). The focus has been on the effect of convection on the microstructural evolution and macrosegregation in hypoeutectic Al-Si alloys during directional solidification (DS). The DS-experiments have been carried out under 1-g at Cleveland State University (CSU) and under low-g on the International Space Station (ISS). The thermal processing-history of the experiments is well defined for both the terrestrially-processed samples and the ISS-processed samples. We have observed that the primary dendrite arm spacings of two samples grown in the low-g environment of the ISS show good agreement with a dendrite-growth model based on diffusion controlled growth. The gravity-driven convection (i.e., thermosolutal convection) in terrestrially grown samples has the effect of decreasing the primary dendrite arm spacings and causes macrosgregation. In order to process DS-samples aboard the ISS, dendritic-seed crystals have to partially remelted in a stationary thermal gradient before the DS is carried out. Microstructural changes and macrosegregation effects during this period are described.

  10. An improved correlated sampling method for calculating correction factor of detector

    International Nuclear Information System (INIS)

    Wu Zhen; Li Junli; Cheng Jianping

    2006-01-01

    In the case of a small size detector lying inside a bulk of medium, there are two problems in the correction factors calculation of the detectors. One is that the detector is too small for the particles to arrive at and collide in; the other is that the ratio of two quantities is not accurate enough. The method discussed in this paper, which combines correlated sampling with modified particle collision auto-importance sampling, and has been realized on the MCNP-4C platform, can solve these two problems. Besides, other 3 variance reduction techniques are also combined with correlated sampling respectively to calculate a simple calculating model of the correction factors of detectors. The results prove that, although all the variance reduction techniques combined with correlated sampling can improve the calculating efficiency, the method combining the modified particle collision auto-importance sampling with the correlated sampling is the most efficient one. (authors)

  11. Using Linked Survey Paradata to Improve Sampling Strategies in the Medical Expenditure Panel Survey

    Directory of Open Access Journals (Sweden)

    Mirel Lisa B.

    2017-06-01

    Full Text Available Using paradata from a prior survey that is linked to a new survey can help a survey organization develop more effective sampling strategies. One example of this type of linkage or subsampling is between the National Health Interview Survey (NHIS and the Medical Expenditure Panel Survey (MEPS. MEPS is a nationally representative sample of the U.S. civilian, noninstitutionalized population based on a complex multi-stage sample design. Each year a new sample is drawn as a subsample of households from the prior year’s NHIS. The main objective of this article is to examine how paradata from a prior survey can be used in developing a sampling scheme in a subsequent survey. A framework for optimal allocation of the sample in substrata formed for this purpose is presented and evaluated for the relative effectiveness of alternative substratification schemes. The framework is applied, using real MEPS data, to illustrate how utilizing paradata from the linked survey offers the possibility of making improvements to the sampling scheme for the subsequent survey. The improvements aim to reduce the data collection costs while maintaining or increasing effective responding sample sizes and response rates for a harder to reach population.

  12. Blood venous sample collection: Recommendations overview and a checklist to improve quality.

    Science.gov (United States)

    Giavarina, Davide; Lippi, Giuseppe

    2017-07-01

    The extra-analytical phases of the total testing process have substantial impact on managed care, as well as an inherent high risk of vulnerability to errors which is often greater than that of the analytical phase. The collection of biological samples is a crucial preanalytical activity. Problems or errors occurring shortly before, or soon after, this preanalytical step may impair sample quality and characteristics, or else modify the final results of testing. The standardization of fasting requirements, rest, patient position and psychological state of the patient are therefore crucial for mitigating the impact of preanalytical variability. Moreover, the quality of materials used for collecting specimens, along with their compatibility, can guarantee sample quality and persistence of chemical and physical characteristics of the analytes over time, so safeguarding the reliability of testing. Appropriate techniques and sampling procedures are effective to prevent problems such as hemolysis, undue clotting in the blood tube, draw of insufficient sample volume and modification of analyte concentration. An accurate identification of both patient and blood samples is a key priority as for other healthcare activities. Good laboratory practice and appropriate training of operators, by specifically targeting collection of biological samples, blood in particular, may greatly improve this issue, thus lowering the risk of errors and their adverse clinical consequences. The implementation of a simple and rapid check-list, including verification of blood collection devices, patient preparation and sampling techniques, was found to be effective for enhancing sample quality and reducing some preanalytical errors associated with these procedures. The use of this tool, along with implementation of objective and standardized systems for detecting non-conformities related to unsuitable samples, can be helpful for standardizing preanalytical activities and improving the quality of

  13. Determining Plane-Sweep Sampling Points in Image Space Using the Cross-Ratio for Image-Based Depth Estimation

    Science.gov (United States)

    Ruf, B.; Erdnuess, B.; Weinmann, M.

    2017-08-01

    With the emergence of small consumer Unmanned Aerial Vehicles (UAVs), the importance and interest of image-based depth estimation and model generation from aerial images has greatly increased in the photogrammetric society. In our work, we focus on algorithms that allow an online image-based dense depth estimation from video sequences, which enables the direct and live structural analysis of the depicted scene. Therefore, we use a multi-view plane-sweep algorithm with a semi-global matching (SGM) optimization which is parallelized for general purpose computation on a GPU (GPGPU), reaching sufficient performance to keep up with the key-frames of input sequences. One important aspect to reach good performance is the way to sample the scene space, creating plane hypotheses. A small step size between consecutive planes, which is needed to reconstruct details in the near vicinity of the camera may lead to ambiguities in distant regions, due to the perspective projection of the camera. Furthermore, an equidistant sampling with a small step size produces a large number of plane hypotheses, leading to high computational effort. To overcome these problems, we present a novel methodology to directly determine the sampling points of plane-sweep algorithms in image space. The use of the perspective invariant cross-ratio allows us to derive the location of the sampling planes directly from the image data. With this, we efficiently sample the scene space, achieving higher sampling density in areas which are close to the camera and a lower density in distant regions. We evaluate our approach on a synthetic benchmark dataset for quantitative evaluation and on a real-image dataset consisting of aerial imagery. The experiments reveal that an inverse sampling achieves equal and better results than a linear sampling, with less sampling points and thus less runtime. Our algorithm allows an online computation of depth maps for subsequences of five frames, provided that the relative

  14. DETERMINING PLANE-SWEEP SAMPLING POINTS IN IMAGE SPACE USING THE CROSS-RATIO FOR IMAGE-BASED DEPTH ESTIMATION

    Directory of Open Access Journals (Sweden)

    B. Ruf

    2017-08-01

    Full Text Available With the emergence of small consumer Unmanned Aerial Vehicles (UAVs, the importance and interest of image-based depth estimation and model generation from aerial images has greatly increased in the photogrammetric society. In our work, we focus on algorithms that allow an online image-based dense depth estimation from video sequences, which enables the direct and live structural analysis of the depicted scene. Therefore, we use a multi-view plane-sweep algorithm with a semi-global matching (SGM optimization which is parallelized for general purpose computation on a GPU (GPGPU, reaching sufficient performance to keep up with the key-frames of input sequences. One important aspect to reach good performance is the way to sample the scene space, creating plane hypotheses. A small step size between consecutive planes, which is needed to reconstruct details in the near vicinity of the camera may lead to ambiguities in distant regions, due to the perspective projection of the camera. Furthermore, an equidistant sampling with a small step size produces a large number of plane hypotheses, leading to high computational effort. To overcome these problems, we present a novel methodology to directly determine the sampling points of plane-sweep algorithms in image space. The use of the perspective invariant cross-ratio allows us to derive the location of the sampling planes directly from the image data. With this, we efficiently sample the scene space, achieving higher sampling density in areas which are close to the camera and a lower density in distant regions. We evaluate our approach on a synthetic benchmark dataset for quantitative evaluation and on a real-image dataset consisting of aerial imagery. The experiments reveal that an inverse sampling achieves equal and better results than a linear sampling, with less sampling points and thus less runtime. Our algorithm allows an online computation of depth maps for subsequences of five frames, provided that

  15. An adaptive sampling method for variable-fidelity surrogate models using improved hierarchical kriging

    Science.gov (United States)

    Hu, Jiexiang; Zhou, Qi; Jiang, Ping; Shao, Xinyu; Xie, Tingli

    2018-01-01

    Variable-fidelity (VF) modelling methods have been widely used in complex engineering system design to mitigate the computational burden. Building a VF model generally includes two parts: design of experiments and metamodel construction. In this article, an adaptive sampling method based on improved hierarchical kriging (ASM-IHK) is proposed to refine the improved VF model. First, an improved hierarchical kriging model is developed as the metamodel, in which the low-fidelity model is varied through a polynomial response surface function to capture the characteristics of a high-fidelity model. Secondly, to reduce local approximation errors, an active learning strategy based on a sequential sampling method is introduced to make full use of the already required information on the current sampling points and to guide the sampling process of the high-fidelity model. Finally, two numerical examples and the modelling of the aerodynamic coefficient for an aircraft are provided to demonstrate the approximation capability of the proposed approach, as well as three other metamodelling methods and two sequential sampling methods. The results show that ASM-IHK provides a more accurate metamodel at the same simulation cost, which is very important in metamodel-based engineering design problems.

  16. Simultaneous escaping of explicit and hidden free energy barriers: application of the orthogonal space random walk strategy in generalized ensemble based conformational sampling.

    Science.gov (United States)

    Zheng, Lianqing; Chen, Mengen; Yang, Wei

    2009-06-21

    To overcome the pseudoergodicity problem, conformational sampling can be accelerated via generalized ensemble methods, e.g., through the realization of random walks along prechosen collective variables, such as spatial order parameters, energy scaling parameters, or even system temperatures or pressures, etc. As usually observed, in generalized ensemble simulations, hidden barriers are likely to exist in the space perpendicular to the collective variable direction and these residual free energy barriers could greatly abolish the sampling efficiency. This sampling issue is particularly severe when the collective variable is defined in a low-dimension subset of the target system; then the "Hamiltonian lagging" problem, which reveals the fact that necessary structural relaxation falls behind the move of the collective variable, may be likely to occur. To overcome this problem in equilibrium conformational sampling, we adopted the orthogonal space random walk (OSRW) strategy, which was originally developed in the context of free energy simulation [L. Zheng, M. Chen, and W. Yang, Proc. Natl. Acad. Sci. U.S.A. 105, 20227 (2008)]. Thereby, generalized ensemble simulations can simultaneously escape both the explicit barriers along the collective variable direction and the hidden barriers that are strongly coupled with the collective variable move. As demonstrated in our model studies, the present OSRW based generalized ensemble treatments show improved sampling capability over the corresponding classical generalized ensemble treatments.

  17. Improved sample preparation and counting techniques for enhanced tritium measurement sensitivity

    Science.gov (United States)

    Moran, J.; Aalseth, C.; Bailey, V. L.; Mace, E. K.; Overman, C.; Seifert, A.; Wilcox Freeburg, E. D.

    2015-12-01

    Tritium (T) measurements offer insight to a wealth of environmental applications including hydrologic tracking, discerning ocean circulation patterns, and aging ice formations. However, the relatively short half-life of T (12.3 years) limits its effective age dating range. Compounding this limitation is the decrease in atmospheric T content by over two orders of magnitude (from 1000-2000 TU in 1962 to testing in the 1960's. We are developing sample preparation methods coupled to direct counting of T via ultra-low background proportional counters which, when combined, offer improved T measurement sensitivity (~4.5 mmoles of H2 equivalent) and will help expand the application of T age dating to smaller sample sizes linked to persistent environmental questions despite the limitations above. For instance, this approach can be used to T date ~ 2.2 mmoles of CH4 collected from sample-limited systems including microbial communities, soils, or subsurface aquifers and can be combined with radiocarbon dating to distinguish the methane's formation age from C age in a system. This approach can also expand investigations into soil organic C where the improved sensitivity will permit resolution of soil C into more descriptive fractions and provide direct assessments of the stability of specific classes of organic matter in soils environments. We are employing a multiple step sample preparation system whereby organic samples are first combusted with resulting CO2 and H2O being used as a feedstock to synthesize CH4. This CH4 is mixed with Ar and loaded directly into an ultra-low background proportional counter for measurement of T β decay in a shallow underground laboratory. Analysis of water samples requires only the addition of geologic CO2 feedstock with the sample for methane synthesis. The chemical nature of the preparation techniques enable high sample throughput with only the final measurement requiring T decay with total sample analysis time ranging from 2 -5 weeks

  18. Space charge profiles in low density polyethylene samples containing a permittivity/conductivity gradient

    DEFF Research Database (Denmark)

    Bambery, K.R.; Fleming, R.J.; Holbøll, Joachim

    2001-01-01

    .5×107 V m-1. Current density was also measured as a function of temperature and field. Space charge due exclusively to the temperature gradient was detected, with density of order 0.01 C m-3. The activation energy associated with the transport of electrons through the bulk was calculated as 0.09 e...

  19. Image processing improvement for optical observations of space debris with the TAROT telescopes

    Science.gov (United States)

    Thiebaut, C.; Theron, S.; Richard, P.; Blanchet, G.; Klotz, A.; Boër, M.

    2016-07-01

    CNES is involved in the Inter-Agency Space Debris Coordination Committee (IADC) and is observing space debris with two robotic ground based fully automated telescopes called TAROT and operated by the CNRS. An image processing algorithm devoted to debris detection in geostationary orbit is implemented in the standard pipeline. Nevertheless, this algorithm is unable to deal with debris tracking mode images, this mode being the preferred one for debris detectability. We present an algorithm improvement for this mode and give results in terms of false detection rate.

  20. An Improvement on Space Focusing Resolution in Two-Field Time-of-Flight Mass Spectrometers

    International Nuclear Information System (INIS)

    Yildirim, M.; Aydin, R.; Akin, U.; Kilic, H. S.; Sise, O.; Ulu, M.; Dogan, M.

    2007-01-01

    Time-of-Flight Mass Spectrometer (TOFMS) is a sophisticated device for the mass selective analysis of a variety of samples. The main limitation on TOFMS technique is the obtainable resolution where the two main limiting factors are the initial space and energy spread of particles created in ionization region. Similar charged particles starting at different points will reach the detector at different times. So, this problem makes space focusing is very important subject. We have presented principles of two-fields TOFMS with second-order space focusing both using analytical methods and ray-tracing simulation. This work aims understanding of ion optical system clearly and gives hint of expectation for future developments

  1. Influence of the radial spacing between cathodes on the surface composition of iron samples sintered by hollow cathode electric discharge

    Directory of Open Access Journals (Sweden)

    Brunatto S.F.

    2001-01-01

    Full Text Available The present work reports an investigation of the influence of the radial spacing between cathodes on the iron sintering process by hollow cathode electrical discharge, with surface enrichment of the alloying elements Cr and Ni. Pressed cylindrical samples of 9.5 mm diameter and density of 7.0 ± 0.1 g/cm³ were prepared by compaction of Ancorsteel 1000C iron powder. These samples, constituting the central cathode, were positioned concentrically in the interior of an external cathode machined from a tube of stainless steel AISI 310 (containing: 25% Cr, 16% Ni, 1.5% Mn, 1.5% Si, 0.03% C and the remainder Fe. Sintering was done at 1150 °C, for 120 min, utilizing radial spacings between the central and hollow cathodes of 3, 6 and 9 mm and a gas mixture of 80% Ar and 20% H2, with a flow rate of 5 cm³/s at a pressure of 3 Torr. The electric discharge was generated using a pulsed voltage power source, with a period of 200 mus. The radial spacing had only a slight influence on the quantity of atoms of alloying elements deposited and diffused on the surface of the sample. Analysis with a microprobe showed the presence of chrome (up to 4.0% and nickel (up to 3.0%, in at. % at the surface of the samples. This surface enrichment can be attributed to the mechanism of sputtering of the metallic atoms present in the external cathode, with the deposition of these elements on the sample surface and consequent diffusion within the sample.

  2. Size, shape, and topology optimization of planar and space trusses using mutation-based improved metaheuristics

    Directory of Open Access Journals (Sweden)

    Ghanshyam G. Tejani

    2018-04-01

    Full Text Available In this study, simultaneous size, shape, and topology optimization of planar and space trusses are investigated. Moreover, the trusses are subjected to constraints for element stresses, nodal displacements, and kinematic stability conditions. Truss Topology Optimization (TTO removes the superfluous elements and nodes from the ground structure. In this method, the difficulties arise due to unacceptable and singular topologies; therefore, the Grubler’s criterion and the positive definiteness are used to handle such issue. Moreover, the TTO is challenging due to its search space, which is implicit, non-convex, non-linear, and often leading to divergence. Therefore, mutation-based metaheuristics are proposed to investigate them. This study compares the performance of four improved metaheuristics (viz. Improved Teaching–Learning-Based Optimization (ITLBO, Improved Heat Transfer Search (IHTS, Improved Water Wave Optimization (IWWO, and Improved Passing Vehicle Search (IPVS and four basic metaheuristics (viz. TLBO, HTS, WWO, and PVS in order to solve structural optimization problems. Keywords: Structural optimization, Mutation operator, Improved metaheuristics, Modified algorithms, Truss topology optimization

  3. Data Transformation Functions for Expanded Search Spaces in Geographic Sample Supervised Segment Generation

    OpenAIRE

    Christoff Fourie; Elisabeth Schoepfer

    2014-01-01

    Sample supervised image analysis, in particular sample supervised segment generation, shows promise as a methodological avenue applicable within Geographic Object-Based Image Analysis (GEOBIA). Segmentation is acknowledged as a constituent component within typically expansive image analysis processes. A general extension to the basic formulation of an empirical discrepancy measure directed segmentation algorithm parameter tuning approach is proposed. An expanded search landscape is defined, c...

  4. New experimental space for irradiating samples by RA reactor fast neutron flux at temperatures up to 100 deg C

    International Nuclear Information System (INIS)

    Pavicevic, M.; Novakovic, M.; Zecevic, V.

    1961-01-01

    The objective of this paper is to present adaptation of the RA reactor which would enable samples irradiation by fast neutrons and describe new experimental possibilities. New experimental space was achieved using hollow fuel elements which have been reconstructed to enable placement of irradiation capsules inside the tube. This paper includes thermal analysis and describes problems related to operation, safety and radiation protection issues which arise from using reconstructed fuel elements

  5. Improvements to the Chebyshev expansion of attenuation correction factors for cylindrical samples

    International Nuclear Information System (INIS)

    Mildner, D.F.R.; Carpenter, J.M.

    1990-01-01

    The accuracy of the Chebyshev expansion coefficients used for the calculation of attenuation correction factors for cylinderical samples has been improved. An increased order of expansion allows the method to be useful over a greater range of attenuation. It is shown that many of these coefficients are exactly zero, others are rational numbers, and others are rational frations of π -1 . The assumptions of Sears in his asymptotic expression of the attenuation correction factor are also examined. (orig.)

  6. Reproducibility of preclinical animal research improves with heterogeneity of study samples

    Science.gov (United States)

    Vogt, Lucile; Sena, Emily S.; Würbel, Hanno

    2018-01-01

    Single-laboratory studies conducted under highly standardized conditions are the gold standard in preclinical animal research. Using simulations based on 440 preclinical studies across 13 different interventions in animal models of stroke, myocardial infarction, and breast cancer, we compared the accuracy of effect size estimates between single-laboratory and multi-laboratory study designs. Single-laboratory studies generally failed to predict effect size accurately, and larger sample sizes rendered effect size estimates even less accurate. By contrast, multi-laboratory designs including as few as 2 to 4 laboratories increased coverage probability by up to 42 percentage points without a need for larger sample sizes. These findings demonstrate that within-study standardization is a major cause of poor reproducibility. More representative study samples are required to improve the external validity and reproducibility of preclinical animal research and to prevent wasting animals and resources for inconclusive research. PMID:29470495

  7. Improvement of 137Cs analysis in small volume seawater samples using the Ogoya underground facility

    International Nuclear Information System (INIS)

    Hirose, K.; Komura, K.; Kanazawa University, Ishikawa; Aoyama, M.; Igarashi, Y.

    2008-01-01

    137 Cs in seawater is one of the most powerful tracers of water motion. Large volumes of samples have been required for determination of 137 Cs in seawater. This paper describes improvement of separation and purification processes of 137 Cs in seawater, which includes purification of 137 Cs using hexachloroplatinic acid in addition to ammonium phosphomolybdate (AMP) precipitation. As a result, we succeeded the 137 Cs determination in seawater with a smaller sample volume of 10 liter by using ultra-low background gamma-spectrometry in the Ogoya underground facility. 137 Cs detection limit was about 0.1 mBq (counting time: 10 6 s). This method is applied to determine 137 Cs in small samples of the South Pacific deep waters. (author)

  8. Improved optimum condition for recovery and measurement of 210Po in environmental samples

    International Nuclear Information System (INIS)

    Zal Uyun Wan Mahmood; Norfaizal Mohamed; Nik Azlin Nik Ariffin; Abdul Kadir Ishak

    2012-01-01

    An improved laboratory technique for measurement of polonium-210( 210 Po) in environmental samples has been developed in Radiochemistry and Environmental Laboratory (RAS), Malaysian Nuclear Agency. To further improve this technique, a study with the objectives to determine the optimum conditions for 210 Po deposition and; evaluate the accuracy and precision results for the determination of 210 Po in environmental samples was carried-out. Polonium-210 which is an alpha emitter obtained in acidic solution through total digestion and dissolution of samples has been efficiently plated onto one side of the silver disc in the spontaneous plating process for measurement of its alpha activity. The optimum conditions for deposition of 210 Po were achieved using hydrochloric acid (HCl) media at acidity of 0.5 M with the presence of 1.0 gram hydroxyl ammonium chloride and the plating temperature at 90 degree Celsius. The plating was carried out in 80 ml HCl solution (0.5 M) for 4 hours. The recorded recoveries obtained using 209 Po tracers in the CRM IAEA-385 and environmental samples were 85 % - 98% whereby the efficiency of the new technique is a distinct advantage over the existing techniques. Therefore, optimization of deposition parameters is a prime importance to achieve accuracy and precision results as well as economy and time saving. (author)

  9. Improving the space surveillance telescope's performance using multi-hypothesis testing

    Energy Technology Data Exchange (ETDEWEB)

    Chris Zingarelli, J.; Cain, Stephen [Air Force Institute of Technology, 2950 Hobson Way, Bldg 641, Wright Patterson AFB, OH 45433 (United States); Pearce, Eric; Lambour, Richard [Lincoln Labratory, Massachusetts Institute of Technology, 244 Wood Street, Lexington, MA 02421 (United States); Blake, Travis [Defense Advanced Research Projects Agency, 675 North Randolph Street Arlington, VA 22203 (United States); Peterson, Curtis J. R., E-mail: John.Zingarelli@afit.edu [United States Air Force, 1690 Air Force Pentagon, Washington, DC 20330 (United States)

    2014-05-01

    The Space Surveillance Telescope (SST) is a Defense Advanced Research Projects Agency program designed to detect objects in space like near Earth asteroids and space debris in the geosynchronous Earth orbit (GEO) belt. Binary hypothesis test (BHT) methods have historically been used to facilitate the detection of new objects in space. In this paper a multi-hypothesis detection strategy is introduced to improve the detection performance of SST. In this context, the multi-hypothesis testing (MHT) determines if an unresolvable point source is in either the center, a corner, or a side of a pixel in contrast to BHT, which only tests whether an object is in the pixel or not. The images recorded by SST are undersampled such as to cause aliasing, which degrades the performance of traditional detection schemes. The equations for the MHT are derived in terms of signal-to-noise ratio (S/N), which is computed by subtracting the background light level around the pixel being tested and dividing by the standard deviation of the noise. A new method for determining the local noise statistics that rejects outliers is introduced in combination with the MHT. An experiment using observations of a known GEO satellite are used to demonstrate the improved detection performance of the new algorithm over algorithms previously reported in the literature. The results show a significant improvement in the probability of detection by as much as 50% over existing algorithms. In addition to detection, the S/N results prove to be linearly related to the least-squares estimates of point source irradiance, thus improving photometric accuracy.

  10. Accuracy of the improved quasistatic space-time method checked with experiment

    International Nuclear Information System (INIS)

    Kugler, G.; Dastur, A.R.

    1976-10-01

    Recent experiments performed at the Savannah River Laboratory have made it possible to check the accuracy of numerical methods developed to simulate space-dependent neutron transients. The experiments were specifically designed to emphasize delayed neutron holdback. The CERBERUS code using the IQS (Improved Quasistatic) method has been developed to provide a practical yet accurate tool for spatial kinetics calculations of CANDU reactors. The code was tested on the Savannah River experiments and excellent agreement was obtained. (author)

  11. An Optimization Study on Listening Experiments to Improve the Comparability of Annoyance Ratings of Noise Samples from Different Experimental Sample Sets.

    Science.gov (United States)

    Di, Guoqing; Lu, Kuanguang; Shi, Xiaofan

    2018-03-08

    Annoyance ratings obtained from listening experiments are widely used in studies on health effect of environmental noise. In listening experiments, participants usually give the annoyance rating of each noise sample according to its relative annoyance degree among all samples in the experimental sample set if there are no reference sound samples, which leads to poor comparability between experimental results obtained from different experimental sample sets. To solve this problem, this study proposed to add several pink noise samples with certain loudness levels into experimental sample sets as reference sound samples. On this basis, the standard curve between logarithmic mean annoyance and loudness level of pink noise was used to calibrate the experimental results and the calibration procedures were described in detail. Furthermore, as a case study, six different types of noise sample sets were selected to conduct listening experiments using this method to examine the applicability of it. Results showed that the differences in the annoyance ratings of each identical noise sample from different experimental sample sets were markedly decreased after calibration. The determination coefficient ( R ²) of linear fitting functions between psychoacoustic annoyance (PA) and mean annoyance (MA) of noise samples from different experimental sample sets increased obviously after calibration. The case study indicated that the method above is applicable to calibrating annoyance ratings obtained from different types of noise sample sets. After calibration, the comparability of annoyance ratings of noise samples from different experimental sample sets can be distinctly improved.

  12. Research on Monte Carlo improved quasi-static method for reactor space-time dynamics

    International Nuclear Information System (INIS)

    Xu Qi; Wang Kan; Li Shirui; Yu Ganglin

    2013-01-01

    With large time steps, improved quasi-static (IQS) method can improve the calculation speed for reactor dynamic simulations. The Monte Carlo IQS method was proposed in this paper, combining the advantages of both the IQS method and MC method. Thus, the Monte Carlo IQS method is beneficial for solving space-time dynamics problems of new concept reactors. Based on the theory of IQS, Monte Carlo algorithms for calculating adjoint neutron flux, reactor kinetic parameters and shape function were designed and realized. A simple Monte Carlo IQS code and a corresponding diffusion IQS code were developed, which were used for verification of the Monte Carlo IQS method. (authors)

  13. Use of space-filling curves to select sample locations in natural resource monitoring studies

    Science.gov (United States)

    Andrew Lister; Charles T. Scott

    2009-01-01

    The establishment of several large area monitoring networks over the past few decades has led to increased research into ways to spatially balance sample locations across the landscape. Many of these methods are well documented and have been used in the past with great success. In this paper, we present a method using geographic information systems (GIS) and fractals...

  14. Space science technology: In-situ science. Sample Acquisition, Analysis, and Preservation Project summary

    Science.gov (United States)

    Aaron, Kim

    1991-01-01

    The Sample Acquisition, Analysis, and Preservation Project is summarized in outline and graphic form. The objective of the project is to develop component and system level technology to enable the unmanned collection, analysis and preservation of physical, chemical and mineralogical data from the surface of planetary bodies. Technology needs and challenges are identified and specific objectives are described.

  15. Improving the quality of biomarker discovery research: the right samples and enough of them.

    Science.gov (United States)

    Pepe, Margaret S; Li, Christopher I; Feng, Ziding

    2015-06-01

    Biomarker discovery research has yielded few biomarkers that validate for clinical use. A contributing factor may be poor study designs. The goal in discovery research is to identify a subset of potentially useful markers from a large set of candidates assayed on case and control samples. We recommend the PRoBE design for selecting samples. We propose sample size calculations that require specifying: (i) a definition for biomarker performance; (ii) the proportion of useful markers the study should identify (Discovery Power); and (iii) the tolerable number of useless markers amongst those identified (False Leads Expected, FLE). We apply the methodology to a study of 9,000 candidate biomarkers for risk of colon cancer recurrence where a useful biomarker has positive predictive value ≥ 30%. We find that 40 patients with recurrence and 160 without recurrence suffice to filter out 98% of useless markers (2% FLE) while identifying 95% of useful biomarkers (95% Discovery Power). Alternative methods for sample size calculation required more assumptions. Biomarker discovery research should utilize quality biospecimen repositories and include sample sizes that enable markers meeting prespecified performance characteristics for well-defined clinical applications to be identified. The scientific rigor of discovery research should be improved. ©2015 American Association for Cancer Research.

  16. Assessing respiratory pathogen communities in bighorn sheep populations: Sampling realities, challenges, and improvements.

    Directory of Open Access Journals (Sweden)

    Carson J Butler

    Full Text Available Respiratory disease has been a persistent problem for the recovery of bighorn sheep (Ovis canadensis, but has uncertain etiology. The disease has been attributed to several bacterial pathogens including Mycoplasma ovipneumoniae and Pasteurellaceae pathogens belonging to the Mannheimia, Bibersteinia, and Pasteurella genera. We estimated detection probability for these pathogens using protocols with diagnostic tests offered by a fee-for-service laboratory and not offered by a fee-for-service laboratory. We conducted 2861 diagnostic tests on swab samples collected from 476 bighorn sheep captured across Montana and Wyoming to gain inferences regarding detection probability, pathogen prevalence, and the power of different sampling methodologies to detect pathogens in bighorn sheep populations. Estimated detection probability using fee-for-service protocols was less than 0.50 for all Pasteurellaceae and 0.73 for Mycoplasma ovipneumoniae. Non-fee-for-service Pasteurellaceae protocols had higher detection probabilities, but no single protocol increased detection probability of all Pasteurellaceae pathogens to greater than 0.50. At least one protocol resulted in an estimated detection probability of 0.80 for each pathogen except Mannheimia haemolytica, for which the highest detection probability was 0.45. In general, the power to detect Pasteurellaceae pathogens at low prevalence in populations was low unless many animals were sampled or replicate samples were collected per animal. Imperfect detection also resulted in low precision when estimating prevalence for any pathogen. Low and variable detection probabilities for respiratory pathogens using live-sampling protocols may lead to inaccurate conclusions regarding pathogen community dynamics and causes of bighorn sheep respiratory disease epizootics. We recommend that agencies collect multiples samples per animal for Pasteurellaceae detection, and one sample for Mycoplasma ovipneumoniae detection from

  17. Improving the quality of urban public space through the identification of space utilization index at Imam Bonjol Park, Padang city

    Science.gov (United States)

    Eriawan, Tomi; Setiawati, Lestari

    2017-06-01

    Padang City as a big city with a population approaching one million people has to address the issue of increased activities of the population and increased need for land and space for those activities. One of the effects of population growth and the development of activities in Padang is the decreasing number of open spaces for the outdoor public activities, both the natural and artificial public. However, Padang City has several open spaces that are built and managed by the government including 40 units of open spaces in the form of plansum parks, playgrounds, and sports parks, with a total area of 10.88 hectares. Despite their status as public open spaces, not all of them can be used and enjoyed by the public since most of them are passive parks, in which they are made only as a garden without any indulgences. This study was performed to assess the quality of public spaces in the central business of Padang City, namely Imam Bonjol Park (Taman Imam Bonjol). The methods of this study were done through several stages, which were to identify the typology of function space based on [1] Carmona (2008) and to assess the space utilization index based on the approach of Public Space Index according to Mehta [2] (2007). The purpose of this study was to assess the quality of space which is a public space in Padang City. The space quality was measured based on the variables in Good Public Space Index, the intensity of use, the intensity of social activity, the duration of activity, the variations in usage, and the diversity of use. The rate of the index of public space quality at Taman Imam Bonjol was determined by assessing 5 (five) variables of space quality. Based on the results of the analysis, public space utilization index was equal to 0.696. This result could be used to determine the quality of public space, in this case was Imam Bonjol Park was in Medium category. The parameters indicated several results including the lack of diversity in users' activity time, less

  18. GARN: Sampling RNA 3D Structure Space with Game Theory and Knowledge-Based Scoring Strategies.

    Science.gov (United States)

    Boudard, Mélanie; Bernauer, Julie; Barth, Dominique; Cohen, Johanne; Denise, Alain

    2015-01-01

    Cellular processes involve large numbers of RNA molecules. The functions of these RNA molecules and their binding to molecular machines are highly dependent on their 3D structures. One of the key challenges in RNA structure prediction and modeling is predicting the spatial arrangement of the various structural elements of RNA. As RNA folding is generally hierarchical, methods involving coarse-grained models hold great promise for this purpose. We present here a novel coarse-grained method for sampling, based on game theory and knowledge-based potentials. This strategy, GARN (Game Algorithm for RNa sampling), is often much faster than previously described techniques and generates large sets of solutions closely resembling the native structure. GARN is thus a suitable starting point for the molecular modeling of large RNAs, particularly those with experimental constraints. GARN is available from: http://garn.lri.fr/.

  19. Improving interpretation of infrared spectra for OM characterization by subtraction of spectra from incinerated samples

    Science.gov (United States)

    Ellerbrock, Ruth H.; Gerke, Horst H.; Leue, Martin

    2017-04-01

    Non-destructive methods such as diffuse reflectance infrared Fourier transform spectroscopy (DRIFT) have been applied to characterize organic matter (OM) at intact structural surfaces among others. However, it is often difficult to distinguish effects of organic components on DRIFT signal intensities from those of mineral components. The objective of this study was to re-evaluate DRIFT spectra from intact earthworm burrow walls and coated cracks to improve the interpretation of C-H and C=O bands. We compared DRIFT and transmission Fourier transform infrared (FTIR) spectra of entire samples that were from the same pedogenetic soil horizon, but different in mineral composition and texture (i.e., glacial till versus loess). Spectra of incinerated samples were subtracted from the original spectra. Transmission FTIR and DRIFT spectra were almost identical for entire soil samples. However, the DRIFT spectra were affected by the bulk mode bands (i.e., wavenumbers 2000 to 1700 cm-1) that affected spectral resolution and reproducibility. The ratios between C-H and C=O band intensities as indicator for OM quality obtained with DRIFT were smaller than those obtained from transmission FTIR. A spectral subtraction procedure was found to reduce effects of mineral absorption bands on DRIFT spectra allowing an improved interpretation. DRIFT spectroscopy as a non-destructive method for analyzing OM composition at intact surfaces in structured soils could be calibrated with information obtained with the more detailed transmission FTIR and complementary methods.

  20. The Atmospheric Scanning Electron Microscope with open sample space observes dynamic phenomena in liquid or gas.

    Science.gov (United States)

    Suga, Mitsuo; Nishiyama, Hidetoshi; Konyuba, Yuji; Iwamatsu, Shinnosuke; Watanabe, Yoshiyuki; Yoshiura, Chie; Ueda, Takumi; Sato, Chikara

    2011-12-01

    Although conventional electron microscopy (EM) requires samples to be in vacuum, most chemical and physical reactions occur in liquid or gas. The Atmospheric Scanning Electron Microscope (ASEM) can observe dynamic phenomena in liquid or gas under atmospheric pressure in real time. An electron-permeable window made of pressure-resistant 100 nm-thick silicon nitride (SiN) film, set into the bottom of the open ASEM sample dish, allows an electron beam to be projected from underneath the sample. A detector positioned below captures backscattered electrons. Using the ASEM, we observed the radiation-induced self-organization process of particles, as well as phenomena accompanying volume change, including evaporation-induced crystallization. Using the electrochemical ASEM dish, we observed tree-like electrochemical depositions on the cathode. In silver nitrate solution, we observed silver depositions near the cathode forming incidental internal voids. The heated ASEM dish allowed observation of patterns of contrast in melting and solidifying solder. Finally, to demonstrate its applicability for monitoring and control of industrial processes, silver paste and solder paste were examined at high throughput. High resolution, imaging speed, flexibility, adaptability, and ease of use facilitate the observation of previously difficult-to-image phenomena, and make the ASEM applicable to various fields. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Second Harmonic Imaging improves Echocardiograph Quality on board the International Space Station

    Science.gov (United States)

    Garcia, Kathleen; Sargsyan, Ashot; Hamilton, Douglas; Martin, David; Ebert, Douglas; Melton, Shannon; Dulchavsky, Scott

    2008-01-01

    Ultrasound (US) capabilities have been part of the Human Research Facility (HRF) on board the International Space Station (ISS) since 2001. The US equipment on board the ISS includes a first-generation Tissue Harmonic Imaging (THI) option. Harmonic imaging (HI) is the second harmonic response of the tissue to the ultrasound beam and produces robust tissue detail and signal. Since this is a first-generation THI, there are inherent limitations in tissue penetration. As a breakthrough technology, HI extensively advanced the field of ultrasound. In cardiac applications, it drastically improves endocardial border detection and has become a common imaging modality. U.S. images were captured and stored as JPEG stills from the ISS video downlink. US images with and without harmonic imaging option were randomized and provided to volunteers without medical education or US skills for identification of endocardial border. The results were processed and analyzed using applicable statistical calculations. The measurements in US images using HI improved measurement consistency and reproducibility among observers when compared to fundamental imaging. HI has been embraced by the imaging community at large as it improves the quality and data validity of US studies, especially in difficult-to-image cases. Even with the limitations of the first generation THI, HI improved the quality and measurability of many of the downlinked images from the ISS and should be an option utilized with cardiac imaging on board the ISS in all future space missions.

  2. Improving the Discoverability and Availability of Sample Data and Imagery in NASA's Astromaterials Curation Digital Repository Using a New Common Architecture for Sample Databases

    Science.gov (United States)

    Todd, N. S.; Evans, C.

    2015-01-01

    The Astromaterials Acquisition and Curation Office at NASA's Johnson Space Center (JSC) is the designated facility for curating all of NASA's extraterrestrial samples. The suite of collections includes the lunar samples from the Apollo missions, cosmic dust particles falling into the Earth's atmosphere, meteorites collected in Antarctica, comet and interstellar dust particles from the Stardust mission, asteroid particles from the Japanese Hayabusa mission, and solar wind atoms collected during the Genesis mission. To support planetary science research on these samples, NASA's Astromaterials Curation Office hosts the Astromaterials Curation Digital Repository, which provides descriptions of the missions and collections, and critical information about each individual sample. Our office is implementing several informatics initiatives with the goal of better serving the planetary research community. One of these initiatives aims to increase the availability and discoverability of sample data and images through the use of a newly designed common architecture for Astromaterials Curation databases.

  3. Sport Sampling Is Associated With Improved Landing Technique in Youth Athletes.

    Science.gov (United States)

    DiStefano, Lindsay J; Beltz, Eleanor M; Root, Hayley J; Martinez, Jessica C; Houghton, Andrew; Taranto, Nicole; Pearce, Katherine; McConnell, Erin; Muscat, Courtney; Boyle, Steve; Trojian, Thomas H

    Sport sampling is recommended to promote fundamental movement skill acquisition and physical activity. In contrast, sport specialization is associated with musculoskeletal injury risk, burnout, and attrition from sport. There is limited evidence to support the influence of sport sampling on neuromuscular control, which is associated with injury risk, in youth athletes. Athletes who participated in only 1 sport during the previous year would demonstrate higher Landing Error Scoring System (LESS) scores than their counterparts. Cross-sectional study. Level 3. A total of 355 youth athletes (age range, 8-14 years) completed a test session with a jump-landing task, which was evaluated using the LESS. Participants were categorized as single sport (SS) or multisport (MS) based on their self-reported sport participation in the past year. Their duration of sport sampling (low, moderate, high) was determined based on their sport participation history. Participants were dichotomized into good (LESS sampling duration (low, moderate, high). The MS group was 2.5 times (95% CI, 1.9-3.1) as likely to be categorized as having good control compared with the SS group (χ 2 (355) = 10.10, P sampling duration group were 5.8 times (95% CI, 3.1-8.5) and 5.4 times (95% CI, 4.0-6.8) as likely to be categorized as having good control compared with the moderate and low groups (χ 2 (216) = 11.20, P sampling at a young age is associated with improved neuromuscular control, which may reduce injury risk in youth athletes. Youth athletes should be encouraged to try participating in multiple sports to enhance their neuromuscular control and promote long-term physical activity.

  4. Improved removal of blood contamination from ThinPrep cervical cytology samples for Raman spectroscopic analysis.

    Science.gov (United States)

    Traynor, Damien; Duraipandian, Shiyamala; Martin, Cara M; O'Leary, John J; Lyng, Fiona M

    2018-05-01

    There is an unmet need for methods to help in the early detection of cervical precancer. Optical spectroscopy-based techniques, such as Raman spectroscopy, have shown great potential for diagnosis of different cancers, including cervical cancer. However, relatively few studies have been carried out on liquid-based cytology (LBC) pap test specimens and confounding factors, such as blood contamination, have been identified. Previous work reported a method to remove blood contamination before Raman spectroscopy by pretreatment of the slides with hydrogen peroxide. The aim of the present study was to extend this work to excessively bloody samples to see if these could be rendered suitable for Raman spectroscopy. LBC ThinPrep specimens were treated by adding hydrogen peroxide directly to the vial before slide preparation. Good quality Raman spectra were recorded from negative and high grade (HG) cytology samples with no blood contamination and with heavy blood contamination. Good classification between negative and HG cytology could be achieved for samples with no blood contamination (sensitivity 92%, specificity 93%) and heavy blood contamination (sensitivity 89%, specificity 88%) with poorer classification when samples were combined (sensitivity 82%, specificity 87%). This study demonstrates for the first time the improved potential of Raman spectroscopy for analysis of ThinPrep specimens regardless of blood contamination. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  5. Improved importance sampling technique for efficient simulation of digital communication systems

    Science.gov (United States)

    Lu, Dingqing; Yao, Kung

    1988-01-01

    A new, improved importance sampling (IIS) approach to simulation is considered. Some basic concepts of IS are introduced, and detailed evolutions of simulation estimation variances for Monte Carlo (MC) and IS simulations are given. The general results obtained from these evolutions are applied to the specific previously known conventional importance sampling (CIS) technique and the new IIS technique. The derivation for a linear system with no signal random memory is considered in some detail. For the CIS technique, the optimum input scaling parameter is found, while for the IIS technique, the optimum translation parameter is found. The results are generalized to a linear system with memory and signals. Specific numerical and simulation results are given which show the advantages of CIS over MC and IIS over CIS for simulations of digital communications systems.

  6. Recent Improvements at CEA on Trace Analysis of Actinides in Environmental Samples

    International Nuclear Information System (INIS)

    Pointurier, F.; Hubert, A.; Faure, A.L.; Pottin, A.C.; Mourier, W.; Marie, O.

    2010-01-01

    In this paper, we present some results of R and D works conducted at CEA to improve on the one side the performance of the techniques already in use for detection of undeclared activities, and on the other side to develop new capabilities, either as alternative to the existing techniques or new methods that bring new information, complementary to the isotopic composition. For the trace analysis of plutonium in swipe samples by ICP-MS, we demonstrate that a thorough knowledge of the background in the actinide mass range is highly desirable. In order to avoid false plutonium detection in the femtogram range, correction from polyatomic interferences including mercury, lead or iridium atoms are in some case necessary. Efforts must be put on improving the purification procedure. Micro-Raman spectrometry allows determining the chemical composition of uranium compound at the scale of the microscopic object using a pre-location of the particles thanks to SEM and a relocation of these particles thanks to mathematical calculations. However, particles below 5 μm are hardly relocated and a coupling device between the SEM and the micro-Raman spectrometer for direct Raman analysis after location of a particle of interest is currently under testing. Lastly, laser ablation - ICP-MS is an interesting technique for direct isotopic or elemental analysis of various solid samples and proves to be a suitable alternative technique for particle analysis, although precision over isotopic ratio measurement is strongly limited by the short duration and irregularity of the signals. However, sensitivity and sample throughput are high and more developments are in progress to validate and improve this method. (author)

  7. An Improved Method for High Quality Metagenomics DNA Extraction from Human and Environmental Samples

    DEFF Research Database (Denmark)

    Bag, Satyabrata; Saha, Bipasa; Mehta, Ojasvi

    2016-01-01

    and human origin samples. We introduced a combination of physical, chemical and mechanical lysis methods for proper lysis of microbial inhabitants. The community microbial DNA was precipitated by using salt and organic solvent. Both the quality and quantity of isolated DNA was compared with the existing...... methodologies and the supremacy of our method was confirmed. Maximum recovery of genomic DNA in the absence of substantial amount of impurities made the method convenient for nucleic acid extraction. The nucleic acids obtained using this method are suitable for different downstream applications. This improved...

  8. Static, Mixed-Array Total Evaporation for Improved Quantitation of Plutonium Minor Isotopes in Small Samples

    Science.gov (United States)

    Stanley, F. E.; Byerly, Benjamin L.; Thomas, Mariam R.; Spencer, Khalil J.

    2016-06-01

    Actinide isotope measurements are a critical signature capability in the modern nuclear forensics "toolbox", especially when interrogating anthropogenic constituents in real-world scenarios. Unfortunately, established methodologies, such as traditional total evaporation via thermal ionization mass spectrometry, struggle to confidently measure low abundance isotope ratios (evaporation techniques as a straightforward means of improving plutonium minor isotope measurements, which have been resistant to enhancement in recent years because of elevated radiologic concerns. Results are presented for small sample (~20 ng) applications involving a well-known plutonium isotope reference material, CRM-126a, and compared with traditional total evaporation methods.

  9. Improvement of sample preparation for input plutonium accountability measurement by isotope dilution gammy-ray spectroscopy

    International Nuclear Information System (INIS)

    Nishida, K.; Kuno, Y.; Sato, S.; Masui, J.; Li, T.K.; Parker, J.L.; Hakkila, E.A.

    1992-01-01

    The sample preparation method for the isotope dilution gamma-ray spectrometry (IDGS) technique has been further improved for simultaneously determining the plutonium concentration and isotopic composition of highly irradiated spent-fuel dissolver solutions. The improvement includes using ion-exchange filter papers (instead of resin beads, as in two previous experiments) for better separation and recovery of plutonium from fission products. The results of IDGS measurements for five dissolver solutions are in good agreement with those by mass spectrometry with ∼0.4% for plutonium concentration and ∼0.1% for 239 Pu isotopic composition. The precision of the plutonium concentration is ∼1% with a 1-h count time. The technique could be implemented as an alternative method for input accountability and verification measurements in reprocessing plants

  10. Data Transformation Functions for Expanded Search Spaces in Geographic Sample Supervised Segment Generation

    Directory of Open Access Journals (Sweden)

    Christoff Fourie

    2014-04-01

    Full Text Available Sample supervised image analysis, in particular sample supervised segment generation, shows promise as a methodological avenue applicable within Geographic Object-Based Image Analysis (GEOBIA. Segmentation is acknowledged as a constituent component within typically expansive image analysis processes. A general extension to the basic formulation of an empirical discrepancy measure directed segmentation algorithm parameter tuning approach is proposed. An expanded search landscape is defined, consisting not only of the segmentation algorithm parameters, but also of low-level, parameterized image processing functions. Such higher dimensional search landscapes potentially allow for achieving better segmentation accuracies. The proposed method is tested with a range of low-level image transformation functions and two segmentation algorithms. The general effectiveness of such an approach is demonstrated compared to a variant only optimising segmentation algorithm parameters. Further, it is shown that the resultant search landscapes obtained from combining mid- and low-level image processing parameter domains, in our problem contexts, are sufficiently complex to warrant the use of population based stochastic search methods. Interdependencies of these two parameter domains are also demonstrated, necessitating simultaneous optimization.

  11. Automated Image Sampling and Classification Can Be Used to Explore Perceived Naturalness of Urban Spaces.

    Directory of Open Access Journals (Sweden)

    Roger Hyam

    Full Text Available The psychological restorative effects of exposure to nature are well established and extend to just viewing of images of nature. A previous study has shown that Perceived Naturalness (PN of images correlates with their restorative value. This study tests whether it is possible to detect degree of PN of images using an image classifier. It takes images that have been scored by humans for PN (including a subset that have been assessed for restorative value and passes them through the Google Vision API image classification service. The resulting labels are assigned to broad semantic classes to create a Calculated Semantic Naturalness (CSN metric for each image. It was found that CSN correlates with PN. CSN was then calculated for a geospatial sampling of Google Street View images across the city of Edinburgh. CSN was found to correlate with PN in this sample also indicating the technique may be useful in large scale studies. Because CSN correlates with PN which correlates with restorativeness it is suggested that CSN or a similar measure may be useful in automatically detecting restorative images and locations. In an exploratory aside CSN was not found to correlate with an indicator of socioeconomic deprivation.

  12. How iSamples (Internet of Samples in the Earth Sciences) Improves Sample and Data Stewardship in the Next Generation of Geoscientists

    Science.gov (United States)

    Hallett, B. W.; Dere, A. L. D.; Lehnert, K.; Carter, M.

    2016-12-01

    Vast numbers of physical samples are routinely collected by geoscientists to probe key scientific questions related to global climate change, biogeochemical cycles, magmatic processes, mantle dynamics, etc. Despite their value as irreplaceable records of nature the majority of these samples remain undiscoverable by the broader scientific community because they lack a digital presence or are not well-documented enough to facilitate their discovery and reuse for future scientific and educational use. The NSF EarthCube iSamples Research Coordination Network seeks to develop a unified approach across all Earth Science disciplines for the registration, description, identification, and citation of physical specimens in order to take advantage of the new opportunities that cyberinfrastructure offers. Even as consensus around best practices begins to emerge, such as the use of the International Geo Sample Number (IGSN), more work is needed to communicate these practices to investigators to encourage widespread adoption. Recognizing the importance of students and early career scientists in particular to transforming data and sample management practices, the iSamples Education and Training Working Group is developing training modules for sample collection, documentation, and management workflows. These training materials are made available to educators/research supervisors online at http://earthcube.org/group/isamples and can be modularized for supervisors to create a customized research workflow. This study details the design and development of several sample management tutorials, created by early career scientists and documented in collaboration with undergraduate research students in field and lab settings. Modules under development focus on rock outcrops, rock cores, soil cores, and coral samples, with an emphasis on sample management throughout the collection, analysis and archiving process. We invite others to share their sample management/registration workflows and to

  13. Improved sample treatment for the determination of insoluble soap in sewage sludge samples by liquid chromatography with fluorescence detection.

    Science.gov (United States)

    Cantarero, Samuel; Zafra-Gómez, A; Ballesteros, O; Navalón, A; Vílchez, J L; Crovetto, G; Verge, C; de Ferrer, J A

    2010-09-15

    A new selective and sensitive method for the determination of insoluble fatty acid salts (soap) in sewage sludge samples is proposed. The method involves a clean up of sample with petroleum ether, the conversion of calcium and magnesium insoluble salts into soluble potassium salts, potassium salts extraction with methanol, and a derivatization procedure previous to the liquid chromatography with fluorescence detection (LC-FLD) analysis. Three different extraction techniques (Soxhlet, microwave-assisted extraction and ultrasounds) were compared and microwave-assisted extraction (MAE) was selected as appropriate for our purpose. This allowed to reduce the extraction time and solvent waste (50 mL of methanol in contrast with 250 mL for Soxhlet procedure). The absence of matrix effect was demonstrated with two standards (C(13:0) and C(17:0)) that are not commercials and neither of them has been detected in sewage sludge samples. Therefore, it was possible to evaluate the matrix effect since both standards have similar environmental behaviour (adsorption and precipitation) to commercial soaps (C(10:0)-C(18:0)). The method was successfully applied to samples from different sources and consequently, with different composition. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  14. Improved automation of dissolved organic carbon sampling for organic-rich surface waters.

    Science.gov (United States)

    Grayson, Richard P; Holden, Joseph

    2016-02-01

    In-situ UV-Vis spectrophotometers offer the potential for improved estimates of dissolved organic carbon (DOC) fluxes for organic-rich systems such as peatlands because they are able to sample and log DOC proxies automatically through time at low cost. In turn, this could enable improved total carbon budget estimates for peatlands. The ability of such instruments to accurately measure DOC depends on a number of factors, not least of which is how absorbance measurements relate to DOC and the environmental conditions. Here we test the ability of a S::can Spectro::lyser™ for measuring DOC in peatland streams with routinely high DOC concentrations. Through analysis of the spectral response data collected by the instrument we have been able to accurately measure DOC up to 66 mg L(-1), which is more than double the original upper calibration limit for this particular instrument. A linear regression modelling approach resulted in an accuracy >95%. The greatest accuracy was achieved when absorbance values for several different wavelengths were used at the same time in the model. However, an accuracy >90% was achieved using absorbance values for a single wavelength to predict DOC concentration. Our calculations indicated that, for organic-rich systems, in-situ measurement with a scanning spectrophotometer can improve fluvial DOC flux estimates by 6 to 8% compared with traditional sampling methods. Thus, our techniques pave the way for improved long-term carbon budget calculations from organic-rich systems such as peatlands. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Improved method for solving the neutron transport problem by discretization of space and energy variables

    International Nuclear Information System (INIS)

    Bosevski, T.

    1971-01-01

    The polynomial interpolation of neutron flux between the chosen space and energy variables enabled transformation of the integral transport equation into a system of linear equations with constant coefficients. Solutions of this system are the needed values of flux for chosen values of space and energy variables. The proposed improved method for solving the neutron transport problem including the mathematical formalism is simple and efficient since the number of needed input data is decreased both in treating the spatial and energy variables. Mathematical method based on this approach gives more stable solutions with significantly decreased probability of numerical errors. Computer code based on the proposed method was used for calculations of one heavy water and one light water reactor cell, and the results were compared to results of other very precise calculations. The proposed method was better concerning convergence rate, decreased computing time and needed computer memory. Discretization of variables enabled direct comparison of theoretical and experimental results

  16. A new Langmuir probe concept for rapid sampling of space plasma electron density

    International Nuclear Information System (INIS)

    Jacobsen, K S; Pedersen, A; Moen, J I; Bekkeng, T A

    2010-01-01

    In this paper we describe a new Langmuir probe concept that was invented for the in situ investigation of HF radar backscatter irregularities, with the capability to measure absolute electron density at a resolution sufficient to resolve the finest conceivable structure in an ionospheric plasma. The instrument consists of two or more fixed-bias cylindrical Langmuir probes whose radius is small compared to the Debye length. With this configuration, it is possible to acquire absolute electron density measurements independent of electron temperature and rocket/satellite potential. The system was flown on the ICI-2 sounding rocket to investigate the plasma irregularities which cause HF backscatter. It had a sampling rate of more than 5 kHz and successfully measured structures down to the scale of one electron gyro radius. The system can easily be adapted for any ionospheric rocket or satellite, and provides high-quality measurements of electron density at any desired resolution

  17. HI-STAR. Health Improvements through Space Technologies and Resources: Executive Summary

    Science.gov (United States)

    Finarelli, Margaret G.

    2002-01-01

    Our mission is to develop and promote a global strategy to help combat malaria using space technology. Like the tiny yet powerful mosquito, HI-STAR (Health Improvements Through Space Technologies and Resources) is a small program that aspires to make a difference. Timely detection of malaria danger zones is essential to help health authorities and policy makers make decisions about how to manage limited resources for combating malaria. In 2001, the technical support network for prevention and control of malaria epidemics published a study. HI-STAR focuses on malaria because it is the most common and deadly of the vector-borne diseases. Malaria also shares many commonalities with other diseases, which means the global strategy developed here may also be applicable to other parasitic diseases. HI-STAR would like to contribute to the many malaria groups already making great strides in the fight against malaria. Some examples include: Roll Back Malaria, The Special Program for Research and Training in Tropical Diseases (TDR) and the Multilateral Initiative on Malaria (MIM). Other important groups that are among the first to include space technologies in their model include: The Center for Health Application of Aerospace Related Technologies (CHAART) and Mapping Malaria Risk in Africa (MARA). Malaria is a complex and multi-faceted disease. Combating it must therefore be equally versatile. HI-STAR incorporates an interdisciplinary, international, intercultural approach.called 'Malaria Early Warning Systems; Concepts, Indicators and Partners.' This study, funded by Roll Back Malaria, a World Health Organization initiative, offers a framework for a monitoring and early warning system. HI-STAR seeks to build on this proposal and enhance the space elements of the suggested framework. It is the work of fifty-three professionals and students from the International Space University's 2002 Summer Session Program held in California, USA.

  18. Projecting technology change to improve space technology planning and systems management

    Science.gov (United States)

    Walk, Steven Robert

    2011-04-01

    Projecting technology performance evolution has been improving over the years. Reliable quantitative forecasting methods have been developed that project the growth, diffusion, and performance of technology in time, including projecting technology substitutions, saturation levels, and performance improvements. These forecasts can be applied at the early stages of space technology planning to better predict available future technology performance, assure the successful selection of technology, and improve technology systems management strategy. Often what is published as a technology forecast is simply scenario planning, usually made by extrapolating current trends into the future, with perhaps some subjective insight added. Typically, the accuracy of such predictions falls rapidly with distance in time. Quantitative technology forecasting (QTF), on the other hand, includes the study of historic data to identify one of or a combination of several recognized universal technology diffusion or substitution patterns. In the same manner that quantitative models of physical phenomena provide excellent predictions of system behavior, so do QTF models provide reliable technological performance trajectories. In practice, a quantitative technology forecast is completed to ascertain with confidence when the projected performance of a technology or system of technologies will occur. Such projections provide reliable time-referenced information when considering cost and performance trade-offs in maintaining, replacing, or migrating a technology, component, or system. This paper introduces various quantitative technology forecasting techniques and illustrates their practical application in space technology and technology systems management.

  19. Spaced education in medical residents: An electronic intervention to improve competency and retention of medical knowledge.

    Directory of Open Access Journals (Sweden)

    Jason Matos

    Full Text Available Spaced education is a novel method that improves medical education through online repetition of core principles often paired with multiple-choice questions. This model is a proven teaching tool for medical students, but its effect on resident learning is less established. We hypothesized that repetition of key clinical concepts in a "Clinical Pearls" format would improve knowledge retention in medical residents.This study investigated spaced education with particular emphasis on using a novel, email-based reinforcement program, and a randomized, self-matched design, in which residents were quizzed on medical knowledge that was either reinforced or not with electronically-administered spaced education. Both reinforced and non-reinforced knowledge was later tested with four quizzes.Overall, respondents incorrectly answered 395 of 1008 questions (0.39; 95% CI, 0.36-0.42. Incorrect response rates varied by quiz (range 0.34-0.49; p = 0.02, but not significantly by post-graduate year (PGY1 0.44, PGY2 0.33, PGY3 0.38; p = 0.08. Although there was no evidence of benefit among residents (RR = 1.01; 95% CI, 0.83-1.22; p = 0.95, we observed a significantly lower risk of incorrect responses to reinforced material among interns (RR = 0.83, 95% CI, 0.70-0.99, p = 0.04.Overall, repetition of Clinical Pearls did not statistically improve test scores amongst junior and senior residents. However, among interns, repetition of the Clinical Pearls was associated with significantly higher test scores, perhaps reflecting their greater attendance at didactic sessions and engagement with Clinical Pearls. Although the study was limited by a low response rate, we employed test and control questions within the same quiz, limiting the potential for selection bias. Further work is needed to determine the optimal spacing and content load of Clinical Pearls to maximize retention amongst medical residents. This particular protocol of spaced education, however, was unique and

  20. The Index to Marine and Lacustrine Geological Samples: Improving Sample Accessibility and Enabling Current and Future Research

    Science.gov (United States)

    Moore, C.

    2011-12-01

    The Index to Marine and Lacustrine Geological Samples is a community designed and maintained resource enabling researchers to locate and request sea floor and lakebed geologic samples archived by partner institutions. Conceived in the dawn of the digital age by representatives from U.S. academic and government marine core repositories and the NOAA National Geophysical Data Center (NGDC) at a 1977 meeting convened by the National Science Foundation (NSF), the Index is based on core concepts of community oversight, common vocabularies, consistent metadata and a shared interface. Form and content of underlying vocabularies and metadata continue to evolve according to the needs of the community, as do supporting technologies and access methodologies. The Curators Consortium, now international in scope, meets at partner institutions biennially to share ideas and discuss best practices. NGDC serves the group by providing database access and maintenance, a list server, digitizing support and long-term archival of sample metadata, data and imagery. Over three decades, participating curators have performed the herculean task of creating and contributing metadata for over 195,000 sea floor and lakebed cores, grabs, and dredges archived in their collections. Some partners use the Index for primary web access to their collections while others use it to increase exposure of more in-depth institutional systems. The Index is currently a geospatially-enabled relational database, publicly accessible via Web Feature and Web Map Services, and text- and ArcGIS map-based web interfaces. To provide as much knowledge as possible about each sample, the Index includes curatorial contact information and links to related data, information and images; 1) at participating institutions, 2) in the NGDC archive, and 3) at sites such as the Rolling Deck to Repository (R2R) and the System for Earth Sample Registration (SESAR). Over 34,000 International GeoSample Numbers (IGSNs) linking to SESAR are

  1. Improved metamodel-based importance sampling for the performance assessment of radioactive waste repositories

    International Nuclear Information System (INIS)

    Cadini, F.; Gioletta, A.; Zio, E.

    2015-01-01

    In the context of a probabilistic performance assessment of a radioactive waste repository, the estimation of the probability of exceeding the dose threshold set by a regulatory body is a fundamental task. This may become difficult when the probabilities involved are very small, since the classically used sampling-based Monte Carlo methods may become computationally impractical. This issue is further complicated by the fact that the computer codes typically adopted in this context requires large computational efforts, both in terms of time and memory. This work proposes an original use of a Monte Carlo-based algorithm for (small) failure probability estimation in the context of the performance assessment of a near surface radioactive waste repository. The algorithm, developed within the context of structural reliability, makes use of an estimated optimal importance density and a surrogate, kriging-based metamodel approximating the system response. On the basis of an accurate analytic analysis of the algorithm, a modification is proposed which allows further reducing the computational efforts by a more effective training of the metamodel. - Highlights: • We tackle uncertainty propagation in a radwaste repository performance assessment. • We improve a kriging-based importance sampling for estimating failure probabilities. • We justify the modification by an analytic, comparative analysis of the algorithms. • The probability of exceeding dose thresholds in radwaste repositories is estimated. • The algorithm is further improved reducing the number of its free parameters

  2. Tactile display landing safety and precision improvements for the Space Shuttle

    Science.gov (United States)

    Olson, John M.

    A tactile display belt using 24 electro-mechanical tactile transducers (tactors) was used to determine if a modified tactile display system, known as the Tactile Situation Awareness System (TSAS) improved the safety and precision of a complex spacecraft (i.e. the Space Shuttle Orbiter) in guided precision approaches and landings. The goal was to determine if tactile cues enhance safety and mission performance through reduced workload, increased situational awareness (SA), and an improved operational capability by increasing secondary cognitive workload capacity and human-machine interface efficiency and effectiveness. Using both qualitative and quantitative measures such as NASA's Justiz Numerical Measure and Synwork1 scores, an Overall Workload (OW) measure, the Cooper-Harper rating scale, and the China Lake Situational Awareness scale, plus Pre- and Post-Flight Surveys, the data show that tactile displays decrease OW, improve SA, counteract fatigue, and provide superior warning and monitoring capacity for dynamic, off-nominal, high concurrent workload scenarios involving complex, cognitive, and multi-sensory critical scenarios. Use of TSAS for maintaining guided precision approaches and landings was generally intuitive, reduced training times, and improved task learning effects. Ultimately, the use of a homogeneous, experienced, and statistically robust population of test pilots demonstrated that the use of tactile displays for Space Shuttle approaches and landings with degraded vehicle systems, weather, and environmental conditions produced substantial improvements in safety, consistency, reliability, and ease of operations under demanding conditions. Recommendations for further analysis and study are provided in order to leverage the results from this research and further explore the potential to reduce the risk of spaceflight and aerospace operations in general.

  3. Evaluation of SRAT Sampling Data in Support of a Six Sigma Yellow Belt Process Improvement Project

    International Nuclear Information System (INIS)

    Edwards, Thomas B.

    2005-01-01

    As part of the Six Sigma continuous improvement initiatives at the Defense Waste Processing Facility (DWPF), a Yellow Belt team was formed to evaluate the frequency and types of samples required for the Sludge Receipt and Adjustment Tank (SRAT) receipt in the DWPF. The team asked, via a technical task request, that the Statistical Consulting Section (SCS), in concert with the Immobilization Technology Section (ITS) (both groups within the Savannah River National Laboratory (SRNL)), conduct a statistical review of recent SRAT receipt results to determine if there is enough consistency in these measurements to allow for less frequent sampling. As part of this review process, key decisions made by DWPF Process Engineering that are based upon the SRAT sample measurements are outlined in this report. For a reduction in SRAT sampling to be viable, these decisions must not be overly sensitive to the additional variation that will be introduced as a result of such a reduction. Measurements from samples of SRAT receipt batches 314 through 323 were reviewed as part of this investigation into the frequency of SRAT sampling. The associated acid calculations for these batches were also studied as part of this effort. The results from this investigation showed no indication of a statistically significant relationship between the tank solids and the acid additions for these batches. One would expect that as the tank solids increase there would be a corresponding increase in acid requirements. There was, however, an indication that the predicted reduction/oxidation (REDOX) ratio (the ratio of Fe 2+ to the total Fe in the glass product) that was targeted by the acid calculations based on the SRAT receipt samples for these batches was on average 0.0253 larger than the predicted REDOX based upon Slurry Mix Evaporator (SME) measurements. This is a statistically significant difference (at the 5% significance level), and the study also suggested that the difference was due to

  4. Improved Fractal Space Filling Curves Hybrid Optimization Algorithm for Vehicle Routing Problem.

    Science.gov (United States)

    Yue, Yi-xiang; Zhang, Tong; Yue, Qun-xing

    2015-01-01

    Vehicle Routing Problem (VRP) is one of the key issues in optimization of modern logistics system. In this paper, a modified VRP model with hard time window is established and a Hybrid Optimization Algorithm (HOA) based on Fractal Space Filling Curves (SFC) method and Genetic Algorithm (GA) is introduced. By incorporating the proposed algorithm, SFC method can find an initial and feasible solution very fast; GA is used to improve the initial solution. Thereafter, experimental software was developed and a large number of experimental computations from Solomon's benchmark have been studied. The experimental results demonstrate the feasibility and effectiveness of the HOA.

  5. Mid- and long-term runoff predictions by an improved phase-space reconstruction model

    International Nuclear Information System (INIS)

    Hong, Mei; Wang, Dong; Wang, Yuankun; Zeng, Xiankui; Ge, Shanshan; Yan, Hengqian; Singh, Vijay P.

    2016-01-01

    In recent years, the phase-space reconstruction method has usually been used for mid- and long-term runoff predictions. However, the traditional phase-space reconstruction method is still needs to be improved. Using the genetic algorithm to improve the phase-space reconstruction method, a new nonlinear model of monthly runoff is constructed. The new model does not rely heavily on embedding dimensions. Recognizing that the rainfall–runoff process is complex, affected by a number of factors, more variables (e.g. temperature and rainfall) are incorporated in the model. In order to detect the possible presence of chaos in the runoff dynamics, chaotic characteristics of the model are also analyzed, which shows the model can represent the nonlinear and chaotic characteristics of the runoff. The model is tested for its forecasting performance in four types of experiments using data from six hydrological stations on the Yellow River and the Yangtze River. Results show that the medium-and long-term runoff is satisfactorily forecasted at the hydrological stations. Not only is the forecasting trend accurate, but also the mean absolute percentage error is no more than 15%. Moreover, the forecast results of wet years and dry years are both good, which means that the improved model can overcome the traditional ‘‘wet years and dry years predictability barrier,’’ to some extent. The model forecasts for different regions are all good, showing the universality of the approach. Compared with selected conceptual and empirical methods, the model exhibits greater reliability and stability in the long-term runoff prediction. Our study provides a new thinking for research on the association between the monthly runoff and other hydrological factors, and also provides a new method for the prediction of the monthly runoff. - Highlights: • The improved phase-space reconstruction model of monthly runoff is established. • Two variables (temperature and rainfall) are incorporated

  6. Space Vector Modulation for an Indirect Matrix Converter with Improved Input Power Factor

    Directory of Open Access Journals (Sweden)

    Nguyen Dinh Tuyen

    2017-04-01

    Full Text Available Pulse width modulation strategies have been developed for indirect matrix converters (IMCs in order to improve their performance. In indirect matrix converters, the LC input filter is used to remove input current harmonics and electromagnetic interference problems. Unfortunately, due to the existence of the input filter, the input power factor is diminished, especially during operation at low voltage outputs. In this paper, a new space vector modulation (SVM is proposed to compensate for the input power factor of the indirect matrix converter. Both computer simulation and experimental studies through hardware implementation were performed to verify the effectiveness of the proposed modulation strategy.

  7. Continuous Improvements to East Coast Abort Landings for Space Shuttle Aborts

    Science.gov (United States)

    Butler, Kevin D.

    2003-01-01

    Improvement initiatives in the areas of guidance, flight control, and mission operations provide increased capability for successful East Coast Abort Landings (ECAL). Automating manual crew procedures in the Space Shuttle's onboard guidance allows faster and more precise commanding of flight control parameters needed for successful ECALs. Automation also provides additional capability in areas not possible with manual control. Operational changes in the mission concept allow for the addition of new landing sites and different ascent trajectories that increase the regions of a successful landing. The larger regions of ECAL capability increase the safety of the crew and Orbiter.

  8. Mid- and long-term runoff predictions by an improved phase-space reconstruction model

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Mei [Research Center of Ocean Environment Numerical Simulation, Institute of Meteorology and oceanography, PLA University of Science and Technology, Nanjing (China); Wang, Dong, E-mail: wangdong@nju.edu.cn [Key Laboratory of Surficial Geochemistry, Ministry of Education, Department of Hydrosciences, School of Earth Sciences and Engineering, Collaborative Innovation Center of South China Sea Studies, State Key Laboratory of Pollution Control and Resource Reuse, Nanjing University, Nanjing 210093 (China); Wang, Yuankun; Zeng, Xiankui [Key Laboratory of Surficial Geochemistry, Ministry of Education, Department of Hydrosciences, School of Earth Sciences and Engineering, Collaborative Innovation Center of South China Sea Studies, State Key Laboratory of Pollution Control and Resource Reuse, Nanjing University, Nanjing 210093 (China); Ge, Shanshan; Yan, Hengqian [Research Center of Ocean Environment Numerical Simulation, Institute of Meteorology and oceanography, PLA University of Science and Technology, Nanjing (China); Singh, Vijay P. [Department of Biological and Agricultural Engineering Zachry Department of Civil Engineering, Texas A & M University, College Station, TX 77843 (United States)

    2016-07-15

    In recent years, the phase-space reconstruction method has usually been used for mid- and long-term runoff predictions. However, the traditional phase-space reconstruction method is still needs to be improved. Using the genetic algorithm to improve the phase-space reconstruction method, a new nonlinear model of monthly runoff is constructed. The new model does not rely heavily on embedding dimensions. Recognizing that the rainfall–runoff process is complex, affected by a number of factors, more variables (e.g. temperature and rainfall) are incorporated in the model. In order to detect the possible presence of chaos in the runoff dynamics, chaotic characteristics of the model are also analyzed, which shows the model can represent the nonlinear and chaotic characteristics of the runoff. The model is tested for its forecasting performance in four types of experiments using data from six hydrological stations on the Yellow River and the Yangtze River. Results show that the medium-and long-term runoff is satisfactorily forecasted at the hydrological stations. Not only is the forecasting trend accurate, but also the mean absolute percentage error is no more than 15%. Moreover, the forecast results of wet years and dry years are both good, which means that the improved model can overcome the traditional ‘‘wet years and dry years predictability barrier,’’ to some extent. The model forecasts for different regions are all good, showing the universality of the approach. Compared with selected conceptual and empirical methods, the model exhibits greater reliability and stability in the long-term runoff prediction. Our study provides a new thinking for research on the association between the monthly runoff and other hydrological factors, and also provides a new method for the prediction of the monthly runoff. - Highlights: • The improved phase-space reconstruction model of monthly runoff is established. • Two variables (temperature and rainfall) are incorporated

  9. Emotional Experience Improves With Age: Evidence Based on Over 10 Years of Experience Sampling

    Science.gov (United States)

    Carstensen, Laura L.; Turan, Bulent; Scheibe, Susanne; Ram, Nilam; Ersner-Hershfield, Hal; Samanez-Larkin, Gregory R.; Brooks, Kathryn P.; Nesselroade, John R.

    2012-01-01

    Recent evidence suggests that emotional well-being improves from early adulthood to old age. This study used experience-sampling to examine the developmental course of emotional experience in a representative sample of adults spanning early to very late adulthood. Participants (N = 184, Wave 1; N = 191, Wave 2; N = 178, Wave 3) reported their emotional states at five randomly selected times each day for a one week period. Using a measurement burst design, the one-week sampling procedure was repeated five and then ten years later. Cross-sectional and growth curve analyses indicate that aging is associated with more positive overall emotional well-being, with greater emotional stability and with more complexity (as evidenced by greater co-occurrence of positive and negative emotions). These findings remained robust after accounting for other variables that may be related to emotional experience (personality, verbal fluency, physical health, and demographic variables). Finally, emotional experience predicted mortality; controlling for age, sex, and ethnicity, individuals who experienced relatively more positive than negative emotions in everyday life were more likely to have survived over a 13 year period. Findings are discussed in the theoretical context of socioemotional selectivity theory. PMID:20973600

  10. Coherent optical adaptive technique improves the spatial resolution of STED microscopy in thick samples

    Science.gov (United States)

    Yan, Wei; Yang, Yanlong; Tan, Yu; Chen, Xun; Li, Yang; Qu, Junle; Ye, Tong

    2018-01-01

    Stimulated emission depletion microscopy (STED) is one of far-field optical microscopy techniques that can provide sub-diffraction spatial resolution. The spatial resolution of the STED microscopy is determined by the specially engineered beam profile of the depletion beam and its power. However, the beam profile of the depletion beam may be distorted due to aberrations of optical systems and inhomogeneity of specimens’ optical properties, resulting in a compromised spatial resolution. The situation gets deteriorated when thick samples are imaged. In the worst case, the sever distortion of the depletion beam profile may cause complete loss of the super resolution effect no matter how much depletion power is applied to specimens. Previously several adaptive optics approaches have been explored to compensate aberrations of systems and specimens. However, it is hard to correct the complicated high-order optical aberrations of specimens. In this report, we demonstrate that the complicated distorted wavefront from a thick phantom sample can be measured by using the coherent optical adaptive technique (COAT). The full correction can effectively maintain and improve the spatial resolution in imaging thick samples. PMID:29400356

  11. Improved technique for measuring the size distribution of black carbon particles in rainwater and snow samples

    Science.gov (United States)

    Mori, T.; Moteki, N.; Ohata, S.; Koike, M.; Azuma, K. G.; Miyazaki, Y.; Kondo, Y.

    2015-12-01

    Black carbon (BC) is the strongest contributor to sunlight absorption among atmospheric aerosols. Quantitative understanding of wet deposition of BC, which strongly affects the spatial distribution of BC, is important to improve our understandings on climate change. We have devised a technique for measuring the masses of individual BC particles in rainwater and snow samples, as a combination of a nebulizer and a single-particle soot photometer (SP2) (Ohata et al. 2011, 2013; Schwarz et al. 2012; Mori et al. 2014). We show two important improvements in this technique: 1)We have extended the upper limit of detectable BC particle diameter from 0.9 μm to about 4.0 μm by modifying the photodetector for measuring the laser-induced incandescence signal. 2)We introduced a pneumatic nebulizer Marin-5 (Cetac Technologies Inc., Omaha, NE, USA) and experimentally confirmed its high extraction efficiency (~50%) independent of particle diameter up to 2.0 μm. Using our improved system, we simultaneously measured the size distribution of BC particles in air and rainwater in Tokyo. We observed that the size distribution of BC in rainwater was larger than that in air, indicating that large BC particles were effectively removed by precipitation. We also observed BC particles with diameters larger than 1.0 μm, indicating that further studies of wet deposition of BC will require the use of the modified SP2.

  12. Optimized, unequal pulse spacing in multiple echo sequences improves refocusing in magnetic resonance.

    Science.gov (United States)

    Jenista, Elizabeth R; Stokes, Ashley M; Branca, Rosa Tamara; Warren, Warren S

    2009-11-28

    A recent quantum computing paper (G. S. Uhrig, Phys. Rev. Lett. 98, 100504 (2007)) analytically derived optimal pulse spacings for a multiple spin echo sequence designed to remove decoherence in a two-level system coupled to a bath. The spacings in what has been called a "Uhrig dynamic decoupling (UDD) sequence" differ dramatically from the conventional, equal pulse spacing of a Carr-Purcell-Meiboom-Gill (CPMG) multiple spin echo sequence. The UDD sequence was derived for a model that is unrelated to magnetic resonance, but was recently shown theoretically to be more general. Here we show that the UDD sequence has theoretical advantages for magnetic resonance imaging of structured materials such as tissue, where diffusion in compartmentalized and microstructured environments leads to fluctuating fields on a range of different time scales. We also show experimentally, both in excised tissue and in a live mouse tumor model, that optimal UDD sequences produce different T(2)-weighted contrast than do CPMG sequences with the same number of pulses and total delay, with substantial enhancements in most regions. This permits improved characterization of low-frequency spectral density functions in a wide range of applications.

  13. Extreme robustness of scaling in sample space reducing processes explains Zipf’s law in diffusion on directed networks

    International Nuclear Information System (INIS)

    Corominas-Murtra, Bernat; Hanel, Rudolf; Thurner, Stefan

    2016-01-01

    It has been shown recently that a specific class of path-dependent stochastic processes, which reduce their sample space as they unfold, lead to exact scaling laws in frequency and rank distributions. Such sample space reducing processes offer an alternative new mechanism to understand the emergence of scaling in countless processes. The corresponding power law exponents were shown to be related to noise levels in the process. Here we show that the emergence of scaling is not limited to the simplest SSRPs, but holds for a huge domain of stochastic processes that are characterised by non-uniform prior distributions. We demonstrate mathematically that in the absence of noise the scaling exponents converge to −1 (Zipf’s law) for almost all prior distributions. As a consequence it becomes possible to fully understand targeted diffusion on weighted directed networks and its associated scaling laws in node visit distributions. The presence of cycles can be properly interpreted as playing the same role as noise in SSRPs and, accordingly, determine the scaling exponents. The result that Zipf’s law emerges as a generic feature of diffusion on networks, regardless of its details, and that the exponent of visiting times is related to the amount of cycles in a network could be relevant for a series of applications in traffic-, transport- and supply chain management. (paper)

  14. Improving Sensorimotor Adaptation Following Long Duration Space Flight by Enhancing Vestibular Information Transfer

    Science.gov (United States)

    Mulavara, A. P.; Kofman, I. S.; De Dios, Y. E; Galvan, R.; Goel, R.; Miller, C.; Peters, B.; Cohen, H. S.; Jeevarajan, J.; Reschke, M.; hide

    2014-01-01

    Crewmember adapted to the microgravity state may need to egress the vehicle within a few minutes for safety and operational reasons after gravitational transitions. The transition from one sensorimotor state to another consists of two main mechanisms: strategic and plastic-adaptive and have been demonstrated in astronauts returning after long duration space flight. Strategic modifications represent "early adaptation" - immediate and transitory changes in control that are employed to deal with short-term changes in the environment. If these modifications are prolonged then plastic-adaptive changes are evoked that modify central nervous system function, automating new behavioral responses. More importantly, this longer term adaptive recovery mechanism was significantly associated with their strategic ability to recover on the first day after return to Earth G. We are developing a method based on stochastic resonance to enhance information transfer by improving the brain's ability to detect vestibular signals (Vestibular Stochastic Resonance, VSR) especially when combined with balance training exercises such as sensorimotor adaptability (SA) training for rapid improvement in functional skill, for standing and mobility. This countermeasure to improve detection of vestibular signals is a stimulus delivery system that is wearable/portable providing low imperceptible levels of white noise based binaural bipolar electrical stimulation of the vestibular system (stochastic vestibular stimulation). To determine efficacy of vestibular stimulation on physiological and perceptual responses during otolith-canal conflicts and dynamic perturbations we have conducted a series of studies: We have shown that imperceptible binaural bipolar electrical stimulation of the vestibular system across the mastoids enhances balance performance in the mediolateral (ML) plane while standing on an unstable surface. We have followed up on the previous study showing VSR stimulation improved balance

  15. MapSentinel: Can the Knowledge of Space Use Improve Indoor Tracking Further?

    Directory of Open Access Journals (Sweden)

    Ruoxi Jia

    2016-04-01

    Full Text Available Estimating an occupant’s location is arguably the most fundamental sensing task in smart buildings. The applications for fine-grained, responsive building operations require the location sensing systems to provide location estimates in real time, also known as indoor tracking. Existing indoor tracking systems require occupants to carry specialized devices or install programs on their smartphone to collect inertial sensing data. In this paper, we propose MapSentinel, which performs non-intrusive location sensing based on WiFi access points and ultrasonic sensors. MapSentinel combines the noisy sensor readings with the floormap information to estimate locations. One key observation supporting our work is that occupants exhibit distinctive motion characteristics at different locations on the floormap, e.g., constrained motion along the corridor or in the cubicle zones, and free movement in the open space. While extensive research has been performed on using a floormap as a tool to obtain correct walking trajectories without wall-crossings, there have been few attempts to incorporate the knowledge of space use available from the floormap into the location estimation. This paper argues that the knowledge of space use as an additional information source presents new opportunities for indoor tracking. The fusion of heterogeneous information is theoretically formulated within the Factor Graph framework, and the Context-Augmented Particle Filtering algorithm is developed to efficiently solve real-time walking trajectories. Our evaluation in a large office space shows that the MapSentinel can achieve accuracy improvement of 31 . 3 % compared with the purely WiFi-based tracking system.

  16. SU-F-J-158: Respiratory Motion Resolved, Self-Gated 4D-MRI Using Rotating Cartesian K-Space Sampling

    Energy Technology Data Exchange (ETDEWEB)

    Han, F; Zhou, Z; Yang, Y; Sheng, K; Hu, P [UCLA School of Medicine, Los Angeles, CA (United States)

    2016-06-15

    Purpose: Dynamic MRI has been used to quantify respiratory motion of abdominal organs in radiation treatment planning. Many existing 4D-MRI methods based on 2D acquisitions suffer from limited slice resolution and additional stitching artifacts when evaluated in 3D{sup 1}. To address these issues, we developed a 4D-MRI (3D dynamic) technique with true 3D k-space encoding and respiratory motion self-gating. Methods: The 3D k-space was acquired using a Rotating Cartesian K-space (ROCK) pattern, where the Cartesian grid was reordered in a quasi-spiral fashion with each spiral arm rotated using golden angle{sup 2}. Each quasi-spiral arm started with the k-space center-line, which were used as self-gating{sup 3} signal for respiratory motion estimation. The acquired k-space data was then binned into 8 respiratory phases and the golden angle ensures a near-uniform k-space sampling in each phase. Finally, dynamic 3D images were reconstructed using the ESPIRiT technique{sup 4}. 4D-MRI was performed on 6 healthy volunteers, using the following parameters (bSSFP, Fat-Sat, TE/TR=2ms/4ms, matrix size=500×350×120, resolution=1×1×1.2mm, TA=5min, 8 respiratory phases). Supplemental 2D real-time images were acquired in 9 different planes. Dynamic locations of the diaphragm dome and left kidney were measured from both 4D and 2D images. The same protocol was also performed on a MRI-compatible motion phantom where the motion was programmed with different amplitude (10–30mm) and frequency (3–10/min). Results: High resolution 4D-MRI were obtained successfully in 5 minutes. Quantitative motion measurements from 4D-MRI agree with the ones from 2D CINE (<5% error). The 4D images are free of the stitching artifacts and their near-isotropic resolution facilitates 3D visualization and segmentation of abdominal organs such as the liver, kidney and pancreas. Conclusion: Our preliminary studies demonstrated a novel ROCK 4D-MRI technique with true 3D k-space encoding and respiratory

  17. Experts’ analysis of the improvement spaces of the first phase of reform in health system financial management: A qualitative study

    Directory of Open Access Journals (Sweden)

    P. Bastani

    2016-04-01

    Full Text Available Background: Health financial reforms began in 2005 through four phases in order to achieve the maximum efficiency and effectiveness in this sector. The first phase was accrual accounting implementation instead of cash method. Objective: The aim of this study was to determine the most important improvement spaces of the first phase of reform in financial management (accrual accounting in the viewpoints of financial experts employed in middle and operational levels of Universities of Medical Sciences. Methods: This qualitative study was conducted in Universities of Medical Sciences in 2013 using non-probability sampling method (snowball. Saturation was achieved only after 25 semi-structured interviews. Data were analyzed using content analysis by Kruger model. Findings: Seven areas of improvement including staffs, managers, information system, organizational culture, structure, process, and financial were identified as main themes. Each theme contained several sub-themes. Conclusion: Attempts and planning should be considered by decision makers in order to improve modifiable determinants through practical mechanisms in the first phase of health system financial management.

  18. Improving the UNC Passive Aerosol Sampler Model Based on Comparison with Commonly Used Aerosol Sampling Methods.

    Science.gov (United States)

    Shirdel, Mariam; Andersson, Britt M; Bergdahl, Ingvar A; Sommar, Johan N; Wingfors, Håkan; Liljelind, Ingrid E

    2018-03-12

    In an occupational environment, passive sampling could be an alternative to active sampling with pumps for sampling of dust. One passive sampler is the University of North Carolina passive aerosol sampler (UNC sampler). It is often analysed by microscopic imaging. Promising results have been shown for particles above 2.5 µm, but indicate large underestimations for PM2.5. The aim of this study was to evaluate, and possibly improve, the UNC sampler for stationary sampling in a working environment. Sampling was carried out at 8-h intervals during 24 h in four locations in an open pit mine with UNC samplers, respirable cyclones, PM10 and PM2.5 impactors, and an aerodynamic particle sizer (APS). The wind was minimal. For quantification, two modifications of the UNC sampler analysis model, UNC sampler with hybrid model and UNC sampler with area factor, were compared with the original one, UNC sampler with mesh factor derived from wind tunnel experiments. The effect of increased resolution for the microscopic imaging was examined. Use of the area factor and a higher resolution eliminated the underestimation for PM10 and PM2.5. The model with area factor had the overall lowest deviation versus the impactor and the cyclone. The intraclass correlation (ICC) showed that the UNC sampler had a higher precision and better ability to distinguish between different exposure levels compared to the cyclone (ICC: 0.51 versus 0.24), but lower precision compared to the impactor (PM10: 0.79 versus 0.99; PM2.5: 0.30 versus 0.45). The particle size distributions as calculated from the different UNC sampler analysis models were visually compared with the distributions determined by APS. The distributions were obviously different when the UNC sampler with mesh factor was used but came to a reasonable agreement when the area factor was used. High resolution combined with a factor based on area only, results in no underestimation of small particles compared to impactors and cyclones and a

  19. Signal improvement in multiphoton microscopy by reflection with simple mirrors near the sample

    Science.gov (United States)

    Rehberg, Markus; Krombach, Fritz; Pohl, Ulrich; Dietzel, Steffen

    2010-03-01

    In conventional fluorescence or confocal microscopy, emitted light is generated not only in the focal plane but also above and below. The situation is different in multiphoton-induced fluorescence and multiphoton-induced higher harmonic generation. Here, restriction of signal generation to a single focal point permits that all emitted photons can contribute to image formation if collected, regardless of their path through the specimen. Often, the intensity of the emitted light is rather low in biological specimens. We present a method to significantly increase the fraction of photons collected by an epi (backward) detector by placing a simple mirror, an aluminum-coated coverslip, directly under the sample. Samples investigated include fluorescent test slides, collagen gels, and thin-layered, intact mouse skeletal muscles. Quantitative analysis revealed an intensity increase of second- and third-harmonic generated signal in skeletal muscle of nine- and sevenfold respectively, and of fluorescent signal in test slides of up to twofold. Our approach thus allows significant signal improvement also for situations were a forward detection is impossible, e.g., due to the anatomy of animals in intravital microscopy.

  20. AST: an automated sequence-sampling method for improving the taxonomic diversity of gene phylogenetic trees.

    Science.gov (United States)

    Zhou, Chan; Mao, Fenglou; Yin, Yanbin; Huang, Jinling; Gogarten, Johann Peter; Xu, Ying

    2014-01-01

    A challenge in phylogenetic inference of gene trees is how to properly sample a large pool of homologous sequences to derive a good representative subset of sequences. Such a need arises in various applications, e.g. when (1) accuracy-oriented phylogenetic reconstruction methods may not be able to deal with a large pool of sequences due to their high demand in computing resources; (2) applications analyzing a collection of gene trees may prefer to use trees with fewer operational taxonomic units (OTUs), for instance for the detection of horizontal gene transfer events by identifying phylogenetic conflicts; and (3) the pool of available sequences is biased towards extensively studied species. In the past, the creation of subsamples often relied on manual selection. Here we present an Automated sequence-Sampling method for improving the Taxonomic diversity of gene phylogenetic trees, AST, to obtain representative sequences that maximize the taxonomic diversity of the sampled sequences. To demonstrate the effectiveness of AST, we have tested it to solve four problems, namely, inference of the evolutionary histories of the small ribosomal subunit protein S5 of E. coli, 16 S ribosomal RNAs and glycosyl-transferase gene family 8, and a study of ancient horizontal gene transfers from bacteria to plants. Our results show that the resolution of our computational results is almost as good as that of manual inference by domain experts, hence making the tool generally useful to phylogenetic studies by non-phylogeny specialists. The program is available at http://csbl.bmb.uga.edu/~zhouchan/AST.php.

  1. Multipurpose Cooling Garment for Improved Space Suit Environmental Control, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Future manned space exploration missions will require space suits with capabilities beyond the current state of the art. Portable Life Support Systems for these...

  2. Post-Flight Microbial Analysis of Samples from the International Space Station Water Recovery System and Oxygen Generation System

    Science.gov (United States)

    Birmele, Michele N.

    2011-01-01

    The Regenerative, Environmental Control and Life Support System (ECLSS) on the International Space Station (ISS) includes the the Water Recovery System (WRS) and the Oxygen Generation System (OGS). The WRS consists of a Urine Processor Assembly (UPA) and Water Processor Assembly (WPA). This report describes microbial characterization of wastewater and surface samples collected from the WRS and OGS subsystems, returned to KSC, JSC, and MSFC on consecutive shuttle flights (STS-129 and STS-130) in 2009-10. STS-129 returned two filters that contained fluid samples from the WPA Waste Tank Orbital Recovery Unit (ORU), one from the waste tank and the other from the ISS humidity condensate. Direct count by microscopic enumeration revealed 8.38 x 104 cells per mL in the humidity condensate sample, but none of those cells were recoverable on solid agar media. In contrast, 3.32 x lOs cells per mL were measured from a surface swab of the WRS waste tank, including viable bacteria and fungi recovered after S12 days of incubation on solid agar media. Based on rDNA sequencing and phenotypic characterization, a fungus recovered from the filter was determined to be Lecythophora mutabilis. The bacterial isolate was identified by rDNA sequence data to be Methylobacterium radiotolerans. Additional UPA subsystem samples were returned on STS-130 for analysis. Both liquid and solid samples were collected from the Russian urine container (EDV), Distillation Assembly (DA) and Recycle Filter Tank Assembly (RFTA) for post-flight analysis. The bacterium Pseudomonas aeruginosa and fungus Chaetomium brasiliense were isolated from the EDV samples. No viable bacteria or fungi were recovered from RFTA brine samples (N= 6), but multiple samples (N = 11) from the DA and RFTA were found to contain fungal and bacterial cells. Many recovered cells have been identified to genus by rDNA sequencing and carbon source utilization profiling (BiOLOG Gen III). The presence of viable bacteria and fungi from WRS

  3. NASA Johnson Space Center's Planetary Sample Analysis and Mission Science (PSAMS) Laboratory: A National Facility for Planetary Research

    Science.gov (United States)

    Draper, D. S.

    2016-01-01

    NASA Johnson Space Center's (JSC's) Astromaterials Research and Exploration Science (ARES) Division, part of the Exploration Integration and Science Directorate, houses a unique combination of laboratories and other assets for conducting cutting edge planetary research. These facilities have been accessed for decades by outside scientists, most at no cost and on an informal basis. ARES has thus provided substantial leverage to many past and ongoing science projects at the national and international level. Here we propose to formalize that support via an ARES/JSC Plane-tary Sample Analysis and Mission Science Laboratory (PSAMS Lab). We maintain three major research capa-bilities: astromaterial sample analysis, planetary process simulation, and robotic-mission analog research. ARES scientists also support planning for eventual human ex-ploration missions, including astronaut geological training. We outline our facility's capabilities and its potential service to the community at large which, taken together with longstanding ARES experience and expertise in curation and in applied mission science, enable multi-disciplinary planetary research possible at no other institution. Comprehensive campaigns incorporating sample data, experimental constraints, and mission science data can be conducted under one roof.

  4. Pursuit of improvement in uranium bulk analysis at the clear facility for safeguards environmental samples

    International Nuclear Information System (INIS)

    Sakurai, S.; Takahashi, M.; Sakakibara, T.; Magara, M.; Kurosawa, S.; Esaka, F.; Takai, K.; Watanabe, K.; Usuda, S.; Adachi, T.

    2002-01-01

    Full text: In order to contribute to the IAEA strengthened safeguards system, a project started in Japan Atomic Energy Research Institute (JAERI) in 1998. Consequently, a clean room facility called as CLEAR, the Clean Laboratory for Environmental Analysis and Research, was constructed in June 2001 at JAERI Tokai and the analytical techniques of ultra-trace nuclear materials in environmental samples are being developed. As for the bulk analysis, performance of inductively-coupled plasma mass spectrometry (ICP-MS) was mainly examined because sample preparation for ICP-MS is simpler than that for thermal ionization mass spectrometry (TIMS). Interference of polyatomic ion (such as PtAr + ) and coexisting element (such as Na) on the uranium ions, as well as mass bias caused by ICP-MS operating conditions, has been investigated for precise measurement on uranium isotope ratio. The authors have also studied on the uranium blanks during sample treatment process. The blank value below 10 pg uranium per sample treatment was obtained: dominant origins were elution from Teflon vessel surface in acid heating process of the sample to dry up. The work is in progress to minimize the blank. Compared with the process blank and the minimum uranium amount for isotope ratio measurement by ICP-MS (ca. 10 pg for natural uranium), the swipe cotton (Texwipe-304) which is currently used for IAEA Environmental Sampling includes much more amount of natural uranium in several nano-grams. If the amount of uranium collected on Texwipe-304 is small, sensitive and reliable measurement on isotope ratio will be impossible by bulk analysis. The authors are seeking alternative swipe materials with less amount of uranium. Recently, one of the authors devised an effective technique for recovery of uranium-containing particles from Texwipe-304. The technique, named as Vacuum Suction Method, uses a combination of polycarbonate membrane filters and a macro-pipette tip, which is connected to a vacuum pump

  5. Use of a holder-vacuum tube device to save on-site hands in preparing urine samples for head-space gas-chromatography, and its application to determine the time allowance for sample sealing.

    Science.gov (United States)

    Kawai, Toshio; Sumino, Kimiaki; Ohashi, Fumiko; Ikeda, Masayuki

    2011-01-01

    To facilitate urine sample preparation prior to head-space gas-chromatographic (HS-GC) analysis. Urine samples containing one of the five solvents (acetone, methanol, methyl ethyl ketone, methyl isobutyl ketone and toluene) at the levels of biological exposure limits were aspirated into a vacuum tube via holder, a device commercially available for venous blood collection (the vacuum tube method). The urine sample, 5 ml, was quantitatively transferred to a 20-ml head-space vial prior to HS-GC analysis. The loaded tubes were stored at +4 ℃ in dark for up to 3 d. The vacuum tube method facilitated on-site procedures of urine sample preparation for HS-GC with no significant loss of solvents in the sample and no need of skilled hands, whereas on-site sample preparation time was significantly reduced. Furthermore, no loss of solvents was detected during the 3-d storage, irrespective of hydrophilic (acetone) or lipophilic solvent (toluene). In a pilot application, high performance of the vacuum tube method in sealing a sample in an air-tight space succeeded to confirm that no solvent will be lost when sealing is completed within 5 min after urine voiding, and that the allowance time is as long as 30 min in case of toluene in urine. The use of the holder-vacuum tube device not only saves hands for transfer of the sample to air-tight space, but facilitates sample storage prior to HS-GC analysis.

  6. Relating Linear and Volumetric Variables Through Body Scanning to Improve Human Interfaces in Space

    Science.gov (United States)

    Margerum, Sarah E.; Ferrer, Mike A.; Young, Karen S.; Rajulu, Sudhakar

    2010-01-01

    Designing space suits and vehicles for the diverse human population present unique challenges for the methods of traditional anthropometry. Space suits are bulky and allow the operator to shift position within the suit and inhibit the ability to identify body landmarks. Limited suit sizing options also cause variability in fit and performance between similarly sized individuals. Space vehicles are restrictive in volume in both the fit and the ability to collect data. NASA's Anthropometric and Biomechanics Facility (ABF) has utilized 3D scanning to shift from traditional linear anthropometry to explore and examine volumetric capabilities to provide anthropometric solutions for design. Overall, the key goals are to improve the human-system performance and develop new processes to aid in the design and evaluation of space systems. Four case studies are presented that illustrate the shift from purely linear analyses to an augmented volumetric toolset to predict and analyze the human within the space suit and vehicle. The first case study involves the calculation of maximal head volume to estimate total free volume in the helmet for proper air exchange. Traditional linear measurements resulted in an inaccurate representation of the head shape, yet limited data exists for the determination of a large head volume. Steps were first taken to identify and classify a maximum head volume and the resulting comparisons to the estimate are presented in this paper. This study illustrates the gap between linear components of anthropometry and the need for overall volume metrics in order to provide solutions. A second case study examines the overlay of the space suit scans and components onto scanned individuals to quantify fit and clearance to aid in sizing the suit to the individual. Restrictions in space suit size availability present unique challenges to optimally fit the individual within a limited sizing range while maintaining performance. Quantification of the clearance and

  7. Accuracy improvement of SPACE code using the optimization for CHF subroutine

    International Nuclear Information System (INIS)

    Yang, Chang Keun; Kim, Yo Han; Park, Jong Eun; Ha, Sang Jun

    2010-01-01

    Typically, a subroutine to calculate the CHF (Critical Heat Flux) is loaded in code for safety analysis of nuclear power plant. CHF subroutine calculates CHF phenomenon using arbitrary condition (Temperature, pressure, flow rate, power, etc). When safety analysis for nuclear power plant is performed using major factor, CHF parameter is one of the most important factor. But the subroutines used in most codes, such as Biasi method, etc., estimate some different values from experimental data. Most CHF subroutines in the codes could predict only in their specification area, such as pressure, mass flow, void fraction, etc. Even though the most accurate CHF subroutine is used in the high quality nuclear safety analysis code, it is not assured that the valued predicted values by the subroutine are acceptable out of their application area. To overcome this hardship, various approaches to estimate the CHF have been examined during the code developing stage of SPACE. And the six sigma technique was adopted for the examination as mentioned this study. The objective of this study is to improvement of CHF prediction accuracy for nuclear power plant safety analysis code using the CHF database and Six Sigma technique. Through the study, it was concluded that the six sigma technique was useful to quantify the deviation of prediction values to experimental data and the implemented CHF prediction method in SPACE code had well-predict capabilities compared with those from other methods

  8. Improving the critical thinking skills of junior high school students on Earth and Space Science (ESS) materials

    Science.gov (United States)

    Marlina, L.; Liliasari; Tjasyono, B.; Hendayana, S.

    2018-05-01

    Critical thinking skills need to be developed in students. With critical thinking skills, students will be able to understand the concept with more depth easily, be sensitive with problems that occur, understand and solve problems that occur in their surroundings, and apply the concepts in different situations. Earth and Space Science (ESS) material is part of the science subjects given from elementary school to college. This research is a test of research program with quantitative method. This study aims to investigate the improvement of critical thinking skills of students through training of science teachers in junior high school in designing learning media for teaching ESS. With samples of 24 science teachers and 32 students of grade 7th in junior high school which are chosen by purposive sampling in a school in Ogan Ilir District, South Sumatra, obtained average pre-test and post-test scores of students’ critical thinking skills are 52.26 and 67.06 with an average N-gain of 0.31. A survey and critical thinking skills based-test were conducted to get the data. The results show positive impact and an increase in students’ critical thinking skills on the ESS material.

  9. Environmental DNA (eDNA sampling improves occurrence and detection estimates of invasive burmese pythons.

    Directory of Open Access Journals (Sweden)

    Margaret E Hunter

    Full Text Available Environmental DNA (eDNA methods are used to detect DNA that is shed into the aquatic environment by cryptic or low density species. Applied in eDNA studies, occupancy models can be used to estimate occurrence and detection probabilities and thereby account for imperfect detection. However, occupancy terminology has been applied inconsistently in eDNA studies, and many have calculated occurrence probabilities while not considering the effects of imperfect detection. Low detection of invasive giant constrictors using visual surveys and traps has hampered the estimation of occupancy and detection estimates needed for population management in southern Florida, USA. Giant constrictor snakes pose a threat to native species and the ecological restoration of the Florida Everglades. To assist with detection, we developed species-specific eDNA assays using quantitative PCR (qPCR for the Burmese python (Python molurus bivittatus, Northern African python (P. sebae, boa constrictor (Boa constrictor, and the green (Eunectes murinus and yellow anaconda (E. notaeus. Burmese pythons, Northern African pythons, and boa constrictors are established and reproducing, while the green and yellow anaconda have the potential to become established. We validated the python and boa constrictor assays using laboratory trials and tested all species in 21 field locations distributed in eight southern Florida regions. Burmese python eDNA was detected in 37 of 63 field sampling events; however, the other species were not detected. Although eDNA was heterogeneously distributed in the environment, occupancy models were able to provide the first estimates of detection probabilities, which were greater than 91%. Burmese python eDNA was detected along the leading northern edge of the known population boundary. The development of informative detection tools and eDNA occupancy models can improve conservation efforts in southern Florida and support more extensive studies of invasive

  10. Environmental DNA (eDNA) sampling improves occurrence and detection estimates of invasive burmese pythons.

    Science.gov (United States)

    Hunter, Margaret E; Oyler-McCance, Sara J; Dorazio, Robert M; Fike, Jennifer A; Smith, Brian J; Hunter, Charles T; Reed, Robert N; Hart, Kristen M

    2015-01-01

    Environmental DNA (eDNA) methods are used to detect DNA that is shed into the aquatic environment by cryptic or low density species. Applied in eDNA studies, occupancy models can be used to estimate occurrence and detection probabilities and thereby account for imperfect detection. However, occupancy terminology has been applied inconsistently in eDNA studies, and many have calculated occurrence probabilities while not considering the effects of imperfect detection. Low detection of invasive giant constrictors using visual surveys and traps has hampered the estimation of occupancy and detection estimates needed for population management in southern Florida, USA. Giant constrictor snakes pose a threat to native species and the ecological restoration of the Florida Everglades. To assist with detection, we developed species-specific eDNA assays using quantitative PCR (qPCR) for the Burmese python (Python molurus bivittatus), Northern African python (P. sebae), boa constrictor (Boa constrictor), and the green (Eunectes murinus) and yellow anaconda (E. notaeus). Burmese pythons, Northern African pythons, and boa constrictors are established and reproducing, while the green and yellow anaconda have the potential to become established. We validated the python and boa constrictor assays using laboratory trials and tested all species in 21 field locations distributed in eight southern Florida regions. Burmese python eDNA was detected in 37 of 63 field sampling events; however, the other species were not detected. Although eDNA was heterogeneously distributed in the environment, occupancy models were able to provide the first estimates of detection probabilities, which were greater than 91%. Burmese python eDNA was detected along the leading northern edge of the known population boundary. The development of informative detection tools and eDNA occupancy models can improve conservation efforts in southern Florida and support more extensive studies of invasive constrictors

  11. Sampling strategies to improve passive optical remote sensing of river bathymetry

    Science.gov (United States)

    Legleiter, Carl; Overstreet, Brandon; Kinzel, Paul J.

    2018-01-01

    Passive optical remote sensing of river bathymetry involves establishing a relation between depth and reflectance that can be applied throughout an image to produce a depth map. Building upon the Optimal Band Ratio Analysis (OBRA) framework, we introduce sampling strategies for constructing calibration data sets that lead to strong relationships between an image-derived quantity and depth across a range of depths. Progressively excluding observations that exceed a series of cutoff depths from the calibration process improved the accuracy of depth estimates and allowed the maximum detectable depth ($d_{max}$) to be inferred directly from an image. Depth retrieval in two distinct rivers also was enhanced by a stratified version of OBRA that partitions field measurements into a series of depth bins to avoid biases associated with under-representation of shallow areas in typical field data sets. In the shallower, clearer of the two rivers, including the deepest field observations in the calibration data set did not compromise depth retrieval accuracy, suggesting that $d_{max}$ was not exceeded and the reach could be mapped without gaps. Conversely, in the deeper and more turbid stream, progressive truncation of input depths yielded a plausible estimate of $d_{max}$ consistent with theoretical calculations based on field measurements of light attenuation by the water column. This result implied that the entire channel, including pools, could not be mapped remotely. However, truncation improved the accuracy of depth estimates in areas shallower than $d_{max}$, which comprise the majority of the channel and are of primary interest for many habitat-oriented applications.

  12. Respondent driven sampling: determinants of recruitment and a method to improve point estimation.

    Directory of Open Access Journals (Sweden)

    Nicky McCreesh

    Full Text Available Respondent-driven sampling (RDS is a variant of a link-tracing design intended for generating unbiased estimates of the composition of hidden populations that typically involves giving participants several coupons to recruit their peers into the study. RDS may generate biased estimates if coupons are distributed non-randomly or if potential recruits present for interview non-randomly. We explore if biases detected in an RDS study were due to either of these mechanisms, and propose and apply weights to reduce bias due to non-random presentation for interview.Using data from the total population, and the population to whom recruiters offered their coupons, we explored how age and socioeconomic status were associated with being offered a coupon, and, if offered a coupon, with presenting for interview. Population proportions were estimated by weighting by the assumed inverse probabilities of being offered a coupon (as in existing RDS methods, and also of presentation for interview if offered a coupon by age and socioeconomic status group.Younger men were under-recruited primarily because they were less likely to be offered coupons. The under-recruitment of higher socioeconomic status men was due in part to them being less likely to present for interview. Consistent with these findings, weighting for non-random presentation for interview by age and socioeconomic status group greatly improved the estimate of the proportion of men in the lowest socioeconomic group, reducing the root-mean-squared error of RDS estimates of socioeconomic status by 38%, but had little effect on estimates for age. The weighting also improved estimates for tribe and religion (reducing root-mean-squared-errors by 19-29%, but had little effect for sexual activity or HIV status.Data collected from recruiters on the characteristics of men to whom they offered coupons may be used to reduce bias in RDS studies. Further evaluation of this new method is required.

  13. Droplet digital PCR improves absolute quantification of viable lactic acid bacteria in faecal samples.

    Science.gov (United States)

    Gobert, Guillaume; Cotillard, Aurélie; Fourmestraux, Candice; Pruvost, Laurence; Miguet, Jean; Boyer, Mickaël

    2018-03-14

    Analysing correlations between the observed health effects of ingested probiotics and their survival in digestive tract allows adapting their preparations for food. Tracking ingested probiotic in faecal samples requires accurate and specific tools to quantify live vs dead cells at strain level. Traditional culture-based methods are simpler to use but they do not allow quantifying viable but non-cultivable (VBNC) cells and they are poorly discriminant below the species level. We have set up a viable PCR (vPCR) assay combining propidium monoazide (PMA) treatment and either real time quantitative PCR (qPCR) or droplet digital PCR (ddPCR) to quantify a Lactobacillus rhamnosus and two Lactobacillus paracasei subsp. paracasei strains in piglet faeces. Adjustments of the PMA treatment conditions and reduction of the faecal sample size were necessary to obtain accurate discrimination between dead and live cells. The study also revealed differences of PMA efficiency among the two L. paracasei strains. Both PCR methods were able to specifically quantify each strain and provided comparable total bacterial counts. However, quantification of lower numbers of viable cells was best achieved with ddPCR, which was characterized by a reduced lower limit of quantification (improvement of up to 1.76 log 10 compared to qPCR). All three strains were able to survive in the piglets' gut with viability losses between 0.78 and 1.59 log 10 /g faeces. This study shows the applicability of PMA-ddPCR to specific quantification of small numbers of viable bacterial cells in the presence of an important background of unwanted microorganisms, and without the need to set up standard curves. It also illustrates the need to adapt PMA protocols according to the final matrix and target strain, even for closely related strains. The PMA-ddPCR approach provides a new tool to quantify bacterial survival in faecal samples from a preclinical and clinical trial. Copyright © 2018 The Authors. Published by

  14. Comparative exploration of hydrogen sulfide and water transmembrane free energy surfaces via orthogonal space tempering free energy sampling.

    Science.gov (United States)

    Lv, Chao; Aitchison, Erick W; Wu, Dongsheng; Zheng, Lianqing; Cheng, Xiaolin; Yang, Wei

    2016-03-05

    Hydrogen sulfide (H2 S), a commonly known toxic gas compound, possesses unique chemical features that allow this small solute molecule to quickly diffuse through cell membranes. Taking advantage of the recent orthogonal space tempering (OST) method, we comparatively mapped the transmembrane free energy landscapes of H2 S and its structural analogue, water (H2 O), seeking to decipher the molecular determinants that govern their drastically different permeabilities. As revealed by our OST sampling results, in contrast to the highly polar water solute, hydrogen sulfide is evidently amphipathic, and thus inside membrane is favorably localized at the interfacial region, that is, the interface between the polar head-group and nonpolar acyl chain regions. Because the membrane binding affinity of H2 S is mainly governed by its small hydrophobic moiety and the barrier height inbetween the interfacial region and the membrane center is largely determined by its moderate polarity, the transmembrane free energy barriers to encounter by this toxic molecule are very small. Moreover when H2 S diffuses from the bulk solution to the membrane center, the above two effects nearly cancel each other, so as to lead to a negligible free energy difference. This study not only explains why H2 S can quickly pass through cell membranes but also provides a practical illustration on how to use the OST free energy sampling method to conveniently analyze complex molecular processes. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  15. Replica Exchange Gaussian Accelerated Molecular Dynamics: Improved Enhanced Sampling and Free Energy Calculation.

    Science.gov (United States)

    Huang, Yu-Ming M; McCammon, J Andrew; Miao, Yinglong

    2018-04-10

    Through adding a harmonic boost potential to smooth the system potential energy surface, Gaussian accelerated molecular dynamics (GaMD) provides enhanced sampling and free energy calculation of biomolecules without the need of predefined reaction coordinates. This work continues to improve the acceleration power and energy reweighting of the GaMD by combining the GaMD with replica exchange algorithms. Two versions of replica exchange GaMD (rex-GaMD) are presented: force constant rex-GaMD and threshold energy rex-GaMD. During simulations of force constant rex-GaMD, the boost potential can be exchanged between replicas of different harmonic force constants with fixed threshold energy. However, the algorithm of threshold energy rex-GaMD tends to switch the threshold energy between lower and upper bounds for generating different levels of boost potential. Testing simulations on three model systems, including the alanine dipeptide, chignolin, and HIV protease, demonstrate that through continuous exchanges of the boost potential, the rex-GaMD simulations not only enhance the conformational transitions of the systems but also narrow down the distribution width of the applied boost potential for accurate energetic reweighting to recover biomolecular free energy profiles.

  16. REMOTE IN-CELL SAMPLING IMPROVEMENTS PROGRAM AT THESAVANNAH RIVER SITE (SRS) DEFENSE WASTE PROCESSING FACILITY (DWPF)

    International Nuclear Information System (INIS)

    Marzolf, A

    2007-01-01

    Remote Systems Engineering (RSE) of the Savannah River National Lab (SRNL) in combination with the Defense Waste Processing Facility(DWPF) Engineering and Operations has evaluated the existing equipment and processes used in the facility sample cells for 'pulling' samples from the radioactive waste stream and performing equipment in-cell repairs/replacements. RSE has designed and tested equipment for improving remote in-cell sampling evolutions and reducing the time required for in-cell maintenance of existing equipment. The equipment within the present process tank sampling system has been in constant use since the facility start-up over 17 years ago. At present, the method for taking samples within the sample cells produces excessive maintenance and downtime due to frequent failures relative to the sampling station equipment and manipulator. Location and orientation of many sampling stations within the sample cells is not conducive to manipulator operation. The overextension of manipulators required to perform many in-cell operations is a major cause of manipulator failures. To improve sampling operations and reduce downtime due to equipment maintenance, a Portable Sampling Station (PSS), wireless in-cell cameras, and new commercially available sampling technology has been designed, developed and/or adapted and tested. The uniqueness of the design(s), the results of the scoping tests, and the benefits relative to in-cell operation and reduction of waste are presented

  17. HUBBLE SPACE TELESCOPE PROPER MOTION (HSTPROMO) CATALOGS OF GALACTIC GLOBULAR CLUSTERS. I. SAMPLE SELECTION, DATA REDUCTION, AND NGC 7078 RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    Bellini, A.; Anderson, J.; Van der Marel, R. P.; Watkins, L. L. [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); King, I. R. [Department of Astronomy, University of Washington, Box 351580, Seattle, WA 98195 (United States); Bianchini, P. [Max Planck Institute for Astronomy, Königstuhl 17, D-69117 Heidelberg (Germany); Chanamé, J. [Instituto de Astrofísica, Pontificia Universidad Católica de Chile, Av. Vicuña Mackenna 4860, Macul 782-0436, Santiago (Chile); Chandar, R. [Department of Physics and Astronomy, The University of Toledo, 2801 West Bancroft Street, Toledo, OH 43606 (United States); Cool, A. M. [Department of Physics and Astronomy, San Francisco State University, 1600 Holloway Avenue, San Francisco, CA 94132 (United States); Ferraro, F. R.; Massari, D. [Dipartimento di Fisica e Astronomia, Università di Bologna, via Ranzani 1, I-40127 Bologna (Italy); Ford, H., E-mail: bellini@stsci.edu [Department of Physics and Astronomy, The Johns Hopkins University, 3400 North Charles Street, Baltimore, MD 21218 (United States)

    2014-12-20

    We present the first study of high-precision internal proper motions (PMs) in a large sample of globular clusters, based on Hubble Space Telescope (HST) data obtained over the past decade with the ACS/WFC, ACS/HRC, and WFC3/UVIS instruments. We determine PMs for over 1.3 million stars in the central regions of 22 clusters, with a median number of ∼60,000 stars per cluster. These PMs have the potential to significantly advance our understanding of the internal kinematics of globular clusters by extending past line-of-sight (LOS) velocity measurements to two- or three-dimensional velocities, lower stellar masses, and larger sample sizes. We describe the reduction pipeline that we developed to derive homogeneous PMs from the very heterogeneous archival data. We demonstrate the quality of the measurements through extensive Monte Carlo simulations. We also discuss the PM errors introduced by various systematic effects and the techniques that we have developed to correct or remove them to the extent possible. We provide in electronic form the catalog for NGC 7078 (M 15), which consists of 77,837 stars in the central 2.'4. We validate the catalog by comparison with existing PM measurements and LOS velocities and use it to study the dependence of the velocity dispersion on radius, stellar magnitude (or mass) along the main sequence, and direction in the plane of the sky (radial or tangential). Subsequent papers in this series will explore a range of applications in globular-cluster science and will also present the PM catalogs for the other sample clusters.

  18. A radial sampling strategy for uniform k-space coverage with retrospective respiratory gating in 3D ultrashort-echo-time lung imaging.

    Science.gov (United States)

    Park, Jinil; Shin, Taehoon; Yoon, Soon Ho; Goo, Jin Mo; Park, Jang-Yeon

    2016-05-01

    The purpose of this work was to develop a 3D radial-sampling strategy which maintains uniform k-space sample density after retrospective respiratory gating, and demonstrate its feasibility in free-breathing ultrashort-echo-time lung MRI. A multi-shot, interleaved 3D radial sampling function was designed by segmenting a single-shot trajectory of projection views such that each interleaf samples k-space in an incoherent fashion. An optimal segmentation factor for the interleaved acquisition was derived based on an approximate model of respiratory patterns such that radial interleaves are evenly accepted during the retrospective gating. The optimality of the proposed sampling scheme was tested by numerical simulations and phantom experiments using human respiratory waveforms. Retrospectively, respiratory-gated, free-breathing lung MRI with the proposed sampling strategy was performed in healthy subjects. The simulation yielded the most uniform k-space sample density with the optimal segmentation factor, as evidenced by the smallest standard deviation of the number of neighboring samples as well as minimal side-lobe energy in the point spread function. The optimality of the proposed scheme was also confirmed by minimal image artifacts in phantom images. Human lung images showed that the proposed sampling scheme significantly reduced streak and ring artifacts compared with the conventional retrospective respiratory gating while suppressing motion-related blurring compared with full sampling without respiratory gating. In conclusion, the proposed 3D radial-sampling scheme can effectively suppress the image artifacts due to non-uniform k-space sample density in retrospectively respiratory-gated lung MRI by uniformly distributing gated radial views across the k-space. Copyright © 2016 John Wiley & Sons, Ltd.

  19. Vitality of oligozoospermic semen samples is improved by both swim-up and density gradient centrifugation before cryopreservation.

    Science.gov (United States)

    Counsel, Madeleine; Bellinge, Rhys; Burton, Peter

    2004-05-01

    To ascertain whether washing sperm from oligozoospermic and normozoospermic samples before cryopreservation improves post-thaw vitality. Normozoospermic (n = 18) and oligozoospermic (n = 16) samples were divided into three aliquots. The first aliquot remained untreated and the second and third aliquots were subjected to the swim-up and discontinuous density gradient sperm washing techniques respectively. Vitality staining was performed, samples mixed with cryopreservation media and frozen. Spermatozoa were thawed, stained, and vitality quantified and expressed as the percentage of live spermatozoa present. Post-thaw vitality in untreated aliquots from normozoospermic samples (24.9% +/- 2.3; mean +/- SEM) was significantly higher (unpaired t-tests; P vitality was significantly higher after swim-up in normozoospermic samples (35.6% +/- 2.1; P vitality in oligozoospermic (22.4% +/- 1.0; P vitality in cryopreserved oligozoospermic samples was improved by both the swim-up and density gradient centrifugation washing techniques prior to freezing.

  20. Vapor space characterization of Waste Tank 241-C-103: Inorganic results from sample Job 7B (May 12-25, 1994)

    International Nuclear Information System (INIS)

    Ligotke, M.W.; Pool, K.H.; Lerner, B.D.

    1994-10-01

    This report is to provide analytical results for use in safety and toxicological evaluations of the vapor space of Hanford single-shell waste storage tanks C-103. Samples were analysed to determine concentrations of ammonia, nitric oxide, nitrogen dioxide, sulfur oxides, and hydrogen cyanide. In addition to the samples, controls were analyzed that included blanks, spiked blanks, and spiked samples. These controls provided information about the suitability of sampling and analytical methods. Also included are the following: information describing the methods and sampling procedures used; results of sample analyses; and Conclusions and recommendations

  1. Optimizing the reconstruction filter in cone-beam CT to improve periodontal ligament space visualization: An in vitro study

    Energy Technology Data Exchange (ETDEWEB)

    Houno, Yuuki; Kodera, Yoshie [Graduate School of Medicine, Nagoya University, Nagoya (Japan); Hishikawa, Toshimitsu; Naitoh, Munetaka; Mitani, Akio; Noguchi, Toshihide; Ariji, Eiichiro [Aichi Gakuin University, Nisshin (Japan); Gotoh, Kenichi [Div. of Radiology, Dental Hospital, Aichi Gakuin University, Nisshin (Japan)

    2017-09-15

    Evaluation of alveolar bone is important in the diagnosis of dental diseases. The periodontal ligament space is difficult to clearly depict in cone-beam computed tomography images because the reconstruction filter conditions during image processing cause image blurring, resulting in decreased spatial resolution. We examined different reconstruction filters to assess their ability to improve spatial resolution and allow for a clearer visualization of the periodontal ligament space. Cone-beam computed tomography projections of 2 skull phantoms were reconstructed using 6 reconstruction conditions and then compared using the Thurstone paired comparison method. Physical evaluations, including the modulation transfer function and the Wiener spectrum, as well as an assessment of space visibility, were undertaken using experimental phantoms. Image reconstruction using a modified Shepp-Logan filter resulted in better sensory, physical, and quantitative evaluations. The reconstruction conditions substantially improved the spatial resolution and visualization of the periodontal ligament space. The difference in sensitivity was obtained by altering the reconstruction filter. Modifying the characteristics of a reconstruction filter can generate significant improvement in assessments of the periodontal ligament space. A high-frequency enhancement filter improves the visualization of thin structures and will be useful when accurate assessment of the periodontal ligament space is necessary.

  2. Optimizing the reconstruction filter in cone-beam CT to improve periodontal ligament space visualization: An in vitro study

    International Nuclear Information System (INIS)

    Houno, Yuuki; Kodera, Yoshie; Hishikawa, Toshimitsu; Naitoh, Munetaka; Mitani, Akio; Noguchi, Toshihide; Ariji, Eiichiro; Gotoh, Kenichi

    2017-01-01

    Evaluation of alveolar bone is important in the diagnosis of dental diseases. The periodontal ligament space is difficult to clearly depict in cone-beam computed tomography images because the reconstruction filter conditions during image processing cause image blurring, resulting in decreased spatial resolution. We examined different reconstruction filters to assess their ability to improve spatial resolution and allow for a clearer visualization of the periodontal ligament space. Cone-beam computed tomography projections of 2 skull phantoms were reconstructed using 6 reconstruction conditions and then compared using the Thurstone paired comparison method. Physical evaluations, including the modulation transfer function and the Wiener spectrum, as well as an assessment of space visibility, were undertaken using experimental phantoms. Image reconstruction using a modified Shepp-Logan filter resulted in better sensory, physical, and quantitative evaluations. The reconstruction conditions substantially improved the spatial resolution and visualization of the periodontal ligament space. The difference in sensitivity was obtained by altering the reconstruction filter. Modifying the characteristics of a reconstruction filter can generate significant improvement in assessments of the periodontal ligament space. A high-frequency enhancement filter improves the visualization of thin structures and will be useful when accurate assessment of the periodontal ligament space is necessary

  3. Heuristic space diversity control for improved meta-hyper-heuristic performance

    CSIR Research Space (South Africa)

    Grobler, J

    2015-04-01

    Full Text Available This paper expands on the concept of heuristic space diversity and investigates various strategies for the management of heuristic space diversity within the context of a meta-hyper-heuristic algorithm in search of greater performance benefits...

  4. A hybridized membrane-botanical biofilter for improving air quality in occupied spaces

    Science.gov (United States)

    Llewellyn, David; Darlington, Alan; van Ras, Niels; Kraakman, Bart; Dixon, Mike

    Botanical biofilters have been shown to be effective in improving indoor air quality through the removal of complex mixtures of gaseous contaminants typically found in human-occupied environments. Traditional, botanical biofilters have been comprised of plants rooted into a thin and highly porous synthetic medium that is hung on vertical surfaces. Water flows from the top of the biofilter and air is drawn horizontally through the rooting medium. These botanical biofilters have been successfully marketed in office and institutional settings. They operate efficiently, with adequate contaminant removal and little maintenance for many years. Depending on climate and outdoor air quality, botanical biofiltration can substantially reduce costs associated with ventilation of stale indoor air. However, there are several limitations that continue to inhibit widespread acceptance: 1. Current designs are architecturally limiting and inefficient at capturing ambient light 2. These biofilters can add significant amounts of humidity to an indoor space. This water loss also leads to a rapid accumulation of dissolved salts; reducing biofilter health and performance 3. There is the perception of potentially actively introducing harmful bioaerosols into the air stream 4. Design and practical limitations inhibit the entrance of this technology into the lucrative residential marketplace This paper describes the hybridization of membrane and botanical biofiltration technologies by incorporating a membrane array into the rootzone of a conventional interior planting. This technology has the potential for addressing all of the above limitations, expanding the range of indoor settings where botanical biofiltration can be applied. This technology was developed as the CSA-funded Canadian component an ESA-MAP project entitled: "Biological airfilter for air quality control of life support systems in manned space craft and other closed environments", A0-99-LSS-019. While the project addressed a

  5. Improved Geologic Interpretation of Non-invasive Electrical Resistivity Imaging from In-situ Samples

    Science.gov (United States)

    Mucelli, A.; Aborn, L.; Jacob, R.; Malusis, M.; Evans, J.

    2016-12-01

    Non-invasive geophysical techniques are useful in characterizing the subsurface geology without disturbing the environment, however, the ability to interpret the subsurface is enhanced by invasive work. Since geologic materials have electrical resistivity values it allows for a geologic interpretation to be made based on variations of electrical resistivity measured by electrical resistivity imaging (ERI). This study focuses on the pre-characterization of the geologic subsurface from ERI collected adjacent to the Montandon Marsh, a wetland located near Lewisburg, PA within the West Branch of the Susquehanna River watershed. The previous invasive data, boreholes, indicate that the subsurface consists of limestone and shale bedrock overlain with sand and gravel deposits from glacial outwash and aeolian processes. The objective is to improve our understanding of the subsurface at this long-term hydrologic research site by using excavation results, specifically observed variations in geologic materials and electrical resistivity laboratory testing of subsurface samples. The pre-excavation ERI indicated that the shallow-most geologic material had a resistivity value of 100-500 ohm-m. In comparison, the laboratory testing indicated the shallow-most material had the same range of electrical resistivity values depending on saturation levels. The ERI also showed that there was an electrically conductive material, 7 to 70 ohm-m, that was interpreted to be clay and agreed with borehole data, however, the excavation revealed that at this depth range the geologic material varied from stratified clay to clay with cobbles to weathered residual clay. Excavation revealed that the subtle variations in the electrical conductive material corresponded well with the variations in the geologic material. We will use these results to reinterpret previously collected ERI data from the entire long-term research site.

  6. Improving Safety on the International Space Station: Transitioning to Electronic Emergency Procedure Books on the International Space Station

    Science.gov (United States)

    Carter-Journet, Katrina; Clahoun, Jessica; Morrow, Jason; Duncan, Gary

    2012-01-01

    The National Aeronautics and Space Administration (NASA) originally designed the International Space Station (ISS) to operate until 2015, but have extended operations until at least 2020. As part of this very dynamic Program, there is an effort underway to simplify the certification of Commercial ]of ]the ]Shelf (COTS) hardware. This change in paradigm allows the ISS Program to take advantage of technologically savvy and commercially available hardware, such as the iPad. The iPad, a line of tablet computers designed and marketed by Apple Inc., was chosen to support this endeavor. The iPad is functional, portable, and could be easily accessed in an emergency situation. The iPad Electronic Flight Bag (EFB), currently approved for use in flight by the Federal Aviation Administration (FAA), is a fraction of the cost of a traditional Class 2 EFB. In addition, the iPad fs ability to use electronic aeronautical data in lieu of paper in route charts and approach plates can cut the annual cost of paper data in half for commercial airlines. ISS may be able to benefit from this type of trade since one of the most important factors considered is information management. Emergency procedures onboard the ISS are currently available to the crew in paper form. Updates to the emergency books can either be launched on an upcoming visiting vehicle such as a Russian Soyuz flight or printed using the onboard ISS printer. In both cases, it is costly to update hardcopy procedures. A new operations concept was proposed to allow for the use of a tablet system that would provide a flexible platform to support space station crew operations. The purpose of the system would be to provide the crew the ability to view and maintain operational data, such as emergency procedures while also allowing Mission Control Houston to update the procedures. The ISS Program is currently evaluating the safety risks associated with the use of iPads versus paper. Paper products can contribute to the flammability

  7. More space and improved living conditions in cities with autonomous vehicles

    NARCIS (Netherlands)

    Vleugel, J.M.; Bal, Frans

    2017-01-01

    Many people live in cities today. Many more will do so in future. This increases the demand for space and (space for) transport. Space to expand roads is usually scarce. Building tunnels or elevated bridges is very expensive. Solving one bottleneck creates another bottleneck downstream. More road

  8. More space and improved living conditions in cities with autonomous vehicles

    NARCIS (Netherlands)

    Vleugel, J.M.; Bal, Frans

    2017-01-01

    Many people live in cities today. Many more will do so in future. This increases the demand for space and (space for) transport. Space to expand roads is usually scarce. Building tunnels or elevated bridges is very expensive. Solving one bottleneck creates a next bottleneck downstream. More road

  9. Improvement of the Russian system of medical care at the site of space crew landing

    Science.gov (United States)

    Rukavishnikov, Ilya; Bogomolov, Valery; Polyakov, Alexey

    The crew members are delivered to ISS and return back to the Earth on the space craft "Soyuz TMA" at present time. The technical means providing the safe landing of space crews are reliable enough. In spite of that the complex of negative factors (long lasting alternating and shock overloads, effects of landing apparatus rotation on vestibular system) affects the crew during landing and can reach the extreme values under the certain conditions. According to this fact there is a possibility of appearance of bodily damages of different weight besides the traditional functional disturbances. The group of search and rescue on the landing site includes the medical specialists appropriately equipped to stop the symptoms of medical contingency (strong vestibule-vegetative reactions, traumas of different weight, etc.) Medical evacuation complex which provides the acceptable conditions for the cosmonauts including the conditions for medical care is delivered to the landing site as well. The long term experience of search and rescue assurance at the landing site have shown that the specialists successfully cope with this task. In some cases it was required to give the medical help which allowed to improve the general condition and physical capacity of crewmembers and provide their evacuation to the places of postflight rehabilitation. At the same time the solution of some of the problems from our point of view could increase the efficacy of medical care for the landing crew. The organization of the training on emergency under the field conditions for medical specialists on the regular basis (not less that once a year) is extremely important. The equipment of medical specialists requires the regular improvement and modernization due to the fast changing medical technologies and standards. Wearable medical sets must provide the first aid performing in accordance to the modern medical requirements. It is also necessary to include in the list of equipment the textbook of

  10. C-Arm Computed Tomography-Assisted Adrenal Venous Sampling Improved Right Adrenal Vein Cannulation and Sampling Quality in Primary Aldosteronism.

    Science.gov (United States)

    Park, Chung Hyun; Hong, Namki; Han, Kichang; Kang, Sang Wook; Lee, Cho Rok; Park, Sungha; Rhee, Yumie

    2018-05-04

    Adrenal venous sampling (AVS) is a gold standard for subtype classification of primary aldosteronism (PA). However, this procedure has a high failure rate because of the anatomical difficulties in accessing the right adrenal vein. We investigated whether C-arm computed tomography-assisted AVS (C-AVS) could improve the success rate of adrenal sampling. A total of 156 patients, diagnosed with PA who underwent AVS from May 2004 through April 2017, were included. Based on the medical records, we retrospectively compared the overall, left, and right catheterization success rates of adrenal veins during the periods without C-AVS (2004 to 2010, n=32) and with C-AVS (2011 to 2016, n=134). The primary outcome was adequate bilateral sampling defined as a selectivity index (SI) >5. With C-AVS, the rates of adequate bilateral AVS increased from 40.6% to 88.7% (PAVS was an independent predictor of adequate bilateral sampling in the multivariate model (odds ratio, 9.01; PAVS improved the overall success rate of AVS, possibly as a result of better catheterization of right adrenal vein. Copyright © 2018 Korean Endocrine Society.

  11. Sequence space coverage, entropy of genomes and the potential to detect non-human DNA in human samples

    Directory of Open Access Journals (Sweden)

    Maley Carlo C

    2008-10-01

    Full Text Available Abstract Background Genomes store information for building and maintaining organisms. Complete sequencing of many genomes provides the opportunity to study and compare global information properties of those genomes. Results We have analyzed aspects of the information content of Homo sapiens, Mus musculus, Drosophila melanogaster, Caenorhabditis elegans, Arabidopsis thaliana, Saccharomyces cerevisiae, and Escherichia coli (K-12 genomes. Virtually all possible (> 98% 12 bp oligomers appear in vertebrate genomes while 98% to D. melanogaster (12–17 bp, C. elegans (11–17 bp, A. thaliana (11–17 bp, S. cerevisiae (10–16 bp and E. coli (9–15 bp. Frequencies of unique oligomers in the genomes follow similar patterns. We identified a set of 2.6 M 15-mers that are more than 1 nucleotide different from all 15-mers in the human genome and so could be used as probes to detect microbes in human samples. In a human sample, these probes would detect 100% of the 433 currently fully sequenced prokaryotes and 75% of the 3065 fully sequenced viruses. The human genome is significantly more compact in sequence space than a random genome. We identified the most frequent 5- to 20-mers in the human genome, which may prove useful as PCR primers. We also identified a bacterium, Anaeromyxobacter dehalogenans, which has an exceptionally low diversity of oligomers given the size of its genome and its GC content. The entropy of coding regions in the human genome is significantly higher than non-coding regions and chromosomes. However chromosomes 1, 2, 9, 12 and 14 have a relatively high proportion of coding DNA without high entropy, and chromosome 20 is the opposite with a low frequency of coding regions but relatively high entropy. Conclusion Measures of the frequency of oligomers are useful for designing PCR assays and for identifying chromosomes and organisms with hidden structure that had not been previously recognized. This information may be used to detect

  12. Sequence space coverage, entropy of genomes and the potential to detect non-human DNA in human samples

    Science.gov (United States)

    Liu, Zhandong; Venkatesh, Santosh S; Maley, Carlo C

    2008-01-01

    Background Genomes store information for building and maintaining organisms. Complete sequencing of many genomes provides the opportunity to study and compare global information properties of those genomes. Results We have analyzed aspects of the information content of Homo sapiens, Mus musculus, Drosophila melanogaster, Caenorhabditis elegans, Arabidopsis thaliana, Saccharomyces cerevisiae, and Escherichia coli (K-12) genomes. Virtually all possible (> 98%) 12 bp oligomers appear in vertebrate genomes while 98% to < 2% of possible oligomers in D. melanogaster (12–17 bp), C. elegans (11–17 bp), A. thaliana (11–17 bp), S. cerevisiae (10–16 bp) and E. coli (9–15 bp). Frequencies of unique oligomers in the genomes follow similar patterns. We identified a set of 2.6 M 15-mers that are more than 1 nucleotide different from all 15-mers in the human genome and so could be used as probes to detect microbes in human samples. In a human sample, these probes would detect 100% of the 433 currently fully sequenced prokaryotes and 75% of the 3065 fully sequenced viruses. The human genome is significantly more compact in sequence space than a random genome. We identified the most frequent 5- to 20-mers in the human genome, which may prove useful as PCR primers. We also identified a bacterium, Anaeromyxobacter dehalogenans, which has an exceptionally low diversity of oligomers given the size of its genome and its GC content. The entropy of coding regions in the human genome is significantly higher than non-coding regions and chromosomes. However chromosomes 1, 2, 9, 12 and 14 have a relatively high proportion of coding DNA without high entropy, and chromosome 20 is the opposite with a low frequency of coding regions but relatively high entropy. Conclusion Measures of the frequency of oligomers are useful for designing PCR assays and for identifying chromosomes and organisms with hidden structure that had not been previously recognized. This information may be used to

  13. Improvements to sample processing and measurement to enable more widespread environmental application of tritium

    Energy Technology Data Exchange (ETDEWEB)

    Moran, James; Alexander, Thomas; Aalseth, Craig; Back, Henning; Mace, Emily; Overman, Cory; Seifert, Allen; Freeburg, Wilcox

    2017-08-01

    Previous measurements have demonstrated the wealth of information that tritium (T) can provide on environmentally relevant processes. We present modifications to sample preparation approaches that enable T measurement by proportional counting on small sample sizes equivalent to 120 mg of water and demonstrate the accuracy of these methods on a suite of standardized water samples. This enhanced method should provide the analytical flexibility needed to address persistent knowledge gaps in our understanding of T behavior in the environment.

  14. Quantitative Characterization of Configurational Space Sampled by HIV-1 Nucleocapsid Using Solution NMR, X-ray Scattering and Protein Engineering.

    Science.gov (United States)

    Deshmukh, Lalit; Schwieters, Charles D; Grishaev, Alexander; Clore, G Marius

    2016-06-03

    Nucleic-acid-related events in the HIV-1 replication cycle are mediated by nucleocapsid, a small protein comprising two zinc knuckles connected by a short flexible linker and flanked by disordered termini. Combining experimental NMR residual dipolar couplings, solution X-ray scattering and protein engineering with ensemble simulated annealing, we obtain a quantitative description of the configurational space sampled by the two zinc knuckles, the linker and disordered termini in the absence of nucleic acids. We first compute the conformational ensemble (with an optimal size of three members) of an engineered nucleocapsid construct lacking the N- and C-termini that satisfies the experimental restraints, and then validate this ensemble, as well as characterize the disordered termini, using the experimental data from the full-length nucleocapsid construct. The experimental and computational strategy is generally applicable to multidomain proteins. Differential flexibility within the linker results in asymmetric motion of the zinc knuckles which may explain their functionally distinct roles despite high sequence identity. One of the configurations (populated at a level of ≈40 %) closely resembles that observed in various ligand-bound forms, providing evidence for conformational selection and a mechanistic link between protein dynamics and function. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Improving Sampling, Analysis, and Data Management for Site Investigation and Cleanup

    Science.gov (United States)

    The United States Environmental Protection Agency (EPA) supports the adoption of streamlined approaches to sampling, analysis, and data management activities conducted during site assessment, characterization, and cleanup.

  16. Probabilistic finite element stiffness of a laterally loaded monopile based on an improved asymptotic sampling method

    DEFF Research Database (Denmark)

    Vahdatirad, Mohammadjavad; Bayat, Mehdi; Andersen, Lars Vabbersgaard

    2015-01-01

    shear strength of clay. Normal and Sobol sampling are employed to provide the asymptotic sampling method to generate the probability distribution of the foundation stiffnesses. Monte Carlo simulation is used as a benchmark. Asymptotic sampling accompanied with Sobol quasi random sampling demonstrates......The mechanical responses of an offshore monopile foundation mounted in over-consolidated clay are calculated by employing a stochastic approach where a nonlinear p–y curve is incorporated with a finite element scheme. The random field theory is applied to represent a spatial variation for undrained...... an efficient method for estimating the probability distribution of stiffnesses for the offshore monopile foundation....

  17. Medial unicompartmental knee arthroplasty improves congruence and restores joint space width of the lateral compartment.

    Science.gov (United States)

    Khamaisy, Saker; Zuiderbaan, Hendrik A; van der List, Jelle P; Nam, Denis; Pearle, Andrew D

    2016-06-01

    Osteoarthritic progression of the lateral compartment remains a leading indication for medial unicompartmental knee arthroplasty (UKA) revision. Therefore, the purpose of this study was to evaluate the alterations of the lateral compartment congruence and joint space width (JSW) following medial UKA. Retrospectively, lateral compartment congruence and JSW were evaluated in 174 knees (74 females, 85 males, mean age 65.5years; SD±10.1) preoperatively and six weeks postoperatively, and compared to 41 healthy knees (26 men, 15 women, mean age 33.7years; SD±6.4). Congruence (CI) was calculated using validated software that evaluates the geometric relationship between surfaces and calculates a congruence index (CI). JSW was measured on three sides (inner, middle, outer) by subdividing the lateral compartment into four quarters. The CI of the control group was 0.98 (SD±0.01). The preoperative CI was 0.88 (SD±0.01), which improved significantly to 0.93 (SD±0.03) postoperatively (pcongruence and restores the JSW of the lateral compartment. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Nationwide Inpatient Sample and National Surgical Quality Improvement Program give different results in hip fracture studies.

    Science.gov (United States)

    Bohl, Daniel D; Basques, Bryce A; Golinvaux, Nicholas S; Baumgaertner, Michael R; Grauer, Jonathan N

    2014-06-01

    National databases are being used with increasing frequency to conduct orthopaedic research. However, there are important differences in these databases, which could result in different answers to similar questions; this important potential limitation pertaining to database research in orthopaedic surgery has not been adequately explored. The purpose of this study was to explore the interdatabase reliability of two commonly used national databases, the Nationwide Inpatient Sample (NIS) and the National Surgical Quality Improvement Program (NSQIP), in terms of (1) demographics; (2) comorbidities; and (3) adverse events. In addition, using the NSQIP database, we identified (4) adverse events that had a higher prevalence after rather than before discharge, which has important implications for interpretation of studies conducted in the NIS. A retrospective cohort study of patients undergoing operative stabilization of transcervical and intertrochanteric hip fractures during 2009 to 2011 was performed in the NIS and NSQIP. Totals of 122,712 and 5021 patients were included from the NIS and NSQIP, respectively. Age, sex, fracture type, and lengths of stay were compared. Comorbidities common to both databases were compared in terms of more or less than twofold difference between the two databases. Similar comparisons were made for adverse events. Finally, adverse events that had a greater postdischarge prevalence were identified from the NSQIP database. Tests for statistical difference were thought to be of little value given the large sample size and the resulting fact that statistical differences would have been identified even for small, clinically inconsequential differences resulting from the associated high power. Because it is of greater clinical importance to focus on the magnitude of differences, the databases were compared by absolute differences. Demographics and hospital lengths of stay were not different between the two databases. In terms of comorbidities

  19. Experimental study of UC polycrystals in the prospect of improving the as-fabricated sample purity

    Energy Technology Data Exchange (ETDEWEB)

    Raveu, Gaëlle, E-mail: gaelle.raveu@cea.fr [CEA, DEC, 13108 Saint-Paul-Lez-Durance (France); Martin, Guillaume; Fiquet, Olivier; Garcia, Philippe; Carlot, Gaëlle; Palancher, Hervé [CEA, DEC, 13108 Saint-Paul-Lez-Durance (France); Bonnin, Anne [ESRF, 6, rue J. Horowitz, 38500 Grenoble Cedex (France); Khodja, Hicham [CEA, DEC, 13108 Saint-Paul-Lez-Durance (France); Raepsaet, Caroline [CEA, IRAMIS, LEEL, 91191 Gif-Sur-Yvette (France); Sauvage, Thierry; Barthe, Marie-France [CNRS – CEMHTI, 3a Rue de la Férolerie, 45071 Orleans (France)

    2014-12-15

    Uranium and plutonium carbides are candidate fuels for Generation IV nuclear reactors. This study is focused on the characterization of uranium monocarbide samples. The successive fabrication steps were carried out under atmospheres containing low oxygen and moisture concentrations (typically less than 100 ppm) but sample transfers occurred in air. Six samples were sliced from four pellets elaborated by carbothermic reaction under vacuum. Little presence of UC{sub 2} is expected in these samples. The α-UC{sub 2} phase was indeed detected within one of these UC samples during an XRD experiment performed with synchrotron radiation. Moreover, oxygen content at the surface of these samples was depth profiled using a recently developed nuclear reaction analysis method. Large oxygen concentrations were measured in the first micron below the sample surface and particularly in the first 100–150 nm. UC{sub 2} inclusions were found to be more oxidized than the surrounding matrix. This work points out to the fact that more care must be given at each step of UC fabrication since the material readily reacts with oxygen and moisture. A new glovebox facility using a highly purified atmosphere is currently being built in order to obtain single phase UC samples of better purity.

  20. Improved sampling for airborne surveys to estimate wildlife population parameters in the African Savannah

    NARCIS (Netherlands)

    Khaemba, W.; Stein, A.

    2002-01-01

    Parameter estimates, obtained from airborne surveys of wildlife populations, often have large bias and large standard errors. Sampling error is one of the major causes of this imprecision and the occurrence of many animals in herds violates the common assumptions in traditional sampling designs like

  1. Why weight? Modelling sample and observational level variability improves power in RNA-seq analyses.

    Science.gov (United States)

    Liu, Ruijie; Holik, Aliaksei Z; Su, Shian; Jansz, Natasha; Chen, Kelan; Leong, Huei San; Blewitt, Marnie E; Asselin-Labat, Marie-Liesse; Smyth, Gordon K; Ritchie, Matthew E

    2015-09-03

    Variations in sample quality are frequently encountered in small RNA-sequencing experiments, and pose a major challenge in a differential expression analysis. Removal of high variation samples reduces noise, but at a cost of reducing power, thus limiting our ability to detect biologically meaningful changes. Similarly, retaining these samples in the analysis may not reveal any statistically significant changes due to the higher noise level. A compromise is to use all available data, but to down-weight the observations from more variable samples. We describe a statistical approach that facilitates this by modelling heterogeneity at both the sample and observational levels as part of the differential expression analysis. At the sample level this is achieved by fitting a log-linear variance model that includes common sample-specific or group-specific parameters that are shared between genes. The estimated sample variance factors are then converted to weights and combined with observational level weights obtained from the mean-variance relationship of the log-counts-per-million using 'voom'. A comprehensive analysis involving both simulations and experimental RNA-sequencing data demonstrates that this strategy leads to a universally more powerful analysis and fewer false discoveries when compared to conventional approaches. This methodology has wide application and is implemented in the open-source 'limma' package. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. Fast egg collection method greatly improves randomness of egg sampling in Drosophila melanogaster

    DEFF Research Database (Denmark)

    Schou, Mads Fristrup

    2013-01-01

    When obtaining samples for population genetic studies, it is essential that the sampling is random. For Drosophila, one of the crucial steps in sampling experimental flies is the collection of eggs. Here an egg collection method is presented, which randomizes the eggs in a water column...... and diminishes environmental variance. This method was compared with a traditional egg collection method where eggs are collected directly from the medium. Within each method the observed and expected standard deviations of egg-to-adult viability were compared, whereby the difference in the randomness...... and to obtain a representative collection of genotypes, the method presented here is strongly recommended when collecting eggs from Drosophila....

  3. Small-angle X-ray scattering tensor tomography: model of the three-dimensional reciprocal-space map, reconstruction algorithm and angular sampling requirements.

    Science.gov (United States)

    Liebi, Marianne; Georgiadis, Marios; Kohlbrecher, Joachim; Holler, Mirko; Raabe, Jörg; Usov, Ivan; Menzel, Andreas; Schneider, Philipp; Bunk, Oliver; Guizar-Sicairos, Manuel

    2018-01-01

    Small-angle X-ray scattering tensor tomography, which allows reconstruction of the local three-dimensional reciprocal-space map within a three-dimensional sample as introduced by Liebi et al. [Nature (2015), 527, 349-352], is described in more detail with regard to the mathematical framework and the optimization algorithm. For the case of trabecular bone samples from vertebrae it is shown that the model of the three-dimensional reciprocal-space map using spherical harmonics can adequately describe the measured data. The method enables the determination of nanostructure orientation and degree of orientation as demonstrated previously in a single momentum transfer q range. This article presents a reconstruction of the complete reciprocal-space map for the case of bone over extended ranges of q. In addition, it is shown that uniform angular sampling and advanced regularization strategies help to reduce the amount of data required.

  4. Validation Of Intermediate Large Sample Analysis (With Sizes Up to 100 G) and Associated Facility Improvement

    International Nuclear Information System (INIS)

    Bode, P.; Koster-Ammerlaan, M.J.J.

    2018-01-01

    Pragmatic rather than physical correction factors for neutron and gamma-ray shielding were studied for samples of intermediate size, i.e. up to the 10-100 gram range. It was found that for most biological and geological materials, the neutron self-shielding is less than 5 % and the gamma-ray self-attenuation can easily be estimated. A trueness control material of 1 kg size was made based on use of left-overs of materials, used in laboratory intercomparisons. A design study for a large sample pool-side facility, handling plate-type volumes, had to be stopped because of a reduction in human resources, available for this CRP. The large sample NAA facilities were made available to guest scientists from Greece and Brazil. The laboratory for neutron activation analysis participated in the world’s first laboratory intercomparison utilizing large samples. (author)

  5. Improvements to sample processing and measurement to enable more widespread environmental application of tritium.

    Science.gov (United States)

    Moran, James; Alexander, Thomas; Aalseth, Craig; Back, Henning; Mace, Emily; Overman, Cory; Seifert, Allen; Freeburg, Wilcox

    2017-08-01

    Previous measurements have demonstrated the wealth of information that tritium (T) can provide on environmentally relevant processes. We present modifications to sample preparation approaches that enable T measurement by proportional counting on small sample sizes equivalent to 120mg of water and demonstrate the accuracy of these methods on a suite of standardized water samples. We identify a current quantification limit of 92.2 TU which, combined with our small sample sizes, correlates to as little as 0.00133Bq of total T activity. This enhanced method should provide the analytical flexibility needed to address persistent knowledge gaps in our understanding of both natural and artificial T behavior in the environment. Copyright © 2017. Published by Elsevier Ltd.

  6. Improving the Accuracy of the Hyperspectral Model for Apple Canopy Water Content Prediction using the Equidistant Sampling Method.

    Science.gov (United States)

    Zhao, Huan-San; Zhu, Xi-Cun; Li, Cheng; Wei, Yu; Zhao, Geng-Xing; Jiang, Yuan-Mao

    2017-09-11

    The influence of the equidistant sampling method was explored in a hyperspectral model for the accurate prediction of the water content of apple tree canopy. The relationship between spectral reflectance and water content was explored using the sample partition methods of equidistant sampling and random sampling, and a stepwise regression model of the apple canopy water content was established. The results showed that the random sampling model was Y = 0.4797 - 721787.3883 × Z 3 - 766567.1103 × Z 5 - 771392.9030 × Z 6 ; the equidistant sampling model was Y = 0.4613 - 480610.4213 × Z 2 - 552189.0450 × Z 5 - 1006181.8358 × Z 6 . After verification, the equidistant sampling method was verified to offer a superior prediction ability. The calibration set coefficient of determination of 0.6599 and validation set coefficient of determination of 0.8221 were higher than that of the random sampling model by 9.20% and 10.90%, respectively. The root mean square error (RMSE) of 0.0365 and relative error (RE) of 0.0626 were lower than that of the random sampling model by 17.23% and 17.09%, respectively. Dividing the calibration set and validation set by the equidistant sampling method can improve the prediction accuracy of the hyperspectral model of apple canopy water content.

  7. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Derivations and Verification of Plans. Volume 1

    Science.gov (United States)

    Johnson, Kenneth L.; White, K, Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques. This recommended procedure would be used as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. This document contains the outcome of the assessment.

  8. Lunar and Meteorite Sample Education Disk Program — Space Rocks for Classrooms, Museums, Science Centers, and Libraries

    Science.gov (United States)

    Allen, J.; Luckey, M.; McInturff, B.; Huynh, P.; Tobola, K.; Loftin, L.

    2010-03-01

    NASA’s Lunar and Meteorite Sample Education Disk Program has Lucite disks containing Apollo lunar samples and meteorite samples that are available for trained educators to borrow for use in classrooms, museums, science center, and libraries.

  9. Study on coupling of three-dimension space time neutron kinetics model and RELAP5 and improvement of RELAP5

    International Nuclear Information System (INIS)

    Gui Xuewen; Cai Qi; Luo Bangqi

    2007-01-01

    A two-group three-dimension space-time neutron kinetics model is applied to the RELAP5 code, which replaces the point reactor kinetics model. A visual operation interface is designed to convenience interactive operation between operator and computer. The calculation results and practical applications indicate that the functions and precision of improved RELAP5 are enhanced and can be easily used. The improved RELAP5 has a good application perspective in nuclear power plant simulation. (authors)

  10. Artificial Neural Network for Total Laboratory Automation to Improve the Management of Sample Dilution.

    Science.gov (United States)

    Ialongo, Cristiano; Pieri, Massimo; Bernardini, Sergio

    2017-02-01

    Diluting a sample to obtain a measure within the analytical range is a common task in clinical laboratories. However, for urgent samples, it can cause delays in test reporting, which can put patients' safety at risk. The aim of this work is to show a simple artificial neural network that can be used to make it unnecessary to predilute a sample using the information available through the laboratory information system. Particularly, the Multilayer Perceptron neural network built on a data set of 16,106 cardiac troponin I test records produced a correct inference rate of 100% for samples not requiring predilution and 86.2% for those requiring predilution. With respect to the inference reliability, the most relevant inputs were the presence of a cardiac event or surgery and the result of the previous assay. Therefore, such an artificial neural network can be easily implemented into a total automation framework to sensibly reduce the turnaround time of critical orders delayed by the operation required to retrieve, dilute, and retest the sample.

  11. Improvements in reading accuracy as a result of increased interletter spacing are not specific to children with dyslexia

    NARCIS (Netherlands)

    Hakvoort, Britt; van den Boer, Madelon; Leenaars, Tineke; Bos, Petra; Tijms, Jurgen

    2017-01-01

    Recently, increased interletter spacing (LS) has been studied as a way to enhance reading fluency. It is suggested that increased LS improves reading performance, especially in poor readers. Theoretically, these findings are well substantiated as a result of diminished crowding effects. Empirically,

  12. The comparability of men who have sex with men recruited from venue-time-space sampling and facebook: a cohort study.

    Science.gov (United States)

    Hernandez-Romieu, Alfonso C; Sullivan, Patrick S; Sanchez, Travis H; Kelley, Colleen F; Peterson, John L; Del Rio, Carlos; Salazar, Laura F; Frew, Paula M; Rosenberg, Eli S

    2014-07-17

    Recruiting valid samples of men who have sex with men (MSM) is a key component of the US human immunodeficiency virus (HIV) surveillance and of research studies seeking to improve HIV prevention for MSM. Social media, such as Facebook, may present an opportunity to reach broad samples of MSM, but the extent to which those samples are comparable with men recruited from venue-based, time-space sampling (VBTS) is unknown. The objective of this study was to assess the comparability of MSM recruited via VBTS and Facebook. HIV-negative and HIV-positive black and white MSM were recruited from June 2010 to December 2012 using VBTS and Facebook in Atlanta, GA. We compared the self-reported venue attendance, demographic characteristics, sexual and risk behaviors, history of HIV-testing, and HIV and sexually transmitted infection (STI) prevalence between Facebook- and VTBS-recruited MSM overall and by race. Multivariate logistic and negative binomial models estimated age/race adjusted ratios. The Kaplan-Meier method was used to assess 24-month retention. We recruited 803 MSM, of whom 110 (34/110, 30.9% black MSM, 76/110, 69.1% white MSM) were recruited via Facebook and 693 (420/693, 60.6% black MSM, 273/693, 39.4% white MSM) were recruited through VTBS. Facebook recruits had high rates of venue attendance in the previous month (26/34, 77% among black and 71/76, 93% among white MSM; between-race P=.01). MSM recruited on Facebook were generally older, with significant age differences among black MSM (P=.02), but not white MSM (P=.14). In adjusted multivariate models, VBTS-recruited MSM had fewer total partners (risk ratio [RR]=0.78, 95% CI 0.64-0.95; P=.01) and unprotected anal intercourse (UAI) partners (RR=0.54, 95% CI 0.40-0.72; PFacebook, to 77% for black and 78% for white MSM recruited at venues. There was no statistically significant differences in retention between the four groups (log-rank P=.64). VBTS and Facebook recruitment methods yielded similar samples of MSM in

  13. Performance improvement of ionic surfactant flooding in carbonate rock samples by use of nanoparticles

    Directory of Open Access Journals (Sweden)

    Mohammad Ali Ahmadi

    2016-07-01

    Full Text Available Abstract Various surfactants have been used in upstream petroleum processes like chemical flooding. Ultimately, the performance of these surfactants depends on their ability to reduce the interfacial tension between oil and water. The surfactant concentration in the aqueous solution decreases owing to the loss of the surfactant on the rock surface in the injection process. The main objective of this paper is to inhibit the surfactant loss by means of adding nanoparticles. Sodium dodecyl sulfate and silica nanoparticles were used as ionic surfactant and nanoparticles in our experiments, respectively. AEROSIL® 816 and AEROSIL® 200 are hydrophobic and hydrophilic nanoparticles. To determine the adsorption loss of the surfactant onto rock samples, a conductivity approach was used. Real carbonate rock samples were used as the solid phase in adsorption experiments. It should be noted that the rock samples were water wet. This paper describes how equilibrium adsorption was investigated by examining adsorption behavior in a system of carbonate sample (solid phase and surfactant solution (aqueous phase. The initial surfactant and nanoparticle concentrations were 500–5000 and 500–2000 ppm, respectively. The rate of surfactant losses was extremely dependent on the concentration of the surfactant in the system, and the adsorption of the surfactant decreased with an increase in the nanoparticle concentration. Also, the hydrophilic nanoparticles are more effective than the hydrophobic nanoparticles.

  14. Biomimetic Sniffing with an Artificial Dogs Nose Leads to Improvements in VaporSampling and Detection

    Science.gov (United States)

    2016-07-25

    commercial equipment, instruments, and materials are identified in this work. Such identification does not imply recommendation or endorsement by the...2005). 50. Jezierski, T., Walczak, M., Górecka, A. Information-seeking behaviour of sniffer dogs during match-to-sample training in the scent lineup

  15. Using temporal sampling to improve attribution of source populations for invasive species.

    Directory of Open Access Journals (Sweden)

    Sharyn J Goldstien

    Full Text Available Numerous studies have applied genetic tools to the identification of source populations and transport pathways for invasive species. However, there are many gaps in the knowledge obtained from such studies because comprehensive and meaningful spatial sampling to meet these goals is difficult to achieve. Sampling populations as they arrive at the border should fill the gaps in source population identification, but such an advance has not yet been achieved with genetic data. Here we use previously acquired genetic data to assign new incursions as they invade populations within New Zealand ports and marinas. We also investigated allelelic frequency change in these recently established populations over a two-year period, and assessed the effect of temporal genetic sampling on our ability to assign new incursions to their population of source. We observed shifts in the allele frequencies among populations, as well as the complete loss of some alleles and the addition of alleles novel to New Zealand, within these recently established populations. There was no significant level of genetic differentiation observed in our samples between years, and the use of these temporal data did alter the assignment probability of new incursions. Our study further suggests that new incursions can add genetic variation to the population in a single introduction event as the founders themselves are often more genetically diverse than theory initially predicted.

  16. Improving Creative Problem-Solving in a Sample of Third Culture Kids

    Science.gov (United States)

    Lee, Young Ju; Bain, Sherry K.; McCallum, R. Steve

    2007-01-01

    We investigated the effects of divergent thinking training (with explicit instruction) on problem-solving tasks in a sample of Third Culture Kids (Useem and Downie, 1976). We were specifically interested in whether the children's originality and fluency in responding increased following instruction, not only on classroom-based worksheets and the…

  17. Improving Statistics Education through Simulations: The Case of the Sampling Distribution.

    Science.gov (United States)

    Earley, Mark A.

    This paper presents a summary of action research investigating statistics students' understandings of the sampling distribution of the mean. With four sections of an introductory Statistics in Education course (n=98 students), a computer simulation activity (R. delMas, J. Garfield, and B. Chance, 1999) was implemented and evaluated to show…

  18. Improvement of Thrust Bearing Calculation Considering the Convectional Heating within the Space between the Pads

    OpenAIRE

    Monika Chmielowiec-Jablczyk; Andreas Schubert; Christian Kraft; Hubert Schwarze; Michal Wodtke; Michal Wasilczuk

    2018-01-01

    A modern thrust bearing tool is used to estimate the behavior of tilting pad thrust bearings not only in the oil film between pad and rotating collar, but also in the space between the pads. The oil flow in the space significantly influences the oil film inlet temperature and the heating of pad and collar. For that reason, it is necessary to define an oil mixing model for the space between the pads. In the bearing tool, the solutions of the Reynolds equation including a cavitation model, the ...

  19. Development of a prototype thermoelectric space cooling system using phase change material to improve the performance

    Science.gov (United States)

    Zhao, Dongliang

    The thermoelectric cooling system has advantages over conventional vapor compression cooling devices, including compact in size, light in weight, high reliability, no mechanical moving parts, no refrigerant, being powered by direct current, and easily switching between cooling and heating modes. However, it has been long suffering from its relatively high cost and low energy efficiency, which has restricted its usage to niche applications, such as space missions, portable cooling devices, scientific and medical equipment, where coefficient of performance (COP) is not as important as reliability, energy availability, and quiet operation environment. Enhancement of thermoelectric cooling system performance generally relies on two methods: improving thermoelectric material efficiency and through thermoelectric cooling system thermal design. This research has been focused on the latter one. A prototype thermoelectric cooling system integrated with phase change material (PCM) thermal energy storage unit for space cooling has been developed. The PCM thermal storage unit used for cold storage at night, functions as the thermoelectric cooling system's heat sink during daytime's cooling period and provides relatively lower hot side temperature for the thermoelectric cooling system. The experimental test of the prototype system in a reduced-scale chamber has realized an average cooling COP of 0.87, with the maximum value of 1.22. Another comparison test for efficacy of PCM thermal storage unit shows that 35.3% electrical energy has been saved from using PCM for the thermoelectric cooling system. In general, PCM faces difficulty of poor thermal conductivity at both solid and liquid phases. This system implemented a finned inner tube to increase heat transfer during PCM charging (melting) process that directly impacts thermoelectric system's performance. A simulation tool for the entire system has been developed including mathematical models for a single thermoelectric module

  20. Improved Bacterial and Viral Recoveries from 'Complex' Samples using Electrophoretically Assisted Acoustic Focusing

    Energy Technology Data Exchange (ETDEWEB)

    Ness, K; Rose, K; Jung, B; Fisher, K; Mariella, Jr., R P

    2008-03-27

    Automated front-end sample preparation technologies can significantly enhance the sensitivity and reliability of biodetection assays [1]. We are developing advanced sample preparation technologies for biowarfare detection and medical point-of-care diagnostics using microfluidic systems with continuous sample processing capabilities. Here we report an electrophoretically assisted acoustic focusing technique to rapidly extract and enrich viral and bacterial loads from 'complex samples', applied in this case to human nasopharyngeal samples as well as simplified surrogates. The acoustic forces capture and remove large particles (> 2 {micro}m) such as host cells, debris, dust, and pollen from the sample. We simultaneously apply an electric field transverse to the flow direction to transport small ({le} 2 {micro}m), negatively-charged analytes into a separate purified recovery fluid using a modified H-filter configuration [Micronics US Patent 5,716,852]. Hunter and O'Brien combined transverse electrophoresis and acoustic focusing to measure the surface charge on large particles, [2] but to our knowledge, our work is the first demonstration combining these two techniques in a continuous flow device. Marina et al. demonstrated superimposed dielectrophoresis (DEP) and acoustic focusing for enhanced separations [3], but these devices have limited throughput due to the rapid decay of DEP forces. Both acoustic standing waves and electric fields exert significant forces over the entire fluid volume in microchannels, thus allowing channels with larger dimensions (> 100 {micro}m) and high throughputs (10-100 {micro}L/min) necessary to process real-world volumes (1 mL). Previous work demonstrated acoustic focusing of microbeads [4] and biological species [5] in various geometries. We experimentally characterized our device by determining the biological size-cutoff where acoustic radiation pressure forces no longer transport biological particles. Figure 1 shows

  1. An Improved Treatment of AC Space Charge Fields in Large Signal Simulation Codes

    National Research Council Canada - National Science Library

    Dialetis, D; Chernin, D; Antonsen, Jr., T. M; Levush, B

    2006-01-01

    An accurate representation of the AC space charge electric field is required in order to be able to predict the performance of linear beam tubes, including TWT's and klystrons, using a steady state...

  2. Exploiting Orbital Data and Observation Campaigns to Improve Space Debris Models

    Science.gov (United States)

    Braun, V.; Horstmann, A.; Reihs, B.; Lemmens, S.; Merz, K.; Krag, H.

    The European Space Agency (ESA) has been developing the Meteoroid and Space Debris Terrestrial Environment Reference (MASTER) software as the European reference model for space debris for more than 25 years. It is an event-based simulation of all known individual debris-generating events since 1957, including breakups, solid rocket motor firings and nuclear reactor core ejections. In 2014, the upgraded Debris Risk Assessment and Mitigation Analysis (DRAMA) tool suite was released. In the same year an ESA instruction made the standard ISO 24113:2011 on space debris mitigation requirements, adopted via the European Cooperation for Space Standardization (ECSS), applicable to all ESA missions. In order to verify the compliance of a space mission with those requirements, the DRAMA software is used to assess collision avoidance statistics, estimate the remaining orbital lifetime and evaluate the on-ground risk for controlled and uncontrolled reentries. In this paper, the approach to validate the MASTER and DRAMA tools is outlined. For objects larger than 1 cm, thus potentially being observable from ground, the MASTER model has been validated through dedicated observation campaigns. Recent campaign results shall be discussed. Moreover, catalogue data from the Space Surveillance Network (SSN) has been used to correlate the larger objects. In DRAMA, the assessment of collision avoidance statistics is based on orbit uncertainty information derived from Conjunction Data Messages (CDM) provided by the Joint Space Operations Center (JSpOC). They were collected for more than 20 ESA spacecraft in the recent years. The way this information is going to be used in a future DRAMA version is outlined and the comparison of estimated manoeuvre rates with real manoeuvres from the operations of ESA spacecraft is shown.

  3. Blood sample collection and patient identification demand improvement: a questionnaire study of preanalytical practices in hospital wards and laboratories.

    Science.gov (United States)

    Wallin, Olof; Söderberg, Johan; Van Guelpen, Bethany; Stenlund, Hans; Grankvist, Kjell; Brulin, Christine

    2010-09-01

    Scand J Caring Sci; 2010; 24; 581-591 
 Blood sample collection and patient identification demand improvement: a questionnaire study of preanalytical practices in hospital wards and laboratories   Most errors in venous blood testing result from human mistakes occurring before the sample reach the laboratory.   To survey venous blood sampling (VBS) practices in hospital wards and to compare practices with hospital laboratories.   Staff in two hospitals (all wards) and two hospital laboratories (314 respondents, response rate 94%), completed a questionnaire addressing issues relevant to the collection of venous blood samples for clinical chemistry testing.   The findings suggest that instructions for patient identification and the collection of venous blood samples were not always followed. For example, 79% of the respondents reported the undesirable practice (UDP) of not always using wristbands for patient identification. Similarly, 87% of the respondents noted the UDP of removing venous stasis after the sampling is finished. Compared with the ward staff, a significantly higher proportion of the laboratory staff reported desirable practices regarding the collection of venous blood samples. Neither education nor the existence of established sampling routines was clearly associated with VBS practices among the ward staff.   The results of this study, the first of its kind, suggest that a clinically important risk of error is associated with VBS in the surveyed wards. Most important is the risk of misidentification of patients. Quality improvement of blood sample collection is clearly needed, particularly in hospital wards. © 2009 The Authors. Journal compilation © 2009 Nordic College of Caring Science.

  4. Improved sample preparation method for environmental plutonium analysis by ICP-SFMS and alpha-spectrometry

    International Nuclear Information System (INIS)

    Varga, Z.; Stefanka, Z.; Suranyi, G.; Vajda, N.

    2007-01-01

    A rapid and simple sample preparation method for plutonium determination in environmental samples by inductively coupled plasma sector field mass spectrometry (ICP-SFMS) and alpha-spectrometry is described. The developed procedure involves a selective CaF 2 co-precipitation for preconcentration followed by extraction chromatographic separation. The proposed method effectively eliminates the possible interferences in mass spectrometric analysis and also removes interfering radionuclides that may disturb alpha-spectrometric measurement. For 239 Pu, 240 Pu and 241 Pu limits of detection of 9.0 fg x g -1 (0.021 mBq), 1.7 fg x g -1 (0.014 mBq) and 3.1 fg x g -1 (11.9 mBq) were achieved by ICP-SFMS, respectively, and 0.02 mBq by alpha-spectrometry. Results of certified reference materials agreed well with the recommended values. (author)

  5. IMPROVED SPECTROPHOTOMETRIC CALIBRATION OF THE SDSS-III BOSS QUASAR SAMPLE

    Energy Technology Data Exchange (ETDEWEB)

    Margala, Daniel; Kirkby, David [Frederick Reines Hall, Department of Physics and Astronomy, University of California, Irvine, CA (United States); Dawson, Kyle [Department of Physics and Astronomy, University of Utah, Salt Lake City, UT 84112 (United States); Bailey, Stephen [Lawrence Berkeley National Laboratory, One Cyclotron Road, Berkeley, CA 94720 (United States); Blanton, Michael [Center for Cosmology and Particle Physics, Department of Physics, New York University, 4 Washington Place, New York, NY 10003 (United States); Schneider, Donald P., E-mail: dmargala@uci.edu [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States)

    2016-11-10

    We present a model for spectrophotometric calibration errors in observations of quasars from the third generation of the Sloan Digital Sky Survey Baryon Oscillation Spectroscopic Survey (BOSS) and describe the correction procedure we have developed and applied to this sample. Calibration errors are primarily due to atmospheric differential refraction and guiding offsets during each exposure. The corrections potentially reduce the systematics for any studies of BOSS quasars, including the measurement of baryon acoustic oscillations using the Ly α forest. Our model suggests that, on average, the observed quasar flux in BOSS is overestimated by ∼19% at 3600 Å and underestimated by ∼24% at 10,000 Å. Our corrections for the entire BOSS quasar sample are publicly available.

  6. Predicted versus observed cosmic-ray-produced noble gases in lunar samples: improved Kr production ratios

    International Nuclear Information System (INIS)

    Regnier, S.; Hohenberg, C.M.; Marti, K.; Reedy, R.C.

    1979-01-01

    New sets of cross sections for the production of krypton isotopes from targets of Rb, Sr, Y, and Zr were constructed primarily on the bases of experimental excitation functions for Kr production from Y. These cross sections were used to calculate galactic-cosmic-ray and solar-proton production rates for Kr isotopes in the moon. Spallation Kr data obtained from ilmenite separates of rocks 10017 and 10047 are reported. Production rates and isotopic ratios for cosmogenic Kr observed in ten well-documented lunar samples and in ilmenite separates and bulk samples from several lunar rocks with long but unknown irradiation histories were compared with predicted rates and ratios. The agreements were generally quite good. Erosion of rock surfaces affected rates or ratios for only near-surface samples, where solar-proton production is important. There were considerable spreads in predicted-to-observed production rates of 83 Kr, due at least in part to uncertainties in chemical abundances. The 78 Kr/ 83 Kr ratios were predicted quite well for samples with a wide range of Zr/Sr abundance ratios. The calculated 80 Kr/ 83 Kr ratios were greater than the observed ratios when production by the 79 Br(n,γ) reaction was included, but were slightly undercalculated if the Br reaction was omitted; these results suggest that Br(n,γ)-produced Kr is not retained well by lunar rocks. The productions of 81 Kr and 82 Kr were overcalculated by approximately 10% relative to 83 Kr. Predicted-to-observed 84 Kr/ 83 ratios scattered considerably, possibly because of uncertainties in corrections for trapped and fission components and in cross sections for 84 Kr production. Most predicted 84 Kr and 86 Kr production rates were lower than observed. Shielding depths of several Apollo 11 rocks were determined from the measured 78 Kr/ 83 Kr ratios of ilmenite separates. 4 figures, 5 tables

  7. Improved Understanding of Sources of Variability in Groundwater Sampling for Long-Term Monitoring Programs

    Science.gov (United States)

    2013-02-01

    contents be construed as reflecting the official policy or position of the Department of Defense. Reference herein to any specific commercial product ... Ethylbenzene , and Vinyl Chloride. One pair of sample and duplicate results was reported as non-detect for Ethylbenzene and were not included in the RPD...by TestAmerica for 1,1-Dichloroethane, Benzene, Chlorobenzene, Ethylbenzene , and Vinyl Chloride resulted in all RPD values meeting the RDP criteria

  8. Improved LC-MS/MS method for the quantification of hepcidin-25 in clinical samples.

    Science.gov (United States)

    Abbas, Ioana M; Hoffmann, Holger; Montes-Bayón, María; Weller, Michael G

    2018-06-01

    Mass spectrometry-based methods play a crucial role in the quantification of the main iron metabolism regulator hepcidin by singling out the bioactive 25-residue peptide from the other naturally occurring N-truncated isoforms (hepcidin-20, -22, -24), which seem to be inactive in iron homeostasis. However, several difficulties arise in the MS analysis of hepcidin due to the "sticky" character of the peptide and the lack of suitable standards. Here, we propose the use of amino- and fluoro-silanized autosampler vials to reduce hepcidin interaction to laboratory glassware surfaces after testing several types of vials for the preparation of stock solutions and serum samples for isotope dilution liquid chromatography-tandem mass spectrometry (ID-LC-MS/MS). Furthermore, we have investigated two sample preparation strategies and two chromatographic separation conditions with the aim of developing a LC-MS/MS method for the sensitive and reliable quantification of hepcidin-25 in serum samples. A chromatographic separation based on usual acidic mobile phases was compared with a novel approach involving the separation of hepcidin-25 with solvents at high pH containing 0.1% of ammonia. Both methods were applied to clinical samples in an intra-laboratory comparison of two LC-MS/MS methods using the same hepcidin-25 calibrators with good correlation of the results. Finally, we recommend a LC-MS/MS-based quantification method with a dynamic range of 0.5-40 μg/L for the assessment of hepcidin-25 in human serum that uses TFA-based mobile phases and silanized glass vials. Graphical abstract Structure of hepcidin-25 (Protein Data Bank, PDB ID 2KEF).

  9. Application of Compton suppression spectrometry in the improvement of nuclear analytical techniques for biological samples

    International Nuclear Information System (INIS)

    Ahmed, Y. A.; Ewa, I.O.B.; Funtua, I.I.; Jonah, S.A.; Landsberger, S.

    2007-01-01

    Compton Suppression Factors (SF) and Compton Reduction Factors (RF) of the UT Austin's Compton suppression spectrometer being parameters characterizing the system performance were measured using ''1''3''7Cs and ''6''0Co point sources. The system performance was evaluated as a function of energy and geometry. The (P/C), A(P/C), (P/T), Cp, and Ce were obtained for each of the parameters. The natural background reduction factor in the anticoincidence mode and that of normal mode was calculated and its effect on the detection limit of biological samples evaluated. Applicability of the spectrometer and the method for biological samples was tested in the measurement of twenty-four elements (Ba, Sr, I, Br, Cu, V, Mg, Na, Cl, Mn, Ca, Sn, In, K, Mo, Cd, Zn, As, Sb, Ni, Rb, Cs, Fe, and Co) commonly found in food, milk, tea and tobacco items. They were determined from seven National Institute for Standard and Technology (NIST) certified reference materials (rice flour, oyster tissue, non-fat powdered milk, peach leaves, tomato leaves, apple leaves, and citrus leaves). Our results shows good agreement with the NIST certified values, indicating that the method developed in the present study is suitable for the determination of aforementioned elements in biological samples without undue interference problems

  10. Improved discovery of NEON data and samples though vocabularies, workflows, and web tools

    Science.gov (United States)

    Laney, C. M.; Elmendorf, S.; Flagg, C.; Harris, T.; Lunch, C. K.; Gulbransen, T.

    2017-12-01

    The National Ecological Observatory Network (NEON) is a continental-scale ecological observation facility sponsored by the National Science Foundation and operated by Battelle. NEON supports research on the impacts of invasive species, land use change, and environmental change on natural resources and ecosystems by gathering and disseminating a full suite of observational, instrumented, and airborne datasets from field sites across the U.S. NEON also collects thousands of samples from soil, water, and organisms every year, and partners with numerous institutions to analyze and archive samples. We have developed numerous new technologies to support processing and discovery of this highly diverse collection of data. These technologies include applications for data collection and sample management, processing pipelines specific to each collection system (field observations, installed sensors, and airborne instruments), and publication pipelines. NEON data and metadata are discoverable and downloadable via both a public API and data portal. We solicit continued engagement and advice from the informatics and environmental research communities, particularly in the areas of data versioning, usability, and visualization.

  11. Improving Precision and Reducing Runtime of Microscopic Traffic Simulators through Stratified Sampling

    Directory of Open Access Journals (Sweden)

    Khewal Bhupendra Kesur

    2013-01-01

    Full Text Available This paper examines the application of Latin Hypercube Sampling (LHS and Antithetic Variables (AVs to reduce the variance of estimated performance measures from microscopic traffic simulators. LHS and AV allow for a more representative coverage of input probability distributions through stratification, reducing the standard error of simulation outputs. Two methods of implementation are examined, one where stratification is applied to headways and routing decisions of individual vehicles and another where vehicle counts and entry times are more evenly sampled. The proposed methods have wider applicability in general queuing systems. LHS is found to outperform AV, and reductions of up to 71% in the standard error of estimates of traffic network performance relative to independent sampling are obtained. LHS allows for a reduction in the execution time of computationally expensive microscopic traffic simulators as fewer simulations are required to achieve a fixed level of precision with reductions of up to 84% in computing time noted on the test cases considered. The benefits of LHS are amplified for more congested networks and as the required level of precision increases.

  12. Wrong, but useful: regional species distribution models may not be improved by range-wide data under biased sampling.

    Science.gov (United States)

    El-Gabbas, Ahmed; Dormann, Carsten F

    2018-02-01

    Species distribution modeling (SDM) is an essential method in ecology and conservation. SDMs are often calibrated within one country's borders, typically along a limited environmental gradient with biased and incomplete data, making the quality of these models questionable. In this study, we evaluated how adequate are national presence-only data for calibrating regional SDMs. We trained SDMs for Egyptian bat species at two different scales: only within Egypt and at a species-specific global extent. We used two modeling algorithms: Maxent and elastic net, both under the point-process modeling framework. For each modeling algorithm, we measured the congruence of the predictions of global and regional models for Egypt, assuming that the lower the congruence, the lower the appropriateness of the Egyptian dataset to describe the species' niche. We inspected the effect of incorporating predictions from global models as additional predictor ("prior") to regional models, and quantified the improvement in terms of AUC and the congruence between regional models run with and without priors. Moreover, we analyzed predictive performance improvements after correction for sampling bias at both scales. On average, predictions from global and regional models in Egypt only weakly concur. Collectively, the use of priors did not lead to much improvement: similar AUC and high congruence between regional models calibrated with and without priors. Correction for sampling bias led to higher model performance, whatever prior used, making the use of priors less pronounced. Under biased and incomplete sampling, the use of global bats data did not improve regional model performance. Without enough bias-free regional data, we cannot objectively identify the actual improvement of regional models after incorporating information from the global niche. However, we still believe in great potential for global model predictions to guide future surveys and improve regional sampling in data

  13. Detecting representative data and generating synthetic samples to improve learning accuracy with imbalanced data sets.

    Directory of Open Access Journals (Sweden)

    Der-Chiang Li

    Full Text Available It is difficult for learning models to achieve high classification performances with imbalanced data sets, because with imbalanced data sets, when one of the classes is much larger than the others, most machine learning and data mining classifiers are overly influenced by the larger classes and ignore the smaller ones. As a result, the classification algorithms often have poor learning performances due to slow convergence in the smaller classes. To balance such data sets, this paper presents a strategy that involves reducing the sizes of the majority data and generating synthetic samples for the minority data. In the reducing operation, we use the box-and-whisker plot approach to exclude outliers and the Mega-Trend-Diffusion method to find representative data from the majority data. To generate the synthetic samples, we propose a counterintuitive hypothesis to find the distributed shape of the minority data, and then produce samples according to this distribution. Four real datasets were used to examine the performance of the proposed approach. We used paired t-tests to compare the Accuracy, G-mean, and F-measure scores of the proposed data pre-processing (PPDP method merging in the D3C method (PPDP+D3C with those of the one-sided selection (OSS, the well-known SMOTEBoost (SB study, and the normal distribution-based oversampling (NDO approach, and the proposed data pre-processing (PPDP method. The results indicate that the classification performance of the proposed approach is better than that of above-mentioned methods.

  14. Cluster chemical ionization for improved confidence level in sample identification by gas chromatography/mass spectrometry.

    Science.gov (United States)

    Fialkov, Alexander B; Amirav, Aviv

    2003-01-01

    Upon the supersonic expansion of helium mixed with vapor from an organic solvent (e.g. methanol), various clusters of the solvent with the sample molecules can be formed. As a result of 70 eV electron ionization of these clusters, cluster chemical ionization (cluster CI) mass spectra are obtained. These spectra are characterized by the combination of EI mass spectra of vibrationally cold molecules in the supersonic molecular beam (cold EI) with CI-like appearance of abundant protonated molecules, together with satellite peaks corresponding to protonated or non-protonated clusters of sample compounds with 1-3 solvent molecules. Like CI, cluster CI preferably occurs for polar compounds with high proton affinity. However, in contrast to conventional CI, for non-polar compounds or those with reduced proton affinity the cluster CI mass spectrum converges to that of cold EI. The appearance of a protonated molecule and its solvent cluster peaks, plus the lack of protonation and cluster satellites for prominent EI fragments, enable the unambiguous identification of the molecular ion. In turn, the insertion of the proper molecular ion into the NIST library search of the cold EI mass spectra eliminates those candidates with incorrect molecular mass and thus significantly increases the confidence level in sample identification. Furthermore, molecular mass identification is of prime importance for the analysis of unknown compounds that are absent in the library. Examples are given with emphasis on the cluster CI analysis of carbamate pesticides, high explosives and unknown samples, to demonstrate the usefulness of Supersonic GC/MS (GC/MS with supersonic molecular beam) in the analysis of these thermally labile compounds. Cluster CI is shown to be a practical ionization method, due to its ease-of-use and fast instrumental conversion between EI and cluster CI, which involves the opening of only one valve located at the make-up gas path. The ease-of-use of cluster CI is analogous

  15. Improved prediction of MHC class I and class II epitopes using a novel Gibbs sampling approach

    DEFF Research Database (Denmark)

    Nielsen, Morten; Lundegaard, Claus; Worning, Peder

    2004-01-01

    Prediction of which peptides will bind a specific major histocompatibility complex (MHC) constitutes an important step in identifying potential T-cell epitopes suitable as vaccine candidates. MHC class II binding peptides have a broad length distribution complicating such predictions. Thus......, identifying the correct alignment is a crucial part of identifying the core of an MHC class II binding motif. In this context, we wish to describe a novel Gibbs motif sampler method ideally suited for recognizing such weak sequence motifs. The method is based on the Gibbs sampling method, and it incorporates...

  16. An Improved Estimation of Regional Fractional Woody/Herbaceous Cover Using Combined Satellite Data and High-Quality Training Samples

    Directory of Open Access Journals (Sweden)

    Xu Liu

    2017-01-01

    Full Text Available Mapping vegetation cover is critical for understanding and monitoring ecosystem functions in semi-arid biomes. As existing estimates tend to underestimate the woody cover in areas with dry deciduous shrubland and woodland, we present an approach to improve the regional estimation of woody and herbaceous fractional cover in the East Asia steppe. This developed approach uses Random Forest models by combining multiple remote sensing data—training samples derived from high-resolution image in a tailored spatial sampling and model inputs composed of specific metrics from MODIS sensor and ancillary variables including topographic, bioclimatic, and land surface information. We emphasize that effective spatial sampling, high-quality classification, and adequate geospatial information are important prerequisites of establishing appropriate model inputs and achieving high-quality training samples. This study suggests that the optimal models improve estimation accuracy (NMSE 0.47 for woody and 0.64 for herbaceous plants and show a consistent agreement with field observations. Compared with existing woody estimate product, the proposed woody cover estimation can delineate regions with subshrubs and shrubs, showing an improved capability of capturing spatialized detail of vegetation signals. This approach can be applicable over sizable semi-arid areas such as temperate steppes, savannas, and prairies.

  17. Improved analytical sensitivity for uranium and plutonium in environmental samples: Cavity ion source thermal ionization mass spectrometry

    International Nuclear Information System (INIS)

    Ingeneri, Kristofer; Riciputi, L.

    2001-01-01

    Following successful field trials, environmental sampling has played a central role as a routine part of safeguards inspections since early 1996 to verify declared and to detect undeclared activity. The environmental sampling program has brought a new series of analytical challenges, and driven a need for advances in verification technology. Environmental swipe samples are often extremely low in concentration of analyte (ng level or lower), yet the need to analyze these samples accurately and precisely is vital, particularly for the detection of undeclared nuclear activities. Thermal ionization mass spectrometry (TIMS) is the standard method of determining isotope ratios of uranium and plutonium in the environmental sampling program. TIMS analysis typically employs 1-3 filaments to vaporize and ionize the sample, and the ions are mass separated and analyzed using magnetic sector instruments due to their high mass resolution and high ion transmission. However, the ionization efficiency (the ratio of material present to material actually detected) of uranium using a standard TIMS instrument is low (0.2%), even under the best conditions. Increasing ionization efficiency by even a small amount would have a dramatic impact for safeguards applications, allowing both improvements in analytical precision and a significant decrease in the amount of uranium and plutonium required for analysis, increasing the sensitivity of environmental sampling

  18. Characterization of spatial distribution of Tetranychus urticae in peppermint in California and implication for improving sampling plan.

    Science.gov (United States)

    Rijal, Jhalendra P; Wilson, Rob; Godfrey, Larry D

    2016-02-01

    Twospotted spider mite, Tetranychus urticae Koch, is an important pest of peppermint in California, USA. Spider mite feeding on peppermint leaves causes physiological changes in the plant, which coupling with the favorable environmental condition can lead to increased mite infestations. Significant yield loss can occur in absence of pest monitoring and timely management. Understating the within-field spatial distribution of T. urticae is critical for the development of reliable sampling plan. The study reported here aims to characterize the spatial distribution of mite infestation in four commercial peppermint fields in northern California using spatial techniques, variogram and Spatial Analysis by Distance IndicEs (SADIE). Variogram analysis revealed that there was a strong evidence for spatially dependent (aggregated) mite population in 13 of 17 sampling dates and the physical distance of the aggregation reached maximum to 7 m in peppermint fields. Using SADIE, 11 of 17 sampling dates showed aggregated distribution pattern of mite infestation. Combining results from variogram and SADIE analysis, the spatial aggregation of T. urticae was evident in all four fields for all 17 sampling dates evaluated. Comparing spatial association using SADIE, ca. 62% of the total sampling pairs showed a positive association of mite spatial distribution patterns between two consecutive sampling dates, which indicates a strong spatial and temporal stability of mite infestation in peppermint fields. These results are discussed in relation to behavior of spider mite distribution within field, and its implications for improving sampling guidelines that are essential for effective pest monitoring and management.

  19. Using Dried Blood Spot Sampling to Improve Data Quality and Reduce Animal Use in Mouse Pharmacokinetic Studies

    Science.gov (United States)

    Wickremsinhe, Enaksha R; Perkins, Everett J

    2015-01-01

    Traditional pharmacokinetic analysis in nonclinical studies is based on the concentration of a test compound in plasma and requires approximately 100 to 200 µL blood collected per time point. However, the total blood volume of mice limits the number of samples that can be collected from an individual animal—often to a single collection per mouse—thus necessitating dosing multiple mice to generate a pharmacokinetic profile in a sparse-sampling design. Compared with traditional methods, dried blood spot (DBS) analysis requires smaller volumes of blood (15 to 20 µL), thus supporting serial blood sampling and the generation of a complete pharmacokinetic profile from a single mouse. Here we compare plasma-derived data with DBS-derived data, explain how to adopt DBS sampling to support discovery mouse studies, and describe how to generate pharmacokinetic and pharmacodynamic data from a single mouse. Executing novel study designs that use DBS enhances the ability to identify and streamline better drug candidates during drug discovery. Implementing DBS sampling can reduce the number of mice needed in a drug discovery program. In addition, the simplicity of DBS sampling and the smaller numbers of mice needed translate to decreased study costs. Overall, DBS sampling is consistent with 3Rs principles by achieving reductions in the number of animals used, decreased restraint-associated stress, improved data quality, direct comparison of interanimal variability, and the generation of multiple endpoints from a single study. PMID:25836959

  20. Improving neutron multiplicity counting for the spatial dependence of multiplication: Results for spherical plutonium samples

    Energy Technology Data Exchange (ETDEWEB)

    Göttsche, Malte, E-mail: malte.goettsche@physik.uni-hamburg.de; Kirchner, Gerald

    2015-10-21

    The fissile mass deduced from a neutron multiplicity counting measurement of high mass dense items is underestimated if the spatial dependence of the multiplication is not taken into account. It is shown that an appropriate physics-based correction successfully removes the bias. It depends on four correction coefficients which can only be exactly determined if the sample geometry and composition are known. In some cases, for example in warhead authentication, available information on the sample will be very limited. MCNPX-PoliMi simulations have been performed to obtain the correction coefficients for a range of spherical plutonium metal geometries, with and without polyethylene reflection placed around the spheres. For hollow spheres, the analysis shows that the correction coefficients can be approximated with high accuracy as a function of the sphere's thickness depending only slightly on the radius. If the thickness remains unknown, less accurate estimates of the correction coefficients can be obtained from the neutron multiplication. The influence of isotopic composition is limited. The correction coefficients become somewhat smaller when reflection is present.

  1. Improving off-line accelerated tryptic digestion. Towards fast-lane proteolysis of complex biological samples.

    Science.gov (United States)

    Vukovic, Jadranka; Loftheim, Håvard; Winther, Bjørn; Reubsaet, J Léon E

    2008-06-27

    Off-line digestion of proteins using immobilized trypsin beads is studied with respect to the format of the digestion reactor, the digestion conditions, the comparison with in-solution digestion and its use in complex biological samples. The use of the filter vial as the most appropriate digestion reactor enables simple, efficient and easy-to-handle off-line digestion of the proteins on trypsin beads. It was shown that complex proteins like bovine serum albumin (BSA) need much longer time (89 min) and elevated temperature (37 degrees C) to be digested to an acceptable level compared to smaller proteins like cytochrome c (5 min, room temperature). Comparing the BSA digestion using immobilized trypsin beads with conventional in-solution digestion (overnight at 37 degrees C), it was shown that comparable results were obtained with respect to sequence coverage (>90%) and amount of missed cleavages (in both cases around 20 peptides with 1 or 2 missed cleavages were detected). However, the digestion using immobilized trypsin beads was considerable less time consuming. Good reproducibility and signal intensities were obtained for the digestion products of BSA in a complex urine sample. In addition to this, peptide products of proteins typically present in urine were identified.

  2. Improving forensic mental health care to Indigenous Australians: theorizing the intercultural space.

    Science.gov (United States)

    Durey, A; Wynaden, D; O'Kane, M

    2014-05-01

    This paper uses the 'intercultural space' as an educational strategy to prepare nurses to work respectfully with Indigenous patients in a forensic mental health context; offers an educational approach that introduces nurses to Indigenous knowledge, beliefs and values, examines power relations in colonized countries between the dominant white cultural group and the Indigenous population and encourages nurses to critically reflect on their health care practice; and explores the intercultural space as a shared space between cultures fostering open and robust inquiry where neither culture dominates and new positions, representations and understandings can emerge. Given the disproportionately high number of Indigenous people imprisoned in colonized countries, this paper responds to research from Western Australia on the need to prepare forensic mental health nurses to deliver care to Indigenous patients with mental health disorders. The paper highlights the nexus between theory, research and education that can inform the design and implementation of programmes to help nurses navigate the complex, layered and contested 'intercultural space' and deliver culturally safe care to Indigenous patients. Nurses are encouraged to critically reflect on how beliefs and values underpinning their cultural positioning impact on health care to Indigenous patients. The paper draws on intercultural theory to offer a pedagogical framework that acknowledges the negative impacts of colonization on Indigenous health and well-being, repositions and revalues Indigenous cultures and knowledges and fosters open and robust inquiry. This approach is seen as a step towards working more effectively in the intercultural space where ultimately binary oppositions that privilege one culture over another and inhibit robust inquiry are avoided, paving the way for new, more inclusive positions, representations and understandings to emerge. While the intercultural space can be a place of struggle, tension

  3. Analysis of human serum by liquid chromatography-mass spectrometry: improved sample preparation and data analysis.

    Science.gov (United States)

    Govorukhina, N I; Reijmers, T H; Nyangoma, S O; van der Zee, A G J; Jansen, R C; Bischoff, R

    2006-07-07

    Discovery of biomarkers is a fast developing field in proteomics research. Liquid chromatography coupled on line to mass spectrometry (LC-MS) has become a powerful method for the sensitive detection, quantification and identification of proteins and peptides in biological fluids like serum. However, the presence of highly abundant proteins often masks those of lower abundance and thus generally prevents their detection and identification in proteomics studies. To perform future comparative analyses of samples from a serum bank of cervical cancer patients in a longitudinal and cross-sectional manner, methodology based on the depletion of high-abundance proteins followed by tryptic digestion and LC-MS has been developed. Two sample preparation methods were tested in terms of their efficiency to deplete high-abundance serum proteins and how they affect the repeatability of the LC-MS data sets. The first method comprised depletion of human serum albumin (HSA) on a dye ligand chromatographic and immunoglobulin G (IgG) on an immobilized Protein A support followed by tryptic digestion, fractionation by cation-exchange chromatography, trapping on a C18 column and reversed-phase LC-MS. The second method included depletion of the six most abundant serum proteins based on multiple immunoaffinity chromatography followed by tryptic digestion, trapping on a C18 column and reversed-phase LC-MS. Repeatability of the overall procedures was evaluated in terms of retention time and peak area for a selected number of endogenous peptides showing that the second method, besides being less time consuming, gave more repeatable results (retention time: <0.1% RSD; peak area: <30% RSD). Application of an LC-MS component detection algorithm followed by principal component analysis (PCA) enabled discrimination of serum samples that were spiked with horse heart cytochrome C from non-spiked serum and the detection of a concentration trend, which correlated to the amount of spiked horse heart

  4. Improving the Calibration of Image Sensors Based on IOFBs, Using Differential Gray-Code Space Encoding

    Directory of Open Access Journals (Sweden)

    Carlos Luna Vázquez

    2012-07-01

    Full Text Available This paper presents a fast calibration method to determine the transfer function for spatial correspondences in image transmission devices with Incoherent Optical Fiber Bundles (IOFBs, by performing a scan of the input, using differential patterns generated from a Gray code (Differential Gray-Code Space Encoding, DGSE. The results demonstrate that this technique provides a noticeable reduction in processing time and better quality of the reconstructed image compared to other, previously employed techniques, such as point or fringe scanning, or even other known space encoding techniques.

  5. Application of spinal code for performance improvement in free-space optical communications

    Science.gov (United States)

    Saiki, Naoya; Okamoto, Eiji; Takenaka, Hideki; Toyoshima, Morio

    2017-09-01

    In recent years, the demand for high-capacity communication has grown, and fiber-optic transmission is being used in wired communications to meet this demand. Similarly, free-space optics (FSO), which is an optical wireless communication technology that uses laser light, has attracted much attention and has been considered as a suitable alternative to satisfy this demand in wireless communications. Free-space optical communication uses a hundred THz frequency band and allows for high-speed and radio-regulation free transmission, which may provide a solution for the current shortage of radio frequency bands.

  6. A call to improve sampling methodology and reporting in young novice driver research.

    Science.gov (United States)

    Scott-Parker, B; Senserrick, T

    2017-02-01

    Young drivers continue to be over-represented in road crash fatalities despite a multitude of research, communication and intervention. Evidence-based improvement depends to a great extent upon research methodology quality and its reporting, with known limitations in the peer-review process. The aim of the current research was to review the scope of research methodologies applied in 'young driver' and 'teen driver' research and their reporting in four peer-review journals in the field between January 2006 and December 2013. In total, 806 articles were identified and assessed. Reporting omissions included participant gender (11% of papers), response rates (49%), retention rates (39%) and information regarding incentives (44%). Greater breadth and specific improvements in study designs and reporting are thereby identified as a means to further advance the field. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  7. Apparent density measurement by mercury pycnometry. Improved accuracy. Simplification of handling for possible application to irradiated samples

    International Nuclear Information System (INIS)

    Marlet, Bernard

    1978-12-01

    The accuracy of the apparent density measurement on massive samples of any geometrical shape has been improved and the method simplified. A standard deviation of +-1 to 5.10 -3 g.ml -1 according to the size and surface state of the sample, was obtained by the use of a flat ground stopper on a mercury pycnometer which fills itself under vacuum. This method saves considerable time and has been adapted to work in shielded cells for the measurement of radioactive materials, especially sintered uranium dioxide leaving the pile. The different parameters are analysed and criticized [fr

  8. Improvement of the qualitative and quantitative detection of simultaneously present fluorescent tracers by systematic sample treatment

    International Nuclear Information System (INIS)

    Behrens, H.

    1982-01-01

    The selective instrumental detection of individual fluorescent tracers in mixtures containing further fluorescent dyes is limited by spectral interferences. Therefore additional separations or other suitable procedures have to be included into the analytic technique. With the method described below, the respective tracer to be detected remains with its initial concentration in the sample and is analysed under the appropriate conditions, whereas the interfering tracers are separated or suppressed. The techniques applied for this base on the facts that 1) the fluorescence intensity of the tracers varies differently when the pH-value changes; 2) the tracers show different absorption behaviour and 3) they provide different degrees of light sensitivity. The procedures permit for example to detect uranin when eosin is present in a higher concentration or to detect eosin when amidorhodamin G is present. (orig.) [de

  9. Improved sample preparation for CE-LIF analysis of plant N-glycans.

    Science.gov (United States)

    Nagels, Bieke; Santens, Francis; Weterings, Koen; Van Damme, Els J M; Callewaert, Nico

    2011-12-01

    In view of glycomics studies in plants, it is important to have sensitive tools that allow one to analyze and characterize the N-glycans present on plant proteins in different species. Earlier methods combined plant-based sample preparations with CE-LIF N-glycan analysis but suffered from background contaminations, often resulting in non-reproducible results. This publication describes a reproducible and sensitive protocol for the preparation and analysis of plant N-glycans, based on a combination of the 'in-gel release method' and N-glycan analysis on a multicapillary DNA sequencer. Our protocol makes it possible to analyze plant N-glycans starting from low amounts of plant material with highly reproducible results. The developed protocol was validated for different plant species and plant cells. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Unbiased tensor-based morphometry: improved robustness and sample size estimates for Alzheimer's disease clinical trials.

    Science.gov (United States)

    Hua, Xue; Hibar, Derrek P; Ching, Christopher R K; Boyle, Christina P; Rajagopalan, Priya; Gutman, Boris A; Leow, Alex D; Toga, Arthur W; Jack, Clifford R; Harvey, Danielle; Weiner, Michael W; Thompson, Paul M

    2013-02-01

    Various neuroimaging measures are being evaluated for tracking Alzheimer's disease (AD) progression in therapeutic trials, including measures of structural brain change based on repeated scanning of patients with magnetic resonance imaging (MRI). Methods to compute brain change must be robust to scan quality. Biases may arise if any scans are thrown out, as this can lead to the true changes being overestimated or underestimated. Here we analyzed the full MRI dataset from the first phase of Alzheimer's Disease Neuroimaging Initiative (ADNI-1) from the first phase of Alzheimer's Disease Neuroimaging Initiative (ADNI-1) and assessed several sources of bias that can arise when tracking brain changes with structural brain imaging methods, as part of a pipeline for tensor-based morphometry (TBM). In all healthy subjects who completed MRI scanning at screening, 6, 12, and 24months, brain atrophy was essentially linear with no detectable bias in longitudinal measures. In power analyses for clinical trials based on these change measures, only 39AD patients and 95 mild cognitive impairment (MCI) subjects were needed for a 24-month trial to detect a 25% reduction in the average rate of change using a two-sided test (α=0.05, power=80%). Further sample size reductions were achieved by stratifying the data into Apolipoprotein E (ApoE) ε4 carriers versus non-carriers. We show how selective data exclusion affects sample size estimates, motivating an objective comparison of different analysis techniques based on statistical power and robustness. TBM is an unbiased, robust, high-throughput imaging surrogate marker for large, multi-site neuroimaging studies and clinical trials of AD and MCI. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. An improved empirical model for diversity gain on Earth-space propagation paths

    Science.gov (United States)

    Hodge, D. B.

    1981-01-01

    An empirical model was generated to estimate diversity gain on Earth-space propagation paths as a function of Earth terminal separation distance, link frequency, elevation angle, and angle between the baseline and the path azimuth. The resulting model reproduces the entire experimental data set with an RMS error of 0.73 dB.

  12. Improvement of Thrust Bearing Calculation Considering the Convectional Heating within the Space between the Pads

    Directory of Open Access Journals (Sweden)

    Monika Chmielowiec-Jablczyk

    2018-02-01

    Full Text Available A modern thrust bearing tool is used to estimate the behavior of tilting pad thrust bearings not only in the oil film between pad and rotating collar, but also in the space between the pads. The oil flow in the space significantly influences the oil film inlet temperature and the heating of pad and collar. For that reason, it is necessary to define an oil mixing model for the space between the pads. In the bearing tool, the solutions of the Reynolds equation including a cavitation model, the energy equation and the heat transfer equation are done iteratively with the finite volume method by considering a constant flow rate. Both effects—laminar/turbulent flow and centrifugal force—are considered. The calculation results are compared with measurements done for a flooded thrust bearing with nominal eight tilting pads with an outer diameter of 180 mm. The heat convection coefficients for the pad surfaces mainly influence the pad temperature field and are adjusted to the measurement results. In the following paper, the calculation results for variable space distances, influence of different parameters on the bearing behavior and operating condition at high load are presented.

  13. An improvement of dimension-free Sobolev imbeddings in r spaces

    Czech Academy of Sciences Publication Activity Database

    Fiorenza, A.; Krbec, Miroslav; Schmeisser, H.-J.

    2014-01-01

    Roč. 267, č. 1 (2014), s. 243-261 ISSN 0022-1236 R&D Projects: GA ČR GAP201/10/1920 Institutional support: RVO:67985840 Keywords : imbedding theorem * small Lebesgue space * rearrangement-invariant Banach Subject RIV: BA - General Mathematics Impact factor: 1.322, year: 2014 http://www.sciencedirect.com/science/article/pii/S0022123614001724

  14. Improved Epstein-Glaser Renormalization in Coordinate Space I. Euclidean Framework

    International Nuclear Information System (INIS)

    Gracia-Bondia, Jose M.

    2003-01-01

    In a series of papers, we investigate the reformulation of Epstein-Glaser renormalization in coordinate space, both in analytic and (Hopf) algebraic terms. This first article deals with analytical aspects. Some of the (historically good) reasons for the divorces of the Epstein-Glaser method, both from mainstream quantum field theory and the mathematical literature on distributions, are made plain; and overcome

  15. Improving Problem-Solving Skills with the Help of Plane-Space Analogies

    Science.gov (United States)

    Budai, László

    2013-01-01

    We live our lives in three-dimensional space and encounter geometrical problems (equipment instructions, maps, etc.) every day. Yet there are not sufficient opportunities for high school students to learn geometry. New teaching methods can help remedy this. Specifically our experience indicates that there is great promise for use of geometry…

  16. Using Monte Carlo Simulation To Improve Cargo Mass Estimates For International Space Station Commercial Resupply Flights

    Science.gov (United States)

    2016-12-01

    The Challenges of ISS Resupply .......................................... 23 F. THE IMPORTANCE OF MASS PROPERTIES IN SPACECRAFT AND MISSION DESIGN...Transportation System TBA trundle bearing assembly VLC verification loads cycle xv EXECUTIVE SUMMARY Resupplying the International Space Station...management priorities. This study addresses those challenges by developing Monte Carlo simulations based on over 13 years of as- flownSS resupply

  17. Improvement of a sample preparation method assisted by sodium deoxycholate for mass-spectrometry-based shotgun membrane proteomics.

    Science.gov (United States)

    Lin, Yong; Lin, Haiyan; Liu, Zhonghua; Wang, Kunbo; Yan, Yujun

    2014-11-01

    In current shotgun-proteomics-based biological discovery, the identification of membrane proteins is a challenge. This is especially true for integral membrane proteins due to their highly hydrophobic nature and low abundance. Thus, much effort has been directed at sample preparation strategies such as use of detergents, chaotropes, and organic solvents. We previously described a sample preparation method for shotgun membrane proteomics, the sodium deoxycholate assisted method, which cleverly circumvents many of the challenges associated with traditional sample preparation methods. However, the method is associated with significant sample loss due to the slightly weaker extraction/solubilization ability of sodium deoxycholate when it is used at relatively low concentrations such as 1%. Hence, we present an enhanced sodium deoxycholate sample preparation strategy that first uses a high concentration of sodium deoxycholate (5%) to lyse membranes and extract/solubilize hydrophobic membrane proteins, and then dilutes the detergent to 1% for a more efficient digestion. We then applied the improved method to shotgun analysis of proteins from rat liver membrane enriched fraction. Compared with other representative sample preparation strategies including our previous sodium deoxycholate assisted method, the enhanced sodium deoxycholate method exhibited superior sensitivity, coverage, and reliability for the identification of membrane proteins particularly those with high hydrophobicity and/or multiple transmembrane domains. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Capture and exploration of sample quality data to inform and improve the management of a screening collection.

    Science.gov (United States)

    Charles, Isabel; Sinclair, Ian; Addison, Daniel H

    2014-04-01

    A new approach to the storage, processing, and interrogation of the quality data for screening samples has improved analytical throughput and confidence and enhanced the opportunities for learning from the accumulating records. The approach has entailed the design, development, and implementation of a database-oriented system, capturing information from the liquid chromatography-mass spectrometry capabilities used for assessing the integrity of samples in AstraZeneca's screening collection. A Web application has been developed to enable the visualization and interactive annotation of the analytical data, monitor the current sample queue, and report the throughput rate. Sample purity and identity are certified automatically on the chromatographic peaks of interest if predetermined thresholds are reached on key parameters. Using information extracted in parallel from the compound registration and container inventory databases, the chromatographic and spectroscopic profiles for each vessel are linked to the sample structures and storage histories. A search engine facilitates the direct comparison of results for multiple vessels of the same or similar compounds, for single vessels analyzed at different time points, or for vessels related by their origin or process flow. Access to this network of information has provided a deeper understanding of the multiple factors contributing to sample quality assurance.

  19. Two-dimensional gas chromatography-online hydrogenation for improved characterization of petrochemical samples.

    Science.gov (United States)

    Potgieter, H; Bekker, R; Govender, A; Rohwer, E

    2016-05-06

    The Fischer-Tropsch (FT) process produces a variety of hydrocarbons over a wide carbon number range and during subsequent product workup a large variety of synthetic fuels and chemicals are produced. The complexity of the product slate obtained from this process is well documented and the high temperature FT (HT-FT) process products are spread over gas, oil and water phases. The characterization of these phases is very challenging even when using comprehensive two-dimensional gas chromatography time-of-flight mass spectrometry (GC×GC-TOFMS). Despite the increase in separation power, peak co-elution still occurs when samples containing isomeric compounds are analysed by comprehensive two dimensional GC. The separation of isomeric compounds with the same double bond equivalents is especially difficult since these compounds elute in a similar position on the GC×GC chromatogram and have identical molecular masses and similar fragmentation patterns in their electron ionization (EI) mass spectra. On-line hydrogenation after GC×GC separation is a possible way to distinguish between these isomeric compounds since the number of rings and alkene double bonds can be determined from the mass spectra of the compounds before and after hydrogenation. This paper describes development of a GC×GC method with post column hydrogenation for the determination of the backbone of cyclic/olefinic structures enabling us to differentiate between classes like dienes and cyclic olefins in complex petrochemical streams. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Adrenal vein sampling: substantial need for technical improvement at regional referral centres.

    Science.gov (United States)

    Elliott, Panda; Holmes, Daniel T

    2013-10-01

    Adrenal vein sampling (AVS) is the gold standard for localization of aldosterone producing adenoma. The anatomy of the right adrenal vein makes this procedure technically demanding and it may yield no clinical information if the adrenal veins are not adequately cannulated. Having frequently observed the technical failure of AVS, we undertook a review of 220 procedures in British Columbia, Canada. Subjects were retrospectively identified through the laboratory information system. The following were collected: demographics, screening aldosterone concentration and renin activity/mass, results of dynamic function tests, AVS aldosterone and cortisol results. Standard calculations were performed on AVS data and site-specific success rates were compared. The effect of adrenocorticotropin hormone (ACTH) stimulation on the selectivity index (SI) and lateralization index (LI) were explored. The overall technical success-rate of AVS procedures was only 44% in procedures where no ACTH-stimulation was used (n=200) but this rose significantly (psuccess of AVS is lower than reported elsewhere. Provided that effects on the LI are considered, the use of ACTH-stimulation during AVS assists in the identification of unilateral forms of PA. Copyright © 2013 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  1. Improvement of vitamin B-6 production by gamma radiation in bacterial isolates from soil sample

    International Nuclear Information System (INIS)

    Trongpanich, Yanee; Anutrakunchai, Chitchanok; Piadang, Nattayana

    2006-09-01

    A vitamin B-6 producing bacterium, Rhizobium sp. 6.1C1 was isolated from soil and produced vitamin B-6 (mainly pyridoxamine) 0.27 mg per liter. Rhizobium sp. 6.1C1 is mesophile bacterium which was not able to grow at over 4 0C. The objective of this study was to improve vitamin B-6 production in high temperature by gamma radiation. The result showed that 677 mutant isolates which were obtained from irradiation dose 0.8 and 1 kGy, were able to grow at 5 0C. Only 4 isolates (08-361, 10-3, 10-94 and 10-98) showed high amount of vitamin B-6 per mg protein. From the results of optimum temperature and initial pH of medium showed high amount of vitamin B-6 per mg protein. From the results of optimum temperature and initial pH of medium showed that isolate 08-361 showed higher amount of vitamin b-6 than wild type. However, this value of vitamin B-6 from this mutant was lower than that when screening. Forms of produced vitamin B-6 from mutant were identified by HPLC. The result showed produced vitamin B-6 were PM and PMP, similar with wild type. Effect of gamma radiation stability of mutant is further study.

  2. Variable sampling-time technique for improving count rate performance of scintillation detectors

    International Nuclear Information System (INIS)

    Tanaka, E.; Nohara, N.; Murayama, H.

    1979-01-01

    A new technique is presented to improve the count rate capability of a scintillation spectrometer or a position sensitive detector with minimum loss of resolution. The technique is based on the combination of pulse shortening and selective integration in which the integration period is not fixed but shortened by the arrival of the following pulse. Theoretical analysis of the degradation of the statiscal component of resolution is made for the proposed system with delay line pulse shortening, and the factor of resolution loss is formulated as a function of the input pulse rate. A new method is also presented for determining the statistical component of resolution separately from the non-statistical system resolution. Preliminary experiments with a NaI(Tl) detector have been carried out, the results of which are consistent with the theoretical prediction. However, due to the non-exponential scintillation decay of the NaI(Tl) crystal, a simple delay line clipping is not satisfactory, and an RC high-pass filter has been added, which results in further degradation of the statistical resolution. (Auth.)

  3. Critical current density improvements in MgB2 superconducting bulk samples by K2CO3 additions  

    DEFF Research Database (Denmark)

    Grivel, J.-C.

    2018-01-01

    MgB2 bulk samples with potassium carbonate doping were made by means of reaction of elemental Mg and B powders mixed with various amounts of K2CO3. The Tc of the superconducting phase as well as its a-axis parameter were decreased as a result of carbon doping. Potassium escaped the samples during...... reaction. The critical current density of MgB2 was improved both in self field and under applied magnetic field for T ≤ 30 K, with optimum results for 1 mol% K2CO3 addition. The normalized flux pinning force (f(b)) shows that the flux pinning mechanism at low field is similar for all samples, following...

  4. T2 image contrast evaluation using three dimension sampling perfection with application optimized contrasts using different flip angle evolution (3D-SPACE)

    International Nuclear Information System (INIS)

    Yamazaki, Ryo; Hiura, Yukikazu; Tsuji, Akio; Nishiki, Shigeo; Uchikoshi, Masato

    2011-01-01

    Sampling perfection with application optimized contrasts using different flip angle evolution (3D-SPACE) sequence enables one to decrease specific absorption rate (SAR) by using variable flip angle refocusing pulse. Therefore, it is expected that the contrast obtained with 3D-SPACE sequences is different from that of spin echo (SE) images and turbo spin echo (TSE) images. The purpose of this study was to evaluate the characteristics of the signal intensity and central nervous system (CNS) image contrast in T 2 weighted 3D-SPACE. Using 3 different sequences (SE, 3D-TSE and 3D-SPACE) with repetition time (TR)/ echo time (TE)=3500/70, 90 and 115 ms, we obtained T 2 weighted magnetic resonance (MR) images of inhouse phantom and five healthy volunteers' brain. Signal intensity of the phantom which contains various T 1 and T 2 value was evaluated. Tissue contrasts of white/gray matter, cerebrospinal fluid (CSF)/subcutaneous fat and gray matter/subcutaneous fat were evaluated for a clinical image study. The phantom study showed that signal intensity in 3D-SPACE significantly decreased under a T 1 value of 250 ms. It was markedly decreased in comparison to other sequences, as effective echo time (TE) was extended. White/gray matter contrast of 3D-SPACE was the highest in all sequences. On the other hand, CSF/fat and gray matter/fat contrast of 3D-SPACE was higher than TSE but lower than SE. CNS image contrasts of 3D-SPACE were comparable to that of SE. Signal intensity had decreased in the range where T 1 and T 2 values were extremely short. (author)

  5. Enhanced bacterial affinity of PVDF membrane: its application as improved sea water sampling tool for environmental monitoring.

    Science.gov (United States)

    Kumar, Sweta Binod; Sharnagat, Preeti; Manna, Paramita; Bhattacharya, Amit; Haldar, Soumya

    2017-02-01

    Isolation of diversified bacteria from seawater is a major challenge in the field of environmental microbiology. In the present study, an attempt has been made to select specific membrane with improved property of attaching diversified bacteria. Initially, different concentrations (15, 18, and 20% W/W) of polysulfone (PSF) were used to check their affinity for the attachment of selected gram-positive (Bacillus subtilis) and gram-negative (Escherichia coli) bacteria. Among these, 20% W/W PSF showed maximum attachment. Therefore, membrane prepared with other materials such as polyvinylidene fluoride (PVDF) and polyether sulfone (PES) were used with the same concentration (20% W/W) to check their improved bacterial attachment property. Comparative study of bacterial attachment on three different membranes revealed that PVDF possessed the highest affinity towards both the groups of bacteria. This property was confirmed by different analytical methods viz. contact angle, atomic force microscopy, zeta potential, and flux study and further validated with seawater samples collected from seven sites of western coast and Lakshadweep island of India, using Biolog EcoPlate™. All the samples showed that bacterial richness and diversity was high in PVDF membrane in comparison to surrounding seawater samples. Interestingly, affinity for more diversified bacteria was reported to be higher in water sample with less turbidity and low bacteria load. This finding can facilitate the development of PVDF (20% W/W) membrane as a simple, cheap, and less labor intensive environmental sampling tool for the isolation of diversified bacteria from seawater sample wih different physiochemical properties. Graphical abstract ᅟ.

  6. The effects of spaced retrieval training in improving hyperphagia of people living with dementia in residential settings.

    Science.gov (United States)

    Hsu, Chia-Ning; Lin, Li-Chan; Wu, Shiao-Chi

    2017-10-01

    To investigate the effectiveness of spaced retrieval for improving hyperphagia in patients with dementia in residential care settings. Although 10-30% of patients with dementia have hyperphagia, most studies have focused on eating difficulties. Only a few studies have focused on hyperphagia. Various memory problems cause hyperphagia in patients with dementia. Spaced retrieval, a cognitive technique for information learning, can be used as a training method to improve memory loss. Recent studies showed that patients who received the training successfully memorised information learned in the training and correctly applied it to their daily lives. Single-blind experiments were performed. The 97 subjects with dementia were recruited from seven institutions. All research participants were stratified into three groups according to cognitive impairment severity and Hyperphagic Behavior Scale scores and then randomly assigned to the experimental and control groups. The experimental group received a six-week one-by-one spaced retrieval training for hyperphagia behaviour. The control group received routine care. After the intervention, the frequency and severity of hyperphagia in the patients with dementia, and food intake were significantly lower in the experimental group than in the control group. However, body mass index did not significantly differ. Our results suggest that the spaced retrieval training could decrease the frequency and severity of hyperphagia in patients with dementia. The content of this training programme is consistent with the normal manner of eating in daily life and is easy for patients to understand and perform. Therefore, it can be applied in residents' daily lives. This study confirms the efficacy of the spaced retrieval training protocol for hyperphagia in patients with dementia. In future studies, the follow-up duration can be increased to determine the long-term effectiveness of the intervention. © 2016 John Wiley & Sons Ltd.

  7. Space Projects: Improvements Needed in Selecting Future Projects for Private Financing

    Science.gov (United States)

    1990-01-01

    The Office of Management and Budget (OMB) and NASA jointly selected seven projects for commercialization to reduce NASA's fiscal year 1990 budget request and to help achieve the goal of increasing private sector involvement in space. However, the efforts to privately finance these seven projects did not increase the commercial sector's involvement in space to the extent desired. The General Accounting Office (GAO) determined that the projects selected were not a fair test of the potential of increasing commercial investment in space at an acceptable cost to the government, primarily because the projects were not properly screened. That is, neither their suitability for commercialization nor the economic consequences of seeking private financing for them were adequately evaluated before selection. Evaluations and market tests done after selection showed that most of the projects were not viable candidates for private financing. GAO concluded that projects should not be removed from NASA's budget for commercial development until after careful screening has been done to determine whether adequate commercial demand exists, development risks are commercially acceptable and private financing is found or judged to be highly likely, and the cost effectiveness of such a decision is acceptable. Premature removal of projects from NASA's budget ultimately can cause project delays and increased costs when unsuccessful commercialization candidates must be returned to the budget. NASA also needs to ensure appropriate comparisons of government and private financing options for future commercialization projects.

  8. A novel method of selective removal of human DNA improves PCR sensitivity for detection of Salmonella Typhi in blood samples.

    Science.gov (United States)

    Zhou, Liqing; Pollard, Andrew J

    2012-07-27

    Enteric fever is a major public health problem, causing an estimated 21million new cases and 216,000 or more deaths every year. Current diagnosis of the disease is inadequate. Blood culture only identifies 45 to 70% of the cases and is time-consuming. Serological tests have very low sensitivity and specificity. Clinical samples obtained for diagnosis of enteric fever in the field generally have blood, so that even PCR-based methods, widely used for detection of other infectious diseases, are not a straightforward option in typhoid diagnosis. We developed a novel method to enrich target bacterial DNA by selective removal of human DNA from blood samples, enhancing the sensitivity of PCR tests. This method offers the possibility of improving PCR assays directly using clinical specimens for diagnosis of this globally important infectious disease. Blood samples were mixed with ox bile for selective lysis of human blood cells and the released human DNA was then digested with addition of bile resistant micrococcal nuclease. The intact Salmonella Typhi bacteria were collected from the specimen by centrifugation and the DNA extracted with QIAamp DNA mini kit. The presence of Salmonella Typhi bacteria in blood samples was detected by PCR with the fliC-d gene of Salmonella Typhi as the target. Micrococcal nuclease retained activity against human blood DNA in the presence of up to 9% ox bile. Background human DNA was dramatically removed from blood samples through the use of ox bile lysis and micrococcal nuclease for removal of mammalian DNA. Consequently target Salmonella Typhi DNA was enriched in DNA preparations and the PCR sensitivity for detection of Salmonella Typhi in spiked blood samples was enhanced by 1,000 fold. Use of a combination of selective ox-bile blood cell lysis and removal of human DNA with micrococcal nuclease significantly improves PCR sensitivity and offers a better option for improved typhoid PCR assays directly using clinical specimens in diagnosis of

  9. An improved taxonomic sampling is a necessary but not sufficient condition for resolving inter-families relationships in Caridean decapods.

    Science.gov (United States)

    Aznar-Cormano, L; Brisset, J; Chan, T-Y; Corbari, L; Puillandre, N; Utge, J; Zbinden, M; Zuccon, D; Samadi, S

    2015-04-01

    During the past decade, a large number of multi-gene analyses aimed at resolving the phylogenetic relationships within Decapoda. However relationships among families, and even among sub-families, remain poorly defined. Most analyses used an incomplete and opportunistic sampling of species, but also an incomplete and opportunistic gene selection among those available for Decapoda. Here we test in the Caridea if improving the taxonomic coverage following the hierarchical scheme of the classification, as it is currently accepted, provides a better phylogenetic resolution for the inter-families relationships. The rich collections of the Muséum National d'Histoire Naturelle de Paris are used for sampling as far as possible at least two species of two different genera for each family or subfamily. All potential markers are tested over this sampling. For some coding genes the amplification success varies greatly among taxa and the phylogenetic signal is highly saturated. This result probably explains the taxon-heterogeneity among previously published studies. The analysis is thus restricted to the genes homogeneously amplified over the whole sampling. Thanks to the taxonomic sampling scheme the monophyly of most families is confirmed. However the genes commonly used in Decapoda appear non-adapted for clarifying inter-families relationships, which remain poorly resolved. Genome-wide analyses, like transcriptome-based exon capture facilitated by the new generation sequencing methods might provide a sounder approach to resolve deep and rapid radiations like the Caridea.

  10. Improved detection of multiple environmental antibiotics through an optimized sample extraction strategy in liquid chromatography-mass spectrometry analysis.

    Science.gov (United States)

    Yi, Xinzhu; Bayen, Stéphane; Kelly, Barry C; Li, Xu; Zhou, Zhi

    2015-12-01

    A solid-phase extraction/liquid chromatography/electrospray ionization/multi-stage mass spectrometry (SPE-LC-ESI-MS/MS) method was optimized in this study for sensitive and simultaneous detection of multiple antibiotics in urban surface waters and soils. Among the seven classes of tested antibiotics, extraction efficiencies of macrolides, lincosamide, chloramphenicol, and polyether antibiotics were significantly improved under optimized sample extraction pH. Instead of only using acidic extraction in many existing studies, the results indicated that antibiotics with low pK a values (antibiotics with high pK a values (>7) were extracted more efficiently under neutral conditions. The effects of pH were more obvious on polar compounds than those on non-polar compounds. Optimization of extraction pH resulted in significantly improved sample recovery and better detection limits. Compared with reported values in the literature, the average reduction of minimal detection limits obtained in this study was 87.6% in surface waters (0.06-2.28 ng/L) and 67.1% in soils (0.01-18.16 ng/g dry wt). This method was subsequently applied to detect antibiotics in environmental samples in a heavily populated urban city, and macrolides, sulfonamides, and lincomycin were frequently detected. Antibiotics with highest detected concentrations were sulfamethazine (82.5 ng/L) in surface waters and erythromycin (6.6 ng/g dry wt) in soils. The optimized sample extraction strategy can be used to improve the detection of a variety of antibiotics in environmental surface waters and soils.

  11. STS 131 Return Samples: Assessment of Air Quality Aboard the Shuttle (STS-131) and International Space Station (19A)

    Science.gov (United States)

    James, John T.

    2010-01-01

    The toxicological assessments of 1 grab sample canister (GSC) from the Shuttle are reported in Table 1. Analytical methods have not changed from earlier reports. The recoveries of the 3 surrogates (C-13-acetone, fluorobenzene, and chlorobenzene) from the Shuttle GSC were 100%, 93%, and 101%, respectively. Based on the historical experience using end-of-mission samples, the Shuttle atmosphere was acceptable for human respiration.

  12. Government/contractor partnerships for continuous improvement. A Goddard Space Flight Center example

    Science.gov (United States)

    Tagler, Richard C.

    1992-01-01

    The efforts of a government organization and its major contractors to foster a continuous improvement environment which transcends the traditional government/contractor relationship is discussed. This relationship is aimed at communication, partnership, and trust - creating benefits for all involved.

  13. A Comparative Analysis of Transmission Control Protocol Improvement Techniques over Space-Based Transmission Media

    National Research Council Canada - National Science Library

    Lawson, Joseph M

    2006-01-01

    The purpose of this study was to assess the throughput improvement afforded by the various TCP optimization techniques, with respect to a simulated geosynchronous satellite system, to provide a cost...

  14. Accounting for sampling error when inferring population synchrony from time-series data: a Bayesian state-space modelling approach with applications.

    Directory of Open Access Journals (Sweden)

    Hugues Santin-Janin

    Full Text Available BACKGROUND: Data collected to inform time variations in natural population size are tainted by sampling error. Ignoring sampling error in population dynamics models induces bias in parameter estimators, e.g., density-dependence. In particular, when sampling errors are independent among populations, the classical estimator of the synchrony strength (zero-lag correlation is biased downward. However, this bias is rarely taken into account in synchrony studies although it may lead to overemphasizing the role of intrinsic factors (e.g., dispersal with respect to extrinsic factors (the Moran effect in generating population synchrony as well as to underestimating the extinction risk of a metapopulation. METHODOLOGY/PRINCIPAL FINDINGS: The aim of this paper was first to illustrate the extent of the bias that can be encountered in empirical studies when sampling error is neglected. Second, we presented a space-state modelling approach that explicitly accounts for sampling error when quantifying population synchrony. Third, we exemplify our approach with datasets for which sampling variance (i has been previously estimated, and (ii has to be jointly estimated with population synchrony. Finally, we compared our results to those of a standard approach neglecting sampling variance. We showed that ignoring sampling variance can mask a synchrony pattern whatever its true value and that the common practice of averaging few replicates of population size estimates poorly performed at decreasing the bias of the classical estimator of the synchrony strength. CONCLUSION/SIGNIFICANCE: The state-space model used in this study provides a flexible way of accurately quantifying the strength of synchrony patterns from most population size data encountered in field studies, including over-dispersed count data. We provided a user-friendly R-program and a tutorial example to encourage further studies aiming at quantifying the strength of population synchrony to account for

  15. The Orientation of Gastric Biopsy Samples Improves the Inter-observer Agreement of the OLGA Staging System.

    Science.gov (United States)

    Cotruta, Bogdan; Gheorghe, Cristian; Iacob, Razvan; Dumbrava, Mona; Radu, Cristina; Bancila, Ion; Becheanu, Gabriel

    2017-12-01

    stage was found in the present case series. Good quality histopathology specimens were described in 95.43% of the oriented biopsy samples, and in 89.14% of the unoriented biopsy samples, respectively (p=0.0275). The orientation of gastric biopsies specimens improves the inter-observer agreement for the assessment of gastric atrophy.

  16. Spatiotemporal patterns of plant water isotope values from a continental-scale sample network in Europe as a tool to improve hydroclimate proxies

    Science.gov (United States)

    Nelson, D. B.; Kahmen, A.

    2016-12-01

    The hydrogen and oxygen isotopic composition of water available for biosynthetic processes in vascular plants plays an important role in shaping the isotopic composition of organic compounds that these organisms produce, including leaf waxes and cellulose in leaves and tree rings. Characterizing changes in large scale spatial patterns of precipitation, soil water, stem water, and leaf water isotope values over time is therefore useful for evaluating how plants reflect changes in the isotopic composition of these source waters in different environments. This information can, in turn, provide improved calibration targets for understanding the environmental signals that plants preserve. The pathway of water through this continuum can include several isotopic fractionations, but the extent to which the isotopic composition of each of these water pools varies under normal field conditions and over space and time has not been systematically and concurrently evaluated at large spatial scales. Two season-long sampling campaigns were conducted at nineteen sites throughout Europe over the 2014 and 2015 growing seasons to track changes in the isotopic composition of plant-relevant waters. Samples of precipitation, soil water, stem water, and leaf water were collected over more than 200 field days and include more than 500 samples from each water pool. Measurements were used to validate continent-wide gridded estimates of leaf water isotope values derived from a combination of mechanistic and statistical modeling conducted with temperature, precipitation, and relative humidity data. Data-model comparison shows good agreement for summer leaf waters, and substantiates the incorporation of modeled leaf waters in evaluating how plants respond to hydroclimate changes at large spatial scales. These results also suggest that modeled leaf water isotope values might be used in future studies in similar ecosystems to improve the coverage density of spatial or temporal data.

  17. Vapor space characterization of waste tank 241-C-101: Results from samples collected on 9/1/94

    International Nuclear Information System (INIS)

    Lucke, R.B.; Clauss, T.W.; Ligotke, M.W.

    1995-11-01

    This report describes results of the analyses of tank-headspace samples taken from the Hanford waste Tank 241-C-101 (referred to as Tank C-101) and the ambient air collected - 30 ft upwind near the tank and through the VSS near the tank. Pacific Northwest Laboratory (PNL) contracted with Westinghouse Hanford Company (WHC) to provide sampling devices and to analyze inorganic and organic analytes collected from the tank headspace and ambient air near the tank. The sample job was designated S4056, and samples were collected by WHC on September 1, 1994, using the vapor sampling system (VSS). The samples were inspected upon delivery to the 326/23B laboratory and logged into PNL record book 55408 before implementation of PNL Technical Procedure PNL-TVP-07. Custody of the sorbent traps was transferred to PNL personnel performing the inorganic analysis and stored at refrigerated (≤ 10 degrees C) temperature until the time of analysis. The canisters were stored in the 326/23B laboratory at ambient (25 degrees C) temperature until the time of the analysis. Access to the 326/23B laboratory is limited to PNL personnel working on the waste-tank safety program. Analyses described in this report were performed at PNL in the 300 area of the Hanford Reservation. Analytical methods that were used are described in the text. In summary, sorbent traps for inorganic analyses containing sample materials were either weighed (for water analysis) or desorbed with the appropriate aqueous solutions (for NH 3 , NO 2 , and NO analyses). The aqueous extracts were analyzed either by selective electrode or by ion chromatography (IC). Organic analyses were performed using cryogenic preconcentration followed by gas chromatography/mass spectrometry (GC/MS)

  18. Automated Reconstruction of Building LoDs from Airborne LiDAR Point Clouds Using an Improved Morphological Scale Space

    Directory of Open Access Journals (Sweden)

    Bisheng Yang

    2016-12-01

    Full Text Available Reconstructing building models at different levels of detail (LoDs from airborne laser scanning point clouds is urgently needed for wide application as this method can balance between the user’s requirements and economic costs. The previous methods reconstruct building LoDs from the finest 3D building models rather than from point clouds, resulting in heavy costs and inflexible adaptivity. The scale space is a sound theory for multi-scale representation of an object from a coarser level to a finer level. Therefore, this paper proposes a novel method to reconstruct buildings at different LoDs from airborne Light Detection and Ranging (LiDAR point clouds based on an improved morphological scale space. The proposed method first extracts building candidate regions following the separation of ground and non-ground points. For each building candidate region, the proposed method generates a scale space by iteratively using the improved morphological reconstruction with the increase of scale, and constructs the corresponding topological relationship graphs (TRGs across scales. Secondly, the proposed method robustly extracts building points by using features based on the TRG. Finally, the proposed method reconstructs each building at different LoDs according to the TRG. The experiments demonstrate that the proposed method robustly extracts the buildings with details (e.g., door eaves and roof furniture and illustrate good performance in distinguishing buildings from vegetation or other objects, while automatically reconstructing building LoDs from the finest building points.

  19. Improvement in quality of life and sexual functioning in a comorbid sample after the unified protocol transdiagnostic group treatment.

    Science.gov (United States)

    de Ornelas Maia, Ana Claudia Corrêa; Sanford, Jenny; Boettcher, Hannah; Nardi, Antonio E; Barlow, David

    2017-10-01

    Patients with multiple mental disorders often experience sexual dysfunction and reduced quality of life. The unified protocol (UP) is a transdiagnostic treatment for emotional disorders that has the potential to improve quality of life and sexual functioning via improved emotion management. The present study evaluates changes in quality of life and sexual functioning in a highly comorbid sample treated with the UP in a group format. Forty-eight patients were randomly assigned to either a UP active-treatment group or a medication-only control group. Treatment was delivered in 14 sessions over the course of 4 months. Symptoms of anxiety and depression were assessed using the Beck Anxiety Inventory and Beck Depression Inventory. Sexual functioning was assessed by the Arizona Sexual Experience Scale (ASEX), and quality of life was assessed by the World Health Organization Quality of Life-BREF scale (WHOQOL-BREF). Quality of life, anxiety and depression all significantly improved among participants treated with the UP. Some improvement in sexual functioning was also noted. The results support the efficacy of the UP in improving quality of life and sexual functioning in comorbid patients. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Fundamental Science and Improvement of the Quality of Life-Space Quantization to MRI

    International Nuclear Information System (INIS)

    Tannenbaum, M.

    2010-01-01

    This paper discusses the following topics: (1) Science versus technology - a false dichotomy; (2) scientific discovery is vital for future progress; (3) An example: Space quantization to magnetic reosnance imaging (MRI) - A timeline from 1911-1977; (4) Modern basic research - what is inside the proton; and (5) The 21st century - beginning of the 3rd millennium. The 20th century started with the study of macroscopic matter which led to the discovery of a whole new submicroscopic world of physics which totally changed our view of nature and led to new quantum applications, both fundamental and practical.

  1. Space Shuttle Orbiter oxygen partial pressure sensing and control system improvements

    Science.gov (United States)

    Frampton, Robert F.; Hoy, Dennis M.; Kelly, Kevin J.; Walleshauser, James J.

    1992-01-01

    A program aimed at developing a new PPO2 oxygen sensor and a replacement amplifier for the Space Shuttle Orbiter is described. Experimental design methodologies used in the test and modeling process made it possible to enhance the effectiveness of the program and to reduce its cost. Significant cost savings are due to the increased lifetime of the basic sensor cell, the maximization of useful sensor life through an increased amplifier gain adjustment capability, the use of streamlined production processes for the manufacture of the assemblies, and the refurbishment capability of the replacement sensor.

  2. Using the experience-sampling method to examine the psychological mechanisms by which participatory art improves wellbeing.

    Science.gov (United States)

    Holt, Nicola J

    2018-01-01

    To measure the immediate impact of art-making in everyday life on diverse indices of wellbeing ('in the moment' and longer term) in order to improve understanding of the psychological mechanisms by which art may improve mental health. Using the experience-sampling method, 41 artists were prompted (with a 'beep' on a handheld computer) at random intervals (10 times a day, for one week) to answer a short questionnaire. The questionnaire tracked art-making and enquired about mood, cognition and state of consciousness. This resulted in 2,495 sampled experiences, with a high response rate in which 89% of questionnaires were completed. Multi-level modelling was used to evaluate the impact of art-making on experience, with 2,495 'experiences' (experiential-level) nested within 41 participants (person-level). Recent art-making was significantly associated with experiential shifts: improvement in hedonic tone, vivid internal imagery and the flow state. Furthermore, the frequency of art-making across the week was associated with person-level measures of wellbeing: eudemonic happiness and self-regulation. Cross-level interactions, between experiential and person-level variables, suggested that hedonic tone improved more for those scoring low on eudemonic happiness, and further that, those high in eudemonic happiness were more likely to experience phenomenological features of the flow state and to experience inner dialogue while art-making. Art-making has both immediate and long-term associations with wellbeing. At the experiential level, art-making affects multiple dimensions of conscious experience: affective, cognitive and state factors. This suggests that there are multiple routes to wellbeing (improving hedonic tone, making meaning through inner dialogue and experiencing the flow state). Recommendations are made to consider these factors when both developing and evaluating public health interventions that involve participatory art.

  3. Use of microwaves to improve nutritional value of soybeans for future space inhabitants

    Science.gov (United States)

    Singh, G.

    1983-01-01

    Whole soybeans from four different varieties at different moisture contents were microwaved for varying times to determine the conditions for maximum destruction of trypsin inhibitor and lipoxygenase activities, and optimal growth of chicks. Microwaving 150 gm samples of soybeans (at 14 to 28% moisture) for 1.5 min was found optimal for reduction of trypsin inhibitor and lipoxygenase activities. Microwaving 1 kgm samples of soybeans for 9 minutes destroyed 82% of the trypsin inhibitor activity and gave optimal chick growth. It should be pointed out that the microwaving time would vary according to the weight of the sample and the power of the microwave oven. The microwave oven used in the above experiments was rated at 650 watts 2450 MHz.

  4. Improved Arousal and Motor Function Using Zolpidem in a Patient With Space-Occupying Intracranial Lesions: A Case Report.

    Science.gov (United States)

    Bomalaski, Martin Nicholas; Smith, Sean Robinson

    2017-08-01

    Patients with disorders of consciousness (DOC) have profound functional limitations with few treatment options for improving arousal and quality of life. Zolpidem is a nonbenzodiazepine hypnotic used to treat insomnia that has also been observed to paradoxically improve arousal in those with DOC, such as the vegetative or minimally conscious states. Little information exists on its use in patients with DOC who have intracranial space-occupying lesions. We present a case of a 24-year-old man in a minimally conscious state due to central nervous system lymphoma who was observed to have increased arousal and improved motor function after the administration of zolpidem. V. Copyright © 2017 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.

  5. Vapor space characterization of waste tank 241-BY-105 (in situ): Results from samples collected on May 9, 1994

    International Nuclear Information System (INIS)

    McVeety, B.D.; Pool, K.H.; Ligotke, M.W.; Clauss, T.W.; Lucke, R.B.; Sharma, A.K.; McCulloch, M.; Fruchter, J.S.; Goheen, S.C.

    1995-05-01

    This report describes inorganic and organic analyses results from in situ samples obtained from the tank headspace of the Hanford waste storage Tank 241-BY-105 (referred to as Tank BY-105). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds NH 3 , NO 2 , NO, HCN, and H 2 O. Sampling for sulfur oxides was not requested. Results of the inorganic samples were affected by sampling errors that led to an undefined uncertainty in sample volume. Consequently, tank-headspace concentrations are estimated only. Thirty-nine tentatively identified organic analytes were observed above the detection limit of (ca.) 10 ppbv, but standards for most of these were not available at the time of analysis, and their quantitation is beyond the scope of this study. In addition, we looked for the 41 standard TO-14 analytes. Of these, only a few were observed above the 2-ppbv detection limit. The 16 organic analytes with the highest estimated concentrations are listed. These 16 analytes account for approximately 68% of the total or organic components in Tank BY-105

  6. Space-time wind speed forecasting for improved power system dispatch

    KAUST Repository

    Zhu, Xinxin

    2014-02-27

    To support large-scale integration of wind power into electric energy systems, state-of-the-art wind speed forecasting methods should be able to provide accurate and adequate information to enable efficient, reliable, and cost-effective scheduling of wind power. Here, we incorporate space-time wind forecasts into electric power system scheduling. First, we propose a modified regime-switching, space-time wind speed forecasting model that allows the forecast regimes to vary with the dominant wind direction and with the seasons, hence avoiding a subjective choice of regimes. Then, results from the wind forecasts are incorporated into a power system economic dispatch model, the cost of which is used as a loss measure of the quality of the forecast models. This, in turn, leads to cost-effective scheduling of system-wide wind generation. Potential economic benefits arise from the system-wide generation of cost savings and from the ancillary service cost savings. We illustrate the economic benefits using a test system in the northwest region of the United States. Compared with persistence and autoregressive models, our model suggests that cost savings from integration of wind power could be on the scale of tens of millions of dollars annually in regions with high wind penetration, such as Texas and the Pacific northwest. © 2014 Sociedad de Estadística e Investigación Operativa.

  7. Component Selection, Accelerated Testing, and Improved Modeling of AMTEC Systems for Space Power (abstract)

    Science.gov (United States)

    Williams, R. M.; Jeffries-Nakamura, B.; Ryan, M. A.; Underwood, M. L.; Suitor, J.; O'Connor, D.

    1993-01-01

    Alkali metal thermal to electric converter (AMTEC) designs for space power are numerous, but selection of materials for construction of long-lived AMTEC devices has been limited to electrodes, current collectors, and the solid electrolyte. AMTEC devices with lifetimes greater than 5 years require careful selection and life testing of all hot-side components. The likely selection of a remote condensed design for initial flight test and probable use with a GPHS in AMTEC powered outer planet probes requires the device to be constructed to tolerate T greater than 1150K, as well as exposure to Na(sub (g)), and Na(sub (liq)) on the high pressure side. The temperatures involved make critical high strength and chemical resistance to Na containing Na(sub 2)O. Selection among materials which can be worked should not be driven by ease of fabricability, as high temperature stability is the critical issue. These concepts drive the selection of Mo alloys for Na(sub (liq)) containment in AMTEC cells for T to 1150K operation, as they are significantly stronger than comparable NB or Ta alloys, are less soluble in Na(sub (liq)) containing dissolved Na(sub 2)O, are workable compared to W alloys (which might be used for certain components), and are ductile at the T greater than 500K of proposed AMTEC modules in space applications.

  8. Improving the beam quality of rf guns by correction of rf and space-charge effects

    International Nuclear Information System (INIS)

    Serafini, L.

    1992-01-01

    In this paper we describe two possible strategies to attain ultra-low emittance electron beam generation by laser-driven RF guns. The first one is based on the exploitation of multi-mode resonant cavities to neutralize the emittance degradation induced by RF effects. Accelerating cigar-like (long and thin) electron bunches in multi-mode operated RF guns the space charge induced emittance is strongly decreased at the same time: high charged bunches, as typically requested by future TeV e - e + colliders, can be delivered by the gun at a quite low transverse emittance and good behaviour in the longitudinal phase space, so that they can be magnetically compressed to reach higher peak currents. The second strategy consists in using disk-like electron bunches, produced by very short laser pulses illuminating the photocathode. By means of an analytical study a new regime has been found, where the normalized transverse emittance scales like the inverse of the peak current, provided that the laser pulse intensity distribution is properly shaped in the transverse direction. Preliminary numerical simulations confirm the analytical predictions and show that the minimum emittance achievable is set up, in this new regime, by the wake-field interaction between the bunch and the cathode metallic wall

  9. Tank Vapor Characterization Project: Vapor space characterization of waste Tank A-101, Results from samples collected on June 8, 1995

    International Nuclear Information System (INIS)

    Pool, K.H.; Clauss, T.W.; McVeety, B.D.; Evans, J.C.; Thomas, B.L.; Olsen, K.B.; Fruchter, J.S.; Ligotke, M.W.

    1995-11-01

    This report describes the analytical results of vapor samples taken from the headspace of the waste storage tank 241-A-101 (Tank A-101) at the Hanford Site in Washington State. The results described in this report were obtained to characterize the vapors present in the tank headspace and to support safety evaluations and tank-farm operations. The results include air concentrations of selected inorganic and organic analytes and grouped compounds from samples obtained by Westinghouse Hanford Company (WHC) and provided for analysis to Pacific Northwest National Laboratory (PNL). Analyses were performed by the Vapor Analytical Laboratory (VAL) at PNL. Analyte concentrations were based on analytical results and, where appropriate, sample volumes provided by WHC. A summary of the results is listed in Table 1. Detailed descriptions of the analytical results appear in the text

  10. Evaluation of aqueductal patency in patients with hydrocephalus: Three-dimensional high-sampling efficiency technique(SPACE) versus two-dimensional turbo spin echo at 3 Tesla

    International Nuclear Information System (INIS)

    Ucar, Murat; Guryildirim, Melike; Tokgoz, Nil; Kilic, Koray; Borcek, Alp; Oner, Yusuf; Akkan, Koray; Tali, Turgut

    2014-01-01

    To compare the accuracy of diagnosing aqueductal patency and image quality between high spatial resolution three-dimensional (3D) high-sampling-efficiency technique (sampling perfection with application optimized contrast using different flip angle evolutions [SPACE]) and T2-weighted (T2W) two-dimensional (2D) turbo spin echo (TSE) at 3-T in patients with hydrocephalus. This retrospective study included 99 patients diagnosed with hydrocephalus. T2W 3D-SPACE was added to the routine sequences which consisted of T2W 2D-TSE, 3D-constructive interference steady state (CISS), and cine phase-contrast MRI (PC-MRI). Two radiologists evaluated independently the patency of cerebral aqueduct and image quality on the T2W 2D-TSE and T2W 3D-SPACE. PC-MRI and 3D-CISS were used as the reference for aqueductal patency and image quality, respectively. Inter-observer agreement was calculated using kappa statistics. The evaluation of the aqueductal patency by T2W 3D-SPACE and T2W 2D-TSE were in agreement with PC-MRI in 100% (99/99; sensitivity, 100% [83/83]; specificity, 100% [16/16]) and 83.8% (83/99; sensitivity, 100% [67/83]; specificity, 100% [16/16]), respectively (p < 0.001). No significant difference in image quality between T2W 2D-TSE and T2W 3D-SPACE (p = 0.056) occurred. The kappa values for inter-observer agreement were 0.714 for T2W 2D-TSE and 0.899 for T2W 3D-SPACE. Three-dimensional-SPACE is superior to 2D-TSE for the evaluation of aqueductal patency in hydrocephalus. T2W 3D-SPACE may hold promise as a highly accurate alternative treatment to PC-MRI for the physiological and morphological evaluation of aqueductal patency.

  11. Evaluation of aqueductal patency in patients with hydrocephalus: Three-dimensional high-sampling efficiency technique(SPACE) versus two-dimensional turbo spin echo at 3 Tesla

    Energy Technology Data Exchange (ETDEWEB)

    Ucar, Murat; Guryildirim, Melike; Tokgoz, Nil; Kilic, Koray; Borcek, Alp; Oner, Yusuf; Akkan, Koray; Tali, Turgut [School of Medicine, Gazi University, Ankara (Turkey)

    2014-12-15

    To compare the accuracy of diagnosing aqueductal patency and image quality between high spatial resolution three-dimensional (3D) high-sampling-efficiency technique (sampling perfection with application optimized contrast using different flip angle evolutions [SPACE]) and T2-weighted (T2W) two-dimensional (2D) turbo spin echo (TSE) at 3-T in patients with hydrocephalus. This retrospective study included 99 patients diagnosed with hydrocephalus. T2W 3D-SPACE was added to the routine sequences which consisted of T2W 2D-TSE, 3D-constructive interference steady state (CISS), and cine phase-contrast MRI (PC-MRI). Two radiologists evaluated independently the patency of cerebral aqueduct and image quality on the T2W 2D-TSE and T2W 3D-SPACE. PC-MRI and 3D-CISS were used as the reference for aqueductal patency and image quality, respectively. Inter-observer agreement was calculated using kappa statistics. The evaluation of the aqueductal patency by T2W 3D-SPACE and T2W 2D-TSE were in agreement with PC-MRI in 100% (99/99; sensitivity, 100% [83/83]; specificity, 100% [16/16]) and 83.8% (83/99; sensitivity, 100% [67/83]; specificity, 100% [16/16]), respectively (p < 0.001). No significant difference in image quality between T2W 2D-TSE and T2W 3D-SPACE (p = 0.056) occurred. The kappa values for inter-observer agreement were 0.714 for T2W 2D-TSE and 0.899 for T2W 3D-SPACE. Three-dimensional-SPACE is superior to 2D-TSE for the evaluation of aqueductal patency in hydrocephalus. T2W 3D-SPACE may hold promise as a highly accurate alternative treatment to PC-MRI for the physiological and morphological evaluation of aqueductal patency.

  12. Space space space

    CERN Document Server

    Trembach, Vera

    2014-01-01

    Space is an introduction to the mysteries of the Universe. Included are Task Cards for independent learning, Journal Word Cards for creative writing, and Hands-On Activities for reinforcing skills in Math and Language Arts. Space is a perfect introduction to further research of the Solar System.

  13. Weak Convergence and Banach Space-Valued Functions: Improving the Stability Theory of Feynman’s Operational Calculi

    International Nuclear Information System (INIS)

    Nielsen, Lance

    2011-01-01

    In this paper we investigate the relation between weak convergence of a sequence {μ n } of probability measures on a Polish space S converging weakly to the probability measure μ and continuous, norm-bounded functions into a Banach space X. We show that, given a norm-bounded continuous function f:S→X, it follows that lim n∞ ∫ S f, dμ n = ∫ S f, dμ —the limit one has for bounded and continuous real (or complex)—valued functions on S. This result is then applied to the stability theory of Feynman’s operational calculus where it is shown that the theory can be significantly improved over previous results.

  14. Design of cladding rods-assisted depressed-core few-mode fibers with improved modal spacing

    Science.gov (United States)

    Han, Jiawei; Zhang, Jie

    2018-03-01

    This paper investigates the design details of cladding rods-assisted (CRA) depressed-core (DC) few-mode fibers (FMFs) that feature more equally spaced linearly polarized (LP) modal effective indices, suitable for high-spatial-density weakly-coupled mode-division multiplexing systems. The influences of the index profile of cladding rods on LP mode-resolved effective index, bending sensitivity, and effective area Aeff, are numerically described. Based on the design considerations of LP modal Aeff-dependent spatial efficiency and LP modal bending loss-dependent robustness, the small LP21-LP02 and LP22-LP03 modal spacing limitations, encountered in state-of-the-art weakly-coupled step-index FMFs, have been substantially improved by at least 25%. In addition, the proposed CRA DC FMFs also show sufficiently large effective areas (in excess of 110 μm2) for all guided LP modes, which are expected to exhibit good nonlinear performance.

  15. Improved Methods of Carnivore Faecal Sample Preservation, DNA Extraction and Quantification for Accurate Genotyping of Wild Tigers

    Science.gov (United States)

    Harika, Katakam; Mahla, Ranjeet Singh; Shivaji, Sisinthy

    2012-01-01

    Background Non-invasively collected samples allow a variety of genetic studies on endangered and elusive species. However due to low amplification success and high genotyping error rates fewer samples can be identified up to the individual level. Number of PCRs needed to obtain reliable genotypes also noticeably increase. Methods We developed a quantitative PCR assay to measure and grade amplifiable nuclear DNA in feline faecal extracts. We determined DNA degradation in experimentally aged faecal samples and tested a suite of pre-PCR protocols to considerably improve DNA retrieval. Results Average DNA concentrations of Grade I, II and III extracts were 982pg/µl, 9.5pg/µl and 0.4pg/µl respectively. Nearly 10% of extracts had no amplifiable DNA. Microsatellite PCR success and allelic dropout rates were 92% and 1.5% in Grade I, 79% and 5% in Grade II, and 54% and 16% in Grade III respectively. Our results on experimentally aged faecal samples showed that ageing has a significant effect on quantity and quality of amplifiable DNA (pDNA degradation occurs within 3 days of exposure to direct sunlight. DNA concentrations of Day 1 samples stored by ethanol and silica methods for a month varied significantly from fresh Day 1 extracts (p0.05). DNA concentrations of fresh tiger and leopard faecal extracts without addition of carrier RNA were 816.5pg/µl (±115.5) and 690.1pg/µl (±207.1), while concentrations with addition of carrier RNA were 49414.5pg/µl (±9370.6) and 20982.7pg/µl (±6835.8) respectively. Conclusions Our results indicate that carnivore faecal samples should be collected as freshly as possible, are better preserved by two-step method and should be extracted with addition of carrier RNA. We recommend quantification of template DNA as this facilitates several downstream protocols. PMID:23071624

  16. Lot quality assurance sampling to monitor supplemental immunization activity quality: an essential tool for improving performance in polio endemic countries.

    Science.gov (United States)

    Brown, Alexandra E; Okayasu, Hiromasa; Nzioki, Michael M; Wadood, Mufti Z; Chabot-Couture, Guillaume; Quddus, Arshad; Walker, George; Sutter, Roland W

    2014-11-01

    Monitoring the quality of supplementary immunization activities (SIAs) is a key tool for polio eradication. Regular monitoring data, however, are often unreliable, showing high coverage levels in virtually all areas, including those with ongoing virus circulation. To address this challenge, lot quality assurance sampling (LQAS) was introduced in 2009 as an additional tool to monitor SIA quality. Now used in 8 countries, LQAS provides a number of programmatic benefits: identifying areas of weak coverage quality with statistical reliability, differentiating areas of varying coverage with greater precision, and allowing for trend analysis of campaign quality. LQAS also accommodates changes to survey format, interpretation thresholds, evaluations of sample size, and data collection through mobile phones to improve timeliness of reporting and allow for visualization of campaign quality. LQAS becomes increasingly important to address remaining gaps in SIA quality and help focus resources on high-risk areas to prevent the continued transmission of wild poliovirus. © Crown copyright 2014.

  17. Vapor space characterization of waste Tank 241-BY-108: Results from samples collected on 10/27/94

    International Nuclear Information System (INIS)

    McVeety, B.D.; Clauss, T.W.; Ligotke, M.W.

    1995-10-01

    This report describes inorganic and organic analyses results from samples obtained from the headspace of the Hanford waste storage Tank 241-BY-108 (referred to as Tank BY-108). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), and water vapor (H 2 O). Trends in NH 3 and H 2 O samples indicated a possible sampling problem. Sampling for hydrogen cyanide (HCN) and sulfur oxides (SO x ) was not requested. In addition, the authors looked for the 40 TO-14 compounds plus an additional 15 analytes. Of these, 17 were observed above the 5-ppbv reporting cutoff. Also, eighty-one organic tentatively identified compounds (TICs) were observed above the reporting cutoff (ca.) 10 ppbv, and are reported with concentrations that are semiquantitative estimates based on internal standard response factors. The nine organic analytes with the highest estimated concentrations are listed in Summary Table 1 and account for approximately 48% of the total organic components in the headspace of Tank BY-108. Three permanent gases, hydrogen (H 2 ), carbon dioxide (CO 2 ), and nitrous oxide (N 2 O) were also detected. Tank BY-108 is on the Ferrocyanide Watch List

  18. Individual thermal profiles as a basis for comfort improvement in space and other environments

    Science.gov (United States)

    Koscheyev, V. S.; Coca, A.; Leon, G. R.; Dancisak, M. J.

    2002-01-01

    BACKGROUND: The development of individualized countermeasures to address problems in thermoregulation is of considerable importance for humans in space and other extreme environments. A methodology is presented for evaluating minimal/maximal heat flux from the total human body and specific body zones, and for assessing individual differences in the efficiency of heat exchange from these body areas. The goal is to apply this information to the design of individualized protective equipment. METHODS: A multi-compartment conductive plastic tubing liquid cooling/warming garment (LCWG) was developed. Inlet water temperatures of 8-45 degrees C were imposed sequentially to specific body areas while the remainder of the garment was maintained at 33 degrees C. RESULTS: There were significant differences in heat exchange level among body zones in both the 8 degrees and 45 degrees C temperature conditions (p thermal profiles is feasible for better comfort of astronauts on long-duration missions and personnel in other extreme environments.

  19. Using spaced retrieval and Montessori-based activities in improving eating ability for residents with dementia.

    Science.gov (United States)

    Lin, Li-Chan; Huang, Ya-Ju; Su, Su-Gen; Watson, Roger; Tsai, Belina W-J; Wu, Shiao-Chi

    2010-10-01

    To construct a training protocol for spaced retrieval (SR) and to investigate the effectiveness of SR and Montessori-based activities in decreasing eating difficulty in older residents with dementia. A single evaluator, blind, and randomized control trial was used. Eighty-five residents with dementia were chosen from three special care units for residents with dementia in long-term care facilities in Taiwan. To avoid any confounding of subjects, the three institutions were randomized into three groups: spaced retrieval, Montessori-based activities, and a control group. The invention consisted of three 30-40 min sessions per week, for 8 weeks. After receiving the intervention, the Edinburgh Feeding Evaluation in Dementia (EdFED) scores and assisted feeding scores for the SR and Montessori-based activity groups were significantly lower than that of the control group. However, the frequencies of physical assistance and verbal assistance for the Montessori-based activity group after intervention were significantly higher than that of the control group, which suggests that residents who received Montessori-based activity need more physical and verbal assistance during mealtimes. In terms of the effects of nutritional status after intervention, Mini-Nutritional Assessment (MNA) in the SR group was significantly higher than that of the control group. This study confirms the efficacy of SR and Montessori-based activities for eating difficulty and eating ability. A longitudinal study to follow the long-term effects of SR and Montessori-based activities on eating ability and nutritional status is recommended. Copyright © 2010 John Wiley & Sons, Ltd.

  20. Performance Improvement of Near Earth Space Survey (NESS Wide-Field Telescope (NESS-2 Optics

    Directory of Open Access Journals (Sweden)

    Sung-Yeol Yu

    2010-06-01

    Full Text Available We modified the optical system of 500 mm wide-field telescope of which point spread function showed an irregularity. The telescope has been operated for Near Earth Space Survey (NESS located at Siding Spring Observatory (SSO in Australia, and the optical system was brought back to Korea in January 2008. After performing a numerical simulation with the tested value of surface figure error of the primary mirror using optical design program, we found that the surface figure error of the mirror should be fabricated less than root mean square (RMS λ/10 in order to obtain a stellar full width at half maximum (FWHM below 28 μm. However, we started to figure the mirror for the target value of RMS λ/20, because system surface figure error would be increased by the error induced by the optical axis adjustment, mirror cell installation, and others. The radius of curvature of the primary mirror was 1,946 mm after the correction. Its measured surface figure error was less than RMS λ/20 on the table of polishing machine, and RMS λ/15 after installation in the primary mirror cell. A test observation performed at Daeduk Observatory at Korea Astronomy and Space Science Institute by utilizing the exiting mount, and resulted in 39.8 μm of stellar FWHM. It was larger than the value from numerical simulation, and showed wing-shaped stellar image. It turned out that the measured-curvature of the secondary mirror, 1,820 mm, was not the same as the designed one, 1,795.977 mm. We fabricated the secondary mirror to the designed value, and finally obtained a stellar FWHM of 27 μm after re-installation of the optical system into SSO NESS Observatory in Australia.

  1. Active-matrix OLED (AMOLED) microdisplay for augmented-reality applications with improved color space

    OpenAIRE

    Thomschke, Michael; Fehse, Karsten; Richter, Bernd; Wartenberg, Philipp; Pfeifer, Richard; Vogel, Uwe

    2013-01-01

    Our contribution describes the optimization of OLED microdisplays to increase the color gamut and to reduce the OLED complexity. We show that these improvements can be reached by a 3-color RGB-white OLED approach that features a single layer multicolor emitting zone, respectively.

  2. Hypothesis Tests for Bernoulli Experiments: Ordering the Sample Space by Bayes Factors and Using Adaptive Significance Levels for Decisions

    Directory of Open Access Journals (Sweden)

    Carlos A. de B. Pereira

    2017-12-01

    Full Text Available The main objective of this paper is to find the relation between the adaptive significance level presented here and the sample size. We statisticians know of the inconsistency, or paradox, in the current classical tests of significance that are based on p-value statistics that are compared to the canonical significance levels (10%, 5%, and 1%: “Raise the sample to reject the null hypothesis” is the recommendation of some ill-advised scientists! This paper will show that it is possible to eliminate this problem of significance tests. We present here the beginning of a larger research project. The intention is to extend its use to more complex applications such as survival analysis, reliability tests, and other areas. The main tools used here are the Bayes factor and the extended Neyman–Pearson Lemma.

  3. Vapor space characterization of waste tank 241-C-106: Results from samples collected on February 15, 1994

    International Nuclear Information System (INIS)

    McVeety, B.D.; Clauss, T.W.; Young, J.S.; Ligotke, M.W.; Goheen, S.C.; Lucke, R.B.; Pool, K.H.; McCulloch, M.; Fruchter, J.S.

    1995-06-01

    This document presents the details of the inorganic and organic analysis that was performed on samples from the headspace of Hanford waste tank 241-C-106. The results described were obtained to support the safety and toxicological evaluations. A summary of the results for the inorganic and organic analytes is included, as well as, a detailed description of the results which appears in the text

  4. Improvement of sampling plans for Salmonella detection in pooled table eggs by use of real-time PCR.

    Science.gov (United States)

    Pasquali, Frédérique; De Cesare, Alessandra; Valero, Antonio; Olsen, John Emerdhal; Manfreda, Gerardo

    2014-08-01

    Eggs and egg products have been described as the most critical food vehicles of salmonellosis. The prevalence and level of contamination of Salmonella on table eggs are low, which severely affects the sensitivity of sampling plans applied voluntarily in some European countries, where one to five pools of 10 eggs are tested by the culture based reference method ISO 6579:2004. In the current study we have compared the testing-sensitivity of the reference culture method ISO 6579:2004 and an alternative real-time PCR method on Salmonella contaminated egg-pool of different sizes (4-9 uninfected eggs mixed with one contaminated egg) and contamination levels (10°-10(1), 10(1)-10(2), 10(2)-10(3)CFU/eggshell). Two hundred and seventy samples corresponding to 15 replicates per pool size and inoculum level were tested. At the lowest contamination level real-time PCR detected Salmonella in 40% of contaminated pools vs 12% using ISO 6579. The results were used to estimate the lowest number of sample units needed to be tested in order to have a 95% certainty not falsely to accept a contaminated lot by Monte Carlo simulation. According to this simulation, at least 16 pools of 10 eggs each are needed to be tested by ISO 6579 in order to obtain this confidence level, while the minimum number of pools to be tested was reduced to 8 pools of 9 eggs each, when real-time PCR was applied as analytical method. This result underlines the importance of including analytical methods with higher sensitivity in order to improve the efficiency of sampling and reduce the number of samples to be tested. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Vapor space characterization of waste Tank 241-TX-118 (in situ): Results from samples collected on 9/7/94

    International Nuclear Information System (INIS)

    Thomas, B.L.; Clauss, T.W.; Ligotke, M.W.; Pool, K.H.; McVeety, B.D.; Olsen, K.B.; Fruchter, J.S.; Goheen, S.C.

    1995-10-01

    This report describes inorganic and organic analyses results from in situ samples obtained from the headspace of the Hanford waste storage Tank 241-TX-118 (referred to as Tank TX-118). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), hydrogen cyanide (CHN), and water (H 2 O). Sampling for sulfur oxides (SO x ) was not requested. In addition, quantitative results were obtained for the 39 TO-14 compounds plus an additional 13 analytes. Hexane, normally included in the additional analytes, was removed because a calibration standard was not available during analysis of Tank TX-118 SUMMA trademark canisters. Of these, 12 were observed above the 5-ppbv reporting cutoff. Fourteen tentatively identified compounds (TICs) were observed above the reporting cutoff of (ca.) 10 ppbv and are reported with concentrations that are semiquantitative estimates based on internal-standard response factors. The 10 organic analytes with the highest estimated concentrations are listed in Table 1 and account for approximately 86% of the total organic components in Tank TX-118. Permanent gas analysis was not conducted on the tank-headspace samples. Tank TX-118 is on both the Ferrocyanide and Organic Watch List

  6. Vapor space characterization of Waste Tank 241-TY-104: Results from samples collected on 4/27/95

    International Nuclear Information System (INIS)

    Klinger, G.S.; Olsen, K.B.; Clauss, T.W.

    1995-10-01

    This report describes inorganic and organic analyses results from samples obtained from the headspace of the Hanford waste storage Tank 241-TY-104 (referred to as Tank TY-104). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), and water (H 2 O). Sampling for hydrogen cyanide (HCN) and sulfur oxides (SO x ) was not requested. In addition, quantitative results were obtained for the 39 TO-14 compounds plus an additional 14 analytes. Of these, 8 were observed above the 5-ppbv reporting cutoff. Five tentatively identified compounds (TICs) were observed above the reporting cutoff of (ca.) 10 ppbv and are reported with concentrations that are semiquantitative estimates based on internal-standard response factors. The 10 organic analytes with the highest estimated concentrations are listed in Table 1 and account for approximately 94% of the total organic components in Tank TY-104. Nitrous oxide (N 2 O) was the only permanent gas detected in the tank-headspace samples. Tank TY-104 is on the Ferrocyanide Watch List

  7. Vapor space characterization of Waste Tank 241-U-105: Results from samples collected on 2/24/95

    International Nuclear Information System (INIS)

    Pool, K.H.; Clauss, T.W.; Ligotke, M.W.

    1995-10-01

    This report describes inorganic and organic analyses results from samples obtained from the headspace of the Hanford waste storage Tank 241-U-105 (referred to as Tank U-105). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), and water (H 2 O). Sampling for hydrogen cyanide (HCN) and sulfur oxides (SO x ) was not requested. In addition, quantitative results were obtained for the 39 TO-14 compounds plus an additional 14 analytes. Of these, six were observed above the 5-ppbv reporting cutoff. Three tentatively identified compounds (TICs) were observed above the reporting cutoff of (ca.) 10 ppbv and are reported with concentrations that are semiquantitative estimates based on internal-standard response factors. All nine of the organic analytes identified are listed in Table 1 and account for 100% of the total organic components in Tank U-105. Nitrous oxide (N 2 O) was the only permanent gas detected in the tank-headspace sample. Tank U-105 is on the Hydrogen Watch List

  8. Vapor space characterization of Waste Tank 241-U-107: Results from samples collected on 2/17/95

    International Nuclear Information System (INIS)

    McVeety, B.D.; Clauss, T.W.; Ligotke, M.W.

    1995-10-01

    This report describes inorganic and organic analyses results from samples obtained from the headspace of the Hanford waste storage Tank 241-U-107 (referred to as Tank U-107). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), and water (H 2 O). Sampling for hydrogen cyanide (HCN) and sulfur oxides (SO x ) was not requested. In addition, quantitative results were obtained for the 39 TO-14 compounds plus an additional 14 analytes. Of these, 10 were observed above the 5-ppbv reporting cutoff. Sixteen organic tentatively identified compounds (TICs) were observed above the reporting cutoff of (ca.) 10 ppbv, and are reported with concentrations that are semiquantitative estimates based on internal-standard response factors. The 10 organic analytes with the highest estimated concentrations are listed in Table 1 and account for approximately 88% of the total organic components in Tank U-107. Nitrous oxide (N 2 O) was the only permanent gas detected in the tank-headspace samples. Tank U-107 is on the Organic and the Hydrogen Watch Lists

  9. Vapor space characterization of waste Tank 241-SX-103: Results from samples collected on 3/23/95

    International Nuclear Information System (INIS)

    Ligotke, M.W.; Clauss, T.W.; Pool, K.H.; McVeety, B.D.; Klinger, G.S.; Olsen, K.B.; Bredt, O.P.; Fruchter, J.S.; Goheen, S.C.

    1995-11-01

    This report describes inorganic and organic analyses results from samples obtained from the headspace of the Hanford waste storage tank 241-SX-103 (referred to as Tank SX-103). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), and water vapor (H 2 O). Sampling for hydrogen cyanide (HCN) and sulfur oxides (SO x ) was not requested. In addition, quantitative results were obtained for the 39 TO-14 compounds plus an additional 14 analytes. Of these, two were observed above the 5-ppbv reporting cutoff. Two tentatively identified compounds (TICs) were observed above the reporting cutoff of (ca.) 10 ppbv and are reported with concentrations that are semiquantitative estimates based on internal-standard response factors. The four organic analytes identified are listed in Table 1 and account for approximately 100% of the total organic components in Tank SX-103. Carbon dioxide (CO 2 ) was the only permanent gas detected in the tank-headspace samples. Tank SX-103 is on the Hydrogen Watch List

  10. Vapor space characterization of Waste Tank 241-U-106 (in situ): Results from samples collected on 8/25/94

    International Nuclear Information System (INIS)

    Ligotke, M.W.; Lucke, R.B.; Pool, K.H.

    1995-10-01

    This report describes inorganic and organic analyses results from in situ samples obtained from the headspace of the Hanford waste storage Tank 241-U-106 (referred to as Tank U-106). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), and water (H 2 O). Sampling for hydrogen cyanide (HCN) and sulfur oxides (SO x ) was not performed. In addition, the authors looked for the 39 TO-14 compounds plus an additional 14 target analytes. Of these, six were observed above the 5-ppbv reporting cutoff. Ten organic tentatively identified compounds (TICs) were observed above the reporting cutoff of (ca.) 10 ppbv in two or more of the three samples collected and are reported with concentrations that are semiquantitative estimates based on internal standard response factors. The 10 organic analytes with the highest estimated concentrations are listed in Table 1 and account for approximately 89% of the total organic components in Tank U-106. Methyl isocyanate, a compound of possible concern in Tank U-106, was not detected. Tank U-106 is on the Organic Watch List

  11. An Improved Seeding Algorithm of Magnetic Flux Lines Based on Data in 3D Space

    Directory of Open Access Journals (Sweden)

    Jia Zhong

    2015-05-01

    Full Text Available This paper will propose an approach to increase the accuracy and efficiency of seeding algorithms of magnetic flux lines in magnetic field visualization. To obtain accurate and reliable visualization results, the density of the magnetic flux lines should map the magnetic induction intensity, and seed points should determine the density of the magnetic flux lines. However, the traditional seeding algorithm, which is a statistical algorithm based on data, will produce errors when computing magnetic flux through subdivision of the plane. To achieve higher accuracy, more subdivisions should be made, which will reduce efficiency. This paper analyzes the errors made when the traditional seeding algorithm is used and gives an improved algorithm. It then validates the accuracy and efficiency of the improved algorithm by comparing the results of the two algorithms with results from the equivalent magnetic flux algorithm.

  12. [Patient identification errors and biological samples in the analytical process: Is it possible to improve patient safety?].

    Science.gov (United States)

    Cuadrado-Cenzual, M A; García Briñón, M; de Gracia Hills, Y; González Estecha, M; Collado Yurrita, L; de Pedro Moro, J A; Fernández Pérez, C; Arroyo Fernández, M

    2015-01-01

    Patient identification errors and biological samples are one of the problems with the highest risk factor in causing an adverse event in the patient. To detect and analyse the causes of patient identification errors in analytical requests (PIEAR) from emergency departments, and to develop improvement strategies. A process and protocol was designed, to be followed by all professionals involved in the requesting and performing of laboratory tests. Evaluation and monitoring indicators of PIEAR were determined, before and after the implementation of these improvement measures (years 2010-2014). A total of 316 PIEAR were detected in a total of 483,254 emergency service requests during the study period, representing a mean of 6.80/10,000 requests. Patient identification failure was the most frequent in all the 6-monthly periods assessed, with a significant difference (Perrors. However, we must continue working with this strategy, promoting a culture of safety for all the professionals involved, and trying to achieve the goal that 100% of the analytical and samples are properly identified. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  13. Improving imperfect data from health management information systems in Africa using space-time geostatistics.

    Directory of Open Access Journals (Sweden)

    Peter W Gething

    2006-06-01

    Full Text Available Reliable and timely information on disease-specific treatment burdens within a health system is critical for the planning and monitoring of service provision. Health management information systems (HMIS exist to address this need at national scales across Africa but are failing to deliver adequate data because of widespread underreporting by health facilities. Faced with this inadequacy, vital public health decisions often rely on crudely adjusted regional and national estimates of treatment burdens.This study has taken the example of presumed malaria in outpatients within the largely incomplete Kenyan HMIS database and has defined a geostatistical modelling framework that can predict values for all data that are missing through space and time. The resulting complete set can then be used to define treatment burdens for presumed malaria at any level of spatial and temporal aggregation. Validation of the model has shown that these burdens are quantified to an acceptable level of accuracy at the district, provincial, and national scale.The modelling framework presented here provides, to our knowledge for the first time, reliable information from imperfect HMIS data to support evidence-based decision-making at national and sub-national levels.

  14. Optimizing the design of nanostructures for improved thermal conduction within confined spaces

    Directory of Open Access Journals (Sweden)

    Fan Jintu

    2011-01-01

    Full Text Available Abstract Maintaining constant temperature is of particular importance to the normal operation of electronic devices. Aiming at the question, this paper proposes an optimum design of nanostructures made of high thermal conductive nanomaterials to provide outstanding heat dissipation from the confined interior (possibly nanosized to the micro-spaces of electronic devices. The design incorporates a carbon nanocone for conducting heat from the interior to the exterior of a miniature electronic device, with the optimum diameter, D 0, of the nanocone satisfying the relationship: D0 2 (x ∝ x 1/2 where x is the position along the length direction of the carbon nanocone. Branched structure made of single-walled carbon nanotubes (CNTs are shown to be particularly suitable for the purpose. It was found that the total thermal resistance of a branched structure reaches a minimum when the diameter ratio, β* satisfies the relationship: β* = γ -0.25b N -1/k* , where γ is ratio of length, b = 0.3 to approximately 0.4 on the single-walled CNTs, b = 0.6 to approximately 0.8 on the multiwalled CNTs, k* = 2 and N is the bifurcation number (N = 2, 3, 4 .... The findings of this research provide a blueprint in designing miniaturized electronic devices with outstanding heat dissipation. PACS numbers: 44.10.+i, 44.05.+e, 66.70.-f, 61.48.De

  15. Performance improvement of coherent free-space optical communication with quadrature phase-shift keying modulation using digital phase estimation.

    Science.gov (United States)

    Li, Xueliang; Geng, Tianwen; Ma, Shuang; Li, Yatian; Gao, Shijie; Wu, Zhiyong

    2017-06-01

    The performance of coherent free-space optical (CFSO) communication with phase modulation is limited by both phase fluctuations and intensity scintillations induced by atmospheric turbulence. To improve the system performance, one effective way is to use digital phase estimation. In this paper, a CFSO communication system with quadrature phase-shift keying modulation is studied. With consideration of the effects of log-normal amplitude fluctuations and Gauss phase fluctuations, a two-stage Mth power carrier phase estimation (CPE) scheme is proposed. The simulation results show that the phase noise can be suppressed greatly by this scheme, and the system symbol error rate performance with the two-stage Mth power CPE can be three orders lower than that of the single-stage Mth power CPE. Therefore, the two-stage CPE we proposed can contribute to the performance improvements of the CFSO communication system and has determinate guidance sense to its actual application.

  16. CMsearch: simultaneous exploration of protein sequence space and structure space improves not only protein homology detection but also protein structure prediction

    KAUST Repository

    Cui, Xuefeng

    2016-06-15

    Motivation: Protein homology detection, a fundamental problem in computational biology, is an indispensable step toward predicting protein structures and understanding protein functions. Despite the advances in recent decades on sequence alignment, threading and alignment-free methods, protein homology detection remains a challenging open problem. Recently, network methods that try to find transitive paths in the protein structure space demonstrate the importance of incorporating network information of the structure space. Yet, current methods merge the sequence space and the structure space into a single space, and thus introduce inconsistency in combining different sources of information. Method: We present a novel network-based protein homology detection method, CMsearch, based on cross-modal learning. Instead of exploring a single network built from the mixture of sequence and structure space information, CMsearch builds two separate networks to represent the sequence space and the structure space. It then learns sequence–structure correlation by simultaneously taking sequence information, structure information, sequence space information and structure space information into consideration. Results: We tested CMsearch on two challenging tasks, protein homology detection and protein structure prediction, by querying all 8332 PDB40 proteins. Our results demonstrate that CMsearch is insensitive to the similarity metrics used to define the sequence and the structure spaces. By using HMM–HMM alignment as the sequence similarity metric, CMsearch clearly outperforms state-of-the-art homology detection methods and the CASP-winning template-based protein structure prediction methods.

  17. The improvement of MOSFET prediction in space environments using the conversion model

    International Nuclear Information System (INIS)

    Shvetzov-Shilovsky, I.N.; Cherepko, S.V.; Pershenkov, V.S.

    1994-01-01

    The modeling of MOS device response to a low dose rate irradiation has been performed. The existing conversion model based on the linear dependence between positive oxide charge annealing and interface trap buildup accurately predicts the long time response of MOSFETs with relatively thick oxides but overestimates the threshold voltage shift for radiation hardened MOSFETs with thin oxides. To give an explanation to this fact, the authors investigate the impulse response function for threshold voltage. A revised model, which incorporates the different energy levels of hole traps in the oxide improves the fit between the model and data and gives an explanation to the fitting parameters dependence on oxide field

  18. Vapor space characterization of waste tank 241-BY-109 (in situ): Results from samples collected on 9/22/94

    International Nuclear Information System (INIS)

    Pool, K.H.; Clauss, T.W.; Ligotke, M.W.

    1995-06-01

    This report describes inorganic and organic analyses results from in situ samples obtained from the headspace of the Hanford waste storage Tank 241-BY-109 (referred to as Tank BY-109). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Summary Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), and water (H 2 O). Sampling for hydrogen cyanide (HCN) and sulfur oxides (SO x ) was not requested. Organic compounds were also quantitatively determined. Twenty-three organic tentatively identified compounds (TICs) were observed above the detection limit of (ca.) 10 ppbv, but standards for most of these were not available at the time of analysis, and the reported concentrations are semiquantitative estimates. In addition, we looked for the 40 standard TO-14 analytes. We observed 38. Of these, only a few were observed above the 2-ppbv calibrated instrument detection limit. The ten organic analytes with the highest estimated concentrations are listed in Summary Table 1. The ten analytes account for approximately 84% of the total organic components in Tank BY-109

  19. Vapor space characterization of waste tank 241-TY-103: Results from samples collected on 4/11/95

    International Nuclear Information System (INIS)

    Ligotke, M.W.; Clauss, T.W.; Pool, K.H.

    1995-10-01

    This report describes inorganic and organic analyses results from samples obtained from the headspace of the Hanford waste storage Tank 241-TY-103 (referred to as Tank TY-103). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), and water (H 2 O). Sampling for hydrogen cyanide (HCN) and sulfur oxides (SO x ) was not requested. In addition, quantitative results were obtained for the 39 TO-14 compounds plus an additional 14 analytes. Of these, 16 were observed above the 5-ppbv reporting cutoff. Sixteen tentatively identified compounds (TICs) were observed above the reporting cutoff of (ca.) 10 ppbv and are reported with concentrations that are semiquantitative estimates based on internal-standard response factors. The 10 organic analytes with the highest estimated concentrations are listed in Table 1 and account for approximately 95% of the total organic components in Tank TY-103. Two permanent gases, carbon dioxide (CO 2 ) and nitrous oxide (N 2 O), were also detected

  20. Vapor space characterization of Waste Tank 241-S-111: Results from samples collected on 3/21/95

    International Nuclear Information System (INIS)

    Klinger, G.S.; Clauss, T.W.; Ligotke, M.W.

    1995-10-01

    This report describes inorganic and organic analyses results from samples obtained from the headspace of the Hanford waste storage Tank 241-S-111 (referred to as Tank S-111). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), and water (H 2 O). Sampling for hydrogen cyanide (HCN) and sulfur oxides (SO x ) was not requested. In addition, quantitative results were obtained for the 39 TO-14 compounds plus an additional 14 analytes. Of these, seven were observed above the 5-ppbv reporting cutoff. Five tentatively identified compounds (TICs) were observed above the reporting cutoff of (ca.) 10 ppbv and are reported with concentrations that are semiquantitative estimates based on internal-standard response factors. The 10 organic analytes with the highest estimated concentrations are listed in Table 1 and account for approximately 98% of the total organic components in Tank S-111. Two permanent gases, hydrogen (H 2 ) and nitrous oxide (N 2 O), were also detected. Tank S-111 is on the Hydrogen Watch List

  1. Vapor space characterization of waste Tank 241-U-103: Results from samples collected on 2/15/95

    International Nuclear Information System (INIS)

    Ligotke, M.W.; Pool, K.H.; Clauss, T.W.; McVeety, B.D.; Klinger, G.S.; Olsen, K.B.; Bredt, O.P.; Fruchter, J.S.; Goheen, S.C.

    1995-11-01

    This report describes inorganic and organic analyses results from samples obtained from the headspace of the Hanford waste storage Tank 241-U-103 (referred to as Tank U-103). The results described her were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), and water vapor (H 2 O). Sampling for hydrogen cyanide (HCN) and sulfur oxides (SO x ) was not requested. In addition, quantitative results were obtained for the 39 TO-14 compounds plus an additional 14 analytes. Of these, 11 were observed above the 5-ppbv reporting cutoff. Eleven tentatively identified compounds (TICs) were observed above the reporting cutoff of (ca.) 10 ppbv and are reported with concentrations that are semiquantitative estimates based on internal-standard response factors. The 10 organic analytes with the highest estimated concentrations are listed in Table 1 and account for approximately 90% of the total organic components in Tank U-103. Two permanent gases, hydrogen (H 2 ) and nitrous oxide (N 2 O), were also detected. Tank U-103 is on the Hydrogen Watch List

  2. Vapor space characterization of waste Tank 241-SX-106: Results from samples collected on 3/24/95

    International Nuclear Information System (INIS)

    Klinger, G.S.; Clauss, T.W.; Litgotke, M.W.

    1995-11-01

    This report describes inorganic and organic analyses results from samples obtained from the headspace of the Hanford waste storage Tank 241-SX-106 (referred to as Tank SX-106). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), and water (H 2 O). Sampling for hydrogen cyanide (HCN) and sulfur oxides (SO x ) was not requested. In addition, quantitative results were obtained for the 39 TO-14 compounds plus an additional 14 analytes. Of these, 4 were observed above the 5-ppbv reporting cutoff. Three tentatively identified compounds (TICs) were observed above the reporting cutoff of (ca.) 10 ppbv and are reported with concentrations that are semiquantitative estimates based on internal-standard response factors. The 7 organic analytes identified are listed in Table 1 and account for approximately 100% of the total organic components in Tank SX-106. Carbon dioxide (CO 2 ) was the only permanent gas detected. Tank SX-106 is on the Ferrocyanide Watch List

  3. Vapor space characterization of waste tank 241-TX-118: Results from samples collected on 12/16/94

    International Nuclear Information System (INIS)

    Lucke, R.B.; Ligotke, M.W.; McVeety, B.D.

    1995-10-01

    This report describes inorganic and organic analyses results from samples obtained from the headspace of the Hanford waste storage Tank 241-TX-118 (referred to as Tank TX-118). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), and water (H 2 O). Sampling for hydrogen cyanide (HCN) and sulfur oxides (SO x ) was not requested. In addition, quantitative results were obtained for the 39 TO-14 compounds plus an additional 14 analytes. Of these, 3 were observed above the 5-ppbv reporting cutoff. Twenty three organic tentatively identified compounds (TICs) were observed above the reporting cutoff of (ca.) 10 ppbv, and are reported with concentrations that are semiquantitative estimates based on internal-standard response factors. The 10 organic analytes with the highest estimated concentrations are listed in Table 1 and account for approximately 84% of the total organic components in Tank TX-118. Two permanent gases, carbon dioxide (CO 2 ) and nitrous oxide (N 2 O), were also detected

  4. Vapor space characterization of waste tank 241-S-102: Results from samples collected on 3/14/95

    International Nuclear Information System (INIS)

    Pool, K.H.; McVeety, B.D.; Clauss, T.W.

    1995-10-01

    This report describes inorganic and organic analyses results from samples obtained from the headspace of the Hanford waste storage Tank 241-S-102 (referred to as Tank S-102). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), and water (H 2 O). Sampling for hydrogen cyanide (HCN) and sulfur oxides (SO x ) was not requested. In addition, quantitative results were obtained for the 39 TO-14 compounds plus an additional 14 analytes. Of these, 11 were observed above the 5-ppbv reporting cutoff. Eleven tentatively identified compounds (TICs) were observed above the reporting cutoff of (ca.) 10 ppbv and are reported with concentrations that are semiquantitative estimates based on internal-standard response factors. The 10 organic analytes with the highest estimated concentrations are listed in Table 1 and account for approximately 95% of the total organic components in Tank S-102. Two permanent gases, hydrogen (H 2 ) and nitrous oxide (N 2 O), were also detected

  5. Vapor space characterization of Waste Tank 241-TY-104 (in situ): Results from samples collected on 8/5/94

    International Nuclear Information System (INIS)

    Ligotke, M.W.; Pool, K.H.; Lucke, R.B.

    1995-10-01

    This report describes inorganic and organic analyses results from in situ samples obtained from the headspace of the Hanford waste storage Tank 241-TY-104 (referred to as Tank TY-104). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), and water (H 2 O). Sampling for hydrogen cyanide (HCN) and sulfur oxides (SO x ) was not performed. In addition, the authors looked for the 39 TO-14 compounds plus an additional 14 analytes. Of these, eight were observed above the 5-ppbv reporting cutoff. Twenty-four organic tentatively identified compounds (TICs) were observed above the reporting cutoff of (ca.) 10 ppbv and are reported with concentrations that are semiquantitative estimates based on internal standard response factors. The 10 organic analytes with the highest estimated concentrations are listed in Table 1 and account for approximately 86% of the total organic components in Tank TY-104. Tank TY-104 is on the Ferrocyanide Watch List

  6. Vapor space characterization of waste Tank 241-TY-101: Results from samples collected on 4/6/95

    International Nuclear Information System (INIS)

    Klinger, G.S.; Clauss, T.W.; Ligotke, M.W.; Pool, K.H.; McVeety, B.D.; Olsen, K.B.; Bredt, O.P.; Fruchter, J.S.; Goheen, S.C.

    1995-11-01

    This report describes inorganic and organic analyses results from samples obtained from the headspace of the Hanford waste storage Tank 241-TY-101 (referred to as Tank TY-101). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), and water vapor (H 2 O). Sampling for hydrogen cyanide (HCN) and sulfur oxides (SO x ) was not requested. In addition, quantitative results were obtained for the 39 TO-14 compounds plus an additional 14 analytes. Off these, 5 were observed above the 5-ppbv reporting cutoff. One tentatively identified compound (TIC) was observed above the reporting cutoff of (ca.) 10 ppbv and are reported with concentrations that are semiquantitative estimates based on internal-standard response factors. The six organic analyses identified are listed in Table 1 and account for approximately 100% of the total organic components in Tank TY-101. Two permanent gases, carbon dioxide (CO 2 ) and nitrous oxide (N 2 O), were also detected. Tank TY-101 is on the Ferrocyanide Watch List

  7. Vapor space characterization of waste tank 241-BX-104: Results from samples collected on 12/30/94

    International Nuclear Information System (INIS)

    Pool, K.H.; Ligotke, M.W.; McVeety, B.D.

    1995-10-01

    This report describes inorganic and organic analyses results from samples obtained from the headspace of the Hanford waste storage Tank 241-BX-104 (referred to as Tank BX-104). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained. for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), and water (H 2 O). Sampling for hydrogen cyanide (HCN) and sulfur oxides (SOx) was not requested. In addition, quantitative results were obtained for the 39 TO-14 compounds plus an additional 14 analytes. Of these, 13 were observed above the 5-ppbv reporting cutoff. Sixty-six organic tentatively identified compounds (TICs) were observed above the reporting cutoff of (ca.) 10 ppbv and are reported with concentrations that are semiquantitative estimates based on internal-standard response factors. The 10 organic analytes, with the highest estimated concentrations are listed in Table 1 and account for approximately 70% of the total organic components in Tank BX-104. Two permanent gases, carbon dioxide (CO 2 ) and nitrous oxide (N 2 O), were also detected

  8. Vapor space characterization of waste Tank 241-BY-107: Results from in situ sample collected on 3/25/94

    International Nuclear Information System (INIS)

    Sharma, A.K.; Lucke, R.B.; Clauss, T.W.; McVeety, B.D.; Fruchter, J.S.; Goheen, S.C.

    1995-06-01

    This report describes organic results from vapors of the Hanford single-shell waste storage Tank 241-BY-107 (referred to as Tank BY-107). Samples for selected inorganic compounds were obtained but not anlayzed (Section 2.0). Quantitative results were obtained for several organic analytes, but quantities of analytes not listed in US Environmental Protection Agency (EPA) compendium Method TO-14 were estimated. Approximately 80 tentatively identified organic analytes were observed above the detection limit of (ca.) 10 ppbv, but standards for most of these were not available at the time of analysis, and their quantitative determination is beyond the scope of this study. The SUMMATM canister samples were also analyzed for the 41 organic compounds listed in EPA compendium Method TO-14. Of these, only a few were observed above the 2-ppbv detection limits. These are summarized in Table 3.1. Estimated quantities were determined of tentatively identified compounds (TICs). A summary of these results shows quantities of all TICs above the concentration of ca. 10 ppbv. This consists of more than 80 organic analytes. The 12 organic analytes with the highest estimated concentrations are shown

  9. An Improved X-Band Maser System for Deep Space Network Applications

    Science.gov (United States)

    Britcliffe, M.; Hanson, T.; Fernandez, J.

    2000-01-01

    An 8450-MHz (X-band) maser system utilizing a commercial Gifford--McMahon (GM) closed-cycle cryocooler (CCR) was designed, fabricated, and demonstrated. The CCR system was used to cool a maser operating at 8450 MHz. The prototype GM CCR system meets or exceeds all Deep Space Network requirements for maser performance. The two-stage GM CCR operates at 4.2 K; for comparison, the DSN's current three-stage cryocooler, which uses a Joule--Thompson cooling stage in addition to GM cooling, operates at 4.5 K. The new CCR withstands heat loads of 1.5 W at 4.2 K as compared to 1 W at 4.5 K for the existing DSN cryocooler used for cooling masers. The measured noise temperature, T_e, of the maser used for these tests is defined at the ambient connection to the antenna feed system. The T_e measured 5.0 K at a CCR temperature of 4.5 K, about 1.5 K higher than the noise temperature of a typical DSN Block II-A X-band traveling-wave maser (TWM). Reducing the temperature of the CCR significantly lowers the maser noise temperature and increases maser gain and bandwidth. The new GM CCR gives future maser systems significant operational advantages, including reduced maintenance time and logistics requirements. The results of a demonstration of this new system are presented. Advantages of using a GM-cooled maser and the effects of the reduced CCR temperature on maser performance are discussed.

  10. Biosentinel: Improving Desiccation Tolerance of Yeast Biosensors for Deep-Space Missions

    Science.gov (United States)

    Dalal, Sawan; Santa Maria, Sergio R.; Liddell, Lauren; Bhattacharya, Sharmila

    2017-01-01

    BioSentinel is one of 13 secondary payloads to be deployed on Exploration Mission 1 (EM-1) in 2019. We will use the budding yeast Saccharomyces cerevisiae as a biosensor to determine how deep-space radiation affects living organisms and to potentially quantify radiation levels through radiation damage analysis. Radiation can damage DNA through double strand breaks (DSBs), which can normally be repaired by homologous recombination. Two yeast strains will be air-dried and stored in microfluidic cards within the payload: a wild-type control strain and a radiation sensitive rad51 mutant that is deficient in DSB repairs. Throughout the mission, the microfluidic cards will be rehydrated with growth medium and an indicator dye. Growth rates of each strain will be measured through LED detection of the reduction of the indicator dye, which correlates with DNA repair and the amount of radiation damage accumulated. Results from BioSentinel will be compared to analog experiments on the ISS and on Earth. It is well known that desiccation can damage yeast cells and decrease viability over time. We performed a screen for desiccation-tolerant rad51 strains. We selected 20 re-isolates of rad51 and ran a weekly screen for desiccation-tolerant mutants for five weeks. Our data shows that viability decreases over time, confirming previous research findings. Isolates L2, L5 and L14 indicate desiccation tolerance and are candidates for whole-genome sequencing. More time is needed to determine whether a specific strain is truly desiccation tolerant. Furthermore, we conducted an intracellular trehalose assay to test how intracellular trehalose concentrations affect or protect the mutant strains against desiccation stress. S. cerevisiae cell and reagent concentrations from a previously established intracellular trehalose protocol did not yield significant absorbance measurements, so we tested varying cell and reagent concentrations and determined proper concentrations for successful

  11. Applications of tuned mass dampers to improve performance of large space mirrors

    Science.gov (United States)

    Yingling, Adam J.; Agrawal, Brij N.

    2014-01-01

    In order for future imaging spacecraft to meet higher resolution imaging capability, it will be necessary to build large space telescopes with primary mirror diameters that range from 10 m to 20 m and do so with nanometer surface accuracy. Due to launch vehicle mass and volume constraints, these mirrors have to be deployable and lightweight, such as segmented mirrors using active optics to correct mirror surfaces with closed loop control. As a part of this work, system identification tests revealed that dynamic disturbances inherent in a laboratory environment are significant enough to degrade the optical performance of the telescope. Research was performed at the Naval Postgraduate School to identify the vibration modes most affecting the optical performance and evaluate different techniques to increase damping of those modes. Based on this work, tuned mass dampers (TMDs) were selected because of their simplicity in implementation and effectiveness in targeting specific modes. The selected damping mechanism was an eddy current damper where the damping and frequency of the damper could be easily changed. System identification of segments was performed to derive TMD specifications. Several configurations of the damper were evaluated, including the number and placement of TMDs, damping constant, and targeted structural modes. The final configuration consisted of two dampers located at the edge of each segment and resulted in 80% reduction in vibrations. The WFE for the system without dampers was 1.5 waves, with one TMD the WFE was 0.9 waves, and with two TMDs the WFE was 0.25 waves. This paper provides details of some of the work done in this area and includes theoretical predictions for optimum damping which were experimentally verified on a large aperture segmented system.

  12. Grad-Shafranov reconstruction: overview and improvement of the numerical solution used in space physics

    Energy Technology Data Exchange (ETDEWEB)

    Ojeda Gonzalez, A.; Domingues, M.O.; Mendes, O., E-mail: ojeda.gonzalez.a@gmail.com [Instituto Nacional de Pesquisas Espaciais (INPE), Sao Jose dos Campos, SP (Brazil); Kaibara, M.K. [Universidade Federal Fluminense (GMA/IME/UFF), Niteroi, RJ (Brazil); Prestes, A. [Universidade do Vale do Paraiba (IP and D/UNIVAP), Sao Jose dos Campos, SP (Brazil). Lab. de Fisica e Astronomia

    2015-10-15

    The Grad-Shafranov equation is a Poisson's equation, i.e., a partial differential equation of elliptic type. The problem is depending on the initial condition and can be treated as a Cauchy problem. Although it is ill-posed or ill-conditioned, it can be integrated numerically. In the integration of the GS equation, singularities with large values of the potential arise after a certain number of integration steps away from the original data line, and a filter should be used. The Grad-Shafranov reconstruction (GSR) technique was developed from 1996 to 2000 for recovering two-dimensional structures in the magnetopause in an ideal MHD formulation. Other works have used the GSR techniques to study magnetic flux ropes in the solar wind and in the magnetotail from a single spacecraft dataset; posteriorly, it was extended to treat measurements from multiple satellites. From Vlasov equation, it is possible to arrive at the GS-equation in function of the normalized vector potential. A general solution is obtained using complex variable theory. A specific solution was chosen as benchmark case to solve numerically the GS equation.We propose some changes in the resolution scheme of the GS equation to improve the solution. The result of each method is compared with the solution proposed by Hau and Sonnerup (J. Geophys. Res. 104(A4), 6899-6917 (1999)). The main improvement found in the GS resolution was the need to filter Bx values at each y value. (author)

  13. High-resolution space-time characterization of convective rain cells: implications on spatial aggregation and temporal sampling operated by coarser resolution instruments

    Science.gov (United States)

    Marra, Francesco; Morin, Efrat

    2017-04-01

    Forecasting the occurrence of flash floods and debris flows is fundamental to save lives and protect infrastructures and properties. These natural hazards are generated by high-intensity convective storms, on space-time scales that cannot be properly monitored by conventional instrumentation. Consequently, a number of early-warning systems are nowadays based on remote sensing precipitation observations, e.g. from weather radars or satellites, that proved effective in a wide range of situations. However, the uncertainty affecting rainfall estimates represents an important issue undermining the operational use of early-warning systems. The uncertainty related to remote sensing estimates results from (a) an instrumental component, intrinsic of the measurement operation, and (b) a discretization component, caused by the discretization of the continuous rainfall process. Improved understanding on these sources of uncertainty will provide crucial information to modelers and decision makers. This study aims at advancing knowledge on the (b) discretization component. To do so, we take advantage of an extremely-high resolution X-Band weather radar (60 m, 1 min) recently installed in the Eastern Mediterranean. The instrument monitors a semiarid to arid transition area also covered by an accurate C-Band weather radar and by a relatively sparse rain gauge network ( 1 gauge/ 450 km2). Radar quantitative precipitation estimation includes corrections reducing the errors due to ground echoes, orographic beam blockage and attenuation of the signal in heavy rain. Intense, convection-rich, flooding events recently occurred in the area serve as study cases. We (i) describe with very high detail the spatiotemporal characteristics of the convective cores, and (ii) quantify the uncertainty due to spatial aggregation (spatial discretization) and temporal sampling (temporal discretization) operated by coarser resolution remote sensing instruments. We show that instantaneous rain intensity

  14. Improvement of the equivalent sphere model for better estimates of skin or eye dose in space radiation environments

    International Nuclear Information System (INIS)

    Lin, Z.W.

    2011-01-01

    It is often useful to get a quick estimate of the dose or dose equivalent of an organ, such as blood-forming organs, the eye or the skin, in a radiation field. Sometimes an equivalent sphere is used to represent the organ for this purpose. For space radiation environments, recently it has been shown that the equivalent sphere model does not work for the eye or the skin in solar particle event environments. In this study, we improve the representation of the eye and the skin using a two-component equivalent sphere model. Motivated by the two-peak structure of the body organ shielding distribution for the eye and the skin, we use an equivalent sphere with two radius parameters, for example a partial spherical shell of a smaller thickness over a proper fraction of the full solid angle combined with a concentric partial spherical shell of a larger thickness over the rest of the full solid angle, to represent the eye or the skin. We find that using an equivalent sphere with two radius parameters instead of one drastically improves the accuracy of the estimates of dose and dose equivalent in space radiation environments. For example, in solar particle event environments the average error in the estimate of the skin dose equivalent using an equivalent sphere with two radius parameters is about 8%, while the average error of the conventional equivalent sphere model using one radius parameter is around 100%.

  15. Progress in Space Weather Modeling and Observations Needed to Improve the Operational NAIRAS Model Aircraft Radiation Exposure Predictions

    Science.gov (United States)

    Mertens, C. J.; Kress, B. T.; Wiltberger, M. J.; Tobiska, W.; Xu, X.

    2011-12-01

    The Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) is a prototype operational model for predicting commercial aircraft radiation exposure from galactic and solar cosmic rays. NAIRAS predictions are currently streaming live from the project's public website, and the exposure rate nowcast is also available on the SpaceWx smartphone app for iPhone, IPad, and Android. Cosmic rays are the primary source of human exposure to high linear energy transfer radiation at aircraft altitudes, which increases the risk of cancer and other adverse health effects. Thus, the NAIRAS model addresses an important national need with broad societal, public health and economic benefits. The processes responsible for the variability in the solar wind, interplanetary magnetic field, solar energetic particle spectrum, and the dynamical response of the magnetosphere to these space environment inputs, strongly influence the composition and energy distribution of the atmospheric ionizing radiation field. During the development of the NAIRAS model, new science questions were identified that must be addressed in order to obtain a more reliable and robust operational model of atmospheric radiation exposure. Addressing these science questions require improvements in both space weather modeling and observations. The focus of this talk is to present these science questions, the proposed methodologies for addressing these science questions, and the anticipated improvements to the operational predictions of atmospheric radiation exposure. The overarching goal of this work is to provide a decision support tool for the aviation industry that will enable an optimal balance to be achieved between minimizing health risks to passengers and aircrew while simultaneously minimizing costs to the airline companies.

  16. Use of amplicon sequencing to improve sensitivity in PCR-based detection of microbial pathogen in environmental samples.

    Science.gov (United States)

    Saingam, Prakit; Li, Bo; Yan, Tao

    2018-06-01

    DNA-based molecular detection of microbial pathogens in complex environments is still plagued by sensitivity, specificity and robustness issues. We propose to address these issues by viewing them as inadvertent consequences of requiring specific and adequate amplification (SAA) of target DNA molecules by current PCR methods. Using the invA gene of Salmonella as the model system, we investigated if next generation sequencing (NGS) can be used to directly detect target sequences in false-negative PCR reaction (PCR-NGS) in order to remove the SAA requirement from PCR. False-negative PCR and qPCR reactions were first created using serial dilutions of laboratory-prepared Salmonella genomic DNA and then analyzed directly by NGS. Target invA sequences were detected in all false-negative PCR and qPCR reactions, which lowered the method detection limits near the theoretical minimum of single gene copy detection. The capability of the PCR-NGS approach in correcting false negativity was further tested and confirmed under more environmentally relevant conditions using Salmonella-spiked stream water and sediment samples. Finally, the PCR-NGS approach was applied to ten urban stream water samples and detected invA sequences in eight samples that would be otherwise deemed Salmonella negative. Analysis of the non-target sequences in the false-negative reactions helped to identify primer dime-like short sequences as the main cause of the false negativity. Together, the results demonstrated that the PCR-NGS approach can significantly improve method sensitivity, correct false-negative detections, and enable sequence-based analysis for failure diagnostics in complex environmental samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Colour Doppler and microbubble contrast agent ultrasonography do not improve cancer detection rate in transrectal systematic prostate biopsy sampling.

    Science.gov (United States)

    Taverna, Gianluigi; Morandi, Giovanni; Seveso, Mauro; Giusti, Guido; Benetti, Alessio; Colombo, Piergiuseppe; Minuti, Francesco; Grizzi, Fabio; Graziotti, Pierpaolo

    2011-12-01

    What's known on the subject? and What does the study add? Transrectal gray-scale ultrasonography guided prostate biopsy sampling is the method for diagnosing prostate cancer (PC) in patients with an increased prostate specific antigen level and/or abnormal digital rectal examination. Several imaging strategies have been proposed to optimize the diagnostic value of biopsy sampling, although at the first biopsy nearly 10-30% of PC still remains undiagnosed. This study compares the PC detection rate when employing Colour Doppler ultransongraphy with or without the injection of SonoVue™ microbubble contrast agent, versus the transrectal ultrasongraphy-guided systematic biopsy sampling. The limited accuracy, sensitivity, specificity and the additional cost of using the contrast agent do not justify its routine application in PC detection. • To compare prostate cancer (PC) detection rate employing colour Doppler ultrasonography with or without SonoVue™ contrast agent with transrectal ultrasonography-guided systematic biopsy sampling. • A total of 300 patients with negative digital rectal examination and transrectal grey-scale ultrasonography, with PSA values ranging between 2.5 and 9.9 ng/mL, were randomized into three groups: 100 patients (group A) underwent transrectal ultrasonography-guided systematic bioptic sampling; 100 patients (group B) underwent colour Doppler ultrasonography, and 100 patients (group C) underwent colour Doppler ultrasonography before and during the injection of SonoVue™. • Contrast-enhanced targeted biopsies were sampled into hypervascularized areas of peripheral, transitional, apical or anterior prostate zones. • All the patients included in Groups B and C underwent a further 13 systematic prostate biopsies. The cancer detection rate was calculated for each group. • In 88 (29.3%) patients a histological diagnosis of PC was made, whereas 22 (7.4%) patients were diagnosed with high-grade prostatic intraepithelial

  18. An Integrated Approach to Thermal Management of International Space Station Logistics Flights, Improving the Efficiency

    Science.gov (United States)

    Holladay, Jon; Day, Greg; Roberts, Barry; Leahy, Frank

    2003-01-01

    The efficiency of re-useable aerospace systems requires a focus on the total operations process rather than just orbital performance. For the Multi-Purpose Logistics Module this activity included special attention to terrestrial conditions both pre-launch and post-landing and how they inter-relate to the mission profile. Several of the efficiencies implemented for the MPLM Mission Engineering were NASA firsts and all served to improve the overall operations activities. This paper will provide an explanation of how various issues were addressed and the resulting solutions. Topics range from statistical analysis of over 30 years of atmospheric data at the launch and landing site to a new approach for operations with the Shuttle Carrier Aircraft. In each situation the goal was to "tune" the thermal management of the overall flight system for minimizing requirement risk while optimizing power and energy performance.

  19. Application of reversible denoising and lifting steps with step skipping to color space transforms for improved lossless compression

    Science.gov (United States)

    Starosolski, Roman

    2016-07-01

    Reversible denoising and lifting steps (RDLS) are lifting steps integrated with denoising filters in such a way that, despite the inherently irreversible nature of denoising, they are perfectly reversible. We investigated the application of RDLS to reversible color space transforms: RCT, YCoCg-R, RDgDb, and LDgEb. In order to improve RDLS effects, we propose a heuristic for image-adaptive denoising filter selection, a fast estimator of the compressed image bitrate, and a special filter that may result in skipping of the steps. We analyzed the properties of the presented methods, paying special attention to their usefulness from a practical standpoint. For a diverse image test-set and lossless JPEG-LS, JPEG 2000, and JPEG XR algorithms, RDLS improves the bitrates of all the examined transforms. The most interesting results were obtained for an estimation-based heuristic filter selection out of a set of seven filters; the cost of this variant was similar to or lower than the transform cost, and it improved the average lossless JPEG 2000 bitrates by 2.65% for RDgDb and by over 1% for other transforms; bitrates of certain images were improved to a significantly greater extent.

  20. Experiments for improving the roentgen imaging and thus the caries diagnosis in the approximal space of lateral teeth

    International Nuclear Information System (INIS)

    Gramm, E.

    1982-01-01

    This is a study of possibilities to improve X-ray pictures of the teeth with regard to detail sharpness in the interdental space in a closed row of lateral teeth. For this purpose, X-ray pictures were made of a phantom showing a closed row of lateral teeth, with two different films being used. The row of teeth was made to include two healthy teeth, one tooth with two spots of initial caries, and one tooth with a caries lesion showing already a cavity. The two films used were the usual one, SUPER DOZAHN, and a fine-grain, insensitive film usually chosen for materials testing (NDT 55). The loss in contrast with increasing kV was observed with all X-ray pictures; the insensitive film was in every case more rich in contrast than the usual dental X-ray film. Use of a special paste on the teeth in the interdental space lead to an improved detail sharpness for visual detection. The spots of special interest, i.e. those with initial caries could in no case be clearly defined as such, whereas the caries lesion became evident on all images. The radiation dose was 4,4 times higher when using the insensitive, fine-grain film, as compared to the dental film; use of the paste still increased the radiation dose by a factor of 1.6. The results show that the measures studied in this thesis are not suited to improving the diagnostic value of the X-ray pictures taken as described above. (orig./MG) [de

  1. Soil map disaggregation improved by soil-landscape relationships, area-proportional sampling and random forest implementation

    DEFF Research Database (Denmark)

    Møller, Anders Bjørn; Malone, Brendan P.; Odgers, Nathan

    implementation generally improved the algorithm’s ability to predict the correct soil class. The implementation of soil-landscape relationships and area-proportional sampling generally increased the calculation time, while the random forest implementation reduced the calculation time. In the most successful......Detailed soil information is often needed to support agricultural practices, environmental protection and policy decisions. Several digital approaches can be used to map soil properties based on field observations. When soil observations are sparse or missing, an alternative approach...... is to disaggregate existing conventional soil maps. At present, the DSMART algorithm represents the most sophisticated approach for disaggregating conventional soil maps (Odgers et al., 2014). The algorithm relies on classification trees trained from resampled points, which are assigned classes according...

  2. SOIL MOISTURE SPACE-TIME ANALYSIS TO SUPPORT IMPROVED CROP MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Bruno Montoani Silva

    2015-02-01

    Full Text Available The knowledge of the water content in the soil profile is essential for an efficient management of crop growth and development. This work aimed to use geostatistical techniques in a spatio-temporal study of soil moisture in an Oxisol in order to provide that information for improved crop management. Data were collected in a coffee crop area at São Roque de Minas, in the upper São Francisco River basin, MG state, Brazil. The soil moisture was measured with a multi-sensor capacitance (MCP probe at 10-, 20-, 30-, 40-, 60- and 100-cm depths between March and December, 2010. After adjusting the spherical semivariogram model using ordinary least squares, best model, the values were interpolated by kriging in order to have a continuous surface relating depth x time (CSDT and the soil water availability to plant (SWAP. The results allowed additional insight on the dynamics of soil water and its availability to plant, and pointed to the effects of climate on the soil water content. These results also allowed identifying when and where there was greater water consumption by the plants, and the soil layers where water was available and potentially explored by the plant root system.

  3. Vapor space characterization of waste Tank 241-C-109 (in situ): Results from samples collected on 6/23/94

    International Nuclear Information System (INIS)

    Clauss, T.W.; Ligotke, M.W.; Pool, K.H.; Lucke, R.B.; McVeety, B.D.; Sharma, A.K.; McCulloch, M.; Fruchter, J.S.; Goheen, S.C.

    1995-10-01

    This report describes organic analyses results from in situ samples obtained from the headspace of the Hanford waste storage Tank 241-C-109 (referred to as Tank C-109). The results described here were obtained to support safety and toxicological evaluations. Organic compounds were quantitatively determined. Thirteen organic tentatively identified compounds (TICs) were observed above the detection limit of (ca.) 10 ppbv, but standards for most of these were not available at the time of analysis, and the reported concentrations are semiquantitative estimates. In addition, the authors looked for the 40 standard TO-14 analytes. Of these, only one was observed above the 2-ppbv calibrated instrumental detection limit. However, it is believed, even though the values for dichlorodifluoromethane and trichlorofluoromethane are below the instrumental detection limit, they are accurate at these low concentrations. The six analytes account for approximately 100% of the total organic components in Tank C-109. These six organic analytes with the highest estimated concentrations are listed in Summary Table 1. Detailed descriptions of the results appear in the text

  4. Designing urban spaces and buildings to improve sustainability and quality of life in a warmer world

    International Nuclear Information System (INIS)

    Smith, Claire; Levermore, Geoff

    2008-01-01

    It is in cities that the negative impacts of a warming climate will be felt most strongly. The summer time comfort and well-being of the urban population will become increasingly compromised under future scenarios for climate change and urbanisation. In contrast to rural areas, where night-time relief from high daytime temperatures occurs as heat is lost to the sky, the city environment stores and traps heat and offers little respite from high temperatures. This urban heat island effect is responsible for temperature differences of up to 7 deg. C between cities and the country in the UK. We already have experience of the potential hazards of these higher temperatures. The majority of heat-related fatalities during the summer of 2003 were in urban areas. This means that the cooling of the urban environment is a high priority for urban planners and designers. Proven ways of doing this include altering the urban microclimate by modifying its heat absorption and emission, for example through urban greening, the use of high-reflectivity materials, and by increasing openness to allow cooling winds. Buildings themselves can also deliver improved comfort and higher levels of sustainability by taking advantage of exemplary facade, glazing and ventilation designs. In addition, changed behaviour by building occupants can help keep urban areas cool. The technology to reduce the future vulnerability of city dwellers to thermal discomfort is already largely in existence. But there is a need for complementary policy and planning commitments to manage its implementation, especially in existing buildings and urban areas

  5. Technical innovation in dynamic contrast-enhanced magnetic resonance imaging of musculoskeletal tumors: an MR angiographic sequence using a sparse k-space sampling strategy.

    Science.gov (United States)

    Fayad, Laura M; Mugera, Charles; Soldatos, Theodoros; Flammang, Aaron; del Grande, Filippo

    2013-07-01

    We demonstrate the clinical use of an MR angiography sequence performed with sparse k-space sampling (MRA), as a method for dynamic contrast-enhanced (DCE)-MRI, and apply it to the assessment of sarcomas for treatment response. Three subjects with sarcomas (2 with osteosarcoma, 1 with high-grade soft tissue sarcomas) underwent MRI after neoadjuvant therapy/prior to surgery, with conventional MRI (T1-weighted, fluid-sensitive, static post-contrast T1-weighted sequences) and DCE-MRI (MRA, time resolution = 7-10 s, TR/TE 2.4/0.9 ms, FOV 40 cm(2)). Images were reviewed by two observers in consensus who recorded image quality (1 = diagnostic, no significant artifacts, 2 = diagnostic, 75 % with good response, >75 % with poor response). DCE-MRI findings were concordant with histological response (arterial enhancement with poor response, no arterial enhancement with good response). Unlike conventional DCE-MRI sequences, an MRA sequence with sparse k-space sampling is easily integrated into a routine musculoskeletal tumor MRI protocol, with high diagnostic quality. In this preliminary work, tumor enhancement characteristics by DCE-MRI were used to assess treatment response.

  6. Technical innovation in dynamic contrast-enhanced magnetic resonance imaging of musculoskeletal tumors: an MR angiographic sequence using a sparse k-space sampling strategy

    International Nuclear Information System (INIS)

    Fayad, Laura M.; Mugera, Charles; Grande, Filippo del; Soldatos, Theodoros; Flammang, Aaron

    2013-01-01

    We demonstrate the clinical use of an MR angiography sequence performed with sparse k-space sampling (MRA), as a method for dynamic contrast-enhanced (DCE)-MRI, and apply it to the assessment of sarcomas for treatment response. Three subjects with sarcomas (2 with osteosarcoma, 1 with high-grade soft tissue sarcomas) underwent MRI after neoadjuvant therapy/prior to surgery, with conventional MRI (T1-weighted, fluid-sensitive, static post-contrast T1-weighted sequences) and DCE-MRI (MRA, time resolution = 7-10 s, TR/TE 2.4/0.9 ms, FOV 40 cm 2 ). Images were reviewed by two observers in consensus who recorded image quality (1 = diagnostic, no significant artifacts, 2 = diagnostic, 75 % with good response, >75 % with poor response). DCE-MRI findings were concordant with histological response (arterial enhancement with poor response, no arterial enhancement with good response). Unlike conventional DCE-MRI sequences, an MRA sequence with sparse k-space sampling is easily integrated into a routine musculoskeletal tumor MRI protocol, with high diagnostic quality. In this preliminary work, tumor enhancement characteristics by DCE-MRI were used to assess treatment response. (orig.)

  7. Robust nonhomogeneous training samples detection method for space-time adaptive processing radar using sparse-recovery with knowledge-aided

    Science.gov (United States)

    Li, Zhihui; Liu, Hanwei; Zhang, Yongshun; Guo, Yiduo

    2017-10-01

    The performance of space-time adaptive processing (STAP) may degrade significantly when some of the training samples are contaminated by the signal-like components (outliers) in nonhomogeneous clutter environments. To remove the training samples contaminated by outliers in nonhomogeneous clutter environments, a robust nonhomogeneous training samples detection method using the sparse-recovery (SR) with knowledge-aided (KA) is proposed. First, the reduced-dimension (RD) overcomplete spatial-temporal steering dictionary is designed with the prior knowledge of system parameters and the possible target region. Then, the clutter covariance matrix (CCM) of cell under test is efficiently estimated using a modified focal underdetermined system solver (FOCUSS) algorithm, where a RD overcomplete spatial-temporal steering dictionary is applied. Third, the proposed statistics are formed by combining the estimated CCM with the generalized inner products (GIP) method, and the contaminated training samples can be detected and removed. Finally, several simulation results validate the effectiveness of the proposed KA-SR-GIP method.

  8. Administration of recombinant interleukin-11 improves the hemodynamic functions and decreases third space fluid loss in a porcine model of hemorrhagic shock and resuscitation.

    Science.gov (United States)

    Honma, Kaneatsu; Koles, Nancy L; Alam, Hasan B; Rhee, Peter; Rollwagen, Florence M; Olsen, Cara; Keith, James C; Pollack, Matthew

    2005-06-01

    We have previously demonstrated that the administration of recombinant human interleukin-11 (rhIL-11) during resuscitation improves the blood pressure in a rodent model of hemorrhagic shock. The purpose of this study was to determine whether the effects of rhIL-11 could be reproduced in a large animal model and to elucidate the impact of rhIL-11 administration on the intravascular volume status and the degree of third space fluid loss after resuscitation. A 40% blood volume hemorrhage was induced in swine (n = 45, weight of 25-35 kg) followed by a 1-h shock period and resuscitation with 0.9% sodium chloride (three times the shed blood volume). The animals were randomized to receive sham hemorrhage (group I, sham); sham hemorrhage and 50 microg/kg rhIL-11 (group II, sham + IL-11); no drug (group III, saline); or 50 microg/kg rhIL-11 (group IV, IL-11). Blood and urine samples were obtained and analyzed at baseline, at the end of hemorrhaging, and thereafter once every hour. The pleural and peritoneal effusions were precisely quantified by using clinically accepted criteria. The mean arterial pressure (MAP) was higher postresuscitation (PR) in groups I, II, and IV (71.4 +/- 7.5 mmHg, 71.0 +/- 8.9 mmHg, and 72.9 +/- 12.3 mmHg, respectively) than in group III (59.9 +/- 10.9 mmHg), and the cardiac output of PR was higher in group IV (3.46 +/- 0.56 L/min) than in group III (2.99 +/- 0.62 L/min; P < 0.01). The difference in MAP between groups I and II became statistically significant at 40 min after rhIL-11 injection and such a difference persisted for 90 min. After resuscitation, the urine output was higher, and the urine specific gravity and third space fluid loss were lower in group IV (1434 +/- 325 mL and 1.0035, 82 +/- 21 mL) than in group III (958 +/- 390 mL and 1.0053, 125 +/- 32 mL; P < 0.05). In a porcine model of hemorrhagic shock, the administration of rhIL-11 at the start of resuscitation significantly improved the cardiac output and blood pressure. This

  9. Improving 3d Spatial Queries Search: Newfangled Technique of Space Filling Curves in 3d City Modeling

    Science.gov (United States)

    Uznir, U.; Anton, F.; Suhaibah, A.; Rahman, A. A.; Mioc, D.

    2013-09-01

    The advantages of three dimensional (3D) city models can be seen in various applications including photogrammetry, urban and regional planning, computer games, etc.. They expand the visualization and analysis capabilities of Geographic Information Systems on cities, and they can be developed using web standards. However, these 3D city models consume much more storage compared to two dimensional (2D) spatial data. They involve extra geometrical and topological information together with semantic data. Without a proper spatial data clustering method and its corresponding spatial data access method, retrieving portions of and especially searching these 3D city models, will not be done optimally. Even though current developments are based on an open data model allotted by the Open Geospatial Consortium (OGC) called CityGML, its XML-based structure makes it challenging to cluster the 3D urban objects. In this research, we propose an opponent data constellation technique of space-filling curves (3D Hilbert curves) for 3D city model data representation. Unlike previous methods, that try to project 3D or n-dimensional data down to 2D or 3D using Principal Component Analysis (PCA) or Hilbert mappings, in this research, we extend the Hilbert space-filling curve to one higher dimension for 3D city model data implementations. The query performance was tested using a CityGML dataset of 1,000 building blocks and the results are presented in this paper. The advantages of implementing space-filling curves in 3D city modeling will improve data retrieval time by means of optimized 3D adjacency, nearest neighbor information and 3D indexing. The Hilbert mapping, which maps a subinterval of the [0, 1] interval to the corresponding portion of the d-dimensional Hilbert's curve, preserves the Lebesgue measure and is Lipschitz continuous. Depending on the applications, several alternatives are possible in order to cluster spatial data together in the third dimension compared to its

  10. High spatial resolution 3D MR cholangiography with high sampling efficiency technique (SPACE): Comparison of 3 T vs. 1.5 T

    Energy Technology Data Exchange (ETDEWEB)

    Arizono, Shigeki [Department of Diagnostic Imaging and Nuclear Medicine, Kyoto University Graduate School of Medicine, 54 Shogoin Kawahara-cho, Sakyo-ku, Kyoto 606-8507 (Japan)], E-mail: arizono@kuhp.kyoto-u.ac.jp; Isoda, Hiroyoshi [Department of Diagnostic Imaging and Nuclear Medicine, Kyoto University Graduate School of Medicine, 54 Shogoin Kawahara-cho, Sakyo-ku, Kyoto 606-8507 (Japan)], E-mail: sayuki@kuhp.kyoto-u.ac.jp; Maetani, Yoji S. [Department of Diagnostic Imaging and Nuclear Medicine, Kyoto University Graduate School of Medicine, 54 Shogoin Kawahara-cho, Sakyo-ku, Kyoto 606-8507 (Japan)], E-mail: mbo@kuhp.kyoto-u.ac.jp; Hirokawa, Yuusuke [Department of Diagnostic Imaging and Nuclear Medicine, Kyoto University Graduate School of Medicine, 54 Shogoin Kawahara-cho, Sakyo-ku, Kyoto 606-8507 (Japan)], E-mail: yuusuke@kuhp.kyoto-u.ac.jp; Shimada, Kotaro [Department of Diagnostic Imaging and Nuclear Medicine, Kyoto University Graduate School of Medicine, 54 Shogoin Kawahara-cho, Sakyo-ku, Kyoto 606-8507 (Japan)], E-mail: kotaro@kuhp.kyoto-u.ac.jp; Nakamoto, Yuji [Department of Diagnostic Imaging and Nuclear Medicine, Kyoto University Graduate School of Medicine, 54 Shogoin Kawahara-cho, Sakyo-ku, Kyoto 606-8507 (Japan)], E-mail: ynakamo1@kuhp.kyoto-u.ac.jp; Shibata, Toshiya [Department of Diagnostic Imaging and Nuclear Medicine, Kyoto University Graduate School of Medicine, 54 Shogoin Kawahara-cho, Sakyo-ku, Kyoto 606-8507 (Japan)], E-mail: ksj@kuhp.kyoto-u.ac.jp; Togashi, Kaori [Department of Diagnostic Imaging and Nuclear Medicine, Kyoto University Graduate School of Medicine, 54 Shogoin Kawahara-cho, Sakyo-ku, Kyoto 606-8507 (Japan)], E-mail: ktogashi@kuhp.kyoto-u.ac.jp

    2010-01-15

    Purpose: The aim of this study was to evaluate image quality of 3D MR cholangiography (MRC) using high sampling efficiency technique (SPACE) at 3 T compared with 1.5 T. Methods and materials: An IRB approved prospective study was performed with 17 healthy volunteers using both 3 and 1.5 T MR scanners. MRC images were obtained with free-breathing navigator-triggered 3D T2-weighted turbo spin-echo sequence with SPACE (TR, >2700 ms; TE, 780 ms at 3 T and 801 ms at 1.5 T; echo-train length, 121; voxel size, 1.1 mm x 1.0 mm x 0.84 mm). The common bile duct (CBD) to liver contrast-to-noise ratios (CNRs) were compared between 3 and 1.5 T. A five-point scale was used to compare overall image quality and visualization of the third branches of bile duct (B2, B6, and B8). The depiction of cystic duct insertion and the highest order of bile duct visible were also compared. The results were compared using the Wilcoxon signed-ranks test. Results: CNR between the CBD and liver was significantly higher at 3 T than 1.5 T (p = 0.0006). MRC at 3 T showed a significantly higher overall image quality (p = 0.0215) and clearer visualization of B2 (p = 0.0183) and B6 (p = 0.0106) than at 1.5 T. In all analyses of duct visibility, 3 T showed higher scores than 1.5 T. Conclusion: 3 T MRC using SPACE offered better image quality than 1.5 T. SPACE technique facilitated high-resolution 3D MRC with excellent image quality at 3 T.

  11. Application of space and aviation technology to improve the safety and reliability of nuclear power plant operations. Final report

    International Nuclear Information System (INIS)

    1980-04-01

    This report investigates various technologies that have been developed and utilized by the aerospace community, particularly the National Aeronautics and Space Administration (NASA) and the aviation industry, that would appear to have some potential for contributing to improved operational safety and reliability at commercial nuclear power plants of the type being built and operated in the United States today. The main initiator for this study, as well as many others, was the accident at the Three Mile Island (TMI) nuclear power plant in March 1979. Transfer and application of technology developed by NASA, as well as other public and private institutions, may well help to decrease the likelihood of similar incidents in the future

  12. Increasing Genome Sampling and Improving SNP Genotyping for Genotyping-by-Sequencing with New Combinations of Restriction Enzymes.

    Science.gov (United States)

    Fu, Yong-Bi; Peterson, Gregory W; Dong, Yibo

    2016-04-07

    Genotyping-by-sequencing (GBS) has emerged as a useful genomic approach for exploring genome-wide genetic variation. However, GBS commonly samples a genome unevenly and can generate a substantial amount of missing data. These technical features would limit the power of various GBS-based genetic and genomic analyses. Here we present software called IgCoverage for in silico evaluation of genomic coverage through GBS with an individual or pair of restriction enzymes on one sequenced genome, and report a new set of 21 restriction enzyme combinations that can be applied to enhance GBS applications. These enzyme combinations were developed through an application of IgCoverage on 22 plant, animal, and fungus species with sequenced genomes, and some of them were empirically evaluated with different runs of Illumina MiSeq sequencing in 12 plant species. The in silico analysis of 22 organisms revealed up to eight times more genome coverage for the new combinations consisted of pairing four- or five-cutter restriction enzymes than the commonly used enzyme combination PstI + MspI. The empirical evaluation of the new enzyme combination (HinfI + HpyCH4IV) in 12 plant species showed 1.7-6 times more genome coverage than PstI + MspI, and 2.3 times more genome coverage in dicots than monocots. Also, the SNP genotyping in 12 Arabidopsis and 12 rice plants revealed that HinfI + HpyCH4IV generated 7 and 1.3 times more SNPs (with 0-16.7% missing observations) than PstI + MspI, respectively. These findings demonstrate that these novel enzyme combinations can be utilized to increase genome sampling and improve SNP genotyping in various GBS applications. Copyright © 2016 Fu et al.

  13. Sampling Development

    Science.gov (United States)

    Adolph, Karen E.; Robinson, Scott R.

    2011-01-01

    Research in developmental psychology requires sampling at different time points. Accurate depictions of developmental change provide a foundation for further empirical studies and theories about developmental mechanisms. However, overreliance on widely spaced sampling intervals in cross-sectional and longitudinal designs threatens the validity of…

  14. Periodic TiO2 Nanostructures with Improved Aspect and Line/Space Ratio Realized by Colloidal Photolithography Technique

    Directory of Open Access Journals (Sweden)

    Loïc Berthod

    2017-10-01

    Full Text Available This paper presents substantial improvements of the colloidal photolithography technique (also called microsphere lithography with the goal of better controlling the geometry of the fabricated nano-scale structures—in this case, hexagonally arranged nanopillars—printed in a layer of directly photopatternable sol-gel TiO2. Firstly, to increase the achievable structure height the photosensitive layer underneath the microspheres is deposited on a reflective layer instead of the usual transparent substrate. Secondly, an increased width of the pillars is achieved by tilting the incident wave and using multiple exposures or substrate rotation, additionally allowing to better control the shape of the pillar’s cross section. The theoretical analysis is carried out by rigorous modelling of the photonics nanojet underneath the microspheres and by optimizing the experimental conditions. Aspect ratios (structure height/lateral structure size greater than 2 are predicted and demonstrated experimentally for structure dimensions in the sub micrometer range, as well as line/space ratios (lateral pillar size/distance between pillars greater than 1. These nanostructures could lead for example to materials exhibiting efficient light trapping in the visible and near-infrared range, as well as improved hydrophobic or photocatalytic properties for numerous applications in environmental and photovoltaic systems.

  15. Operations research to add postpartum family planning to maternal and neonatal health to improve birth spacing in Sylhet District, Bangladesh.

    Science.gov (United States)

    Ahmed, Salahuddin; Norton, Maureen; Williams, Emma; Ahmed, Saifuddin; Shah, Rasheduzzaman; Begum, Nazma; Mungia, Jaime; Lefevre, Amnesty; Al-Kabir, Ahmed; Winch, Peter J; McKaig, Catharine; Baqui, Abdullah H

    2013-08-01

    Short birth intervals are associated with increased risk of adverse maternal and neonatal health (MNH) outcomes. Improving postpartum contraceptive use is an important programmatic strategy to improve the health and well-being of women, newborns, and children. This article documents the intervention package and evaluation design of a study conducted in a rural district of Bangladesh to evaluate the effects of an integrated, community-based MNH and postpartum family planning program on contraceptive use and birth-interval lengths. The study integrated family planning counseling within 5 community health worker (CHW)-household visits to pregnant and postpartum women, while a community mobilizer (CM) led community meetings on the importance of postpartum family planning and pregnancy spacing for maternal and child health. The CM and the CHWs emphasized 3 messages: (1) Use of the Lactational Amenorrhea Method (LAM) during the first 6 months postpartum and transition to another modern contraceptive method; (2) Exclusive, rather than fully or nearly fully, breastfeeding to support LAM effectiveness and good infant breastfeeding practices; (3) Use of a modern contraceptive method after a live birth for at least 24 months before attempting another pregnancy (a birth-to-birth interval of about 3 years) to support improved infant health and nutrition. CHWs provided only family planning counseling in the original study design, but we later added community-based distribution of methods, and referrals for clinical methods, to meet women's demand. Using a quasi-experimental design, and relying primarily on pre/post-household surveys, we selected pregnant women from 4 unions to receive the intervention (n = 2,280) and pregnant women from 4 other unions (n = 2,290) to serve as the comparison group. Enrollment occurred between 2007 and 2009, and data collection ended in January 2013. Formative research showed that women and their family members generally did not perceive

  16. Evaluation of Total Nitrite Pattern Visualization as an Improved Method for Gunshot Residue Detection and its Application to Casework Samples.

    Science.gov (United States)

    Berger, Jason; Upton, Colin; Springer, Elyah

    2018-04-23

    Visualization of nitrite residues is essential in gunshot distance determination. Current protocols for the detection of nitrites include, among other tests, the Modified Griess Test (MGT). This method is limited as nitrite residues are unstable in the environment and limited to partially burned gunpowder. Previous research demonstrated the ability of alkaline hydrolysis to convert nitrates to nitrites, allowing visualization of unburned gunpowder particles using the MGT. This is referred to as Total Nitrite Pattern Visualization (TNV). TNV techniques were modified and a study conducted to streamline the procedure outlined in the literature to maximize the efficacy of the TNV in casework, while reducing the required time from 1 h to 5 min, and enhancing effectiveness on blood-soiled samples. The TNV method was found to provide significant improvement in the ability to detect significant nitrite residues, without sacrificing efficiency, that would allow for the determination of the muzzle-to-target distance. © 2018 American Academy of Forensic Sciences.

  17. Improved diffusion technique for 15N:14N analysis of ammonium and nitrate from aqueous samples by stable isotope spectrometry

    International Nuclear Information System (INIS)

    Goerges, T.; Dittert, K.

    1998-01-01

    Nitrogen (N) isotope ratio mass spectrometry (IRMS) by Dumas combustion and continuous flow mass spectrometry has become a wide-spread tool for the studies of N turnover. The speed and labor efficiency of 15N determinations from aqueous solutions such as soil solutions or soil extracts are often limited by sample preparation. Several procedures for the conversion of dissolved ammonium (NH4+) or nitrate NO3- to gaseous ammonia and its subsequent trapping in acidified traps have been elaborated in the last decades. They are based on the use of acidified filters kept either above the respective solution or in floating PTFE envelopes. In this paper, we present an improved diffusion method with a fixed PTFE trap. The diffusion containers are continuously kept in a vertical rotary shaker. Quantitative diffusion can thus be achieved in only three days. For solutions with NH4+ levels of only 1 mg N kg-1 and NO3- concentrations of 12 mg N kg-1, recovery rates of 98.8-102% were obtained. By addition of 15N labeled and non-labeled NH4+ and NO3- it was shown that no cross-contamination from NH4+ to NO3- or vice versa takes place even when one form is labeled to more than 1 at %15N while the other form has natural 15N content. The method requires no intermediate step of ammonia volatilization before NO3- conversion

  18. Sampling the stream landscape: Improving the applicability of an ecoregion-level capture probability model for stream fishes

    Science.gov (United States)

    Mollenhauer, Robert; Mouser, Joshua B.; Brewer, Shannon K.

    2018-01-01

    Temporal and spatial variability in streams result in heterogeneous gear capture probability (i.e., the proportion of available individuals identified) that confounds interpretation of data used to monitor fish abundance. We modeled tow-barge electrofishing capture probability at multiple spatial scales for nine Ozark Highland stream fishes. In addition to fish size, we identified seven reach-scale environmental characteristics associated with variable capture probability: stream discharge, water depth, conductivity, water clarity, emergent vegetation, wetted width–depth ratio, and proportion of riffle habitat. The magnitude of the relationship between capture probability and both discharge and depth varied among stream fishes. We also identified lithological characteristics among stream segments as a coarse-scale source of variable capture probability. The resulting capture probability model can be used to adjust catch data and derive reach-scale absolute abundance estimates across a wide range of sampling conditions with similar effort as used in more traditional fisheries surveys (i.e., catch per unit effort). Adjusting catch data based on variable capture probability improves the comparability of data sets, thus promoting both well-informed conservation and management decisions and advances in stream-fish ecology.

  19. Fully three-dimensional reconstruction from data collected on concentric cubes in Fourier space: implementation and a sample application to MRI [magnetic resonance imaging

    International Nuclear Information System (INIS)

    Herman, G.T.; Roberts, D.; Axel, L.

    1992-01-01

    An algorithm is proposed for rapid and accurate reconstruction from data collected in Fourier space at points arranged on a grid of concentric cubes. The whole process has computational complexity of the same order as required for the 3D fast Fourier transform and so (for medically relevant sizes of the data set) it is faster than backprojection into the same size rectangular grid. The design of the algorithm ensures that no interpolations are needed, in contrast to methods involving backprojection with their unavoidable interpolations. As an application, a 3D data collection method for MRI has been designed which directly samples the Fourier transform of the object to be reconstructed on concentric cubes as needed for the algorithm. (author)

  20. Laser-induced breakdown spectroscopy for space exploration applications: Influence of the ambient pressure on the calibration curves prepared from soil and clay samples

    International Nuclear Information System (INIS)

    Salle, Beatrice; Cremers, David A.; Maurice, Sylvestre; Wiens, Roger C.

    2005-01-01

    Recently, there has been an increasing interest in the laser-induced breakdown spectroscopy (LIBS) technique for stand-off detection of geological samples for use on landers and rovers to Mars, and for other space applications. For space missions, LIBS analysis capabilities must be investigated and instrumental development is required to take into account constraints such as size, weight, power and the effect of environmental atmosphere (pressure and ambient gas) on flight instrument performance. In this paper, we study the in-situ LIBS method at reduced pressure (7 Torr CO 2 to simulate the Martian atmosphere) and near vacuum (50 mTorr in air to begin to simulate the Moon or asteroids' pressure) as well as at atmospheric pressure in air (for Earth conditions and comparison). Here in-situ corresponds to distances on the order of 150 mm in contrast to stand-off analysis at distance of many meters. We show the influence of the ambient pressure on the calibration curves prepared from certified soil and clay pellets. In order to detect simultaneously all the elements commonly observed in terrestrial soils, we used an Echelle spectrograph. The results are discussed in terms of calibration curves, measurement precision, plasma light collection system efficiency and matrix effects

  1. Sodium magnetic resonance imaging. Development of a 3D radial acquisition technique with optimized k-space sampling density and high SNR-efficiency

    International Nuclear Information System (INIS)

    Nagel, Armin Michael

    2009-01-01

    A 3D radial k-space acquisition technique with homogenous distribution of the sampling density (DA-3D-RAD) is presented. This technique enables short echo times (TE 23 Na-MRI, and provides a high SNR-efficiency. The gradients of the DA-3D-RAD-sequence are designed such that the average sampling density in each spherical shell of k-space is constant. The DA-3D-RAD-sequence provides 34% more SNR than a conventional 3D radial sequence (3D-RAD) if T 2 * -decay is neglected. This SNR-gain is enhanced if T 2 * -decay is present, so a 1.5 to 1.8 fold higher SNR is measured in brain tissue with the DA-3D-RAD-sequence. Simulations and experimental measurements show that the DA-3D-RAD sequence yields a better resolution in the presence of T 2 * -decay and less image artefacts when B 0 -inhomogeneities exist. Using the developed sequence, T 1 -, T 2 * - and Inversion-Recovery- 23 Na-image contrasts were acquired for several organs and 23 Na-relaxation times were measured (brain tissue: T 1 =29.0±0.3 ms; T 2s * ∼4 ms; T 2l * ∼31 ms; cerebrospinal fluid: T 1 =58.1±0.6 ms; T 2 * =55±3 ms (B 0 =3 T)). T 1 - und T 2 * -relaxation times of cerebrospinal fluid are independent of the selected magnetic field strength (B0 = 3T/7 T), whereas the relaxation times of brain tissue increase with field strength. Furthermore, 23 Na-signals of oedemata were suppressed in patients and thus signals from different tissue compartments were selectively measured. (orig.)

  2. Improving the modelling of redshift-space distortions - I. A bivariate Gaussian description for the galaxy pairwise velocity distributions

    Science.gov (United States)

    Bianchi, Davide; Chiesa, Matteo; Guzzo, Luigi

    2015-01-01

    As a step towards a more accurate modelling of redshift-space distortions (RSD) in galaxy surveys, we develop a general description of the probability distribution function of galaxy pairwise velocities within the framework of the so-called streaming model. For a given galaxy separation r, such function can be described as a superposition of virtually infinite local distributions. We characterize these in terms of their moments and then consider the specific case in which they are Gaussian functions, each with its own mean μ and dispersion σ. Based on physical considerations, we make the further crucial assumption that these two parameters are in turn distributed according to a bivariate Gaussian, with its own mean and covariance matrix. Tests using numerical simulations explicitly show that with this compact description one can correctly model redshift-space distortions on all scales, fully capturing the overall linear and non-linear dynamics of the galaxy flow at different separations. In particular, we naturally obtain Gaussian/exponential, skewed/unskewed distribution functions, depending on separation as observed in simulations and data. Also, the recently proposed single-Gaussian description of RSD is included in this model as a limiting case, when the bivariate Gaussian is collapsed to a two-dimensional Dirac delta function. We also show how this description naturally allows for the Taylor expansion of 1 + ξS(s) around 1 + ξR(r), which leads to the Kaiser linear formula when truncated to second order, explicating its connection with the moments of the velocity distribution functions. More work is needed, but these results indicate a very promising path to make definitive progress in our programme to improve RSD estimators.

  3. Improved Models and Tools for Prediction of Radiation Effects on Space Electronics in Wide Temperature Range, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — All NASA exploration systems operate in the extreme environments of space (Moon, Mars, etc.) and require reliable electronics capable of handling a wide temperature...

  4. Improved Understanding of Space Radiation Effects on Exploration Electronics by Advanced Modeling of Nanoscale Devices and Novel Materials, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Future NASA space exploration missions will use nanometer-scale electronic technologies which call for a shift in how radiation effects in such devices and materials...

  5. Improved Models and Tools for Prediction of Radiation Effects on Space Electronics in Wide Temperature Range, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — All NASA exploration systems operate in the extreme environments of space and require reliable electronics capable of handling a wide temperature range (-180ºC to...

  6. An improved method of sample preparation on AnchorChip targets for MALDI-MS and MS/MS and its application in the liver proteome project

    DEFF Research Database (Denmark)

    Zhang, Xumin; Shi, Liang; Shu, Shaokung

    2007-01-01

    An improved method for sample preparation for MALDI-MS and MS/MS using AnchorChip targets is presented. The method, termed the SMW method (sample, matrix wash), results in better sensitivity for peptide mass fingerprinting as well as for sequencing by MS/MS than previously published methods. The ...

  7. Effect of interaction between irradiation-induced defects and intrinsic defects in the pinning improvement of neutron irradiated YBaCuO sample

    International Nuclear Information System (INIS)

    Topal, Ugur; Sozeri, Huseyin; Yavuz, Hasbi

    2004-01-01

    Interaction between the intrinsic (native) defects and the irradiation-induced defects created by neutron irradiation was examined for the YBCO sample. For this purpose, non-superconducting Y-211 phase was included to the Y-123 samples at different contents as a source of large pinning center. The critical current density enhancement with the irradiation for these samples were analysed and then the role of defects on pinning improvement was discussed

  8. Effect of interaction between irradiation-induced defects and intrinsic defects in the pinning improvement of neutron irradiated YBaCuO sample

    Energy Technology Data Exchange (ETDEWEB)

    Topal, Ugur; Sozeri, Huseyin; Yavuz, Hasbi

    2004-08-01

    Interaction between the intrinsic (native) defects and the irradiation-induced defects created by neutron irradiation was examined for the YBCO sample. For this purpose, non-superconducting Y-211 phase was included to the Y-123 samples at different contents as a source of large pinning center. The critical current density enhancement with the irradiation for these samples were analysed and then the role of defects on pinning improvement was discussed.

  9. An improved Bayesian tensor regularization and sampling algorithm to track neuronal fiber pathways in the language circuit.

    Science.gov (United States)

    Mishra, Arabinda; Anderson, Adam W; Wu, Xi; Gore, John C; Ding, Zhaohua

    2010-08-01

    The purpose of this work is to design a neuronal fiber tracking algorithm, which will be more suitable for reconstruction of fibers associated with functionally important regions in the human brain. The functional activations in the brain normally occur in the gray matter regions. Hence the fibers bordering these regions are weakly myelinated, resulting in poor performance of conventional tractography methods to trace the fiber links between them. A lower fractional anisotropy in this region makes it even difficult to track the fibers in the presence of noise. In this work, the authors focused on a stochastic approach to reconstruct these fiber pathways based on a Bayesian regularization framework. To estimate the true fiber direction (propagation vector), the a priori and conditional probability density functions are calculated in advance and are modeled as multivariate normal. The variance of the estimated tensor element vector is associated with the uncertainty due to noise and partial volume averaging (PVA). An adaptive and multiple sampling of the estimated tensor element vector, which is a function of the pre-estimated variance, overcomes the effect of noise and PVA in this work. The algorithm has been rigorously tested using a variety of synthetic data sets. The quantitative comparison of the results to standard algorithms motivated the authors to implement it for in vivo DTI data analysis. The algorithm has been implemented to delineate fibers in two major language pathways (Broca's to SMA and Broca's to Wernicke's) across 12 healthy subjects. Though the mean of standard deviation was marginally bigger than conventional (Euler's) approach [P. J. Basser et al., "In vivo fiber tractography using DT-MRI data," Magn. Reson. Med. 44(4), 625-632 (2000)], the number of extracted fibers in this approach was significantly higher. The authors also compared the performance of the proposed method to Lu's method [Y. Lu et al., "Improved fiber tractography with Bayesian

  10. Social Connectedness and Life Satisfaction: Comparing Mean Levels for 2 Undergraduate Samples and Testing for Improvement Based on Brief Counseling

    Science.gov (United States)

    Blau, Gary; DiMino, John; DeMaria, Peter A.; Beverly, Clyde; Chessler, Marcy; Drennan, Rob

    2016-01-01

    Objectives: Comparing the mean levels of social connectedness and life satisfaction, and analyzing their relationship for 2 undergraduate samples, and testing for an increase in their means for a brief counseling sample. Participants: Between October 2013 and May 2015, 3 samples were collected: not-in-counseling (NIC; n = 941), initial counseling…

  11. k-space sampling optimization for ultrashort TE imaging of cortical bone: Applications in radiation therapy planning and MR-based PET attenuation correction

    International Nuclear Information System (INIS)

    Hu, Lingzhi; Traughber, Melanie; Su, Kuan-Hao; Pereira, Gisele C.; Grover, Anu; Traughber, Bryan; Muzic, Raymond F. Jr.

    2014-01-01

    Purpose: The ultrashort echo-time (UTE) sequence is a promising MR pulse sequence for imaging cortical bone which is otherwise difficult to image using conventional MR sequences and also poses strong attenuation for photons in radiation therapy and PET imaging. The authors report here a systematic characterization of cortical bone signal decay and a scanning time optimization strategy for the UTE sequence through k-space undersampling, which can result in up to a 75% reduction in acquisition time. Using the undersampled UTE imaging sequence, the authors also attempted to quantitatively investigate the MR properties of cortical bone in healthy volunteers, thus demonstrating the feasibility of using such a technique for generating bone-enhanced images which can be used for radiation therapy planning and attenuation correction with PET/MR. Methods: An angularly undersampled, radially encoded UTE sequence was used for scanning the brains of healthy volunteers. Quantitative MR characterization of tissue properties, including water fraction and R2 ∗ = 1/T2 ∗ , was performed by analyzing the UTE images acquired at multiple echo times. The impact of different sampling rates was evaluated through systematic comparison of the MR image quality, bone-enhanced image quality, image noise, water fraction, and R2 ∗ of cortical bone. Results: A reduced angular sampling rate of the UTE trajectory achieves acquisition durations in proportion to the sampling rate and in as short as 25% of the time required for full sampling using a standard Cartesian acquisition, while preserving unique MR contrast within the skull at the cost of a minimal increase in noise level. The R2 ∗ of human skull was measured as 0.2–0.3 ms −1 depending on the specific region, which is more than ten times greater than the R2 ∗ of soft tissue. The water fraction in human skull was measured to be 60%–80%, which is significantly less than the >90% water fraction in brain. High-quality, bone

  12. Multi-domain computerized cognitive training program improves performance of bookkeeping tasks: a matched-sampling active-controlled trial.

    Science.gov (United States)

    Lampit, Amit; Ebster, Claus; Valenzuela, Michael

    2014-01-01

    Cognitive skills are important predictors of job performance, but the extent to which computerized cognitive training (CCT) can improve job performance in healthy adults is unclear. We report, for the first time, that a CCT program aimed at attention, memory, reasoning and visuo-spatial abilities can enhance productivity in healthy younger adults on bookkeeping tasks with high relevance to real-world job performance. 44 business students (77.3% female, mean age 21.4 ± 2.6 years) were assigned to either (a) 20 h of CCT, or (b) 20 h of computerized arithmetic training (active control) by a matched sampling procedure. Both interventions were conducted over a period of 6 weeks, 3-4 1-h sessions per week. Transfer of skills to performance on a 60-min paper-based bookkeeping task was measured at three time points-baseline, after 10 h and after 20 h of training. Repeated measures ANOVA found a significant Group X Time effect on productivity (F = 7.033, df = 1.745; 73.273, p = 0.003) with a significant interaction at both the 10-h (Relative Cohen's effect size = 0.38, p = 0.014) and 20-h time points (Relative Cohen's effect size = 0.40, p = 0.003). No significant effects were found on accuracy or on Conners' Continuous Performance Test, a measure of sustained attention. The results are discussed in reference to previous findings on the relationship between brain plasticity and job performance. Generalization of results requires further study.

  13. Multi-domain computerized cognitive training program improves performance of bookkeeping tasks: a matched-sampling active-controlled trial

    Directory of Open Access Journals (Sweden)

    Amit eLampit

    2014-07-01

    Full Text Available Cognitive skills are important predictors of job performance, but the extent to which Computerized Cognitive Training (CCT can improve job performance in healthy adults is unclear. We report, for the first time, that a CCT program aimed at attention, memory, reasoning and visuo-spatial abilities can enhance productivity in healthy younger adults on bookkeeping tasks with high relevance to real-world job performance. 44 business students (77.3% female, mean age 21.4 ± 2.6 years were assigned to either (a 20 hours of CCT, or (b 20 hours of computerized arithmetic training (active control by a matched sampling procedure. Both interventions were conducted over a period of six weeks, 3-4 one-hour sessions per week. Transfer of skills to performance on a 60-minute paper-based bookkeeping task was measured at three time points – baseline, after 10 hours and after 20 hours of training. Repeated measures ANOVA found a significant Group X Time effect on productivity (F=7.033, df=1.745; 73.273, p=0.003 with a significant interaction at both the 10-hour (Relative Cohen’s effect size = 0.38, p=0.014 and 20-hour time points (Relative Cohen’s effect size = 0.40, p=0.003. No significant effects were found on accuracy or on Conners’ Continuous Performance Test, a measure of sustained attention. The results are discussed in reference to previous findings on the relationship between brain plasticity and job performance. Generalization of results requires further study.

  14. High-Speed Rail Train Timetabling Problem: A Time-Space Network Based Method with an Improved Branch-and-Price Algorithm

    Directory of Open Access Journals (Sweden)

    Bisheng He

    2014-01-01

    Full Text Available A time-space network based optimization method is designed for high-speed rail train timetabling problem to improve the service level of the high-speed rail. The general time-space path cost is presented which considers both the train travel time and the high-speed rail operation requirements: (1 service frequency requirement; (2 stopping plan adjustment; and (3 priority of train types. Train timetabling problem based on time-space path aims to minimize the total general time-space path cost of all trains. An improved branch-and-price algorithm is applied to solve the large scale integer programming problem. When dealing with the algorithm, a rapid branching and node selection for branch-and-price tree and a heuristic train time-space path generation for column generation are adopted to speed up the algorithm computation time. The computational results of a set of experiments on China’s high-speed rail system are presented with the discussions about the model validation, the effectiveness of the general time-space path cost, and the improved branch-and-price algorithm.

  15. Improving the quality factor of an RF spiral inductor with non-uniform metal width and non-uniform coil spacing

    International Nuclear Information System (INIS)

    Shen Pei; Zhang Wanrong; Huang Lu; Jin Dongyue; Xie Hongyun

    2011-01-01

    An improved inductor layout with non-uniform metal width and non-uniform spacing is proposed to increase the quality factor (Q factor). For this inductor layout, from outer coil to inner coil, the metal width is reduced by an arithmetic-progression step, while the metal spacing is increased by a geometric-progression step. An improved layout with variable width and changed spacing is of benefit to the Q factor of RF spiral inductor improvement (approximately 42.86%), mainly due to the suppression of eddy-current loss by weakening the current crowding effect in the center of the spiral inductor. In order to increase the Q factor further, for the novel inductor, a patterned ground shield is used with optimized layout together. The results indicate that, in the range of 0.5 to 16 GHz, the Q factor of the novel inductor is at an optimum, which improves by 67% more than conventional inductors with uniform geometry dimensions (equal width and equal spacing), is enhanced by nearly 23% more than a PGS inductor with uniform geometry dimensions, and improves by almost 20% more than an inductor with an improved layout. (semiconductor devices)

  16. Stability of 4-dimensional space-time from the IIB matrix model via the improved mean field approximation

    International Nuclear Information System (INIS)

    Aoyama, Tatsumi; Kawai, Hikaru; Shibusa, Yuuichiro

    2006-01-01

    We investigate the origin of our four-dimensional space-time by considering dynamical aspects of the IIB matrix model using the improved mean field approximation. Previous works have focused on the specific choices of configurations as ansatz which preserve SO(d) rotational symmetry. In this report, an extended ansatz is proposed and examined up to a third-order approximation which includes both the SO(4) ansatz and the SO(7) ansatz in their respective limits. From the solutions of the self-consistency condition represented by the extrema of the free energy of the system, it is found that some of the solutions found in the SO(4) or SO(7) ansatz disappear in the extended ansatz. This implies that the extension of ansatz can be used to distinguish stable solutions from unstable solutions. It is also found that there is a non-trivial accumulation of extrema including the SO(4)-preserving solution, which may lead to the formation of a plateau. (author)

  17. Performance Improvement of Space Shift Keying MIMO Systems with Orthogonal Codebook-Based Phase-Rotation Precoding

    Directory of Open Access Journals (Sweden)

    Mohammed Al-Ansi

    2017-01-01

    Full Text Available This paper considers codebook-based precoding for Space Shift Keying (SSK modulation MIMO system. Codebook-based precoding avoids the necessity for full knowledge of Channel State Information (CSI at the transmitter and alleviates the complexity of generating a CSI-optimized precoder. The receiver selects the codeword that maximizes the Minimum Euclidean Distance (MED of the received constellation and feeds back its index to the transmitter. In this paper, we first develop a new accurate closed-form Bit Error Rate (BER for SSK without precoding. Then, we investigate several phase-rotation codebooks with quantized set of phases and systematic structure. Namely, we investigate the Full-Combination, Walsh-Hadamard, Quasi-Orthogonal Sequences, and Orthogonal Array Testing codebooks. In addition, since the size of the Full-Combination codebook may be large, we develop an iterative search method for fast selection of its best codeword. The proposed codebooks significantly improve the BER performance in Rayleigh and Nakagami fading channels, even at high spatial correlation among transmit antennas and CSI estimation error. Moreover, we show that only four phases {+1,+j,-1,-j} are sufficient and further phase granularity does yield significant gain. This avoids hardware multiplication during searching the codebook and applying the codeword.

  18. Evaluating energy, health and carbon co-benefits from improved domestic space heating: A randomised community trial

    International Nuclear Information System (INIS)

    Preval, Nick; Chapman, Ralph; Pierse, Nevil; Howden-Chapman, Philippa

    2010-01-01

    In order to value the costs and benefits associated with improved space heating we analysed the Housing, Heating and Health Study, a randomised community trial involving installation of energy efficient and healthy heaters (heat pump, wood pellet burner or flued gas heater) in homes with basic insulation and poor heating, occupied by households which included a child with asthma. We compared the initial purchase and installation cost of heaters with changes in the number of visits to health professionals, time off work/school, caregiving, and pharmaceutical use for household members and changes in total household energy use and carbon emissions following the intervention. We used two scenarios to analyse the results over the predicted 12-year life-span of the heaters. The targeted approach (Scenario A - assuming high rates of household asthma throughout the period of analysis) produced enough health-related benefits to offset the cost of the heaters, and when total energy use and carbon emission savings were included in the analysis the ratio of benefits to costs was 1.09:1. The untargeted approach (Scenario B - assuming typical New Zealand asthma rates throughout the period of analysis) had a ratio of total benefits to costs of 0.31:1.

  19. Evaluating energy, health and carbon co-benefits from improved domestic space heating. A randomised community trial

    Energy Technology Data Exchange (ETDEWEB)

    Preval, Nick; Pierse, Nevil; Howden-Chapman, Philippa [He Kainga Oranga/Housing and Health Research Programme, University of Otago, Wellington, PO Box 7343, Wellington South (New Zealand); Chapman, Ralph [School of Geography, Graduate Programme in Environmental Studies, Environment and Earth Sciences, Victoria University, PO Box 600, Wellington 6140 (New Zealand)

    2010-08-15

    In order to value the costs and benefits associated with improved space heating we analysed the Housing, Heating and Health Study, a randomised community trial involving installation of energy efficient and healthy heaters (heat pump, wood pellet burner or flued gas heater) in homes with basic insulation and poor heating, occupied by households which included a child with asthma. We compared the initial purchase and installation cost of heaters with changes in the number of visits to health professionals, time off work/school, caregiving, and pharmaceutical use for household members and changes in total household energy use and carbon emissions following the intervention. We used two scenarios to analyse the results over the predicted 12-year life-span of the heaters. The targeted approach (Scenario A - assuming high rates of household asthma throughout the period of analysis) produced enough health-related benefits to offset the cost of the heaters, and when total energy use and carbon emission savings were included in the analysis the ratio of benefits to costs was 1.09:1. The untargeted approach (Scenario B - assuming typical New Zealand asthma rates throughout the period of analysis) had a ratio of total benefits to costs of 0.31:1. (author)

  20. Evaluating energy, health and carbon co-benefits from improved domestic space heating: A randomised community trial

    Energy Technology Data Exchange (ETDEWEB)

    Preval, Nick [He Kainga Oranga/Housing and Health Research Programme, University of Otago, Wellington, PO Box 7343, Wellington South (New Zealand); Chapman, Ralph, E-mail: Ralph.chapman@vuw.ac.n [School of Geography, Graduate Programme in Environmental Studies, Environment and Earth Sciences, Victoria University, PO Box 600, Wellington 6140 (New Zealand); Pierse, Nevil; Howden-Chapman, Philippa [He Kainga Oranga/Housing and Health Research Programme, University of Otago, Wellington, PO Box 7343, Wellington South (New Zealand)

    2010-08-15

    In order to value the costs and benefits associated with improved space heating we analysed the Housing, Heating and Health Study, a randomised community trial involving installation of energy efficient and healthy heaters (heat pump, wood pellet burner or flued gas heater) in homes with basic insulation and poor heating, occupied by households which included a child with asthma. We compared the initial purchase and installation cost of heaters with changes in the number of visits to health professionals, time off work/school, caregiving, and pharmaceutical use for household members and changes in total household energy use and carbon emissions following the intervention. We used two scenarios to analyse the results over the predicted 12-year life-span of the heaters. The targeted approach (Scenario A - assuming high rates of household asthma throughout the period of analysis) produced enough health-related benefits to offset the cost of the heaters, and when total energy use and carbon emission savings were included in the analysis the ratio of benefits to costs was 1.09:1. The untargeted approach (Scenario B - assuming typical New Zealand asthma rates throughout the period of analysis) had a ratio of total benefits to costs of 0.31:1.

  1. Improving the thermal integrity of new single-family detached residential buildings: Documentation for a regional database of capital costs and space conditioning load savings

    International Nuclear Information System (INIS)

    Koomey, J.G.; McMahon, J.E.; Wodley, C.

    1991-07-01

    This report summarizes the costs and space-conditioning load savings from improving new single-family building shells. It relies on survey data from the National Association of Home-builders (NAHB) to assess current insulation practices for these new buildings, and NAHB cost data (aggregated to the Federal region level) to estimate the costs of improving new single-family buildings beyond current practice. Space-conditioning load savings are estimated using a database of loads for prototype buildings developed at Lawrence Berkeley Laboratory, adjusted to reflect population-weighted average weather in each of the ten federal regions and for the nation as a whole

  2. Communication spaces.

    Science.gov (United States)

    Coiera, Enrico

    2014-01-01

    Annotations to physical workspaces such as signs and notes are ubiquitous. When densely annotated, work areas become communication spaces. This study aims to characterize the types and purpose of such annotations. A qualitative observational study was undertaken in two wards and the radiology department of a 440-bed metropolitan teaching hospital. Images were purposefully sampled; 39 were analyzed after excluding inferior images. Annotation functions included signaling identity, location, capability, status, availability, and operation. They encoded data, rules or procedural descriptions. Most aggregated into groups that either created a workflow by referencing each other, supported a common workflow without reference to each other, or were heterogeneous, referring to many workflows. Higher-level assemblies of such groupings were also observed. Annotations make visible the gap between work done and the capability of a space to support work. Annotations are repairs of an environment, improving fitness for purpose, fixing inadequacy in design, or meeting emergent needs. Annotations thus record the missing information needed to undertake tasks, typically added post-implemented. Measuring annotation levels post-implementation could help assess the fit of technology to task. Physical and digital spaces could meet broader user needs by formally supporting user customization, 'programming through annotation'. Augmented reality systems could also directly support annotation, addressing existing information gaps, and enhancing work with context sensitive annotation. Communication spaces offer a model of how work unfolds. Annotations make visible local adaptation that makes technology fit for purpose post-implementation and suggest an important role for annotatable information systems and digital augmentation of the physical environment.

  3. A Proposal of Estimation Methodology to Improve Calculation Efficiency of Sampling-based Method in Nuclear Data Sensitivity and Uncertainty Analysis

    International Nuclear Information System (INIS)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2014-01-01

    The uncertainty with the sampling-based method is evaluated by repeating transport calculations with a number of cross section data sampled from the covariance uncertainty data. In the transport calculation with the sampling-based method, the transport equation is not modified; therefore, all uncertainties of the responses such as k eff , reaction rates, flux and power distribution can be directly obtained all at one time without code modification. However, a major drawback with the sampling-based method is that it requires expensive computational load for statistically reliable results (inside confidence level 0.95) in the uncertainty analysis. The purpose of this study is to develop a method for improving the computational efficiency and obtaining highly reliable uncertainty result in using the sampling-based method with Monte Carlo simulation. The proposed method is a method to reduce the convergence time of the response uncertainty by using the multiple sets of sampled group cross sections in a single Monte Carlo simulation. The proposed method was verified by estimating GODIVA benchmark problem and the results were compared with that of conventional sampling-based method. In this study, sampling-based method based on central limit theorem is proposed to improve calculation efficiency by reducing the number of repetitive Monte Carlo transport calculation required to obtain reliable uncertainty analysis results. Each set of sampled group cross sections is assigned to each active cycle group in a single Monte Carlo simulation. The criticality uncertainty for the GODIVA problem is evaluated by the proposed and previous method. The results show that the proposed sampling-based method can efficiently decrease the number of Monte Carlo simulation required for evaluate uncertainty of k eff . It is expected that the proposed method will improve computational efficiency of uncertainty analysis with sampling-based method

  4. An improved enzyme-linked immunosorbent assay for whole-cell determination of methanogens in samples from anaerobic reactors

    DEFF Research Database (Denmark)

    Sørensen, A.H.; Ahring, B.K.

    1997-01-01

    An enzyme-linked immunosorbent assay was developed for the detection of whole cells of methanogens in samples from anaerobic continuously stirred tank digesters treating slurries of solid waste. The assay was found to allow for quantitative analysis of the most important groups of methanogens......-quality microtiter plates and the addition of dilute hydrochloric acid to the samples. In an experiment on different digester samples, the test demonstrated a unique pattern of different methanogenic strains present in each sample. The limited preparatory work required for the assay and the simple assay design make...... in samples from anaerobic digesters in a reproducible manner. Polyclonal antisera against eight strains of methanogens were employed in the test, The specificities of the antisera were increased by adsorption with cross-reacting cells. The reproducibility of the assay depended on the use of high...

  5. Anthropogenic resource subsidies determine space use by Australian arid zone dingoes: an improved resource selection modelling approach.

    Directory of Open Access Journals (Sweden)

    Thomas M Newsome

    Full Text Available Dingoes (Canis lupus dingo were introduced to Australia and became feral at least 4,000 years ago. We hypothesized that dingoes, being of domestic origin, would be adaptable to anthropogenic resource subsidies and that their space use would be affected by the dispersion of those resources. We tested this by analyzing Resource Selection Functions (RSFs developed from GPS fixes (locations of dingoes in arid central Australia. Using Generalized Linear Mixed-effect Models (GLMMs, we investigated resource relationships for dingoes that had access to abundant food near mine facilities, and for those that did not. From these models, we predicted the probability of dingo occurrence in relation to anthropogenic resource subsidies and other habitat characteristics over ∼ 18,000 km(2. Very small standard errors and subsequent pervasively high P-values of results will become more important as the size of data sets, such as our GPS tracking logs, increases. Therefore, we also investigated methods to minimize the effects of serial and spatio-temporal correlation among samples and unbalanced study designs. Using GLMMs, we accounted for some of the correlation structure of GPS animal tracking data; however, parameter standard errors remained very small and all predictors were highly significant. Consequently, we developed an alternative approach that allowed us to review effect sizes at different spatial scales and determine which predictors were sufficiently ecologically meaningful to include in final RSF models. We determined that the most important predictor for dingo occurrence around mine sites was distance to the refuse facility. Away from mine sites, close proximity to human-provided watering points was predictive of dingo dispersion as were other landscape factors including palaeochannels, rocky rises and elevated drainage depressions. Our models demonstrate that anthropogenically supplemented food and water can alter dingo-resource relationships. The

  6. Improved technical success and radiation safety of adrenal vein sampling using rapid, semi-quantitative point-of-care cortisol measurement.

    Science.gov (United States)

    Page, Michael M; Taranto, Mario; Ramsay, Duncan; van Schie, Greg; Glendenning, Paul; Gillett, Melissa J; Vasikaran, Samuel D

    2018-01-01

    Objective Primary aldosteronism is a curable cause of hypertension which can be treated surgically or medically depending on the findings of adrenal vein sampling studies. Adrenal vein sampling studies are technically demanding with a high failure rate in many centres. The use of intraprocedural cortisol measurement could improve the success rates of adrenal vein sampling but may be impracticable due to cost and effects on procedural duration. Design Retrospective review of the results of adrenal vein sampling procedures since commencement of point-of-care cortisol measurement using a novel single-use semi-quantitative measuring device for cortisol, the adrenal vein sampling Accuracy Kit. Success rate and complications of adrenal vein sampling procedures before and after use of the adrenal vein sampling Accuracy Kit. Routine use of the adrenal vein sampling Accuracy Kit device for intraprocedural measurement of cortisol commenced in 2016. Results Technical success rate of adrenal vein sampling increased from 63% of 99 procedures to 90% of 48 procedures ( P = 0.0007) after implementation of the adrenal vein sampling Accuracy Kit. Failure of right adrenal vein cannulation was the main reason for an unsuccessful study. Radiation dose decreased from 34.2 Gy.cm 2 (interquartile range, 15.8-85.9) to 15.7 Gy.cm 2 (6.9-47.3) ( P = 0.009). No complications were noted, and implementation costs were minimal. Conclusions Point-of-care cortisol measurement during adrenal vein sampling improved cannulation success rates and reduced radiation exposure. The use of the adrenal vein sampling Accuracy Kit is now standard practice at our centre.

  7. Thermal stability improvement of a multiple finger power SiGe heterojunction bipolar transistor under different power dissipations using non-uniform finger spacing

    International Nuclear Information System (INIS)

    Chen Liang; Zhang Wan-Rong; Jin Dong-Yue; Shen Pei; Xie Hong-Yun; Ding Chun-Bao; Xiao Ying; Sun Bo-Tao; Wang Ren-Qing

    2011-01-01

    A method of non-uniform finger spacing is proposed to enhance thermal stability of a multiple finger power SiGe heterojunction bipolar transistor under different power dissipations. Temperature distribution on the emitter fingers of a multi-finger SiGe heterojunction bipolar transistor is studied using a numerical electro-thermal model. The results show that the SiGe heterojunction bipolar transistor with non-uniform finger spacing has a small temperature difference between fingers compared with a traditional uniform finger spacing heterojunction bipolar transistor at the same power dissipation. What is most important is that the ability to improve temperature non-uniformity is not weakened as power dissipation increases. So the method of non-uniform finger spacing is very effective in enhancing the thermal stability and the power handing capability of power device. Experimental results verify our conclusions. (interdisciplinary physics and related areas of science and technology)

  8. Improved sperm kinematics in semen samples collected after 2 h versus 4-7 days of ejaculation abstinence

    DEFF Research Database (Denmark)

    Alipour, H; Van Der Horst, G; Christiansen, O B

    2017-01-01

    STUDY QUESTION: Does a short abstinence period of only 2 h yield spermatozoa with better motility characteristics than samples collected after 4-7 days? SUMMARY ANSWER: Despite lower semen volume, sperm concentration, total sperm counts and total motile counts, higher percentages of motile...... a controlled repeated-measures design based on semen samples from 43 male partners, in couples attending for IVF treatment, who had a sperm concentration above 15 million/ml. Data were collected between June 2014 and December 2015 in the Fertility Unit of Aalborg University Hospital (Aalborg, Denmark......). PARTICIPANTS/MATERIALS, SETTING, METHODS: Participants provided a semen sample after 4-7 days of abstinence followed by another sample after only 2 h. For both ejaculates, sperm concentration, total sperm counts, motility groups and detailed kinematic parameters were assessed and compared by using the Sperm...

  9. Application of bias correction methods to improve U3Si2 sample preparation for quantitative analysis by WDXRF

    International Nuclear Information System (INIS)

    Scapin, Marcos A.; Guilhen, Sabine N.; Azevedo, Luciana C. de; Cotrim, Marycel E.B.; Pires, Maria Ap. F.

    2017-01-01

    The determination of silicon (Si), total uranium (U) and impurities in uranium-silicide (U 3 Si 2 ) samples by wavelength dispersion X-ray fluorescence technique (WDXRF) has been already validated and is currently implemented at IPEN's X-Ray Fluorescence Laboratory (IPEN-CNEN/SP) in São Paulo, Brazil. Sample preparation requires the use of approximately 3 g of H 3 BO 3 as sample holder and 1.8 g of U 3 Si 2 . However, because boron is a neutron absorber, this procedure precludes U 3 Si 2 sample's recovery, which, in time, considering routinely analysis, may account for significant unusable uranium waste. An estimated average of 15 samples per month are expected to be analyzed by WDXRF, resulting in approx. 320 g of U 3 Si 2 that would not return to the nuclear fuel cycle. This not only impacts in production losses, but generates another problem: radioactive waste management. The purpose of this paper is to present the mathematical models that may be applied for the correction of systematic errors when H 3 BO 3 sample holder is substituted by cellulose-acetate {[C 6 H 7 O 2 (OH) 3-m (OOCCH 3 )m], m = 0∼3}, thus enabling U 3 Si 2 sample’s recovery. The results demonstrate that the adopted mathematical model is statistically satisfactory, allowing the optimization of the procedure. (author)

  10. Improvement of the Work Environment and Work-Related Stress: A Cross-Sectional Multilevel Study of a Nationally Representative Sample of Japanese Workers.

    Science.gov (United States)

    Watanabe, Kazuhiro; Tabuchi, Takahiro; Kawakami, Norito

    2017-03-01

    This cross-sectional multilevel study aimed to investigate the relationship between improvement of the work environment and work-related stress in a nationally representative sample in Japan. The study was based on a national survey that randomly sampled 1745 worksites and 17,500 nested employees. The survey asked the worksites whether improvements of the work environment were conducted; and it asked the employees to report the number of work-related stresses they experienced. Multilevel multinominal logistic and linear regression analyses were conducted. Improvement of the work environment was not significantly associated with any level of work-related stress. Among men, it was significantly and negatively associated with the severe level of work-related stress. The association was not significant among women. Improvements to work environments may be associated with reduced work-related stress among men nationwide in Japan.

  11. Improving the Yield of Histological Sampling in Patients With Suspected Colorectal Cancer During Colonoscopy by Introducing a Colonoscopy Quality Assurance Program.

    Science.gov (United States)

    Gado, Ahmed; Ebeid, Basel; Abdelmohsen, Aida; Axon, Anthony

    2011-08-01

    Masses discovered by clinical examination, imaging or endoscopic studies that are suspicious for malignancy typically require biopsy confirmation before treatment is initiated. Biopsy specimens may fail to yield a definitive diagnosis if the lesion is extensively ulcerated or otherwise necrotic and viable tumor tissue is not obtained on sampling. The diagnostic yield is improved when multiple biopsy samples (BSs) are taken. A colonoscopy quality-assurance program (CQAP) was instituted in 2003 in our institution. The aim of this study was to determine the effect of instituting a CQAP on the yield of histological sampling in patients with suspected colorectal cancer (CRC) during colonoscopy. Initial assessment of colonoscopy practice was performed in 2003. A total of five patients with suspected CRC during colonoscopy were documented in 2003. BSs confirmed CRC in three (60%) patients and were nondiagnostic in two (40%). A quality-improvement process was instituted which required a minimum six BSs with adequate size of the samples from any suspected CRC during colonoscopy. A total of 37 patients for the period 2004-2010 were prospectively assessed. The diagnosis of CRC was confirmed with histological examination of BSs obtained during colonoscopy in 63% of patients in 2004, 60% in 2005, 50% in 2006, 67% in 2007, 100% in 2008, 67% in 2009 and 100% in 2010. The yield of histological sampling increased significantly ( p quality assurance and improvement program increased the yield of histological sampling in patients with suspected CRC during colonoscopy.

  12. Improving the growth of Ge/Si islands by modulating the spacing between screen and accelerator grids in ion beam sputtering deposition system

    International Nuclear Information System (INIS)

    Yang, Jie; Zhao, Bo; Wang, Chong; Qiu, Feng; Wang, Rongfei; Yang, Yu

    2016-01-01

    Highlights: • Ge islands were prepared by ion beam sputtering with different grid-to-grid gaps. • Ge islands with larger sizes and low density are observed in 1-mm-spaced samples. • The island growth was determined by sputter energy and the quality of Si buffer. • The crystalline volume fraction of buffer must be higher than 72% to grow islands. - Abstract: Ge islands were fabricated on Si buffer layer by ion beam sputtering deposition with a spacing between the screen and accelerator grids of either 1 mm or 2 mm. The Si buffer layer exhibits mixed-phase microcrystallinity for samples grown with 1 mm spacing and crystallinity for those with 2 mm spacing. Ge islands are larger and less dense than those grown on the crystalline buffer because of the selective growth mechanism on the microcrystalline buffer. Moreover, the nucleation site of Ge islands formed on the crystalline Si buffer is random. Ge islands grown at different grid-to-grid gaps are characterized by two key factors, namely, divergence half angle of ion beam and crystallinity of buffer layer. High grid-to-grid spacing results in small divergence half angle, thereby enhancing the sputtering energy and redistribution of sputtered atoms. The crystalline volume fraction of the microcrystalline Si buffer was obtained based on the integrated intensity ratio of Raman peaks. The islands show decreased density with decreasing crystalline volume fraction and are difficult to observe at crystalline volume fractions lower than 72%.

  13. Improving the growth of Ge/Si islands by modulating the spacing between screen and accelerator grids in ion beam sputtering deposition system

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Jie; Zhao, Bo [Institute of Optoelectronic Information Materials, School of Materials Science and Engineering, Yunnan University, Kunming 650091 (China); Yunnan Key Laboratory for Micro/Nano Materials and Technology, Yunnan University, Kunming 650091 (China); Wang, Chong, E-mail: cwang@mail.sitp.ac.cn [Institute of Optoelectronic Information Materials, School of Materials Science and Engineering, Yunnan University, Kunming 650091 (China); Yunnan Key Laboratory for Micro/Nano Materials and Technology, Yunnan University, Kunming 650091 (China); Qiu, Feng; Wang, Rongfei [Institute of Optoelectronic Information Materials, School of Materials Science and Engineering, Yunnan University, Kunming 650091 (China); Yunnan Key Laboratory for Micro/Nano Materials and Technology, Yunnan University, Kunming 650091 (China); Yang, Yu, E-mail: yuyang@ynu.edu.cn [Institute of Optoelectronic Information Materials, School of Materials Science and Engineering, Yunnan University, Kunming 650091 (China); Yunnan Key Laboratory for Micro/Nano Materials and Technology, Yunnan University, Kunming 650091 (China)

    2016-11-15

    Highlights: • Ge islands were prepared by ion beam sputtering with different grid-to-grid gaps. • Ge islands with larger sizes and low density are observed in 1-mm-spaced samples. • The island growth was determined by sputter energy and the quality of Si buffer. • The crystalline volume fraction of buffer must be higher than 72% to grow islands. - Abstract: Ge islands were fabricated on Si buffer layer by ion beam sputtering deposition with a spacing between the screen and accelerator grids of either 1 mm or 2 mm. The Si buffer layer exhibits mixed-phase microcrystallinity for samples grown with 1 mm spacing and crystallinity for those with 2 mm spacing. Ge islands are larger and less dense than those grown on the crystalline buffer because of the selective growth mechanism on the microcrystalline buffer. Moreover, the nucleation site of Ge islands formed on the crystalline Si buffer is random. Ge islands grown at different grid-to-grid gaps are characterized by two key factors, namely, divergence half angle of ion beam and crystallinity of buffer layer. High grid-to-grid spacing results in small divergence half angle, thereby enhancing the sputtering energy and redistribution of sputtered atoms. The crystalline volume fraction of the microcrystalline Si buffer was obtained based on the integrated intensity ratio of Raman peaks. The islands show decreased density with decreasing crystalline volume fraction and are difficult to observe at crystalline volume fractions lower than 72%.

  14. Improved reproducibility in genome-wide DNA methylation analysis for PAXgene® fixed samples compared to restored FFPE DNA

    DEFF Research Database (Denmark)

    Andersen, Gitte Brinch; Hager, Henrik; Hansen, Lise Lotte

    2014-01-01

    Chip. Quantitative DNA methylation analysis demonstrated that the methylation profile in PAXgene-fixed tissues showed, in comparison with restored FFPE samples, a higher concordance with the profile detected in frozen samples. We demonstrate, for the first time, that DNA from PAXgene conserved tissue performs better......Formalin fixation has been the standard method for conservation of clinical specimens for decades. However, a major drawback is the high degradation of nucleic acids, which complicates its use in genome-wide analyses. Unbiased identification of biomarkers, however, requires genome-wide studies......, precluding the use of the valuable archives of specimens with long-term follow-up data. Therefore, restoration protocols for DNA from formalin-fixed and paraffin-embedded (FFPE) samples have been developed, although they are cost-intensive and time-consuming. An alternative to FFPE and snap...

  15. Space charge inhibition effect of nano-Fe{sub 3}O{sub 4} on improvement of impulse breakdown voltage of transformer oil based on improved Kerr optic measurements

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Qing, E-mail: yangqing@cqu.edu.cn; Yu, Fei; Sima, Wenxia [State Key Laboratory of Power Transmission Equipment & System Security and New Technology, Chongqing University, Shapingba District, Chongqing, 400044 (China); Zahn, Markus [Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, Cambridge, MA 02139 (United States)

    2015-09-15

    Transformer oil-based nanofluids (NFs) with 0.03 g/L Fe{sub 3}O{sub 4} nanoparticle content exhibit 11.2% higher positive impulse breakdown voltage levels than pure transformer oils. To study the effects of the Fe{sub 3}O{sub 4} nanoparticles on the space charge in transformer oil and to explain why the nano-modified transformer oil exhibits improved impulse breakdown voltage characteristics, the traditional Kerr electro-optic field mapping technique is improved by increasing the length of the parallel-plate electrodes and by using a photodetector array as a high light sensitivity device. The space charge distributions of pure transformer oil and of NFs containing Fe{sub 3}O{sub 4} nanoparticles can be measured using the improved Kerr electro-optic field mapping technique. Test results indicate a significant reduction in space charge density in the transformer oil-based NFs with the Fe{sub 3}O{sub 4} nanoparticles. The fast electrons are captured by the nanoparticles and are converted into slow-charged particles in the NFs, which then reduce the space charge density and result in a more uniform electric field distribution. Streamer propagation in the NFs is also obstructed, and the breakdown strengths of the NFs under impulse voltage conditions are also improved.

  16. Fail-Safe, Controllable Liquid Spring/Damper System for Improved Rover Space Vehicle Mobility, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA is planning to return to the moon in 2020 to explore thousands of miles of the moon?s surface with individual missions, lasting six months or longer. Surface...

  17. Histopathological examination of nerve samples from pure neural leprosy patients: obtaining maximum information to improve diagnostic efficiency

    Directory of Open Access Journals (Sweden)

    Sérgio Luiz Gomes Antunes

    2012-03-01

    Full Text Available Nerve biopsy examination is an important auxiliary procedure for diagnosing pure neural leprosy (PNL. When acid-fast bacilli (AFB are not detected in the nerve sample, the value of other nonspecific histological alterations should be considered along with pertinent clinical, electroneuromyographical and laboratory data (the detection of Mycobacterium leprae DNA with polymerase chain reaction and the detection of serum anti-phenolic glycolipid 1 antibodies to support a possible or probable PNL diagnosis. Three hundred forty nerve samples [144 from PNL patients and 196 from patients with non-leprosy peripheral neuropathies (NLN] were examined. Both AFB-negative and AFB-positive PNL samples had more frequent histopathological alterations (epithelioid granulomas, mononuclear infiltrates, fibrosis, perineurial and subperineurial oedema and decreased numbers of myelinated fibres than the NLN group. Multivariate analysis revealed that independently, mononuclear infiltrate and perineurial fibrosis were more common in the PNL group and were able to correctly classify AFB-negative PNL samples. These results indicate that even in the absence of AFB, these histopathological nerve alterations may justify a PNL diagnosis when observed in conjunction with pertinent clinical, epidemiological and laboratory data.

  18. Histopathological examination of nerve samples from pure neural leprosy patients: obtaining maximum information to improve diagnostic efficiency.

    Science.gov (United States)

    Antunes, Sérgio Luiz Gomes; Chimelli, Leila; Jardim, Márcia Rodrigues; Vital, Robson Teixeira; Nery, José Augusto da Costa; Corte-Real, Suzana; Hacker, Mariana Andréa Vilas Boas; Sarno, Euzenir Nunes

    2012-03-01

    Nerve biopsy examination is an important auxiliary procedure for diagnosing pure neural leprosy (PNL). When acid-fast bacilli (AFB) are not detected in the nerve sample, the value of other nonspecific histological alterations should be considered along with pertinent clinical, electroneuromyographical and laboratory data (the detection of Mycobacterium leprae DNA with polymerase chain reaction and the detection of serum anti-phenolic glycolipid 1 antibodies) to support a possible or probable PNL diagnosis. Three hundred forty nerve samples [144 from PNL patients and 196 from patients with non-leprosy peripheral neuropathies (NLN)] were examined. Both AFB-negative and AFB-positive PNL samples had more frequent histopathological alterations (epithelioid granulomas, mononuclear infiltrates, fibrosis, perineurial and subperineurial oedema and decreased numbers of myelinated fibres) than the NLN group. Multivariate analysis revealed that independently, mononuclear infiltrate and perineurial fibrosis were more common in the PNL group and were able to correctly classify AFB-negative PNL samples. These results indicate that even in the absence of AFB, these histopathological nerve alterations may justify a PNL diagnosis when observed in conjunction with pertinent clinical, epidemiological and laboratory data.

  19. Finish-Kazakhstan cooperation on an aerosols sampling - testing of a new methods for nuclear monitoring improvement

    International Nuclear Information System (INIS)

    Tarvajnen, M.; Akhmetov, M.A.; Ptitskaya, L.D.; Osintsev, A.Yu.; Zhantikin, T.M.; Eligbaeva, G.

    2001-01-01

    The aerosols sampling is the powerful method of air radioactivity monitoring both the natural and artificial origin. Up to the today the IAEA does not engage of aerosols sampling study. To study of possibility of this method examination for radiation monitoring - the state authorities of Finland and the Republic of Kazakhstan - Department of Radiation and Nuclear Safety (Stuck) and Kazakhstan Atomic Energy Committee - jointly carried out the field tests in Kazakhstan. The test was began in the Kurchatov in April 2000 - at the desire of IAEA working team on Iraq - close to former Semipalatinsk test site and was ended in Astana in August of 2001. The main aim of the field test was study of possibility and appropriateness of concept and technology of aerosols sampling developed for the complete condition of environment. In the paper the role of participating sides in the field test and main results and conclusions are discussed as well. The gained experience will allow developing the method aerosols sampling for IAEA international safeguard measures strengthening application

  20. A Novel Qualitative Method to Improve Access, Elicitation, and Sample Diversification for Enhanced Transferability Applied to Studying Chemistry Outreach

    Science.gov (United States)

    Pratt, Justin M.; Yezierski, Ellen J.

    2018-01-01

    Conducting qualitative research in any discipline warrants two actions: accessing participants and eliciting their ideas. In chemistry education research (CER), survey techniques have been used to increase access to participants and diversify samples. Interview tasks (such as card sorting, using demonstrations, and using simulations) have been…

  1. Improving the Network Scale-Up Estimator: Incorporating Means of Sums, Recursive Back Estimation, and Sampling Weights.

    Directory of Open Access Journals (Sweden)

    Patrick Habecker

    Full Text Available Researchers interested in studying populations that are difficult to reach through traditional survey methods can now draw on a range of methods to access these populations. Yet many of these methods are more expensive and difficult to implement than studies using conventional sampling frames and trusted sampling methods. The network scale-up method (NSUM provides a middle ground for researchers who wish to estimate the size of a hidden population, but lack the resources to conduct a more specialized hidden population study. Through this method it is possible to generate population estimates for a wide variety of groups that are perhaps unwilling to self-identify as such (for example, users of illegal drugs or other stigmatized populations via traditional survey tools such as telephone or mail surveys--by asking a representative sample to estimate the number of people they know who are members of such a "hidden" subpopulation. The original estimator is formulated to minimize the weight a single scaling variable can exert upon the estimates. We argue that this introduces hidden and difficult to predict biases, and instead propose a series of methodological advances on the traditional scale-up estimation procedure, including a new estimator. Additionally, we formalize the incorporation of sample weights into the network scale-up estimation process, and propose a recursive process of back estimation "trimming" to identify and remove poorly performing predictors from the estimation process. To demonstrate these suggestions we use data from a network scale-up mail survey conducted in Nebraska during 2014. We find that using the new estimator and recursive trimming process provides more accurate estimates, especially when used in conjunction with sampling weights.

  2. Smart Air Sampling Instruments Have the Ability to Improve the Accuracy of Air Monitoring Data Comparisons Among Nuclear Industry Facilities

    International Nuclear Information System (INIS)

    Gavila, F. M.

    2008-01-01

    Valid inter-comparisons of operating performance parameters among all members of the nuclear industry are essential for the implementation of continuous improvement and for obtaining credibility among regulators and the general public. It is imperative that the comparison of performances among different industry facilities be as accurate as possible and normalized to industry-accepted reference standards

  3. Preliminary evidence that yoga practice progressively improves mood and decreases stress in a sample of UK prisoners

    NARCIS (Netherlands)

    Bilderbeck, A.C.; Brazil, I.A.; Farias, M.

    2015-01-01

    Objectives. In the first randomized controlled trial of yoga on UK prisoners, we previously showed that yoga practice was associated with improved mental wellbeing and cognition. Here, we aimed to assess how class attendance, self-practice, and demographic factors were related to outcome amongst

  4. Improvement of sampling plans for Salmonella detection in pooled table eggs by use of real-time PCR

    DEFF Research Database (Denmark)

    Pasquali, Frédérique; De Cesare, Alessandra; Valero, Antonio

    2014-01-01

    Eggs and egg products have been described as the most critical food vehicles of salmonellosis. The prevalence and level of contamination of Salmonella on table eggs are low, which severely affects the sensitivity of sampling plans applied voluntarily in some European countries, where one to five...... pools of 10 eggs are tested by the culture based reference method ISO 6579:2004. In the current study we have compared the testing-sensitivity of the reference culture method ISO 6579:2004 and an alternative real-time PCR method on Salmonella contaminated egg-pool of different sizes (4-9 uninfected eggs...... mixed with one contaminated egg) and contamination levels (10°-10(1), 10(1)-10(2), 10(2)-10(3)CFU/eggshell). Two hundred and seventy samples corresponding to 15 replicates per pool size and inoculum level were tested. At the lowest contamination level real-time PCR detected Salmonella in 40...

  5. Real-time photonic sampling with improved signal-to-noise and distortion ratio using polarization-dependent modulators

    Science.gov (United States)

    Liang, Dong; Zhang, Zhiyao; Liu, Yong; Li, Xiaojun; Jiang, Wei; Tan, Qinggui

    2018-04-01

    A real-time photonic sampling structure with effective nonlinearity suppression and excellent signal-to-noise ratio (SNR) performance is proposed. The key points of this scheme are the polarization-dependent modulators (P-DMZMs) and the sagnac loop structure. Thanks to the polarization sensitive characteristic of P-DMZMs, the differences between transfer functions of the fundamental signal and the distortion become visible. Meanwhile, the selection of specific biases in P-DMZMs is helpful to achieve a preferable linearized performance with a low noise level for real-time photonic sampling. Compared with the quadrature-biased scheme, the proposed scheme is capable of valid nonlinearity suppression and is able to provide a better SNR performance even in a large frequency range. The proposed scheme is proved to be effective and easily implemented for real time photonic applications.

  6. Conditional estimation of local pooled dispersion parameter in small-sample RNA-Seq data improves differential expression test.

    Science.gov (United States)

    Gim, Jungsoo; Won, Sungho; Park, Taesung

    2016-10-01

    High throughput sequencing technology in transcriptomics studies contribute to the understanding of gene regulation mechanism and its cellular function, but also increases a need for accurate statistical methods to assess quantitative differences between experiments. Many methods have been developed to account for the specifics of count data: non-normality, a dependence of the variance on the mean, and small sample size. Among them, the small number of samples in typical experiments is still a challenge. Here we present a method for differential analysis of count data, using conditional estimation of local pooled dispersion parameters. A comprehensive evaluation of our proposed method in the aspect of differential gene expression analysis using both simulated and real data sets shows that the proposed method is more powerful than other existing methods while controlling the false discovery rates. By introducing conditional estimation of local pooled dispersion parameters, we successfully overcome the limitation of small power and enable a powerful quantitative analysis focused on differential expression test with the small number of samples.

  7. Improved detection of endoparasite DNA in soil sample PCR by the use of anti-inhibitory substances.

    Science.gov (United States)

    Krämer, F; Vollrath, T; Schnieder, T; Epe, C

    2002-09-26

    Although there have been numerous microbial examinations of soil for the presence of human pathogenic developmental parasite stages of Ancylostoma caninum and Toxocara canis, molecular techniques (e.g. DNA extraction, purification and subsequent PCR) have scarcely been applied. Here, DNA preparations of soil samples artificially contaminated with genomic DNA or parasite eggs were examined by PCR. A. caninum and T. canis-specific primers based on the ITS-2 sequence were used for amplification. After the sheer DNA preparation a high content of PCR-interfering substances was still detectable. Subsequently, two different inhibitors of PCR-interfering agents (GeneReleaser, Bioventures Inc. and Maximator, Connex GmbH) were compared in PCR. Both substances increased PCR sensitivity greatly. However, comparison of the increase in sensitivity achieved with the two compounds demonstrated the superiority of Maximator, which enhanced sensitivity to the point of permitting positive detecti