WorldWideScience

Sample records for sample changers

  1. A pneumatic sample changer for gamma-ray spectroscopy

    Science.gov (United States)

    Massoni, C.J.; Fones, R.V.; Simon, F.O.

    1973-01-01

    A gravity-feed, pneumatic-ejection sample changer has been developed. The changer is suitable for both flat and well-type detectors and permits the continuous use of gamma-ray spectroscopy equipment 24 h a day, 7 days a week. The electronic circuitry has a fail-safe feature which stops the operation of the changer if a malfunction occurs. ?? 1973 The American Institute of Physics.

  2. BioSAXS Sample Changer: a robotic sample changer for rapid and reliable high-throughput X-ray solution scattering experiments.

    Science.gov (United States)

    Round, Adam; Felisaz, Franck; Fodinger, Lukas; Gobbo, Alexandre; Huet, Julien; Villard, Cyril; Blanchet, Clement E; Pernot, Petra; McSweeney, Sean; Roessle, Manfred; Svergun, Dmitri I; Cipriani, Florent

    2015-01-01

    Small-angle X-ray scattering (SAXS) of macromolecules in solution is in increasing demand by an ever more diverse research community, both academic and industrial. To better serve user needs, and to allow automated and high-throughput operation, a sample changer (BioSAXS Sample Changer) that is able to perform unattended measurements of up to several hundred samples per day has been developed. The Sample Changer is able to handle and expose sample volumes of down to 5 µl with a measurement/cleaning cycle of under 1 min. The samples are stored in standard 96-well plates and the data are collected in a vacuum-mounted capillary with automated positioning of the solution in the X-ray beam. Fast and efficient capillary cleaning avoids cross-contamination and ensures reproducibility of the measurements. Independent temperature control for the well storage and for the measurement capillary allows the samples to be kept cool while still collecting data at physiological temperatures. The Sample Changer has been installed at three major third-generation synchrotrons: on the BM29 beamline at the European Synchrotron Radiation Facility (ESRF), the P12 beamline at the PETRA-III synchrotron (EMBL@PETRA-III) and the I22/B21 beamlines at Diamond Light Source, with the latter being the first commercial unit supplied by Bruker ASC.

  3. BioSAXS Sample Changer: a robotic sample changer for rapid and reliable high-throughput X-ray solution scattering experiments

    Energy Technology Data Exchange (ETDEWEB)

    Round, Adam, E-mail: around@embl.fr; Felisaz, Franck [European Molecular Biology Laboratory, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); Université Grenoble Alpes–EMBL–CNRS, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); Fodinger, Lukas; Gobbo, Alexandre [European Molecular Biology Laboratory, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); Huet, Julien [European Molecular Biology Laboratory, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); Université Grenoble Alpes–EMBL–CNRS, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); Villard, Cyril [European Molecular Biology Laboratory, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); Blanchet, Clement E., E-mail: around@embl.fr [EMBL c/o DESY, Notkestrasse 85, 22603 Hamburg (Germany); Pernot, Petra; McSweeney, Sean [ESRF, 6 Rue Jules Horowitz, 38000 Grenoble (France); Roessle, Manfred; Svergun, Dmitri I. [EMBL c/o DESY, Notkestrasse 85, 22603 Hamburg (Germany); Cipriani, Florent, E-mail: around@embl.fr [European Molecular Biology Laboratory, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); Université Grenoble Alpes–EMBL–CNRS, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France)

    2015-01-01

    A robotic sample changer for solution X-ray scattering experiments optimized for speed and to use the minimum amount of material has been developed. This system is now in routine use at three high-brilliance European synchrotron sites, each capable of several hundred measurements per day. Small-angle X-ray scattering (SAXS) of macromolecules in solution is in increasing demand by an ever more diverse research community, both academic and industrial. To better serve user needs, and to allow automated and high-throughput operation, a sample changer (BioSAXS Sample Changer) that is able to perform unattended measurements of up to several hundred samples per day has been developed. The Sample Changer is able to handle and expose sample volumes of down to 5 µl with a measurement/cleaning cycle of under 1 min. The samples are stored in standard 96-well plates and the data are collected in a vacuum-mounted capillary with automated positioning of the solution in the X-ray beam. Fast and efficient capillary cleaning avoids cross-contamination and ensures reproducibility of the measurements. Independent temperature control for the well storage and for the measurement capillary allows the samples to be kept cool while still collecting data at physiological temperatures. The Sample Changer has been installed at three major third-generation synchrotrons: on the BM29 beamline at the European Synchrotron Radiation Facility (ESRF), the P12 beamline at the PETRA-III synchrotron (EMBL@PETRA-III) and the I22/B21 beamlines at Diamond Light Source, with the latter being the first commercial unit supplied by Bruker ASC.

  4. RoboDiff: combining a sample changer and goniometer for highly automated macromolecular crystallography experiments.

    Science.gov (United States)

    Nurizzo, Didier; Bowler, Matthew W; Caserotto, Hugo; Dobias, Fabien; Giraud, Thierry; Surr, John; Guichard, Nicolas; Papp, Gergely; Guijarro, Matias; Mueller-Dieckmann, Christoph; Flot, David; McSweeney, Sean; Cipriani, Florent; Theveneau, Pascal; Leonard, Gordon A

    2016-08-01

    Automation of the mounting of cryocooled samples is now a feature of the majority of beamlines dedicated to macromolecular crystallography (MX). Robotic sample changers have been developed over many years, with the latest designs increasing capacity, reliability and speed. Here, the development of a new sample changer deployed at the ESRF beamline MASSIF-1 (ID30A-1), based on an industrial six-axis robot, is described. The device, named RoboDiff, includes a high-capacity dewar, acts as both a sample changer and a high-accuracy goniometer, and has been designed for completely unattended sample mounting and diffraction data collection. This aim has been achieved using a high level of diagnostics at all steps of the process from mounting and characterization to data collection. The RoboDiff has been in service on the fully automated endstation MASSIF-1 at the ESRF since September 2014 and, at the time of writing, has processed more than 20 000 samples completely automatically.

  5. A multi-sample changer coupled to an electron cyclotron resonance source for accelerator mass spectrometry experiments.

    Science.gov (United States)

    Vondrasek, R; Palchan, T; Pardo, R; Peters, C; Power, M; Scott, R

    2014-02-01

    A new multi-sample changer has been constructed allowing rapid changes between samples. The sample changer has 20 positions and is capable of moving between samples in 1 min. The sample changer is part of a project using Accelerator Mass Spectrometry (AMS) at the Argonne Tandem Linac Accelerator System (ATLAS) facility to measure neutron capture rates on a wide range of actinides in a reactor environment. This project will require the measurement of a large number of samples previously irradiated in the Advanced Test Reactor at Idaho National Laboratory. The AMS technique at ATLAS is based on production of highly charged positive ions in an electron cyclotron resonance ion source followed by acceleration in the ATLAS linac. The sample material is introduced into the plasma via laser ablation chosen to limit the dependency of material feed rates upon the source material composition as well as minimize cross-talk between samples.

  6. FlexED8: the first member of a fast and flexible sample-changer family for macromolecular crystallography.

    Science.gov (United States)

    Papp, Gergely; Felisaz, Franck; Sorez, Clement; Lopez-Marrero, Marcos; Janocha, Robert; Manjasetty, Babu; Gobbo, Alexandre; Belrhali, Hassan; Bowler, Matthew W; Cipriani, Florent

    2017-10-01

    Automated sample changers are now standard equipment for modern macromolecular crystallography synchrotron beamlines. Nevertheless, most are only compatible with a single type of sample holder and puck. Recent work aimed at reducing sample-handling efforts and crystal-alignment times at beamlines has resulted in a new generation of compact and precise sample holders for cryocrystallography: miniSPINE and NewPin [see the companion paper by Papp et al. (2017, Acta Cryst., D73, 829-840)]. With full data collection now possible within seconds at most advanced beamlines, and future fourth-generation synchrotron sources promising to extract data in a few tens of milliseconds, the time taken to mount and centre a sample is rate-limiting. In this context, a versatile and fast sample changer, FlexED8, has been developed that is compatible with the highly successful SPINE sample holder and with the miniSPINE and NewPin sample holders. Based on a six-axis industrial robot, FlexED8 is equipped with a tool changer and includes a novel open sample-storage dewar with a built-in ice-filtering system. With seven versatile puck slots, it can hold up to 112 SPINE sample holders in uni-pucks, or 252 miniSPINE or NewPin sample holders, with 36 samples per puck. Additionally, a double gripper, compatible with the SPINE sample holders and uni-pucks, allows a reduction in the sample-exchange time from 40 s, the typical time with a standard single gripper, to less than 5 s. Computer vision-based sample-transfer monitoring, sophisticated error handling and automatic error-recovery procedures ensure high reliability. The FlexED8 sample changer has been successfully tested under real conditions on a beamline.

  7. Game Changers

    DEFF Research Database (Denmark)

    Helms, Niels Henrik

    2012-01-01

    at forsøge at beskrive nogle af de mekanismer, som gør, at nogle af disse kreative industrier bliver netop kreative og innovative, at de ikke alene kan klare sig, men også ændre og udvikle både indhold, form og organisering – at de bliver det der på managementsprog hedder game changers....

  8. A multi-purpose robotic sample changer for texture and powder measurements on the HIPPO neutron diffractometer

    Science.gov (United States)

    Losko, Adrian; Vogel, Sven

    2012-10-01

    Automation of sample changes is essential on neutron diffractometers with short count times per sample (as little as 1min for steel samples), such as the high pressure preferred orientation (HIPPO) instrument at the Los Alamos Neutron Science Center (LANSCE), to allow for a high sample throughput. Efficient use of available neutron flux is indispensable and reduces the instrument downtime and workload of beamline personnel. High precision motion in cartesian coordinates permits accurate sample alignment and increased coverage of sample directions for texture measurements. Using geometrical properties of diffraction by crystals, corrections in sample displacements in strain measurements will minimize the artificial strain due to misalignment of the sample position to determine the center of ``gravity'' of the diffraction signal by utilizing a sample rotation that will ensure that the same grain population will diffract in two different detectors, allowing to determine any sample position offset. Those corrections are only achievable with a combination of high precision sample positioning and a large detector coverage as on HIPPO. Here we present the capabilities of the new robotic sample changer to help improve texture and strain measurements on the HIPPO instrument.

  9. Static electromagnetic frequency changers

    CERN Document Server

    Rozhanskii, L L

    1963-01-01

    Static Electromagnetic Frequency Changers is about the theory, design, construction, and applications of static electromagnetic frequency changers, devices that used for multiplication or division of alternating current frequency. It is originally published in the Russian language. This book is organized into five chapters. The first three chapters introduce the readers to the principles of operation, the construction, and the potential applications of static electromagnetic frequency changers and to the principles of their design. The two concluding chapters use some hitherto unpublished work

  10. IT as a Game Changer

    Science.gov (United States)

    Oblinger, Diana G.

    2012-01-01

    Information technology can be a game changer in higher education, as it has been in other sectors. Information technology has brought about much of the economic growth of the past century, accelerating globalization and fostering democracy. This chapter explores many ways that information technology can be a game changer. Some are as simple as…

  11. Electronic tap-changer for distribution transformers

    CERN Document Server

    Faiz, Jawad

    2011-01-01

    This reference collects all relevant aspects electronic tap-changer and presents them in a comprehensive and orderly manner. It explains logically and systematically the design and optimization of a full electronic tap-changer for distribution transformers. The book provides a fully new insight to all possible structures of power section design and categorizes them comprehensively, including cost factors of the design. In the control section design, the authors review mechanical tap-changer control systems and they present the modeling of a full electronic tap-changer as well as a closed-loop

  12. Novice Career Changers Weather the Classroom Weather

    Science.gov (United States)

    Gifford, James; Snyder, Mary Grace; Cuddapah, Jennifer Locraft

    2013-01-01

    A close look at one professional's career change into teaching illustrates unique challenges and qualities, showing in stark relief what makes the induction smoother and the experience more successful. This article presents the story of a novice career changer teacher that illustrates their unique problems and dispositions, as well as…

  13. Synthetic Biology: game changer in intelectual property

    Directory of Open Access Journals (Sweden)

    Laurens Landeweerd

    2016-12-01

    Full Text Available Synthetic biology can be considered a game changer that plays an important role in the current NBIC, or BINC convergence of nano-, bio-, info and cognitive sciences. Although most synthetic biology experts are unaware of it, the field appeals to the imagination in its adherence to targets that were usually associated with premodern alchemist science. This paper elaborates several aspects of synthetic biology as well as its consequences for long held notions of intellectual property and the ontological categories of scientific discovery on the one hand and engineering on the other, the distinction between natural and artificial, the grown and the made.

  14. Game changer: the topology of creativity.

    Science.gov (United States)

    de Vaan, Mathijs; Stark, David; Vedres, Balazs

    2015-01-01

    This article examines the sociological factors that explain why some creative teams are able to produce game changers--cultural products that stand out as distinctive while also being critically recognized as outstanding. The authors build on work pointing to structural folding--the network property of a cohesive group whose membership overlaps with that of another cohesive group. They hypothesize that the effects of structural folding on game changing success are especially strong when overlapping groups are cognitively distant. Measuring social distance separately from cognitive distance and distinctiveness independently from critical acclaim, the authors test their hypothesis about structural folding and cognitive diversity by analyzing team reassembly for 12,422 video games and the career histories of 139,727 video game developers. When combined with cognitive distance, structural folding channels and mobilizes a productive tension of rules, roles, and codes that promotes successful innovation. In addition to serving as pipes and prisms, network ties are also the source of tools and tensions.

  15. Online Collaboration Processes of Career Changers Seeking Alternative Teacher Certification

    Science.gov (United States)

    Moraes Varjao, Jaqueline Urania

    2012-01-01

    In March 2011, the U.S. Department of Education released a list of Teacher Shortage Area (TSA) nationwide. In most states the need for certified teachers fall in four main areas: English as a Second Language (ESOL), Mathematics and Science (6-12), and Foreign Language. The current study explored the ways in which career changers enrolled in online…

  16. On-load Tap Changer Diagnosis on High-Voltage Power Transformers using Dynamic Resistance Measurements

    NARCIS (Netherlands)

    Erbrink, J.J.

    2011-01-01

    High-voltage transformers have tap changers to regulate the voltage in the high-voltage network when the load changes. Those tap changers are subject to different degradation mechanisms and need regular maintenance. Various defects, like contact degradation, often remain undetected and the

  17. AHP 40:THE MOUNTAIN CHANGERS: LIFESTYLE MIGRATION IN SOUTHWEST CHINA

    Directory of Open Access Journals (Sweden)

    Gary Sigley

    2016-02-01

    Full Text Available In the early twenty-first century, the People's Republic of China (PRC continues its remarkable transformation that encompasses all facets of social life. One of the most significant, visible forms of such change is urbanization. Chinese cities are rapidly expanding and, according to some reports, will grow by a staggering 400 million people over the next several decades. In just under forty years China will have transformed from a predominantly rural to urban society, a pace of urbanization not matched in previous human experience. Yet while migration in China has in recent decades been overwhelmingly of people moving from the countryside to the city, and to a lesser extent (but also quite large given the size of China's population migration of more well educated urbanites and professionals between cities, there has been a small flow of people in the other direction, that is, of those leaving the metropolises of the eastern seaboard to seek out alternative lifestyles in the mountains of western China, and in particular to places like Yunnan in the southwest. These are akin to the "sea changers" and "tree changers" found in more affluent Western societies and can be included in the relatively new phenomenon of "lifestyle migration." This paper provides a preliminary overview of this phenomenon in the context of Dali, a prefectural city in Yunnan Province.

  18. "Elite" Career-Changers and Their Experience of Initial Teacher Education

    Science.gov (United States)

    Wilkins, Chris

    2017-01-01

    This study explores the motivation of "high-status" professionals to change career and enter teaching, and their experience of undertaking initial teacher education (ITE) programmes in England. The study builds on previous research which found that career-changers are disproportionately more likely to fail to complete their ITE studies,…

  19. Decentralized Voltage Control Coordination of On-Load Tap Changer Transformers, Distributed Generation Units and Flexible Loads

    DEFF Research Database (Denmark)

    Romani Dalmau, Aina; Martinez Perez, David; Diaz de Cerio Mendaza, Iker

    2015-01-01

    approach is introduced in order to coordinate the voltage control capability from different assets in medium voltage networks. In this sense, the on-load tap changer strategy of the primary substation and the power factor control of wind turbines and combined heat and power plants are combined...... of the on-load tap changer and the reactive power from dispersed generation while maximizing the capacity usage of the Power-to-Gas load....

  20. Voltage Management in Unbalanced Low Voltage Networks Using a Decoupled Phase-Tap-Changer Transformer

    OpenAIRE

    Coppo, Massimiliano; Turri, Roberto; Marinelli, Mattia; Han, Xue

    2014-01-01

    The paper studies a medium voltage-low voltage transformer with a decoupled on load tap changer capability on each phase. The overall objective is the evaluation of the potential benefits on a low voltage network of such possibility. A realistic Danish low voltage network is used for the analysis. The load profiles are characterized by using single phase measurement data on voltages, currents and active powers with a 10 minutes resolution. Different scenarios are considered: no tap action, th...

  1. Business Model Generation: A handbook for visionaries, game changers and challengers

    OpenAIRE

    Oliveira, MAY; João José Pinto Ferreira

    2011-01-01

    The book entitled “Business Model Generation: A Handbook for visionaries, game changers and challengers” though written by Osterwalder and Pigneur (2010) was also co-created by 470 practitioners from 45 countries. The book is thus a good example of how a global creative collaboration effort can contribute positively to the business and management literature and subsequently to the advancement of society. The book "Business Model Generation" has both narrative and visual detail. Befor...

  2. Can Unconventional Gas be a Game Changer in European Gas Markets?

    OpenAIRE

    2010-01-01

    Although unconventional gas development will not be a game changer for European gas markets overall it could have a significant impact in individual countries although probably not this decade. Florence Gény’s study argues that much more stringent European environmental standards difficulties of access to land and fresh water and lack of incentives for landowners to allow companies to drill will require a completely different business model for unconventional gas development in Europe compare...

  3. How game changers catalyzed, disrupted, and incentivized social innovation: three historical cases of nature conservation, assimilation, and women's rights

    Directory of Open Access Journals (Sweden)

    Frances R. Westley

    2016-12-01

    Full Text Available We explore the impact of "game changers" on the dynamics of innovation over time in three problem domains, that of wilderness protection, women's rights, and assimilation of indigenous children in Canada. Taking a specifically historical and cross-scale approach, we look at one social innovation in each problem domain. We explore the origins and history of the development of the National Parks in the USA, the legalization of contraception in the USA and Canada, and the residential school system in Canada. Based on a comparison of these cases, we identify three kinds of game changers, those that catalyze social innovation, which we define as "seminal," those that disrupt the continuity of social innovation, which we label exogenous shocks, and those that provide opportunities for novel combinations and recombinations, which we label as endogamous game changers.

  4. Is Insulin Pump a game changer in the management of Diabetes Mellitus: A case series

    Directory of Open Access Journals (Sweden)

    Haider Ghazanfar

    2016-02-01

    Full Text Available Prevalence of diabetes has drastically increased over the past decade and it has a debilitating effect on one’s quality of life. The Insulin Pump is a relatively new modality in the management of Diabetic patients. The objective of our case series was to assess the impact of insulin pump on glycemic control and quality of life of diabetic patients. In this case series we have presented four Type 2 diabetic cases which presented with different common complications associated with diabetes as well as its management, particularly the complication of hypoglycemic episodes. Worldwide introduction of insulin pump can prove to be a trend changer in the management of diabetes

  5. Voltage Management in Unbalanced Low Voltage Networks Using a Decoupled Phase-Tap-Changer Transformer

    DEFF Research Database (Denmark)

    Coppo, Massimiliano; Turri, Roberto; Marinelli, Mattia

    2014-01-01

    The paper studies a medium voltage-low voltage transformer with a decoupled on load tap changer capability on each phase. The overall objective is the evaluation of the potential benefits on a low voltage network of such possibility. A realistic Danish low voltage network is used for the analysis....... The load profiles are characterized by using single phase measurement data on voltages, currents and active powers with a 10 minutes resolution. Different scenarios are considered: no tap action, three-phase coordinated tap action, single phase discrete step and single phase continuous tap action...

  6. Inserting the tap values of the tap changer transformers into the Jacobian matrix as control variables

    Directory of Open Access Journals (Sweden)

    Faruk Yalçın

    2013-06-01

    Full Text Available Series and shunt admittance values of under load tap changer transformers are changed according to tap changing. As this situation changes the structure of bus admittance matrix, it causes the need of rebuilding the bus admittance matrix at each tap changing case in power flow studies. In this paper, a new approach that includes the tap changing effects into the Jacobian matrix. By this approach, the need of rebuilding the bus admittance matrix at each tap changing case during power flow study is prevented. So, fast convergence is achieved for the power flow algorithm. Although there are similar studies for this aim in the literature, apart from these studies, including the tap changing effects to the Jacobian matrix when more than one under load tap changer transformers are connected to the same bus with different connection combinations is provided by the proposed approach. For this aim, new power equations and new Jacobian matrix component calculation equations are obtained. The proposed approach is tested on IEEE 57-bus test system and its accuracy is proved.

  7. Design and Development of an Automatic Tool Changer for an Articulated Robot Arm

    Science.gov (United States)

    Ambrosio, H.; Karamanoglu, M.

    2014-07-01

    In the creative industries, the length of time between the ideation stage and the making of physical objects is decreasing due to the use of CAD/CAM systems and adicitive manufacturing. Natural anisotropic materials, such as solid wood can also be transformed using CAD/CAM systems, but only with subtractive processes such as machining with CNC routers. Whilst some 3 axis CNC routing machines are affordable to buy and widely available, more flexible 5 axis routing machines still present themselves as a too big investment for small companies. Small refurbished articulated robots can be a cheaper alternative but they require a light end-effector. This paper presents a new lightweight tool changer that converts a small 3kg payload 6 DOF robot into a robot apprentice able to machine wood and similar soft materials.

  8. Four-bar linkage-based automatic tool changer: Dynamic modeling and torque optimization

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sangho; Seo, TaeWon [Yeungnam University, Gyeongsan (Korea, Republic of); Kim, Jong-Won; Kim, Jongwon [Seoul National University, Seoul (Korea, Republic of)

    2017-05-15

    An Automatic tool changer (ATC) is a device used in a tapping machine to reduce process time. This paper presents the optimization of a Peak torque reduction mechanism (PTRM) for an ATC. It is necessary to reduce the fatigue load and energy consumed, which is related to the peak torque. The PTRM uses a torsion spring to reduce the peak torque and was applied to a novel ATC mechanism, which was modeled using inverse dynamics. Optimization of the PTRM is required to minimize the peak torque. The design parameters are the initial angle and stiffness of the torsion spring, and the objective function is the peak torque of the input link. The torque was simulated, and the peak torque was decreased by 10 %. The energy consumed was reduced by the optimization.

  9. Game Changers

    Science.gov (United States)

    McCollum, Sean

    2011-01-01

    New concepts of PE and sports programs are making it more fun for everyone to play. Diets featuring fast food, sugary soft drinks and declining physical activity have contributed to a tripling of childhood obesity rates in the United States in the last 30 years, according to the U.S. Centers for Disease Control. Today, nearly a third of American…

  10. LifeChanger: A Pilot Study of a Game-Based Curriculum for Sexuality Education.

    Science.gov (United States)

    Gilliam, Melissa; Jagoda, Patrick; Heathcock, Stephen; Orzalli, Sarah; Saper, Carolyn; Dudley, Jessyca; Wilson, Claire

    2016-04-01

    To assess the feasibility and acceptability of a game-based sexuality education curriculum. Curriculum evaluation used descriptive statistics, observation, and qualitative and quantitative data collection. The study was conducted in eighth grade classrooms in Chicago, Illinois. Students from 3 eighth grade classrooms from a school using a game-based curriculum. The intervention had 11 modules and used an ecological model informed by the extant literature. The intervention was developed by the Game Changer Chicago Design Lab and featured a card game designed with youth participation. The study outcomes of interest included learning, feasibility, and acceptability of the curriculum. Students highly rated frank conversation via "Ask the Doctor" sessions and role-playing. Students raised concerns about the breadth of activities, preferring to explore fewer topics in greater depth. A game-based curriculum was feasible, yet students placed the highest value on frank discussion about sexuality. Copyright © 2016 North American Society for Pediatric and Adolescent Gynecology. Published by Elsevier Inc. All rights reserved.

  11. Africa's game changers and the catalysts of social and system innovation

    Directory of Open Access Journals (Sweden)

    Mark Swilling

    2016-03-01

    Full Text Available It is widely recognized that many African economies are being transformed by rapid economic growth driven largely by rising demand for the abundant natural resources scattered across the African continent. I critically review the mainstream game-changing dynamics driving this process, with special reference to a set of influential policy-oriented documents. This is followed by an analysis of less-recognized game-changing dynamics that have, in turn, been affected by the mainstream game-changing dynamics. These less-recognized game-changing dynamics include energy infrastructure challenges in a context of climate change, securing access to water, access to arable soils, slum urbanism, and food security responses. These mainstream and less-recognized game-changing dynamics provide the context for analyzing a range of African actor networks engaged in social and system innovations. I use a transdisciplinary framework to discuss these actor networks and how they construct their understanding of the game changers affecting their programs and actions. Based on a case study of the iShack initiative in Stellenbosch, South Africa, I conclude that social and system innovations will need to be driven by transformation knowledge co-produced by researchers and social actors who can actively link game-changing dynamics that operate at multiple scales with local-level innovations with potential societal impacts.

  12. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2015-12-01

    Full Text Available Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as vibration meter, signal acquisition card, data processing platform, and machine control program. Meanwhile, based on the difference between the mechanical configuration and the desired characteristics, it is difficult for a vibration detection system to directly choose the commercially available kits. For this reason, it was also selected as an item for self-development research, along with the exploration of a significant parametric study that is sufficient to represent the machine characteristics and states. However, we also launched the development of functional parts of the system simultaneously. Finally, we entered the conditions and the parameters generated from both the states and the characteristics into the developed system to verify its feasibility.

  13. Photovoltaic Hosting Capacity of Feeders with Reactive Power Control and Tap Changers

    Energy Technology Data Exchange (ETDEWEB)

    Ceylan, Oğuzhan; Paudyal, Sumit; Bhattarai, Bishnu P.; Myers, Kurt S.

    2017-06-01

    This paper proposes an algorithm to determine photovoltaic (PV) hosting capacity of power distribution networks as a function of number of PV injection nodes, reactive power support from the PVs, and the sub-station load tap changers (LTCs). In the proposed method, several minute by minute simulations are run based on randomly chosen PV injection nodes, daily PV output profiles, and daily load profiles from a pool of high-resolution realistic data set. The simulation setup is built using OpenDSS and MATLAB. The performance of the proposed method is investigated in the IEEE 123-node distribution feeder for multiple scenarios. The case studies are performed particularly for one, two, five and ten PV injection nodes, and looking at the maximum voltage deviations. Case studies show that the PV hosting capacity of the 123-node feeder greatly differs with the number of PV injection nodes. We have also observed that the PV hosting capacity increases with reactive power support and higher tap position of sub-station LTC.

  14. Reducing emissions from deforestation and forest degradation (REDD+): game changer or just another quick fix?

    Science.gov (United States)

    Venter, Oscar; Koh, Lian Pin

    2012-02-01

    Reducing emissions from deforestation and forest degradation (REDD+) provides financial compensation to land owners who avoid converting standing forests to other land uses. In this paper, we review the main opportunities and challenges for REDD+ implementation, including expectations for REDD+ to deliver on multiple environmental and societal cobenefits. We also highlight a recent case study, the Norway-Indonesia REDD+ agreement and discuss how it might be a harbinger of outcomes in other forest-rich nations seeking REDD+ funds. Looking forward, we critically examine the fundamental assumptions of REDD+ as a solution for the atmospheric buildup of greenhouse gas emissions and tropical deforestation. We conclude that REDD+ is currently the most promising mechanism driving the conservation of tropical forests. Yet, to emerge as a true game changer, REDD+ must still demonstrate that it can access low transaction cost and high-volume carbon markets or funds, while also providing or complimenting a suite of nonmonetary incentives to encourage a developing nation's transition from forest losing to forest gaining, and align with, not undermine, a globally cohesive attempt to mitigate anthropogenic climate change. © 2012 New York Academy of Sciences.

  15. Business model generation a handbook for visionaries, game changers, and challengers

    CERN Document Server

    Osterwalder, Alexander

    2010-01-01

    Business Model Generation is a handbook for visionaries, game changers, and challengers striving to defy outmoded business models and design tomorrow′s enterprises. If your organization needs to adapt to harsh new realities, but you don′t yet have a strategy that will get you out in front of your competitors, you need Business Model Generation. Co–created by 470 "Business Model Canvas" practitioners from 45 countries, the book features a beautiful, highly visual, 4–color design that takes powerful strategic ideas and tools, and makes them easy to implement in your organization. It explains the most common Business Model patterns, based on concepts from leading business thinkers, and helps you reinterpret them for your own context. You will learn how to systematically understand, design, and implement a game–changing business model––or analyze and renovate an old one. Along the way, you′ll understand at a much deeper level your customers, distribution channels, partners, revenue streams, costs...

  16. The "glymphatic" mechanism for solute clearance in Alzheimer's disease: game changer or unproven speculation?

    Science.gov (United States)

    Smith, Alex J; Verkman, Alan S

    2017-11-03

    How solutes and macromolecules are removed from brain tissue is of central importance in normal brain physiology and in how toxic protein aggregates are cleared in neurodegenerative conditions, including Alzheimer's disease (AD). Conventionally, solute transport in the narrow and tortuous extracellular space in brain parenchyma has been thought to be primarily diffusive and nondirectional. The recently proposed "glymphatic" (glial-lymphatic) hypothesis posits that solute clearance is convective and driven by active fluid transport from para-arterial to paravenous spaces though aquaporin-4 water channels in astrocyte endfeet. Glymphatic, convective solute clearance has received much attention because of its broad implications for AD and other brain pathologies and even the function of sleep. However, the theoretical plausibility of glymphatic transport has been questioned, and recent data have challenged its experimental underpinnings. A substantiated mechanism of solute clearance in the brain is of considerable importance because of its implications for pathogenic mechanisms of neurologic diseases and delivery of therapeutics.-Smith, A. J., Verkman, A. S. The "glymphatic" mechanism for solute clearance in Alzheimer's disease: game changer or unproven speculation? © FASEB.

  17. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  18. Experimental Testing and Model Validation of a Decoupled-Phase On-Load Tap Changer Transformer in an Active Network

    DEFF Research Database (Denmark)

    Zecchino, Antonio; Hu, Junjie; Coppo, Massimiliano

    2016-01-01

    this problem, distribution transformers with on-load tapping capability are under development. This paper presents model and experimental validation of a 35 kVA three-phase power distribution transformer with independent on-load tap changer control capability on each phase. With the purpose of investigating...... to reproduce the main feature of an unbalanced grid. The experimental activities are recreated in by carrying out dynamics simulation studies, aiming at validating the implemented models of both the transformer as well as the other grid components. Phase-neutral voltages’ deviations are limited, proving...

  19. Voltage Control for Unbalanced Low Voltage Grids Using a Decoupled-Phase On-Load Tap-Changer Transformer and Photovoltaic Inverters

    DEFF Research Database (Denmark)

    Zecchino, Antonio; Marinelli, Mattia; Hu, Junjie

    2015-01-01

    This paper presents modeling and analysis of the potential benefits of joint actions of a MV/LV three-phase power distribution transformer with independent on-load tap-changer control on each phase and photovoltaic inverters provided with reactive power control capability, in terms of accommodating...

  20. Coordinated voltage control scheme for distribution grid with on-load tap-changer and distributed energy resources in a market context

    DEFF Research Database (Denmark)

    Han, Xue; Bindner, Henrik W.; Mehmedalic, Jasmin

    2015-01-01

    The evolutionary changes in the electricity system are reshaping the system operation and control to achieve a more sustainable environment. In this transition, distributed energy resources (DERs) may introduce some problems, such as intermittent features, but could also play an important role on...... for case study. The necessity of the coordination between DER units and the grid facilities, e.g., on-load tap-changer (OLTC), is addressed....

  1. Being a Game Changer

    Science.gov (United States)

    Herrig, Brian; Taranto, Greg

    2012-01-01

    One of the key features that draws many people to play video games is the fact that they are interactive. Video games allow the user to be actively engaged and in control of the action (Prensky, 2006). Seventh grade students at Canonsburg Middle School are actively engaging in the creation of video games. The students are engaged at a much deeper…

  2. Towards a compact and precise sample holder for macromolecular crystallography

    Science.gov (United States)

    Rossi, Christopher; Janocha, Robert; Sorez, Clement; Astruc, Anthony; McCarthy, Andrew; Belrhali, Hassan; Cipriani, Florent

    2017-01-01

    Most of the sample holders currently used in macromolecular crystallography offer limited storage density and poor initial crystal-positioning precision upon mounting on a goniometer. This has now become a limiting factor at high-throughput beamlines, where data collection can be performed in a matter of seconds. Furthermore, this lack of precision limits the potential benefits emerging from automated harvesting systems that could provide crystal-position information which would further enhance alignment at beamlines. This situation provided the motivation for the development of a compact and precise sample holder with corresponding pucks, handling tools and robotic transfer protocols. The development process included four main phases: design, prototype manufacture, testing with a robotic sample changer and validation under real conditions on a beamline. Two sample-holder designs are proposed: NewPin and miniSPINE. They share the same robot gripper and allow the storage of 36 sample holders in uni-puck footprint-style pucks, which represents 252 samples in a dry-shipping dewar commonly used in the field. The pucks are identified with human- and machine-readable codes, as well as with radio-frequency identification (RFID) tags. NewPin offers a crystal-repositioning precision of up to 10 µm but requires a specific goniometer socket. The storage density could reach 64 samples using a special puck designed for fully robotic handling. miniSPINE is less precise but uses a goniometer mount compatible with the current SPINE standard. miniSPINE is proposed for the first implementation of the new standard, since it is easier to integrate at beamlines. An upgraded version of the SPINE sample holder with a corresponding puck named SPINEplus is also proposed in order to offer a homogenous and interoperable system. The project involved several European synchrotrons and industrial companies in the fields of consumables and sample-changer robotics. Manual handling of mini

  3. Helga-Jane Scarwell et Isabelle Roussel (dir., 2010, Le changement climatique : Quand le climat nous pousse à changer d’ère, Presses Universitaire du Septentrion, Lille, 358 pages

    Directory of Open Access Journals (Sweden)

    Laurence Rocher

    2010-12-01

    Full Text Available L’ouvrage « Le changement climatique. Quand le climat nous pousse à changer d’ère », édité par les Presses Universitaire du Septentrion dans la collection Environnement et société, coordonné par Helga-Jane Scarwell et Isabelle Roussel, rassemble autour de la question du changement climatique plusieurs contributions de chercheurs en géographie, pour la plupart membres du laboratoire TVES de Lille. Ce livre apporte un éclairage original et intéressant, alliant des études de cas comportant un fo...

  4. Game Changer: Linked Learning Detroit

    Science.gov (United States)

    ConnectEd: The California Center for College and Career, 2016

    2016-01-01

    JP Morgan Chase joins the Skillman Foundation, the Ford Foundation, and the Ford Motor Company Fund, whose grants total $7 million and will connect 10,000 Detroit high school students to career education and work experiences over the next three years through Linked Learning Detroit. Learn about Linked Learning Detroit through interviews with…

  5. El Nino: The climate changer

    Digital Repository Service at National Institute of Oceanography (India)

    RameshBabu, V.

    continued effectiveness in EOR The polymer can however be modified by reducing its rate of biodegradation without altering its Theological properties At present, scientists from Stanford Research Institute, USA are study- ing ways to reduce biodegradation... of South Africa, the major food pro- ducer in the African continent, was forced to import corn from the USA to make up for production lost on account of drought. In addition to these effects, the 1982-83 El Nino was also responsible for the failure...

  6. Distance Learning: A Game Changer

    Science.gov (United States)

    Bates, Rodger; LaBrecque, Bryan

    2017-01-01

    Previous research identified a variety of special populations which may be serviced through online learning activities. These have included the military, Native Americans, prisoners, remote occupations, and others. This paper focuses the growing role of distance learning opportunities for student and professional athletes. Special attention is…

  7. Autonomous sample switcher for Mössbauer spectroscopy

    Science.gov (United States)

    López, J. H.; Restrepo, J.; Barrero, C. A.; Tobón, J. E.; Ramírez, L. F.; Jaramillo, J.

    2017-11-01

    In this work we show the design and implementation of an autonomous sample switcher device to be used as a part of the experimental set up in transmission Mössbauer spectroscopy, which can be extended to other spectroscopic techniques employing radioactive sources. The changer is intended to minimize radiation exposure times to the users or technical staff and to optimize the use of radioactive sources without compromising the resolution of measurements or spectra. This proposal is motivated firstly by the potential hazards arising from the use of radioactive sources and secondly by the expensive costs involved, and in other cases the short life times, where a suitable and optimum use of the sources is crucial. The switcher system includes a PIC microcontroller for simple tasks involving sample displacement and positioning, in addition to a virtual instrument developed by using LabView. The shuffle of the samples proceeds in a sequential way based on the number of counts and the signal to noise ratio as selection criteria whereas the virtual instrument allows performing} a remote monitoring from a PC via Internet about the status of the spectra and to take control decisions. As an example, we show a case study involving a series of akaganeite samples. An efficiency and economical analysis is finally presented and discussed.

  8. Autonomous sample switcher for Mössbauer spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    López, J. H., E-mail: jolobotero@gmail.com; Restrepo, J., E-mail: jrestre@gmail.com [University of Antioquia, Group of Magnetism and Simulation, Institute of Physics (Colombia); Barrero, C. A., E-mail: cesar.barrero.meneses@gmail.com [University of Antioquia, Group of Solid State Physics, Institute of Physics (Colombia); Tobón, J. E., E-mail: nobotj@gmail.com; Ramírez, L. F., E-mail: luisf.ramirez@udea.edu.co; Jaramillo, J., E-mail: jdex87@gmail.com [University of Antioquia, Group of Scientific Instrumentation and Microelectronics, Institute of Physics (Colombia)

    2017-11-15

    In this work we show the design and implementation of an autonomous sample switcher device to be used as a part of the experimental set up in transmission Mössbauer spectroscopy, which can be extended to other spectroscopic techniques employing radioactive sources. The changer is intended to minimize radiation exposure times to the users or technical staff and to optimize the use of radioactive sources without compromising the resolution of measurements or spectra. This proposal is motivated firstly by the potential hazards arising from the use of radioactive sources and secondly by the expensive costs involved, and in other cases the short life times, where a suitable and optimum use of the sources is crucial. The switcher system includes a PIC microcontroller for simple tasks involving sample displacement and positioning, in addition to a virtual instrument developed by using LabView. The shuffle of the samples proceeds in a sequential way based on the number of counts and the signal to noise ratio as selection criteria whereas the virtual instrument allows performing a remote monitoring from a PC via Internet about the status of the spectra and to take control decisions. As an example, we show a case study involving a series of akaganeite samples. An efficiency and economical analysis is finally presented and discussed.

  9. Improved sample manipulation at the STRESS-SPEC neutron diffractometer using an industrial 6-axis robot for texture and strain analyses

    Energy Technology Data Exchange (ETDEWEB)

    Randau, C. [Institute for Materials Science and Engineering, Clausthal University of Technology, D-38678 Clausthal-Zellerfeld (Germany); Brokmeier, H.G., E-mail: heinz-guenter.brokmeier@tu-clausthal.de [Institute for Materials Science and Engineering, Clausthal University of Technology, D-38678 Clausthal-Zellerfeld (Germany); Institute of Materials Research, Helmholtz-Centre Geesthacht, D-21502 Geesthacht (Germany); Gan, W.M. [Institute of Materials Research, Helmholtz-Centre Geesthacht, D-21502 Geesthacht (Germany); Hofmann, M.; Voeller, M. [Forschungsneutronenquelle Heinz Maier-Leibnitz (FRM II), TU München, D-85747 Garching (Germany); Tekouo, W. [Institute for Machine Tools and Industrial Management, TU München, D-85747 Garching (Germany); Al-hamdany, N. [Institute for Materials Science and Engineering, Clausthal University of Technology, D-38678 Clausthal-Zellerfeld (Germany); Seidl, G. [Forschungsneutronenquelle Heinz Maier-Leibnitz (FRM II), TU München, D-85747 Garching (Germany); Schreyer, A. [Institute of Materials Research, Helmholtz-Centre Geesthacht, D-21502 Geesthacht (Germany)

    2015-09-11

    The materials science neutron diffractometer STRESS-SPEC located at FRM II is a dedicated instrument for strain and pole figure measurements. Both methods make complementary demands on sample handling. On one hand pole figure measurements need a high degree of freedom to orient small samples and on the other hand in strain investigations it is often necessary to handle large and heavy components. Therefore a robot based sample positioning system was developed, which has the capability to provide both possibilities. Based on this new robot system further developments like a full automated sample changer system for texture measurements were accomplished. Moreover this system opens the door for combined strain and texture analysis at STRESS-SPEC.

  10. Balanced sampling

    NARCIS (Netherlands)

    Brus, D.J.

    2015-01-01

    In balanced sampling a linear relation between the soil property of interest and one or more covariates with known means is exploited in selecting the sampling locations. Recent developments make this sampling design attractive for statistical soil surveys. This paper introduces balanced sampling

  11. Development of a system using the library of the Genie spectroscopy software and exchange of samples

    Energy Technology Data Exchange (ETDEWEB)

    Lapolli, Andre L.; Munita, Casimiro S., E-mail: alapolli@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    One the great difficulties in using the NAA method is in regards to the time that the operator spends exchanging the samples after each measurement. It becomes a big problem in routine analyses when various chemical elements are determined and then each sample must be measured at different decay times. The application of the automatic sample exchanger reduces the time analysis by several hours and reduces the tedious manual operation. Then, the effective use of NAA depends on the availability of a suitable automatic sample changer. There are some systems that are sold commercially, however many laboratories can not acquire them because they are costly. This paper presents altered programs the G2KNAA.REX, which created a screen making possible automatic or manual acquisitions by calling the old program NAAACQ.rex for the procurement manual and the new program NAAACQ2.rex for automatic requisitions. In conclusion, as can be seen in the program lines, the synchronization to automation, which unites the three systems (the computer, the Canberra Set, the sample exchanger) is done in a timely manner. The system was tested and is functioning in a satisfactory manner. (author)

  12. Language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik

    1998-01-01

    This article has two aims: [1] to present a revised version of the sampling method that was originally proposed in 1993 by Rijkhoff, Bakker, Hengeveld and Kahrel, and [2] to discuss a number of other approaches to language sampling in the light of our own method. We will also demonstrate how our...... sampling method is used with different genetic classifications (Voegelin & Voegelin 1977, Ruhlen 1987, Grimes ed. 1997) and argue that —on the whole— our sampling technique compares favourably with other methods, especially in the case of exploratory research....

  13. Venous Sampling

    Science.gov (United States)

    ... neck to help locate abnormally functioning glands or pituitary adenoma . This test is most often used after an unsuccessful neck exploration. Inferior petrosal sinus sampling , in which blood samples are taken from veins that drain the pituitary gland to study disorders related to pituitary hormone ...

  14. Sampling Development

    Science.gov (United States)

    Adolph, Karen E.; Robinson, Scott R.

    2011-01-01

    Research in developmental psychology requires sampling at different time points. Accurate depictions of developmental change provide a foundation for further empirical studies and theories about developmental mechanisms. However, overreliance on widely spaced sampling intervals in cross-sectional and longitudinal designs threatens the validity of…

  15. Language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik

    1998-01-01

    This article has two aims: [1] to present a revised version of the sampling method that was originally proposed in 1993 by Rijkhoff, Bakker, Hengeveld and Kahrel, and [2] to discuss a number of other approaches to language sampling in the light of our own method. We will also demonstrate how our...

  16. Environmental sampling

    Energy Technology Data Exchange (ETDEWEB)

    Puckett, J.M.

    1998-12-31

    Environmental Sampling (ES) is a technology option that can have application in transparency in nuclear nonproliferation. The basic process is to take a sample from the environment, e.g., soil, water, vegetation, or dust and debris from a surface, and through very careful sample preparation and analysis, determine the types, elemental concentration, and isotopic composition of actinides in the sample. The sample is prepared and the analysis performed in a clean chemistry laboratory (CCL). This ES capability is part of the IAEA Strengthened Safeguards System. Such a Laboratory is planned to be built by JAERI at Tokai and will give Japan an intrinsic ES capability. This paper presents options for the use of ES as a transparency measure for nuclear nonproliferation.

  17. Elevating sampling

    Science.gov (United States)

    Labuz, Joseph M.; Takayama, Shuichi

    2014-01-01

    Sampling – the process of collecting, preparing, and introducing an appropriate volume element (voxel) into a system – is often under appreciated and pushed behind the scenes in lab-on-a-chip research. What often stands in the way between proof-of-principle demonstrations of potentially exciting technology and its broader dissemination and actual use, however, is the effectiveness of sample collection and preparation. The power of micro- and nanofluidics to improve reactions, sensing, separation, and cell culture cannot be accessed if sampling is not equally efficient and reliable. This perspective will highlight recent successes as well as assess current challenges and opportunities in this area. PMID:24781100

  18. Robotic Tool Changer for Planetary Exploration Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Future planetary exploration missions will require compact, lightweight robotic manipulators for handling a variety of tools & instruments without increasing the...

  19. STEM Career Changers' Transformation into Science Teachers

    Science.gov (United States)

    Snyder, Catherine; Oliveira, Alandeom W.; Paska, Lawrence M.

    2013-06-01

    This study examines the transformation (professional growth) of career-changing women scientists who decided to become teachers. Drawing upon Mezirow's Transformative Learning Theory, we tracked their transformation for 3 years. Our findings revealed multiple identities, disorientation, a perceived sense of meaninglessness, loss and eventual regain in confidence, gain in pedagogical knowledge and skill, and changed perceptions of the social roles of science teachers and scientists. Driven by personal choice or need (financial, intellectual), such transformations were achieved through active pursuit of meaning in one's work, critical assessment of assumptions, planning, and trying on the unfamiliar role of a science teacher. It is argued that such transition entails complex changes in thinking about science teaching and identifying oneself as a science teacher.

  20. iLearning, The Game Changer

    Science.gov (United States)

    2012-03-21

    After swearing in, Johnny will receive a military issue iPad loaded with a suit of training applications (“apps”, as they are commonly known). He...their first unit of assignment. This “unhinging” of learning from the class room also opens a new and exciting IMT paradigm shift. Unhinged learning

  1. Systematic sampling with errors in sample locations

    DEFF Research Database (Denmark)

    Ziegel, Johanna; Baddeley, Adrian; Dorph-Petersen, Karl-Anton

    2010-01-01

    Systematic sampling of points in continuous space is widely used in microscopy and spatial surveys. Classical theory provides asymptotic expressions for the variance of estimators based on systematic sampling as the grid spacing decreases. However, the classical theory assumes that the sample grid...... is exactly periodic; real physical sampling procedures may introduce errors in the placement of the sample points. This paper studies the effect of errors in sample positioning on the variance of estimators in the case of one-dimensional systematic sampling. First we sketch a general approach to variance...

  2. An Abecedary of Sampling.

    Science.gov (United States)

    Doyle, Kenneth O., Jr.

    1979-01-01

    The vocabulary of sampling is examined in order to provide a clear understanding of basic sampling concepts. The basic vocabulary of sampling (population, probability sampling, precision and bias, stratification), the fundamental grammar of sampling (random sample), sample size and response rate, and cluster, multiphase, snowball, and panel…

  3. Modern survey sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Exposure to SamplingAbstract Introduction Concepts of Population, Sample, and SamplingInitial RamificationsAbstract Introduction Sampling Design, Sampling SchemeRandom Numbers and Their Uses in Simple RandomSampling (SRS)Drawing Simple Random Samples with and withoutReplacementEstimation of Mean, Total, Ratio of Totals/Means:Variance and Variance EstimationDetermination of Sample SizesA.2 Appendix to Chapter 2 A.More on Equal Probability Sampling A.Horvitz-Thompson EstimatorA.SufficiencyA.LikelihoodA.Non-Existence Theorem More Intricacies Abstract Introduction Unequal Probability Sampling StrategiesPPS Sampling Exploring Improved WaysAbstract Introduction Stratified Sampling Cluster SamplingMulti-Stage SamplingMulti-Phase Sampling: Ratio and RegressionEstimationviiviii ContentsControlled SamplingModeling Introduction Super-Population ModelingPrediction Approach Model-Assisted Approach Bayesian Methods Spatial SmoothingSampling on Successive Occasions: Panel Rotation Non-Response and Not-at-Homes Weighting Adj...

  4. Lunar Sample Atlas

    Data.gov (United States)

    National Aeronautics and Space Administration — The Lunar Sample Atlas provides pictures of the Apollo samples taken in the Lunar Sample Laboratory, full-color views of the samples in microscopic thin-sections,...

  5. Signal sampling circuit

    NARCIS (Netherlands)

    Louwsma, S.M.; Vertregt, Maarten

    2011-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital

  6. Signal sampling circuit

    NARCIS (Netherlands)

    Louwsma, S.M.; Vertregt, Maarten

    2010-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital

  7. A Mars Sample Return Sample Handling System

    Science.gov (United States)

    Wilson, David; Stroker, Carol

    2013-01-01

    We present a sample handling system, a subsystem of the proposed Dragon landed Mars Sample Return (MSR) mission [1], that can return to Earth orbit a significant mass of frozen Mars samples potentially consisting of: rock cores, subsurface drilled rock and ice cuttings, pebble sized rocks, and soil scoops. The sample collection, storage, retrieval and packaging assumptions and concepts in this study are applicable for the NASA's MPPG MSR mission architecture options [2]. Our study assumes a predecessor rover mission collects samples for return to Earth to address questions on: past life, climate change, water history, age dating, understanding Mars interior evolution [3], and, human safety and in-situ resource utilization. Hence the rover will have "integrated priorities for rock sampling" [3] that cover collection of subaqueous or hydrothermal sediments, low-temperature fluidaltered rocks, unaltered igneous rocks, regolith and atmosphere samples. Samples could include: drilled rock cores, alluvial and fluvial deposits, subsurface ice and soils, clays, sulfates, salts including perchlorates, aeolian deposits, and concretions. Thus samples will have a broad range of bulk densities, and require for Earth based analysis where practical: in-situ characterization, management of degradation such as perchlorate deliquescence and volatile release, and contamination management. We propose to adopt a sample container with a set of cups each with a sample from a specific location. We considered two sample cups sizes: (1) a small cup sized for samples matching those submitted to in-situ characterization instruments, and, (2) a larger cup for 100 mm rock cores [4] and pebble sized rocks, thus providing diverse samples and optimizing the MSR sample mass payload fraction for a given payload volume. We minimize sample degradation by keeping them frozen in the MSR payload sample canister using Peltier chip cooling. The cups are sealed by interference fitted heat activated memory

  8. Information sampling behavior with explicit sampling costs

    Science.gov (United States)

    Juni, Mordechai Z.; Gureckis, Todd M.; Maloney, Laurence T.

    2015-01-01

    The decision to gather information should take into account both the value of information and its accrual costs in time, energy and money. Here we explore how people balance the monetary costs and benefits of gathering additional information in a perceptual-motor estimation task. Participants were rewarded for touching a hidden circular target on a touch-screen display. The target’s center coincided with the mean of a circular Gaussian distribution from which participants could sample repeatedly. Each “cue” — sampled one at a time — was plotted as a dot on the display. Participants had to repeatedly decide, after sampling each cue, whether to stop sampling and attempt to touch the hidden target or continue sampling. Each additional cue increased the participants’ probability of successfully touching the hidden target but reduced their potential reward. Two experimental conditions differed in the initial reward associated with touching the hidden target and the fixed cost per cue. For each condition we computed the optimal number of cues that participants should sample, before taking action, to maximize expected gain. Contrary to recent claims that people gather less information than they objectively should before taking action, we found that participants over-sampled in one experimental condition, and did not significantly under- or over-sample in the other. Additionally, while the ideal observer model ignores the current sample dispersion, we found that participants used it to decide whether to stop sampling and take action or continue sampling, a possible consequence of imperfect learning of the underlying population dispersion across trials. PMID:27429991

  9. Sampling and P-sampling expansions

    Indian Academy of Sciences (India)

    Using the hyperfinite representation of functions and generalized functions this paper develops a rigorous version of the so-called `delta method' approach to sampling theory. This yields a slightly more general version of the classical WKS sampling theorem for band-limited functions.

  10. How Sample Size Affects a Sampling Distribution

    Science.gov (United States)

    Mulekar, Madhuri S.; Siegel, Murray H.

    2009-01-01

    If students are to understand inferential statistics successfully, they must have a profound understanding of the nature of the sampling distribution. Specifically, they must comprehend the determination of the expected value and standard error of a sampling distribution as well as the meaning of the central limit theorem. Many students in a high…

  11. Chorionic Villus Sampling (CVS)

    Science.gov (United States)

    ... Pregnancy > Prenatal care > Chorionic villus sampling Chorionic villus sampling E-mail to a friend Please fill in ... It's been added to your dashboard . Chorionic villus sampling (CVS) is a prenatal test . It’s used to ...

  12. Iowa Geologic Sampling Points

    Data.gov (United States)

    Iowa State University GIS Support and Research Facility — Point locations of geologic samples/files in the IGS repository. Types of samples include well cuttings, outcrop samples, cores, drillers logs, measured sections,...

  13. DNA Sampling Hook

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The DNA Sampling Hook is a significant improvement on a method of obtaining a tissue sample from a live fish in situ from an aquatic environment. A tissue sample...

  14. Samples 9 (2010)

    OpenAIRE

    Arbeitskreis Studium Populärer Musik e.V. (ASPM)

    2010-01-01

    SCHWERPUNKTTHEMA. SAMPLING IM HIPHOP Gastherausgeber: Oliver Kautny und Adam Krims. Oliver Kautny: Talkin´ All That Jazz - Ein Plädoyer für die Analyse des Sampling im HipHop (Editorial). Adam Krims: Sampling in Scholarship (english editorial). Mark Katz: Sampling before Sampling. The Link Between DJ and Producer. Sascha Klammt aka Quasi Modo: Das Sample - eine einzigartige Momentaufnahme als Basis für eine neue Komposition. Detlev Rick aka DJ Rick Ski: Die Entstehung des A...

  15. Sampling in Practice

    DEFF Research Database (Denmark)

    Petersen, Lars; Esbensen, Kim Harry

    2005-01-01

    A basic knowledge of the Theory of Sampling (TOS) and a set of only eight sampling unit operations is all the practical sampler needs to ensure representativeness of samples extracted from all kinds of lots: production batches, - truckloads, - barrels, sub-division in the laboratory, sampling...... in nature and in the field (environmental sampling, forestry, geology, biology), from raw materials or manufactory processes etc. We here can only give a brief introduction to the Fundamental Sampling Principle (FSP) and these eight Sampling Unit Operations (SUO’s). Always respecting FSP and invoking only...

  16. Sampling and P-sampling expansions

    Indian Academy of Sciences (India)

    In this paper we consider instead a non-standard approach to sampling theory. The hyperfinite representation of functions and generalized functions has been studied in an earlier paper [2], and the same notation and conventions will be used here. In particular,. PcNI denotes a given even infinite hypernatural number, 4 И ...

  17. Genetic Sample Inventory

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database archives genetic tissue samples from marine mammals collected primarily from the U.S. east coast. The collection includes samples from field programs,...

  18. Genetic Sample Inventory - NRDA

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database archives genetic tissue samples from marine mammals collected in the North-Central Gulf of Mexico from 2010-2015. The collection includes samples from...

  19. Superposition Enhanced Nested Sampling

    Science.gov (United States)

    Martiniani, Stefano; Stevenson, Jacob D.; Wales, David J.; Frenkel, Daan

    2014-07-01

    The theoretical analysis of many problems in physics, astronomy, and applied mathematics requires an efficient numerical exploration of multimodal parameter spaces that exhibit broken ergodicity. Monte Carlo methods are widely used to deal with these classes of problems, but such simulations suffer from a ubiquitous sampling problem: The probability of sampling a particular state is proportional to its entropic weight. Devising an algorithm capable of sampling efficiently the full phase space is a long-standing problem. Here, we report a new hybrid method for the exploration of multimodal parameter spaces exhibiting broken ergodicity. Superposition enhanced nested sampling combines the strengths of global optimization with the unbiased or athermal sampling of nested sampling, greatly enhancing its efficiency with no additional parameters. We report extensive tests of this new approach for atomic clusters that are known to have energy landscapes for which conventional sampling schemes suffer from broken ergodicity. We also introduce a novel parallelization algorithm for nested sampling.

  20. Mold Testing or Sampling

    Science.gov (United States)

    In most cases, if visible mold growth is present, sampling is unnecessary. Since no EPA or other federal limits have been set for mold or mold spores, sampling cannot be used to check a building's compliance with federal mold standards.

  1. Chorionic villus sampling

    Science.gov (United States)

    ... medlineplus.gov/ency/article/003406.htm Chorionic villus sampling To use the sharing features on this page, please enable JavaScript. Chorionic villus sampling (CVS) is a test some pregnant women have ...

  2. Sampling on Quasicrystals

    OpenAIRE

    Grepstad, Sigrid

    2011-01-01

    We prove that quasicrystals are universal sets of stable sampling in any dimension. Necessary and sufficient density conditions for stable sampling and interpolation sets in one dimension are studied in detail.

  3. Superposition Enhanced Nested Sampling

    Directory of Open Access Journals (Sweden)

    Stefano Martiniani

    2014-08-01

    Full Text Available The theoretical analysis of many problems in physics, astronomy, and applied mathematics requires an efficient numerical exploration of multimodal parameter spaces that exhibit broken ergodicity. Monte Carlo methods are widely used to deal with these classes of problems, but such simulations suffer from a ubiquitous sampling problem: The probability of sampling a particular state is proportional to its entropic weight. Devising an algorithm capable of sampling efficiently the full phase space is a long-standing problem. Here, we report a new hybrid method for the exploration of multimodal parameter spaces exhibiting broken ergodicity. Superposition enhanced nested sampling combines the strengths of global optimization with the unbiased or athermal sampling of nested sampling, greatly enhancing its efficiency with no additional parameters. We report extensive tests of this new approach for atomic clusters that are known to have energy landscapes for which conventional sampling schemes suffer from broken ergodicity. We also introduce a novel parallelization algorithm for nested sampling.

  4. Lunar Sample Compendium

    Data.gov (United States)

    National Aeronautics and Space Administration — The purpose of the Lunar Sample Compendium is to inform scientists, astronauts and the public about the various lunar samples that have been returned from the Moon....

  5. Developing Water Sampling Standards

    Science.gov (United States)

    Environmental Science and Technology, 1974

    1974-01-01

    Participants in the D-19 symposium on aquatic sampling and measurement for water pollution assessment were informed that determining the extent of waste water stream pollution is not a cut and dry procedure. Topics discussed include field sampling, representative sampling from storm sewers, suggested sampler features and application of improved…

  6. IAEA Sampling Plan

    Energy Technology Data Exchange (ETDEWEB)

    Geist, William H. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-15

    The objectives for this presentation are to describe the method that the IAEA uses to determine a sampling plan for nuclear material measurements; describe the terms detection probability and significant quantity; list the three nuclear materials measurement types; describe the sampling method applied to an item facility; and describe multiple method sampling.

  7. Aerosol sampling system

    Science.gov (United States)

    Masquelier, Donald A.

    2004-02-10

    A system for sampling air and collecting particulate of a predetermined particle size range. A low pass section has an opening of a preselected size for gathering the air but excluding particles larger than the sample particles. An impactor section is connected to the low pass section and separates the air flow into a bypass air flow that does not contain the sample particles and a product air flow that does contain the sample particles. A wetted-wall cyclone collector, connected to the impactor section, receives the product air flow and traps the sample particles in a liquid.

  8. Sample Proficiency Test exercise

    Energy Technology Data Exchange (ETDEWEB)

    Alcaraz, A; Gregg, H; Koester, C

    2006-02-05

    The current format of the OPCW proficiency tests has multiple sets of 2 samples sent to an analysis laboratory. In each sample set, one is identified as a sample, the other as a blank. This method of conducting proficiency tests differs from how an OPCW designated laboratory would receive authentic samples (a set of three containers, each not identified, consisting of the authentic sample, a control sample, and a blank sample). This exercise was designed to test the reporting if the proficiency tests were to be conducted. As such, this is not an official OPCW proficiency test, and the attached report is one method by which LLNL might report their analyses under a more realistic testing scheme. Therefore, the title on the report ''Report of the Umpteenth Official OPCW Proficiency Test'' is meaningless, and provides a bit of whimsy for the analyses and readers of the report.

  9. The Lunar Sample Compendium

    Science.gov (United States)

    Meyer, Charles

    2009-01-01

    The Lunar Sample Compendium is a succinct summary of the data obtained from 40 years of study of Apollo and Luna samples of the Moon. Basic petrographic, chemical and age information is compiled, sample-by-sample, in the form of an advanced catalog in order to provide a basic description of each sample. The LSC can be found online using Google. The initial allocation of lunar samples was done sparingly, because it was realized that scientific techniques would improve over the years and new questions would be formulated. The LSC is important because it enables scientists to select samples within the context of the work that has already been done and facilitates better review of proposed allocations. It also provides back up material for public displays, captures information found only in abstracts, grey literature and curatorial databases and serves as a ready access to the now-vast scientific literature.

  10. Fast mixing hyperdynamic sampling

    OpenAIRE

    Sminchisescu, Cristian; Triggs, Bill

    2006-01-01

    Special issue on ECCV'02 papers; International audience; Sequential random sampling (‘Markov Chain Monte-Carlo') is a popular strategy for many vision problems involving multi-modal distributions over high-dimensional parameter spaces. It applies both to importance sampling (where one wants to sample points according to their ‘importance' for some calculation, but otherwise fairly) and to global-optimization (where one wants to find good minima, or at least good starting points for local mini...

  11. Hyperdynamics Importance Sampling

    OpenAIRE

    Sminchisescu, Cristian; Triggs, Bill

    2002-01-01

    International audience; Sequential random sampling (‘Markov Chain Monte-Carlo') is a popular strategy for many vision problems involving multimodal distributions over high-dimensional parameter spaces. It applies both to importance sampling (where one wants to sample points according to their ‘importance' for some calculation, but otherwise fairly) and to global optimization (where one wants to find good minima, or at least good starting points for local minimization, regardless of fairness)....

  12. Laboratory Sampling Guide

    Science.gov (United States)

    2012-05-11

    Barium  Beryllium  Cadmium  Chromium  Cobalt  Copper  Iron  Lead  Magnesium  Manganese  Molybdenum  Potassium  Nickel...paper or MCE filters Aluminum Antimony Arsenic Barium Beryllium Cadmium Chromium(Total) Cobalt Copper Iron Lead Manganese Molybdenum...Sampling Equipment Selection Guidea Matrix Sampling Device Image Device-Specific Guidance Sample Type Comments L iq ui ds Automatic sampler ASTM D

  13. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  14. Sample Differentiation: Cocaine Example.

    Science.gov (United States)

    Baugh, L D; Liu, R H

    1991-12-01

    Since the analyses of drug samples in crime laboratories are often associated with investigations, potential differentiations of test samples are frequently requested and explored. Cocaine sample differentiation requires the determination of synthetic or natural origin. Synthetic samples are characterized by the presence of optical isomers, certain diastereoisomers and other by-products, and chemical residues used in synthesis. Samples derived from a natural origin (coca leaves) are characterized by the presence of certain natural products or their derivatives that are carried through the overall process and by residual chemicals reflecting the treatment procedures. Various approaches and analytical data available in the literature concerning the differentiation of cocaine samples are reviewed. Each sample must carry its own "signature"; however, true sample "individualization" cannot be accomplished using the technologies commonly available and used in crime laboratories, and is not usually needed. Alternatively, "classifying" cocaine samples in certain categories or groups can be accomplished routinely and often provides adequate information for investigatory purposes. Copyright © 1991 Central Police University.

  15. Game-Changer: The Illusion of War Without Risk

    Science.gov (United States)

    2017-04-28

    401-841-3556 Standard Form 298 (Rev . 8-98) NAVAL WAR COLLEGE Newport, R.I. ‘GAME-CHANGER’: THE ILLUSION OF WAR WITHOUT RISK...Midway could have been a feint, as Dutch Harbor was. The idea of perfect intelligence is a dream, and not even a beautiful one: it teaches commanders

  16. social enterprise as the game-changer: embracing innovation

    African Journals Online (AJOL)

    Mugumbate

    issues such poverty, joblessness, homelessness, drug abuse, child abuse, divorce, juvenile delinquency and many others continue to ...... of research into business and social sector relationships in Newcastle and the Hunter region of New South. Wales, Australia. Asia pacific journal of social work, 12(2), 96–122. Gray, M ...

  17. Stakeholder engagement: the game changer for program management

    National Research Council Canada - National Science Library

    Baugh, Amy

    2015-01-01

    .... The first section of the book covers stakeholder engagement in the program definition phase, including how to identify key stakeholders, gain their trust, and build relationships through effective communication...

  18. Metagenomics: The Next Culture-Independent Game Changer

    Directory of Open Access Journals (Sweden)

    Jessica D. Forbes

    2017-07-01

    Full Text Available A trend towards the abandonment of obtaining pure culture isolates in frontline laboratories is at a crossroads with the ability of public health agencies to perform their basic mandate of foodborne disease surveillance and response. The implementation of culture-independent diagnostic tests (CIDTs including nucleic acid and antigen-based assays for acute gastroenteritis is leaving public health agencies without laboratory evidence to link clinical cases to each other and to food or environmental substances. This limits the efficacy of public health epidemiology and surveillance as well as outbreak detection and investigation. Foodborne outbreaks have the potential to remain undetected or have insufficient evidence to support source attribution and may inadvertently increase the incidence of foodborne diseases. Next-generation sequencing of pure culture isolates in clinical microbiology laboratories has the potential to revolutionize the fields of food safety and public health. Metagenomics and other ‘omics’ disciplines could provide the solution to a cultureless future in clinical microbiology, food safety and public health. Data mining of information obtained from metagenomics assays can be particularly useful for the identification of clinical causative agents or foodborne contamination, detection of AMR and/or virulence factors, in addition to providing high-resolution subtyping data. Thus, metagenomics assays may provide a universal test for clinical diagnostics, foodborne pathogen detection, subtyping and investigation. This information has the potential to reform the field of enteric disease diagnostics and surveillance and also infectious diseases as a whole. The aim of this review will be to present the current state of CIDTs in diagnostic and public health laboratories as they relate to foodborne illness and food safety. Moreover, we will also discuss the diagnostic and subtyping utility and concomitant bias limitations of metagenomics and comparable detection techniques in clinical microbiology, food and public health laboratories. Early advances in the discipline of metagenomics, however, have indicated noteworthy challenges. Through forthcoming improvements in sequencing technology and analytical pipelines among others, we anticipate that within the next decade, detection and characterization of pathogens via metagenomics-based workflows will be implemented in routine usage in diagnostic and public health laboratories.

  19. Ares V: Game Changer for National Security Launch

    Science.gov (United States)

    Sumrall, Phil; Morris, Bruce

    2009-01-01

    NASA is designing the Ares V cargo launch vehicle to vastly expand exploration of the Moon begun in the Apollo program and enable the exploration of Mars and beyond. As the largest launcher in history, Ares V also represents a national asset offering unprecedented opportunities for new science, national security, and commercial missions of unmatched size and scope. The Ares V is the heavy-lift component of NASA's dual-launch architecture that will replace the current space shuttle fleet, complete the International Space Station, and establish a permanent human presence on the Moon as a stepping-stone to destinations beyond. During extensive independent and internal architecture and vehicle trade studies as part of the Exploration Systems Architecture Study (ESAS), NASA selected the Ares I crew launch vehicle and the Ares V to support future exploration. The smaller Ares I will launch the Orion crew exploration vehicle with four to six astronauts into orbit. The Ares V is designed to carry the Altair lunar lander into orbit, rendezvous with Orion, and send the mated spacecraft toward lunar orbit. The Ares V will be the largest and most powerful launch vehicle in history, providing unprecedented payload mass and volume to establish a permanent lunar outpost and explore significantly more of the lunar surface than was done during the Apollo missions. The Ares V consists of a Core Stage, two Reusable Solid Rocket Boosters (RSRBs), Earth Departure Stage (EDS), and a payload shroud. For lunar missions, the shroud would cover the Lunar Surface Access Module (LSAM). The Ares V Core Stage is 33 feet in diameter and 212 feet in length, making it the largest rocket stage ever built. It is the same diameter as the Saturn V first stage, the S-IC. However, its length is about the same as the combined length of the Saturn V first and second stages. The Core Stage uses a cluster of five Pratt & Whitney Rocketdyne RS-68B rocket engines, each supplying about 700,000 pounds of thrust. Its propellants are liquid hydrogen and liquid oxygen. The two solid rocket boosters provide about 3.5 million pounds of thrust at liftoff. These 5.5-segment boosters are derived from the 4-segment boosters now used on the Space Shuttle, and are similar to those used in the Ares I first stage. The EDS is powered by one J-2X engine. The J-2X, which has roughly 294,000 pounds of thrust, also powers the Ares I Upper Stage. It is derived from the J-2 that powered the Saturn V second and third stages. The EDS performs two functions. Its initial suborbital burns will place the lunar lander into a stable Earth orbit. After the Orion crew vehicle, launched separately on an Ares I, docks with the lander/EDS stack, EDS will ignite a second time to put the combined 65-metric ton vehicle into a lunar transfer orbit. When it stands on the launch pad at Kennedy Space Center late in the next decade, the Ares V stack will be approximately 381 feet tall and have a gross liftoff mass of 8.1 million pounds. The current point-of-departure design exceeds Saturn V s mass capability by approximately 40 percent. Using the current payload shroud design, Ares V can carry 315,000 pounds to 29-degree low Earth orbit (LEO) or 77,000 pounds to a geosynchronous orbit. Another unique aspect of the Ares V is the 33-foot-diameter payload shroud, which encloses approximately 30,400 cubic feet of usable volume. A larger hypothetical shroud for encapsulating larger payloads has been studied. While Ares V makes possible larger payload masses and volumes, it may alternately make possible more cost-effective mission design if the relevant payload communities are willing to consider an alternative to the existing approach that has driven them to employ complexity to solve current launch vehicle mass and volume constraints. By using Ares V s mass and volume capabilities as margin, payload designers stand to reduce development risk and cost. Significant progress has been made on the Ares V to support a plaed fiscal 2011 authority-to-proceed (ATP) milestone. The Ares V team is actively reaching out to external organizations during this early concept phase to ensure that the Ares V vehicle can be leveraged for national security, science, and commercial development needs. This presentation will discuss Ares V vehicle configuration, the path to the current concept, accomplishments to date, and potential payload utilization opportunities.

  20. Life Satisfaction of Former-Military, Second-Career Changers

    Science.gov (United States)

    Robertson, Heather

    2014-01-01

    One-hundred thirty-six former-military members with average age of 51 transitioning to a second-career in teaching were surveyed regarding life satisfaction and were found to be satisfied with their lives. The research compliments earlier studies of second-career teachers as effective teachers, yet provides additional insight on former-military…

  1. Metagenomics: The Next Culture-Independent Game Changer

    Science.gov (United States)

    Forbes, Jessica D.; Knox, Natalie C.; Ronholm, Jennifer; Pagotto, Franco; Reimer, Aleisha

    2017-01-01

    A trend towards the abandonment of obtaining pure culture isolates in frontline laboratories is at a crossroads with the ability of public health agencies to perform their basic mandate of foodborne disease surveillance and response. The implementation of culture-independent diagnostic tests (CIDTs) including nucleic acid and antigen-based assays for acute gastroenteritis is leaving public health agencies without laboratory evidence to link clinical cases to each other and to food or environmental substances. This limits the efficacy of public health epidemiology and surveillance as well as outbreak detection and investigation. Foodborne outbreaks have the potential to remain undetected or have insufficient evidence to support source attribution and may inadvertently increase the incidence of foodborne diseases. Next-generation sequencing of pure culture isolates in clinical microbiology laboratories has the potential to revolutionize the fields of food safety and public health. Metagenomics and other ‘omics’ disciplines could provide the solution to a cultureless future in clinical microbiology, food safety and public health. Data mining of information obtained from metagenomics assays can be particularly useful for the identification of clinical causative agents or foodborne contamination, detection of AMR and/or virulence factors, in addition to providing high-resolution subtyping data. Thus, metagenomics assays may provide a universal test for clinical diagnostics, foodborne pathogen detection, subtyping and investigation. This information has the potential to reform the field of enteric disease diagnostics and surveillance and also infectious diseases as a whole. The aim of this review will be to present the current state of CIDTs in diagnostic and public health laboratories as they relate to foodborne illness and food safety. Moreover, we will also discuss the diagnostic and subtyping utility and concomitant bias limitations of metagenomics and comparable detection techniques in clinical microbiology, food and public health laboratories. Early advances in the discipline of metagenomics, however, have indicated noteworthy challenges. Through forthcoming improvements in sequencing technology and analytical pipelines among others, we anticipate that within the next decade, detection and characterization of pathogens via metagenomics-based workflows will be implemented in routine usage in diagnostic and public health laboratories. PMID:28725217

  2. Local Identification of Voltage Instability from Load Tap Changer Response

    DEFF Research Database (Denmark)

    Weckesser, Johannes Tilman Gabriel; Papangelis, Lampros; Vournas, Costas D.

    2017-01-01

    This paper presents a local long-term voltage instability monitoring method, which is suitable for on-line applications. The proposed extended-time Local Identification of Voltage Emergency Situations (eLIVES) method is a significantly modified version of the previously presented LIVES method. Th...... to acquired distribution voltage measurements and a new set of rules to detect a voltage emergency situation. The effectiveness of the eLIVES method is presented on the IEEE Nordic test system for voltage stability and security assessment.......This paper presents a local long-term voltage instability monitoring method, which is suitable for on-line applications. The proposed extended-time Local Identification of Voltage Emergency Situations (eLIVES) method is a significantly modified version of the previously presented LIVES method...

  3. Metagenomics: The Next Culture-Independent Game Changer.

    Science.gov (United States)

    Forbes, Jessica D; Knox, Natalie C; Ronholm, Jennifer; Pagotto, Franco; Reimer, Aleisha

    2017-01-01

    A trend towards the abandonment of obtaining pure culture isolates in frontline laboratories is at a crossroads with the ability of public health agencies to perform their basic mandate of foodborne disease surveillance and response. The implementation of culture-independent diagnostic tests (CIDTs) including nucleic acid and antigen-based assays for acute gastroenteritis is leaving public health agencies without laboratory evidence to link clinical cases to each other and to food or environmental substances. This limits the efficacy of public health epidemiology and surveillance as well as outbreak detection and investigation. Foodborne outbreaks have the potential to remain undetected or have insufficient evidence to support source attribution and may inadvertently increase the incidence of foodborne diseases. Next-generation sequencing of pure culture isolates in clinical microbiology laboratories has the potential to revolutionize the fields of food safety and public health. Metagenomics and other 'omics' disciplines could provide the solution to a cultureless future in clinical microbiology, food safety and public health. Data mining of information obtained from metagenomics assays can be particularly useful for the identification of clinical causative agents or foodborne contamination, detection of AMR and/or virulence factors, in addition to providing high-resolution subtyping data. Thus, metagenomics assays may provide a universal test for clinical diagnostics, foodborne pathogen detection, subtyping and investigation. This information has the potential to reform the field of enteric disease diagnostics and surveillance and also infectious diseases as a whole. The aim of this review will be to present the current state of CIDTs in diagnostic and public health laboratories as they relate to foodborne illness and food safety. Moreover, we will also discuss the diagnostic and subtyping utility and concomitant bias limitations of metagenomics and comparable detection techniques in clinical microbiology, food and public health laboratories. Early advances in the discipline of metagenomics, however, have indicated noteworthy challenges. Through forthcoming improvements in sequencing technology and analytical pipelines among others, we anticipate that within the next decade, detection and characterization of pathogens via metagenomics-based workflows will be implemented in routine usage in diagnostic and public health laboratories.

  4. Social enterprise as the game-changer: embracing innovation and ...

    African Journals Online (AJOL)

    The paper underscores that since social enterprise is premised on a culture of innovation, openness and adaption, it represents a hands-on approach to sustainable community economic development. This article concludes by proposing strategies for incorporating entrepreneurial initiatives in community and social sector ...

  5. A Game Changer: Electrifying Remote Communities by Using Isolated Microgrids

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Xiaonan; Wang, Jianhui

    2017-06-01

    Microgrids, as self-contained entities, are of increasing interest in modern electric grids. Microgrids provide a sustainable solution to aggregate distributed energy resources (DERs) [e.g., photovoltaics (PVs), wind turbines], energy storage, and loads in a localized manner, especially in distribution systems. As a controllable unit, a microgrid can manage and balance the source and load power inside it to ensure stable and reliable operation. Moreover, through coordination with upper-level control systems, it can be dispatched and respond to the control commands issued by the central controller in the distribution system-in other words, a system that is effectively a distribution management system (DMS).

  6. Digital Suicide Prevention: Can Technology Become a Game-changer?

    Science.gov (United States)

    Vahabzadeh, Arshya; Sahin, Ned; Kalali, Amir

    2016-01-01

    Suicide continues to be a leading cause of death and has been recognized as a significant public health issue. Rapid advances in data science can provide us with useful tools for suicide prevention, and help to dynamically assess suicide risk in quantitative data-driven ways. In this article, the authors highlight the most current international research in digital suicide prevention, including the use of machine learning, smartphone applications, and wearable sensor driven systems. The authors also discuss future opportunities for digital suicide prevention, and propose a novel Sensor-driven Mental State Assessment System.

  7. Large Stationary Gravity Waves: A Game Changer for Venus' Science

    Science.gov (United States)

    Navarro, T.; Schubert, G.; Lebonnois, S.

    2017-11-01

    In 2015, the discovery by the Akatsuki spacecraft of an astonishing, unexpected, 10,000 km long meridional structure at the cloud top, stationary with respect to the surface, calls into question our very basic understanding of Venus.

  8. Industrial Hygiene Sampling Instructions

    Science.gov (United States)

    1987-03-01

    transport . g. Field blank tubes will be submitted with each set of samples. If the number of samples in a set exceeds 10, then submit at the rate of one...2 Gelman PSPJ037 (For PAH) 37 2 Membrana ’ - PVC 37 5 Gelman 66467 37 5 Nuclepore 361850 Filter only 240810 Pad only Swinnex Cassette 13 - Millipore

  9. Simple street tree sampling

    Science.gov (United States)

    David J. Nowak; Jeffrey T. Walton; James Baldwin; Jerry. Bond

    2015-01-01

    Information on street trees is critical for management of this important resource. Sampling of street tree populations provides an efficient means to obtain street tree population information. Long-term repeat measures of street tree samples supply additional information on street tree changes and can be used to report damages from catastrophic events. Analyses of...

  10. Sampling system and method

    Science.gov (United States)

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee

    2013-04-16

    The present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit.

  11. Extraterrestrial Samples at JSC

    Science.gov (United States)

    Allen, Carlton C.

    2007-01-01

    A viewgraph presentation on the curation of extraterrestrial samples at NASA Johnson Space Center is shown. The topics include: 1) Apollo lunar samples; 2) Meteorites from Antarctica; 3) Cosmic dust from the stratosphere; 4) Genesis solar wind ions; 5) Stardust comet and interstellar grains; and 5) Space-Exposed Hardware.

  12. Gaussian Boson Sampling

    Science.gov (United States)

    Hamilton, Craig S.; Kruse, Regina; Sansoni, Linda; Barkhofen, Sonja; Silberhorn, Christine; Jex, Igor

    2017-10-01

    Boson sampling has emerged as a tool to explore the advantages of quantum over classical computers as it does not require universal control over the quantum system, which favors current photonic experimental platforms. Here, we introduce Gaussian Boson sampling, a classically hard-to-solve problem that uses squeezed states as a nonclassical resource. We relate the probability to measure specific photon patterns from a general Gaussian state in the Fock basis to a matrix function called the Hafnian, which answers the last remaining question of sampling from Gaussian states. Based on this result, we design Gaussian Boson sampling, a #P hard problem, using squeezed states. This demonstrates that Boson sampling from Gaussian states is possible, with significant advantages in the photon generation probability, compared to existing protocols.

  13. Sample pretretment in microsystems

    DEFF Research Database (Denmark)

    Perch-Nielsen, Ivan R.

    2003-01-01

    When a sample, e.g. from a patient, is processed using conventional methods, the sample must be transported to the laboratory where it is analyzed, after which the results is sent back. By integrating the separate steps of the analysis in a micro total analysis system (μTAS), results can......: Sample preparation → DNA amplification → DNA analysis. The overall goal of the project is integration of as many as possible of these steps. This thesis covers mainly pretreatment in a microchip. Some methods for sample pretreatment have been tested. Most conventional is fluorescence activated cell sort...... be obtained fast and better. Preferably with all the processes from sample to signal moved to the bedside of the patient. Of course there is still much to learn and study in the process of miniaturization. DNA analysis is one process subject to integration. There are roughly three steps in a DNA analysis...

  14. Biological sample collector

    Science.gov (United States)

    Murphy, Gloria A [French Camp, CA

    2010-09-07

    A biological sample collector is adapted to a collect several biological samples in a plurality of filter wells. A biological sample collector may comprise a manifold plate for mounting a filter plate thereon, the filter plate having a plurality of filter wells therein; a hollow slider for engaging and positioning a tube that slides therethrough; and a slide case within which the hollow slider travels to allow the tube to be aligned with a selected filter well of the plurality of filter wells, wherein when the tube is aligned with the selected filter well, the tube is pushed through the hollow slider and into the selected filter well to sealingly engage the selected filter well and to allow the tube to deposit a biological sample onto a filter in the bottom of the selected filter well. The biological sample collector may be portable.

  15. Rapid Active Sampling Package

    Science.gov (United States)

    Peters, Gregory

    2010-01-01

    A field-deployable, battery-powered Rapid Active Sampling Package (RASP), originally designed for sampling strong materials during lunar and planetary missions, shows strong utility for terrestrial geological use. The technology is proving to be simple and effective for sampling and processing materials of strength. Although this originally was intended for planetary and lunar applications, the RASP is very useful as a powered hand tool for geologists and the mining industry to quickly sample and process rocks in the field on Earth. The RASP allows geologists to surgically acquire samples of rock for later laboratory analysis. This tool, roughly the size of a wrench, allows the user to cut away swaths of weathering rinds, revealing pristine rock surfaces for observation and subsequent sampling with the same tool. RASPing deeper (.3.5 cm) exposes single rock strata in-situ. Where a geologist fs hammer can only expose unweathered layers of rock, the RASP can do the same, and then has the added ability to capture and process samples into powder with particle sizes less than 150 microns, making it easier for XRD/XRF (x-ray diffraction/x-ray fluorescence). The tool uses a rotating rasp bit (or two counter-rotating bits) that resides inside or above the catch container. The container has an open slot to allow the bit to extend outside the container and to allow cuttings to enter and be caught. When the slot and rasp bit are in contact with a substrate, the bit is plunged into it in a matter of seconds to reach pristine rock. A user in the field may sample a rock multiple times at multiple depths in minutes, instead of having to cut out huge, heavy rock samples for transport back to a lab for analysis. Because of the speed and accuracy of the RASP, hundreds of samples can be taken in one day. RASP-acquired samples are small and easily carried. A user can characterize more area in less time than by using conventional methods. The field-deployable RASP used a Ni

  16. Contributions to sampling statistics

    CERN Document Server

    Conti, Pier; Ranalli, Maria

    2014-01-01

    This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international  forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...

  17. Sample quality criteria.

    Science.gov (United States)

    Ramsey, Charles A; Wagner, Claas

    2015-01-01

    The concept of Sample Quality Criteria (SQC) is the initial step in the scientific approach to representative sampling. It includes the establishment of sampling objectives, Decision Unit (DU), and confidence. Once fully defined, these criteria serve as input, in addition to material properties, to the Theory of Sampling for developing a representative sampling protocol. The first component of the SQC establishes these questions: What is the analyte(s) of concern? What is the concentration level of interest of the analyte(s)? How will inference(s) be made from the analytical data to the DU? The second component of the SQC establishes the DU, i.e., the scale at which decisions are to be made. On a large scale, a DU could be a ship or rail car; examples for small-scale DUs are individual beans, seeds, or kernels. A well-defined DU is critical because it defines the spatial and temporal boundaries of sample collection. SQC are not limited to a single DU; they can also include multiple DUs. The third SQC component, the confidence, establishes the desired probability that a correct inference (decision) can be made. The confidence level should typically correlate to the potential consequences of an incorrect decision (e.g., health or economic). The magnitude of combined errors in the sampling, sample processing and analytical protocols determines the likelihood of an incorrect decision. Thus, controlling error to a greater extent increases the probability of a correct decision. The required confidence level directly affects the sampling effort and QC measures.

  18. Inference for Noisy Samples

    Directory of Open Access Journals (Sweden)

    I. A. Ahmad

    2012-07-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE In the current work, some well-known inference procedures including testing and estimation are adjusted to accommodate noisy data that lead to nonidentically distributed sample.  The main two cases addressed are the Poisson and the normal distributions. Both one and two sample cases are addressed.  Other cases including the exponential and the Pareto distributions are briefly mentioned.  In the Poisson case, the situation when the sample size is random is mentioned.

  19. Lunar Sample Display Locations

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA provides a number of lunar samples for display at museums, planetariums, and scientific expositions around the world. Lunar displays are open to the public....

  20. Sample Return Robot Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This Challenge requires demonstration of an autonomous robotic system to locate and collect a set of specific sample types from a large planetary analog area and...

  1. Open port sampling interface

    Science.gov (United States)

    Van Berkel, Gary J

    2017-04-25

    A system for sampling a sample material includes a probe which can have an outer probe housing with an open end. A liquid supply conduit within the housing has an outlet positioned to deliver liquid to the open end of the housing. The liquid supply conduit can be connectable to a liquid supply for delivering liquid at a first volumetric flow rate to the open end of the housing. A liquid exhaust conduit within the housing is provided for removing liquid from the open end of the housing. A liquid exhaust system can be provided for removing liquid from the liquid exhaust conduit at a second volumetric flow rate, the first volumetric flow rate exceeding the second volumetric flow rate, wherein liquid at the open end will receive sample, liquid containing sample material will be drawn into and through the liquid exhaust conduit, and liquid will overflow from the open end.

  2. Open port sampling interface

    Energy Technology Data Exchange (ETDEWEB)

    Van Berkel, Gary J.

    2018-01-16

    A system for sampling a sample material includes a probe which can have an outer probe housing with an open end. A liquid supply conduit within the housing has an outlet positioned to deliver liquid to the open end of the housing. The liquid supply conduit can be connectable to a liquid supply for delivering liquid at a first volumetric flow rate to the open end of the housing. A liquid exhaust conduit within the housing is provided for removing liquid from the open end of the housing. A liquid exhaust system can be provided for removing liquid from the liquid exhaust conduit at a second volumetric flow rate, the first volumetric flow rate exceeding the second volumetric flow rate, wherein liquid at the open end will receive sample, liquid containing sample material will be drawn into and through the liquid exhaust conduit, and liquid will overflow from the open end.

  3. Stardust Sample Catalog

    Data.gov (United States)

    National Aeronautics and Space Administration — This Catalog summarizes the samples examined in the course of the Preliminary Examination (PE) Team (PET) of the Stardust Mission to comet Wild 2, and the results of...

  4. Roadway sampling evaluation.

    Science.gov (United States)

    2014-09-01

    The Florida Department of Transportation (FDOT) has traditionally required that all sampling : and testing of asphalt mixtures be at the Contractors production facility. With recent staffing cuts, as : well as budget reductions, FDOT has been cons...

  5. UFA Auction Sampling Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Between 1984 January - 2002 June, personnel from NMFS/PIFSC/FRMD/FMB/FMAP and Hawaii Department of Aquatic Resources (DAR) conducted port sampling at the United...

  6. Sample Encapsulation Device Project

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA's Science Mission Directorate is currently considering various sample cache and return missions to the Moon, Mars and asteroids. These missions involve the use...

  7. Mini MAX - Medicaid Sample

    Data.gov (United States)

    U.S. Department of Health & Human Services — To facilitate wider use of MAX, CMS contracted with Mathematica to convene a technical expert panel (TEP) and determine the feasibility of creating a sample file for...

  8. Dissolution actuated sample container

    Science.gov (United States)

    Nance, Thomas A.; McCoy, Frank T.

    2013-03-26

    A sample collection vial and process of using a vial is provided. The sample collection vial has an opening secured by a dissolvable plug. When dissolved, liquids may enter into the interior of the collection vial passing along one or more edges of a dissolvable blocking member. As the blocking member is dissolved, a spring actuated closure is directed towards the opening of the vial which, when engaged, secures the vial contents against loss or contamination.

  9. Two phase sampling

    CERN Document Server

    Ahmad, Zahoor; Hanif, Muhammad

    2013-01-01

    The development of estimators of population parameters based on two-phase sampling schemes has seen a dramatic increase in the past decade. Various authors have developed estimators of population using either one or two auxiliary variables. The present volume is a comprehensive collection of estimators available in single and two phase sampling. The book covers estimators which utilize information on single, two and multiple auxiliary variables of both quantitative and qualitative nature. Th...

  10. Sample size for beginners.

    OpenAIRE

    Florey, C D

    1993-01-01

    The common failure to include an estimation of sample size in grant proposals imposes a major handicap on applicants, particularly for those proposing work in any aspect of research in the health services. Members of research committees need evidence that a study is of adequate size for there to be a reasonable chance of a clear answer at the end. A simple illustrated explanation of the concepts in determining sample size should encourage the faint hearted to pay more attention to this increa...

  11. Wet gas sampling

    Energy Technology Data Exchange (ETDEWEB)

    Welker, T.F.

    1997-07-01

    The quality of gas has changed drastically in the past few years. Most gas is wet with hydrocarbons, water, and heavier contaminants that tend to condense if not handled properly. If a gas stream is contaminated with condensables, the sampling of that stream must be done in a manner that will ensure all of the components in the stream are introduced into the sample container as the composite. The sampling and handling of wet gas is extremely difficult under ideal conditions. There are no ideal conditions in the real world. The problems related to offshore operations and other wet gas systems, as well as the transportation of the sample, are additional problems that must be overcome if the analysis is to mean anything to the producer and gatherer. The sampling of wet gas systems is decidedly more difficult than sampling conventional dry gas systems. Wet gas systems were generally going to result in the measurement of one heating value at the inlet of the pipe and a drastic reduction in the heating value of the gas at the outlet end of the system. This is caused by the fallout or accumulation of the heavier products that, at the inlet, may be in the vapor state in the pipeline; hence, the high gravity and high BTU. But, in fact, because of pressure and temperature variances, these liquids condense and form a liquid that is actually running down the pipe as a stream or is accumulated in drips to be blown from the system. (author)

  12. Recommended protocols for sampling macrofungi

    Science.gov (United States)

    Gregory M. Mueller; John Paul Schmit; Sabine M. Hubndorf Leif Ryvarden; Thomas E. O' Dell; D. Jean Lodge; Patrick R. Leacock; Milagro Mata; Loengrin Umania; Qiuxin (Florence) Wu; Daniel L. Czederpiltz

    2004-01-01

    This chapter discusses several issues regarding reommended protocols for sampling macrofungi: Opportunistic sampling of macrofungi, sampling conspicuous macrofungi using fixed-size, sampling small Ascomycetes using microplots, and sampling a fixed number of downed logs.

  13. Nonuniform sampling by quantiles.

    Science.gov (United States)

    Craft, D Levi; Sonstrom, Reilly E; Rovnyak, Virginia G; Rovnyak, David

    2018-02-13

    A flexible strategy for choosing samples nonuniformly from a Nyquist grid using the concept of statistical quantiles is presented for broad classes of NMR experimentation. Quantile-directed scheduling is intuitive and flexible for any weighting function, promotes reproducibility and seed independence, and is generalizable to multiple dimensions. In brief, weighting functions are divided into regions of equal probability, which define the samples to be acquired. Quantile scheduling therefore achieves close adherence to a probability distribution function, thereby minimizing gaps for any given degree of subsampling of the Nyquist grid. A characteristic of quantile scheduling is that one-dimensional, weighted NUS schedules are deterministic, however higher dimensional schedules are similar within a user-specified jittering parameter. To develop unweighted sampling, we investigated the minimum jitter needed to disrupt subharmonic tracts, and show that this criterion can be met in many cases by jittering within 25-50% of the subharmonic gap. For nD-NUS, three supplemental components to choosing samples by quantiles are proposed in this work: (i) forcing the corner samples to ensure sampling to specified maximum values in indirect evolution times, (ii) providing an option to triangular backfill sampling schedules to promote dense/uniform tracts at the beginning of signal evolution periods, and (iii) providing an option to force the edges of nD-NUS schedules to be identical to the 1D quantiles. Quantile-directed scheduling meets the diverse needs of current NUS experimentation, but can also be used for future NUS implementations such as off-grid NUS and more. A computer program implementing these principles (a.k.a. QSched) in 1D- and 2D-NUS is available under the general public license. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. AND/OR Importance Sampling

    OpenAIRE

    Gogate, Vibhav; Dechter, Rina

    2012-01-01

    The paper introduces AND/OR importance sampling for probabilistic graphical models. In contrast to importance sampling, AND/OR importance sampling caches samples in the AND/OR space and then extracts a new sample mean from the stored samples. We prove that AND/OR importance sampling may have lower variance than importance sampling; thereby providing a theoretical justification for preferring it over importance sampling. Our empirical evaluation demonstrates that AND/OR importance sampling is ...

  15. Career Change and Motivation: A Matter of Balance

    Science.gov (United States)

    Green, Liz; Hemmings, Brian; Green, Annette

    2007-01-01

    The study was designed to consider the motivations of career changers and the perceived outcomes of their career change. Data were collected from a sample of career changers (N = 81), approximately half of whom had used the services of a career coach. The analysis showed: firstly, that the reported outcomes associated with career change appeared…

  16. Ethics and sample size.

    Science.gov (United States)

    Bacchetti, Peter; Wolf, Leslie E; Segal, Mark R; McCulloch, Charles E

    2005-01-15

    The belief is widespread that studies are unethical if their sample size is not large enough to ensure adequate power. The authors examine how sample size influences the balance that determines the ethical acceptability of a study: the balance between the burdens that participants accept and the clinical or scientific value that a study can be expected to produce. The average projected burden per participant remains constant as the sample size increases, but the projected study value does not increase as rapidly as the sample size if it is assumed to be proportional to power or inversely proportional to confidence interval width. This implies that the value per participant declines as the sample size increases and that smaller studies therefore have more favorable ratios of projected value to participant burden. The ethical treatment of study participants therefore does not require consideration of whether study power is less than the conventional goal of 80% or 90%. Lower power does not make a study unethical. The analysis addresses only ethical acceptability, not optimality; large studies may be desirable for other than ethical reasons.

  17. Interactive Sample Book (ISB)

    DEFF Research Database (Denmark)

    Heimdal, Elisabeth Jacobsen; Lenau, Torben Anker; Guglielmi, Michel

    2009-01-01

    supervisor Torben A. Lenau. Inspiration to use smart materials Interactive textiles are still quite an unknown phenomenon to many. It is thus often difficult to communicate what kind of potentials lie within these materials. This is why the ISB project was started, as a practice based research project...... and senses in relation to integrated decoration and function primarily to indoor applications. The result of the project will be a number of interactive textiles, to be gathered in an interactive sample book (ISB), in a similar way as the sample books of wallpapers one can take home from the shop and choose...... from. In other words, it is a kind of display material, which in a simple manner can illustrate how different techniques and smart materials work. The sample book should display a number of possibilities where sensor technology, smart materials and textiles are mixed to such an extent that the textile...

  18. Radioactive air sampling methods

    CERN Document Server

    Maiello, Mark L

    2010-01-01

    Although the field of radioactive air sampling has matured and evolved over decades, it has lacked a single resource that assimilates technical and background information on its many facets. Edited by experts and with contributions from top practitioners and researchers, Radioactive Air Sampling Methods provides authoritative guidance on measuring airborne radioactivity from industrial, research, and nuclear power operations, as well as naturally occuring radioactivity in the environment. Designed for industrial hygienists, air quality experts, and heath physicists, the book delves into the applied research advancing and transforming practice with improvements to measurement equipment, human dose modeling of inhaled radioactivity, and radiation safety regulations. To present a wide picture of the field, it covers the international and national standards that guide the quality of air sampling measurements and equipment. It discusses emergency response issues, including radioactive fallout and the assets used ...

  19. Strategic Sample Selection

    DEFF Research Database (Denmark)

    Di Tillio, Alfredo; Ottaviani, Marco; Sørensen, Peter Norman

    2017-01-01

    is double logconvex, as with normal noise. The results are applied to the analysis of strategic sample selection by a biased researcher and extended to the case of uncertain and unanticipated selection. Our theoretical analysis offers applied research a new angle on the problem of selection in empirical......What is the impact of sample selection on the inference payoff of an evaluator testing a simple hypothesis based on the outcome of a location experiment? We show that anticipated selection locally reduces noise dispersion and thus increases informativeness if and only if the noise distribution...

  20. Sample size for beginners.

    Science.gov (United States)

    Florey, C D

    1993-05-01

    The common failure to include an estimation of sample size in grant proposals imposes a major handicap on applicants, particularly for those proposing work in any aspect of research in the health services. Members of research committees need evidence that a study is of adequate size for there to be a reasonable chance of a clear answer at the end. A simple illustrated explanation of the concepts in determining sample size should encourage the faint hearted to pay more attention to this increasingly important aspect of grantsmanship.

  1. Quantum private data sampling

    Science.gov (United States)

    Fattal, David; Fiorentino, Marco; Beausoleil, Raymond G.

    2009-08-01

    We present a novel quantum communication protocol for "Private Data Sampling", where a player (Bob) obtains a random sample of limited size of a classical database, while the database owner (Alice) remains oblivious as to which bits were accessed. The protocol is efficient in the sense that the communication complexity per query scales at most linearly with the size of the database. It does not violate Lo's "no-go" theorem for one-sided twoparty secure computation, since a given joint input by Alice and Bob can result in randomly different protocol outcomes. After outlining the main security features of the protocol, we present our first experimental results.

  2. Request for wood samples

    NARCIS (Netherlands)

    NN,

    1977-01-01

    In recent years the wood collection at the Rijksherbarium was greatly expanded following a renewed interest in wood anatomy as an aid for solving classification problems. Staff members of the Rijksherbarium added to the collection by taking interesting wood samples with them from their expeditions

  3. Determination of Sample Size

    OpenAIRE

    Naing, Nyi Nyi

    2003-01-01

    There is a particular importance of determining a basic minimum required ‘n’ size of the sample to recognize a particular measurement of a particular population. This article has highlighted the determination of an appropriate size to estimate population parameters.

  4. Drafting Work Sample.

    Science.gov (United States)

    Shawsheen Valley Regional Vocational-Technical High School, Billerica, MA.

    This manual contains a work sample intended to assess a handicapped student's interest in and to screen interested students into a training program in basic mechanical drawing. (The course is based on the entry level of an assistant drafter.) Section 1 describes the assessment, correlates the work performed and worker traits required for…

  5. Optimal Sampling and Interpolation

    NARCIS (Netherlands)

    Shekhawat, Hanumant

    2012-01-01

    The main objective in this thesis is to design optimal samplers, downsamplers and interpolators (holds) which are required in signal processing. The sampled-data system theory is used to fulfill this objective in a generic setup. Signal processing, which includes signal transmission, storage and

  6. Biological Sampling Variability Study

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hutchison, Janine R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-11-08

    There are many sources of variability that exist in the sample collection and analysis process. This paper addresses many, but not all, sources of variability. The main focus of this paper was to better understand and estimate variability due to differences between samplers. Variability between days was also studied, as well as random variability within each sampler. Experiments were performed using multiple surface materials (ceramic and stainless steel), multiple contaminant concentrations (10 spores and 100 spores), and with and without the presence of interfering material. All testing was done with sponge sticks using 10-inch by 10-inch coupons. Bacillus atrophaeus was used as the BA surrogate. Spores were deposited using wet deposition. Grime was coated on the coupons which were planned to include the interfering material (Section 3.3). Samples were prepared and analyzed at PNNL using CDC protocol (Section 3.4) and then cultured and counted. Five samplers were trained so that samples were taken using the same protocol. Each sampler randomly sampled eight coupons each day, four coupons with 10 spores deposited and four coupons with 100 spores deposited. Each day consisted of one material being tested. The clean samples (no interfering materials) were run first, followed by the dirty samples (coated with interfering material). There was a significant difference in recovery efficiency between the coupons with 10 spores deposited (mean of 48.9%) and those with 100 spores deposited (mean of 59.8%). There was no general significant difference between the clean and dirty (containing interfering material) coupons or between the two surface materials; however, there was a significant interaction between concentration amount and presence of interfering material. The recovery efficiency was close to the same for coupons with 10 spores deposited, but for the coupons with 100 spores deposited, the recovery efficiency for the dirty samples was significantly larger (65

  7. Sampling system and method

    Energy Technology Data Exchange (ETDEWEB)

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee

    2017-03-07

    In one embodiment, the present disclosure provides an apparatus and method for supporting a tubing bundle during installation or removal. The apparatus includes a clamp for securing the tubing bundle to an external wireline. In various examples, the clamp is external to the tubing bundle or integral with the tubing bundle. According to one method, a tubing bundle and wireline are deployed together and the tubing bundle periodically secured to the wireline using a clamp. In another embodiment, the present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit. In a specific example, one or more clamps are used to connect the first and/or second conduits to an external wireline.

  8. Steered transition path sampling.

    Science.gov (United States)

    Guttenberg, Nicholas; Dinner, Aaron R; Weare, Jonathan

    2012-06-21

    We introduce a path sampling method for obtaining statistical properties of an arbitrary stochastic dynamics. The method works by decomposing a trajectory in time, estimating the probability of satisfying a progress constraint, modifying the dynamics based on that probability, and then reweighting to calculate averages. Because the progress constraint can be formulated in terms of occurrences of events within time intervals, the method is particularly well suited for controlling the sampling of currents of dynamic events. We demonstrate the method for calculating transition probabilities in barrier crossing problems and survival probabilities in strongly diffusive systems with absorbing states, which are difficult to treat by shooting. We discuss the relation of the algorithm to other methods.

  9. Digital Microfluidics Sample Analyzer

    Science.gov (United States)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  10. Samples and Sampling Protocols for Scientific Investigations | Joel ...

    African Journals Online (AJOL)

    ... from sampling, through sample preparation, calibration to final measurement and reporting. This paper, therefore offers useful information on practical guidance on sampling protocols in line with best practice and international standards. Keywords: Sampling, sampling protocols, chain of custody, analysis, documentation ...

  11. Repeated judgment sampling: Boundaries

    Directory of Open Access Journals (Sweden)

    Johannes Muller-Trede

    2011-06-01

    Full Text Available This paper investigates the boundaries of the recent result that eliciting more than one estimate from the same person and averaging these can lead to accuracy gains in judgment tasks. It first examines its generality, analysing whether the kind of question being asked has an effect on the size of potential gains. Experimental results show that the question type matters. Previous results reporting potential accuracy gains are reproduced for year-estimation questions, and extended to questions about percentage shares. On the other hand, no gains are found for general numerical questions. The second part of the paper tests repeated judgment sampling's practical applicability by asking judges to provide a third and final answer on the basis of their first two estimates. In an experiment, the majority of judges do not consistently average their first two answers. As a result, they do not realise the potential accuracy gains from averaging.

  12. Sampling the Hydrogen Atom

    Directory of Open Access Journals (Sweden)

    Graves N.

    2013-01-01

    Full Text Available A model is proposed for the hydrogen atom in which the electron is an objectively real particle orbiting at very near to light speed. The model is based on the postulate that certain velocity terms associated with orbiting bodies can be considered as being af- fected by relativity. This leads to a model for the atom in which the stable electron orbits are associated with orbital velocities where Gamma is n /α , leading to the idea that it is Gamma that is quantized and not angular momentum as in the Bohr and other models. The model provides a mechanism which leads to quantization of energy levels within the atom and also provides a simple mechanical explanation for the Fine Struc- ture Constant. The mechanism is closely associated with the Sampling theorem and the related phenomenon of aliasing developed in the mid-20th century by engineers at Bell labs.

  13. Sample collection, biobanking, and analysis

    NARCIS (Netherlands)

    Ahsman, Maurice J.; Tibboel, Dick; Mathot, Ron A. A.; de Wildt, Saskia N.

    2011-01-01

    Pediatric pharmacokinetic studies require sampling of biofluids from neonates and children. Limitations on sampling frequency and sample volume complicate the design of these studies. In addition, strict guidelines, designed to guarantee patient safety, are in place. This chapter describes the

  14. Variable Sampling Mapping

    Science.gov (United States)

    Smith, Jeffrey, S.; Aronstein, David L.; Dean, Bruce H.; Lyon, Richard G.

    2012-01-01

    The performance of an optical system (for example, a telescope) is limited by the misalignments and manufacturing imperfections of the optical elements in the system. The impact of these misalignments and imperfections can be quantified by the phase variations imparted on light traveling through the system. Phase retrieval is a methodology for determining these variations. Phase retrieval uses images taken with the optical system and using a light source of known shape and characteristics. Unlike interferometric methods, which require an optical reference for comparison, and unlike Shack-Hartmann wavefront sensors that require special optical hardware at the optical system's exit pupil, phase retrieval is an in situ, image-based method for determining the phase variations of light at the system s exit pupil. Phase retrieval can be used both as an optical metrology tool (during fabrication of optical surfaces and assembly of optical systems) and as a sensor used in active, closed-loop control of an optical system, to optimize performance. One class of phase-retrieval algorithms is the iterative transform algorithm (ITA). ITAs estimate the phase variations by iteratively enforcing known constraints in the exit pupil and at the detector, determined from modeled or measured data. The Variable Sampling Mapping (VSM) technique is a new method for enforcing these constraints in ITAs. VSM is an open framework for addressing a wide range of issues that have previously been considered detrimental to high-accuracy phase retrieval, including undersampled images, broadband illumination, images taken at or near best focus, chromatic aberrations, jitter or vibration of the optical system or detector, and dead or noisy detector pixels. The VSM is a model-to-data mapping procedure. In VSM, fully sampled electric fields at multiple wavelengths are modeled inside the phase-retrieval algorithm, and then these fields are mapped to intensities on the light detector, using the properties

  15. Sample size estimation and sampling techniques for selecting a representative sample

    OpenAIRE

    Aamir Omair

    2014-01-01

    Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect ...

  16. Two-Step Sequential Sampling

    NARCIS (Netherlands)

    Moors, J.J.A.; Strijbosch, L.W.G.

    2000-01-01

    Deciding upon the optimal sample size in advance is a difficult problem in general. Often, the investigator regrets not having drawn a larger sample; in many cases additional observations are done. This implies that the actual sample size is no longer deterministic; hence, even if all sample

  17. Jenis Sample: Keuntungan dan Kerugiannya

    OpenAIRE

    Suprapto, Agus

    1994-01-01

    Sample is a part of a population that are used in a study for purposes of making estimation about the nature of the total population that is obtained with sampling technic. Sampling technic is more adventagous than cencus because it can reduce cost, time, and it can gather deeper information and more accurate data. It is useful to distinguish two major types of sampling technics. First, Prob bility sampling i.e. simple random sampling. Second, Non Probability sampling i.e. systematic sam­plin...

  18. Sample design for Understanding Society

    OpenAIRE

    Lynn, Peter

    2009-01-01

    This paper describes the design of the sample for “Understanding Society†. The sample consists of five components. The largest component is a newly-selected general population sample. The other four components are an ethnic minority ‘boost’ sample, a general population comparison sample, the ex-BHPS (British Household Panel Survey) sample, and the innovation panel sample. For each component, the paper outlines the design and explains the rationale behind the main features of the desig...

  19. Basic design of sample container for transport of extraterrestrial samples

    Science.gov (United States)

    Dirri, F.; Longobardo, A.; Palomba, E.; Hutzler, A.; Ferrière, L.

    2017-09-01

    The aim of this work is to provide, in the framework of the EURO-CARES (European Curation of Astromaterials Returned from Exploration of Space) project, a technical overview based on the sample container used in previous sample return missions (e.g., Hayabusa1, Stardust, etc.) and to define a basic design of a sample container aimed at transporting the extraterrestrial returned samples within a Sample Curation Facility (SCF) or from a SCF to another laboratory (and vice versa). The sample container structure and the transportation criticalities (such as contamination and mechanical stress) are discussed in detail in each scenario.

  20. Coordination of Conditional Poisson Samples

    Directory of Open Access Journals (Sweden)

    Grafström Anton

    2015-12-01

    Full Text Available Sample coordination seeks to maximize or to minimize the overlap of two or more samples. The former is known as positive coordination, and the latter as negative coordination. Positive coordination is mainly used for estimation purposes and to reduce data collection costs. Negative coordination is mainly performed to diminish the response burden of the sampled units. Poisson sampling design with permanent random numbers provides an optimum coordination degree of two or more samples. The size of a Poisson sample is, however, random. Conditional Poisson (CP sampling is a modification of the classical Poisson sampling that produces a fixed-size πps sample. We introduce two methods to coordinate Conditional Poisson samples over time or simultaneously. The first one uses permanent random numbers and the list-sequential implementation of CP sampling. The second method uses a CP sample in the first selection and provides an approximate one in the second selection because the prescribed inclusion probabilities are not respected exactly. The methods are evaluated using the size of the expected sample overlap, and are compared with their competitors using Monte Carlo simulation. The new methods provide a good coordination degree of two samples, close to the performance of Poisson sampling with permanent random numbers.

  1. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  2. Sample Acquisition for Materials in Planetary Exploration (SAMPLE) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — ORBITEC proposes to analyze, design, and develop a device for autonomous lunar surface/subsurface sampling and processing applications. The Sample Acquisition for...

  3. Sampling and chemical analysis in environmental samples around Nuclear Power Plants and some environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Yong Woo; Han, Man Jung; Cho, Seong Won; Cho, Hong Jun; Oh, Hyeon Kyun; Lee, Jeong Min; Chang, Jae Sook [KORTIC, Taejon (Korea, Republic of)

    2002-12-15

    Twelve kinds of environmental samples such as soil, seawater, underground water, etc. around Nuclear Power Plants(NPPs) were collected. Tritium chemical analysis was tried for the samples of rain water, pine-needle, air, seawater, underground water, chinese cabbage, a grain of rice and milk sampled around NPPs, and surface seawater and rain water sampled over the country. Strontium in the soil that sere sampled at 60 point of district in Korea were analyzed. Tritium were sampled at 60 point of district in Korea were analyzed. Tritium were analyzed in 21 samples of surface seawater around the Korea peninsular that were supplied from KFRDI(National Fisheries Research and Development Institute). Sampling and chemical analysis environmental samples around Kori, Woolsung, Youngkwang, Wooljin Npps and Taeduk science town for tritium and strontium analysis was managed according to plans. Succeed to KINS after all samples were tried.

  4. A Note on Information-Directed Sampling and Thompson Sampling

    OpenAIRE

    Zhou, Li

    2015-01-01

    This note introduce three Bayesian style Multi-armed bandit algorithms: Information-directed sampling, Thompson Sampling and Generalized Thompson Sampling. The goal is to give an intuitive explanation for these three algorithms and their regret bounds, and provide some derivations that are omitted in the original papers.

  5. Metadata, Identifiers, and Physical Samples

    Science.gov (United States)

    Arctur, D. K.; Lenhardt, W. C.; Hills, D. J.; Jenkyns, R.; Stroker, K. J.; Todd, N. S.; Dassie, E. P.; Bowring, J. F.

    2016-12-01

    Physical samples are integral to much of the research conducted by geoscientists. The samples used in this research are often obtained at significant cost and represent an important investment for future research. However, making information about samples - whether considered data or metadata - available for researchers to enable discovery is difficult: a number of key elements related to samples are difficult to characterize in common ways, such as classification, location, sample type, sampling method, repository information, subsample distribution, and instrumentation, because these differ from one domain to the next. Unifying these elements or developing metadata crosswalks is needed. The iSamples (Internet of Samples) NSF-funded Research Coordination Network (RCN) is investigating ways to develop these types of interoperability and crosswalks. Within the iSamples RCN, one of its working groups, WG1, has focused on the metadata related to physical samples. This includes identifying existing metadata standards and systems, and how they might interoperate with the International Geo Sample Number (IGSN) schema (schema.igsn.org) in order to help inform leading practices for metadata. For example, we are examining lifecycle metadata beyond the IGSN `birth certificate.' As a first step, this working group is developing a list of relevant standards and comparing their various attributes. In addition, the working group is looking toward technical solutions to facilitate developing a linked set of registries to build the web of samples. Finally, the group is also developing a comparison of sample identifiers and locators. This paper will provide an overview and comparison of the standards identified thus far, as well as an update on the technical solutions examined for integration. We will discuss how various sample identifiers might work in complementary fashion with the IGSN to more completely describe samples, facilitate retrieval of contextual information, and

  6. New prior sampling methods for nested sampling - Development and testing

    Science.gov (United States)

    Stokes, Barrie; Tuyl, Frank; Hudson, Irene

    2017-06-01

    Nested Sampling is a powerful algorithm for fitting models to data in the Bayesian setting, introduced by Skilling [1]. The nested sampling algorithm proceeds by carrying out a series of compressive steps, involving successively nested iso-likelihood boundaries, starting with the full prior distribution of the problem parameters. The "central problem" of nested sampling is to draw at each step a sample from the prior distribution whose likelihood is greater than the current likelihood threshold, i.e., a sample falling inside the current likelihood-restricted region. For both flat and informative priors this ultimately requires uniform sampling restricted to the likelihood-restricted region. We present two new methods of carrying out this sampling step, and illustrate their use with the lighthouse problem [2], a bivariate likelihood used by Gregory [3] and a trivariate Gaussian mixture likelihood. All the algorithm development and testing reported here has been done with Mathematica® [4].

  7. Public Use Microdata Samples (PUMS)

    Data.gov (United States)

    National Aeronautics and Space Administration — Public Use Microdata Samples (PUMS) are computer-accessible files containing records for a sample of housing units, with information on the characteristics of each...

  8. Graph Sampling for Visual Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Fangyan; Zhang, Song; Chung Wong, Pak

    2017-07-01

    Effectively visualizing large graphs and capturing the statistical properties are two challenging tasks. To aid in these two tasks, many sampling approaches for graph simplification have been proposed, falling into three categories: node sampling, edge sampling, and traversal-based sampling. It is still unknown which approach is the best. We evaluate commonly used graph sampling methods through a combined visual and statistical comparison of graphs sampled at various rates. We conduct our evaluation on three graph models: random graphs, small-world graphs, and scale-free graphs. Initial results indicate that the effectiveness of a sampling method is dependent on the graph model, the size of the graph, and the desired statistical property. This benchmark study can be used as a guideline in choosing the appropriate method for a particular graph sampling task, and the results presented can be incorporated into graph visualization and analysis tools.

  9. Treat Medication Samples with Respect

    Science.gov (United States)

    Home Support ISMP Newsletters Webinars Report Errors Educational Store Consulting FAQ Tools About Us Contact Us Treat Medication Samples with Respect A physician may give you samples of a particular medication at the time of your office or clinic visit. ...

  10. Representative mass reduction in sampling

    DEFF Research Database (Denmark)

    Petersen, Lars; Esbensen, Harry Kim; Dahl, Casper Kierulf

    2004-01-01

    dividers, the Boerner Divider, the ??spoon method??, alternate/fractional shoveling and grab sampling. Only devices based on riffle splitting principles (static or rotational) passes the ultimate representativity test (with minor, but significant relative differences). Grab sampling, the overwhelmingly...... always be representative in the full Theory of Sampling (TOS) sense. This survey also allows empirical verification of the merits of the famous ??Gy?s formula?? for order-of-magnitude estimation of the Fundamental Sampling Error (FSE)....

  11. The rise of survey sampling

    NARCIS (Netherlands)

    Bethlehem, J.

    2009-01-01

    This paper is about the history of survey sampling. It describes how sampling became an accepted scientific method. From the first ideas in 1895 it took some 50 years before the principles of probability sampling were widely accepted. This papers has a focus on developments in official statistics in

  12. Sample inhomogeneity in PIXE analysis

    Science.gov (United States)

    Kajfosz, J.; Szymczyk, S.; Kornaś, G.

    1984-04-01

    The influence of sample inhomogeneity on the precision of analytical results obtained by PIXE was investigated. A simple method for the determination of sample inhomogeneity is proposed and its applicability is shown on a series of examples. Differences in the distribution of individual elements in samples were observed.

  13. Sampling and sample processing in pesticide residue analysis.

    Science.gov (United States)

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  14. Mars Sample Quarantine Protocol Workshop

    Science.gov (United States)

    DeVincenzi, Donald L. (Editor); Bagby, John (Editor); Race, Margaret (Editor); Rummel, John (Editor)

    1999-01-01

    The Mars Sample Quarantine Protocol (QP) Workshop was convened to deal with three specific aspects of the initial handling of a returned Mars sample: 1) biocontainment, to prevent uncontrolled release of sample material into the terrestrial environment; 2) life detection, to examine the sample for evidence of live organisms; and 3) biohazard testing, to determine if the sample poses any threat to terrestrial life forms and the Earth's biosphere. During the first part of the Workshop, several tutorials were presented on topics related to the workshop in order to give all participants a common basis in the technical areas necessary to achieve the objectives of the Workshop.

  15. Sample processing device and method

    DEFF Research Database (Denmark)

    2011-01-01

    A sample processing device is disclosed, which sample processing device comprises a first substrate and a second substrate, where the first substrate has a first surface comprising two area types, a first area type with a first contact angle with water and a second area type with a second contact...... a sample liquid comprising the sample and the first preparation system is adapted to receive a receiving liquid. In a particular embodiment, a magnetic sample transport component, such as a permanent magnet or an electromagnet, is arranged to move magnetic beads in between the first and second substrates....

  16. Improve natural gas sampling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jiskoot, R.J.J. (Jiskoot Autocontrol, Kent (United Kingdom))

    1994-02-01

    Accurate and reliable sampling systems are imperative when confirming natural gas' commercial value. Buyers and sellers need accurate hydrocarbon-composition information to conduct fair sale transactions. Because of poor sample extraction, preparation or analysis can invalidate the sale, more attention should be directed toward improving representative sampling. Consider all sampling components, i.e., gas types, line pressure and temperature, equipment maintenance and service needs, etc. The paper discusses gas sampling, design considerations (location, probe type, extraction devices, controller, and receivers), operating requirements, and system integration.

  17. Rotary Percussive Sample Acquisition Tool

    Science.gov (United States)

    Klein, K.; Badescu, M.; Haddad, N.; Shiraishi, L.; Walkemeyer, P.

    2012-01-01

    As part of a potential Mars Sample Return campaign NASA is studying a sample caching mission to Mars, with a possible 2018 launch opportunity. As such, a Sample Acquisition Tool (SAT) has been developed in support of the Integrated Mars Sample Acquisition and Handling (IMSAH) architecture as it relates to the proposed Mars Sample Return (MSR) campaign. The tool allows for core generation and capture directly into a sample tube. In doing so, the sample tube becomes the fundamental handling element within the IMSAH sample chain reducing the risk associated with sample contamination as well as the need to handle a sample of unknown geometry. The tool's functionality was verified utilizing a proposed rock test suite that encompasses a series of rock types that have been utilized in the past to qualify Martian surface sampling hardware. The corresponding results have shown the tool can effectively generate, fracture, and capture rock cores while maintaining torque margins of no less than 50% with an average power consumption of no greater than 90W and a tool mass of less than 6kg.

  18. Manual versus automated blood sampling

    DEFF Research Database (Denmark)

    Teilmann, A C; Kalliokoski, Otto; Sørensen, Dorte B

    2014-01-01

    Facial vein (cheek blood) and caudal vein (tail blood) phlebotomy are two commonly used techniques for obtaining blood samples from laboratory mice, while automated blood sampling through a permanent catheter is a relatively new technique in mice. The present study compared physiological parameters......, glucocorticoid dynamics as well as the behavior of mice sampled repeatedly for 24 h by cheek blood, tail blood or automated blood sampling from the carotid artery. Mice subjected to cheek blood sampling lost significantly more body weight, had elevated levels of plasma corticosterone, excreted more fecal...... corticosterone metabolites, and expressed more anxious behavior than did the mice of the other groups. Plasma corticosterone levels of mice subjected to tail blood sampling were also elevated, although less significantly. Mice subjected to automated blood sampling were less affected with regard to the parameters...

  19. Pierre Gy's sampling theory and sampling practice heterogeneity, sampling correctness, and statistical process control

    CERN Document Server

    Pitard, Francis F

    1993-01-01

    Pierre Gy's Sampling Theory and Sampling Practice, Second Edition is a concise, step-by-step guide for process variability management and methods. Updated and expanded, this new edition provides a comprehensive study of heterogeneity, covering the basic principles of sampling theory and its various applications. It presents many practical examples to allow readers to select appropriate sampling protocols and assess the validity of sampling protocols from others. The variability of dynamic process streams using variography is discussed to help bridge sampling theory with statistical process control. Many descriptions of good sampling devices, as well as descriptions of poor ones, are featured to educate readers on what to look for when purchasing sampling systems. The book uses its accessible, tutorial style to focus on professional selection and use of methods. The book will be a valuable guide for mineral processing engineers; metallurgists; geologists; miners; chemists; environmental scientists; and practit...

  20. Sampling designs dependent on sample parameters of auxiliary variables

    CERN Document Server

    Wywiał, Janusz L

    2015-01-01

    The book offers a valuable resource for students and statisticians whose work involves survey sampling. An estimation of the population parameters in finite and fixed populations assisted by auxiliary variables is considered. New sampling designs dependent on moments or quantiles of auxiliary variables are presented on the background of the classical methods. Accuracies of the estimators based on original sampling design are compared with classical estimation procedures. Specific conditional sampling designs are applied to problems of small area estimation as well as to estimation of quantiles of variables under study. .

  1. Comet coma sample return instrument

    Science.gov (United States)

    Albee, A. L.; Brownlee, Don E.; Burnett, Donald S.; Tsou, Peter; Uesugi, K. T.

    1994-01-01

    The sample collection technology and instrument concept for the Sample of Comet Coma Earth Return Mission (SOCCER) are described. The scientific goals of this Flyby Sample Return are to return to coma dust and volatile samples from a known comet source, which will permit accurate elemental and isotopic measurements for thousands of individual solid particles and volatiles, detailed analysis of the dust structure, morphology, and mineralogy of the intact samples, and identification of the biogenic elements or compounds in the solid and volatile samples. Having these intact samples, morphologic, petrographic, and phase structural features can be determined. Information on dust particle size, shape, and density can be ascertained by analyzing penetration holes and tracks in the capture medium. Time and spatial data of dust capture will provide understanding of the flux dynamics of the coma and the jets. Additional information will include the identification of cosmic ray tracks in the cometary grains, which can provide a particle's process history and perhaps even the age of the comet. The measurements will be made with the same equipment used for studying micrometeorites for decades past; hence, the results can be directly compared without extrapolation or modification. The data will provide a powerful and direct technique for comparing the cometary samples with all known types of meteorites and interplanetary dust. This sample collection system will provide the first sample return from a specifically identified primitive body and will allow, for the first time, a direct method of matching meteoritic materials captured on Earth with known parent bodies.

  2. Optimization of sampling parameters for standardized exhaled breath sampling.

    Science.gov (United States)

    Doran, Sophie; Romano, Andrea; Hanna, George B

    2017-09-05

    The lack of standardization of breath sampling is a major contributing factor to the poor repeatability of results and hence represents a barrier to the adoption of breath tests in clinical practice. On-line and bag breath sampling have advantages but do not suit multicentre clinical studies whereas storage and robust transport are essential for the conduct of wide-scale studies. Several devices have been developed to control sampling parameters and to concentrate volatile organic compounds (VOCs) onto thermal desorption (TD) tubes and subsequently transport those tubes for laboratory analysis. We conducted three experiments to investigate (i) the fraction of breath sampled (whole vs. lower expiratory exhaled breath); (ii) breath sample volume (125, 250, 500 and 1000ml) and (iii) breath sample flow rate (400, 200, 100 and 50 ml/min). The target VOCs were acetone and potential volatile biomarkers for oesophago-gastric cancer belonging to the aldehyde, fatty acids and phenol chemical classes. We also examined the collection execution time and the impact of environmental contamination. The experiments showed that the use of exhaled breath-sampling devices requires the selection of optimum sampling parameters. The increase in sample volume has improved the levels of VOCs detected. However, the influence of the fraction of exhaled breath and the flow rate depends on the target VOCs measured. The concentration of potential volatile biomarkers for oesophago-gastric cancer was not significantly different between the whole and lower airway exhaled breath. While the recovery of phenols and acetone from TD tubes was lower when breath sampling was performed at a higher flow rate, other VOCs were not affected. A dedicated 'clean air supply' overcomes the contamination from ambient air, but the breath collection device itself can be a source of contaminants. In clinical studies using VOCs to diagnose gastro-oesophageal cancer, the optimum parameters are 500mls sample volume

  3. Influence of sampling depth and post-sampling analysis time

    African Journals Online (AJOL)

    Paradise is released directly to the ocean at this point. ... cooled (0ºC - 0.2ºC) plastic bottle, sealed and labelled in the field. During mussel sampling, 10 -12 animals (sufficient to yield. 150-250 g of soft tissue), were taken from each shellfish stock sample, sealed in cool sterile plastic bags and kept in plastic containers.

  4. Quantum sampling problems, BosonSampling and quantum supremacy

    Science.gov (United States)

    Lund, A. P.; Bremner, Michael J.; Ralph, T. C.

    2017-04-01

    There is a large body of evidence for the potential of greater computational power using information carriers that are quantum mechanical over those governed by the laws of classical mechanics. But the question of the exact nature of the power contributed by quantum mechanics remains only partially answered. Furthermore, there exists doubt over the practicality of achieving a large enough quantum computation that definitively demonstrates quantum supremacy. Recently the study of computational problems that produce samples from probability distributions has added to both our understanding of the power of quantum algorithms and lowered the requirements for demonstration of fast quantum algorithms. The proposed quantum sampling problems do not require a quantum computer capable of universal operations and also permit physically realistic errors in their operation. This is an encouraging step towards an experimental demonstration of quantum algorithmic supremacy. In this paper, we will review sampling problems and the arguments that have been used to deduce when sampling problems are hard for classical computers to simulate. Two classes of quantum sampling problems that demonstrate the supremacy of quantum algorithms are BosonSampling and Instantaneous Quantum Polynomial-time Sampling. We will present the details of these classes and recent experimental progress towards demonstrating quantum supremacy in BosonSampling.

  5. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  6. Gravimetric dust sampling for control purposes and occupational dust sampling.

    CSIR Research Space (South Africa)

    Unsted, AD

    1997-02-01

    Full Text Available Prior to the introduction of gravimetric dust sampling, konimeters had been used for dust sampling, which was largely for control purposes. Whether or not absolute results were achievable was not an issue since relative results were used to evaluate...

  7. Sampling problems in twin research.

    Science.gov (United States)

    Torgersen, S

    1987-01-01

    Sampling problems related to twin problems may be of at least three kinds: small samples, self-selection and unrepresentative ascertainment. The article mostly discusses the third type of sample problem. Data from a nationwide Norwegian twin study show that the results of twin studies will be quite different depending upon the ascertainment procedure. According to both the ICD-9 as well as the DSM-III classification system, only samples ascertained from mental hospitals treating severe cases are able to demonstrate hereditary factors of any strength.

  8. Neonatal blood gas sampling methods

    African Journals Online (AJOL)

    Blood gas sampling is part of everyday practice in the care of babies admitted to the neonatal intensive care unit, particularly for those receiving respiratory support. There is little published guidance that systematically evaluates the different methods of neonatal blood gas sampling, where each method has its individual ...

  9. Learning to Reason from Samples

    Science.gov (United States)

    Ben-Zvi, Dani; Bakker, Arthur; Makar, Katie

    2015-01-01

    The goal of this article is to introduce the topic of "learning to reason from samples," which is the focus of this special issue of "Educational Studies in Mathematics" on "statistical reasoning." Samples are data sets, taken from some wider universe (e.g., a population or a process) using a particular procedure…

  10. Learning to reason from samples

    NARCIS (Netherlands)

    Ben-Zvi, Dani; Bakker, Arthur; Makar, Katie

    2015-01-01

    The goal of this article is to introduce the topic of learning to reason from samples, which is the focus of this special issue of Educational Studies in Mathematics on statistical reasoning. Samples are data sets, taken from some wider universe (e.g., a population or a process) using a particular

  11. Simulated Sampling of Estuary Plankton

    Science.gov (United States)

    Fortner, Rosanne W.; Jenkins, Deborah Bainer

    2009-01-01

    To find out about the microscopic life in the valuable estuary environment, it is usually necessary to be near the water. This dry lab offers an alternative, using authentic data and a simulation of plankton sampling. From the types of organisms found in the sample, middle school students can infer relationships in the biological and physical…

  12. Sample-whitened matched filters

    DEFF Research Database (Denmark)

    Andersen, Ib

    1973-01-01

    A sample-whitened matched filter (SWMF) for a channel with intersymbol interference and additive white Gaussian noise is defined as a linear filter with the properties that its output samples are a sufficient statistic for the MAP estimation of the transmitted sequence and have uncorrelated noise...

  13. Sampling by Fluidics and Microfluidics

    Directory of Open Access Journals (Sweden)

    V. Tesař

    2002-01-01

    Full Text Available Selecting one from several available fluid samples is a procedure often performed especially in chemical engineering. It is usually done by an array of valves sequentially opened and closed. Not generally known is an advantageous alternative: fluidic sampling units without moving parts. In the absence of complete pipe closure, cross-contamination between samples cannot be ruled out. This is eliminated by arranging for small protective flows that clear the cavities and remove any contaminated fluid. Although this complicates the overall circuit layout, fluidic sampling units with these "guard" flows were successfully built and tested. Recent interest in microchemistry leads to additional problems due very low operating Reynolds numbers. This necessitated the design of microfluidic sampling units based on new operating principles.

  14. Hermetic Seal Designs for Sample Return Sample Tubes

    Science.gov (United States)

    Younse, Paulo J.

    2013-01-01

    Prototypes have been developed of potential hermetic sample sealing techniques for encapsulating samples in a ˜1-cm-diameter thin-walled sample tube that are compatible with IMSAH (Integrated Mars Sample Acquisition and Handling) architecture. Techniques include a heat-activated, finned, shape memory alloy plug; a contracting shape memory alloy activated cap; an expanding shape memory alloy plug; and an expanding torque plug. Initial helium leak testing of the shape memory alloy cap and finned shape memory alloy plug seals showed hermetic- seal capability compared against an industry standard of seal integrity after Martian diurnal cycles. Developmental testing is currently being done on the expanding torque plug, and expanding shape memory alloy plug seal designs. The finned shape memory alloy (SMA) plug currently shows hermetic sealing capability based on preliminary tests.

  15. Sampling Theorem in Terms of the Bandwidth and Sampling Interval

    Science.gov (United States)

    Dean, Bruce H.

    2011-01-01

    An approach has been developed for interpolating non-uniformly sampled data, with applications in signal and image reconstruction. This innovation generalizes the Whittaker-Shannon sampling theorem by emphasizing two assumptions explicitly (definition of a band-limited function and construction by periodic extension). The Whittaker- Shannon sampling theorem is thus expressed in terms of two fundamental length scales that are derived from these assumptions. The result is more general than what is usually reported, and contains the Whittaker- Shannon form as a special case corresponding to Nyquist-sampled data. The approach also shows that the preferred basis set for interpolation is found by varying the frequency component of the basis functions in an optimal way.

  16. Defining sample size and sampling strategy for dendrogeomorphic rockfall reconstructions

    Science.gov (United States)

    Morel, Pauline; Trappmann, Daniel; Corona, Christophe; Stoffel, Markus

    2015-05-01

    Optimized sampling strategies have been recently proposed for dendrogeomorphic reconstructions of mass movements with a large spatial footprint, such as landslides, snow avalanches, and debris flows. Such guidelines have, by contrast, been largely missing for rockfalls and cannot be transposed owing to the sporadic nature of this process and the occurrence of individual rocks and boulders. Based on a data set of 314 European larch (Larix decidua Mill.) trees (i.e., 64 trees/ha), growing on an active rockfall slope, this study bridges this gap and proposes an optimized sampling strategy for the spatial and temporal reconstruction of rockfall activity. Using random extractions of trees, iterative mapping, and a stratified sampling strategy based on an arbitrary selection of trees, we investigate subsets of the full tree-ring data set to define optimal sample size and sampling design for the development of frequency maps of rockfall activity. Spatially, our results demonstrate that the sampling of only 6 representative trees per ha can be sufficient to yield a reasonable mapping of the spatial distribution of rockfall frequencies on a slope, especially if the oldest and most heavily affected individuals are included in the analysis. At the same time, however, sampling such a low number of trees risks causing significant errors especially if nonrepresentative trees are chosen for analysis. An increased number of samples therefore improves the quality of the frequency maps in this case. Temporally, we demonstrate that at least 40 trees/ha are needed to obtain reliable rockfall chronologies. These results will facilitate the design of future studies, decrease the cost-benefit ratio of dendrogeomorphic studies and thus will permit production of reliable reconstructions with reasonable temporal efforts.

  17. Curation of Samples from Mars

    Science.gov (United States)

    Lindstrom, D.; Allen, C.

    One of the strong scientific reasons for returning samples from Mars is to search for evidence of current or past life in the samples. Because of the remote possibility that the samples may contain life forms that are hazardous to the terrestrial biosphere, the National Research Council has recommended that all samples returned from Mars be kept under strict biological containment until tests show that they can safely be released to other laboratories. It is possible that Mars samples may contain only scarce or subtle traces of life or prebiotic chemistry that could readily be overwhelmed by terrestrial contamination. Thus, the facilities used to contain, process, and analyze samples from Mars must have a combination of high-level biocontainment and organic / inorganic chemical cleanliness that is unprecedented. We have been conducting feasibility studies and developing designs for a facility that would be at least as capable as current maximum containment BSL-4 (BioSafety Level 4) laboratories, while simultaneously maintaining cleanliness levels exceeding those of the cleanest electronics manufacturing labs. Unique requirements for the processing of Mars samples have inspired a program to develop handling techniques that are much more precise and reliable than the approach (currently used for lunar samples) of employing gloved human hands in nitrogen-filled gloveboxes. Individual samples from Mars are expected to be much smaller than lunar samples, the total mass of samples returned by each mission being 0.5- 1 kg, compared with many tens of kg of lunar samples returned by each of the six Apollo missions. Smaller samp les require much more of the processing to be done under microscopic observation. In addition, the requirements for cleanliness and high-level containment would be difficult to satisfy while using traditional gloveboxes. JSC has constructed a laboratory to test concepts and technologies important to future sample curation. The Advanced Curation

  18. Sampling of illicit drugs for quantitative analysis--part III: sampling plans and sample preparations.

    Science.gov (United States)

    Csesztregi, T; Bovens, M; Dujourdy, L; Franc, A; Nagy, J

    2014-08-01

    The findings in this paper are based on the results of our drug homogeneity studies and particle size investigations. Using that information, a general sampling plan (depicted in the form of a flow-chart) was devised that could be applied to the quantitative instrumental analysis of the most common illicit drugs: namely heroin, cocaine, amphetamine, cannabis resin, MDMA tablets and herbal cannabis in 'bud' form (type I). Other more heterogeneous forms of cannabis (type II) were found to require alternative, more traditional sampling methods. A table was constructed which shows the sampling uncertainty expected when a particular number of random increments are taken and combined to form a single primary sample. It also includes a recommended increment size; which is 1 g for powdered drugs and cannabis resin, 1 tablet for MDMA and 1 bud for herbal cannabis in bud form (type I). By referring to that table, individual laboratories can ensure that the sampling uncertainty for a particular drug seizure can be minimised, such that it lies in the same region as their analytical uncertainty for that drug. The table shows that assuming a laboratory wishes to quantitatively analyse a seizure of powdered drug or cannabis resin with a 'typical' heterogeneity, a primary sample of 15×1 g increments is generally appropriate. The appropriate primary sample for MDMA tablets is 20 tablets, while for herbal cannabis (in bud form) 50 buds were found to be appropriate. Our study also showed that, for a suitably homogenised primary sample of the most common powdered drugs, an analytical sample size of between 20 and 35 mg was appropriate and for herbal cannabis the appropriate amount was 200 mg. The need to ensure that the results from duplicate or multiple incremental sampling were compared, to demonstrate whether or not a particular seized material has a 'typical' heterogeneity and that the sampling procedure applied has resulted in a 'correct sample', was highlighted and the setting

  19. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    Energy Technology Data Exchange (ETDEWEB)

    Shine, E. P.; Poirier, M. R.

    2013-10-29

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  20. Mars Science Laboratory Sample Acquisition, Sample Processing and Handling Subsystem: A Description of the Sampling Functionality

    Science.gov (United States)

    Jandura, L.; Burke, K.; Kennedy, B.; Melko, J.; Okon, A.; Sunshine, D.

    2009-12-01

    The Sample Acquisition/Sample Processing and Handling (SA/SPaH) subsystem for the Mars Science Library (MSL) is a rover-based sampling system scheduled to launch in 2011. The SA/SPaH consists of a powdering drill and a scooping, sieving, and portioning device mounted on a turret at the end of a robotic arm. Also on the turret is a dust removal tool for clearing the surface of scientific targets, and two science instruments mounted on vibration isolators. The SA/SPaH can acquire powder from rocks at depths of 20 to 50 mm and can also pick up loose regolith with its scoop. The acquired sample is sieved and portioned and delivered to one of two instruments inside the rover for analysis. The functionality of the system will be described along with the targets the system can acquire and the sample that can be delivered. Top View of the SA/SPaH on the Rover

  1. Continuous sampling from distributed streams

    DEFF Research Database (Denmark)

    Graham, Cormode; Muthukrishnan, S.; Yi, Ke

    2012-01-01

    A fundamental problem in data management is to draw and maintain a sample of a large data set, for approximate query answering, selectivity estimation, and query planning. With large, streaming data sets, this problem becomes particularly difficult when the data is shared across multiple...... distributed sites. The main challenge is to ensure that a sample is drawn uniformly across the union of the data while minimizing the communication needed to run the protocol on the evolving data. At the same time, it is also necessary to make the protocol lightweight, by keeping the space and time costs low...... for each participant. In this article, we present communication-efficient protocols for continuously maintaining a sample (both with and without replacement) from k distributed streams. These apply to the case when we want a sample from the full streams, and to the sliding window cases of only the W most...

  2. Sample size determination and power

    CERN Document Server

    Ryan, Thomas P, Jr

    2013-01-01

    THOMAS P. RYAN, PhD, teaches online advanced statistics courses for Northwestern University and The Institute for Statistics Education in sample size determination, design of experiments, engineering statistics, and regression analysis.

  3. Microfluidic Sample Preparation for Immunoassays

    Energy Technology Data Exchange (ETDEWEB)

    Visuri, S; Benett, W; Bettencourt, K; Chang, J; Fisher, K; Hamilton, J; Krulevitch, P; Park, C; Stockton, C; Tarte, L; Wang, A; Wilson, T

    2001-08-09

    Researchers at Lawrence Livermore National Laboratory are developing means to collect and identify fluid-based biological pathogens in the forms of proteins, viruses, and bacteria. to support detection instruments, they are developing a flexible fluidic sample preparation unit. The overall goal of this Microfluidic Module is to input a fluid sample, containing background particulates and potentially target compounds, and deliver a processed sample for detection. They are developing techniques for sample purification, mixing, and filtration that would be useful to many applications including immunologic and nucleic acid assays. Many of these fluidic functions are accomplished with acoustic radiation pressure or dielectrophoresis. They are integrating these technologies into packaged systems with pumps and valves to control fluid flow through the fluidic circuit.

  4. Subsurface Noble Gas Sampling Manual

    Energy Technology Data Exchange (ETDEWEB)

    Carrigan, C. R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sun, Y. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-18

    The intent of this document is to provide information about best available approaches for performing subsurface soil gas sampling during an On Site Inspection or OSI. This information is based on field sampling experiments, computer simulations and data from the NA-22 Noble Gas Signature Experiment Test Bed at the Nevada Nuclear Security Site (NNSS). The approaches should optimize the gas concentration from the subsurface cavity or chimney regime while simultaneously minimizing the potential for atmospheric radioxenon and near-surface Argon-37 contamination. Where possible, we quantitatively assess differences in sampling practices for the same sets of environmental conditions. We recognize that all sampling scenarios cannot be addressed. However, if this document helps to inform the intuition of the reader about addressing the challenges resulting from the inevitable deviations from the scenario assumed here, it will have achieved its goal.

  5. SWOT ANALYSIS ON SAMPLING METHOD

    National Research Council Canada - National Science Library

    CHIS ANCA OANA; BELENESI MARIOARA;

    2014-01-01

    .... Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors...

  6. Biological Sample Monitoring Database (BSMDBS)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Biological Sample Monitoring Database System (BSMDBS) was developed for the Northeast Fisheries Regional Office and Science Center (NER/NEFSC) to record and...

  7. Tests on standard concrete samples

    CERN Multimedia

    CERN PhotoLab

    1973-01-01

    Compression and tensile tests on standard concrete samples. The use of centrifugal force in tensile testing has been developed by the SB Division and the instruments were built in the Central workshops.

  8. More practical critical height sampling.

    Science.gov (United States)

    Thomas B. Lynch; Jeffrey H. Gove

    2015-01-01

    Critical Height Sampling (CHS) (Kitamura 1964) can be used to predict cubic volumes per acre without using volume tables or equations. The critical height is defined as the height at which the tree stem appears to be in borderline condition using the point-sampling angle gauge (e.g. prism). An estimate of cubic volume per acre can be obtained from multiplication of the...

  9. Biological Environmental Sampling Technologies Assessment

    Science.gov (United States)

    2015-12-01

    array‐based measurements that are made on carbon ink surfaces. The instrument can process a wide variety of sample types and is targeted at...from all types of surfaces and absorb unknown liquids. The Aklus Shield system can also be used to sample debris, soil, or vegetation . For this...system is already part of the selected JBTDS system. Therefore, if the InnovaPrep system was selected, it would reduce the logistical footprint of

  10. Sampling Operations on Big Data

    Science.gov (United States)

    2015-11-29

    big data in Section II, followed by a description of the analytic environment D4M in Section III. We then describe the types of sampling methods and...signal reconstruction steps are used to do these operations. Big Data analytics , often characterized by analytics applied to datasets that strain available...Sampling Operations on Big Data Vijay Gadepally, Taylor Herr, Luke Johnson, Lauren Milechin, Maja Milosavljevic, Benjamin A. Miller Lincoln

  11. Biofouling development on plasma treated samples versus layers coated samples

    Science.gov (United States)

    Hnatiuc, B.; Exnar, P.; Sabau, A.; Spatenka, P.; Dumitrache, C. L.; Hnatiuc, M.; Ghita, S.

    2016-12-01

    Biofouling is the most important cause of naval corrosion. In order to reduce the Biofouling development on naval materials as steel or resin, different new methods have been tested. These methods could help to follow the new IMO environment reglementations and they could replace few classic operations before the painting of the small ships. The replacement of these operations means a reduction in maintenance costs. Their action must influence especially the first two steps of the Biofouling development, called Microfouling, that demand about 24 hours. This work presents the comparative results of the Biofouling development on two different classic naval materials, steel and resin, for three treated samples, immersed in sea water. Non-thermal plasma, produced by GlidArc technology, is applied to the first sample, called GD. The plasma treatment was set to 10 minutes. The last two samples, called AE9 and AE10 are covered by hydrophobic layers, prepared from a special organic-inorganic sol synthesized by sol-gel method. Theoretically, because of the hydrophobic properties, the Biofouling formation must be delayed for AE9 and AE10. The Biofouling development on each treated sample was compared with a witness non-treated sample. The microbiological analyses have been done for 24 hours by epifluorescence microscopy, available for one single layer.

  12. Are most samples of animals systematically biased? Consistent individual trait differences bias samples despite random sampling.

    Science.gov (United States)

    Biro, Peter A

    2013-02-01

    Sampling animals from the wild for study is something nearly every biologist has done, but despite our best efforts to obtain random samples of animals, 'hidden' trait biases may still exist. For example, consistent behavioral traits can affect trappability/catchability, independent of obvious factors such as size and gender, and these traits are often correlated with other repeatable physiological and/or life history traits. If so, systematic sampling bias may exist for any of these traits. The extent to which this is a problem, of course, depends on the magnitude of bias, which is presently unknown because the underlying trait distributions in populations are usually unknown, or unknowable. Indeed, our present knowledge about sampling bias comes from samples (not complete population censuses), which can possess bias to begin with. I had the unique opportunity to create naturalized populations of fish by seeding each of four small fishless lakes with equal densities of slow-, intermediate-, and fast-growing fish. Using sampling methods that are not size-selective, I observed that fast-growing fish were up to two-times more likely to be sampled than slower-growing fish. This indicates substantial and systematic bias with respect to an important life history trait (growth rate). If correlations between behavioral, physiological and life-history traits are as widespread as the literature suggests, then many animal samples may be systematically biased with respect to these traits (e.g., when collecting animals for laboratory use), and affect our inferences about population structure and abundance. I conclude with a discussion on ways to minimize sampling bias for particular physiological/behavioral/life-history types within animal populations.

  13. SWOT ANALYSIS ON SAMPLING METHOD

    Directory of Open Access Journals (Sweden)

    CHIS ANCA OANA

    2014-07-01

    Full Text Available Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors. It should be applied correctly for a fair view of financial statements, to satisfy the needs of all financial users. In order to be applied correctly the method must be understood by all its users and mainly by auditors. Otherwise the risk of not applying it correctly would cause loose of reputation and discredit, litigations and even prison. Since there is not a unitary practice and methodology for applying the technique, the risk of incorrectly applying it is pretty high. The SWOT analysis is a technique used that shows the advantages, disadvantages, threats and opportunities. We applied SWOT analysis in studying the sampling method, from the perspective of three players: the audit company, the audited entity and users of financial statements. The study shows that by applying the sampling method the audit company and the audited entity both save time, effort and money. The disadvantages of the method are difficulty in applying and understanding its insight. Being largely used as an audit method and being a factor of a correct audit opinion, the sampling method’s advantages, disadvantages, threats and opportunities must be understood by auditors.

  14. Distance sampling methods and applications

    CERN Document Server

    Buckland, S T; Marques, T A; Oedekoven, C S

    2015-01-01

    In this book, the authors cover the basic methods and advances within distance sampling that are most valuable to practitioners and in ecology more broadly. This is the fourth book dedicated to distance sampling. In the decade since the last book published, there have been a number of new developments. The intervening years have also shown which advances are of most use. This self-contained book covers topics from the previous publications, while also including recent developments in method, software and application. Distance sampling refers to a suite of methods, including line and point transect sampling, in which animal density or abundance is estimated from a sample of distances to detected individuals. The book illustrates these methods through case studies; data sets and computer code are supplied to readers through the book’s accompanying website.  Some of the case studies use the software Distance, while others use R code. The book is in three parts.  The first part addresses basic methods, the ...

  15. Sample Results from Routine Salt Batch 7 Samples

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. [Savannah River Site (SRS), Aiken, SC (United States)

    2015-05-13

    Strip Effluent Hold Tank (SEHT) and Decontaminated Salt Solution Hold Tank (DSSHT) samples from several of the “microbatches” of Integrated Salt Disposition Project (ISDP) Salt Batch (“Macrobatch”) 7B have been analyzed for 238Pu, 90Sr, 137Cs, Inductively Coupled Plasma Emission Spectroscopy (ICPES), and Ion Chromatography Anions (IC-A). The results from the current microbatch samples are similar to those from earlier samples from this and previous macrobatches. The Actinide Removal Process (ARP) and the Modular Caustic-Side Solvent Extraction Unit (MCU) continue to show more than adequate Pu and Sr removal, and there is a distinct positive trend in Cs removal, due to the use of the Next Generation Solvent (NGS). The Savannah River National Laboratory (SRNL) notes that historically, most measured Concentration Factor (CF) values during salt processing have been in the 12-14 range. However, recent processing gives CF values closer to 11. This observation does not indicate that the solvent performance is suffering, as the Decontamination Factor (DF) has still maintained consistently high values. Nevertheless, SRNL will continue to monitor for indications of process upsets. The bulk chemistry of the DSSHT and SEHT samples do not show any signs of unusual behavior.

  16. Environmental surveillance master sampling schedule

    Energy Technology Data Exchange (ETDEWEB)

    Bisping, L.E.

    1997-01-01

    Environmental surveillance of the Hanford Site and surrounding areas is conducted by the Pacific Northwest National Laboratory (PNNL)(a) for the US Department of Energy (DOE). This document contains the planned 1997 schedules for routine collection of samples for the Surface Environmental Surveillance Project (SESP) and Drinking Water Monitoring Project. In addition, Section 3.0, Biota, also reflects a rotating collection schedule identifying the year a specific sample is scheduled for collection. The purpose of these monitoring projects is to evaluate levels of radioactive and nonradioactive pollutants in the Hanford environs, as required in DOE Order 5400.1, General Environmental Protection Program, and DOE Order 5400.5, Radiation Protection of the Public and the Environment. The sampling methods will be the same as those described in the Environmental Monitoring Plan, US Department of Energy, Richland Operations Office, DOE/RL91-50, Rev. 1, US Department of Energy, Richland, Washington.

  17. β-NMR sample optimization

    CERN Document Server

    Zakoucka, Eva

    2013-01-01

    During my summer student programme I was working on sample optimization for a new β-NMR project at the ISOLDE facility. The β-NMR technique is well-established in solid-state physics and just recently it is being introduced for applications in biochemistry and life sciences. The β-NMR collaboration will be applying for beam time to the INTC committee in September for three nuclei: Cu, Zn and Mg. Sample optimization for Mg was already performed last year during the summer student programme. Therefore sample optimization for Cu and Zn had to be completed as well for the project proposal. My part in the project was to perform thorough literature research on techniques studying Cu and Zn complexes in native conditions, search for relevant binding candidates for Cu and Zn applicable for ß-NMR and eventually evaluate selected binding candidates using UV-VIS spectrometry.

  18. Ball assisted device for analytical surface sampling

    Science.gov (United States)

    ElNaggar, Mariam S; Van Berkel, Gary J; Covey, Thomas R

    2015-11-03

    A system for sampling a surface includes a sampling probe having a housing and a socket, and a rolling sampling sphere within the socket. The housing has a sampling fluid supply conduit and a sampling fluid exhaust conduit. The sampling fluid supply conduit supplies sampling fluid to the sampling sphere. The sampling fluid exhaust conduit has an inlet opening for receiving sampling fluid carried from the surface by the sampling sphere. A surface sampling probe and a method for sampling a surface are also disclosed.

  19. The ocean sampling day consortium

    DEFF Research Database (Denmark)

    Kopf, Anna; Bicak, Mesude; Kottmann, Renzo

    2015-01-01

    Ocean Sampling Day was initiated by the EU-funded Micro B3 (Marine Microbial Biodiversity, Bioinformatics, Biotechnology) project to obtain a snapshot of the marine microbial biodiversity and function of the world’s oceans. It is a simultaneous global mega-sequencing campaign aiming to generate...... the largest standardized microbial data set in a single day. This will be achievable only through the coordinated efforts of an Ocean Sampling Day Consortium, supportive partnerships and networks between sites. This commentary outlines the establishment, function and aims of the Consortium and describes our...... vision for a sustainable study of marine microbial communities and their embedded functional traits....

  20. Sampling for stereology in lungs

    Directory of Open Access Journals (Sweden)

    J. R. Nyengaard

    2006-12-01

    Full Text Available The present article reviews the relevant stereological estimators for obtaining reliable quantitative structural data from the lungs. Stereological sampling achieves reliable, quantitative information either about the whole lung or complete lobes, whilst minimising the workload. Studies have used systematic random sampling, which has fixed and constant sampling probabilities on all blocks, sections and fields of view. For an estimation of total lung or lobe volume, the Cavalieri principle can be used, but it is not useful in estimating individual cell volume due to various effects from over- or underprojection. If the number of certain structures is required, two methods can be used: the disector and the fractionator. The disector method is a three-dimensional stereological probe for sampling objects according to their number. However, it may be affected on tissue deformation and, therefore, the fractionator method is often the preferred sampling principle. In this method, a known and predetermined fraction of an object is sampled in one or more steps, with the final step estimating the number. Both methods can be performed in a physical and optical manner, therefore enabling cells and larger lung structure numbers (e.g. number of alveoli to be estimated. Some estimators also require randomisation of orientation, so that all directions have an equal chance of being chosen. Using such isotropic sections, surface area, length, and diameter can be estimated on a Cavalieri set of sections. Stereology can also illustrate the potential for transport between two compartments by analysing the barrier width. Estimating the individual volume of cells can be achieved by local stereology using a two-step procedure that first samples lung cells using the disector and then introduces individual volume estimation of the sampled cells. The coefficient of error of most unbiased stereological estimators is a combination of variance from blocks, sections, fields

  1. Sample Return Primer and Handbook

    Science.gov (United States)

    Barrow, Kirk; Cheuvront, Allan; Faris, Grant; Hirst, Edward; Mainland, Nora; McGee, Michael; Szalai, Christine; Vellinga, Joseph; Wahl, Thomas; Williams, Kenneth; hide

    2007-01-01

    This three-part Sample Return Primer and Handbook provides a road map for conducting the terminal phase of a sample return mission. The main chapters describe element-by-element analyses and trade studies, as well as required operations plans, procedures, contingencies, interfaces, and corresponding documentation. Based on the experiences of the lead Stardust engineers, the topics include systems engineering (in particular range safety compliance), mission design and navigation, spacecraft hardware and entry, descent, and landing certification, flight and recovery operations, mission assurance and system safety, test and training, and the very important interactions with external support organizations (non-NASA tracking assets, landing site support, and science curation).

  2. Succinct Sampling from Discrete Distributions

    DEFF Research Database (Denmark)

    Bringmann, Karl; Larsen, Kasper Green

    2013-01-01

    We revisit the classic problem of sampling from a discrete distribution: Given n non-negative w-bit integers x_1,...,x_n, the task is to build a data structure that allows sampling i with probability proportional to x_i. The classic solution is Walker's alias method that takes, when implemented...... requirement of the classic solution for a fundamental sampling problem, on the other hand, they provide the strongest known separation between the systematic and non-systematic case for any data structure problem. Finally, we also believe our upper bounds are practically efficient and simpler than Walker...... on a Word RAM, O(n) preprocessing time, O(1) expected query time for one sample, and n(w+2 lg n+o(1)) bits of space. Using the terminology of succinct data structures, this solution has redundancy 2n lg n+o(n) bits, i.e., it uses 2n lg n+o(n) bits in addition to the information theoretic minimum required...

  3. Mahalanobis' Contributions to Sample Surveys

    Indian Academy of Sciences (India)

    has more than 60 research publications in various journals and is a Member of the International. Statistical Institute. Rao is on the Governing Council of the National Sample. Survey Organisation and is the Managing Editor of. Sankhya, Series B as well as a coeditor. ______ LAA~AA< ______ __. RESONANCE I June 1999.

  4. Sampling Assumptions in Inductive Generalization

    Science.gov (United States)

    Navarro, Daniel J.; Dry, Matthew J.; Lee, Michael D.

    2012-01-01

    Inductive generalization, where people go beyond the data provided, is a basic cognitive capability, and it underpins theoretical accounts of learning, categorization, and decision making. To complete the inductive leap needed for generalization, people must make a key "sampling" assumption about how the available data were generated.…

  5. Environmental surveillance master sampling schedule

    Energy Technology Data Exchange (ETDEWEB)

    Bisping, L.E.

    1994-02-01

    This document contains the planned 1994 schedules for routine collection of samples for the Surface Environmental Surveillance Project (SESP), Drinking Water Project, and Ground-Water Surveillance Project. Samples are routinely collected for the SESP and analyzed to determine the quality of air, surface water, soil, sediment, wildlife, vegetation, foodstuffs, and farm products at Hanford Site and surrounding communities. The responsibility for monitoring the onsite drinking water falls outside the scope of the SESP. The Hanford Environmental Health Foundation is responsible for monitoring the nonradiological parameters as defined in the National Drinking Water Standards while PNL conducts the radiological monitoring of the onsite drinking water. PNL conducts the drinking water monitoring project concurrent with the SESP to promote efficiency and consistency, utilize the expertise developed over the years, and reduce costs associated with management, procedure development, data management, quality control and reporting. The ground-water sampling schedule identifies ground-water sampling events used by PNL for environmental surveillance of the Hanford Site.

  6. The Lyman alpha reference sample

    DEFF Research Database (Denmark)

    Hayes, M.; Östlin, G.; Schaerer, D.

    2013-01-01

    We report on new imaging observations of the Lyman alpha emission line (Lyα), performed with the Hubble Space Telescope, that comprise the backbone of the Lyman alpha Reference Sample. We present images of 14 starburst galaxies at redshifts 0.028

  7. The RECONS 10 Parsec Sample

    Science.gov (United States)

    Henry, Todd; Dieterich, Sergio; Finch, C.; Ianna, P. A.; Jao, W.-C.; Riedel, Adric; Subasavage, John; Winters, J.; RECONS Team

    2018-01-01

    The sample of stars, brown dwarfs, and exoplanets known within 10 parsecs of our Solar System as of January 1, 2017 is presented. The current census is comprised of 416 objects made up of 371 stars (including the Sun and white dwarfs) and 45 brown dwarfs. The stars are known to be orbited by 43 planets (eight in our Solar System and 35 exoplanets). There are 309 systems within 10 pc, including 275 with stellar primaries and 34 systems containing only brown dwarfs.Via a long-term astrometric effort at CTIO, the RECONS (REsearch Consortium On Nearby Stars, www.recons.org) team has added 44 stellar systems to the sample, accounting for one of every seven systems known within 10 pc. Overall, the 278 red dwarfs clearly dominate the sample, accounting for 75% of all stars known within 10 pc. The completeness of the sample is assessed, indicating that a few red, brown, and white dwarfs within 10 pc may be discovered, both as primaries and secondaries, although we estimate that 90% of the stellar systems have been identified. The evolution of the 10 pc sample over the past century is outlined to illustrate our growing knowledge of the solar neighborhood.The luminosity and mass functions for stars within 10 pc are described. In contrast to many studies, once all known close multiples are resolved into individual components, the true mass function rises to the end of the stellar main sequence, followed by a precipitous drop in the number of brown dwarfs, which are outnumbered 8.2 to 1 by stars. Of the 275 stellar primaries in the sample, 182 (66%) are single, 75 (27%) have at least one stellar companion, only 8 (3%) have a brown dwarf companion, and 19 (7%) systems are known to harbor planets. Searches for brown dwarf companions to stars in this sample have been quite rigorous, so the brown dwarf companion rate is unlikely to rise significantly. In contrast, searches for exoplanets, particularly terrestrial planets, have been limited. Thus, overall the solar neighborhood is

  8. Apparatus for Sampling Surface Contamination

    Science.gov (United States)

    Wells, Mark

    2008-01-01

    An apparatus denoted a swab device has been developed as a convenient means of acquiring samples of contaminants from surfaces and suspending the samples in liquids. (Thereafter, the liquids can be dispensed, in controlled volumes, into scientific instruments for analysis of the contaminants.) The swab device is designed so as not to introduce additional contamination and to facilitate, simplify, and systematize the dispensing of controlled volumes of liquid into analytical instruments. The swab device is a single apparatus into which are combined all the equipment and materials needed for sampling surface contamination. The swab device contains disposable components stacked together on a nondisposable dispensing head. One of the disposable components is a supply cartridge holding a sufficient volume of liquid for one complete set of samples. (The liquid could be clean water or another suitable solvent, depending on the application.) This supply of liquid is sealed by Luer valves. At the beginning of a sampling process, the user tears open a sealed bag containing the supply cartridge. A tip on the nondisposable dispensing head is engaged with a Luer valve on one end of the supply cartridge and rotated, locking the supply cartridge on the dispensing head and opening the valve. The swab tip includes a fabric swab that is wiped across the surface of interest to acquire a sample. A sealed bag containing a disposable dispensing tip is then opened, and the swab tip is pushed into the dispensing tip until seated. The dispensing head contains a piston that passes through a spring-loaded lip seal. The air volume displaced by this piston forces the liquid out of the supply cartridge, over the swab, and into the dispensing tip. The piston is manually cycled to enforce oscillation of the air volume and thereby to cause water to flow to wash contaminants from the swab and cause the resulting liquid suspension of contaminants to flow into the dispensing tip. After several cycles

  9. Authentication of forensic DNA samples.

    Science.gov (United States)

    Frumkin, Dan; Wasserstrom, Adam; Davidson, Ariane; Grafit, Arnon

    2010-02-01

    Over the past twenty years, DNA analysis has revolutionized forensic science, and has become a dominant tool in law enforcement. Today, DNA evidence is key to the conviction or exoneration of suspects of various types of crime, from theft to rape and murder. However, the disturbing possibility that DNA evidence can be faked has been overlooked. It turns out that standard molecular biology techniques such as PCR, molecular cloning, and recently developed whole genome amplification (WGA), enable anyone with basic equipment and know-how to produce practically unlimited amounts of in vitro synthesized (artificial) DNA with any desired genetic profile. This artificial DNA can then be applied to surfaces of objects or incorporated into genuine human tissues and planted in crime scenes. Here we show that the current forensic procedure fails to distinguish between such samples of blood, saliva, and touched surfaces with artificial DNA, and corresponding samples with in vivo generated (natural) DNA. Furthermore, genotyping of both artificial and natural samples with Profiler Plus((R)) yielded full profiles with no anomalies. In order to effectively deal with this problem, we developed an authentication assay, which distinguishes between natural and artificial DNA based on methylation analysis of a set of genomic loci: in natural DNA, some loci are methylated and others are unmethylated, while in artificial DNA all loci are unmethylated. The assay was tested on natural and artificial samples of blood, saliva, and touched surfaces, with complete success. Adopting an authentication assay for casework samples as part of the forensic procedure is necessary for maintaining the high credibility of DNA evidence in the judiciary system.

  10. Superfund Site Information - Site Sampling Data

    Data.gov (United States)

    U.S. Environmental Protection Agency — This asset includes Superfund site-specific sampling information including location of samples, types of samples, and analytical chemistry characteristics of...

  11. Adaptive Sampling in Hierarchical Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Knap, J; Barton, N R; Hornung, R D; Arsenlis, A; Becker, R; Jefferson, D R

    2007-07-09

    We propose an adaptive sampling methodology for hierarchical multi-scale simulation. The method utilizes a moving kriging interpolation to significantly reduce the number of evaluations of finer-scale response functions to provide essential constitutive information to a coarser-scale simulation model. The underlying interpolation scheme is unstructured and adaptive to handle the transient nature of a simulation. To handle the dynamic construction and searching of a potentially large set of finer-scale response data, we employ a dynamic metric tree database. We study the performance of our adaptive sampling methodology for a two-level multi-scale model involving a coarse-scale finite element simulation and a finer-scale crystal plasticity based constitutive law.

  12. Network reconstruction via density sampling

    CERN Document Server

    Squartini, Tiziano; Gabrielli, Andrea; Garlaschelli, Diego

    2016-01-01

    Reconstructing weighted networks from partial information is necessary in many important circumstances, e.g. for a correct estimation of systemic risk. It has been shown that, in order to achieve an accurate reconstruction, it is crucial to reliably replicate the empirical degree sequence, which is however unknown in many realistic situations. More recently, it has been found that the knowledge of the degree sequence can be replaced by the knowledge of the strength sequence, which is typically accessible, complemented by that of the total number of links, thus considerably relaxing the observational requirements. Here we further relax these requirements and devise a procedure valid when even the the total number of links is unavailable. We assume that, apart from the heterogeneity induced by the degree sequence itself, the network is homogeneous, so that its link density can be estimated by sampling subsets of nodes with representative density. We show that the best way of sampling nodes is the random selecti...

  13. Representative process sampling - in practice

    DEFF Research Database (Denmark)

    Esbensen, Kim; Friis-Pedersen, Hans Henrik; Julius, Lars Petersen

    2007-01-01

    Didactic data sets representing a range of real-world processes are used to illustrate "how to do" representative process sampling and process characterisation. The selected process data lead to diverse variogram expressions with different systematics (no range vs. important ranges; trends and/or...... presented cases of variography either solved the initial problems or served to understand the reasons and causes behind the specific process structures revealed in the variograms. Process Analytical Technologies (PAT) are not complete without process TOS....

  14. Pseudo-Marginal Slice Sampling

    OpenAIRE

    Murray, Iain; Graham, Matthew

    2015-01-01

    Markov chain Monte Carlo (MCMC) methods asymptotically sample from complex probability distributions. The pseudo-marginal MCMC framework only requires an unbiased estimator of the unnormalized probability distribution function to construct a Markov chain. However, the resulting chains are harder to tune to a target distribution than conventional MCMC, and the types of updates available are limited. We describe a general way to clamp and update the random numbers used in a pseudo-marginal meth...

  15. Accurate sampling using Langevin dynamics

    CERN Document Server

    Bussi, Giovanni

    2008-01-01

    We show how to derive a simple integrator for the Langevin equation and illustrate how it is possible to check the accuracy of the obtained distribution on the fly, using the concept of effective energy introduced in a recent paper [J. Chem. Phys. 126, 014101 (2007)]. Our integrator leads to correct sampling also in the difficult high-friction limit. We also show how these ideas can be applied in practical simulations, using a Lennard-Jones crystal as a paradigmatic case.

  16. Model-based distance sampling

    OpenAIRE

    Buckland, Stephen Terrence; Oedekoven, Cornelia Sabrina; Borchers, David Louis

    2015-01-01

    CSO was part-funded by EPSRC/NERC Grant EP/1000917/1. Conventional distance sampling adopts a mixed approach, using model-based methods for the detection process, and design-based methods to estimate animal abundance in the study region, given estimated probabilities of detection. In recent years, there has been increasing interest in fully model-based methods. Model-based methods are less robust for estimating animal abundance than conventional methods, but offer several advantages: they ...

  17. Focused conformational sampling in proteins

    Science.gov (United States)

    Bacci, Marco; Langini, Cassiano; Vymětal, Jiří; Caflisch, Amedeo; Vitalis, Andreas

    2017-11-01

    A detailed understanding of the conformational dynamics of biological molecules is difficult to obtain by experimental techniques due to resolution limitations in both time and space. Computer simulations avoid these in theory but are often too short to sample rare events reliably. Here we show that the progress index-guided sampling (PIGS) protocol can be used to enhance the sampling of rare events in selected parts of biomolecules without perturbing the remainder of the system. The method is very easy to use as it only requires as essential input a set of several features representing the parts of interest sufficiently. In this feature space, new states are discovered by spontaneous fluctuations alone and in unsupervised fashion. Because there are no energetic biases acting on phase space variables or projections thereof, the trajectories PIGS generates can be analyzed directly in the framework of transition networks. We demonstrate the possibility and usefulness of such focused explorations of biomolecules with two loops that are part of the binding sites of bromodomains, a family of epigenetic "reader" modules. This real-life application uncovers states that are structurally and kinetically far away from the initial crystallographic structures and are also metastable. Representative conformations are intended to be used in future high-throughput virtual screening campaigns.

  18. Characterization of superconducting multilayers samples

    CERN Document Server

    Antoine, C Z; Berry, S; Bouat, S; Jacquot, J F; Villegier, J C; Lamura, G; Gurevich, A

    2009-01-01

    Best RF bulk niobium accelerating cavities have nearly reached their ultimate limits at rf equatorial magnetic field H  200 mT close to the thermodynamic critical field Hc. In 2006 Gurevich proposed to use nanoscale layers of superconducting materials with high values of Hc > HcNb for magnetic shielding of bulk niobium to increase the breakdown magnetic field inside SC RF cavities [1]. Depositing good quality layers inside a whole cavity is rather difficult but we have sputtered high quality samples by applying the technique used for the preparation of superconducting electronics circuits and characterized these samples by X-ray reflectivity, dc resistivity (PPMS) and dc magnetization (SQUID). Dc magnetization curves of a 250 nm thick Nb film have been measured, with and without a magnetron sputtered coating of a single or multiple stack of 15 nm MgO and 25 nm NbN layers. The Nb samples with/without the coating clearly exhibit different behaviors. Because SQUID measurements are influenced by edge an...

  19. Synchronizing data from irregularly sampled sensors

    Science.gov (United States)

    Uluyol, Onder

    2017-07-11

    A system and method include receiving a set of sampled measurements for each of multiple sensors, wherein the sampled measurements are at irregular intervals or different rates, re-sampling the sampled measurements of each of the multiple sensors at a higher rate than one of the sensor's set of sampled measurements, and synchronizing the sampled measurements of each of the multiple sensors.

  20. Sampling for Machine Translation Evaluation

    OpenAIRE

    de la Fuente, Rubén

    2014-01-01

    Aquest article pretén oferir una visió general de les millors pràctiques desenvolupades a PayPal per al disseny i preparació de mostres per a diferents tasques incloses en el procés d'avaluació de la traducció automàtica. This paper intends to provide an overview of best practices developed within PayPal for designing and preparing samples for different tasks included in the process of machine translation evaluation. Este artículo pretende ofrecer una visión general de las mejores práct...

  1. Bayesian Sampling using Condition Indicators

    DEFF Research Database (Denmark)

    Faber, Michael H.; Sørensen, John Dalsgaard

    2002-01-01

    The problem of control quality of components is considered for the special case where the acceptable failure rate is low, the test costs are high and where it may be difficult or impossible to test the condition of interest directly. Based on the classical control theory and the concept...... of condition indicators introduced by Benjamin and Cornell (1970) a Bayesian approach to quality control is formulated. The formulation is then extended to the case where the quality control is based on sampling of indirect information about the condition of the components, i.e. condition indicators...

  2. Biobanking and international interoperability: samples.

    Science.gov (United States)

    Kiehntopf, Michael; Krawczak, Michael

    2011-09-01

    In terms of sample exchange, international collaborations between biobanks, or between biobanks and their research partners, have two important aspects. First, the donors' consent usually implies that the scope and purpose of any sample transfer to third parties is subject to major constraints. Since the legal, ethical and political framework of biobanking may differ substantially, even between countries of comparable jurisdictional systems, general rules for the international sharing of biomaterial are difficult, if not impossible, to define. Issues of uncertainty include the right to transfer the material, the scope of research allowed, and intellectual property rights. Since suitable means of international law enforcement may not be available in the context of biobanking, collaborators are advised to clarify any residual uncertainty by means of bilateral contracts, for example, in the form of material transfer agreements. Second, biobank partners may rightly expect that the biomaterial they receive for further analysis attains a certain level of quality. This implies that a biobank has to implement stringent quality control measures covering, in addition to the material transfer itself, the whole process of material acquisition, transport, pre-analytical handling and storage. Again, it may be advisable for biobank partners to claim contractual warranties for the type and quality of the biomaterial they wish to acquire.

  3. Cold SQUIDs and hot samples

    Energy Technology Data Exchange (ETDEWEB)

    Lee, T.S.C. [Univ. of California, Berkeley, CA (United States). Dept. of Physics]|[Lawrence Berkeley national Lab., CA (United States). Materials Sciences Div.

    1997-05-01

    Low transition temperature (low-{Tc}) and high-{Tc} Superconducting QUantum Interference Devices (SQUIDs) have been used to perform high-resolution magnetic measurements on samples whose temperatures are much higher than the operating temperatures of the devices. Part 1 of this work focuses on measurements of the rigidity of flux vortices in high-{Tc} superconductors using two low-{Tc} SQUIDs, one on either side of a thermally-insulated sample. The correlation between the signals of the SQUIDs is a direct measure of the extent of correlation between the movements of opposite ends of vortices. These measurements were conducted under the previously-unexplored experimental conditions of nominally-zero applied magnetic field, such that vortex-vortex interactions were unimportant, and with zero external current. At specific temperatures, the authors observed highly-correlated noise sources, suggesting that the vortices moved as rigid rods. At other temperatures, the noise was mostly uncorrelated, suggesting that the relevant vortices were pinned at more than one point along their length. Part 2 describes the design, construction, performance, and applications of a scanning high-{Tc} SQUID microscope optimized for imaging room-temperature objects with very high spatial resolution and magnetic source sensitivity.

  4. NASA's Aerosol Sampling Experiment Summary

    Science.gov (United States)

    Meyer, Marit E.

    2016-01-01

    In a spacecraft cabin environment, the size range of indoor aerosols is much larger and they persist longer than on Earth because they are not removed by gravitational settling. A previous aerosol experiment in 1991 documented that over 90 of the mass concentration of particles in the NASA Space Shuttle air were between 10 m and 100 m based on measurements with a multi-stage virtual impactor and a nephelometer (Liu et al. 1991). While the now-retired Space Shuttle had short duration missions (less than two weeks), the International Space Station (ISS) has been continually inhabited by astronauts for over a decade. High concentrations of inhalable particles on ISS are potentially responsible for crew complaints of respiratory and eye irritation and comments about 'dusty' air. Air filtration is the current control strategy for airborne particles on the ISS, and filtration modeling, performed for engineering and design validation of the air revitalization system in ISS, predicted that PM requirements would be met. However, aerosol monitoring has never been performed on the ISS to verify PM levels. A flight experiment is in preparation which will provide data on particulate matter in ISS ambient air. Particles will be collected with a thermophoretic sampler as well as with passive samplers which will extend the particle size range of sampling. Samples will be returned to Earth for chemical and microscopic analyses, providing the first aerosol data for ISS ambient air.

  5. TRU waste-sampling program

    Energy Technology Data Exchange (ETDEWEB)

    Warren, J.L.; Zerwekh, A.

    1985-08-01

    As part of a TRU waste-sampling program, Los Alamos National Laboratory retrieved and examined 44 drums of /sup 238/Pu- and /sup 239/Pu-contaminated waste. The drums ranged in age from 8 months to 9 years. The majority of drums were tested for pressure, and gas samples withdrawn from the drums were analyzed by a mass spectrometer. Real-time radiography and visual examination were used to determine both void volumes and waste content. Drum walls were measured for deterioration, and selected drum contents were reassayed for comparison with original assays and WIPP criteria. Each drum tested at atmospheric pressure. Mass spectrometry revealed no problem with /sup 239/Pu-contaminated waste, but three 8-month-old drums of /sup 238/Pu-contaminated waste contained a potentially hazardous gas mixture. Void volumes fell within the 81 to 97% range. Measurements of drum walls showed no significant corrosion or deterioration. All reassayed contents were within WIPP waste acceptance criteria. Five of the drums opened and examined (15%) could not be certified as packaged. Three contained free liquids, one had corrosive materials, and one had too much unstabilized particulate. Eleven drums had the wrong (or not the most appropriate) waste code. In many cases, disposal volumes had been inefficiently used. 2 refs., 23 figs., 7 tabs.

  6. Sampled-data controller implementation

    Science.gov (United States)

    Wang, Yu; Leduc, Ryan J.

    2012-09-01

    The setting of this article is the implementation of timed discrete-event systems (TDES) as sampled-data (SD) controllers. An SD controller is driven by a periodic clock and sees the system as a series of inputs and outputs. On each clock edge (tick event), it samples its inputs, changes states and updates its outputs. In this article, we establish a formal representation of an SD controller as a Moore synchronous finite state machine (FSM). We describe how to translate a TDES supervisor to an FSM, as well as necessary properties to be able to do so. We discuss how to construct a single centralised controller as well as a set of modular controllers, and show that they will produce equivalent output. We briefly discuss how the recently introduced SD controllability definition relates to our translation method. SD controllability is an extension of TDES controllability which captures several new properties that are useful in dealing with concurrency issues, as well as make it easier to translate a TDES supervisor into an SD controller. We next discuss the application of SD controllability to a small flexible manufacturing system (FMS) from the literature. The example demonstrates the successful application of the new SD properties. We describe the design of the system in detail to illustrate the new conditions and to provide designers with guidance on how to apply the properties. We also present some FSM translation issues encountered, as well as the FSM version of the system's supervisors.

  7. Towards Cost-efficient Sampling Methods

    CERN Document Server

    Peng, Luo; Chong, Wu

    2014-01-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper presents two new sampling methods based on the perspective that a small part of vertices with high node degree can possess the most structure information of a network. The two proposed sampling methods are efficient in sampling the nodes with high degree. The first new sampling method is improved on the basis of the stratified random sampling method and selects the high degree nodes with higher probability by classifying the nodes according to their degree distribution. The second sampling method improves the existing snowball sampling method so that it enables to sample the targeted nodes selectively in every sampling step. Besides, the two proposed sampling methods not only sample the nodes but also pick the edges directly connected to these nodes. In order to demonstrate the two methods' availability and accuracy, we compare them with the existing sampling methods in...

  8. 40 CFR 141.21 - Coliform sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Coliform sampling. 141.21 Section 141... sampling. (a) Routine monitoring. (1) Public water systems must collect total coliform samples at sites... must collect at least one repeat sample from the sampling tap where the original total coliform...

  9. 40 CFR 1065.150 - Continuous sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Continuous sampling. 1065.150 Section... ENGINE-TESTING PROCEDURES Equipment Specifications § 1065.150 Continuous sampling. You may use continuous sampling techniques for measurements that involve raw or dilute sampling. Make sure continuous sampling...

  10. 40 CFR 89.420 - Background sample.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Background sample. 89.420 Section 89... Procedures § 89.420 Background sample. (a) Background samples are produced by continuously drawing a sample... background samples may be produced and analyzed for each mode. Hence, a unique background value will be used...

  11. Methodology Series Module 5: Sampling Strategies.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  12. Methodology series module 5: Sampling strategies

    Directory of Open Access Journals (Sweden)

    Maninder Singh Setia

    2016-01-01

    Full Text Available Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the 'Sampling Method'. There are essentially two types of sampling methods: 1 probability sampling – based on chance events (such as random numbers, flipping a coin etc.; and 2 non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term 'random sample' when the researcher has used convenience sample. The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the 'generalizability' of these results. In such a scenario, the researcher may want to use 'purposive sampling' for the study.

  13. Sample Return Systems for Extreme Environments Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Since the Apollo era, sample return missions have been primarily limited to asteroid sampling. More comprehensive sampling could yield critical information on the...

  14. Methodology series module 5: Sampling strategies

    OpenAIRE

    Maninder Singh Setia

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ? Sampling Method?. There are essentially two types of sampling methods: 1) probability sampling ? based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling ? based on researcher's choice, population that accessible & available. Some of the non-probabilit...

  15. Zamak samples analyses using EDXRF

    Energy Technology Data Exchange (ETDEWEB)

    Assis, J.T. de; Lima, I.; Monin, V., E-mail: joaquim@iprj.uerj.b, E-mail: inaya@iprj.uerj.b, E-mail: monin@iprj.uerj.b [Universidade do Estado do Rio de Janeiro (UERJ), Nova Friburgo, RJ (Brazil). Inst. Politecnico. Dept. de Engenharia Mecanica e Energia; Anjos, M. dos; Lopes, R.T., E-mail: ricardo@lin.ufrj.b [Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Lab. de Instrumentacao Nuclear; Alves, H., E-mail: marcelin@uerj.b, E-mail: haimon.dlafis@gmail.co [Universidade do Estado do Rio de Janeiro (UERJ), Rio de Janeiro, RJ (Brazil). Inst. de Fisica. Dept. de Fisica Aplicada e Termodinamica

    2009-07-01

    Zamak is a family of alloys with a base metal of zinc and alloying elements of aluminium, magnesium and copper. Among all non-ferrous metal alloys, Zamak is one that has more applications, for their physical, mechanical properties and easy ability to electrodeposition. It has good resistance to corrosion, traction, shock and wear. Its low melting point (approximately 400 deg C) allows greater durability of the mold, allowing greater production of melted series parts. Zamak can be used in several kinds of areas, such as, to produce residential and industrial locks, construction and carpentry components, refrigerators hinges and so on. It is observed that in some cases the quality of these products is not very good. The problem should be the quality of Zamak alloy purchased by the industries. One possible technique that can be used to investigate the quality of these alloys is Energy Dispersive X-ray fluorescence. In this paper we present results of eight samples of Zamak alloy by this technique and it was possible to classify Zamak alloy and verify some irregularity on these alloys. (author)

  16. Graph Sampling for Covariance Estimation

    KAUST Repository

    Chepuri, Sundeep Prabhakar

    2017-04-25

    In this paper the focus is on subsampling as well as reconstructing the second-order statistics of signals residing on nodes of arbitrary undirected graphs. Second-order stationary graph signals may be obtained by graph filtering zero-mean white noise and they admit a well-defined power spectrum whose shape is determined by the frequency response of the graph filter. Estimating the graph power spectrum forms an important component of stationary graph signal processing and related inference tasks such as Wiener prediction or inpainting on graphs. The central result of this paper is that by sampling a significantly smaller subset of vertices and using simple least squares, we can reconstruct the second-order statistics of the graph signal from the subsampled observations, and more importantly, without any spectral priors. To this end, both a nonparametric approach as well as parametric approaches including moving average and autoregressive models for the graph power spectrum are considered. The results specialize for undirected circulant graphs in that the graph nodes leading to the best compression rates are given by the so-called minimal sparse rulers. A near-optimal greedy algorithm is developed to design the subsampling scheme for the non-parametric and the moving average models, whereas a particular subsampling scheme that allows linear estimation for the autoregressive model is proposed. Numerical experiments on synthetic as well as real datasets related to climatology and processing handwritten digits are provided to demonstrate the developed theory.

  17. Bilateral inferior petrosal sinus sampling.

    Science.gov (United States)

    Zampetti, Benedetta; Grossrubatscher, Erika; Dalino Ciaramella, Paolo; Boccardi, Edoardo; Loli, Paola

    2016-07-01

    Simultaneous bilateral inferior petrosal sinus sampling (BIPSS) plays a crucial role in the diagnostic work-up of Cushing's syndrome. It is the most accurate procedure in the differential diagnosis of hypercortisolism of pituitary or ectopic origin, as compared with clinical, biochemical and imaging analyses, with a sensitivity and specificity of 88-100% and 67-100%, respectively. In the setting of hypercortisolemia, ACTH levels obtained from venous drainage of the pituitary are expected to be higher than the levels of peripheral blood, thus suggesting pituitary ACTH excess as the cause of hypercortisolism. Direct stimulation of the pituitary corticotroph with corticotrophin-releasing hormone enhances the sensitivity of the procedure. The procedure must be undertaken in the presence of hypercortisolemia, which suppresses both the basal and stimulated secretory activity of normal corticotrophic cells: ACTH measured in the sinus is, therefore, the result of the secretory activity of the tumor tissue. The poor accuracy in lateralization of BIPSS (positive predictive value of 50-70%) makes interpetrosal ACTH gradient alone not sufficient for the localization of the tumor. An accurate exploration of the gland is recommended if a tumor is not found in the predicted area. Despite the fact that BIPSS is an invasive procedure, the occurrence of adverse events is extremely rare, particularly if it is performed by experienced operators in referral centres. © 2016 The authors.

  18. Bilateral inferior petrosal sinus sampling

    Directory of Open Access Journals (Sweden)

    Benedetta Zampetti

    2016-08-01

    Full Text Available Simultaneous bilateral inferior petrosal sinus sampling (BIPSS plays a crucial role in the diagnostic work-up of Cushing’s syndrome. It is the most accurate procedure in the differential diagnosis of hypercortisolism of pituitary or ectopic origin, as compared with clinical, biochemical and imaging analyses, with a sensitivity and specificity of 88–100% and 67–100%, respectively. In the setting of hypercortisolemia, ACTH levels obtained from venous drainage of the pituitary are expected to be higher than the levels of peripheral blood, thus suggesting pituitary ACTH excess as the cause of hypercortisolism. Direct stimulation of the pituitary corticotroph with corticotrophin-releasing hormone enhances the sensitivity of the procedure. The procedure must be undertaken in the presence of hypercortisolemia, which suppresses both the basal and stimulated secretory activity of normal corticotrophic cells: ACTH measured in the sinus is, therefore, the result of the secretory activity of the tumor tissue. The poor accuracy in lateralization of BIPSS (positive predictive value of 50–70% makes interpetrosal ACTH gradient alone not sufficient for the localization of the tumor. An accurate exploration of the gland is recommended if a tumor is not found in the predicted area. Despite the fact that BIPSS is an invasive procedure, the occurrence of adverse events is extremely rare, particularly if it is performed by experienced operators in referral centres.

  19. 21 CFR 203.38 - Sample lot or control numbers; labeling of sample units.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Sample lot or control numbers; labeling of sample... SERVICES (CONTINUED) DRUGS: GENERAL PRESCRIPTION DRUG MARKETING Samples § 203.38 Sample lot or control numbers; labeling of sample units. (a) Lot or control number required on drug sample labeling and sample...

  20. Tank 12H residuals sample analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Oji, L. N. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Shine, E. P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Diprete, D. P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Coleman, C. J. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Hay, M. S. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-06-11

    The Savannah River National Laboratory (SRNL) was requested by Savannah River Remediation (SRR) to provide sample preparation and analysis of the Tank 12H final characterization samples to determine the residual tank inventory prior to grouting. Eleven Tank 12H floor and mound residual material samples and three cooling coil scrape samples were collected and delivered to SRNL between May and August of 2014.

  1. Credit in Acceptance Sampling on Attributes

    NARCIS (Netherlands)

    Klaassen, Chris A.J.

    2000-01-01

    Credit is introduced in acceptance sampling on attributes and a Credit Based Acceptance sampling system is developed that is very easy to apply in practice.The credit of a producer is defined as the total number of items accepted since the last rejection.In our sampling system the sample size for a

  2. 45 CFR 1356.84 - Sampling.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Sampling. 1356.84 Section 1356.84 Public Welfare....84 Sampling. (a) The State agency may collect and report the information required in section 1356.83(e) of this part on a sample of the baseline population consistent with the sampling requirements...

  3. 30 CFR 90.208 - Bimonthly sampling.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Bimonthly sampling. 90.208 Section 90.208... MANDATORY HEALTH STANDARDS-COAL MINERS WHO HAVE EVIDENCE OF THE DEVELOPMENT OF PNEUMOCONIOSIS Sampling Procedures § 90.208 Bimonthly sampling. (a) Each operator shall take one valid respirable dust sample for...

  4. 30 CFR 90.207 - Compliance sampling.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Compliance sampling. 90.207 Section 90.207... MANDATORY HEALTH STANDARDS-COAL MINERS WHO HAVE EVIDENCE OF THE DEVELOPMENT OF PNEUMOCONIOSIS Sampling Procedures § 90.207 Compliance sampling. (a) The operator shall take five valid respirable dust samples for...

  5. 42 CFR 402.109 - Statistical sampling.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Statistical sampling. 402.109 Section 402.109... Statistical sampling. (a) Purpose. CMS or OIG may introduce the results of a statistical sampling study to... or caused to be presented. (b) Prima facie evidence. The results of the statistical sampling study...

  6. 40 CFR 61.34 - Air sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 8 2010-07-01 2010-07-01 false Air sampling. 61.34 Section 61.34... sampling. (a) Stationary sources subject to § 61.32(b) shall locate air sampling sites in accordance with a... concentrations calculated within 30 days after filters are collected. Records of concentrations at all sampling...

  7. 7 CFR 51.17 - Official sampling.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Official sampling. 51.17 Section 51.17 Agriculture... Inspection Service § 51.17 Official sampling. Samples may be officially drawn by any duly authorized... time and place of the sampling and the brands or other identifying marks of the containers from which...

  8. 40 CFR 90.422 - Background sample.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Background sample. 90.422 Section 90... Procedures § 90.422 Background sample. (a) Background samples are produced by drawing a sample of the dilution air during the exhaust collection phase of each test cycle mode. (1) An individual background...

  9. Improved variance estimation along sample eigenvectors

    NARCIS (Netherlands)

    Hendrikse, A.J.; Veldhuis, Raymond N.J.; Spreeuwers, Lieuwe Jan

    Second order statistics estimates in the form of sample eigenvalues and sample eigenvectors give a sub optimal description of the population density. So far only attempts have been made to reduce the bias in the sample eigenvalues. However, because the sample eigenvectors differ from the population

  10. Sampling wild species to conserve genetic diversity

    Science.gov (United States)

    Sampling seed from natural populations of crop wild relatives requires choice of the locations to sample from and the amount of seed to sample. While this may seem like a simple choice, in fact careful planning of a collector’s sampling strategy is needed to ensure that a crop wild collection will ...

  11. Nanopipettes: probes for local sample analysis.

    Science.gov (United States)

    Saha-Shah, Anumita; Weber, Anna E; Karty, Jonathan A; Ray, Steven J; Hieftje, Gary M; Baker, Lane A

    2015-06-01

    Nanopipettes (pipettes with diameters nanopipette shank was studied to optimize sampling volume and probe geometry. This method was utilized to collect nanoliter volumes (nanopipettes for surface sampling of mouse brain tissue sections was also explored. Lipid analyses were performed on mouse brain tissues with spatial resolution of sampling as small as 50 μm. Nanopipettes were shown to be a versatile tool that will find further application in studies of sample heterogeneity and population analysis for a wide range of samples.

  12. Separating Interviewer and Sampling-Point Effects

    OpenAIRE

    Rainer Schnell; Frauke Kreuter

    2003-01-01

    "Data used in nationwide face-to-face surveys are almost always collected in multistage cluster samples. The relative homogeneity of the clusters selected in this way can lead to design effects at the sampling stage. Interviewers can further homogenize answers within the small geographic clusters that form the sampling points. The study presented here was designed to distinguish between interviewer effects and sampling-point effects using interpenetrated samples for conducting ...

  13. Concrete samples for organic samples, data package and 222-S validation summary report. Addendum 1A

    Energy Technology Data Exchange (ETDEWEB)

    Vogel, R.E.

    1994-11-01

    This document is in two parts: the first is the data package entitled ``Concrete Samples for Organic Samples`` and the second is entitled ``Concrete Samples for Organic Samples -- Addendum 1A`` which is the 222-S validation summary report.

  14. Using Inverse Probability Bootstrap Sampling to Eliminate Sample Induced Bias in Model Based Analysis of Unequal Probability Samples.

    Science.gov (United States)

    Nahorniak, Matthew; Larsen, David P; Volk, Carol; Jordan, Chris E

    2015-01-01

    In ecology, as in other research fields, efficient sampling for population estimation often drives sample designs toward unequal probability sampling, such as in stratified sampling. Design based statistical analysis tools are appropriate for seamless integration of sample design into the statistical analysis. However, it is also common and necessary, after a sampling design has been implemented, to use datasets to address questions that, in many cases, were not considered during the sampling design phase. Questions may arise requiring the use of model based statistical tools such as multiple regression, quantile regression, or regression tree analysis. However, such model based tools may require, for ensuring unbiased estimation, data from simple random samples, which can be problematic when analyzing data from unequal probability designs. Despite numerous method specific tools available to properly account for sampling design, too often in the analysis of ecological data, sample design is ignored and consequences are not properly considered. We demonstrate here that violation of this assumption can lead to biased parameter estimates in ecological research. In addition, to the set of tools available for researchers to properly account for sampling design in model based analysis, we introduce inverse probability bootstrapping (IPB). Inverse probability bootstrapping is an easily implemented method for obtaining equal probability re-samples from a probability sample, from which unbiased model based estimates can be made. We demonstrate the potential for bias in model-based analyses that ignore sample inclusion probabilities, and the effectiveness of IPB sampling in eliminating this bias, using both simulated and actual ecological data. For illustration, we considered three model based analysis tools--linear regression, quantile regression, and boosted regression tree analysis. In all models, using both simulated and actual ecological data, we found inferences to be

  15. Des données pour changer le monde | CRDI - Centre de recherches ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    28 mars 2017 ... Parce que, soyons honnêtes, un gouvernement seul ne sera jamais capable de combattre la corruption ou le réchauffement climatique. » Et toute cette démarche de « libération des données » est intiment liée à la nature du Web. Quoi de mieux qu'Internet pour distribuer à volonté de l'information, sans ...

  16. Frequency response variation of two offshore wind park transformers with different tap changer positions

    DEFF Research Database (Denmark)

    Arana Aristi, Iván; Holbøll, Joachim; Sørensen, T

    2010-01-01

    This paper presents the results of several sweep frequency response analysis (SFRA) measurements performed on two identical offshore wind farm transformers. A comparison is made between the transformers based on different recommended measurements and procedures, different measurement systems...

  17. Global risk & global challenges - Space as a game changer for socioeconomic sustainable development

    Science.gov (United States)

    Lehnert, Christopher; Karlsson, Evelina; Giannopapa, Christina

    2017-11-01

    The world's societies at the beginning of the 21st century are better off than ever before. (Gapminder, 2015) At the same time, the world is also threatened by global challenges where space as a tool has and can play a pivotal role in meeting those challenges. The challenges range from climate change, over mass unemployment, to terrorism or migration - to name but a few. Space activities have started to respond to this changing world, not only by providing a deeper understanding of our universe, but by using space as an additional sphere and sector, through which humankind can increase and secure its wealth - it is thus game changing in the way we sustain humanity's existence. This paper is meant to capture this development. In the first part, an overview is given on the risks that humankind is facing. The second part describes the way that space can be used as a tool to prevent and manage these risks. The overview in the first part is based on the examination of the most recent reports and overall strategies of key International Governmental Organisations and Non-Governmental Organisations that are involved in agenda-setting, policy formulation and implementation. The second part includes an overview on current activities of the European Space Agency (ESA) that play a role in responding to these risks. To better understand ESA's activities that contain humanity's risks, a standard classification for risks management is used, which distinguishes between four components: Identification, Assessment, Management and Communication (Renn, 2005). The analysis reveals how space activities already today play a pivotal role in all four types of risk management. Space activities contribute very tangible to the management of risks through its space mission, but also in a more indirect way, as providing the technical backbone for stable and reliable cooperation in the international governance arena, and serve as crucial economic stimulator. The overall results show that space activities touch upon every aspect of responding to the humanity's risks. Especially in the identification and the preventive management of humanity's risks, space systems are a crucial enabler. They are also an important part in dealing with risks related to scarcity of resources. It is thus important that current levels of investments into space infrastructure are maintained, as the benefits of space activities is essential to humankind's existence and that upon further programmatic decisions, stakeholders involved with the management of risks are being consulted.

  18. An Extreme Event as a Game Changer in Coastal Erosion Management

    DEFF Research Database (Denmark)

    Sørensen, Carlo Sass; Drønen, Nils K.; Knudsen, Per

    2016-01-01

    of cyclone Xaver in December 2013 with severe coastal erosion led to collaboration between the involved municipalities to work on a coherent solution for the entire coastline that involves sand nourishments, renovation and optimization of hard protection structures, and the restoration of recreational values...

  19. The Axisymmetric Tandem Mirror: A Magnetic Mirror Concept Game Changer Magnet Mirror Status Study Group

    Energy Technology Data Exchange (ETDEWEB)

    Simonen, T; Cohen, R; Correll, D; Fowler, K; Post, D; Berk, H; Horton, W; Hooper, E B; Fisch, N; Hassam, A; Baldwin, D; Pearlstein, D; Logan, G; Turner, B; Moir, R; Molvik, A; Ryutov, D; Ivanov, A A; Kesner, J; Cohen, B; McLean, H; Tamano, T; Tang, X Z; Imai, T

    2008-10-24

    Experimental results, theory and innovative ideas now point with increased confidence to the possibility of a Gas Dynamic Trap (GDT) neutron source which would be on the path to an attractively simple Axisymmetric Tandem Mirror (ATM) power plant. Although magnetic mirror research was terminated in the US 20 years ago, experiments continued in Japan (Gamma 10) and Russia (GDT), with a very small US effort. This research has now yielded data, increased understanding, and generated ideas resulting in the new concepts described here. Early mirror research was carried out with circular axisymmetric magnets. These plasmas were MHD unstable due to the unfavorable magnetic curvature near the mid-plane. Then the minimum-B concept emerged in which the field line curvature was everywhere favorable and the plasma was situated in a MHD stable magnetic well (70% average beta in 2XII-B). The Ioffe-bar or baseball-coil became the standard for over 40 years. In the 1980's, driven by success with minimum-B stabilization and the control of ion cyclotron instabilities in PR6 and 2XII-B, mirrors were viewed as a potentially attractive concept with near-term advantages as a lower Q neutron source for applications such as a hybrid fission fuel factory or toxic waste burner. However there are down sides to the minimum-B geometry: coil construction is complex; restraining magnetic forces limit field strength and mirror ratios. Furthermore, the magnetic field lines have geodesic curvature which introduces resonant and neoclassical radial transport as observed in early tandem mirror experiments. So what now leads us to think that simple axisymmetric mirror plasmas can be stable? The Russian GDT experiment achieves on-axis 60% beta by peaking of the kinetic plasma pressure near the mirror throat (where the curvature is favorable) to counter-balance the average unfavorable mid-plane curvature. Then a modest augmentation of plasma pressure in the expander results in stability. The GDT experiments have confirmed the physics of effluent plasma stabilization predicted by theory. The plasma had a mean ion energy of 10 keV and a density of 5e19m-3. If successful, the axisymmetric tandem mirror extension of the GDT idea could lead to a Q {approx} 10 power plant of modest size and would yield important applications at lower Q. In addition to the GDT method, there are four other ways to augment stability that have been demonstrated; including: plasma rotation (MCX), diverter coils (Tara), pondermotive (Phaedrus & Tara), and end wall funnel shape (Nizhni Novgorod). There are also 5 stabilization techniques predicted, but not yet demonstrated: expander kinetic pressure (KSTM-Post), Pulsed ECH Dynamic Stabilization (Post), wall stabilization (Berk), non-paraxial end mirrors (Ryutov), and cusp ends (Kesner). While these options should be examined further together with conceptual engineering designs. Physics issues that need further analysis include: electron confinement, MHD and trapped particle modes, analysis of micro stability, radial transport, evaluation and optimization of Q, and the plasma density needed to bridge to the expansion-region. While promising all should be examined through increased theory effort, university-scale experiments, and through increased international collaboration with the substantial facilities in Russia and Japan The conventional wisdom of magnetic mirrors was that they would never work as a fusion concept for a number of reasons. This conventional wisdom is most probably all wrong or not applicable, especially for applications such as low Q (DT Neutron Source) aimed at materials testing or for a Q {approx} 3-5 fusion neutron source applied to destroying actinides in fission waste and breeding of fissile fuel.

  20. Game-Changer: Operationalizing the Common Core Using WebQuests and "Gamification" in Teacher Education

    Science.gov (United States)

    Levitt, Roberta; Piro, Joseph

    2014-01-01

    Technology integration and Information and Communication Technology (ICT)-based education have enhanced the teaching and learning process by introducing a range of web-based instructional resources for classroom practitioners to deepen and extend instruction. One of the most durable of these resources has been the WebQuest. Introduced around the…

  1. Smartphone Applications for Hypertension Management: a Potential Game-Changer That Needs More Control.

    Science.gov (United States)

    Parati, Gianfranco; Torlasco, Camilla; Omboni, Stefano; Pellegrini, Dario

    2017-06-01

    This review article will summarize available data on mobile applications for the management of hypertension, by highlighting their potential for clinical use, the current limitations and the yet pending issues to be addressed in future studies. The number of available applications related to arterial hypertension and their usage by smartphone owners is constantly increasing. However, most applications lack standardization and scientific validation, and security flaws could be an important, yet still underrated, issue. Small studies showed that treatment strategies based on telemonitoring of home blood pressure with mobile applications could improve blood pressure control, but there are no data on strong outcomes and the high heterogeneity of available studies severely limits the possibility of reaching a definitive conclusion on the impact of such strategies. Smartphone applications for arterial hypertension represent a great chance to improve management of this condition. Results from small studies are promising, but there is a strong need for large, long-term, well-designed clinical trials, before these potential solutions might be reliably applied in real-life patients' care.

  2. Genetic markers as therapeutic target in rheumatoid arthritis: A game changer in clinical therapy?

    Science.gov (United States)

    Ali, A M Mohamed Thoufic; Vino, S

    2016-11-01

    Rheumatoid arthritis (RA) is a chronic, inflammatory, multi-systemic autoimmune disease unremitted by genetic and environmental factors. The factors are crucial but inadequate in the development of disease; however, these factors can be representative of potential therapeutic targets and response to clinical therapy. Insights into the contribution of genetic risk factors are currently in progress with studies querying the genetic variation, their role in gene expression of coding and non-coding genes and other mechanisms of disease. In this review, we describe the significance of genetic markers architecture of RA through genome-wide association studies and meta-analysis studies. Further, it also reveals the mechanism of disease pathogenesis investigated through the mutual findings of functional and genetic studies of individual RA-associated genes, which includes HLA-DRB1, HLA-DQB1, HLA-DPB1, PADI4, PTPN22, TRAF1-C5, STAT4 and C5orf30. However, the genetic background of RA remains to be clearly depicted. Prospective efforts of the post-genomic and functional genomic period can travel toward real possible assessment of the genetic effect on RA. The discovery of novel genes associated with the disease can be appropriate in identifying potential biomarkers, which could assist in early diagnosis and aggressive treatment.

  3. Hybrid 3D printing: a game-changer in personalized cardiac medicine?

    Science.gov (United States)

    Kurup, Harikrishnan K N; Samuel, Bennett P; Vettukattil, Joseph J

    2015-12-01

    Three-dimensional (3D) printing in congenital heart disease has the potential to increase procedural efficiency and patient safety by improving interventional and surgical planning and reducing radiation exposure. Cardiac magnetic resonance imaging and computed tomography are usually the source datasets to derive 3D printing. More recently, 3D echocardiography has been demonstrated to derive 3D-printed models. The integration of multiple imaging modalities for hybrid 3D printing has also been shown to create accurate printed heart models, which may prove to be beneficial for interventional cardiologists, cardiothoracic surgeons, and as an educational tool. Further advancements in the integration of different imaging modalities into a single platform for hybrid 3D printing and virtual 3D models will drive the future of personalized cardiac medicine.

  4. (GameChanger) Multifunctional Design of Hybrid Composites of Load Bearing Antennas

    Science.gov (United States)

    2011-06-01

    changing technology whose realization will provide a new paradigm in load-bearing electronics and RF front ends. At this moment , we have only...3. Ming Yang, Kai Sun and Nicholas A. Kotov, Formation and Assembly−Disassembly Processes of ZnO Hexagonal Pyramids Driven by Dipolar and Excluded...Nicholas A. Kotov, Synthesis of CdSe Nanocrystals in High-temperature Water , Industrial & Engineering Chemistry Research 2007, 46(13), 4358-4362 76

  5. The Curious Case of NG2 Cells: Transient Trend or Game Changer?

    Directory of Open Access Journals (Sweden)

    Jean-Marie Mangin

    2011-02-01

    Full Text Available It has been 10 years since the seminal work of Dwight Bergles and collaborators demonstrated that NG2 (nerve/glial antigen 2-expressing oligodendrocyte progenitor cells (NG2 cells receive functional glutamatergic synapses from neurons (Bergles et al., 2000, contradicting the old dogma that only neurons possess the complex and specialized molecular machinery necessary to receive synapses. While this surprising discovery may have been initially shunned as a novelty item of undefined functional significance, the study of neuron-to-NG2 cell neurotransmission has since become a very active and exciting field of research. Many laboratories have now confirmed and extended the initial discovery, showing for example that NG2 cells can also receive inhibitory GABAergic synapses (Lin and Bergles, 2004 or that neuron-to-NG2 cell synaptic transmission is a rather ubiquitous phenomenon that has been observed in all brain areas explored so far, including white matter tracts (Kukley et al., 2007; Ziskin et al., 2007; Etxeberria et al., 2010. Thus, while still being in its infancy, this field of research has already brought many surprising and interesting discoveries, and has become part of a continuously growing effort in neuroscience to re-evaluate the long underestimated role of glial cells in brain function (Barres, 2008. However, this area of research is now reaching an important milestone and its long-term significance will be defined by its ability to uncover the still elusive function of NG2 cells and their synapses in the brain, rather than by its sensational but transient successes at upsetting the old order established by neuronal physiology. To participate in the effort to facilitate such a transition, here we propose a critical review of the latest findings in the field of NG2 cell physiology - discussing how they inform us on the possible function(s of NG2 cells in the brain - and we present some personal views on new directions the field could benefit from in order to achieve lasting significance.

  6. Profiles of STEM Students: Persisters, Joiners, Changers and Departers. ACT Research Report Series 2017-3

    Science.gov (United States)

    Westrick, Paul

    2017-01-01

    This study is an extension of two previous studies that provided profiles of persisting STEM majors overall (regardless of academic performance) and persisting STEM majors who earned semester GPAs of 3.0 or higher (Westrick, 2016, 2017). Using data from 25 four-year institutions, this study compared the mean ACT assessment scores, HSGPAs, and ACT…

  7. Metabolic changers in oxygen transport in patients with diabetes mellitus type 2. Possibilities for correction

    Directory of Open Access Journals (Sweden)

    I Z Bondarenko

    2009-06-01

    Full Text Available Diabetes mellitus type 2 (DM2 - is an independent predictor of development of heart failure (HF. Spiroergometry - is a method for studying blood gas exchange parameters, commonly used for specification of HF. The purpose: 1. To study features of gas exchange at patients with DM2 without cardiovascular diseases in comparison with healthy control. 2. To estimate efficiency of metoprolol for correction of metabolic disturbances in patients with DM2. Materials and methods: 12 patients with DM2, aged 48,4±8, without history of cardiovascular diseases and 15 control subjects, aged 43,6±8 underwent cardio-pulmonary exercise test on treadmill, according to Bruce protocol. Exercise energy, VO2 peak, MET, VE max, VCO2 production were observed. Results: Patients with DM2 had a reduced exercise duration (p<0,001, lower peak oxygen consumption (p<0,001, VCO2 production and MET (p<0,005, than controls, representing the same state of hypoxia as in patients with ischemic heart disease (IHD of functional class 2. The introduction of metoprolol to patients with DM2 significantly increased exercise duration time and VCO2 production (p<0,005. Conclusions: 1. VO2 consumption in patients with DM2 is decreased to the same levels as in persons without DM2, who have IHD and HF. 2. Changes in oxygen-transport in persons with DM2 may serve as a marker of negative influence of the disease on cardiovascular system status. 3. Metoprolol improves parameters of cardio-respiratory system in patients with DM2.

  8. Un texte sacré peut-il changer ? : Variations sur l'Évangile

    OpenAIRE

    Brunet, Étienne

    2000-01-01

    texte repris en 2016 par Bénédicte Pincemin comme chapitre 20 du tome III des Écrits choisis d'Étienne Brunet aux éditions Champion (Tous comptes faits. Écrits choisis, tome III. Questions linguistiques). Ce texte fait partie des "chapitres numériques", non imprimés dans l'ouvrage papier mais faisant partie intégrante de son contenu : travail de réédition scientifique du texte, contribution à la partie 6 "Texte" de l'ouvrage, prise en compte dans les parties introductives, commentaires, index...

  9. Game Changers: Six Technologies That Are Transforming Community College Education and Job Training

    Science.gov (United States)

    Zurier, Steve

    2013-01-01

    Few professors teaching at community colleges today expect students to learn every nuance of every technological device they are likely to encounter throughout the course of their careers. Educators can expose students to enough of a base so that when the technology does evolve--and it will--they have the confidence and ability to adapt and…

  10. La classe inversée peut-elle changer l’école ?

    OpenAIRE

    Peraya, Daniel

    2015-01-01

    Dans une classe inversée, les élèves prennent connaissance de la matière et des contenus du cours à domicile à partir de ressources diversifiées, classiques autant qu’en ligne, tandis que le temps passé en classe est consacré à des explications de compréhension, à des approfondissements, à des exercices et à des travaux individuels et/ou collectifs. Quel impact la classe inversée a-t-elle sur l'apprentissage Que disent les premières évaluations de cette approche pédagogique souvent présentée ...

  11. Game Changers: The Role Athletic Identity and Racial Identity Play on Academic Performance

    Science.gov (United States)

    Bimper, Albert Y., Jr.

    2014-01-01

    The purpose of this study was to examine the degree to which athletic and racial identity predict the academic outcomes of Black student athletes participating in National Collegiate Athletic Association Division 1 Football Bowl Series football. The academic outcomes of Black student athletes are a growing concern to both scholars and…

  12. In sport and social justice, is genetic enhancement a game changer?

    Science.gov (United States)

    Parker, Lisa S

    2012-12-01

    The possibility of genetic enhancement to increase the likelihood of success in sport and life's prospects raises questions for accounts of sport and theories of justice. These questions obviously include the fairness of such enhancement and its relationship to the goals of sport and demands of justice. Of equal interest, however, is the effect on our understanding of individual effort, merit, and desert of either discovering genetic contributions to components of such effort or recognizing the influence of social factors on the development and exercise of individual effort. This paper analyzes arguments about genetic enhancement with the goal of raising questions about how sport and justice regard unchosen, undeserved inequalities and what is assumed to be their opposite-namely, the exercise and results of individual effort. It is suggested that contemplating enhancement of natural assets previously outside human control may reinforce recognition of responsibility to intervene with regard to social advantages so as to support individual effort and improve individuals' life prospects.

  13. Multidrug therapy for leprosy: a game changer on the path to elimination.

    Science.gov (United States)

    Smith, Cairns S; Aerts, Ann; Saunderson, Paul; Kawuma, Joseph; Kita, Etsuko; Virmond, Marcos

    2017-09-01

    Leprosy is present in more than 100 countries, where it remains a major cause of peripheral neuropathy and disability. Attempts to eliminate the disease have faced various obstacles, including characteristics of the causative bacillus Mycobacterium leprae: the long incubation period, limited knowledge about its mode of transmission, and its poor growth on culture media. Fortunately, the leprosy bacillus is sensitive to several antibiotics. The first antibiotic to be widely used for leprosy treatment was dapsone in the 1950s, which had to be taken over several years and was associated with increasing bacterial resistance. Therefore, in 1981, WHO recommended that all registered patients with leprosy should receive combination therapy with three antibiotics: rifampicin, clofazimine, and dapsone. Global implementation of this highly effective multidrug therapy took about 15 years. In 1985, 5·3 million patients were receiving multidrug therapy; by 1991, this figure had decreased to 3·1 million (a decrease of 42%) and, by 2000, to 597 232 (a decrease of almost 90%). This reduction in the number of patients registered for treatment was due to shortening of the treatment regimen and achievement of 100% coverage with multidrug therapy. This achievement, which owed much to WHO and the donors of the multidrug therapy components, prompted WHO in 1991 to set a global target of less than one case per 10 000 population by 2000 to eliminate the disease as a public health problem. All but 15 countries achieved this target. Since 2000, about 250 000 new cases of leprosy have been detected every year. We believe an all-out campaign by a global leprosy coalition is needed to bring that figure down to zero. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Le développement ouvert : créer, échanger, engager | CRDI ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    11 mars 2016 ... Les technologies numériques et les plateformes de réseau qui mettent à profit les logiciels ouverts et les données ouvertes habilitent les gens à participer aux processus sociaux et politiques, à collaborer à la science et à l'éducation et à profiter des services. Matthew Smith et Ruhiya Seward, ...

  15. Open Innovation as Business Model Game-changer in the Public Sector

    DEFF Research Database (Denmark)

    Gaur, Aakanksha; Osella, Michele; Ferro, Enrico

    2017-01-01

    Organizations are increasingly looking to tap into external knowledge sources through open innovation initiatives. Most public sector agencies are in the early stages of adoption of open innovation and are in the process of defining relevant issues. Once such issue concerns how open innovation...... be better aligned with open innovation strategies (in our case crowdsourcing). Our results indicate that in adopting a crowd-based open innovation strategy, the content, structure and governance dimensions of public sector business model need to be aligned accordingly. The content of the business model...

  16. Hypersonic Vehicles: State-of-the-Art and Potential Game Changers for Future Warfare

    OpenAIRE

    Besser, Hans-Ludwig; Huggins, Michael; Zimper, Dirk; Göge, Dennis

    2016-01-01

    Hypersonic flight is lacking a scientific definition, but is typically understood as flight within the atmosphere at speed around and beyond Mach 5. In this regime, dissociation of air starts to get significant and kinetic heating results in increasingly severe problems for a vehicle with increasing flight Mach number. Temperatures to be dealt with are about doubled between Mach 4 and 6 and quadrupled between Mach 4 and 9. Drag forces get huge and limit longer flight to altitudes within th...

  17. Sampling a guide for internal auditors

    CERN Document Server

    Apostolou, Barbara

    2004-01-01

    While it is possible to examine 100 percent of an audit customer's data, the time and cost associated with such a study are often prohibitive. To obtain sufficient, reliable, and relevant information with a limited data set, sampling is an efficient and effective tool. It can help you evaluate the customer's assertions, as well as reach audit conclusions and provide reasonable assurance to your organization. This handbook will help you understand sampling. It also serves as a guide for auditors and students preparing for certification. Topics include: An overview of sampling. Statistical and nonstatistical sampling issues. Sampling selection methods and risks. The pros and cons of popular sampling plans.

  18. Dynamic Method for Identifying Collected Sample Mass

    Science.gov (United States)

    Carson, John

    2008-01-01

    G-Sample is designed for sample collection missions to identify the presence and quantity of sample material gathered by spacecraft equipped with end effectors. The software method uses a maximum-likelihood estimator to identify the collected sample's mass based on onboard force-sensor measurements, thruster firings, and a dynamics model of the spacecraft. This makes sample mass identification a computation rather than a process requiring additional hardware. Simulation examples of G-Sample are provided for spacecraft model configurations with a sample collection device mounted on the end of an extended boom. In the absence of thrust knowledge errors, the results indicate that G-Sample can identify the amount of collected sample mass to within 10 grams (with 95-percent confidence) by using a force sensor with a noise and quantization floor of 50 micrometers. These results hold even in the presence of realistic parametric uncertainty in actual spacecraft inertia, center-of-mass offset, and first flexibility modes. Thrust profile knowledge is shown to be a dominant sensitivity for G-Sample, entering in a nearly one-to-one relationship with the final mass estimation error. This means thrust profiles should be well characterized with onboard accelerometers prior to sample collection. An overall sample-mass estimation error budget has been developed to approximate the effect of model uncertainty, sensor noise, data rate, and thrust profile error on the expected estimate of collected sample mass.

  19. Small Sample Whole-Genome Amplification

    Energy Technology Data Exchange (ETDEWEB)

    Hara, C A; Nguyen, C P; Wheeler, E K; Sorensen, K J; Arroyo, E S; Vrankovich, G P; Christian, A T

    2005-09-20

    Many challenges arise when trying to amplify and analyze human samples collected in the field due to limitations in sample quantity, and contamination of the starting material. Tests such as DNA fingerprinting and mitochondrial typing require a certain sample size and are carried out in large volume reactions; in cases where insufficient sample is present whole genome amplification (WGA) can be used. WGA allows very small quantities of DNA to be amplified in a way that enables subsequent DNA-based tests to be performed. A limiting step to WGA is sample preparation. To minimize the necessary sample size, we have developed two modifications of WGA: the first allows for an increase in amplified product from small, nanoscale, purified samples with the use of carrier DNA while the second is a single-step method for cleaning and amplifying samples all in one column. Conventional DNA cleanup involves binding the DNA to silica, washing away impurities, and then releasing the DNA for subsequent testing. We have eliminated losses associated with incomplete sample release, thereby decreasing the required amount of starting template for DNA testing. Both techniques address the limitations of sample size by providing ample copies of genomic samples. Carrier DNA, included in our WGA reactions, can be used when amplifying samples with the standard purification method, or can be used in conjunction with our single-step DNA purification technique to potentially further decrease the amount of starting sample necessary for future forensic DNA-based assays.

  20. Aerobot Sampling and Handling System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Honeybee Robotics proposes to: ?Derive and document the functional and technical requirements for Aerobot surface sampling and sample handling across a range of...

  1. 1990 sampling of treated aspen stands

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — In mid-August, 1990, sampling of aspen stand exclosures were conducted at the National Elk Refuge. This sampling is part of a study to monitor aspen regeneration on...

  2. Revised Total Coliform Rule Lab Sampling Form

    Science.gov (United States)

    This form should be completed when a water system collects any required Revised Total Coliform Rule (RTCR) samples. It should also be used when collecting “Special” non-compliance samples for the RTCR.

  3. Extreme Environment Sampling System Deployment Mechanism Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Future Venus or Comet mission architectures may feature robotic sampling systems comprised of a Sampling Tool and Deployment Mechanism. Since 2005, Honeybee has been...

  4. AFSC/ABL: 2009 Chinook Excluder Samples

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This project genetically analyzed 1,620 chinook salmon samples from the 2009 spring salmon excluder device test. These samples were collected over a short period of...

  5. ISCO Grab Sample Ion Chromatography Analytical Data

    Data.gov (United States)

    U.S. Environmental Protection Agency — ISCO grab samples were collected from river, wastewater treatment plant discharge, and public drinking water intakes. Samples were analyzed for major ions (ppb)...

  6. Commutability of food microbiology proficiency testing samples.

    Science.gov (United States)

    Abdelmassih, M; Polet, M; Goffaux, M-J; Planchon, V; Dierick, K; Mahillon, J

    2014-03-01

    Food microbiology proficiency testing (PT) is a useful tool to assess the analytical performances among laboratories. PT items should be close to routine samples to accurately evaluate the acceptability of the methods. However, most PT providers distribute exclusively artificial samples such as reference materials or irradiated foods. This raises the issue of the suitability of these samples because the equivalence-or 'commutability'-between results obtained on artificial vs. authentic food samples has not been demonstrated. In the clinical field, the use of noncommutable PT samples has led to erroneous evaluation of the performances when different analytical methods were used. This study aimed to provide a first assessment of the commutability of samples distributed in food microbiology PT. REQUASUD and IPH organized 13 food microbiology PTs including 10-28 participants. Three types of PT items were used: genuine food samples, sterile food samples and reference materials. The commutability of the artificial samples (reference material or sterile samples) was assessed by plotting the distribution of the results on natural and artificial PT samples. This comparison highlighted matrix-correlated issues when nonfood matrices, such as reference materials, were used. Artificially inoculated food samples, on the other hand, raised only isolated commutability issues. In the organization of a PT-scheme, authentic or artificially inoculated food samples are necessary to accurately evaluate the analytical performances. Reference materials, used as PT items because of their convenience, may present commutability issues leading to inaccurate penalizing conclusions for methods that would have provided accurate results on food samples. For the first time, the commutability of food microbiology PT samples was investigated. The nature of the samples provided by the organizer turned out to be an important factor because matrix effects can impact on the analytical results. © 2013

  7. Optimal Design in Geostatistics under Preferential Sampling

    OpenAIRE

    Ferreira, Gustavo da Silva; Gamerman, Dani

    2015-01-01

    This paper analyses the effect of preferential sampling in Geostatistics when the choice of new sampling locations is the main interest of the researcher. A Bayesian criterion based on maximizing utility functions is used. Simulated studies are presented and highlight the strong influence of preferential sampling in the decisions. The computational complexity is faced by treating the new local sampling locations as a model parameter and the optimal choice is then made by analysing its posteri...

  8. Representative process sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Julius, Lars Petersen; Esbensen, Kim

    2005-01-01

    Process sampling of moving streams of particulate matter, fluids and slurries (over time or space) or stationary one-dimensional (1-D) lots is often carried out according to existing tradition or protocol not taking the theory of sampling (TOS) into account. In many situations, sampling errors...

  9. 7 CFR 275.11 - Sampling.

    Science.gov (United States)

    2010-01-01

    ... FOOD STAMP AND FOOD DISTRIBUTION PROGRAM PERFORMANCE REPORTING SYSTEM Quality Control (QC) Reviews... two samples for the food stamp quality control review process, an active case sample and a negative... quality control review has an equal or known chance of being selected in the sample. Since the food stamp...

  10. Spur reduction technique for sampling PLLs

    NARCIS (Netherlands)

    Gao, X.; Bahai, A.; Klumperink, Eric A.M.; Nauta, Bram; Bohsali, M.; Djabbari, A.; Socci, G.

    2010-01-01

    Control circuitry and method of controlling a sampling phase locked loop (PLL). By controlling the duty cycle of a sampling control signal, in accordance with the PLL reference and output signals, spurious output signals from the sampling PLL being controlled can be reduced.

  11. Low power and low spur sampling PLL

    NARCIS (Netherlands)

    Gao, X.; Klumperink, Eric A.M.; Bahai, A.; Bohsali, M.; Nauta, Bram; Djabbari, A.; Socci, G.

    2010-01-01

    Abstract Control circuitry and method of controlling a sampling phase locked loop (PLL). By controlling the duty cycle of one or more sampling control signals, power consumption by the reference signal buffer and spurious output signals from the sampling PLL being controlled can be reduced.

  12. 45 CFR 160.536 - Statistical sampling.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Statistical sampling. 160.536 Section 160.536... REQUIREMENTS GENERAL ADMINISTRATIVE REQUIREMENTS Procedures for Hearings § 160.536 Statistical sampling. (a) In... statistical sampling study as evidence of the number of violations under § 160.406 of this part, or the...

  13. 40 CFR 761.348 - Contemporaneous sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Contemporaneous sampling. 761.348... PROHIBITIONS Sampling Non-Liquid, Non-Metal PCB Bulk Product Waste for Purposes of Characterization for PCB Disposal in Accordance With § 761.62, and Sampling PCB Remediation Waste Destined for Off-Site Disposal...

  14. 10 CFR 430.63 - Sampling.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Sampling. 430.63 Section 430.63 Energy DEPARTMENT OF... Enforcement § 430.63 Sampling. (a) For purposes of a certification of compliance, the determination that a... the case of faucets, showerheads, water closets, and urinals) shall be based upon the sampling...

  15. 40 CFR 61.44 - Stack sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 8 2010-07-01 2010-07-01 false Stack sampling. 61.44 Section 61.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL... Firing § 61.44 Stack sampling. (a) Sources subject to § 61.42(b) shall be continuously sampled, during...

  16. Spur reduction technique for sampling PLLs

    NARCIS (Netherlands)

    Gao, X.; Bahai, A.; Klumperink, Eric A.M.; Nauta, Bram; Bohsali, M.; Djabbari, A.; Socci, G.

    2012-01-01

    Control circuitry and method of controlling a sampling phase locked loop (PLL). By controlling the duty cycle of a sampling control signal, in accordance with the PLL reference and output signals, spurious output signals from the sampling PLL being controlled can be reduced.

  17. Spur reduction technique for sampling PLLs

    NARCIS (Netherlands)

    Gao, X.; Bahai, Ahmad; Bohsali, Mounhir; Djabbari, Ali; Klumperink, Eric A.M.; Nauta, Bram; Socci, Gerard

    2013-01-01

    Control circuitry and method of controlling a sampling phase locked loop (PLL). By controlling the duty cycle of a sampling control signal, in accordance with the PLL reference and output signals, spurious output signals from the sampling PLL being controlled can be reduced.

  18. 19 CFR 151.10 - Sampling.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Sampling. 151.10 Section 151.10 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE General § 151.10 Sampling. When necessary, the port director...

  19. 7 CFR 75.18 - Sampling.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Sampling. 75.18 Section 75.18 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections... CERTIFICATION OF QUALITY OF AGRICULTURAL AND VEGETABLE SEEDS Inspection § 75.18 Sampling. Sampling, when...

  20. 42 CFR 1003.133 - Statistical sampling.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Statistical sampling. 1003.133 Section 1003.133... AUTHORITIES CIVIL MONEY PENALTIES, ASSESSMENTS AND EXCLUSIONS § 1003.133 Statistical sampling. (a) In meeting... statistical sampling study as evidence of the number and amount of claims and/or requests for payment as...

  1. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Mixture model analysis of complex samples

    NARCIS (Netherlands)

    Wedel, M; ter Hofstede, F; Steenkamp, JBEM

    1998-01-01

    We investigate the effects of a complex sampling design on the estimation of mixture models. An approximate or pseudo likelihood approach is proposed to obtain consistent estimates of class-specific parameters when the sample arises from such a complex design. The effects of ignoring the sample

  3. Lunar and Meteorite Sample Disk for Educators

    Science.gov (United States)

    Foxworth, Suzanne; Luckey, M.; McInturff, B.; Allen, J.; Kascak, A.

    2015-01-01

    NASA Johnson Space Center (JSC) has the unique responsibility to curate NASA's extraterrestrial samples from past and future missions. Curation includes documentation, preservation, preparation and distribution of samples for research, education and public outreach. Between 1969 and 1972 six Apollo missions brought back 382 kilograms of lunar rocks, core and regolith samples, from the lunar surface. JSC also curates meteorites collected from a US cooperative effort among NASA, the National Science Foundation (NSF) and the Smithsonian Institution that funds expeditions to Antarctica. The meteorites that are collected include rocks from Moon, Mars, and many asteroids including Vesta. The sample disks for educational use include these different samples. Active relevant learning has always been important to teachers and the Lunar and Meteorite Sample Disk Program provides this active style of learning for students and the general public. The Lunar and Meteorite Sample Disks permit students to conduct investigations comparable to actual scientists. The Lunar Sample Disk contains 6 samples; Basalt, Breccia, Highland Regolith, Anorthosite, Mare Regolith and Orange Soil. The Meteorite Sample Disk contains 6 samples; Chondrite L3, Chondrite H5, Carbonaceous Chondrite, Basaltic Achondrite, Iron and Stony-Iron. Teachers are given different activities that adhere to their standards with the disks. During a Sample Disk Certification Workshop, teachers participate in the activities as students gain insight into the history, formation and geologic processes of the moon, asteroids and meteorites.

  4. Multivariate stratified sampling by stochastic multiobjective optimisation

    OpenAIRE

    Diaz-Garcia, Jose A.; Ramos-Quiroga, Rogelio

    2011-01-01

    This work considers the allocation problem for multivariate stratified random sampling as a problem of integer non-linear stochastic multiobjective mathematical programming. With this goal in mind the asymptotic distribution of the vector of sample variances is studied. Two alternative approaches are suggested for solving the allocation problem for multivariate stratified random sampling. An example is presented by applying the different proposed techniques.

  5. Illustration of Launching Samples Home from Mars

    Science.gov (United States)

    2005-01-01

    One crucial step in a Mars sample return mission would be to launch the collected sample away from the surface of Mars. This artist's concept depicts a Mars ascent vehicle for starting a sample of Mars rocks on their trip to Earth.

  6. Self-Digitization of Sample Volumes

    Science.gov (United States)

    Cohen, Dawn E.; Schneider, Thomas; Wang, Michelle; Chiu, Daniel T.

    2010-01-01

    This paper describes a very simple and robust microfluidic device for digitizing samples into an array of discrete volumes. The device is based on an inherent fluidic phenomenon, where an incoming aqueous sample divides itself into an array of chambers that have been primed with an immiscible phase. Self-digitization of sample volumes results from the interplay between fluidic forces, interfacial tension, channel geometry, and the final stability of the digitized samples in the chambers. Here we describe experiments and simulations that were used to characterize these parameters and the conditions under which the self-digitization process occurred. Unlike existing methods used to partition samples into array, our method is able to digitize 100% of a sample into a localized array without any loss of sample volume. The final volume of the discretized sample at each location is defined by the geometry and size of each chamber. Thus, we can form an array of samples with varying but predefined volumes. We exploited this feature to separate the crystal growth of otherwise concomitant polymorphs from a single solution. Additionally, we demonstrated the removal of the digitized samples from the chambers for downstream analysis, as well as the addition of reagents to the digitized samples. We believe this simple method will be useful in a broad range of applications where a large array of discretized samples is required, including digital PCR, single-cell analysis, and cell-based drug screening. PMID:20550137

  7. 7 CFR 28.908 - Samples.

    Science.gov (United States)

    2010-01-01

    .... Samples may be drawn in gins equipped with mechanical samplers approved by the Division and operated... that were drawn by a mechanical sampler at the gin may be transported with the bales to the warehouse... sample from a bale for review classification if the producer so desires. (b) Drawing of samples manual...

  8. Social network sampling using spanning trees

    Science.gov (United States)

    Jalali, Zeinab S.; Rezvanian, Alireza; Meybodi, Mohammad Reza

    2016-12-01

    Due to the large scales and limitations in accessing most online social networks, it is hard or infeasible to directly access them in a reasonable amount of time for studying and analysis. Hence, network sampling has emerged as a suitable technique to study and analyze real networks. The main goal of sampling online social networks is constructing a small scale sampled network which preserves the most important properties of the original network. In this paper, we propose two sampling algorithms for sampling online social networks using spanning trees. The first proposed sampling algorithm finds several spanning trees from randomly chosen starting nodes; then the edges in these spanning trees are ranked according to the number of times that each edge has appeared in the set of found spanning trees in the given network. The sampled network is then constructed as a sub-graph of the original network which contains a fraction of nodes that are incident on highly ranked edges. In order to avoid traversing the entire network, the second sampling algorithm is proposed using partial spanning trees. The second sampling algorithm is similar to the first algorithm except that it uses partial spanning trees. Several experiments are conducted to examine the performance of the proposed sampling algorithms on well-known real networks. The obtained results in comparison with other popular sampling methods demonstrate the efficiency of the proposed sampling algorithms in terms of Kolmogorov-Smirnov distance (KSD), skew divergence distance (SDD) and normalized distance (ND).

  9. Optimization of environmental sampling using interactive GIS.

    NARCIS (Netherlands)

    Groenigen, van J.W.; Stein, A.; Zuurbier, R.

    1997-01-01

    An interactive sampling procedure is proposed to optimize environmental risk assessment. Subsequent sampling stages were used as quantitative pre-information. With this pre-information probability maps were made using indicator kriging to direct subsequent sampling. In this way, optimal use of the

  10. Radar Doppler Processing with Nonuniform Sampling.

    Energy Technology Data Exchange (ETDEWEB)

    Doerry, Armin W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-07-01

    Conventional signal processing to estimate radar Doppler frequency often assumes uniform pulse/sample spacing. This is for the convenience of t he processing. More recent performance enhancements in processor capability allow optimally processing nonuniform pulse/sample spacing, thereby overcoming some of the baggage that attends uniform sampling, such as Doppler ambiguity and SNR losses due to sidelobe control measures.

  11. METHODOLOGICAL ASPECTS OF STRATIFICATION OF AUDIT SAMPLING

    OpenAIRE

    Vilena A. Yakimova

    2013-01-01

    The article presents the methodological foundations for construction stratification audit sampling for attribute-based sampling. The sampling techniques of Russian and foreign practice is studied and stratified. The role of stratification in the audit is described. Approaches to construction of the stratification are revealed on the basis of professional judgment (qualitative methods), statistical groupings (quantitative methods) and combinatory ones (complex qualitative stratifications). Gro...

  12. Global Unique Identification of Geoscience Samples: The International Geo Sample Number (IGSN) and the System for Earth Sample Registration (SESAR)

    Science.gov (United States)

    Lehnert, K. A.; Goldstein, S. L.; Vinayagamoorthy, S.; Lenhardt, W. C.

    2005-12-01

    Data on samples represent a primary foundation of Geoscience research across disciplines, ranging from the study of climate change, to biogeochemical cycles, to mantle and continental dynamics and are key to our knowledge of the Earth's dynamical systems and evolution. Different data types are generated for individual samples by different research groups, published in different papers, and stored in different databases on a global scale. The utility of these data is critically dependent on their integration. Such integration can be achieved within a Geoscience Cyberinfrastructure, but requires unambiguous identification of samples. Currently, naming of samples is arbitrary and inconsistent and therefore severely limits our ability to share, link, and integrate sample-based data. Major problems include name duplication, and changing of names as a sample is passed along over many years to different investigators. SESAR, the System for Earth Sample Registration (http://www.geosamples.org), addresses this problem by building a registry that generates and administers globally unique identifiers for Geoscience samples: the International Geo Sample Number (IGSN). Implementation of the IGSN in data publication and digital data management will dramatically advance interoperability among information systems for sample-based data, opening an extensive range of new opportunities for discovery and for interdisciplinary approaches in research. The IGSN will also facilitate the ability of investigators to build on previously collected data on samples as new measurements are made or new techniques are developed. With potentially broad application to all types of Geoscience samples, SESAR is global in scope. It is a web-based system that can be easily accessed by individual users through an interactive web interface and by distributed client systems via standard web services. Samples can be registered individually or in batches and at various levels of granularity from entire cores

  13. [Variance estimation considering multistage sampling design in multistage complex sample analysis].

    Science.gov (United States)

    Li, Yichong; Zhao, Yinjun; Wang, Limin; Zhang, Mei; Zhou, Maigeng

    2016-03-01

    Multistage sampling is a frequently-used method in random sampling survey in public health. Clustering or independence between observations often exists in the sampling, often called complex sample, generated by multistage sampling. Sampling error may be underestimated and the probability of type I error may be increased if the multistage sample design was not taken into consideration in analysis. As variance (error) estimator in complex sample is often complicated, statistical software usually adopt ultimate cluster variance estimate (UCVE) to approximate the estimation, which simply assume that the sample comes from one-stage sampling. However, with increased sampling fraction of primary sampling unit, contribution from subsequent sampling stages is no more trivial, and the ultimate cluster variance estimate may, therefore, lead to invalid variance estimation. This paper summarize a method of variance estimation considering multistage sampling design. The performances are compared with UCVE and the method considering multistage sampling design by simulating random sampling under different sampling schemes using real world data. Simulation showed that as primary sampling unit (PSU) sampling fraction increased, UCVE tended to generate increasingly biased estimation, whereas accurate estimates were obtained by using the method considering multistage sampling design.

  14. Handling missing data in ranked set sampling

    CERN Document Server

    Bouza-Herrera, Carlos N

    2013-01-01

    The existence of missing observations is a very important aspect to be considered in the application of survey sampling, for example. In human populations they may be caused by a refusal of some interviewees to give the true value for the variable of interest. Traditionally, simple random sampling is used to select samples. Most statistical models are supported by the use of samples selected by means of this design. In recent decades, an alternative design has started being used, which, in many cases, shows an improvement in terms of accuracy compared with traditional sampling. It is called R

  15. Statistical aspects of food safety sampling.

    Science.gov (United States)

    Jongenburger, I; den Besten, H M W; Zwietering, M H

    2015-01-01

    In food safety management, sampling is an important tool for verifying control. Sampling by nature is a stochastic process. However, uncertainty regarding results is made even greater by the uneven distribution of microorganisms in a batch of food. This article reviews statistical aspects of sampling and describes the impact of distributions on the sampling results. Five different batch contamination scenarios are illustrated: a homogeneous batch, a heterogeneous batch with high- or low-level contamination, and a batch with localized high- or low-level contamination. These batch contamination scenarios showed that sampling results have to be interpreted carefully, especially when heterogeneous and localized contamination in food products is expected.

  16. Rotary Mode Core Sample System availability improvement

    Energy Technology Data Exchange (ETDEWEB)

    Jenkins, W.W.; Bennett, K.L.; Potter, J.D. [Westinghouse Hanford Co., Richland, WA (United States); Cross, B.T.; Burkes, J.M.; Rogers, A.C. [Southwest Research Institute (United States)

    1995-02-28

    The Rotary Mode Core Sample System (RMCSS) is used to obtain stratified samples of the waste deposits in single-shell and double-shell waste tanks at the Hanford Site. The samples are used to characterize the waste in support of ongoing and future waste remediation efforts. Four sampling trucks have been developed to obtain these samples. Truck I was the first in operation and is currently being used to obtain samples where the push mode is appropriate (i.e., no rotation of drill). Truck 2 is similar to truck 1, except for added safety features, and is in operation to obtain samples using either a push mode or rotary drill mode. Trucks 3 and 4 are now being fabricated to be essentially identical to truck 2.

  17. Quality evaluation of processed clay soil samples.

    Science.gov (United States)

    Steiner-Asiedu, Matilda; Harrison, Obed Akwaa; Vuvor, Frederick; Tano-Debrah, Kwaku

    2016-01-01

    This study assessed the microbial quality of clay samples sold on two of the major Ghanaian markets. The study was a cross-sectional assessing the evaluation of processed clay and effects it has on the nutrition of the consumers in the political capital town of Ghana. The items for the examination was processed clay soil samples. Staphylococcus spp and fecal coliforms including Klebsiella, Escherichia, and Shigella and Enterobacterspp were isolated from the clay samples. Samples from the Kaneshie market in Accra recorded the highest total viable counts 6.5 Log cfu/g and Staphylococcal count 5.8 Log cfu/g. For fecal coliforms, Madina market samples had the highest count 6.5 Log cfu/g and also recorded the highest levels of yeast and mould. For Koforidua, total viable count was highest in the samples from the Zongo market 6.3 Log cfu/g. Central market samples had the highest count of fecal coliforms 4.6 Log cfu/g and yeasts and moulds 6.5 Log cfu/g. "Small" market recorded the highest staphylococcal count 6.2 Log cfu/g. The water activity of the clay samples were low, and ranged between 0.65±0.01 and 0.66±0.00 for samples collected from Koforidua and Accra respectively. The clay samples were found to contain Klebsiella spp. Escherichia, Enterobacter, Shigella spp. staphylococcus spp., yeast and mould. These have health implications when consumed.

  18. Static versus dynamic sampling for data mining

    Energy Technology Data Exchange (ETDEWEB)

    John, G.H.; Langley, P. [Stanford Univ., CA (United States)

    1996-12-31

    As data warehouses grow to the point where one hundred gigabytes is considered small, the computational efficiency of data-mining algorithms on large databases becomes increasingly important. Using a sample from the database can speed up the datamining process, but this is only acceptable if it does not reduce the quality of the mined knowledge. To this end, we introduce the {open_quotes}Probably Close Enough{close_quotes} criterion to describe the desired properties of a sample. Sampling usually refers to the use of static statistical tests to decide whether a sample is sufficiently similar to the large database, in the absence of any knowledge of the tools the data miner intends to use. We discuss dynamic sampling methods, which take into account the mining tool being used and can thus give better samples. We describe dynamic schemes that observe a mining tool`s performance on training samples of increasing size and use these results to determine when a sample is sufficiently large. We evaluate these sampling methods on data from the UCI repository and conclude that dynamic sampling is preferable.

  19. Uncertainty and sampling issues in tank characterization

    Energy Technology Data Exchange (ETDEWEB)

    Liebetrau, A.M.; Pulsipher, B.A.; Kashporenko, D.M. [and others

    1997-06-01

    A defensible characterization strategy must recognize that uncertainties are inherent in any measurement or estimate of interest and must employ statistical methods for quantifying and managing those uncertainties. Estimates of risk and therefore key decisions must incorporate knowledge about uncertainty. This report focuses statistical methods that should be employed to ensure confident decision making and appropriate management of uncertainty. Sampling is a major source of uncertainty that deserves special consideration in the tank characterization strategy. The question of whether sampling will ever provide the reliable information needed to resolve safety issues is explored. The issue of sample representativeness must be resolved before sample information is reliable. Representativeness is a relative term but can be defined in terms of bias and precision. Currently, precision can be quantified and managed through an effective sampling and statistical analysis program. Quantifying bias is more difficult and is not being addressed under the current sampling strategies. Bias could be bounded by (1) employing new sampling methods that can obtain samples from other areas in the tanks, (2) putting in new risers on some worst case tanks and comparing the results from existing risers with new risers, or (3) sampling tanks through risers under which no disturbance or activity has previously occurred. With some bound on bias and estimates of precision, various sampling strategies could be determined and shown to be either cost-effective or infeasible.

  20. Soil Gas Sample Handling: Evaluation of Water Removal and Sample Ganging

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Brad G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Abrecht, David G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hayes, James C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mendoza, Donaldo P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-10-31

    Soil gas sampling is currently conducted in support of Nuclear Test Ban treaty verification. Soil gas samples are collected and analyzed for isotopes of interest. Some issues that can impact sampling and analysis of these samples are excess moisture and sample processing time. Here we discuss three potential improvements to the current sampling protocol; a desiccant for water removal, use of molecular sieve to remove CO2 from the sample during collection, and a ganging manifold to allow composite analysis of multiple samples.

  1. Distributed MIMO radar using compressive sampling

    CERN Document Server

    Petropulu, Athina P; Poor, H Vincent

    2009-01-01

    A distributed MIMO radar is considered, in which the transmit and receive antennas belong to nodes of a small scale wireless network. The transmit waveforms could be uncorrelated, or correlated in order to achieve a desirable beampattern. The concept of compressive sampling is employed at the receive nodes in order to perform direction of arrival (DOA) estimation. According to the theory of compressive sampling, a signal that is sparse in some domain can be recovered based on far fewer samples than required by the Nyquist sampling theorem. The DOAs of targets form a sparse vector in the angle space, and therefore, compressive sampling can be applied for DOA estimation. The proposed approach achieves the superior resolution of MIMO radar with far fewer samples than other approaches. This is particularly useful in a distributed scenario, in which the results at each receive node need to be transmitted to a fusion center.

  2. Automatic polarization control in optical sampling system

    Science.gov (United States)

    Zhao, Zhao; Yang, Aiying; Feng, Lihui

    2015-08-01

    In an optical sampling system for high-speed optical communications, polarization controlling is one of the most important parts of the system, regardless of nonlinear optical sampling or linear optical sampling. A simple method based on variance calculation of sampled data is proposed in this paper to tune the wave plates in a motor-driven polarization controller. In the experiment, an optical sampling system base on SFG in PPLN is carried for a 10Gbit/s or beyond optical data signal. The results demonstrate that, with the proposed method, the error of estimated Q factor from the sampled data is least, and the tuning time of optimized polarization state is less than 30 seconds with the accuracy of +/-1°.

  3. Reconstruction of Intensity From Covered Samples

    Energy Technology Data Exchange (ETDEWEB)

    Barabash, Rozaliya [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Watkins, Thomas R [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Meisner, Roberta Ann [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Burchell, Timothy D [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rosseel, Thomas M [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-01-01

    The safe handling of activated samples requires containment and covering the sample to eliminate any potential for contamination. Subsequent characterization of the surface with x-rays ideally necessitates a thin film. While many films appear visually transparent, they are not necessarily x-ray transparent. Each film material has a unique beam attenuation and sometimes have amorphous peaks that can superimpose with those of the sample. To reconstruct the intensity of the underlying activated sample, the x-ray attenuation and signal due to the film needs to be removed from that of the sample. This requires the calculation of unique deconvolution parameters for the film. The development of a reconstruction procedure for a contained/covered sample is described.

  4. Subsurface Sample Acquisition and Transfer Systems (SSATS)

    Science.gov (United States)

    Rafeek, S.; Gorevan, S. P.; Kong, K. Y.

    2001-01-01

    In the exploration of planets and small bodies, scientists will need the services of a deep drilling and material handling system to not only obtain the samples necessary for analyses but also to precisely transfer and deposit those samples in in-situ instruments on board a landed craft or rover. The technology for such a deep sampling system as the SSATS is currently been developed by Honeybee Robotics through a PIDDP effort. The SSATS has its foundation in a one-meter prototype (SATM) drill that was developed under the New Millenium Program for ST4/Champollion. Additionally the SSATS includes relevant coring technology form a coring drill (Athena Mini-Corer) developed for the Mars Sample Return Mission. These highly developed technologies along with the current PIDDP effort, is combined to produce a sampling system that can acquire and transfer samples from various depths. Additional information is contained in the original extended abstract.

  5. Sample size in qualitative interview studies

    DEFF Research Database (Denmark)

    Malterud, Kirsti; Siersma, Volkert Dirk; Guassora, Ann Dorrit Kristiane

    2016-01-01

    Sample sizes must be ascertained in qualitative studies like in quantitative studies but not by the same means. The prevailing concept for sample size in qualitative studies is “saturation.” Saturation is closely tied to a specific methodology, and the term is inconsistently applied. We propose...... the concept “information power” to guide adequate sample size for qualitative studies. Information power indicates that the more information the sample holds, relevant for the actual study, the lower amount of participants is needed. We suggest that the size of a sample with sufficient information power...... depends on (a) the aim of the study, (b) sample specificity, (c) use of established theory, (d) quality of dialogue, and (e) analysis strategy. We present a model where these elements of information and their relevant dimensions are related to information power. Application of this model in the planning...

  6. Numerical simulations of regolith sampling processes

    Science.gov (United States)

    Schäfer, Christoph M.; Scherrer, Samuel; Buchwald, Robert; Maindl, Thomas I.; Speith, Roland; Kley, Wilhelm

    2017-07-01

    We present recent improvements in the simulation of regolith sampling processes in microgravity using the numerical particle method smooth particle hydrodynamics (SPH). We use an elastic-plastic soil constitutive model for large deformation and failure flows for dynamical behaviour of regolith. In the context of projected small body (asteroid or small moons) sample return missions, we investigate the efficiency and feasibility of a particular material sampling method: Brushes sweep material from the asteroid's surface into a collecting tray. We analyze the influence of different material parameters of regolith such as cohesion and angle of internal friction on the sampling rate. Furthermore, we study the sampling process in two environments by varying the surface gravity (Earth's and Phobos') and we apply different rotation rates for the brushes. We find good agreement of our sampling simulations on Earth with experiments and provide estimations for the influence of the material properties on the collecting rate.

  7. Diagnostic herd sensitivity using environmental samples

    DEFF Research Database (Denmark)

    Vigre, Håkan; Josefsen, Mathilde Hartmann; Seyfarth, Anne Mette

    . In our example, the prevalence of infected pigs in each herd was estimated from the pooled samples of nasal swabs. Logistic regression was used to estimate the effect of animal prevalence on the probability to detect MRSA in the dust and air samples at herd level. The results show a significant increase...... of the within herd prevalence, and performed almost perfectly at a prevalence of 25% infected pigs (sensitivity=99%). In general, the dependence of within herd prevalence should be considered in designing surveillance programs based on environmental samples.......Due to logistic and economic benefits, the animal industry has an increased interest in using environmental samples to classify herds free of infections. For a valid interpretation of results obtained from environmental samples, the performance of the diagnostic method using these samples must...

  8. Sample transport with thermocapillary force for microfluidics

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, N-T; Pang, W W; Huang, X [School of Mechanical and Production Engineering, Nanyang Technological University 50 Nanyang Avenue, Singapore 639798 (Singapore)

    2006-04-01

    This paper presents a novel concept for transport of aqueous sample in capillaries. The concept is based on the thermocapillary effect, which utilizes the temperature dependency of surface tension to drive a sample droplet. To date, the major problem of this concept was the evaporation of the aqueous sample. In our approach, a liquid-liquid system was used for delivering the sample. The aqueous sample is protected by silicone oil, thus evaporation can be avoided. A transient temperature field drives both liquids away from a heater. The paper first presents a theoretical model for the coupled thermocapillary problem. Next, the paper compares and discusses experimental results with different capillary sizes. The results show the huge potential of this concept for handling sample droplets dispersed in oil, which are often created by droplet-based microfluidics.

  9. Direct impact aerosol sampling by electrostatic precipitation

    Energy Technology Data Exchange (ETDEWEB)

    Braden, Jason D.; Harter, Andrew G.; Stinson, Brad J.; Sullivan, Nicholas M.

    2016-02-02

    The present disclosure provides apparatuses for collecting aerosol samples by ionizing an air sample at different degrees. An air flow is generated through a cavity in which at least one corona wire is disposed and electrically charged to form a corona therearound. At least one grounded sample collection plate is provided downstream of the at least one corona wire so that aerosol ions generated within the corona are deposited on the at least one grounded sample collection plate. A plurality of aerosol samples ionized to different degrees can be generated. The at least one corona wire may be perpendicular to the direction of the flow, or may be parallel to the direction of the flow. The apparatus can include a serial connection of a plurality of stages such that each stage is capable of generating at least one aerosol sample, and the air flow passes through the plurality of stages serially.

  10. Incremental Sampling Methodology (ISM) for Metallic Residues

    Science.gov (United States)

    2013-08-01

    result in improved precision for Cu or if other changes, such as increasing the digestion aliquot mass or di- gestion interval or increasing the number...200 g of material. The soil samples were air-dried at ambient temperature, sieved to remove the greater-than- 2-mm fraction, and the less-than-2-mm...yielding a 25-kg sample. The incremental sample was air-dried at ambient temperature and passed through a 2-mm sieve. A rotary splitter was

  11. Representative Sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Petersen, Lars; Esbensen, Kim Harry

    2005-01-01

    regime in order to secure the necessary reliability of: samples (which must be representative, from the primary sampling onwards), analysis (which will not mean anything outside the miniscule analytical volume without representativity ruling all mass reductions involved, also in the laboratory) and data......) that fully cover all practical aspects of sampling and provides a handy “toolbox” for samplers, engineers, laboratory and scientific personnel....

  12. Field Sampling from a Segmented Image

    CSIR Research Space (South Africa)

    Debba, Pravesh

    2008-06-01

    Full Text Available Image Debba, Stein, van der Meer, Carranza, Lucieer Objective Study Site Methods The ICM Algorithm Sampling Per Category Sample Size Per Category Fitness Function Per Category Simulated Annealing Per Category Results Experiment Case... Study Conclusions Field Sampling from a Segmented Image P. Debba1 A. Stein2 F.D. van der Meer2 E.J.M. Carranza2 A. Lucieer3 1The Council for Scientific and Industrial Research (CSIR), Logistics and Quantitative Methods, CSIR Built Environment, P...

  13. Efficient Monte Carlo sampling by parallel marginalization

    OpenAIRE

    Weare, Jonathan

    2007-01-01

    Markov chain Monte Carlo sampling methods often suffer from long correlation times. Consequently, these methods must be run for many steps to generate an independent sample. In this paper, a method is proposed to overcome this difficulty. The method utilizes information from rapidly equilibrating coarse Markov chains that sample marginal distributions of the full system. This is accomplished through exchanges between the full chain and the auxiliary coarse chains. Results of numerical tests o...

  14. Techniques for geothermal liquid sampling and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kindle, C.H.; Woodruff, E.M.

    1981-07-01

    A methodology has been developed that is particularly suited to liquid-dominated resources and adaptable to a variety of situations. It is intended to be a base methodology upon which variations can be made to meet specific needs or situations. The approach consists of recording flow conditions at the time of sampling, a specific insertable probe sampling system, a sample stabilization procedure, commercially available laboratory instruments, and data quality check procedures.

  15. METALLOGRAPHIC SAMPLE PREPARATION STATION-CONSTRUCTIVE CONCEPT

    Directory of Open Access Journals (Sweden)

    AVRAM Florin Timotei

    2016-11-01

    Full Text Available In this paper we propose to present the issues involved in the case of the constructive conception of a station for metallographic sample preparation. This station is destined for laboratory work. The metallographic station is composed of a robot ABB IRB1600, a metallographic microscope, a gripping device, a manipulator, a laboratory grinding and polishing machine. The robot will be used for manipulation of the sample preparation and the manipulator take the sample preparation for processing.

  16. Microfabricated Devices for Sample Extraction, Concentrations, and Related Sample Processing Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Gang; Lin, Yuehe

    2006-12-01

    This is an invited book chapter. As with other analytical techniques, sample pretreatments, sample extraction, sample introduction, and related techniques are of extreme importance for micro-electro-mechanical systems (MEMS). Bio-MEMS devices and systems start with a sampling step. The biological sample then usually undergoes some kinds of sample preparation steps before the actual analysis. These steps may involve extracting the target sample from its matrix, removing interferences from the sample, derivatizing the sample to detectable species, or performing a sample preconcentration step. The integration of the components for sample pretreatment into microfluidic devices represents one of the remaining the bottle-neck towards achieving true miniaturized total analysis systems (?TAS). This chapter provides a thorough state-of-art of the developments in this field to date.

  17. It's in the Sample: The Effects of Sample Size and Sample Diversity on the Breadth of Inductive Generalization

    Science.gov (United States)

    Lawson, Chris A.; Fisher, Anna V.

    2011-01-01

    Developmental studies have provided mixed evidence with regard to the question of whether children consider sample size and sample diversity in their inductive generalizations. Results from four experiments with 105 undergraduates, 105 school-age children (M = 7.2 years), and 105 preschoolers (M = 4.9 years) showed that preschoolers made a higher…

  18. Investigation of Hardened Filling Grout Samples

    DEFF Research Database (Denmark)

    Sørensen, Eigil V.

     Suzlon Wind Energy A/S requested on August 28, 2007 an investigation of 2 samples of a hardened filling grout to be carried out, comprising drilling and strength determination of 4 test cylinders, and description of the surface characteristics of the samples....... Suzlon Wind Energy A/S requested on August 28, 2007 an investigation of 2 samples of a hardened filling grout to be carried out, comprising drilling and strength determination of 4 test cylinders, and description of the surface characteristics of the samples....

  19. Unbiased sampling and meshing of isosurfaces

    KAUST Repository

    Yan, Dongming

    2014-11-01

    In this paper, we present a new technique to generate unbiased samples on isosurfaces. An isosurface, F(x,y,z) = c , of a function, F , is implicitly defined by trilinear interpolation of background grid points. The key idea of our approach is that of treating the isosurface within a grid cell as a graph (height) function in one of the three coordinate axis directions, restricted to where the slope is not too high, and integrating / sampling from each of these three. We use this unbiased sampling algorithm for applications in Monte Carlo integration, Poisson-disk sampling, and isosurface meshing.

  20. How to calculate sample size and why.

    Science.gov (United States)

    Kim, Jeehyoung; Seo, Bong Soo

    2013-09-01

    Calculating the sample size is essential to reduce the cost of a study and to prove the hypothesis effectively. Referring to pilot studies and previous research studies, we can choose a proper hypothesis and simplify the studies by using a website or Microsoft Excel sheet that contains formulas for calculating sample size in the beginning stage of the study. There are numerous formulas for calculating the sample size for complicated statistics and studies, but most studies can use basic calculating methods for sample size calculation.

  1. Sampling in the Linear Canonical Transform Domain

    Directory of Open Access Journals (Sweden)

    Bing-Zhao Li

    2012-01-01

    Full Text Available This paper investigates the interpolation formulae and the sampling theorem for bandpass signals in the linear canonical transform (LCT domain. Firstly, one of the important relationships between the bandpass signals in the Fourier domain and the bandpass signals in the LCT domain is derived. Secondly, two interpolation formulae from uniformly sampled points at half of the sampling rate associated with the bandpass signals and their generalized Hilbert transform or the derivatives in the LCT domain are obtained. Thirdly, the interpolation formulae from nonuniform samples are investigated. The simulation results are also proposed to verify the correctness of the derived results.

  2. Efficient Monte Carlo sampling by parallel marginalization.

    Science.gov (United States)

    Weare, Jonathan

    2007-07-31

    Markov chain Monte Carlo sampling methods often suffer from long correlation times. Consequently, these methods must be run for many steps to generate an independent sample. In this paper, a method is proposed to overcome this difficulty. The method utilizes information from rapidly equilibrating coarse Markov chains that sample marginal distributions of the full system. This is accomplished through exchanges between the full chain and the auxiliary coarse chains. Results of numerical tests on the bridge sampling and filtering/smoothing problems for a stochastic differential equation are presented.

  3. Sample size determination for the fluctuation experiment.

    Science.gov (United States)

    Zheng, Qi

    2017-01-01

    The Luria-Delbrück fluctuation experiment protocol is increasingly employed to determine microbial mutation rates in the laboratory. An important question raised at the planning stage is "How many cultures are needed?" For over 70 years sample sizes have been determined either by intuition or by following published examples where sample sizes were chosen intuitively. This paper proposes a practical method for determining the sample size. The proposed method relies on existing algorithms for computing the expected Fisher information under two commonly used mutant distributions. The role of partial plating in reducing sample size is discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Drone inflight mixing of biochemical samples.

    Science.gov (United States)

    Katariya, Mayur; Chung, Dwayne Chung Kim; Minife, Tristan; Gupta, Harshit; Zahidi, Alifa Afiah Ahmad; Liew, Oi Wah; Ng, Tuck Wah

    2018-01-04

    Autonomous systems for sample transport to the laboratory for analysis can be improved in terms of timeliness, cost and error mitigation in the pre-analytical testing phase. Drones have been reported for outdoor sample transport but incorporating devices on them to attain homogenous mixing of reagents during flight to enhance sample processing timeliness is limited by payload issues. It is shown here that flipping maneuvers conducted with quadcopters are able to facilitate complete and gentle mixing. This capability incorporated during automated sample transport serves to address an important factor contributing to pre-analytical variability which ultimately impacts on test result reliability. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Sampling Criterion for EMC Near Field Measurements

    DEFF Research Database (Denmark)

    Franek, Ondrej; Sørensen, Morten; Ebert, Hans

    2012-01-01

    An alternative, quasi-empirical sampling criterion for EMC near field measurements intended for close coupling investigations is proposed. The criterion is based on maximum error caused by sub-optimal sampling of near fields in the vicinity of an elementary dipole, which is suggested as a worst......-case representative of a signal trace on a typical printed circuit board. It has been found that the sampling density derived in this way is in fact very similar to that given by the antenna near field sampling theorem, if an error less than 1 dB is required. The principal advantage of the proposed formulation is its...

  6. DXC'13 Industrial Track Sample Data

    Data.gov (United States)

    National Aeronautics and Space Administration — The sample scenarios provided here are competition scenarios from previous DXC competitions. They are identical to the competition data associated with previous...

  7. Colling Wipe Samples for VX Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Koester, C; Hoppes, W G

    2010-02-11

    This standard operating procedure (SOP) provides uniform procedures for the collection of wipe samples of VX residues from surfaces. Personnel may use this procedure to collect and handle wipe samples in the field. Various surfaces, including building materials (wood, metal, tile, vinyl, etc.) and equipment, may be sampled based on this procedure. The purpose of such sampling is to determine whether or not the relevant surfaces are contaminated, to determine the extent of their contamination, to evaluate the effectiveness of decontamination procedures, and to determine the amount of contaminant that might present as a contact hazard.

  8. Method and apparatus for sampling atmospheric mercury

    Science.gov (United States)

    Trujillo, Patricio E.; Campbell, Evan E.; Eutsler, Bernard C.

    1976-01-20

    A method of simultaneously sampling particulate mercury, organic mercurial vapors, and metallic mercury vapor in the working and occupational environment and determining the amount of mercury derived from each such source in the sampled air. A known volume of air is passed through a sampling tube containing a filter for particulate mercury collection, a first adsorber for the selective adsorption of organic mercurial vapors, and a second adsorber for the adsorption of metallic mercury vapor. Carbon black molecular sieves are particularly useful as the selective adsorber for organic mercurial vapors. The amount of mercury adsorbed or collected in each section of the sampling tube is readily quantitatively determined by flameless atomic absorption spectrophotometry.

  9. Colloid characterization and quantification in groundwater samples

    Energy Technology Data Exchange (ETDEWEB)

    K. Stephen Kung

    2000-06-01

    This report describes the work conducted at Los Alamos National Laboratory for studying the groundwater colloids for the Yucca Mountain Project in conjunction with the Hydrologic Resources Management Program (HRMP) and the Underground Test Area (UGTA) Project. Colloidal particle size distributions and total particle concentration in groundwater samples are quantified and characterized. Colloid materials from cavity waters collected near underground nuclear explosion sites by HRMP field sampling personnel at the Nevada Test Site (NTS) were quantified. Selected colloid samples were further characterized by electron microscope to evaluate the colloid shapes, elemental compositions, and mineral phases. The authors have evaluated the colloid size and concentration in the natural groundwater sample that was collected from the ER-20-5 well and stored in a 50-gallon (about 200-liter) barrel for several months. This groundwater sample was studied because HRMP personnel have identified trace levels of radionuclides in the water sample. Colloid results show that even though the water sample had filtered through a series of Millipore filters, high-colloid concentrations were identified in all unfiltered and filtered samples. They had studied the samples that were diluted with distilled water and found that diluted samples contained more colloids than the undiluted ones. These results imply that colloids are probably not stable during the storage conditions. Furthermore, results demonstrate that undesired colloids have been introduced into the samples during the storage, filtration, and dilution processes. They have evaluated possible sources of colloid contamination associated with sample collection, filtrating, storage, and analyses of natural groundwaters. The effects of container types and sample storage time on colloid size distribution and total concentration were studied to evaluate colloid stability by using J13 groundwater. The data suggests that groundwater samples

  10. Comet Odyssey: Comet Surface Sample Return

    Science.gov (United States)

    Weissman, Paul R.; Bradley, J.; Smythe, W. D.; Brophy, J. R.; Lisano, M. E.; Syvertson, M. L.; Cangahuala, L. A.; Liu, J.; Carlisle, G. L.

    2010-10-01

    Comet Odyssey is a proposed New Frontiers mission that would return the first samples from the surface of a cometary nucleus. Stardust demonstrated the tremendous power of analysis of returned samples in terrestrial laboratories versus what can be accomplished in situ with robotic missions. But Stardust collected only 1 milligram of coma dust, and the 6.1 km/s flyby speed heated samples up to 2000 K. Comet Odyssey would collect two independent 800 cc samples directly from the surface in a far more benign manner, preserving the primitive composition. Given a minimum surface density of 0.2 g/cm3, this would return two 160 g surface samples to Earth. Comet Odyssey employs solar-electric propulsion to rendezvous with the target comet. After 180 days of reconnaissance and site selection, the spacecraft performs a "touch-and-go” maneuver with surface contact lasting 3 seconds. A brush-wheel sampler on a remote arm collects up to 800 cc of sample. A duplicate second arm and sampler collects the second sample. The samples are placed in a return capsule and maintained at colder than -70 C during the return flight and at colder than -30 C during re-entry and for up to six hours after landing. The entire capsule is then refrigerated and transported to the Astromaterials Curatorial Facility at NASA/JSC for initial inspection and sample analysis by the Comet Odyssey team. Comet Odyssey's planned target was comet 9P/Tempel 1, with launch in December 2017 and comet arrival in June 2022. After a stay of 300 days at the comet, the spacecraft departs and arrives at Earth in May 2027. Comet Odyssey is a forerunner to a flagship Cryogenic Comet Sample Return mission that would return samples from deep below the nucleus surface, including volatile ices. This work was supported by internal funds from the Jet Propulsion Laboratory.

  11. Sample-Clock Phase-Control Feedback

    Science.gov (United States)

    Quirk, Kevin J.; Gin, Jonathan W.; Nguyen, Danh H.; Nguyen, Huy

    2012-01-01

    To demodulate a communication signal, a receiver must recover and synchronize to the symbol timing of a received waveform. In a system that utilizes digital sampling, the fidelity of synchronization is limited by the time between the symbol boundary and closest sample time location. To reduce this error, one typically uses a sample clock in excess of the symbol rate in order to provide multiple samples per symbol, thereby lowering the error limit to a fraction of a symbol time. For systems with a large modulation bandwidth, the required sample clock rate is prohibitive due to current technological barriers and processing complexity. With precise control of the phase of the sample clock, one can sample the received signal at times arbitrarily close to the symbol boundary, thus obviating the need, from a synchronization perspective, for multiple samples per symbol. Sample-clock phase-control feedback was developed for use in the demodulation of an optical communication signal, where multi-GHz modulation bandwidths would require prohibitively large sample clock frequencies for rates in excess of the symbol rate. A custom mixedsignal (RF/digital) offset phase-locked loop circuit was developed to control the phase of the 6.4-GHz clock that samples the photon-counting detector output. The offset phase-locked loop is driven by a feedback mechanism that continuously corrects for variation in the symbol time due to motion between the transmitter and receiver as well as oscillator instability. This innovation will allow significant improvements in receiver throughput; for example, the throughput of a pulse-position modulation (PPM) with 16 slots can increase from 188 Mb/s to 1.5 Gb/s.

  12. Corrosion of metal samples rapidly measured

    Science.gov (United States)

    Maskell, C. E.

    1966-01-01

    Corrosion of a large number of metal samples that have been exposed to controlled environment is accurately and rapidly measured. Wire samples of the metal are embedded in clear plastic and sectioned for microexamination. Unexposed wire can be included in the matrix as a reference.

  13. METABOLITE CHARACTERIZATION IN SERUM SAMPLES FROM ...

    African Journals Online (AJOL)

    Preferred Customer

    take advantage of larger chemical shift spread of 13C resonances allowing a more detailed identification of ... fingerprints of various metabolites of serum samples of normal healthy control have been obtained which can ... fasting 10 mL of blood sample from each individual was taken and was allowed to clot in plastic.

  14. 7 CFR 29.34 - Sample seal.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Sample seal. 29.34 Section 29.34 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Definitions § 29.34 Sample seal. A seal approved by the Director for sealing official...

  15. K-Median: Random Sampling Procedure

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. K-Median: Random Sampling Procedure. Sample a set of 1/ + 1 points from P. Let Q = first 1/ points, p = last point. Let T = Avg. 1-Median cost of P, c=1-Median. Let B1 = B(c,T/ 2), B2 = B(p, T). Let P' = points in B1.

  16. Sampling Lesbian, Gay, and Bisexual Populations

    Science.gov (United States)

    Meyer, Ilan H.; Wilson, Patrick A.

    2009-01-01

    Sampling has been the single most influential component of conducting research with lesbian, gay, and bisexual (LGB) populations. Poor sampling designs can result in biased results that will mislead other researchers, policymakers, and practitioners. Investigators wishing to study LGB populations must therefore devote significant energy and…

  17. 40 CFR 61.33 - Stack sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 8 2010-07-01 2010-07-01 false Stack sampling. 61.33 Section 61.33 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL... sampling. (a) Unless a waiver of emission testing is obtained under § 61.13, each owner or operator...

  18. Sampling depth confounds soil acidification outcomes

    Science.gov (United States)

    In the northern Great Plains (NGP) of North America, surface sampling depths of 0-15 or 0-20 cm are suggested for testing soil characteristics such as pH. However, acidification is often most pronounced near the soil surface. Thus, sampling deeper can potentially dilute (increase) pH measurements an...

  19. Sampled Noise in Switched Current Circuits

    DEFF Research Database (Denmark)

    Jørgensen, Ivan Herald Holger; Bogason, Gudmundur

    1997-01-01

    The understanding of noise in analog sampled data systems is vital for the design of high resolution circuitry. In this paper a general description of sampled and held noise is presented. The noise calculations are verified by measurements on an analog delay line implemented using switched...

  20. Sampling scheme optimization from hyperspectral data

    NARCIS (Netherlands)

    Debba, P.

    2006-01-01

    This thesis presents statistical sampling scheme optimization for geo-environ-menta] purposes on the basis of hyperspectral data. It integrates derived products of the hyperspectral remote sensing data into individual sampling schemes. Five different issues are being dealt with.First, the optimized

  1. 40 CFR 1065.805 - Sampling system.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Sampling system. 1065.805 Section 1065.805 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Testing With Oxygenated Fuels § 1065.805 Sampling system. (a) Dilute engine...

  2. Sampling for validation of digital soil maps

    NARCIS (Netherlands)

    Brus, D.J.; Kempen, B.; Heuvelink, G.B.M.

    2011-01-01

    The increase in digital soil mapping around the world means that appropriate and efficient sampling strategies are needed for validation. Data used for calibrating a digital soil mapping model typically are non-random samples. In such a case we recommend collection of additional independent data and

  3. Additional Considerations in Determining Sample Size.

    Science.gov (United States)

    Levin, Joel R.; Subkoviak, Michael J.

    Levin's (1975) sample-size determination procedure for completely randomized analysis of variance designs is extended to designs in which antecedent or blocking variables information is considered. In particular, a researcher's choice of designs is framed in terms of determining the respective sample sizes necessary to detect specified contrasts…

  4. Determining Sample Size for Research Activities

    Science.gov (United States)

    Krejcie, Robert V.; Morgan, Daryle W.

    1970-01-01

    A formula for determining sample size, which originally appeared in 1960, has lacked a table for easy reference. This article supplies a graph of the function and a table of values which permits easy determination of the size of sample needed to be representative of a given population. (DG)

  5. Sampling of temporal networks: Methods and biases

    Science.gov (United States)

    Rocha, Luis E. C.; Masuda, Naoki; Holme, Petter

    2017-11-01

    Temporal networks have been increasingly used to model a diversity of systems that evolve in time; for example, human contact structures over which dynamic processes such as epidemics take place. A fundamental aspect of real-life networks is that they are sampled within temporal and spatial frames. Furthermore, one might wish to subsample networks to reduce their size for better visualization or to perform computationally intensive simulations. The sampling method may affect the network structure and thus caution is necessary to generalize results based on samples. In this paper, we study four sampling strategies applied to a variety of real-life temporal networks. We quantify the biases generated by each sampling strategy on a number of relevant statistics such as link activity, temporal paths and epidemic spread. We find that some biases are common in a variety of networks and statistics, but one strategy, uniform sampling of nodes, shows improved performance in most scenarios. Given the particularities of temporal network data and the variety of network structures, we recommend that the choice of sampling methods be problem oriented to minimize the potential biases for the specific research questions on hand. Our results help researchers to better design network data collection protocols and to understand the limitations of sampled temporal network data.

  6. Writing for Distance Education. Samples Booklet.

    Science.gov (United States)

    International Extension Coll., Cambridge (England).

    Approaches to the format, design, and layout of printed instructional materials for distance education are illustrated in 36 samples designed to accompany the manual, "Writing for Distance Education." Each sample is presented on a single page with a note pointing out its key features. Features illustrated include use of typescript layout, a comic…

  7. Statistical aspects of food safety sampling

    NARCIS (Netherlands)

    Jongenburger, I.; Besten, den H.M.W.; Zwietering, M.H.

    2015-01-01

    In food safety management, sampling is an important tool for verifying control. Sampling by nature is a stochastic process. However, uncertainty regarding results is made even greater by the uneven distribution of microorganisms in a batch of food. This article reviews statistical aspects of

  8. Accuracy assessment with complex sampling designs

    Science.gov (United States)

    Raymond L. Czaplewski

    2010-01-01

    A reliable accuracy assessment of remotely sensed geospatial data requires a sufficiently large probability sample of expensive reference data. Complex sampling designs reduce cost or increase precision, especially with regional, continental and global projects. The General Restriction (GR) Estimator and the Recursive Restriction (RR) Estimator separate a complex...

  9. Methodological Choices in Rating Speech Samples

    Science.gov (United States)

    O'Brien, Mary Grantham

    2016-01-01

    Much pronunciation research critically relies upon listeners' judgments of speech samples, but researchers have rarely examined the impact of methodological choices. In the current study, 30 German native listeners and 42 German L2 learners (L1 English) rated speech samples produced by English-German L2 learners along three continua: accentedness,…

  10. Analysing designed experiments in distance sampling

    Science.gov (United States)

    Stephen T. Buckland; Robin E. Russell; Brett G. Dickson; Victoria A. Saab; Donal N. Gorman; William M. Block

    2009-01-01

    Distance sampling is a survey technique for estimating the abundance or density of wild animal populations. Detection probabilities of animals inherently differ by species, age class, habitats, or sex. By incorporating the change in an observer's ability to detect a particular class of animals as a function of distance, distance sampling leads to density estimates...

  11. Sampling low-density gypsy moth populations

    Science.gov (United States)

    William E. Wallner; Clive G. Jones; Joseph S. Elkinton; Bruce L. Parker

    1991-01-01

    The techniques and methodology for sampling gypsy moth, Lymantria dispar L., at low densities, less than 100 egg masses/ha (EM/ha), are compared. Forest managers have constraints of time and cost, and need a useful, simple predictable means to assist them in sampling gypsy moth populations. A comparison of various techniques coupled with results of...

  12. Quantum algorithm for exact Monte Carlo sampling

    OpenAIRE

    Destainville, Nicolas; Georgeot, Bertrand; Giraud, Olivier

    2010-01-01

    We build a quantum algorithm which uses the Grover quantum search procedure in order to sample the exact equilibrium distribution of a wide range of classical statistical mechanics systems. The algorithm is based on recently developed exact Monte Carlo sampling methods, and yields a polynomial gain compared to classical procedures.

  13. Multilingualism remixed: Sampling, braggadocio and the stylisation ...

    African Journals Online (AJOL)

    Multilingualism remixed: Sampling, braggadocio and the stylisation of local voice. ... is the question of how multilingual voice may carry across media, modalities and context. In this paper, we ... Specifically, we ask how emcees sample local varieties of language, texts and registers to stage their particular stylisation of voice.

  14. Chemical fingerprinting of unevaporated automotive gasoline samples.

    Science.gov (United States)

    Sandercock, P M L; Du Pasquier, E

    2003-06-24

    The comparison of two or more samples of liquid gasoline (petrol) to establish a common origin is a difficult problem in the forensic investigation of arsons and suspicious fires. A total of 35 randomly collected samples of unevaporated gasoline, covering three different grades (regular unleaded, premium unleaded and lead replacement), were examined. The high-boiling fraction of the gasoline was targeted with a view to apply the techniques described herein to evaporated gasoline samples in the future.A novel micro solid phase extraction (SPE) technique using activated alumina was developed to isolate the polar compounds and the polycyclic aromatic hydrocarbons (PAHs) from a 200microl sample of gasoline. Samples were analysed using full-scan gas chromatography-mass spectrometry (GC-MS) and potential target compounds identified. Samples were then re-analysed directly, without prior treatment, using GC-MS in selected ion monitoring (SIM) mode for target compounds that exhibited variation between gasoline samples. Principal component analysis (PCA) was applied to the chromatographic data. The first two principal components (PCs) accounted for 91.5% of the variation in the data. Linear discriminant analysis (LDA) performed on the PCA results showed that the 35 samples tested could be classified into 32 different groups.

  15. Evaluation of diesel particulate matter sampling techniques

    CSIR Research Space (South Africa)

    Pretorius, CJ

    2011-09-01

    Full Text Available The study evaluated diesel particulate matter (DPM) sampling methods used in the South African mining industry. The three-piece cassette respirable, open face and stopper sampling methods were compared with the SKC DPM cassette method to find a...

  16. Personal gravimetric dust sampling and risk assessment.

    CSIR Research Space (South Africa)

    Unsted, AD

    1996-03-01

    Full Text Available . At all the sampling sites extremely large variation in dust concentrations were measured on a day to day and shift basis. Correlation of dust concentrations between personal and stationary samples was very poor as was the correlation between quartz...

  17. k-Means: Random Sampling Procedure

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. k-Means: Random Sampling Procedure. Optimal 1-Mean is. Approximation of Centroid (Inaba et al). S = random sample of size O(1/ ); Centroid of S is a (1+ )-approx centroid of P with constant probability.

  18. Decisions from Experience: Why Small Samples?

    Science.gov (United States)

    Hertwig, Ralph; Pleskac, Timothy J.

    2010-01-01

    In many decisions we cannot consult explicit statistics telling us about the risks involved in our actions. In lieu of such data, we can arrive at an understanding of our dicey options by sampling from them. The size of the samples that we take determines, ceteris paribus, how good our choices will be. Studies of decisions from experience have…

  19. 27 CFR 6.91 - Samples.

    Science.gov (United States)

    2010-04-01

    ... TREASURY LIQUORS âTIED-HOUSEâ Exceptions § 6.91 Samples. The act by an industry member of furnishing or giving a sample of distilled spirits, wine, or malt beverages to a retailer who has not purchased the brand from that industry member within the last 12 months does not constitute a means to induce within...

  20. Statistical Literacy and Sample Survey Results

    Science.gov (United States)

    McAlevey, Lynn; Sullivan, Charles

    2010-01-01

    Sample surveys are widely used in the social sciences and business. The news media almost daily quote from them, yet they are widely misused. Using students with prior managerial experience embarking on an MBA course, we show that common sample survey results are misunderstood even by those managers who have previously done a statistics course. In…

  1. 40 CFR 61.54 - Sludge sampling.

    Science.gov (United States)

    2010-07-01

    ..., preparation, and analysis of sludge samples shall be accomplished according to Method 105 in appendix B of... may use Method 105 of appendix B and the procedures specified in this section. (1) A sludge test shall... be sampled according to paragraph (c)(1) of this section, sludge charging rate for the plant shall be...

  2. Gamma-ray spectrometry of LDEF samples

    Energy Technology Data Exchange (ETDEWEB)

    Winn, W.G.

    1991-01-01

    A total of 31 samples from the Long Duration Exposure Facility (LDEF), including materials of aluminum, vanadium, and steel trunnions were analyzed by ultra-low-level gamma spectroscopy. The study quantified particle induced activations of (sup 22)Na, {sup 46}Sc, {sup 51}Cr, {sup 54}Mn, {sup 56}Co, {sup 57}Co, {sup 58}Co, and {sup 60}Co. The samples of trunnion sections exhibited increasing activity toward the outer end of the trunnion and decreasing activity toward its radial center. The trunnion sections did not include end pieces, which have been reported to collect noticeable {sup 7}Be on their leading surfaces. No significant {sup 7}Be was detected in the samples analyzed. The Underground Counting Facility at Savannah River Laboratory (SRL) was used in this work. The facility is 50 ft. underground, constructed with low-background shielding materials, and operated as a clean room. The most sensitive analyses were performed with a 90%-efficient HPGe gamma-ray detector, which is enclosed in a purged active/passive shield. Each sample was counted for one to six days in two orientations to yield more representative average activities for the sample. The non-standard geometries of the LDEF samples prompted the development of a novel calibration method, whereby the efficiency about the samples surfaces (measured with point sources) predicted the efficiency for the bulk sample.

  3. Gamma-ray spectrometry of LDEF samples

    Energy Technology Data Exchange (ETDEWEB)

    Winn, W.G.

    1991-12-31

    A total of 31 samples from the Long Duration Exposure Facility (LDEF), including materials of aluminum, vanadium, and steel trunnions were analyzed by ultra-low-level gamma spectroscopy. The study quantified particle induced activations of (sup 22)Na, {sup 46}Sc, {sup 51}Cr, {sup 54}Mn, {sup 56}Co, {sup 57}Co, {sup 58}Co, and {sup 60}Co. The samples of trunnion sections exhibited increasing activity toward the outer end of the trunnion and decreasing activity toward its radial center. The trunnion sections did not include end pieces, which have been reported to collect noticeable {sup 7}Be on their leading surfaces. No significant {sup 7}Be was detected in the samples analyzed. The Underground Counting Facility at Savannah River Laboratory (SRL) was used in this work. The facility is 50 ft. underground, constructed with low-background shielding materials, and operated as a clean room. The most sensitive analyses were performed with a 90%-efficient HPGe gamma-ray detector, which is enclosed in a purged active/passive shield. Each sample was counted for one to six days in two orientations to yield more representative average activities for the sample. The non-standard geometries of the LDEF samples prompted the development of a novel calibration method, whereby the efficiency about the samples surfaces (measured with point sources) predicted the efficiency for the bulk sample.

  4. Network Sampling with Memory: A proposal for more efficient sampling from social networks

    OpenAIRE

    Mouw, Ted; Verdery, Ashton M.

    2012-01-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its ...

  5. A sampling algorithm for segregation analysis

    Directory of Open Access Journals (Sweden)

    Henshall John

    2001-11-01

    Full Text Available Abstract Methods for detecting Quantitative Trait Loci (QTL without markers have generally used iterative peeling algorithms for determining genotype probabilities. These algorithms have considerable shortcomings in complex pedigrees. A Monte Carlo Markov chain (MCMC method which samples the pedigree of the whole population jointly is described. Simultaneous sampling of the pedigree was achieved by sampling descent graphs using the Metropolis-Hastings algorithm. A descent graph describes the inheritance state of each allele and provides pedigrees guaranteed to be consistent with Mendelian sampling. Sampling descent graphs overcomes most, if not all, of the limitations incurred by iterative peeling algorithms. The algorithm was able to find the QTL in most of the simulated populations. However, when the QTL was not modeled or found then its effect was ascribed to the polygenic component. No QTL were detected when they were not simulated.

  6. East Mountain Area 1995 air sampling results

    Energy Technology Data Exchange (ETDEWEB)

    Deola, R.A. [Sandia National Labs., Albuquerque, NM (United States). Air Quality Dept.

    1996-09-01

    Ambient air samples were taken at two locations in the East Mountain Area in conjunction with thermal testing at the Lurance Canyon Burn Site (LCBS). The samples were taken to provide measurements of particulate matter with a diameter less than or equal to 10 micrometers (PM{sub 10}) and volatile organic compounds (VOCs). This report summarizes the results of the sampling performed in 1995. The results from small-scale testing performed to determine the potentially produced air pollutants in the thermal tests are included in this report. Analytical results indicate few samples produced measurable concentrations of pollutants believed to be produced by thermal testing. Recommendations for future air sampling in the East Mountain Area are also noted.

  7. Importance sampling the Rayleigh phase function

    DEFF Research Database (Denmark)

    Frisvad, Jeppe Revall

    2011-01-01

    Rayleigh scattering is used frequently in Monte Carlo simulation of multiple scattering. The Rayleigh phase function is quite simple, and one might expect that it should be simple to importance sample it efficiently. However, there seems to be no one good way of sampling it in the literature. Thi....... This paper provides the details of several different techniques for importance sampling the Rayleigh phase function, and it includes a comparison of their performance as well as hints toward efficient implementation.......Rayleigh scattering is used frequently in Monte Carlo simulation of multiple scattering. The Rayleigh phase function is quite simple, and one might expect that it should be simple to importance sample it efficiently. However, there seems to be no one good way of sampling it in the literature...

  8. Custom sample environments at the ALBA XPEEM

    Energy Technology Data Exchange (ETDEWEB)

    Foerster, Michael, E-mail: mfoerster@cells.es; Prat, Jordi; Massana, Valenti; Gonzalez, Nahikari; Fontsere, Abel; Molas, Bernat; Matilla, Oscar; Pellegrin, Eric; Aballe, Lucia

    2016-12-15

    A variety of custom-built sample holders offer users a wide range of non-standard measurements at the ALBA synchrotron PhotoEmission Electron Microscope (PEEM) experimental station. Some of the salient features are: an ultrahigh vacuum (UHV) suitcase compatible with many offline deposition and characterization systems, built-in electromagnets for uni- or biaxial in-plane (IP) and out-of-plane (OOP) fields, as well as the combination of magnetic fields with electric fields or current injection. Electronics providing a synchronized sinusoidal signal for sample excitation enable time-resolved measurements at the 500 MHz storage ring RF frequency. - Highlights: • Custom sample environment for XPEEM at ALBA. • Sample holders with electromagnets, in-plane dipole, in-plane quadruple and out-of-plane. • Sample holders with printed circuit boards for electric contacts including electromagnets. • UHV suitcase adapter. • Synchronized 500 MHz electrical excitation for time resolved measurements.

  9. A Geology Sampling System for Small Bodies

    Science.gov (United States)

    Naids, Adam J.; Hood, Anthony D.; Abell, Paul; Graff, Trevor; Buffington, Jesse

    2016-01-01

    Human exploration of microgravity bodies is being investigated as a precursor to a Mars surface mission. Asteroids, comets, dwarf planets, and the moons of Mars all fall into this microgravity category and some are being discussed as potential mission targets. Obtaining geological samples for return to Earth will be a major objective for any mission to a small body. Currently, the knowledge base for geology sampling in microgravity is in its infancy. Humans interacting with non-engineered surfaces in microgravity environment pose unique challenges. In preparation for such missions a team at the NASA Johnson Space Center has been working to gain experience on how to safely obtain numerous sample types in such an environment. This paper describes the type of samples the science community is interested in, highlights notable prototype work, and discusses an integrated geology sampling solution.

  10. A Geology Sampling System for Microgravity Bodies

    Science.gov (United States)

    Hood, Anthony; Naids, Adam

    2016-01-01

    Human exploration of microgravity bodies is being investigated as a precursor to a Mars surface mission. Asteroids, comets, dwarf planets, and the moons of Mars all fall into this microgravity category and some are been discussed as potential mission targets. Obtaining geological samples for return to Earth will be a major objective for any mission to a microgravity body. Currently the knowledge base for geology sampling in microgravity is in its infancy. Humans interacting with non-engineered surfaces in microgravity environment pose unique challenges. In preparation for such missions a team at the NASA Johnson Space Center has been working to gain experience on how to safely obtain numerous sample types in such an environment. This paper describes the type of samples the science community is interested in, highlights notable prototype work, and discusses an integrated geology sampling solution.

  11. Recommended Maximum Temperature For Mars Returned Samples

    Science.gov (United States)

    Beaty, D. W.; McSween, H. Y.; Czaja, A. D.; Goreva, Y. S.; Hausrath, E.; Herd, C. D. K.; Humayun, M.; McCubbin, F. M.; McLennan, S. M.; Hays, L. E.

    2016-01-01

    The Returned Sample Science Board (RSSB) was established in 2015 by NASA to provide expertise from the planetary sample community to the Mars 2020 Project. The RSSB's first task was to address the effect of heating during acquisition and storage of samples on scientific investigations that could be expected to be conducted if the samples are returned to Earth. Sample heating may cause changes that could ad-versely affect scientific investigations. Previous studies of temperature requirements for returned mar-tian samples fall within a wide range (-73 to 50 degrees Centigrade) and, for mission concepts that have a life detection component, the recommended threshold was less than or equal to -20 degrees Centigrade. The RSSB was asked by the Mars 2020 project to determine whether or not a temperature requirement was needed within the range of 30 to 70 degrees Centigrade. There are eight expected temperature regimes to which the samples could be exposed, from the moment that they are drilled until they are placed into a temperature-controlled environment on Earth. Two of those - heating during sample acquisition (drilling) and heating while cached on the Martian surface - potentially subject samples to the highest temperatures. The RSSB focused on the upper temperature limit that Mars samples should be allowed to reach. We considered 11 scientific investigations where thermal excursions may have an adverse effect on the science outcome. Those are: (T-1) organic geochemistry, (T-2) stable isotope geochemistry, (T-3) prevention of mineral hydration/dehydration and phase transformation, (T-4) retention of water, (T-5) characterization of amorphous materials, (T-6) putative Martian organisms, (T-7) oxidation/reduction reactions, (T-8) (sup 4) He thermochronometry, (T-9) radiometric dating using fission, cosmic-ray or solar-flare tracks, (T-10) analyses of trapped gasses, and (T-11) magnetic studies.

  12. CHARACTERIZATION OF TANK 19F SAMPLES

    Energy Technology Data Exchange (ETDEWEB)

    Oji, L.; Diprete, D.; Click, D.

    2009-12-17

    The Savannah River National Laboratory (SRNL) was asked by Liquid Waste Operations to characterize Tank 19F closure samples. Tank 19F slurry samples analyzed included the liquid and solid fractions derived from the slurry materials along with the floor scrape bottom Tank 19F wet solids. These samples were taken from Tank 19F in April 2009 and made available to SRNL in the same month. Because of limited amounts of solids observed in Tank 19F samples, the samples from the north quadrants of the tank were combined into one Tank 19F North Hemisphere sample and similarly the south quadrant samples were combined into one Tank 19F South Hemisphere sample. These samples were delivered to the SRNL shielded cell. The Tank 19F samples were analyzed for radiological, chemical and elemental components. Where analytical methods yielded additional contaminants other than those requested by the customer, these results were also reported. The target detection limits for isotopes analyzed were based on detection values of 1E-04 {micro}Ci/g for most radionuclides and customer desired detection values of 1E-05 {micro}Ci/g for I-129, Pa-231, Np-237, and Ra-226. While many of the target detection limits, as specified in the technical task request and task technical and quality assurance plans were met for the species characterized for Tank 19F, some were not met. In a number of cases, the relatively high levels of radioactive species of the same element or a chemically similar element precluded the ability to measure some isotopes to low levels. SRNL, in conjunction with the plant customer, reviewed all these cases and determined that the impacts were negligible.

  13. CHARACTERIZATION OF THE TANK 18F SAMPLES

    Energy Technology Data Exchange (ETDEWEB)

    Oji, L.; Click, D.; Diprete, D.

    2009-12-17

    The Savannah River National Laboratory (SRNL) was asked by Liquid Waste Operations to characterize Tank 18F closure samples. Tank 18F slurry samples analyzed included the liquid and solid fractions derived from the 'as-received' slurry materials along with the floor scrape bottom Tank 18F wet solids. These samples were taken from Tank 18F in March 2009 and made available to SRNL in the same month. Because of limited amounts of solids observed in Tank 18F samples, the samples from the north quadrants of the tank were combined into one North Tank 18F Hemisphere sample and similarly the south quadrant samples were combined into one South Tank 18F Hemisphere sample. These samples were delivered to the SRNL shielded cell. The Tank 18F samples were analyzed for radiological, chemical and elemental components. Where analytical methods yielded additional contaminants other than those requested by the customer, these results were also reported. The target detection limits for isotopes analyzed were 1E-04 {micro}Ci/g for most radionuclides and customer desired detection values of 1E-05 {micro}Ci/g for I-129, Pa-231, Np-237, and Ra-226. While many of the minimum detection limits, as specified in the technical task request and task technical and quality assurance plans were met for the species characterized for Tank 18F, some were not met due to spectral interferences. In a number of cases, the relatively high levels of radioactive species of the same element or a chemically similar element precluded the ability to measure some isotopes to low levels. SRNL, in conjunction with the plant customer, reviewed all these cases and determined that the impacts were negligible.

  14. Cooled membrane for high sensitivity gas sampling.

    Science.gov (United States)

    Jiang, Ruifen; Pawliszyn, Janusz

    2014-04-18

    A novel sample preparation method that combines the advantages of high surface area geometry and cold surface effect was proposed to achieve high sensitivity gas sampling. To accomplish this goal, a device that enables the membrane to be cooled down was developed for sampling, and a gas chromatograph-mass spectrometer was used for separation and quantification analysis. Method development included investigation of the effect of membrane temperature, membrane size, gas flow rate and humidity. Results showed that high sensitivity for equilibrium sampling, such as limonene sampling in the current study could be achieved by either cooling down the membrane and/or using a large volume extraction phase. On the other hand, for pre-equilibrium extraction, in which the extracted amount was mainly determined by membrane surface area and diffusion coefficient, high sensitivity could be obtained by using thinner membranes with a larger surface and/or a higher sampling flow rate. In addition, humidity showed no significant influence on extraction efficiency, due to the absorption property of the liquid extraction phase. Next, the limit of detection (LOD) was found, and the reproducibility of the developed cooled membrane gas sampling method was evaluated. Results showed that LODs with a membrane diameter of 19mm at room temperature sampling were 9.2ng/L, 0.12ng/L, 0.10ng/L for limonene, cinnamaldehyde and 2-pentadecanone, respectively. Intra- and inter-membrane sampling reproducibility revealed RSD% lower than 8% and 13%, respectively. Results uniformly demonstrated that the proposed cooled membrane device could serve as an alternative powerful tool for future gas sampling. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. The LITA Drill and Sample Delivery System

    Science.gov (United States)

    Paulsen, G.; Yoon, S.; Zacny, K.; Wettergreeng, D.; Cabrol, N. A.

    2013-12-01

    The Life in the Atacama (LITA) project has a goal of demonstrating autonomous roving, sample acquisition, delivery and analysis operations in Atacama, Chile. To enable the sample handling requirement, Honeybee Robotics developed a rover-deployed, rotary-percussive, autonomous drill, called the LITA Drill, capable of penetrating to ~80 cm in various formations, capturing and delivering subsurface samples to a 20 cup carousel. The carousel has a built-in capability to press the samples within each cup, and position target cups underneath instruments for analysis. The drill and sample delivery system had to have mass and power requirements consistent with a flight system. The drill weighs 12 kg and uses less than 100 watt of power to penetrate ~80 cm. The LITA Drill auger has been designed with two distinct stages. The lower part has deep and gently sloping flutes for retaining powdered sample, while the upper section has shallow and steep flutes for preventing borehole collapse and for efficient movement of cuttings and fall back material out of the hole. The drill uses the so called 'bite-sampling' approach that is samples are taken in short, 5-10 cm bites. To take the first bite, the drill is lowered onto the ground and upon drilling of the first bite it is then retracted into an auger tube. The auger with the auger tube are then lifted off the ground and positioned next to the carousel. To deposit the sample, the auger is rotated and retracted above the auger tube. The cuttings retained on the flutes are either gravity fed or are brushed off by a passive side brush into the cup. After the sample from the first bite has been deposited, the drill is lowered back into the same hole to take the next bite. This process is repeated until a target depth is reached. The bite sampling is analogous to peck drilling in the machining process where a bit is periodically retracted to clear chips. If there is some fall back into the hole once the auger has cleared the hole, this

  16. Biostatistics Series Module 5: Determining Sample Size.

    Science.gov (United States)

    Hazra, Avijit; Gogtay, Nithya

    2016-01-01

    Determining the appropriate sample size for a study, whatever be its type, is a fundamental aspect of biomedical research. An adequate sample ensures that the study will yield reliable information, regardless of whether the data ultimately suggests a clinically important difference between the interventions or elements being studied. The probability of Type 1 and Type 2 errors, the expected variance in the sample and the effect size are the essential determinants of sample size in interventional studies. Any method for deriving a conclusion from experimental data carries with it some risk of drawing a false conclusion. Two types of false conclusion may occur, called Type 1 and Type 2 errors, whose probabilities are denoted by the symbols σ and β. A Type 1 error occurs when one concludes that a difference exists between the groups being compared when, in reality, it does not. This is akin to a false positive result. A Type 2 error occurs when one concludes that difference does not exist when, in reality, a difference does exist, and it is equal to or larger than the effect size defined by the alternative to the null hypothesis. This may be viewed as a false negative result. When considering the risk of Type 2 error, it is more intuitive to think in terms of power of the study or (1 - β). Power denotes the probability of detecting a difference when a difference does exist between the groups being compared. Smaller α or larger power will increase sample size. Conventional acceptable values for power and α are 80% or above and 5% or below, respectively, when calculating sample size. Increasing variance in the sample tends to increase the sample size required to achieve a given power level. The effect size is the smallest clinically important difference that is sought to be detected and, rather than statistical convention, is a matter of past experience and clinical judgment. Larger samples are required if smaller differences are to be detected. Although the

  17. 1997 Baseline Sampling and Analysis Sample Locations, Geographic NAD83, LOSCO (2004) [BSA_1997_sample_locations_LOSCO_2004

    Data.gov (United States)

    Louisiana Geographic Information Center — The monitor point data set was produced as a part of the Baseline Sampling and Analysis (BSA) program coordinated by the Louisiana Oil Spill Coordinator's Office....

  18. 1999 Baseline Sampling and Analysis Sampling Locations, Geographic NAD83, LOSCO (2004) [BSA_1999_sample_locations_LOSCO_2004

    Data.gov (United States)

    Louisiana Geographic Information Center — The monitor point data set was produced as a part of the Baseline Sampling and Analysis program coordinated by the Louisiana Oil Spill Coordinator's Office. This...

  19. 1998 Baseline Sampling and Analysis Sampling Locations, Geographic NAD83, LOSCO (2004) [BSA_1998_sample_locations_LOSCO_2004

    Data.gov (United States)

    Louisiana Geographic Information Center — The monitor point data set was produced as a part of the Baseline Sampling and Analysis program coordinated by the Louisiana Oil Spill Coordinator's Office. This...

  20. Galahad: medium class asteroid sample return mission

    Science.gov (United States)

    Cheng, Andrew; Rivkin, Andrew; Adler, Mark

    The Galahad asteroid sample return mission proposal to the NASA New Frontiers solicitation met all of the objectives for the Asteroid Rover/Sample Return mission as defined in that announcement. Galahad is in many ways similar to the Marco Polo and the OSIRIS-Rex proposals. All three missions plan bulk sample returns from primitive, C or B class Near Earth asteroids. Galahad in particular will rendezvous with and orbit the binary C-asteroid 1996 FG3, making extensive orbital measurements. It will then land and collect over 60 g of well-documented samples with geologic context for return to Earth. The samples are expected to provide abundant materials from the early solar system, including chondrules and CAIs, as well as a primitive assemblage of organics, presolar grains and probably hydrated minerals. Analyses of these samples will yield new understanding of the early solar system, planetary accretion, and the nature and origins of prebiotic organic material. We will discuss scientific and technical approaches to characterization of, landing on, and sample collection from small primitive bodies.

  1. Downsampling Non-Uniformly Sampled Data

    Directory of Open Access Journals (Sweden)

    Fredrik Gustafsson

    2007-10-01

    Full Text Available Decimating a uniformly sampled signal a factor D involves low-pass antialias filtering with normalized cutoff frequency 1/D followed by picking out every Dth sample. Alternatively, decimation can be done in the frequency domain using the fast Fourier transform (FFT algorithm, after zero-padding the signal and truncating the FFT. We outline three approaches to decimate non-uniformly sampled signals, which are all based on interpolation. The interpolation is done in different domains, and the inter-sample behavior does not need to be known. The first one interpolates the signal to a uniformly sampling, after which standard decimation can be applied. The second one interpolates a continuous-time convolution integral, that implements the antialias filter, after which every Dth sample can be picked out. The third frequency domain approach computes an approximate Fourier transform, after which truncation and IFFT give the desired result. Simulations indicate that the second approach is particularly useful. A thorough analysis is therefore performed for this case, using the assumption that the non-uniformly distributed sampling instants are generated by a stochastic process.

  2. Sampling of Complex Networks: A Datamining Approach

    Science.gov (United States)

    Loecher, Markus; Dohrmann, Jakob; Bauer, Gernot

    2007-03-01

    Efficient and accurate sampling of big complex networks is still an unsolved problem. As the degree distribution is one of the most commonly used attributes to characterize a network, there have been many attempts in recent papers to derive the original degree distribution from the data obtained during a traceroute- like sampling process. This talk describes a strategy for predicting the original degree of a node using the data obtained from a network by traceroute-like sampling making use of datamining techniques. Only local quantities (the sampled degree k, the redundancy of node detection r, the time of the first discovery of a node t and the distance to the sampling source d) are used as input for the datamining models. Global properties like the betweenness centrality are ignored. These local quantities are examined theoretically and in simulations to increase their value for the predictions. The accuracy of the models is discussed as a function of the number of sources used in the sampling process and the underlying topology of the network. The purpose of this work is to introduce the techniques of the relatively young field of datamining to the discussion on network sampling.

  3. Enzymatic Purification of Microplastics in Environmental Samples.

    Science.gov (United States)

    Löder, Martin G J; Imhof, Hannes K; Ladehoff, Maike; Löschel, Lena A; Lorenz, Claudia; Mintenig, Svenja; Piehl, Sarah; Primpke, Sebastian; Schrank, Isabella; Laforsch, Christian; Gerdts, Gunnar

    2017-12-19

    Micro-Fourier transform infrared (micro-FTIR) spectroscopy and Raman spectroscopy enable the reliable identification and quantification of microplastics (MPs) in the lower micron range. Since concentrations of MPs in the environment are usually low, the large sample volumes required for these techniques lead to an excess of coenriched organic or inorganic materials. While inorganic materials can be separated from MPs using density separation, the organic fraction impedes the ability to conduct reliable analyses. Hence, the purification of MPs from organic materials is crucial prior to conducting an identification via spectroscopic techniques. Strong acidic or alkaline treatments bear the danger of degrading sensitive synthetic polymers. We suggest an alternative method, which uses a series of technical grade enzymes for purifying MPs in environmental samples. A basic enzymatic purification protocol (BEPP) proved to be efficient while reducing 98.3 ± 0.1% of the sample matrix in surface water samples. After showing a high recovery rate (84.5 ± 3.3%), the BEPP was successfully applied to environmental samples from the North Sea where numbers of MPs range from 0.05 to 4.42 items m-3. Experiences with different environmental sample matrices were considered in an improved and universally applicable version of the BEPP, which is suitable for focal plane array detector (FPA)-based micro-FTIR analyses of water, wastewater, sediment, biota, and food samples.

  4. Thermal probe design for Europa sample acquisition

    Science.gov (United States)

    Horne, Mera F.

    2018-01-01

    The planned lander missions to the surface of Europa will access samples from the subsurface of the ice in a search for signs of life. A small thermal drill (probe) is proposed to meet the sample requirement of the Science Definition Team's (SDT) report for the Europa mission. The probe is 2 cm in diameter and 16 cm in length and is designed to access the subsurface to 10 cm deep and to collect five ice samples of 7 cm3 each, approximately. The energy required to penetrate the top 10 cm of ice in a vacuum is 26 Wh, approximately, and to melt 7 cm3 of ice is 1.2 Wh, approximately. The requirement stated in the SDT report of collecting samples from five different sites can be accommodated with repeated use of the same thermal drill. For smaller sample sizes, a smaller probe of 1.0 cm in diameter with the same length of 16 cm could be utilized that would require approximately 6.4 Wh to penetrate the top 10 cm of ice, and 0.02 Wh to collect 0.1 g of sample. The thermal drill has the advantage of simplicity of design and operations and the ability to penetrate ice over a range of densities and hardness while maintaining sample integrity.

  5. Apollo Lunar Sample Photograph Digitization Project Update

    Science.gov (United States)

    Todd, N. S.; Lofgren, G. E.

    2012-01-01

    This is an update of the progress of a 4-year data restoration project effort funded by the LASER program to digitize photographs of the Apollo lunar rock samples and create high resolution digital images and undertaken by the Astromaterials Acquisition and Curation Office at JSC [1]. The project is currently in its last year of funding. We also provide an update on the derived products that make use of the digitized photos including the Lunar Sample Catalog and Photo Database[2], Apollo Sample data files for GoogleMoon[3].

  6. Sampling the Uppermost Surface of Airless Bodies

    Science.gov (United States)

    Noble, S. K.; Keller, L. P.; Christoffersen, R.

    2011-01-01

    The uppermost surface of an airless body is a critical source of ground-truth information for the various remote sensing techniques that only penetrate nanometers to micrometers into the surface. Such samples will also be vital for understanding conditions at the surface and acquiring information about how the body interacts with its environment, including solar wind interaction, grain charging and levitation [1]. Sampling the uppermost surface while preserving its structure (e.g. porosity, grain-to-grain contacts) however, is a daunting task that has not been achieved on any sample return mission to date.

  7. Basic Statistical Concepts for Sample Size Estimation

    Directory of Open Access Journals (Sweden)

    Vithal K Dhulkhed

    2008-01-01

    Full Text Available For grant proposals the investigator has to include an estimation of sample size .The size of the sample should be adequate enough so that there is sufficient data to reliably answer the research question being addressed by the study. At the very planning stage of the study the investigator has to involve the statistician. To have meaningful dialogue with the statistician every research worker should be familiar with the basic concepts of statistics. This paper is concerned with simple principles of sample size calculation. Concepts are explained based on logic rather than rigorous mathematical calculations to help him assimilate the fundamentals.

  8. Fluidics platform and method for sample preparation

    Science.gov (United States)

    Benner, Henry W.; Dzenitis, John M.

    2016-06-21

    Provided herein are fluidics platforms and related methods for performing integrated sample collection and solid-phase extraction of a target component of the sample all in one tube. The fluidics platform comprises a pump, particles for solid-phase extraction and a particle-holding means. The method comprises contacting the sample with one or more reagents in a pump, coupling a particle-holding means to the pump and expelling the waste out of the pump while the particle-holding means retains the particles inside the pump. The fluidics platform and methods herein described allow solid-phase extraction without pipetting and centrifugation.

  9. Visual Sample Plan (VSP) - FIELDS Integration

    Energy Technology Data Exchange (ETDEWEB)

    Pulsipher, Brent A.; Wilson, John E.; Gilbert, Richard O.; Hassig, Nancy L.; Carlson, Deborah K.; Bing-Canar, John; Cooper, Brian; Roth, Chuck

    2003-04-19

    Two software packages, VSP 2.1 and FIELDS 3.5, are being used by environmental scientists to plan the number and type of samples required to meet project objectives, display those samples on maps, query a database of past sample results, produce spatial models of the data, and analyze the data in order to arrive at defensible decisions. VSP 2.0 is an interactive tool to calculate optimal sample size and optimal sample location based on user goals, risk tolerance, and variability in the environment and in lab methods. FIELDS 3.0 is a set of tools to explore the sample results in a variety of ways to make defensible decisions with quantified levels of risk and uncertainty. However, FIELDS 3.0 has a small sample design module. VSP 2.0, on the other hand, has over 20 sampling goals, allowing the user to input site-specific assumptions such as non-normality of sample results, separate variability between field and laboratory measurements, make two-sample comparisons, perform confidence interval estimation, use sequential search sampling methods, and much more. Over 1,000 copies of VSP are in use today. FIELDS is used in nine of the ten U.S. EPA regions, by state regulatory agencies, and most recently by several international countries. Both software packages have been peer-reviewed, enjoy broad usage, and have been accepted by regulatory agencies as well as site project managers as key tools to help collect data and make environmental cleanup decisions. Recently, the two software packages were integrated, allowing the user to take advantage of the many design options of VSP, and the analysis and modeling options of FIELDS. The transition between the two is simple for the user – VSP can be called from within FIELDS, automatically passing a map to VSP and automatically retrieving sample locations and design information when the user returns to FIELDS. This paper will describe the integration, give a demonstration of the integrated package, and give users download

  10. Sampling system for in vivo ultrasound images

    DEFF Research Database (Denmark)

    Jensen, Jorgen Arendt; Mathorne, Jan

    1991-01-01

    Newly developed algorithms for processing medical ultrasound images use the high frequency sampled transducer signal. This paper describes demands imposed on a sampling system suitable for acquiring such data and gives details about a prototype constructed. It acquires full clinical images at a s...... at a sampling frequency of 20 MHz with a resolution of 12 bits. The prototype can be used for real time image processing. An example of a clinical in vivo image is shown and various aspects of the data acquisition process are discussed....

  11. METHODOLOGICAL ASPECTS OF STRATIFICATION OF AUDIT SAMPLING

    Directory of Open Access Journals (Sweden)

    Vilena A. Yakimova

    2013-01-01

    Full Text Available The article presents the methodological foundations for construction stratification audit sampling for attribute-based sampling. The sampling techniques of Russian and foreign practice is studied and stratified. The role of stratification in the audit is described. Approaches to construction of the stratification are revealed on the basis of professional judgment (qualitative methods, statistical groupings (quantitative methods and combinatory ones (complex qualitative stratifications. Grouping of accounting information for the purpose of constructing an optimal stratification and its criteria are proposed. The stratification methods are worked out and tested on the example of ABC-analysis.

  12. A Comet Surface Sample Return System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed Phase II investigation will focus on the development of spacecraft systems required to obtain a sample from the nucleus of a comet, hermetically seal...

  13. Fetal scalp blood sampling during labor

    DEFF Research Database (Denmark)

    Chandraharan, Edwin; Wiberg, Nana

    2014-01-01

    Fetal cardiotocography is characterized by low specificity; therefore, in an attempt to ensure fetal well-being, fetal scalp blood sampling has been recommended by most obstetric societies in the case of a non-reassuring cardiotocography. The scientific agreement on the evidence for using fetal...... scalp blood sampling to decrease the rate of operative delivery for fetal distress is ambiguous. Based on the same studies, a Cochrane review states that fetal scalp blood sampling increases the rate of instrumental delivery while decreasing neonatal acidosis, whereas the National Institute of Health...... and Clinical Excellence guideline considers that fetal scalp blood sampling decreases instrumental delivery without differences in other outcome variables. The fetal scalp is supplied by vessels outside the skull below the level of the cranial vault, which is likely to be compressed during contractions...

  14. On Invertible Sampling and Adaptive Security

    DEFF Research Database (Denmark)

    Ishai, Yuval; Kumarasubramanian, Abishek; Orlandi, Claudio

    2011-01-01

    functionalities was left open. We provide the first convincing evidence that the answer to this question is negative, namely that some (randomized) functionalities cannot be realized with adaptive security. We obtain this result by studying the following related invertible sampling problem: given an efficient...... sampling algorithm A, obtain another sampling algorithm B such that the output of B is computationally indistinguishable from the output of A, but B can be efficiently inverted (even if A cannot). This invertible sampling problem is independently motivated by other cryptographic applications. We show......Secure multiparty computation (MPC) is one of the most general and well studied problems in cryptography. We focus on MPC protocols that are required to be secure even when the adversary can adaptively corrupt parties during the protocol, and under the assumption that honest parties cannot reliably...

  15. GeoLab Sample Handling System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop  a robotic sample handling/ manipulator system for the GeoLab glovebox. This work leverages from earlier GeoLab work and a 2012 collaboration with a...

  16. Guam Commercial Fisheries BioSampling (CFBS)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Guam Commercial Fisheries Biosampling program, which collects length and weight frequency data for whole commercial catches, and samples 4-8 species for in-depth...

  17. Silicon based ultrafast optical waveform sampling

    DEFF Research Database (Denmark)

    Ji, Hua; Galili, Michael; Pu, Minhao

    2010-01-01

    A 300 nmx450 nmx5 mm silicon nanowire is designed and fabricated for a four wave mixing based non-linear optical gate. Based on this silicon nanowire, an ultra-fast optical sampling system is successfully demonstrated using a free-running fiber laser with a carbon nanotube-based mode-locker as th......A 300 nmx450 nmx5 mm silicon nanowire is designed and fabricated for a four wave mixing based non-linear optical gate. Based on this silicon nanowire, an ultra-fast optical sampling system is successfully demonstrated using a free-running fiber laser with a carbon nanotube-based mode......-locker as the sampling source. A clear eye-diagram of a 320 Gbit/s data signal is obtained. The temporal resolution of the sampling system is estimated to 360 fs....

  18. A Comet Surface Sample Return System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed Phase I investigation will focus on the development of spacecraft systems required to obtain a sample from the nucleus of a comet, hermetically seal the...

  19. Statistical sampling method, used in the audit

    Directory of Open Access Journals (Sweden)

    Gabriela-Felicia UNGUREANU

    2010-05-01

    Full Text Available The rapid increase in the size of U.S. companies from the early twentieth century created the need for audit procedures based on the selection of a part of the total population audited to obtain reliable audit evidence, to characterize the entire population consists of account balances or classes of transactions. Sampling is not used only in audit – is used in sampling surveys, market analysis and medical research in which someone wants to reach a conclusion about a large number of data by examining only a part of these data. The difference is the “population” from which the sample is selected, ie that set of data which is intended to draw a conclusion. Audit sampling applies only to certain types of audit procedures.

  20. Sample preparation in biological mass spectrometry

    CERN Document Server

    Ivanov, Alexander R

    2011-01-01

    The aim of this book is to provide the researcher with important sample preparation strategies in a wide variety of analyte molecules, specimens, methods, and biological applications requiring mass spectrometric analysis as a detection end-point.

  1. ROE Gulf of Mexico Hypoxia Sample Locations

    Data.gov (United States)

    U.S. Environmental Protection Agency — This dataset describes dissolved oxygen levels in the Gulf of Mexico. Individual sampling sites are represented by point data. The background polygon shows areas...

  2. National Sample Survey of Registered Nurses

    Data.gov (United States)

    U.S. Department of Health & Human Services — The National Sample Survey of Registered Nurses (NSSRN) Download makes data from the survey readily available to users in a one-stop download. The Survey has been...

  3. DXC'09 Industrial Track Sample Data

    Data.gov (United States)

    National Aeronautics and Space Administration — Sample data, including nominal and faulty scenarios, for Tier 1 and Tier 2 of the First International Diagnostic Competition. Three file formats are provided,...

  4. Bisphenol A levels in multimedia samples

    Data.gov (United States)

    U.S. Environmental Protection Agency — Levels of bisphenol A in multimedia samples. This dataset is associated with the following publication: Morgan, M., M. Nash, D. Boyd Barr, J. Starr, M. Clifton, and...

  5. Surface sampling concentration and reaction probe

    Science.gov (United States)

    Van Berkel, Gary J; Elnaggar, Mariam S

    2013-07-16

    A method of analyzing a chemical composition of a specimen is described. The method can include providing a probe comprising an outer capillary tube and an inner capillary tube disposed co-axially within the outer capillary tube, where the inner and outer capillary tubes define a solvent capillary and a sampling capillary in fluid communication with one another at a distal end of the probe; contacting a target site on a surface of a specimen with a solvent in fluid communication with the probe; maintaining a plug volume proximate a solvent-specimen interface, wherein the plug volume is in fluid communication with the probe; draining plug sampling fluid from the plug volume through the sampling capillary; and analyzing a chemical composition of the plug sampling fluid with an analytical instrument. A system for performing the method is also described.

  6. NMFS Menhaden Biostatistical (Port Samples) Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Data set consists of port samples of gulf and Atlantic menhaden from the reduction purse-seine fisheries: data include specimen fork length, weight and age (yrs), as...

  7. Optimising uncertainty in physical sample preparation.

    Science.gov (United States)

    Lyn, Jennifer A; Ramsey, Michael H; Damant, Andrew P; Wood, Roger

    2005-11-01

    Uncertainty associated with the result of a measurement can be dominated by the physical sample preparation stage of the measurement process. In view of this, the Optimised Uncertainty (OU) methodology has been further developed to allow the optimisation of the uncertainty from this source, in addition to that from the primary sampling and the subsequent chemical analysis. This new methodology for the optimisation of physical sample preparation uncertainty (u(prep), estimated as s(prep)) is applied for the first time, to a case study of myclobutanil in retail strawberries. An increase in expenditure (+7865%) on the preparatory process was advised in order to reduce the s(prep) by the 69% recommended. This reduction is desirable given the predicted overall saving, under optimised conditions, of 33,000 pounds Sterling per batch. This new methodology has been shown to provide guidance on the appropriate distribution of resources between the three principle stages of a measurement process, including physical sample preparation.

  8. DXC'11 Industrial Track Sample Data

    Data.gov (United States)

    National Aeronautics and Space Administration — The sample scenarios provided here are competition scenarios from Diagnostic Problems I and II of DXC'10. The zip file has a spreadsheet (and pdf) that lists the...

  9. Nitrate Waste Treatment Sampling and Analysis Plan

    Energy Technology Data Exchange (ETDEWEB)

    Vigil-Holterman, Luciana R. [Los Alamos National Laboratory; Martinez, Patrick Thomas [Los Alamos National Laboratory; Garcia, Terrence Kerwin [Los Alamos National Laboratory

    2017-07-05

    This plan is designed to outline the collection and analysis of nitrate salt-bearing waste samples required by the New Mexico Environment Department- Hazardous Waste Bureau in the Los Alamos National Laboratory (LANL) Hazardous Waste Facility Permit (Permit).

  10. Optimizing sampling approaches along ecological gradients

    DEFF Research Database (Denmark)

    Schweiger, Andreas; Irl, Severin D. H.; Steinbauer, Manuel

    2016-01-01

    1. Natural scientists and especially ecologists use manipulative experiments or field observations along gradients to differentiate patterns driven by processes from those caused by random noise. A well-conceived sampling design is essential for identifying, analysing and reporting underlying...

  11. Particle size distribution in ground biological samples.

    Science.gov (United States)

    Koglin, D; Backhaus, F; Schladot, J D

    1997-05-01

    Modern trace and retrospective analysis of Environmental Specimen Bank (ESB) samples require surplus material prepared and characterized as reference materials. Before the biological samples could be analyzed and stored for long periods at cryogenic temperatures, the materials have to be pre-crushed. As a second step, a milling and homogenization procedure has to follow. For this preparation, a grinding device is cooled with liquid nitrogen to a temperature of -190 degrees C. It is a significant condition for homogeneous samples that at least 90% of the particles should be smaller than 200 microns. In the German ESB the particle size distribution of the processed material is determined by means of a laser particle sizer. The decrease of particle sizes of deer liver and bream muscles after different grinding procedures as well as the consequences of ultrasonic treatment of the sample before particle size measurements have been investigated.

  12. Two-stage sampling for acceptance testing

    Energy Technology Data Exchange (ETDEWEB)

    Atwood, C.L.; Bryan, M.F.

    1992-09-01

    Sometimes a regulatory requirement or a quality-assurance procedure sets an allowed maximum on a confidence limit for a mean. If the sample mean of the measurements is below the allowed maximum, but the confidence limit is above it, a very widespread practice is to increase the sample size and recalculate the confidence bound. The confidence level of this two-stage procedure is rarely found correctly, but instead is typically taken to be the nominal confidence level, found as if the final sample size had been specified in advance. In typical settings, the correct nominal [alpha] should be between the desired P(Type I error) and half that value. This note gives tables for the correct a to use, some plots of power curves, and an example of correct two-stage sampling.

  13. Two-stage sampling for acceptance testing

    Energy Technology Data Exchange (ETDEWEB)

    Atwood, C.L.; Bryan, M.F.

    1992-09-01

    Sometimes a regulatory requirement or a quality-assurance procedure sets an allowed maximum on a confidence limit for a mean. If the sample mean of the measurements is below the allowed maximum, but the confidence limit is above it, a very widespread practice is to increase the sample size and recalculate the confidence bound. The confidence level of this two-stage procedure is rarely found correctly, but instead is typically taken to be the nominal confidence level, found as if the final sample size had been specified in advance. In typical settings, the correct nominal {alpha} should be between the desired P(Type I error) and half that value. This note gives tables for the correct a to use, some plots of power curves, and an example of correct two-stage sampling.

  14. Designing optimal sampling schemes for field visits

    CSIR Research Space (South Africa)

    Debba, Pravesh

    2008-10-01

    Full Text Available This is a presentation of a statistical method for deriving optimal spatial sampling schemes. The research focuses on ground verification of minerals derived from hyperspectral data. Spectral angle mapper (SAM) and spectral feature fitting (SFF...

  15. Sample Return Systems for Extreme Environments Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In Phase I we were able to demonstrate that sample return missions utilizing high velocity penetrators (0.1- 1 km/s) could provide substantial new capabilities for...

  16. BioSampling Data from LHP Cruises

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set includes separate bioSampling logs from each LHP Bottomfishing cruise both within and outside of the Main Hawaiian Islands, as well as a master file...

  17. Guidelines for sampling fish in inland waters

    National Research Council Canada - National Science Library

    Backiel, Tadeusz; Welcomme, R. L

    1980-01-01

    The book is addressed mainly to Fishery Biologists but it is hoped that Fishing Gear Technologists also can acquire some basic knowledge of sampling problems and procedures which, in turn, can result...

  18. Sampling: Making Electronic Discovery More Cost Effective

    Directory of Open Access Journals (Sweden)

    Milton Luoma

    2011-06-01

    Full Text Available With the huge volumes of electronic data subject to discovery in virtually every instance of litigation, time and costs of conducting discovery have become exceedingly important when litigants plan their discovery strategies.  Rather than incurring the costs of having lawyers review every document produced in response to a discovery request in search of relevant evidence, a cost effective strategy for document review planning is to use statistical sampling of the database of documents to determine the likelihood of finding relevant evidence by reviewing additional documents.  This paper reviews and discusses how sampling can be used to make document review more cost effective by considering issues such as an appropriate sample size, how to develop a sampling strategy, and taking into account the potential value of the litigation in relation to the costs of additional discovery efforts. 

  19. Boson sampling on a photonic chip.

    Science.gov (United States)

    Spring, Justin B; Metcalf, Benjamin J; Humphreys, Peter C; Kolthammer, W Steven; Jin, Xian-Min; Barbieri, Marco; Datta, Animesh; Thomas-Peter, Nicholas; Langford, Nathan K; Kundys, Dmytro; Gates, James C; Smith, Brian J; Smith, Peter G R; Walmsley, Ian A

    2013-02-15

    Although universal quantum computers ideally solve problems such as factoring integers exponentially more efficiently than classical machines, the formidable challenges in building such devices motivate the demonstration of simpler, problem-specific algorithms that still promise a quantum speedup. We constructed a quantum boson-sampling machine (QBSM) to sample the output distribution resulting from the nonclassical interference of photons in an integrated photonic circuit, a problem thought to be exponentially hard to solve classically. Unlike universal quantum computation, boson sampling merely requires indistinguishable photons, linear state evolution, and detectors. We benchmarked our QBSM with three and four photons and analyzed sources of sampling inaccuracy. Scaling up to larger devices could offer the first definitive quantum-enhanced computation.

  20. Hydraulically controlled discrete sampling from open boreholes

    Science.gov (United States)

    Harte, Philip T.

    2013-01-01

    Groundwater sampling from open boreholes in fractured-rock aquifers is particularly challenging because of mixing and dilution of fluid within the borehole from multiple fractures. This note presents an alternative to traditional sampling in open boreholes with packer assemblies. The alternative system called ZONFLO (zonal flow) is based on hydraulic control of borehole flow conditions. Fluid from discrete fractures zones are hydraulically isolated allowing for the collection of representative samples. In rough-faced open boreholes and formations with less competent rock, hydraulic containment may offer an attractive alternative to physical containment with packers. Preliminary test results indicate a discrete zone can be effectively hydraulically isolated from other zones within a borehole for the purpose of groundwater sampling using this new method.

  1. Water Sample Points, Navajo Nation, 2000, USACE

    Data.gov (United States)

    U.S. Environmental Protection Agency — This point shapefile presents the locations and results for water samples collected on the Navajo Nation by the US Army Corps of Engineers (USACE) for the US...

  2. CNMI Commercial Fisheries BioSampling (CFBS)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The main market sampling program in the Commonwealth of the Northern Mariana Islands (CNMI) is the new biosampling program implemented in late 2010 on the island of...

  3. Sampling, Probability Models and Statistical Reasoning Statistical ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  4. Spatial-dependence recurrence sample entropy

    Science.gov (United States)

    Pham, Tuan D.; Yan, Hong

    2018-03-01

    Measuring complexity in terms of the predictability of time series is a major area of research in science and engineering, and its applications are spreading throughout many scientific disciplines, where the analysis of physiological signals is perhaps the most widely reported in literature. Sample entropy is a popular measure for quantifying signal irregularity. However, the sample entropy does not take sequential information, which is inherently useful, into its calculation of sample similarity. Here, we develop a method that is based on the mathematical principle of the sample entropy and enables the capture of sequential information of a time series in the context of spatial dependence provided by the binary-level co-occurrence matrix of a recurrence plot. Experimental results on time-series data of the Lorenz system, physiological signals of gait maturation in healthy children, and gait dynamics in Huntington's disease show the potential of the proposed method.

  5. Importance Sampling Variance Reduction in GRESS ATMOSIM

    Energy Technology Data Exchange (ETDEWEB)

    Wakeford, Daniel Tyler [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-04-26

    This document is intended to introduce the importance sampling method of variance reduction to a Geant4 user for application to neutral particle Monte Carlo transport through the atmosphere, as implemented in GRESS ATMOSIM.

  6. Commercial Fisheries Database Biological Sample (CFDBS)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Age and length frequency data for finfish and invertebrate species collected during commercial fishing vessels. Samples are collected by fisheries reporting...

  7. Comparison of metagenomic samples using sequence signatures

    Directory of Open Access Journals (Sweden)

    Jiang Bai

    2012-12-01

    Full Text Available Abstract Background Sequence signatures, as defined by the frequencies of k-tuples (or k-mers, k-grams, have been used extensively to compare genomic sequences of individual organisms, to identify cis-regulatory modules, and to study the evolution of regulatory sequences. Recently many next-generation sequencing (NGS read data sets of metagenomic samples from a variety of different environments have been generated. The assembly of these reads can be difficult and analysis methods based on mapping reads to genes or pathways are also restricted by the availability and completeness of existing databases. Sequence-signature-based methods, however, do not need the complete genomes or existing databases and thus, can potentially be very useful for the comparison of metagenomic samples using NGS read data. Still, the applications of sequence signature methods for the comparison of metagenomic samples have not been well studied. Results We studied several dissimilarity measures, including d2, d2* and d2S recently developed from our group, a measure (hereinafter noted as Hao used in CVTree developed from Hao’s group (Qi et al., 2004, measures based on relative di-, tri-, and tetra-nucleotide frequencies as in Willner et al. (2009, as well as standard lp measures between the frequency vectors, for the comparison of metagenomic samples using sequence signatures. We compared their performance using a series of extensive simulations and three real next-generation sequencing (NGS metagenomic datasets: 39 fecal samples from 33 mammalian host species, 56 marine samples across the world, and 13 fecal samples from human individuals. Results showed that the dissimilarity measure d2S can achieve superior performance when comparing metagenomic samples by clustering them into different groups as well as recovering environmental gradients affecting microbial samples. New insights into the environmental factors affecting microbial compositions in metagenomic samples

  8. Streaming Gibbs Sampling for LDA Model

    OpenAIRE

    Gao, Yang; Chen, Jianfei; Zhu, Jun

    2016-01-01

    Streaming variational Bayes (SVB) is successful in learning LDA models in an online manner. However previous attempts toward developing online Monte-Carlo methods for LDA have little success, often by having much worse perplexity than their batch counterparts. We present a streaming Gibbs sampling (SGS) method, an online extension of the collapsed Gibbs sampling (CGS). Our empirical study shows that SGS can reach similar perplexity as CGS, much better than SVB. Our distributed version of SGS,...

  9. Exact sampling hardness of Ising spin models

    Science.gov (United States)

    Fefferman, B.; Foss-Feig, M.; Gorshkov, A. V.

    2017-09-01

    We study the complexity of classically sampling from the output distribution of an Ising spin model, which can be implemented naturally in a variety of atomic, molecular, and optical systems. In particular, we construct a specific example of an Ising Hamiltonian that, after time evolution starting from a trivial initial state, produces a particular output configuration with probability very nearly proportional to the square of the permanent of a matrix with arbitrary integer entries. In a similar spirit to boson sampling, the ability to sample classically from the probability distribution induced by time evolution under this Hamiltonian would imply unlikely complexity theoretic consequences, suggesting that the dynamics of such a spin model cannot be efficiently simulated with a classical computer. Physical Ising spin systems capable of achieving problem-size instances (i.e., qubit numbers) large enough so that classical sampling of the output distribution is classically difficult in practice may be achievable in the near future. Unlike boson sampling, our current results only imply hardness of exact classical sampling, leaving open the important question of whether a much stronger approximate-sampling hardness result holds in this context. The latter is most likely necessary to enable a convincing experimental demonstration of quantum supremacy. As referenced in a recent paper [A. Bouland, L. Mancinska, and X. Zhang, in Proceedings of the 31st Conference on Computational Complexity (CCC 2016), Leibniz International Proceedings in Informatics (Schloss Dagstuhl-Leibniz-Zentrum für Informatik, Dagstuhl, 2016)], our result completes the sampling hardness classification of two-qubit commuting Hamiltonians.

  10. Statistical literacy and sample survey results

    Science.gov (United States)

    McAlevey, Lynn; Sullivan, Charles

    2010-10-01

    Sample surveys are widely used in the social sciences and business. The news media almost daily quote from them, yet they are widely misused. Using students with prior managerial experience embarking on an MBA course, we show that common sample survey results are misunderstood even by those managers who have previously done a statistics course. In general, they fare no better than managers who have never studied statistics. There are implications for teaching, especially in business schools, as well as for consulting.

  11. Object Detection with Active Sample Harvesting

    OpenAIRE

    Canévet, Olivier

    2017-01-01

    The work presented in this dissertation lies in the domains of image classification, object detection, and machine learning. Whether it is training image classifiers or object detectors, the learning phase consists in finding an optimal boundary between populations of samples. In practice, all the samples are not equally important: some examples are trivially classified and do not bring much to the training, while others close to the boundary or misclassified are the ones that truly matter. S...

  12. Applicability of passive sampling to groundwater monitoring

    OpenAIRE

    Berho, Catherine; Togola, Anne; Ghestem, Jean Philippe

    2011-01-01

    Passive sampling technology has become of great importance in the field of environmental monitoring for several years, due to its well-known advantages (low perturbation of the sample, time weighted average concentration estimation ...). Although passive samplers have been successfully used in a variety of field studies in surface waters, only a few studies have tested their applicability in groundwater. Indeed, groundwater presents specificity such as a low velocity of water which might affe...

  13. Sampling for assurance of future reliability

    Science.gov (United States)

    Klauenberg, Katy; Elster, Clemens

    2017-02-01

    Ensuring measurement trueness, compliance with regulations and conformity with standards are key tasks in metrology which are often considered at the time of an inspection. Current practice does not always verify quality after or between inspections, calibrations, laboratory comparisons, conformity assessments, etc. Statistical models describing behavior over time may ensure reliability, i.e. they may give the probability of functioning, compliance or survival until some future point in time. It may not always be possible or economic to inspect a whole population of measuring devices or other units. Selecting a subset of the population according to statistical sampling plans and inspecting only these, allows conclusions about the quality of the whole population with a certain confidence. Combining these issues of sampling and aging, raises questions such as: How many devices need to be inspected, and at least how many of them must conform, so that one can be sure, that more than 100p % of the population will comply until the next inspection? This research is to raise awareness and offer a simple answer to such time- and sample-based quality statements in metrology and beyond. Reliability demonstration methods, such as the prevailing Weibull binomial model, quantify the confidence in future reliability on the basis of a sample. We adapt the binomial model to be applicable to sampling without replacement and simplify the Weibull model so that sampling plans may be determined on the basis of existing ISO standards. Provided the model is suitable, no additional information and no software are needed; and yet, the consumer is protected against future failure. We establish new sampling plans for utility meter surveillance, which are required by a recent modification of German law. These sampling plans are given in similar tables to the previous ones, which demonstrates their suitability for everyday use.

  14. Harpoon-based sample Acquisition System

    Science.gov (United States)

    Bernal, Javier; Nuth, Joseph; Wegel, Donald

    2012-02-01

    Acquiring information about the composition of comets, asteroids, and other near Earth objects is very important because they may contain the primordial ooze of the solar system and the origins of life on Earth. Sending a spacecraft is the obvious answer, but once it gets there it needs to collect and analyze samples. Conceptually, a drill or a shovel would work, but both require something extra to anchor it to the comet, adding to the cost and complexity of the spacecraft. Since comets and asteroids are very low gravity objects, drilling becomes a problem. If you do not provide a grappling mechanism, the drill would push the spacecraft off the surface. Harpoons have been proposed as grappling mechanisms in the past and are currently flying on missions such as ROSETTA. We propose to use a hollow, core sampling harpoon, to act as the anchoring mechanism as well as the sample collecting device. By combining these two functions, mass is reduced, more samples can be collected and the spacecraft can carry more propellant. Although challenging, returning the collected samples to Earth allows them to be analyzed in laboratories with much greater detail than possible on a spacecraft. Also, bringing the samples back to Earth allows future generations to study them.

  15. High-efficiency multiphoton boson sampling

    Science.gov (United States)

    Wang, Hui; He, Yu; Li, Yu-Huai; Su, Zu-En; Li, Bo; Huang, He-Liang; Ding, Xing; Chen, Ming-Cheng; Liu, Chang; Qin, Jian; Li, Jin-Peng; He, Yu-Ming; Schneider, Christian; Kamp, Martin; Peng, Cheng-Zhi; Höfling, Sven; Lu, Chao-Yang; Pan, Jian-Wei

    2017-06-01

    Boson sampling is considered as a strong candidate to demonstrate 'quantum computational supremacy' over classical computers. However, previous proof-of-principle experiments suffered from small photon number and low sampling rates owing to the inefficiencies of the single-photon sources and multiport optical interferometers. Here, we develop two central components for high-performance boson sampling: robust multiphoton interferometers with 99% transmission rate and actively demultiplexed single-photon sources based on a quantum dot-micropillar with simultaneously high efficiency, purity and indistinguishability. We implement and validate three-, four- and five-photon boson sampling, and achieve sampling rates of 4.96 kHz, 151 Hz and 4 Hz, respectively, which are over 24,000 times faster than previous experiments. Our architecture can be scaled up for a larger number of photons and with higher sampling rates to compete with classical computers, and might provide experimental evidence against the extended Church-Turing thesis.

  16. Micro contactor based on isotachophoretic sample transport.

    Science.gov (United States)

    Goet, Gabriele; Baier, Tobias; Hardt, Steffen

    2009-12-21

    It is demonstrated how isotachophoresis (ITP) in a microfluidic device may be utilized to bring two small sample volumes into contact in a well-controlled manner. The ITP contactor serves a similar purpose as micromixers that are designed to mix two species rapidly in a microfluidic channel. In contrast to many micromixers, the ITP contactor does not require complex channel architectures and allows a sample processing in the spirit of "digital microfluidics", i.e. the samples always remain in a compact volume. It is shown that the ITP zone transport through microchannels proceeds in a reproducible and predictable manner, and that the sample trajectories follow simple relationships obtained from Ohm's law. Firstly, the micro contactor can be used to synchronize two ITP zones having reached a channel at different points in time. Secondly, fulfilling its actual purpose it is capable of bringing two samples in molecular contact via an interpenetration of ITP zones. It is demonstrated that the contacting time is proportional to the ITP zone extension. This opens up the possibility of using that type of device as a special type of micromixer with "mixing times" significantly below one second and an option to regulate the duration of contact through specific parameters such as the sample volume. Finally, it is shown how the micro contactor can be utilized to conduct a hybridization reaction between two ITP zones containing complementary DNA strands.

  17. On sampling fractions and electron shower shapes

    Energy Technology Data Exchange (ETDEWEB)

    Peryshkin, Alexander; Raja, Rajendran; /Fermilab

    2011-12-01

    We study the usage of various definitions of sampling fractions in understanding electron shower shapes in a sampling multilayer electromagnetic calorimeter. We show that the sampling fractions obtained by the conventional definition (I) of (average observed energy in layer)/(average deposited energy in layer) will not give the best energy resolution for the calorimeter. The reason for this is shown to be the presence of layer by layer correlations in an electromagnetic shower. The best resolution is obtained by minimizing the deviation from the total input energy using a least squares algorithm. The 'sampling fractions' obtained by this method (II) are shown to give the best resolution for overall energy. We further show that the method (II) sampling fractions are obtained by summing the columns of a non-local {lambda} tensor that incorporates the correlations. We establish that the sampling fractions (II) cannot be used to predict the layer by layer energies and that one needs to employ the full {lambda} tensor for this purpose. This effect is again a result of the correlations.

  18. Electrostatic sampling of trace DNA from clothing.

    Science.gov (United States)

    Zieger, Martin; Defaux, Priscille Merciani; Utz, Silvia

    2016-05-01

    During acts of physical aggression, offenders frequently come into contact with clothes of the victim, thereby leaving traces of DNA-bearing biological material on the garments. Since tape-lifting and swabbing, the currently established methods for non-destructive trace DNA sampling from clothing, both have their shortcomings in collection efficiency and handling, we thought about a new collection method for these challenging samples. Testing two readily available electrostatic devices for their potential to sample biological material from garments made of different fabrics, we found one of them, the electrostatic dust print lifter (DPL), to perform comparable to well-established sampling with wet cotton swabs. In simulated aggression scenarios, we had the same success rate for the establishment of single aggressor profiles, suitable for database submission, with both the DPL and wet swabbing. However, we lost a substantial amount of information with electrostatic sampling, since almost no mixed aggressor-victim profiles suitable for database entry could be established, compared to conventional swabbing. This study serves as a proof of principle for electrostatic DNA sampling from items of clothing. The technique still requires optimization before it might be used in real casework. But we are confident that in the future it could be an efficient and convenient contribution to the toolbox of forensic practitioners.

  19. Quota sampling in internet research: practical issues.

    Science.gov (United States)

    Im, Eun-Ok; Chee, Wonshik

    2011-07-01

    Quota sampling has been suggested as a potentially good method for Internet-based research and has been used by several researchers working with Internet samples. However, very little is known about the issues or concerns in using a quota sampling method in Internet research. The purpose of this article was to present the practical issues using quota sampling in an Internet-based study. During the Internet study, the research team recorded all recruitment issues that arose and made written notes indicating the possible reasons for the problems. In addition, biweekly team discussions were conducted for which written records were kept. Overall, quota sampling was effective in ensuring that an adequate number of midlife women were recruited from the targeted ethnic groups. However, during the study process, we encountered the following practical issues using quota sampling: (1) difficulty reaching out to women in lower socioeconomic classes, (2) difficulty ensuring authenticity of participants' identities, (3) participants giving inconsistent answers for the screening questions versus the Internet survey questions, (4) potential problems with a question on socioeconomic status, (5) resentment toward the research project and/or researchers because of rejection, and (6) a longer time and more expense than anticipated.

  20. Automated sampling and control of gaseous simulations

    KAUST Repository

    Huang, Ruoguan

    2013-05-04

    In this work, we describe a method that automates the sampling and control of gaseous fluid simulations. Several recent approaches have provided techniques for artists to generate high-resolution simulations based on a low-resolution simulation. However, often in applications the overall flow in the low-resolution simulation that an animator observes and intends to preserve is composed of even lower frequencies than the low resolution itself. In such cases, attempting to match the low-resolution simulation precisely is unnecessarily restrictive. We propose a new sampling technique to efficiently capture the overall flow of a fluid simulation, at the scale of user\\'s choice, in such a way that the sampled information is sufficient to represent what is virtually perceived and no more. Thus, by applying control based on the sampled data, we ensure that in the resulting high-resolution simulation, the overall flow is matched to the low-resolution simulation and the fine details on the high resolution are preserved. The samples we obtain have both spatial and temporal continuity that allows smooth keyframe matching and direct manipulation of visible elements such as smoke density through temporal blending of samples. We demonstrate that a user can easily configure a simulation with our system to achieve desired results. © 2013 Springer-Verlag Berlin Heidelberg.

  1. Demystifying Theoretical Sampling in Grounded Theory Research

    Directory of Open Access Journals (Sweden)

    Jenna Breckenridge BSc(Hons,Ph.D.Candidate

    2009-06-01

    Full Text Available Theoretical sampling is a central tenet of classic grounded theory and is essential to the development and refinement of a theory that is ‘grounded’ in data. While many authors appear to share concurrent definitions of theoretical sampling, the ways in which the process is actually executed remain largely elusive and inconsistent. As such, employing and describing the theoretical sampling process can present a particular challenge to novice researchers embarking upon their first grounded theory study. This article has been written in response to the challenges faced by the first author whilst writing a grounded theory proposal. It is intended to clarify theoretical sampling for new grounded theory researchers, offering some insight into the practicalities of selecting and employing a theoretical sampling strategy. It demonstrates that the credibility of a theory cannot be dissociated from the process by which it has been generated and seeks to encourage and challenge researchers to approach theoretical sampling in a way that is apposite to the core principles of the classic grounded theory methodology.

  2. Collecting Samples in Gale Crater, Mars; an Overview of the Mars Science Laboratory Sample Acquisition, Sample Processing and Handling System

    Science.gov (United States)

    Anderson, R. C.; Jandura, L.; Okon, A. B.; Sunshine, D.; Roumeliotis, C.; Beegle, L. W.; Hurowitz, J.; Kennedy, B.; Limonadi, D.; McCloskey, S.; Robinson, M.; Seybold, C.; Brown, K.

    2012-09-01

    The Mars Science Laboratory Mission (MSL), scheduled to land on Mars in the summer of 2012, consists of a rover and a scientific payload designed to identify and assess the habitability, geological, and environmental histories of Gale crater. Unraveling the geologic history of the region and providing an assessment of present and past habitability requires an evaluation of the physical and chemical characteristics of the landing site; this includes providing an in-depth examination of the chemical and physical properties of Martian regolith and rocks. The MSL Sample Acquisition, Processing, and Handling (SA/SPaH) subsystem will be the first in-situ system designed to acquire interior rock and soil samples from Martian surface materials. These samples are processed and separated into fine particles and distributed to two onboard analytical science instruments SAM (Sample Analysis at Mars Instrument Suite) and CheMin (Chemistry and Mineralogy) or to a sample analysis tray for visual inspection. The SA/SPaH subsystem is also responsible for the placement of the two contact instruments, Alpha Particle X-Ray Spectrometer (APXS), and the Mars Hand Lens Imager (MAHLI), on rock and soil targets. Finally, there is a Dust Removal Tool (DRT) to remove dust particles from rock surfaces for subsequent analysis by the contact and or mast mounted instruments (e.g. Mast Cameras (MastCam) and the Chemistry and Micro-Imaging instruments (ChemCam)).

  3. UNLABELED SELECTED SAMPLES IN FEATURE EXTRACTION FOR CLASSIFICATION OF HYPERSPECTRAL IMAGES WITH LIMITED TRAINING SAMPLES

    Directory of Open Access Journals (Sweden)

    A. Kianisarkaleh

    2015-12-01

    Full Text Available Feature extraction plays a key role in hyperspectral images classification. Using unlabeled samples, often unlimitedly available, unsupervised and semisupervised feature extraction methods show better performance when limited number of training samples exists. This paper illustrates the importance of selecting appropriate unlabeled samples that used in feature extraction methods. Also proposes a new method for unlabeled samples selection using spectral and spatial information. The proposed method has four parts including: PCA, prior classification, posterior classification and sample selection. As hyperspectral image passes these parts, selected unlabeled samples can be used in arbitrary feature extraction methods. The effectiveness of the proposed unlabeled selected samples in unsupervised and semisupervised feature extraction is demonstrated using two real hyperspectral datasets. Results show that through selecting appropriate unlabeled samples, the proposed method can improve the performance of feature extraction methods and increase classification accuracy.

  4. Determining the Mineralogy of Lunar Samples Using Micro Raman Spectroscopy: Comparisons Between Polished and Unpolished Samples

    Science.gov (United States)

    Bower, D. M.; Curran, N. M.; Cohen, B. A.

    2017-10-01

    Raman spectroscopy is a versatile non-destructive analytical technique that provides compositional and contextual information for geologic samples, including lunar rocks. We have analyzed a suite of Apollo 16 samples using micro Raman spectroscopy.

  5. Sample results from the interim salt disposition program macrobatch 9 tank 21H qualification samples

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-11-01

    Savannah River National Laboratory (SRNL) analyzed samples from Tank 21H in support of qualification of Macrobatch (Salt Batch) 9 for the Interim Salt Disposition Program (ISDP). This document reports characterization data on the samples of Tank 21H.

  6. Fluid sample collection and distribution system. [qualitative analysis of aqueous samples from several points

    Science.gov (United States)

    Brooks, R. L. (Inventor)

    1979-01-01

    A multipoint fluid sample collection and distribution system is provided wherein the sample inputs are made through one or more of a number of sampling valves to a progressive cavity pump which is not susceptible to damage by large unfiltered particles. The pump output is through a filter unit that can provide a filtered multipoint sample. An unfiltered multipoint sample is also provided. An effluent sample can be taken and applied to a second progressive cavity pump for pumping to a filter unit that can provide one or more filtered effluent samples. The second pump can also provide an unfiltered effluent sample. Means are provided to periodically back flush each filter unit without shutting off the whole system.

  7. Procedures for sampling and sample reduction within quality assurance systems for solid biofuels

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    The objective of this experimental study on sampling was to determine the size and number of samples of biofuels required (taken at two sampling points in each case) and to compare two methods of sampling. The first objective of the sample-reduction exercise was to compare the reliability of various sampling methods, and the second objective was to measure the variations introduced as a result of reducing the sample size to form suitable test portions. The materials studied were sawdust, wood chips, wood pellets and bales of straw, and these were analysed for moisture, ash, particle size and chloride. The sampling procedures are described. The study was conducted in Scandinavia. The results of the study were presented in Leipzig in October 2004. The work was carried out as part of the UK's DTI Technology Programme: New and Renewable Energy.

  8. GSAMPLE: Stata module to draw a random sample

    OpenAIRE

    Jann, Ben

    2006-01-01

    gsample draws a random sample from the data in memory. Simple random sampling (SRS) is supported, as well as unequal probability sampling (UPS), of which sampling with probabilities proportional to size (PPS) is a special case. Both methods, SRS and UPS/PPS, provide sampling with replacement and sampling without replacement. Furthermore, stratified sampling and cluster sampling is supported.

  9. Novel Sample-handling Approach for XRD Analysis with Minimal Sample Preparation

    Science.gov (United States)

    Sarrazin, P.; Chipera, S.; Bish, D.; Blake, D.; Feldman, S.; Vaniman, D.; Bryson, C.

    2004-01-01

    Sample preparation and sample handling are among the most critical operations associated with X-ray diffraction (XRD) analysis. These operations require attention in a laboratory environment, but they become a major constraint in the deployment of XRD instruments for robotic planetary exploration. We are developing a novel sample handling system that dramatically relaxes the constraints on sample preparation by allowing characterization of coarse-grained material that would normally be impossible to analyze with conventional powder-XRD techniques.

  10. Influence of sampling depth and post-sampling analysis time on the ...

    African Journals Online (AJOL)

    Bacteriological analysis was carried out for samples taken at water depth and at 1, 6, 12 and 24 hours post-sampling. It was observed that the total and faecal coliform bacteria were significantly higher in the 3 m water depth samples than in the surface water samples (ANOVA, F = 59.41, 26.751, 9.82 (T.C); 46.41, 26.81, ...

  11. Replicating studies in which samples of participants respond to samples of stimuli.

    Science.gov (United States)

    Westfall, Jacob; Judd, Charles M; Kenny, David A

    2015-05-01

    In a direct replication, the typical goal is to reproduce a prior experimental result with a new but comparable sample of participants in a high-powered replication study. Often in psychology, the research to be replicated involves a sample of participants responding to a sample of stimuli. In replicating such studies, we argue that the same criteria should be used in sampling stimuli as are used in sampling participants. Namely, a new but comparable sample of stimuli should be used to ensure that the original results are not due to idiosyncrasies of the original stimulus sample, and the stimulus sample must often be enlarged to ensure high statistical power. In support of the latter point, we discuss the fact that in experiments involving samples of stimuli, statistical power typically does not approach 1 as the number of participants goes to infinity. As an example of the importance of sampling new stimuli, we discuss the bygone literature on the risky shift phenomenon, which was almost entirely based on a single stimulus sample that was later discovered to be highly unrepresentative. We discuss the use of both resampled and expanded stimulus sets, that is, stimulus samples that include the original stimuli plus new stimuli. © The Author(s) 2015.

  12. Single versus duplicate blood samples in ACTH stimulated adrenal vein sampling

    NARCIS (Netherlands)

    Dekkers, T.; Arntz, M.; Wilt, G.J. van der; Schultze Kool, L.J.; Sweep, F.C.; Hermus, A.R.M.M.; Lenders, J.W.M.; Deinum, J.

    2013-01-01

    BACKGROUND: Adrenal vein sampling (AVS) is the preferred test for subtyping primary aldosteronism. However, the procedure is technically demanding and costly. In AVS it is common practice to take duplicate blood samples at each location. In this paper we explore whether a single sample procedure

  13. Perpendicular distance sampling: an alternative method for sampling downed coarse woody debris

    Science.gov (United States)

    Michael S. Williams; Jeffrey H. Gove

    2003-01-01

    Coarse woody debris (CWD) plays an important role in many forest ecosystem processes. In recent years, a number of new methods have been proposed to sample CWD. These methods select individual logs into the sample using some form of unequal probability sampling. One concern with most of these methods is the difficulty in estimating the volume of each log. A new method...

  14. Why sampling scheme matters: the effect of sampling scheme on landscape genetic results

    Science.gov (United States)

    Michael K. Schwartz; Kevin S. McKelvey

    2008-01-01

    There has been a recent trend in genetic studies of wild populations where researchers have changed their sampling schemes from sampling pre-defined populations to sampling individuals uniformly across landscapes. This reflects the fact that many species under study are continuously distributed rather than clumped into obvious "populations". Once individual...

  15. Vibronic Boson Sampling: Generalized Gaussian Boson Sampling for Molecular Vibronic Spectra at Finite Temperature.

    Science.gov (United States)

    Huh, Joonsuk; Yung, Man-Hong

    2017-08-07

    Molecular vibroic spectroscopy, where the transitions involve non-trivial Bosonic correlation due to the Duschinsky Rotation, is strongly believed to be in a similar complexity class as Boson Sampling. At finite temperature, the problem is represented as a Boson Sampling experiment with correlated Gaussian input states. This molecular problem with temperature effect is intimately related to the various versions of Boson Sampling sharing the similar computational complexity. Here we provide a full description to this relation in the context of Gaussian Boson Sampling. We find a hierarchical structure, which illustrates the relationship among various Boson Sampling schemes. Specifically, we show that every instance of Gaussian Boson Sampling with an initial correlation can be simulated by an instance of Gaussian Boson Sampling without initial correlation, with only a polynomial overhead. Since every Gaussian state is associated with a thermal state, our result implies that every sampling problem in molecular vibronic transitions, at any temperature, can be simulated by Gaussian Boson Sampling associated with a product of vacuum modes. We refer such a generalized Gaussian Boson Sampling motivated by the molecular sampling problem as Vibronic Boson Sampling.

  16. Sample Size for Measuring Grammaticality in Preschool Children from Picture-Elicited Language Samples

    Science.gov (United States)

    Eisenberg, Sarita L.; Guo, Ling-Yu

    2015-01-01

    Purpose: The purpose of this study was to investigate whether a shorter language sample elicited with fewer pictures (i.e., 7) would yield a percent grammatical utterances (PGU) score similar to that computed from a longer language sample elicited with 15 pictures for 3-year-old children. Method: Language samples were elicited by asking forty…

  17. Lunar Samples: Apollo Collection Tools, Curation Handling, Surveyor III and Soviet Luna Samples

    Science.gov (United States)

    Allton, J.H.

    2009-01-01

    The 6 Apollo missions that landed on the lunar surface returned 2196 samples comprised of 382 kg. The 58 samples weighing 21.5 kg collected on Apollo 11 expanded to 741 samples weighing 110.5 kg by the time of Apollo 17. The main goal on Apollo 11 was to obtain some material and return it safely to Earth. As we gained experience, the sampling tools and a more specific sampling strategy evolved. A summary of the sample types returned is shown in Table 1. By year 1989, some statistics on allocation by sample type were compiled [2]. The "scientific interest index" is based on the assumption that the more allocations per gram of sample, the higher the scientific interest. It is basically a reflection of the amount of diversity within a given sample type. Samples were also set aside for biohazard testing. The samples set aside and used for biohazard testing were represen-tative, as opposed to diverse. They tended to be larger and be comprised of less scientifically valuable mate-rial, such as dust and debris in the bottom of sample containers.

  18. Systematic sampling of discrete and continuous populations: sample selection and the choice of estimator

    Science.gov (United States)

    Harry T. Valentine; David L. R. Affleck; Timothy G. Gregoire

    2009-01-01

    Systematic sampling is easy, efficient, and widely used, though it is not generally recognized that a systematic sample may be drawn from the population of interest with or without restrictions on randomization. The restrictions or the lack of them determine which estimators are unbiased, when using the sampling design as the basis for inference. We describe the...

  19. Biological Sterilization of Returned Mars Samples

    Science.gov (United States)

    Allen, C. C.; Albert, F. G.; Combie, J.; Bodnar, R. J.; Hamilton, V. E.; Jolliff, B. L.; Kuebler, K.; Wang, A.; Lindstrom, D. J.; Morris, P. A.

    1999-01-01

    Martian rock and soil, collected by robotic spacecraft, will be returned to terrestrial laboratories early in the next century. Current plans call for the samples to be immediately placed into biological containment and tested for signs of present or past life and biological hazards. It is recommended that "Controlled distribution of unsterilized materials from Mars should occur only if rigorous analyses determine that the materials do not constitute a biological hazard. If any portion of the sample is removed from containment prior to completion of these analyses it should first be sterilized." While sterilization of Mars samples may not be required, an acceptable method must be available before the samples are returned to Earth. The sterilization method should be capable of destroying a wide range of organisms with minimal effects on the geologic samples. A variety of biological sterilization techniques and materials are currently in use, including dry heat, high pressure steam, gases, plasmas and ionizing radiation. Gamma radiation is routinely used to inactivate viruses and destroy bacteria in medical research. Many commercial sterilizers use Co-60 , which emits gamma photons of 1.17 and 1.33 MeV. Absorbed doses of approximately 1 Mrad (10(exp 8) ergs/g) destroy most bacteria. This study investigates the effects of lethal doses of Co-60 gamma radiation on materials similar to those anticipated to be returned from Mars. The goals are to determine the gamma dose required to kill microorganisms in rock and soil samples and to determine the effects of gamma sterilization on the samples' isotopic, chemical and physical properties. Additional information is contained in the original extended abstract.

  20. Predicting sample size required for classification performance

    Directory of Open Access Journals (Sweden)

    Figueroa Rosa L

    2012-02-01

    Full Text Available Abstract Background Supervised learning methods need annotated data in order to generate efficient models. Annotated data, however, is a relatively scarce resource and can be expensive to obtain. For both passive and active learning methods, there is a need to estimate the size of the annotated sample required to reach a performance target. Methods We designed and implemented a method that fits an inverse power law model to points of a given learning curve created using a small annotated training set. Fitting is carried out using nonlinear weighted least squares optimization. The fitted model is then used to predict the classifier's performance and confidence interval for larger sample sizes. For evaluation, the nonlinear weighted curve fitting method was applied to a set of learning curves generated using clinical text and waveform classification tasks with active and passive sampling methods, and predictions were validated using standard goodness of fit measures. As control we used an un-weighted fitting method. Results A total of 568 models were fitted and the model predictions were compared with the observed performances. Depending on the data set and sampling method, it took between 80 to 560 annotated samples to achieve mean average and root mean squared error below 0.01. Results also show that our weighted fitting method outperformed the baseline un-weighted method (p Conclusions This paper describes a simple and effective sample size prediction algorithm that conducts weighted fitting of learning curves. The algorithm outperformed an un-weighted algorithm described in previous literature. It can help researchers determine annotation sample size for supervised machine learning.